I am trying to debug some errors in a live Merb app. There are a lot of lined of error code running by, but I jut need to see the first one. I can use grep to select these lines and print them but it closes as soon as it reached the end of the file.
What I would like to do is use grep like the shift-F mode in less where it will keep the file open and report new matching line as they are written to the log.
- or -
Is there a way to do this directly with less that I don't know about?
try this
tail -f dev.log | grep '^ERROR:'
the -f option to tail tells it to wait for more data when it hits EOF.
Can't you do this with watch and tail?
watch -n 30 "grep 'dev.log' '^ERROR:' | tail -n 30"
Related
Background
I'm working on a change to my CI build that triggers a command which runs my Xcode unit test output. These tests log an extremely verbose amount of information – so much so that I exceed the 4 MB log capture limit and my build gets terminated. As far as I can tell, there's no way for me to make it less verbose (one way I came up with is going to require a change to the command that runs the tests, which I'm working on.
My Workaround
So I decided to get clever and try filtering my output with sed, like so:
test_running_command | sed '/xctest\[/d; /^$/d'
The sed command works when I run it on a file, filtering out lines containing xctest[ and empty lines, as intended. But when I incorporate this into my CI build, I see output streamed to a certain point, and then it just stops. After 10 minutes, my CI build gets killed anyway, before I had the chance to hit the 4 MB limit.
The Question
Why is sed hanging like this?
Troubleshooting Performed
I tried using awk, like so, which similarly hangs.
test_running_command | awk '$0 !~ /xctest\[/ && $0 !~ /^$/ {print}'
I tried a command from this answer to turn on line buffering for the original command, which just hung at a different place.
script -q /dev/null test_running_command | <sed or awk command from above>
As suggested by #CharlesDuffy in a comment, I used tee to write to file what goes into the pipe, and determined that the left side of the pipe definitely gets much further than the right. I observed (on my local machine) that while the output to the console was frozen, before_filter.txt was continuing to progress. This is the line I used:
test_running_command | tee before.txt | sed '/xctest\[/d; /^$/d' | tee after.txt
As suggested by #LuisMuñoz in a comment, I tried using stdbuf (after installing with brew install coreutils) to disable buffering into and out of sed. I didn't see a difference in behavior. Output still froze at an arbitrary point.
test_running_command | gstdbuf -i0 -o0 sed '/xctest\[/d; /^$/d'
I discovered the -l flag for sed. Apparently that turns on line-buffering mode, which produced the desired effect. I wonder if it was having problems because its buffer was getting overrun by the incredibly long lines produced. Regardless, this command ultimately worked:
test_running_command | sed -l '/xctest\[/d; /^$/d'
As a side note, I had one test that was taking longer than 10 minutes on the CI server, and without log output, that was causing a timeout. I put a single printf statement inside its outer loop that produced just enough log output so the build didn't get killed. The printf line didn't get filtered by the sed command, so this is perfect.
I have a website that generates raw access logs. My web host doesn't have good support for viewing users online, so I've turned to SSH. Right now I am using the following bash script to see who is online:
watch tail -50 access.log
However there is a lot of junk in there. There is a bot that visits my website every 5 minutes, and whenever somebody loads a page, a bunch of assets (images, JS, CSS) also show up in the log.
Is there a way to tail the end of the file, and also filter out lines containing certain words, such as Alexabot, .css, .png, .jpg, and .js?
Also, it'd be ideal if the command/script would work backwards from the end of the file until 50 lines not matching the filter criteria are found, rather than just taking the last 50 lines and suppressing the spammy ones within that group.
I looked into awk for filtering, but I am stuck because I don't know how to combine awk with tail.
You could use grep withe the -v, invert-match,
watch tail -50 access.log | grep -v -E '(Alexabot|.css|.png|.jpg|.js)'
The -E flag is for enabling Extended Regular Expression (ERE) syntax to allow match for multiple strings separated by |
You could also try disabling output buffering with stdbuf -oL before feeding the stream to grep as
watch tail -50 access.log | stdbuf -oL grep -v -E '(Alexabot|.css|.png|.jpg|.js)'
Here is awk version to negate multiple strings.
watch tail -50 access.log |awk '!/Alexabot|.css|.png|.jpg|.js/'
OR, if you want continuous monitoring.
stdbuf -oL tail -f access.log | stdbuf -iL awk '!/Alexabot|.css|.png|.jpg|.js/'
I'm relatively new to linux - please forgive me if the solution is simple/obvious..
I'm trying to set up a background running script that monitors a log file for certain keyword patterns with awk and tail, and then uses espeak to provide a simplified notification when these keywords appear in the log file (which uses sysklogd)
The concept is derived from this guide
This is a horrible example of what i'm trying to do:
#!/bin/bash
tail -f -n1 /var/log/example_main | awk '/example sshd/&&/session opened for user/{system("espeak \"Opening SSH session\"")}'
tail -f -n1 /var/log/example_main | awk '/example sshd/&&/session closed/{system("espeak \"Session closed. Goodbye.\"")}''
tail -f -n1 /var/log/example_main | awk '/example sshd/&&/authentication failure/{system("espeak \"Warning: Authentication Faliure\"")}'
tail -f -n1 /var/log/example_main | awk '/example sshd/&&/authentication failure/{system("espeak \"Authentication Failure. I have denied access.\"")}'
The first tail command by itself works perfectly; it monitors the defined log file for 'example sshd' and 'session opened for user', then uses espeak to say 'Opening SSH session'. As you would expect given the above excerpt, the bash script will not run multiple tails simultaneously (or at least it stops after this first tail command).
I guess I have a few questions:
How should I set out this script?
What is the best way to constantly run this script in the background - e.g init?
Are there any tutorials/documentation somewhere that could help me out?
Is there already something like this available that I could use?
Thanks, any help would be greatly appreciated - sorry for the long post.
Personally, I would attempt to set each of these up as an individual cron job. This would allow you to run it at a specific time and at specified intervals.
For example, you could type crontab -e
Then inside, have each of these tail commands listed as such:
5 * * * * tail -f -n1 /var/log/example_main | awk '/example sshd/&&/session opened for user/{system("espeak \"Opening SSH session\"")}'
That would run that one command at 5 minutes after the hour, every hour.
This was a decent guide I found: HowTo: Add Jobs To cron
For work, I occasionally need to monitor the output logs of services I create. These logs are short lived, and contain a lot of information that I don't necessarily need. Up until this point I've been watching them using:
grep <tag> * | less
where <tag> is either INFO, DEBUG, WARN, or ERROR. There are about 10x as many warns as there are errors, and 10x as many debugs as warns, and so forth. It makes it difficult to catch one ERROR in a sea of relevant DEBUG messages. I would like a way to, for instance, make all 'WARN' messages appear on the left-hand side of the terminal, and all the 'ERROR' messages appear on the right-hand side.
I have tried using tmux and screen, but it doesn't seem to be working on my dev machine.
Try doing this :
FILE=filename.log
vim -O <(grep 'ERR' "$FILE") <(grep 'WARN' "$FILE")
Just use sed to indent the desired lines. Or, use colors. For example, to make ERRORS red, you could do:
$ r=$( printf '\033[1;31m' ) # escape sequence may change depending on the display
$ g=$( printf '\033[1;32m' )
$ echo $g # Set the output color to the default
$ sed "/ERROR/ { s/^/$r/; s/$/$g/; }" *
If these are live logs, how about running these two commands in separate terminals:
Errors:
tail -f * | grep ERROR
Warnings:
tail -f * | grep WARN
Edit
To automate this you could start it in a tmux session. I tend to do this with a tmux script similar to what I described here.
In you case the script file could contain something like this:
monitor.tmux
send-keys "tail -f * | grep ERROR\n"
split
send-keys "tail -f * | grep WARN\n"
Then run like this:
tmux new -d \; source-file monitor.tmux; tmux attach
You could do this using screen. Simply split the screen vertically and run tail -f LOGFILE | grep KEYWORD on each pane.
As a shortcut, you can use the following rc file:
split -v
screen bash -c "tail -f /var/log/syslog | grep ERR"
focus
screen bash -c "tail -f /var/log/syslog | grep WARN"
then launch your screen instance using:
screen -c monitor_log_screen.rc
You can of course extend this concept much further by making more splits and use commands like tail -f and watch to get live updates of different output.
Do also explore screen other screen features such as use of multiple windows (with monitoring) and hardstatus and you can come up with quite a comprehensive "monitoring console".
I'm pretty much a novice to shell scripting. I'm trying to send the output of some piped commands to an open command in bash in OSX.
My ultimate goal is to compile a Flex/Actionscript application from TextWrangler by calling a bash script with a little Applescript and have the result played directly in a Flash Player. The Applescript is pretty much doing it's job. But the bash script doesn't work as I expect. Same results when I ommit the Applescript and simply put it directly in terminal.
This is what the Applescript is sending to terminal:
mxmlc -warnings=false DocumentClass.as | tail -n 1 | sed 's/[[:space:]].*$//' | open -a 'Flash Player'
So basically, I read the last line of the output of mxmlc, which usually looks something like this:
/Users/fireeyedboy/Desktop/DocumentClass.swf (994 bytes)
and I strip everything after the first space it encounters. I know it's hardly bulletproof yet, it's still just a proof of concept. When I get this roughly working I'll refine. It returns the desired result so far:
/Users/fireeyedboy/Desktop/DocumentClass.swf
But as you can see, I then try to pipe this sed result to the Flash Player and that's where it fails. The Flash Player seems to open way too early. I would expect the Flash Player to open only after the script finished the sed command. But it opens way earlier.
So my question is twofold:
Is it even possible to pipe an
argument to the open command this
way?
Do I need to use some type
of delay command to get this
working, since the open command doesn't seem to be waiting for the input?
You're trying to give the name of the swf file as input to stdin of the open command, which it doesn't support.
It expects the file name as an argument (similar to -a).
You can do something like this:
FILENAME=`xmlc -warnings=false DocumentClass.as | tail -n 1 | sed 's/[[:space:]].*$//'`
open -a 'Flash Player' $FILENAME
or on a single line:
open -a 'Flash Player' `xmlc -warnings=false DocumentClass.as | tail -n 1 | sed 's/[[:space:]].*$//'`
If you're using bash (or another modern POSIX shell), you can replace the pretty unreadable backtick character with $( and ):
open -a 'Flash Player' $(xmlc -warnings=false DocumentClass.as | tail -n 1 | sed 's/[[:space:]].*$//')
All commands in a pipe are started at the same time. During this step, their input/outputs are chained together.
My guess is that open -a 'Flash Player' doesn't wait for input but simply starts the flash player. I suggest to try to run the player with an argument instead:
name=$(mxmlc -warnings=false DocumentClass.as | tail -n 1 | sed 's/[[:space:]].*$//')
open -a 'Flash Player' "$name"
I'm not familiar with the "open" command as it seems to be a mac thing, but i think what you want to do is:
open -a 'Flash Player' $(mxmlc -warnings=false DocumentClass.as | tail -n 1 | sed 's/[[:space:]].*$//')
In general you can't pipe arguments to a command, you have to specify that you want the output of the previous command to be treated as arguments, either as in my example or with the xargs command. Note that there is a limit on the maximum size of a command line, though.