Redirecting grep output to file - bash

If I execute
$ java -jar selenium-server.jar 2>&1 | grep "jetty.Server"
I get, after a while, the output I expect:
$ 16:30:24.881 INFO - Started org.openqa.jetty.jetty.Server#6b0a2d64
But I i try to redirect grep output to a file, it doesn't write a thing
$ java -jar selenium-server.jar 2>&1 | grep "jetty.Server" > /tmp/ebook_selenium
Any idea why? Thanks

We found that grep flushes its output when it writes to stdout but not to a file.
grep --line-buffered will force grep to output each line as it's processed.

Related

Redirect stdout / stderr to bash script

Apologize if this is a really simple question, but I couldn't find anything on google for "redirect stdout and stderr to bash script" and "redirect bash script output to another bash script".
I know this will redirect stdout and stderr to log.txt, but I'm not sure how to redirect it to a bash script.
java -jar build/libs/bot-kt-1.1.3.jar > log.txt 2>&1
Ideally something that behaves like above and is like so
java -jar build/libs/bot-kt-1.1.3.jar > "./script.sh" 2>&1 # script.sh uses the $1 argument for input
You can use xargs to move data from a pipe into an argument:
java -jar build/libs/bot-kt-1.1.3.jar 2>&1 | xargs ./script.sh
to pass everything as a single argument:
java -jar build/libs/bot-kt-1.1.3.jar 2>&1 | xargs --null ./script.sh
You can use command substitution for that:
./script.sh $(java -jar build/libs/bot-kt-1.1.3.jar 2>&1)
In addition to perreal's answer, if your script could be changed to read from stdin instead of expecting a parameter, you could also use a pipe:
java -jar build/libs/bot-kt-1.1.3.jar 2>&1 | ./script.sh

tee command piped in a grep and redirected to file

I would like to use the following command in bash:
(while true; do date; sleep 1;done) | tee out.out 2>&1 | grep ^[A-Z] >log.log 2>&1 &
unfortunately, until it is finished (by killing the ppid of sleep command for example), the file log.log is empty but the file out.out has the expected content.
I first want to understand what's happening
I would like to fix this.
In order to fix this, you need to make grep line-buffered. This might depend on the implementation, but on BSD grep (shipped with Mac OS X), you simply need to add the --line-buffered option to grep:
(while true; do date; sleep 1;done) | tee out.out 2>&1 | grep --line-buffered ^[A-Z] >log.log 2>&1 &
From the grep man page:
--line-buffered
Force output to be line buffered. By default, output is line buffered when standard output is a terminal and block buffered otherwise.
You can actually validate that behavior by outputting to STDOUT instead:
(while true; do date; sleep 1;done) | tee out.out 2>&1 | grep ^[A-Z] 2>&1 &
In that case, you don't need to buffer by line explicitly, because that's the default. However, when you redirect to a file, you must explicitly set that behaviour.

Cannot redirect the ouput of hwclock -r command

I am implementing a shell script and I want to analyse the output shown by hwclock -r (--show) command which displays the RTC time and date.
To do that I tried things like: hwclock -r | grep -v "grep" | grep "error" > /dev/null
to see if an error happened while reading RTC registers.
The problem is that output is only and always forwarded to console. I tried to forward output to a file then analyse its content and I also tried to use tee -a command to direct output to both console and a file, but with no success.
Is there a solution to that or an explanation to what is happening with hwclock -r command.
In advance Thank you.
I just solved it by forwarding error messages to a file then make the analysis.
hwclock -r 2> file.txt; grep -v "grep" | grep "error" > /dev/null will do the job.
You omitted file.txt in the first grep.
If you just want to check for "error", with a not too old bash this will also do, in a shorter way:
hwclock -r |& grep error >/dev/null

Truncating File in Bash

I am writing the output of a command into a bash file. The command gradually produces output, and I am using grep to retrieve part specific lines, and tee to write it to the file. Right now, the command is writing all the lines into the file. I want the file to be truncated everytime the bash command has some output, such that there is always one line in the file. How can I achieve such an effect?
The command I am using is:
2>&1 zypper -x -n in geany | grep -o --line-buffered "percent=\"[0-9]*\"" | tee /var/log/oneclick.log
This produces output like percent="10" and so on. Each time, only one line should exist in the file
If you need to overwrite the file for each line:
2>&1 zypper -x -n in geany |
grep -o --line-buffered "percent=\"[0-9]*\"" |
while read line; do
echo "$line" > /var/log/oneclick.log
echo "$line"
done

bash output redirect prob

I want to count the number of lines output from a command in a bash script. i.e.
COUNT=ls | wc -l
But I also want the script to output the original output from ls. How to get this done? (My actual command is not ls and it has side effects. So I can't run it twice.)
The tee(1) utility may be helpful:
$ ls | tee /dev/tty | wc -l
CHANGES
qpi.doc
qpi.lib
qpi.s
4
info coreutils "tee invocation" includes this following example, which might be more instructive of tee(1)'s power:
wget -O - http://example.com/dvd.iso \
| tee >(sha1sum > dvd.sha1) \
>(md5sum > dvd.md5) \
> dvd.iso
That downloads the file once, sends output through two child processes (as started via bash(1) process substitution) and also tee(1)'s stdout, which is redirected to a file.
ls | tee tmpfile | first command
cat tmpfile | second command
Tee is a good way to do that, but you can make something simpler:
ls > __tmpfile
cat __tmpfile | wc -l
cat __tmpfile
rm __tmpfile

Resources