Redirect both stdout and stderr to file, print stdout only [duplicate] - bash

This question already has answers here:
Separately redirecting and recombining stderr/stdout without losing ordering
(2 answers)
Closed 4 years ago.
I have a large amount of text coming in stdout and stderr; I would like to log all of it in a file (in the same order), and print only what comes from stdout in the console for further processing (like grep).
Any combination of > file or &> file, even with | or |& will permanently redirect the stream and I cannot pipe it afterwards:
my_command > output.log | grep something # logs only stdout, prints only stderr
my_command &> output.log | grep something # logs everything in correct order, prints nothing
my_command > output.log |& grep something # logs everything in correct order, prints nothing
my_command &> output.log |& grep something # logs everything in correct order, prints nothing
Any use of tee will either
print what comes from stderr then log everything that comes from stdout and print it out, so I lose the order of the text that comes in
log both in the correct order if I use |& tee but I lose control over the streams since now everything is in stdout.
example:
my_command | tee output.log | grep something # logs only stdout, prints all of stderr then all of stdout
my_command |& tee output.log | grep something # logs everything, prints everything to stdout
my_command | tee output.log 3>&1 1>&2 2>&3 | tee -a output.log | grep something # logs only stdout, prints all of stderr then all of stdout
Now I'm all out of ideas.
This is what my test case looks like:
testFunction() {
echo "output";
1>&2 echo "error";
echo "output-2";
1>&2 echo "error-2";
echo "output-3";
1>&2 echo "error-3";
}
I would like my console output to look like:
output
output-2
output-3
And my output.log file to look like:
output
error
output-2
error-2
output-3
error-3
For more details, I'm filtering the output of mvn clean install with grep to only keep minimal information in the terminal, but I also would like to have a full log somewhere in case I need to investigate a stack trace or something. The java test logs are sent to stderr so I choose to discard it in my console output.

While not really a solution which uses redirects or anything of that order, you might want to use annotate-output for this.
Assume that script.sh contains your function, then you can do:
$ annotate-output ./script.sh
13:17:15 I: Started ./script.sh
13:17:15 O: output
13:17:15 E: error
13:17:15 E: error-2
13:17:15 O: output-2
13:17:15 E: error-3
13:17:15 O: output-3
13:17:15 I: Finished with exitcode 0
So now it is easy to reprocess that information and send it to the files you want:
$ annotate-output ./script.sh \
| awk '{s=substr($0,13)}/ [OE]: /{print s> "logfile"}/ O: /{print s}'
output
output-2
output-3
$ cat logfile
output
error
error-2
output-2
error-3
output-3
Or any other combination of tee sed cut ...
As per comment from #CharlesDuffy:
Since stdout and stderr are processed in parallel, it can happen that some lines received on
stdout will show up before later-printed stderr lines (and vice-versa).
This is unfortunately very hard to fix with the current annotation strategy. A fix would
involve switching to PTRACE'ing the process. Giving nice a (much) higher priority over the
executed program could, however, cause this behaviour to show up less frequently.
source: man annotate-output

Related

shell: send grep output to stderr and leave stdout intact

i have a program that outputs to stdout (actually it outputs to stderr, but i can easily redirect that to stdout with 2>&1 or the like.
i would like to run grep on the output of the program, and redirect all matches to stderr while leaving the unmatched lines on stdout (alternatively, i'd be happy with getting all lines - not just the unmatched ones - on stdout)
e.g.
$ myprogram() {
cat <<EOF
one line
a line with an error
another line
EOF
}
$ myprogram | greptostderr error >/dev/null
a line with an error
$ myprogram | greptostderr error 2>/dev/null
one line
another line
$
a trivial solution would be:
myprogram | tee logfile
grep error logfile 1>&2
rm logfile
however, i would rather get the matching lines on stderr when they occur, not when the program exits...
eventually, I found this, which gave me a hint to for a a POSIX solution like so:
greptostderr() {
while read LINE; do
echo $LINE
echo $LINE | grep -- "$#" 1>&2
done
}
for whatever reasons, this does not output anything (probably a buffering problem).
a somewhat ugly solution that seems to work goes like this:
greptostderr() {
while read LINE; do
echo $LINE
echo $LINE | grep -- "$#" | tee /dev/stderr >/dev/null
done
}
are there any better ways to implement this?
ideally i'm looking for a POSIX shell solution, but bash is fine as well...
I would use awk instead of grep, which gives you more flexibility in handling both matched and unmatched lines.
myprogram | awk -v p=error '{ print > ($0 ~ p ? "/dev/stderr" : "/dev/stdout")}'
Every line will be printed; the result of $0 ~ p determines whether the line is printed to standard error or standard output. (You may need to adjust the output file names based on your file system.)

Tee to commands only, not stdout

I already know how to use tee with process substitution to send output to various commands, and stdout, eg
command0 | tee >(command1) >(command2)
With the above line, stdout will be composed of interleaved lines from command0, command1, and command2.
Is there a way to prevent tee from writing to stdout, without removing the output of any commands it pipes to? So for the example above, for stdout to only have output from command1 and command2?
Most answers relating to teeing without stdout are only writing directly to files, and recommend using something like this:
command0 | tee file1 file2 >/dev/null
But with process substitution, that would consume all output from the other commands too.
command0 | tee >(command1) >(command2) >/dev/null
Is there some way to tell tee not to print to stdout, or to only consume the output directly from tee?
Try this:
( command0 | tee >(command1 1>&3 ) | command2 ) 3>&1
It redirects the stdout of command1 to pipe 3, so that command2 sees only the original source. At end, you redirect pipe 3 to stdout again.
Use this to test it:
( echo test | tee >( sed 's/^/1 /' >&3 ) | sed 's/^/2 /' ) 3>&1
The output is unordered and in my case:
2 test
1 test
I have seen a comment and an answer that use an extra >, but don't really explain why it does what it does. It seems like it is redirecting output somewhere but all I can tell so far is that it does what I'm looking for. This works:
command0 | tee > >(command1) >(command2)
command0 | tee >(command1) > >(command2)
it appears not to matter where the extra > is, so long as it is before at least one of the arguments to tee. So this will not work:
command0 | tee >(command1) >(command2) >
Without knowing what this is called, and with no further leads, I can't explain further.

bash : parse output of command and store into variable

I have made a command witch return 'version:X' .
ie:
$>./mybox -v
$>version:2
I don't understand why this isn't working :
$>VERSION=$( /home/mybox -v | sed 's/.*version:\([0-9]*\).*/\1/')
$>echo $VERSION
$>
if I write this, it is ok :
$>VERSION=$( echo "version:2" | sed 's/.*version:\([0-9]*\).*/\1/')
$>echo $VERSION
$>2
Regards
It's pretty common for version/error/debugging information to be sent to stderr, not stdout. When running the command from a terminal, both will be printed, but only stdout will make it through the pipe to sed.
echo output always goes to stdout by default, which is why you're not having trouble there.
If the above is correct, you'll just need to redirect stderr (file descriptor 2) to stdout (file descriptor 1) before passing it along:
VERSION=$( /home/mybox -v 2>&1 | sed 's/.*version:\([0-9]*\).*/\1/')
# ^^^^

bash, nested commands and redirects

I am trying to track the CPU usage of a process using a command like this:
top -b -d 1 | grep myprocess.exe
Next, I would like to redirect this to a log file, e.g.
top -b -d 1 | grep myprocess.exe > output.log
Now, this does not actually work because it thinks I am grepping myprocess.exe > output.log
instead of myprocess.exe
Does anybody know how I can get this redirect to work?
Now, this does not actually work because it thinks I am grepping myprocess.exe > output.log instead of myprocess.exe
Wrong. All should be fine. The 1st example executes the pipeline with stdout set to your terminal (thus you see the output, but nothing is written to the file). The 2nd example executes the pipeline with stdout set to output.log (thus you don't see output, but it will go right in your file).
If you want the output written to both, you need another process that gets your previous pipeline's stdout as stdin, and duplicates it. Like:
previous_pipeline | tee output.log
tee will print on stdout what it gets on stdin (So for stdout, everything is the same as before), but additionally open another file (given as cmdline arg) and write a copy to it.
Try tee:
top -b -d 1 | grep myprocess.exe | tee output.log
If you want it to show no output:
top -b -d 1 | grep myprocess.exe | tee output.log > /dev/null

Capture log4J output with grep

I know that log4j by default outputs to stderror.
I have been capturing the out put of my application with the following command:
application_to_run 2> log ; cat log | grep FATAL
Is there a way to capture the output without the auxiliary file?
If you want both stdout and stderr, use:
( application_to_run 2>&1 ) | grep FATAL
If you want both stderr alone, you can use:
( application_to_run 2>&1 >/dev/null ) | grep FATAL
The first sends all output destined for file handle 2 (stderr) to file handle 1 (stdout), then pipes that through grep. The second does the same but also sends stdout to the bit bucket. This will work since redirection is a positional thing. First, stderr is redirected to the current stdout, then stdout is redirected to /dev/null.
If you are asking how to redirect stderr to stdout so you can use it in a pipe, there are two ways I know of:
$ command 2>&1 | ...
$ command |& ..

Resources