Send standard out and standard error to different files, but also display them - bash

The answers to this question show us how to redirect standard output and standard error to two separate files.
But what if I also want to see the output on the console as it is created?
We can use tee to save one of the streams to a file, but then with the other stream, we must either echo it or save it to a file.
$ command 2>error.log | tee output.log
How can I use tee on both streams?

I found the answer here.
$ ( command 2>&1 1>&3 | tee error.log >&2 ) 3>&1 | tee output.log

Related

Redirect both stdout and stderr to file, print stdout only [duplicate]

This question already has answers here:
Separately redirecting and recombining stderr/stdout without losing ordering
(2 answers)
Closed 4 years ago.
I have a large amount of text coming in stdout and stderr; I would like to log all of it in a file (in the same order), and print only what comes from stdout in the console for further processing (like grep).
Any combination of > file or &> file, even with | or |& will permanently redirect the stream and I cannot pipe it afterwards:
my_command > output.log | grep something # logs only stdout, prints only stderr
my_command &> output.log | grep something # logs everything in correct order, prints nothing
my_command > output.log |& grep something # logs everything in correct order, prints nothing
my_command &> output.log |& grep something # logs everything in correct order, prints nothing
Any use of tee will either
print what comes from stderr then log everything that comes from stdout and print it out, so I lose the order of the text that comes in
log both in the correct order if I use |& tee but I lose control over the streams since now everything is in stdout.
example:
my_command | tee output.log | grep something # logs only stdout, prints all of stderr then all of stdout
my_command |& tee output.log | grep something # logs everything, prints everything to stdout
my_command | tee output.log 3>&1 1>&2 2>&3 | tee -a output.log | grep something # logs only stdout, prints all of stderr then all of stdout
Now I'm all out of ideas.
This is what my test case looks like:
testFunction() {
echo "output";
1>&2 echo "error";
echo "output-2";
1>&2 echo "error-2";
echo "output-3";
1>&2 echo "error-3";
}
I would like my console output to look like:
output
output-2
output-3
And my output.log file to look like:
output
error
output-2
error-2
output-3
error-3
For more details, I'm filtering the output of mvn clean install with grep to only keep minimal information in the terminal, but I also would like to have a full log somewhere in case I need to investigate a stack trace or something. The java test logs are sent to stderr so I choose to discard it in my console output.
While not really a solution which uses redirects or anything of that order, you might want to use annotate-output for this.
Assume that script.sh contains your function, then you can do:
$ annotate-output ./script.sh
13:17:15 I: Started ./script.sh
13:17:15 O: output
13:17:15 E: error
13:17:15 E: error-2
13:17:15 O: output-2
13:17:15 E: error-3
13:17:15 O: output-3
13:17:15 I: Finished with exitcode 0
So now it is easy to reprocess that information and send it to the files you want:
$ annotate-output ./script.sh \
| awk '{s=substr($0,13)}/ [OE]: /{print s> "logfile"}/ O: /{print s}'
output
output-2
output-3
$ cat logfile
output
error
error-2
output-2
error-3
output-3
Or any other combination of tee sed cut ...
As per comment from #CharlesDuffy:
Since stdout and stderr are processed in parallel, it can happen that some lines received on
stdout will show up before later-printed stderr lines (and vice-versa).
This is unfortunately very hard to fix with the current annotation strategy. A fix would
involve switching to PTRACE'ing the process. Giving nice a (much) higher priority over the
executed program could, however, cause this behaviour to show up less frequently.
source: man annotate-output

How to save the terminal screen output in piping command

I have few commands that I'm piping. The first command gives a big file output, while its output on the screen is only a very short statistical summary of it. The big file output is being processed fine through the piping, but I'd like to save the screen output into a text file, so my question is how to do it within the piping?
So far I've tried using tee the below:
&> someFile.txt
> someFile.txt
>> someFile.txt
But all of them gave me the big file output, but I'd like only the screen short output.
Any ideas how to do that?
If you just want the output of command_to_refine_big_output on stdout and in a file called log in the current directory, this works:
command_with_big_output | command_to_refine_big_output | tee log
Note that this only writes stdout to log file, if your want stderr to, you can do:
command_with_big_output | command_to_refine_big_output 2>&1 | tee log
or, if you want all output, errors include of the complete chain:
command_with_big_output 2>&1 | command_to_refine_big_output 2>&1 | tee log

Pipe output to two different commands [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
osx/linux: pipes into two processes?
Is there a way to pipe the output from one command into the input of two other commands, running them simultaneously?
Something like this:
$ echo 'test' |(cat) |(cat)
test
test
The reason I want to do this is that I have a program which receives an FM radio signal from a USB SDR device, and outputs the audio as raw PCM data (like a .wav file but with no header.) Since the signal is not music but POCSAG pager data, I need to pipe it to a decoder program to recover the pager text. However I also want to listen to the signal so I know whether any data is coming in or not. (Otherwise I can't tell if the decoder is broken or there's just no data being broadcast.) So as well as piping the data to the pager decoder, I also need to pipe the same data to the play command.
Currently I only know how to do one - either pipe it to the decoder and read the data in silence, or pipe it to play and hear it without seeing any decoded text.
How can I pipe the same data to both commands, so I can read the text and hear the audio?
I can't use tee as it only writes the duplicated data to a file, but I need to process the data in real-time.
It should be ok if you use both tee and mkfifo.
mkfifo pipe
cat pipe | (command 1) &
echo 'test' | tee pipe | (command 2)
Recent bash present >(command) syntax:
echo "Hello world." | tee >(sed 's/^/1st: /') >(sed 's/^/2nd cmd: /') >/dev/null
May return:
2nd cmd: Hello world.
1st: Hello world.
download somefile.ext, save them, compute md5sum and sha1sum:
wget -O - http://somewhere.someland/somepath/somefile.ext |
tee somefile.ext >(md5sum >somefile.md5) | sha1sum >somefile.sha1
or
wget -O - http://somewhere.someland/somepath/somefile.ext |
tee >(md5sum >somefile.md5) >(sha1sum >somefile.sha1) >somefile.ext
Old answer
There is a way to do that via unnamed pipe (tested under linux):
(( echo "hello" |
tee /dev/fd/5 |
sed 's/^/1st occure: /' >/dev/fd/4
) 5>&1 |
sed 's/^/2nd command: /'
) 4>&1
give:
2nd command: hello
1st occure: hello
This sample will let you download somefile.ext, save them, compute his md5sum and compute his sha1sum:
(( wget -O - http://somewhere.someland/somepath/somefile.ext |
tee /dev/fd/5 |
md5sum >/dev/fd/4
) 5>&1 |
tee somefile.ext |
sha1sum
) 4>&1
Maybe take a look at tee command. What it does is simply print its input to a file, but it also prints its input to the standard output. So something like:
echo "Hello" | tee try.txt | <some_command>
Will create a file with content "Hello" AND also let "Hello" (flow through the pipeline) end up as <some_command>'s STDIN.

Direct output to standard output and an output file simultaneously

I know that
./executable &>outputfile
will redirect the standard output and standard error to a file. This is what I want, but I would also like the output to continue to be printed in the terminal. What is the best way to do this?
Ok, here is my exact command: I have tried
./damp2Plan 10 | tee log.txt
and
./damp2Plan 10 2>&1 | tee log.txt
where 10 is just an argument passed to main. Neither work correctly. The result is that the very first printf statement in the code does go to terminal and log.txt just fine, but none of the rest do. I'm on UbuntuĀ 12.04 (Precise Pangolin).
Use tee:
./executable 2>&1 | tee outputfile
tee outputs in chunks and there may be some delay before you see any output. If you want closer to real-time output, you could redirect to a file as you are now, and monitor it with tail -f in a different shell:
./executable 2>&1 > outputfile
tail -f outputfile

Capture log4J output with grep

I know that log4j by default outputs to stderror.
I have been capturing the out put of my application with the following command:
application_to_run 2> log ; cat log | grep FATAL
Is there a way to capture the output without the auxiliary file?
If you want both stdout and stderr, use:
( application_to_run 2>&1 ) | grep FATAL
If you want both stderr alone, you can use:
( application_to_run 2>&1 >/dev/null ) | grep FATAL
The first sends all output destined for file handle 2 (stderr) to file handle 1 (stdout), then pipes that through grep. The second does the same but also sends stdout to the bit bucket. This will work since redirection is a positional thing. First, stderr is redirected to the current stdout, then stdout is redirected to /dev/null.
If you are asking how to redirect stderr to stdout so you can use it in a pipe, there are two ways I know of:
$ command 2>&1 | ...
$ command |& ..

Resources