Bash: Pipe stdout and catch stderr into a variable - bash

Is it possible to pipe normal stdout further to another program, but store stderr into a variable?
This is the usecase:
mysqldump database | gzip > database.sql
In this scenario I would like to catch all errors/warnings produced by mysqldump and store them into a variable, but the normal stdout (which is the dump) should continue being piped to gzip.
Any ideas about how to accomplish this?

You can do the following:
mysqldump database 2> dump_errors | gzip > database.sql
error_var=$( cat dump_errors )
rm dump_errors
Here, all errors by mysqldump are redirected to a file called 'dump_errors', and stdout is piped to gzip, which in turn writes to database.sql.
Contents of 'dump_errors' are then assigned to variable 'error_val', and file 'dump_errors' is then removed.
Note the following redirections:
$ sort 1> output 2> errors # redirects stdout to "output", stderr to "errors"
$ sort 1> output 2>&1 # stderr goes to stdout, stdout writes to "output"
$ sort 2> errors 1>&2 # stdout goes to stderr, stderr writes to "errors"
$ sort 2>&1 > output # tie stderr to screen, redirect stdout to "output"
$ sort > output 2>&1 # redirect stdout to "output", tie stderr to "output"

You could do something like:
errors=$(mysqldump database 2>&1 > >(gzip > database.sql))
Here, I'm using process substitution to get gzip to use mysqldump's output as stdin. Given the order of redirections (2>&1 before >), mysqldump's stderr should now be used for the command substitution.
Testing it out:
$ a=$(sh -c 'echo foo >&2; echo bar' 2>&1 > >(gzip > foo))
$ gunzip < foo
bar
$ echo $a
foo

Related

Is it possible to print only stdout to the screen while writing stdout and stderr to a logfile?

I know that it is possible to redirect both to a specific file:
./command 1> out.log 2> err.log
or
./command 1>test.log 2>&1
to write both to a file. But I don't know a way to write both to the same file (preserving the output order) while printing just one of both. tee isn't very helpful because it prints both file descriptors.
Writing standard output to one file and both to another is fairly simple with tee:
{ cmd | tee stdout.log; } &> both.log
Both descriptors of the compound command are redirected to both.log, but the standard output of cmd is first passed through tee to stdout.log before also being written to both.log.
Writing standard error to one and both to another is trickier.
{ foo 2>&1 1>&3 | tee stderr.log ; } 3>&1 | tee both.log > /dev/null
It's a little tricky to describe correctly. First, the command group's standard error is ignored; it's standard output is the pipe to tee both.log. But 3>&1 also copies its fd 3 to its standard output. So the question is, what gets written to that?
Inside the command group, foo's standard output is the pipe to tee stderr.log. The 2>&1 copies tee's standard error to that descriptor, and 1>&3 copies foo's standard output to the inherited fd 3.
It will be better to write them to different files, in this case it will be very easy:
cmd 2>/stderr.log| tee -a stdout.log
But if you want a single file, you will need some tricks here and additional process running for redirection.
You can use several redirections:
foo () { echo 1 ; echo 2 >&2 ; }
(( foo | tee >(cat) >&3) &>log ) 3>&1
The tee command sends the stdout to file descriptors 1 (through the process substitution) and 3. Both stdout and stderr are redirected to the log. In the end, 3, the copy of stdout, is sent back to the terminal.
Alternatively, you can do without the process substitution, redirect the output directly to log, but use -a and >> for append. You need to clear the log beforehand.
: > log; (( foo | tee -a log >&3) 2>> log ) 3>&1
stdout to the console and stdout and error to a file?
{ cmd | tee /dev/tty; } &> /tmp/both.log
To answer the question :
stderr to the console and stdout and error to a file?
{ cmd 3>&1 1>&2 2>&3 | tee /dev/tty; } &> /tmp/both.log
3>&1 1>&2 2>&3 means swapping stdout and stderr

Save StdOut and StdErr to two different files and append them in third one

I would like to save result of my script run with sh.exe on windows into three different files:
stdout to stdout.txt; stderr to stderr.txt and append stdout and stderr to all.txt. I tried to use
foo.sh &> all.txt 2> stderr.txt
or
foo.sh 2>&1 1>logfile | tee -a logfile
but it doen't even append stderr and stdout.
How can I do it?

How do I copy stderr without stopping it writing to the terminal?

I want to write a shell script that runs a command, writing its stderr to my terminal as it arrives. However, I also want to save stderr to a variable, so I can inspect it later.
How can I achieve this? Should I use tee, or a subshell, or something else?
I've tried this:
# Create FD 3 that can be used so stdout still comes through
exec 3>&1
# Run the command, piping stdout to normal stdout, but saving stderr.
{ ERROR=$( $# 2>&1 1>&3) ; }
echo "copy of stderr: $ERROR"
However, this doesn't write stderr to the console, it only saves it.
I've also tried:
{ $#; } 2> >(tee stderr.txt >&2 )
echo "stderr was:"
cat stderr.txt
However, I don't want the temporary file.
I often want to do this, and find myself reaching for /dev/stderr, but there can be problems with this approach; for example, Nix build scripts give "permission denied" errors if they try to write to /dev/stdout or /dev/stderr.
After reinventing this wheel a few times, my current approach is to use process substitution as follows:
myCmd 2> >(tee >(cat 1>&2))
Reading this from the outside in:
This will run myCmd, leaving its stdout as-is. The 2> will redirect the stderr of myCmd to a different destination; the destination here is >(tee >(cat 1>&2)) which will cause it to be piped into the command tee >(cat 1>&2).
The tee command duplicates its input (in this case, the stderr of myCmd) to its stdout and to the given destination. The destination here is >(cat 1>&2), which will cause the data to be piped into the command cat 1>&2.
The cat command just passes its input straight to stdout. The 1>&2 redirects stdout to go to stderr.
Reading from the inside out:
The cat 1>&2 command redirects its stdin to stderr, so >(cat 1>&2) acts like /dev/stderr.
Hence tee >(cat 1>&2) duplicates its stdin to both stdout and stderr, acting like tee /dev/stderr.
We use 2> >(tee >(cat 1>&2)) to get 2 copies of stderr: one on stdout and one on stderr.
We can use the copy on stdout as normal, for example storing it in a variable. We can leave the copy on stderr to get printed to the terminal.
We can combine this with other redirections if we like, e.g.
# Create FD 3 that can be used so stdout still comes through
exec 3>&1
# Run the command, redirecting its stdout to the shell's stdout,
# duplicating its stderr and sending one copy to the shell's stderr
# and using the other to replace the command's stdout, which we then
# capture
{ ERROR=$( $# 2> >(tee >(cat 1>&2)) 1>&3) ; }
echo "copy of stderr: $ERROR"
Credit goes to #Etan Reisner for the fundamentals of the approach; however, it's better to use tee with /dev/stderr rather than /dev/tty in order to preserve normal behavior (if you send to /dev/tty, the outside world doesn't see it as stderr output, and can neither capture nor suppress it):
Here's the full idiom:
exec 3>&1 # Save original stdout in temp. fd #3.
# Redirect stderr to *captured* stdout, send stdout to *saved* stdout, also send
# captured stdout (and thus stderr) to original stderr.
errOutput=$("$#" 2>&1 1>&3 | tee /dev/stderr)
exec 3>&- # Close temp. fd.
echo "copy of stderr: $errOutput"

how to redirect stdout and stderr to a file while showing stderr to screen?

The script should redirect all the output (stdout and stderr) to a log file, and only display stderr to the screen (notifying user if an error happens). The command tee may help but don't know how to write it.
Thanks.
P.S., thanks lihao and konsolebox for the answer, but is there a way to keep the output in order. For example:
$ cat test.sh
echo "to stdout..1"
echo "to stderr..1" >&2
echo "to stdout..2"
echo "to stderr..2" >&2
$ sh test.sh 2>&1 >test.log | tee -a test.log
to stderr..1
to stderr..2
$ cat test.log
to stdout..1
to stdout..2
to stderr..1
to stderr..2
Command: { sh test.sh 2> >(tee /dev/fd/4); } 4>&1 >test.log has the same output.
how about the following:
cmd args 2>&1 >logfile | tee -a logfile
You should map normal stdout to another file descriptor (4), make the file the default output, then use tee to redirect output to the new file descriptor through /dev/fd. Of course you'd need process substitution to pass stderr output to tee:
{ cmd args 2> >(exec tee /dev/fd/4); } 4>&1 >file
If you want to make a general redirection for the script, place this at the beginning of it:
exec 4>&1 >file 2> >(exec tee /dev/fd/4)
You can restore normal output with:
exec >&4 4>&-

How to redirect stdout+stderr to one file while keeping streams separate?

Redirecting stdout+stderr such that both get written to a file while still outputting to stdout is simple enough:
cmd 2>&1 | tee output_file
But then now both stdout/stderr from cmd are coming on stdout. I'd like to write stdout+stderr to the same file (so ordering is preserved assuming cmd is single threaded) but then still be able to also separately redirect them, something like this:
some_magic_tee_variant combined_output cmd > >(command-expecting-stdout) 2> >(command-expecting-stderr)
So combined_output contains the both with order preserved, but the command-expecting-stdout only gets stdout and command-expecting-stderr only gets stderr. Basically, I want to log stdout+stderr while still allowing stdout and stderr to be separately redirected and piped. The problem with the tee approach is it globs them together. Is there a way to do this in bash/zsh?
From what I unterstand this is what you are looking for. First I made a litte script to write on stdout and stderr. It looks like this:
$ cat foo.sh
#!/bin/bash
echo foo 1>&2
echo bar
Then I ran it like this:
$ ./foo.sh 2> >(tee stderr | tee -a combined) 1> >(tee stdout | tee -a combined)
foo
bar
The results in my bash look like this:
$ cat stderr
foo
$ cat stdout
bar
$ cat combined
foo
bar
Note that the -a flag is required so the tees don't overwrite the other tee's content.
{ { cmd | tee out >&3; } 2>&1 | tee err >&2; } 3>&1
Or, to be pedantic:
{ { cmd 3>&- | tee out >&3 2> /dev/null; } 2>&1 | tee err >&2 3>&- 2> /dev/null; } 3>&1
Note that it's futile to try and preserve order. It is basically impossible. The only solution would be to modify "cmd" or use some LD_PRELOAD or gdb hack,
Order can indeed be preserved. Here's an example which captures the standard output and error, in the order in which they are generated, to a logfile, while displaying only the standard error on any terminal screen you like. Tweak to suit your needs.
1.Open two windows (shells)
2.Create some test files
touch /tmp/foo /tmp/foo1 /tmp/foo2
3.In window1:
mkfifo /tmp/fifo
</tmp/fifo cat - >/tmp/logfile
4.Then, in window2:
(ls -l /tmp/foo /tmp/nofile /tmp/foo1 /tmp/nofile /tmp/nofile; echo successful test; ls /tmp/nofile1111) 2>&1 1>/tmp/fifo | tee /tmp/fifo 1>/dev/pts/1
Where /dev/pts/1 can be whatever terminal display you want. The subshell runs some "ls" and "echo" commands in sequence, some succeed (providing stdout) and some fail (providing stderr) in order to generate a mingled stream of output and error messages, so that you can verify the correct ordering in the log file.
Here's how I do it:
exec 3>log ; example_command 2>&1 1>&3 | tee -a log ; exec 3>&-
Worked Example
bash$ exec 3>log ; { echo stdout ; echo stderr >&2 ; } 2>&1 1>&3 | \
tee -a log ; exec 3>&-
stderr
bash$ cat log
stdout
stderr
Here's how that works:
exec 3>log sets up file descriptor 3 to redirect into the file called log, until further notice.
example_command to make this a working example, I used { echo stdout ; echo stderr >&2 ; }. Or you could use ls /tmp doesnotexist to provide output instead.
Need to jump ahead to the pipe | at this point because bash does it first. The pipe sets up a pipe and redirects the file descriptor 1 into this pipe. So now, STDOUT is going into the pipe.
Now we can go back to where we were next in our left-to-right interpretation: 2>&1 this says errors from the program are to go to where STDOUT currently points, i.e. into the pipe we just set up.
1>&3 means STDOUT is redirected into file descriptor 3, which we earlier set up to output to the log file. So STDOUT from the command just goes into the log file, not to the terminal's STDOUT.
tee -a log takes it's input from the pipe (which you'll remember is now the errors from the command), and outputs it to STDOUT and also appends it to the log file.
exec 3>&- closes the file descriptor 3.
Victor Sergienko's comment is what worked for me, adding exec to the front of it makes this work for the entire script (instead of having to put it after individual commands)
exec 2> >(tee -a output_file >&2) 1> >(tee -a output_file)

Resources