Piping both stdout and stderr in bash? - bash

It seems that newer versions of bash have the &> operator, which (if I understand correctly), redirects both stdout and stderr to a file (&>> appends to the file instead, as Adrian clarified).
What's the simplest way to achieve the same thing, but instead piping to another command?
For example, in this line:
cmd-doesnt-respect-difference-between-stdout-and-stderr | grep -i SomeError
I'd like the grep to match on content both in stdout and stderr (effectively, have them combined into one stream).
Note: this question is asking about piping, not redirecting - so it is not a duplicate of the question it's currently marked as a duplicate of.

(Note that &>>file appends to a file while &> would redirect and overwrite a previously existing file.)
To combine stdout and stderr you would redirect the latter to the former using 1>&2. This redirects stdout (file descriptor 1) to stderr (file descriptor 2), e.g.:
$ { echo "stdout"; echo "stderr" 1>&2; } | grep -v std
stderr
$
stdout goes to stdout, stderr goes to stderr. grep only sees stdout, hence stderr prints to the terminal.
On the other hand:
$ { echo "stdout"; echo "stderr" 1>&2; } 2>&1 | grep -v std
$
After writing to both stdout and stderr, 2>&1 redirects stderr back to stdout and grep sees both strings on stdin, thus filters out both.
You can read more about redirection here.
Regarding your example (POSIX):
cmd-doesnt-respect-difference-between-stdout-and-stderr 2>&1 | grep -i SomeError
or, using >=bash-4:
cmd-doesnt-respect-difference-between-stdout-and-stderr |& grep -i SomeError

Bash has a shorthand for 2>&1 |, namely |&, which pipes both stdout and stderr (see the manual):
cmd-doesnt-respect-difference-between-stdout-and-stderr |& grep -i SomeError
This was introduced in Bash 4.0, see the release notes.

Related

save command output to variable in bash script [duplicate]

This question already has answers here:
How to store standard error in a variable
(20 answers)
Closed 6 years ago.
I'm writing a script to backup a database. I have the following line:
mysqldump --user=$dbuser --password=$dbpswd \
--host=$host $mysqldb | gzip > $filename
I want to assign the stderr to a variable, so that it will send an email to myself letting me know what happened if something goes wrong. I've found solutions to redirect stderr to stdout, but I can't do that as the stdout is already being sent (via gzip) to a file. How can I separately store stderr in a variable $result ?
Try redirecting stderr to stdout and using $() to capture that. In other words:
VAR=$((your-command-including-redirect) 2>&1)
Since your command redirects stdout somewhere, it shouldn't interfere with stderr. There might be a cleaner way to write it, but that should work.
Edit:
This really does work. I've tested it:
#!/bin/bash
BLAH=$((
(
echo out >&1
echo err >&2
) 1>log
) 2>&1)
echo "BLAH=$BLAH"
will print BLAH=err and the file log contains out.
For any generic command in Bash, you can do something like this:
{ error=$(command 2>&1 1>&$out); } {out}>&1
Regular output appears normally, anything to stderr is captured in $error (quote it as "$error" when using it to preserve newlines). To capture stdout to a file, just add a redirection at the end, for example:
{ error=$(ls /etc/passwd /etc/bad 2>&1 1>&$out); } {out}>&1 >output
Breaking it down, reading from the outside in, it:
creates a file description $out for the whole block, duplicating stdout
captures the stdout of the whole command in $error (but see below)
the command itself redirects stderr to stdout (which gets captured above) then stdout to the original stdout from outside the block, so only the stderr gets captured
You can save the stdout reference from before it is redirected in another file number (e.g. 3) and then redirect stderr to that:
result=$(mysqldump --user=$dbuser --password=$dbpswd \
--host=$host $mysqldb 3>&1 2>&3 | gzip > $filename)
So 3>&1 will redirect file number 3 to stdout (notice this is before stdout is redirected with the pipe). Then 2>&3 redirects stderr to file number 3, which now is the same as stdout. Finally stdout is redirected by being fed into a pipe, but this is not affecting file numbers 2 and 3 (notice that redirecting stdout from gzip is unrelated to the outputs from the mysqldump command).
Edit: Updated the command to redirect stderr from the mysqldump command and not gzip, I was too quick in my first answer.
dd writes both stdout and stderr:
$ dd if=/dev/zero count=50 > /dev/null
50+0 records in
50+0 records out
the two streams are independent and separately redirectable:
$ dd if=/dev/zero count=50 2> countfile | wc -c
25600
$ cat countfile
50+0 records in
50+0 records out
$ mail -s "countfile for you" thornate < countfile
if you really needed a variable:
$ variable=`cat countfile`

Prefix for command output

From this question I learned how to add a prefix to each output of a command:
command | sed "s/^/[prefix] /"
But this only adds the prefix for each line from stdout.
I successfully used the following to add the prefix also to stderr output.
command 2>&1 | sed "s/^/[prefix] /"
But this sends the result to stdout only.
How can I prefix any output of command while pushing the lines to the previous output (preserving both stdout and stderr)?
As a combination of iBug's answer and this and especially this answer, I came up with a one-liner that uses temporary file descriptors:
command 1> >(sed "s/^/[prefix]/") 2> >(sed "s/^/[prefix]/" >&2)
Or as a function:
function prefix_cmd {
local PREF="${1//\//\\/}" # replace / with \/
shift
local CMD=("$#")
${CMD[#]} 1> >(sed "s/^/${PREF}/") 2> >(sed "s/^/${PREF}/" 1>&2)
}
prefix_cmd "prefix" command
You can only pipe stdout using the shell pipe syntax. You need two pipes if you want to process stdout and stderr separately. A named pipe may work here.
Here's a sample script that demonstrates the solution
#!/bin/bash
PREF="$1"
shift
NPOUT=pipe.out
NPERR=pipe.err
mkfifo $NPOUT $NPERR
# Make two background sed processes
sed "s/^/$PREF/" <$NPOUT &
sed "s/^/$PREF/" <$NPERR >&2 &
# Run the program
"$#" >$NPOUT 2>$NPERR
rm $NPOUT $NPERR
Usage:
./foo.sh "[prefix] " command -options
It will feed command with its stdin and send command's stdout and stderr to its stdout and stderr separately.
Note I didn't suppress sed's stderr, which may interfere with the output. You can do so like this:
sed "s/^/$PREF/" <$NPOUT 2>/dev/null &
^^^^^^^^^^^

forward stdin to stdout

I'm looking for a way to "forward" stdin to stdout in a pipe, while in that step something is written to stderr. The example should clarify this:
echo "before.." | >&2 echo "some logging..."; [[forward stdin>stdout]] | cat
This should put "before.." to stdout, meanwhile "some logging..." to stderr.
How to do that? Or is there maybe another quite different approach to this?
Here's a solution based on your comments:
cat ~/.bashrc | tee >( cat -n >&2 ) | sort
cat ~/.bashrc represents the start of your pipeline, producing some data.
tee duplicates its input, writing to both stdout and any files listed as arguments.
>( ... ) is a bash construct that runs ... as a pipe subcommand but replaces itself by a filename (something that tee can open and write to).
cat -n represents modifying the input (adding line numbers).
>&2 redirects stdout to stderr.
sort represents the end of your pipeline (normal processing of the unchanged input).
Putting it all together, bash will
run cat ~/.bashrc, putting the contents of ~/.bashrc on stdout
... which is piped to the stdin of tee
run cat -n with stdout redirected to stderr and stdin redirected to a new pipe
run tee /dev/fd/63 (where /dev/fd/63 represents the other end of the cat -n pipe)
this is where it all comes together: tee reads its input and writes it to both its stdout and to the other pipe that goes to cat -n (and from there to stderr)
finally tee's stdout goes into sort
Redirections follows the simple command they refer to, thus
echo "before" >&1
echo "some logging..." >&2
should do the trick, if I understand what you're trying to do.

Redirecting stdout/stderr to multiple files

I was wondering how to redirect stderr to multiple outputs. I tried it with this script, but I couldn't get it to work quite right. The first file should have both stdout and stderr, and the 2nd should just have errors.
perl script.pl &> errorTestnormal.out &2> errorTest.out
Is there a better way to do this? Any help would be much appreciated. Thank you.
perl script.pl 2>&1 >errorTestnormal.out | tee -a errorTestnormal.out > errorTest.out
Will do what you want.
This is a bit messy, lets go through it step by step.
We say what used to go to STDERR will now go STDOUT
We say what used to go to STDOUT will now go to errorTestnormal.out.
So now, STDOUT gets printed to a file, and STDERR gets printed to STDOUT. We want put STDERR into 2 different files, which we can do with tee. tee appends the text it's given to a file, and also echoes to STDOUT.
We use tee to append to errorTestnormal.out, so it now contains all the STDOUT and STDERR output of script.pl.
Then, we write STDOUT of tee (which contains STDERR from script.pl) into errorTest.out
After this, errorTestnormal.out has all the STDOUT output, and then all the STDERR output. errotTest.out contains only the STDERR output.
I had to mess around with this for a while as well. In order to get stderr in both files, while only putting stdout into a single file (e.g. stderr into errors.log and output.log and then stdout into just output.log) AND in the order that they happen, this command is better:
((sh test.sh 2>&1 1>&3 | tee errors.log) 3>&1 | tee output.log) > /dev/null 2>&1
The last /dev/nul 2>&1 can be omitted if you want the stdout and stderr to still be output onto the screen.
I guess in case of the 2nd ">" you try to send the error output of errorTestnormal.out (and not that of script.pl) to errorTest.out.

How to connect stderr to stdin using pipes?

The "|" pipe operator connects the stdout of one process to the stdin of another. Is there any way to create a pipe that connects the stderr of one process to the stdin of another keeping the stdout alive in my terminal? Searching on the internet gave me no information at all...
Thank you in advance,
Michalis.
If you're happy to mix stdouot and stderr, then you can first redirect stderr to stdout and then pipe that:
theprogram 2>&1 | otherprogram
If you don't want stdout, you can kill that one:
theprogram 2>&1 1> /dev/null | otherprogram
If you do want to store the original stdout as well, then you have to redirect it either to a file (in place of /dev/null), or to another file descriptor that you opened previously with exec. Here are some details.
(Unfortunately there is no direct "pipe this file descriptor" syntax like 2|. That would have been handy.)
You can get this effect with bash's process substitution feature:
somecommand 2> >(errorprocessor)
You could use named pipes:
mkfifo /my/pipe
error-handler </my/pipe &
do-something 2>/my/pipe
This should keep stdin & stdout of "do-something" in your terminal und redirect stderr to /my/pipe, which is read by "error-handler".
(I hope this work, have no bash to test)
You may also swap the stdout & stderr streams, i. e. stdout becomes the new stderr and stderr becomes the new stdout).
# only the stdout stream gets upcased
ls -ld / xxx ~/.bashrc yyy 3>&1 1>&2 2>&3 3>&- | tr '[[:lower:]]' '[[:upper:]]'
# block original stdout by closing fd 1
ls -ld / xxx ~/.bashrc yyy 2>&1 1>&- | tr '[[:lower:]]' '[[:upper:]]'

Resources