save command output to variable in bash script [duplicate] - bash

This question already has answers here:
How to store standard error in a variable
(20 answers)
Closed 6 years ago.
I'm writing a script to backup a database. I have the following line:
mysqldump --user=$dbuser --password=$dbpswd \
--host=$host $mysqldb | gzip > $filename
I want to assign the stderr to a variable, so that it will send an email to myself letting me know what happened if something goes wrong. I've found solutions to redirect stderr to stdout, but I can't do that as the stdout is already being sent (via gzip) to a file. How can I separately store stderr in a variable $result ?

Try redirecting stderr to stdout and using $() to capture that. In other words:
VAR=$((your-command-including-redirect) 2>&1)
Since your command redirects stdout somewhere, it shouldn't interfere with stderr. There might be a cleaner way to write it, but that should work.
Edit:
This really does work. I've tested it:
#!/bin/bash
BLAH=$((
(
echo out >&1
echo err >&2
) 1>log
) 2>&1)
echo "BLAH=$BLAH"
will print BLAH=err and the file log contains out.

For any generic command in Bash, you can do something like this:
{ error=$(command 2>&1 1>&$out); } {out}>&1
Regular output appears normally, anything to stderr is captured in $error (quote it as "$error" when using it to preserve newlines). To capture stdout to a file, just add a redirection at the end, for example:
{ error=$(ls /etc/passwd /etc/bad 2>&1 1>&$out); } {out}>&1 >output
Breaking it down, reading from the outside in, it:
creates a file description $out for the whole block, duplicating stdout
captures the stdout of the whole command in $error (but see below)
the command itself redirects stderr to stdout (which gets captured above) then stdout to the original stdout from outside the block, so only the stderr gets captured

You can save the stdout reference from before it is redirected in another file number (e.g. 3) and then redirect stderr to that:
result=$(mysqldump --user=$dbuser --password=$dbpswd \
--host=$host $mysqldb 3>&1 2>&3 | gzip > $filename)
So 3>&1 will redirect file number 3 to stdout (notice this is before stdout is redirected with the pipe). Then 2>&3 redirects stderr to file number 3, which now is the same as stdout. Finally stdout is redirected by being fed into a pipe, but this is not affecting file numbers 2 and 3 (notice that redirecting stdout from gzip is unrelated to the outputs from the mysqldump command).
Edit: Updated the command to redirect stderr from the mysqldump command and not gzip, I was too quick in my first answer.

dd writes both stdout and stderr:
$ dd if=/dev/zero count=50 > /dev/null
50+0 records in
50+0 records out
the two streams are independent and separately redirectable:
$ dd if=/dev/zero count=50 2> countfile | wc -c
25600
$ cat countfile
50+0 records in
50+0 records out
$ mail -s "countfile for you" thornate < countfile
if you really needed a variable:
$ variable=`cat countfile`

Related

Pipe-ing perf output into a file [duplicate]

This question already has answers here:
How to redirect and append both standard output and standard error to a file with Bash
(8 answers)
Closed 6 years ago.
I know that in Linux, to redirect output from the screen to a file, I can either use the > or tee. However, I'm not sure why part of the output is still output to the screen and not written to the file.
Is there a way to redirect all output to file?
That part is written to stderr, use 2> to redirect it. For example:
foo > stdout.txt 2> stderr.txt
or if you want in same file:
foo > allout.txt 2>&1
Note: this works in (ba)sh, check your shell for proper syntax
All POSIX operating systems have 3 streams: stdin, stdout, and stderr. stdin is the input, which can accept the stdout or stderr. stdout is the primary output, which is redirected with >, >>, or |. stderr is the error output, which is handled separately so that any exceptions do not get passed to a command or written to a file that it might break; normally, this is sent to a log of some kind, or dumped directly, even when the stdout is redirected. To redirect both to the same place, use:
$command &> /some/file
EDIT: thanks to Zack for pointing out that the above solution is not portable--use instead:
$command > file 2>&1
If you want to silence the error, do:
$command 2> /dev/null
To get the output on the console AND in a file file.txt for example.
make 2>&1 | tee file.txt
Note: & (in 2>&1) specifies that 1 is not a file name but a file descriptor.
Use this - "require command here" > log_file_name 2>&1
Detail description of redirection operator in Unix/Linux.
The > operator redirects the output usually to a file but it can be to a device. You can also use >> to append.
If you don't specify a number then the standard output stream is assumed but you can also redirect errors
> file redirects stdout to file
1> file redirects stdout to file
2> file redirects stderr to file
&> file redirects stdout and stderr to file
/dev/null is the null device it takes any input you want and throws it away. It can be used to suppress any output.
Credits to osexp2003 and j.a. …
Instead of putting:
&>> your_file.log
behind a line in:
crontab -e
I use:
#!/bin/bash
exec &>> your_file.log
…
at the beginning of a BASH script.
Advantage: You have the log definitions within your script. Good for Git etc.
You can use exec command to redirect all stdout/stderr output of any commands later.
sample script:
exec 2> your_file2 > your_file1
your other commands.....
It might be the standard error. You can redirect it:
... > out.txt 2>&1
Command:
foo >> output.txt 2>&1
appends to the output.txt file, without replacing the content.
Use >> to append:
command >> file

Get whole output stream from remote machine after running .rb file using ssh [duplicate]

This question already has answers here:
How to redirect and append both standard output and standard error to a file with Bash
(8 answers)
Closed 6 years ago.
I know that in Linux, to redirect output from the screen to a file, I can either use the > or tee. However, I'm not sure why part of the output is still output to the screen and not written to the file.
Is there a way to redirect all output to file?
That part is written to stderr, use 2> to redirect it. For example:
foo > stdout.txt 2> stderr.txt
or if you want in same file:
foo > allout.txt 2>&1
Note: this works in (ba)sh, check your shell for proper syntax
All POSIX operating systems have 3 streams: stdin, stdout, and stderr. stdin is the input, which can accept the stdout or stderr. stdout is the primary output, which is redirected with >, >>, or |. stderr is the error output, which is handled separately so that any exceptions do not get passed to a command or written to a file that it might break; normally, this is sent to a log of some kind, or dumped directly, even when the stdout is redirected. To redirect both to the same place, use:
$command &> /some/file
EDIT: thanks to Zack for pointing out that the above solution is not portable--use instead:
$command > file 2>&1
If you want to silence the error, do:
$command 2> /dev/null
To get the output on the console AND in a file file.txt for example.
make 2>&1 | tee file.txt
Note: & (in 2>&1) specifies that 1 is not a file name but a file descriptor.
Use this - "require command here" > log_file_name 2>&1
Detail description of redirection operator in Unix/Linux.
The > operator redirects the output usually to a file but it can be to a device. You can also use >> to append.
If you don't specify a number then the standard output stream is assumed but you can also redirect errors
> file redirects stdout to file
1> file redirects stdout to file
2> file redirects stderr to file
&> file redirects stdout and stderr to file
/dev/null is the null device it takes any input you want and throws it away. It can be used to suppress any output.
Credits to osexp2003 and j.a. …
Instead of putting:
&>> your_file.log
behind a line in:
crontab -e
I use:
#!/bin/bash
exec &>> your_file.log
…
at the beginning of a BASH script.
Advantage: You have the log definitions within your script. Good for Git etc.
You can use exec command to redirect all stdout/stderr output of any commands later.
sample script:
exec 2> your_file2 > your_file1
your other commands.....
It might be the standard error. You can redirect it:
... > out.txt 2>&1
Command:
foo >> output.txt 2>&1
appends to the output.txt file, without replacing the content.
Use >> to append:
command >> file

Piping both stdout and stderr in bash?

It seems that newer versions of bash have the &> operator, which (if I understand correctly), redirects both stdout and stderr to a file (&>> appends to the file instead, as Adrian clarified).
What's the simplest way to achieve the same thing, but instead piping to another command?
For example, in this line:
cmd-doesnt-respect-difference-between-stdout-and-stderr | grep -i SomeError
I'd like the grep to match on content both in stdout and stderr (effectively, have them combined into one stream).
Note: this question is asking about piping, not redirecting - so it is not a duplicate of the question it's currently marked as a duplicate of.
(Note that &>>file appends to a file while &> would redirect and overwrite a previously existing file.)
To combine stdout and stderr you would redirect the latter to the former using 1>&2. This redirects stdout (file descriptor 1) to stderr (file descriptor 2), e.g.:
$ { echo "stdout"; echo "stderr" 1>&2; } | grep -v std
stderr
$
stdout goes to stdout, stderr goes to stderr. grep only sees stdout, hence stderr prints to the terminal.
On the other hand:
$ { echo "stdout"; echo "stderr" 1>&2; } 2>&1 | grep -v std
$
After writing to both stdout and stderr, 2>&1 redirects stderr back to stdout and grep sees both strings on stdin, thus filters out both.
You can read more about redirection here.
Regarding your example (POSIX):
cmd-doesnt-respect-difference-between-stdout-and-stderr 2>&1 | grep -i SomeError
or, using >=bash-4:
cmd-doesnt-respect-difference-between-stdout-and-stderr |& grep -i SomeError
Bash has a shorthand for 2>&1 |, namely |&, which pipes both stdout and stderr (see the manual):
cmd-doesnt-respect-difference-between-stdout-and-stderr |& grep -i SomeError
This was introduced in Bash 4.0, see the release notes.

How to echo a particular line alone to stdout from a script , if the script's stdout is redirected to a file during execution?

Consider ./my_script >/var/log/my_log
Single echo statement from this script alone must to go to stdout.
How could this be accomplished ?
so we have some clever program
cat print2stdout
#!/bin/sh
echo some words secret and sent to null
echo some words to stdout > /dev/fd/3
last line puts to echo to 3 file descriptor opened.
and when invoking we map 3 FD to stdout, then redirect stdout to file
the result looks like that:
./print2stdout 3>&1 >/dev/null
some words to stdout
Just use /dev/tty which points to your terminal emulator regardless of redirections.
#!/bin/sh
echo this line go to the possibly redirected stdout
echo this line shows up on the screen > /dev/tty

Redirecting stdout/stderr to multiple files

I was wondering how to redirect stderr to multiple outputs. I tried it with this script, but I couldn't get it to work quite right. The first file should have both stdout and stderr, and the 2nd should just have errors.
perl script.pl &> errorTestnormal.out &2> errorTest.out
Is there a better way to do this? Any help would be much appreciated. Thank you.
perl script.pl 2>&1 >errorTestnormal.out | tee -a errorTestnormal.out > errorTest.out
Will do what you want.
This is a bit messy, lets go through it step by step.
We say what used to go to STDERR will now go STDOUT
We say what used to go to STDOUT will now go to errorTestnormal.out.
So now, STDOUT gets printed to a file, and STDERR gets printed to STDOUT. We want put STDERR into 2 different files, which we can do with tee. tee appends the text it's given to a file, and also echoes to STDOUT.
We use tee to append to errorTestnormal.out, so it now contains all the STDOUT and STDERR output of script.pl.
Then, we write STDOUT of tee (which contains STDERR from script.pl) into errorTest.out
After this, errorTestnormal.out has all the STDOUT output, and then all the STDERR output. errotTest.out contains only the STDERR output.
I had to mess around with this for a while as well. In order to get stderr in both files, while only putting stdout into a single file (e.g. stderr into errors.log and output.log and then stdout into just output.log) AND in the order that they happen, this command is better:
((sh test.sh 2>&1 1>&3 | tee errors.log) 3>&1 | tee output.log) > /dev/null 2>&1
The last /dev/nul 2>&1 can be omitted if you want the stdout and stderr to still be output onto the screen.
I guess in case of the 2nd ">" you try to send the error output of errorTestnormal.out (and not that of script.pl) to errorTest.out.

Resources