I have the following two bash scripts:
one.bash:
#!/bin/bash
echo "I don't control this line of output to stdout"
echo "I don't control this line of output to stderr" >&2
echo "I do control this line of output to fd 5" >&5
callone.bash:
#!/bin/bash
# here I try to merge stdout and stderr into stderr.
# then direct fd5 into stdout.
bash ./one.bash 1>&2 5>&1
When I run it like this:
bash callone.bash 2>stderr.txt >stdout.txt
The stderr.txt file looks like this:
I don't control this line of output to stdout
I don't control this line of output to stderr
I do control this line of output to fd 5
and stdout is empty.
I would like the "do control" line to be output to only stdout.txt.
The restrictions on making changes are:
I can change anything in callone.bash.
I can change the line in one.bash that I control.
I can add an exec in one.bash related to file descriptor 5.
I have to run the script as indicated.
[EDIT] The use case for this is: I have a script that does all kinds of running of other scripts that can output to stderr and stdout. But I need to ensure that the user only sees the well controlled message. So I send the well controlled message to fd5, and everything else (stdout & stderr) is sent to the log.
Redirections happen in order.
Once you run 1>&2 you've replaced fd 1 with fd 2.
So when you then run 5>&1 you are redirecting fd 5 to where fd 1 points now (not where it was when it started).
You need to invert the two redirections:
bash ./one.bash 5>&1 1>&2
Related
I'd like to know if there's a way in bash have the current command statement printed to both stdout and a log file, in addition to the command's output. For example:
runme.sh:
# do some setup, e.g. create script.log
echo "Fully logged command"
Would write the following to stdout and to script.log:
+ echo 'Fully logged command'
Fully logged command
For example, if I use these lines early in the script:
set -x
exec > >(tee -ai script.log)
This produces the command output from set -x in stdout but not in the log file.
I have done a bit of testing, and as it appears, set -x prints its messages to stderr. This, however, means that you need to redirect stderr to stdout and pipe stdout to tee.
So if you are doing this:
set -x
exec 2>&1 > >(tee -ai output.log)
... you are neatly getting everything that Bash executes in your log file as well, together with any output produced by any commands that you are executing.
But beware: Any formatting that may be applied by your programs are lost.
As a side note, as has been explained in some answers here, any pipes are created before any redirections take effect. So when you are redirecting stderr to a piped stdout, that is also going to wind up in said pipe.
Bash supports colors, i.e. \033[31m switches to red and \033[0m switches back to uncolored.
I would like to make a small bash-wrapper that reliably puts out stderr in red, i.e. it should put \033[31m before and \033[0m after everything that comes from stderr.
I'm not sure that this is even possible, because when two parallel processes (or even a single process) writes to both stdout and stderr there would have to be a way to distinguish the two by a character-by-character basis.
Colorizing text is simple enough: read each line and echo it with appropriate escape sequences at beginning and end. But colorizing standard error gets tricky because standard error doesn’t get passed to pipes.
Here’s one approach that works by swapping standard error and standard output, then filtering standard output.
Here is our test command:
#!/bin/bash
echo hi
echo 'Error!' 1>&2
And the wrapper script:
#!/bin/bash
(# swap stderr and stdout
exec 3>&1 # copy stdout to fd 3
exec 1>&2 # copy stderr to fd 1
exec 2>&3- # move saved stdout on fd 3 over to 2
"${#}") | while read line; do
echo -e "\033[31m${line}\033[0m"
done
Then:
$ ./wrapper ./test-command
hi
Error! # <- shows up red
Unfortunately, all output from the wrapper command comes out of stderr, not stdout, so you can’t pipe the output into any further scripts. You can probably get around this by creating a temporary fifo… but hopefully this little wrapper script is enough to meet your needs.
Based on andrewdotn's wrapper
Changes:
Puts the stderr output back to stderr
Avoid echo -e processing content in the lines
wrapper
#!/bin/bash
"${#}" 2> >(
while read line; do
echo -ne "\033[31m" 1>&2
echo -n "${line}" 1>&2
echo -e "\033[0m" 1>&2
done
)
Issues:
The output lines end up grouped, rather than mixed stdout/stderr
Test script:
#!/bin/bash
echo Hi
echo "\033[32mStuff"
echo message
echo error 1>&2
echo message
echo error 1>&2
echo message
echo error 1>&2
Output:
Hi
\033[32mStuff
message
message
message
error # <- shows up red
error # <- shows up red
error # <- shows up red
Is it possible within a bash script, to make all output, except the output i specifically output with echo, go to a log file, BUT if there's errors in the output it should show in the terminal (and the log file also ofcourse)?
Here is what you can do by using an additional file descriptor:
#!/bin/bash
# open fd=3 redirecting to 1 (stdout)
exec 3>&1
# redirect stdout/stderr to a file but show stderr on terminal
exec >file.log 2> >(tee >(cat >&3))
# function echo to show echo output on terminal
echo() {
# call actual echo command and redirect output to fd=3
command echo "$#" >&3
}
# script starts here
echo "show me"
printf "=====================\n"
printf "%s\n" "hide me"
ls foo-foo
date
tty
echo "end of run"
# close fd=3
exec 3>&-
After you run your script it will display following on terminal:
show me
ls: cannot access 'foo-foo': No such file or directory
end of run
If you do cat file.log then it shows:
=====================
hide me
ls: cannot access 'foo-foo': No such file or directory
Fri Dec 2 14:20:47 EST 2016
/dev/ttys002
On terminal we're only getting output of echo command and all the errors.
In the log file we're getting error and remaining output from script.
UNIX terminals usually provide two output file decriptors, stdout and stderr, both of which go to the terminal by default.
Well behaved programs send their "standard" output to stdout, and errors to stderr. So for example echo writes to stdout. grep writes matching lines to stdout, but if something goes wrong, for example a file can't be read, the error goes to stderr.
You can redirect these with > (for stdout) and 2> (for stderr). So:
myscript >log 2>errors
Writes output to log and errors to errors.
So part of your requirement can be met simply with:
command >log
... errors will continue to go to the terminal, via stdout.
Your extra requirement is "except the output i specifically output with echo".
It might be enough for you that your echos go to stderr:
echo "Processing next part" >&2
The >&2 redirects stdout from this command to stderr. This is the standard way of outputting errors (and sometimes informational output) in shell scripts.
If you need more than this, you might want to do something more complicated with more file descriptors. Try: https://unix.stackexchange.com/questions/18899/when-would-you-use-an-additional-file-descriptor
Well behaved UNIX programs tend to avoid doing complicated things with extra file descriptors. The convention is to restrict yourself to stdout and stderr, with any further outputs being specified as filenames in the command line parameters.
Suppose that a script of mine is invoked like this:
(script.sh 1>&2) 2>err
Is there a way to re-direct the output of one of the commands run by the script to standard output? I tried to do 2>&1 for that command, but that did not help. This answer suggests a solution for Windows command shell and re-directs to a file instead of the standard output.
For a simple example, suppose that the script is:
#!/bin/sh
# ... many commands whose output will go to `stderr`
echo aaa # command whose output needs to go to `stdout`; tried 2>&1
# ... many commands whose output will go to `stderr`
How do I cause the output of that echo to go to stdout (a sign of that would be that it would appear on the screen) when the script is invoked as shown above?
Send it to stderr in the script
echo this goes to stderr
echo so does this
echo this will end up in stdout >&2
echo more stderr
Run as
{ ./script.sh 3>&2 2>&1 1>&3 ; } 2>err
err contains
this goes to stderr
so does this
more stderr
Output to stdout
this will end up in stdout
Consider ./my_script >/var/log/my_log
Single echo statement from this script alone must to go to stdout.
How could this be accomplished ?
so we have some clever program
cat print2stdout
#!/bin/sh
echo some words secret and sent to null
echo some words to stdout > /dev/fd/3
last line puts to echo to 3 file descriptor opened.
and when invoking we map 3 FD to stdout, then redirect stdout to file
the result looks like that:
./print2stdout 3>&1 >/dev/null
some words to stdout
Just use /dev/tty which points to your terminal emulator regardless of redirections.
#!/bin/sh
echo this line go to the possibly redirected stdout
echo this line shows up on the screen > /dev/tty