Bash - send all output to a log file, BUT show errors - bash

Is it possible within a bash script, to make all output, except the output i specifically output with echo, go to a log file, BUT if there's errors in the output it should show in the terminal (and the log file also ofcourse)?

Here is what you can do by using an additional file descriptor:
#!/bin/bash
# open fd=3 redirecting to 1 (stdout)
exec 3>&1
# redirect stdout/stderr to a file but show stderr on terminal
exec >file.log 2> >(tee >(cat >&3))
# function echo to show echo output on terminal
echo() {
# call actual echo command and redirect output to fd=3
command echo "$#" >&3
}
# script starts here
echo "show me"
printf "=====================\n"
printf "%s\n" "hide me"
ls foo-foo
date
tty
echo "end of run"
# close fd=3
exec 3>&-
After you run your script it will display following on terminal:
show me
ls: cannot access 'foo-foo': No such file or directory
end of run
If you do cat file.log then it shows:
=====================
hide me
ls: cannot access 'foo-foo': No such file or directory
Fri Dec 2 14:20:47 EST 2016
/dev/ttys002
On terminal we're only getting output of echo command and all the errors.
In the log file we're getting error and remaining output from script.

UNIX terminals usually provide two output file decriptors, stdout and stderr, both of which go to the terminal by default.
Well behaved programs send their "standard" output to stdout, and errors to stderr. So for example echo writes to stdout. grep writes matching lines to stdout, but if something goes wrong, for example a file can't be read, the error goes to stderr.
You can redirect these with > (for stdout) and 2> (for stderr). So:
myscript >log 2>errors
Writes output to log and errors to errors.
So part of your requirement can be met simply with:
command >log
... errors will continue to go to the terminal, via stdout.
Your extra requirement is "except the output i specifically output with echo".
It might be enough for you that your echos go to stderr:
echo "Processing next part" >&2
The >&2 redirects stdout from this command to stderr. This is the standard way of outputting errors (and sometimes informational output) in shell scripts.
If you need more than this, you might want to do something more complicated with more file descriptors. Try: https://unix.stackexchange.com/questions/18899/when-would-you-use-an-additional-file-descriptor
Well behaved UNIX programs tend to avoid doing complicated things with extra file descriptors. The convention is to restrict yourself to stdout and stderr, with any further outputs being specified as filenames in the command line parameters.

Related

How to use the set command to redirect output to another file

Bash:
I was wondering how I could use the set command to redirect the output to another file.
For example, I would like to use the set command to re-direct the output to a file called "output", how would I do so?
I've tried using set -h output but it didn't seem to work. Any help is appreciated.
The set command is not for redirecting output. Maybe you are thinking of redirections.
For example, if you would like to redirect the output of echo 'hi!' to a file, you could run echo 'hi!' > output.
You might be thinking of exec
#!/usr/bin/env bash
echo "script starting"
# make a backup of fd1 (stdout) as fd3, then redirect fd1 to an output file
exec 3>&1 1>/tmp/output.file
# your script here
echo "hello world"
echo "...and so on..."
# turn off logging
exec 1>&3 3>&-
echo done
Running that looks like:
$ bash script.sh
script starting
done
$ cat /tmp/output.file
hello world
...and so on...
You can use this method to log output to both the terminal and the output file, using a process substitition:
exec 3>&1 1> >(tee /tmp/output.file)
and other fancy things like adding a timestamp to the output
exec 3>&1 1> >(ts | tee /tmp/output.file)
The set command allows you to set or unset values of shell options and positional parameters and it is not the right choice for redirecting the output to a file. To write the output of a command to a file, there are multiple ways and some of the commonly used ways are listed below:
command > output.log
The standard output stream will be redirected to the file only, it
will not be visible in the terminal. If the file already exists, it
gets overwritten.
command >> output.log
The standard output stream will be redirected to the file only, it
will not be visible in the terminal. If the file already exists, the
new data will get appended to the end of the file.
command 2> output.log
The standard error stream will be redirected to the file only, it will
not be visible in the terminal. If the file already exists, it gets
overwritten.
command 2>> output.log
The standard error stream will be redirected to the file only, it will
not be visible in the terminal. If the file already exists, the new
data will get appended to the end of the file.
command &> output.log
Both the standard output and standard error stream will be redirected
to the file only, nothing will be visible in the terminal. If the file
already exists, it gets overwritten.
command &>> output.log
Both the standard output and standard error stream will be redirected
to the file only, nothing will be visible in the terminal. If the file
already exists, the new data will get appended to the end of the file.
command | tee output.log
The standard output stream will be copied to the file, it will still
be visible in the terminal. If the file already exists, it gets
overwritten.
command | tee -a output.log
The standard output stream will be copied to the file, it will still
be visible in the terminal. If the file already exists, the new data
will get appended to the end of the file.
command |& tee output.log
Both the standard output and standard error streams will be copied to
the file while still being visible in the terminal. If the file
already exists, it gets overwritten.
command |& tee -a output.log
Both the standard output and standard error streams will be copied to
the file while still being visible in the terminal. If the file
already exists, the new data will get appended to the end of the file.
command 2>&1 | tee output.log
Both the standard output and standard error streams will be redirected
to the file.

Echo bash command to stdout and a log file?

I'd like to know if there's a way in bash have the current command statement printed to both stdout and a log file, in addition to the command's output. For example:
runme.sh:
# do some setup, e.g. create script.log
echo "Fully logged command"
Would write the following to stdout and to script.log:
+ echo 'Fully logged command'
Fully logged command
For example, if I use these lines early in the script:
set -x
exec > >(tee -ai script.log)
This produces the command output from set -x in stdout but not in the log file.
I have done a bit of testing, and as it appears, set -x prints its messages to stderr. This, however, means that you need to redirect stderr to stdout and pipe stdout to tee.
So if you are doing this:
set -x
exec 2>&1 > >(tee -ai output.log)
... you are neatly getting everything that Bash executes in your log file as well, together with any output produced by any commands that you are executing.
But beware: Any formatting that may be applied by your programs are lost.
As a side note, as has been explained in some answers here, any pipes are created before any redirections take effect. So when you are redirecting stderr to a piped stdout, that is also going to wind up in said pipe.

Re-direct output of command within a script whose output is re-directed

Suppose that a script of mine is invoked like this:
(script.sh 1>&2) 2>err
Is there a way to re-direct the output of one of the commands run by the script to standard output? I tried to do 2>&1 for that command, but that did not help. This answer suggests a solution for Windows command shell and re-directs to a file instead of the standard output.
For a simple example, suppose that the script is:
#!/bin/sh
# ... many commands whose output will go to `stderr`
echo aaa # command whose output needs to go to `stdout`; tried 2>&1
# ... many commands whose output will go to `stderr`
How do I cause the output of that echo to go to stdout (a sign of that would be that it would appear on the screen) when the script is invoked as shown above?
Send it to stderr in the script
echo this goes to stderr
echo so does this
echo this will end up in stdout >&2
echo more stderr
Run as
{ ./script.sh 3>&2 2>&1 1>&3 ; } 2>err
err contains
this goes to stderr
so does this
more stderr
Output to stdout
this will end up in stdout

bash file descriptor redirection

I have the following two bash scripts:
one.bash:
#!/bin/bash
echo "I don't control this line of output to stdout"
echo "I don't control this line of output to stderr" >&2
echo "I do control this line of output to fd 5" >&5
callone.bash:
#!/bin/bash
# here I try to merge stdout and stderr into stderr.
# then direct fd5 into stdout.
bash ./one.bash 1>&2 5>&1
When I run it like this:
bash callone.bash 2>stderr.txt >stdout.txt
The stderr.txt file looks like this:
I don't control this line of output to stdout
I don't control this line of output to stderr
I do control this line of output to fd 5
and stdout is empty.
I would like the "do control" line to be output to only stdout.txt.
The restrictions on making changes are:
I can change anything in callone.bash.
I can change the line in one.bash that I control.
I can add an exec in one.bash related to file descriptor 5.
I have to run the script as indicated.
[EDIT] The use case for this is: I have a script that does all kinds of running of other scripts that can output to stderr and stdout. But I need to ensure that the user only sees the well controlled message. So I send the well controlled message to fd5, and everything else (stdout & stderr) is sent to the log.
Redirections happen in order.
Once you run 1>&2 you've replaced fd 1 with fd 2.
So when you then run 5>&1 you are redirecting fd 5 to where fd 1 points now (not where it was when it started).
You need to invert the two redirections:
bash ./one.bash 5>&1 1>&2

Get whole output stream from remote machine after running .rb file using ssh [duplicate]

This question already has answers here:
How to redirect and append both standard output and standard error to a file with Bash
(8 answers)
Closed 6 years ago.
I know that in Linux, to redirect output from the screen to a file, I can either use the > or tee. However, I'm not sure why part of the output is still output to the screen and not written to the file.
Is there a way to redirect all output to file?
That part is written to stderr, use 2> to redirect it. For example:
foo > stdout.txt 2> stderr.txt
or if you want in same file:
foo > allout.txt 2>&1
Note: this works in (ba)sh, check your shell for proper syntax
All POSIX operating systems have 3 streams: stdin, stdout, and stderr. stdin is the input, which can accept the stdout or stderr. stdout is the primary output, which is redirected with >, >>, or |. stderr is the error output, which is handled separately so that any exceptions do not get passed to a command or written to a file that it might break; normally, this is sent to a log of some kind, or dumped directly, even when the stdout is redirected. To redirect both to the same place, use:
$command &> /some/file
EDIT: thanks to Zack for pointing out that the above solution is not portable--use instead:
$command > file 2>&1
If you want to silence the error, do:
$command 2> /dev/null
To get the output on the console AND in a file file.txt for example.
make 2>&1 | tee file.txt
Note: & (in 2>&1) specifies that 1 is not a file name but a file descriptor.
Use this - "require command here" > log_file_name 2>&1
Detail description of redirection operator in Unix/Linux.
The > operator redirects the output usually to a file but it can be to a device. You can also use >> to append.
If you don't specify a number then the standard output stream is assumed but you can also redirect errors
> file redirects stdout to file
1> file redirects stdout to file
2> file redirects stderr to file
&> file redirects stdout and stderr to file
/dev/null is the null device it takes any input you want and throws it away. It can be used to suppress any output.
Credits to osexp2003 and j.a. …
Instead of putting:
&>> your_file.log
behind a line in:
crontab -e
I use:
#!/bin/bash
exec &>> your_file.log
…
at the beginning of a BASH script.
Advantage: You have the log definitions within your script. Good for Git etc.
You can use exec command to redirect all stdout/stderr output of any commands later.
sample script:
exec 2> your_file2 > your_file1
your other commands.....
It might be the standard error. You can redirect it:
... > out.txt 2>&1
Command:
foo >> output.txt 2>&1
appends to the output.txt file, without replacing the content.
Use >> to append:
command >> file

Resources