save bash output and error message to log file within the script - bash

I'm trying to redirect stdout to log file within the script using following code:
LOGFILE=logfile.txt
exec 2> $LOGFILE
But the logfile.txt is empty. Could anyone give me some hint?

The redirect notation 2> means to redirect fd 2, which is stderr, not stdout.
If you want to use stdout, write
exec >$LOGFILE
If you want both
exec >$LOGFILE 2>&1

Related

How to redirect both stdout and stderr to a file from within a bash script?

I want to add a command to my bash script that directs all stderr and stdout to specific files. From this and many other sources, I know that from the command line I would use:
/path/to/script.sh >> log_file 2>> err_file
However, I want something inside my script, something akin to these slurm flags:
#!/bin/bash
#SBATCH -o slurm.stdout.txt # Standard output log
#SBATCH -e slurm.stderr.txt # Standard error log
<code>
Is there a way to direct output from within a script, or do I need to use >> log_file 2>> err_file every time I call the script? Thanks
You can use this:
exec >> file
exec 2>&1
at the start of your bash script. This will append both stdout and stderr to your file.
You can use this at the start of your bash script:
# Redirected Output
exec > log_file 2> err_file
If the file does exist it is truncated to zero size. If you prefer to append, use this:
# Appending Redirected Output
exec >> log_file 2>> err_file
If you want to redirect both stdout and stderr to the same file, then you can use:
# Redirected Output
exec &> log_file
# This is semantically equivalent to
exec > log_file 2>&1
If you prefer to append, use this:
# Appending Redirected Output
exec >> log_file 2>&1
#SBATCH --output=serial_test_%j.log # Standard output and error log
This will send all output, i.e stdout and stderr to a single log file called serial_test_<JOBID>.log
Ref: https://help.rc.ufl.edu/doc/Sample_SLURM_Scripts

Running tcpdump in the background linux

Linux (Gentoo) and Linux (Redhat on AWS free)
I am a member of the pcap group and can run tcpdump as a non-root user.
I am trying to run a script the runs tcpdump in the background and send the output to a text file temp.txt. My script will create a file called temp.txt but /usr/bin/tcpdump -tttt will not write to it.
I can run the script without nohup.
/usr/sbin/tcpdump -c 10 -tttt > `pwd`/temp.txt
Why will the nohup not work? The following is my script:
#!/bin/bash
#tpd-txt.sh
nohup /usr/sbin/tcpdump -c 10 -tttt > `pwd`/temp.txt > /dev/null 2>&1 &
Try
nohup /usr/sbin/tcpdump -c 10 -tttt 2>&1 >./temp.txt &
I am assuming you want to redirect standard error to output so it can be captured in logs.
Below is quick reference guide for output redirection in bash.
1>filename
# Redirect stdout to file "filename."
1>>filename
# Redirect and append stdout to file "filename."
2>filename
# Redirect stderr to file "filename."
2>>filename
# Redirect and append stderr to file "filename."
&>filename
# Redirect both stdout and stderr to file "filename."
2>&1
# Redirects stderr to stdout.
# Error messages get sent to the same place as standard output.

How do I copy stderr without stopping it writing to the terminal?

I want to write a shell script that runs a command, writing its stderr to my terminal as it arrives. However, I also want to save stderr to a variable, so I can inspect it later.
How can I achieve this? Should I use tee, or a subshell, or something else?
I've tried this:
# Create FD 3 that can be used so stdout still comes through
exec 3>&1
# Run the command, piping stdout to normal stdout, but saving stderr.
{ ERROR=$( $# 2>&1 1>&3) ; }
echo "copy of stderr: $ERROR"
However, this doesn't write stderr to the console, it only saves it.
I've also tried:
{ $#; } 2> >(tee stderr.txt >&2 )
echo "stderr was:"
cat stderr.txt
However, I don't want the temporary file.
I often want to do this, and find myself reaching for /dev/stderr, but there can be problems with this approach; for example, Nix build scripts give "permission denied" errors if they try to write to /dev/stdout or /dev/stderr.
After reinventing this wheel a few times, my current approach is to use process substitution as follows:
myCmd 2> >(tee >(cat 1>&2))
Reading this from the outside in:
This will run myCmd, leaving its stdout as-is. The 2> will redirect the stderr of myCmd to a different destination; the destination here is >(tee >(cat 1>&2)) which will cause it to be piped into the command tee >(cat 1>&2).
The tee command duplicates its input (in this case, the stderr of myCmd) to its stdout and to the given destination. The destination here is >(cat 1>&2), which will cause the data to be piped into the command cat 1>&2.
The cat command just passes its input straight to stdout. The 1>&2 redirects stdout to go to stderr.
Reading from the inside out:
The cat 1>&2 command redirects its stdin to stderr, so >(cat 1>&2) acts like /dev/stderr.
Hence tee >(cat 1>&2) duplicates its stdin to both stdout and stderr, acting like tee /dev/stderr.
We use 2> >(tee >(cat 1>&2)) to get 2 copies of stderr: one on stdout and one on stderr.
We can use the copy on stdout as normal, for example storing it in a variable. We can leave the copy on stderr to get printed to the terminal.
We can combine this with other redirections if we like, e.g.
# Create FD 3 that can be used so stdout still comes through
exec 3>&1
# Run the command, redirecting its stdout to the shell's stdout,
# duplicating its stderr and sending one copy to the shell's stderr
# and using the other to replace the command's stdout, which we then
# capture
{ ERROR=$( $# 2> >(tee >(cat 1>&2)) 1>&3) ; }
echo "copy of stderr: $ERROR"
Credit goes to #Etan Reisner for the fundamentals of the approach; however, it's better to use tee with /dev/stderr rather than /dev/tty in order to preserve normal behavior (if you send to /dev/tty, the outside world doesn't see it as stderr output, and can neither capture nor suppress it):
Here's the full idiom:
exec 3>&1 # Save original stdout in temp. fd #3.
# Redirect stderr to *captured* stdout, send stdout to *saved* stdout, also send
# captured stdout (and thus stderr) to original stderr.
errOutput=$("$#" 2>&1 1>&3 | tee /dev/stderr)
exec 3>&- # Close temp. fd.
echo "copy of stderr: $errOutput"

Automatically capture all stderr and stdout to a file and still show on console

I'm looking for a way to capture all standard output and standard error to a file, while also outputting it to console. So:
(set it up here)
set -x # I want to capture every line that's executed too
cat 'foo'
echo 'bar'
Now the output from foo and bar, as well as the debugging output from set -x, will be logged to some log file and shown on the console.
I can't control how the file is invoked, so it needs to be set up at the start of the file.
You can use exec and process substitution to send stdout and stderr inside of the script to tee. The process substitution is a bashism, so it is not portable and will not work if bash is called as /bin/sh or with --posix.
exec > >(tee foo.log) 2>&1
set -x # I want to capture every line that's executed too
cat 'foo'
echo 'bar'
sleep 2
The sleep is added to the end because the output to the console will be buffered by the tee. The sleep will help prevent the prompt from returning before the output has finished.
Maybe create a proxy-script that calls the real script, redirecting stderr to stdout and piping it to tee?
Something like this:
#!/bin/bash
/path/to/real/script "$#" 2>&1 | tee file
If you like only STDERR on your console you can:
#!/bin/bash
set -e
outfile=logfile
exec > >(cat >> $outfile)
exec 2> >(tee -a $outfile >&2)
# write your code here

How do I log stderr and stdout synchronously, but print stderr to screen only?

This is a task that I try to do pretty often.
I want to log both stderr and stdout to a log file. But I only want to print to console stderr.
I've tried with tee, but once I've merge stderr and stdout using "2>&1". I can not print stdout to the screen anymore since both my pipes are merged.
Here is a simple example of what I tried
./dosomething.sh | tee -a log 2>&1.
Now I have both stderr and stdout to the log and the screen.
Any Ideas?
Based on some reading on this web site, this question has been asked.
Write STDOUT & STDERR to a logfile, also write STDERR to screen
And also a question very similar here:
Save stdout, stderr and stdout+stderr synchronously
But neither of them are able to redirect both stdout+stderr to a log and stderr to the screen while stdoud and stderr are synchronously written to the log file.
I was able to get this working in bash:
(./tmp.sh 2> >(tee >(cat >&2) >&1)) > tmp.log
This does not work correctly in zsh (the prompt does not wait for the process to exit), and does not work at all in dash. A more portable solution may be to write a simple C program to do it.
I managed to get this working with this script in bash.
mkfifo stdout
mkfifo stderr
rm -f out
cat stderr | tee -a out &
cat stdout >> out &
(echo "stdout";
grep;
echo "an other stdout";
echo "again stdout";
stat) 2> stderr > stdout
rm -f stdout
rm -f stderr
The order of the output is preserved. With this script the process ends correctly.
Note: I used grep and stat without parameter to generate stdout.

Resources