Redirect all messages to log file from inside script except some print command output on console as well as screen - bash

I want to write all stdout of script to a log file by default but want to display some messages on the console as well. Below is the sample code I tried.
#!/bin/bash
LOG='output.txt'
exec 3>&1
exec 1>>$LOG
exec 2>>$LOG
echo "This should be written to log"
echo "This should be written to both screen as well as log"|tee -a >&3
This writes This should be written to log to the log file, which is good.
It writes This should be written to both screen as well as log only on the console, but I want it to be written to both console and log.

You're missing a filename on your tee command. Try it like this:
echo "This should be written to both screen as well as log"|tee -a $LOG >&3

Related

Redirect stderr to terminal and also file in bash

I have a peculiar case here which can be summed up as follows:
I want the entire error/stdout messages to be redirected to a file from my script but also there is one particular line which i want to redirect to terminal and also redirect that to the file.
This is the code:
exec &>test.log
echo "Check if this line is going to test.log"
echo "This should go to stderr" >> /dev/stderr
Now the last line should go to both stderr and test.log.
Can this be achieved somehow in bash ?
Yes, the tee command allows you to direct output to one or more files, as well as stdout.
As you pointed out in a comment, this doesn't work by itself because of the exec command.
This should do what you want:
exec 3>&1 1>test.log
echo "Check if this line is going to test.log"
exec 1>&3 3>&-
echo "Maybe this should go to stderr" | tee -a test.log >> /dev/stderr
I got the information about the way to restore stderr by properly setting up the original exec, from here, and combined it with tee.

How to redirect the stream inline? It is possible not to keep repeating the code? [duplicate]

Is it possible to redirect all of the output of a Bourne shell script to somewhere, but with shell commands inside the script itself?
Redirecting the output of a single command is easy, but I want something more like this:
#!/bin/sh
if [ ! -t 0 ]; then
# redirect all of my output to a file here
fi
# rest of script...
Meaning: if the script is run non-interactively (for example, cron), save off the output of everything to a file. If run interactively from a shell, let the output go to stdout as usual.
I want to do this for a script normally run by the FreeBSD periodic utility. It's part of the daily run, which I don't normally care to see every day in email, so I don't have it sent. However, if something inside this one particular script fails, that's important to me and I'd like to be able to capture and email the output of this one part of the daily jobs.
Update: Joshua's answer is spot-on, but I also wanted to save and restore stdout and stderr around the entire script, which is done like this:
# save stdout and stderr to file
# descriptors 3 and 4,
# then redirect them to "foo"
exec 3>&1 4>&2 >foo 2>&1
# ...
# restore stdout and stderr
exec 1>&3 2>&4
Addressing the question as updated.
#...part of script without redirection...
{
#...part of script with redirection...
} > file1 2>file2 # ...and others as appropriate...
#...residue of script without redirection...
The braces '{ ... }' provide a unit of I/O redirection. The braces must appear where a command could appear - simplistically, at the start of a line or after a semi-colon. (Yes, that can be made more precise; if you want to quibble, let me know.)
You are right that you can preserve the original stdout and stderr with the redirections you showed, but it is usually simpler for the people who have to maintain the script later to understand what's going on if you scope the redirected code as shown above.
The relevant sections of the Bash manual are Grouping Commands and I/O Redirection. The relevant sections of the POSIX shell specification are Compound Commands and I/O Redirection. Bash has some extra notations, but is otherwise similar to the POSIX shell specification.
Typically we would place one of these at or near the top of the script. Scripts that parse their command lines would do the redirection after parsing.
Send stdout to a file
exec > file
with stderr
exec > file
exec 2>&1
append both stdout and stderr to file
exec >> file
exec 2>&1
As Jonathan Leffler mentioned in his comment:
exec has two separate jobs. The first one is to replace the currently executing shell (script) with a new program. The other is changing the I/O redirections in the current shell. This is distinguished by having no argument to exec.
You can make the whole script a function like this:
main_function() {
do_things_here
}
then at the end of the script have this:
if [ -z $TERM ]; then
# if not run via terminal, log everything into a log file
main_function 2>&1 >> /var/log/my_uber_script.log
else
# run via terminal, only output to screen
main_function
fi
Alternatively, you may log everything into logfile each run and still output it to stdout by simply doing:
# log everything, but also output to stdout
main_function 2>&1 | tee -a /var/log/my_uber_script.log
For saving the original stdout and stderr you can use:
exec [fd number]<&1
exec [fd number]<&2
For example, the following code will print "walla1" and "walla2" to the log file (a.txt), "walla3" to stdout, "walla4" to stderr.
#!/bin/bash
exec 5<&1
exec 6<&2
exec 1> ~/a.txt 2>&1
echo "walla1"
echo "walla2" >&2
echo "walla3" >&5
echo "walla4" >&6
[ -t <&0 ] || exec >> test.log
I finally figured out how to do it. I wanted to not just save the output to a file but also, find out if the bash script ran successfully or not!
I've wrapped the bash commands inside a function and then called the function main_function with a tee output to a file. Afterwards, I've captured the output using if [ $? -eq 0 ].
#! /bin/sh -
main_function() {
python command.py
}
main_function > >(tee -a "/var/www/logs/output.txt") 2>&1
if [ $? -eq 0 ]
then
echo 'Success!'
else
echo 'Failure!'
fi

Redirect output to a file AND console

Currently I have a script where I want all output to be redirected to both a file and the console.
#!/bin/bash
touch /mnt/ybdata/ybvwconf.log
{
... [Do some script code here]
} 2>&1 | tee -a /mnt/ybdata/ybvwconf.log
Above, you can see my current code, which works perfectly fine. It prints all the output to the console as well as piping it to the ybvwconf.log file. However, I was looking for a way to eliminate the curly brackets. Something along the lines of this:
#!/bin/bash
touch /mnt/ybdata/ybvwconf.log
exec 2>&1 | tee -a /mnt/ybdata/ybvwconf.log
... [Do some script code here]
I have tried this approach and sadly it does not work. I don't get any errors but no content appears in my log file. Any ideas what might be wrong?
You can place this at top of your script to redirect both stdout and stderr to a file and show them on terminal as well.
#!/bin/bash
exec &> >(tee /mnt/ybdata/ybvwconf.log; exit)
# your script code goes here

Shell, redirect all output to a file but still print echo

I have multiple vagrant provisions (shell scripts), and I would like to redirect all command output to a file while keeping the echo output in stdout.
Currently I redirect all output by doing
exec &>> provision.log
At the beginning of each provision
Which works great, but console is empty. So I would like to redirect everything but the echo commands to a file.
If there would be a possibility of redirecting all output (including echo) to a file and keeping only echo in stdout that would be the best.
Result should look like:
Console
Starting provision "master"
Updating packages...
Installing MySQL...
ERROR! check provision.log for details
provision.log
Starting provision "master"
Updating packages...
...
(output from apt-get)
...
Installing MySQL...
...
(output from apt-get install mysql-server-5.5)
...
ERROR! check provision.log for details
I do realize I could attach output redirect to every command but that is quite messy
You can take this approach of duplicating stdout file descriptor and using a custom echo function redirecting to duplicate file descriptor.
#!/bin/bash
# open fd=3 redirecting to 1 (stdout)
exec 3>&1
# function echo to show echo output on terminal
echo() {
# call actual echo command and redirect output to fd=3 and log file
command echo "$#"
command echo "$#" >&3
}
# redirect stdout to a log file
exec >>logfile
printf "%s\n" "going to file"
date
echo "on stdout"
pwd
# close fd=3
exec 3>&-
You might redirect only stdout and keep stderr on the terminal. Of course your script should then echo to stderr, perhaps using echo something > /dev/stderr or echo something >&2
You could also redirect echo-s to /dev/tty which makes sense only if the script was started on a terminal (not e.g. thru at or crontab)

Automatically capture all stderr and stdout to a file and still show on console

I'm looking for a way to capture all standard output and standard error to a file, while also outputting it to console. So:
(set it up here)
set -x # I want to capture every line that's executed too
cat 'foo'
echo 'bar'
Now the output from foo and bar, as well as the debugging output from set -x, will be logged to some log file and shown on the console.
I can't control how the file is invoked, so it needs to be set up at the start of the file.
You can use exec and process substitution to send stdout and stderr inside of the script to tee. The process substitution is a bashism, so it is not portable and will not work if bash is called as /bin/sh or with --posix.
exec > >(tee foo.log) 2>&1
set -x # I want to capture every line that's executed too
cat 'foo'
echo 'bar'
sleep 2
The sleep is added to the end because the output to the console will be buffered by the tee. The sleep will help prevent the prompt from returning before the output has finished.
Maybe create a proxy-script that calls the real script, redirecting stderr to stdout and piping it to tee?
Something like this:
#!/bin/bash
/path/to/real/script "$#" 2>&1 | tee file
If you like only STDERR on your console you can:
#!/bin/bash
set -e
outfile=logfile
exec > >(cat >> $outfile)
exec 2> >(tee -a $outfile >&2)
# write your code here

Resources