bash send email after error on job - bash

I set up a cron that will run a script this script will run a command which renews lets encrypt.
#!/bin/bash
/usr/local/sbin/certbot-auto renew --renew-hook "service nginx reload" -q >> /var/log/certbot-renew.log | mail -s "CERTBOT Renewals" test#test.com < /var/log/certbot-renew.log
exit 0
This produced an email every time the cron ran but what I want is if there is an error/renewal to send an email. Ive read up that if I use &> this will write errors will this work if i replace >> with &> or should I be using 2>&1 to capture both stdout and stderr?

On this command
command >>file 2>&1 | other command
The output is redirected to a file >>, then to a pipe, a tee can duplicate the output.
command 2>&1 | tee -a file | other command
Otherwise some shell accept &>> to redirect stdout and stderr to a file in append mode.
following command do the same, the order is important (fd1 is redirected to file and fd2 to fd1)
command >>file 2>&1

Related

Running tcpdump in the background linux

Linux (Gentoo) and Linux (Redhat on AWS free)
I am a member of the pcap group and can run tcpdump as a non-root user.
I am trying to run a script the runs tcpdump in the background and send the output to a text file temp.txt. My script will create a file called temp.txt but /usr/bin/tcpdump -tttt will not write to it.
I can run the script without nohup.
/usr/sbin/tcpdump -c 10 -tttt > `pwd`/temp.txt
Why will the nohup not work? The following is my script:
#!/bin/bash
#tpd-txt.sh
nohup /usr/sbin/tcpdump -c 10 -tttt > `pwd`/temp.txt > /dev/null 2>&1 &
Try
nohup /usr/sbin/tcpdump -c 10 -tttt 2>&1 >./temp.txt &
I am assuming you want to redirect standard error to output so it can be captured in logs.
Below is quick reference guide for output redirection in bash.
1>filename
# Redirect stdout to file "filename."
1>>filename
# Redirect and append stdout to file "filename."
2>filename
# Redirect stderr to file "filename."
2>>filename
# Redirect and append stderr to file "filename."
&>filename
# Redirect both stdout and stderr to file "filename."
2>&1
# Redirects stderr to stdout.
# Error messages get sent to the same place as standard output.

Output the test shell script status to webpage

I have created tests using sshUnit2[shUnit2 is a xUnit unit test framework for Bourne based shell scripts]. Once the tests are executed I can see the execution on the console including the test status. I would like to redirect the output including error/exception to a webpage just like rake task in rspec. Your help is much appreciated.
When you type a command in the terminal you'll get a the normal output from stdout. If you want to see errors from stderr you have to redirect them from stderr to stdout by appending 2>&1 to your command (YOUR_COMMAND 2>&1).
To view the output from a command in a web browser, you can pipe the output to netcat (e.g. YOUR_COMMAND 2>&1 | netcat -l -p PORT_NUMBER). Now the command waits until you navigate your web browser to localhost:PORT_NUMBER. After opening the URL, netcat will print some server-client specific stuff and then quits. You can prevent the netcat output by redirecting it to /dev/null (YOUR_COMMAND 2>&1 | netcat -l -p PORT_NUMBER 2>&1 >/dev/null).
If you want to keep the "command-output-server" alive after loading the content in the browser, you have to loop over the command. With while true you can loop infinitely. So do while true; do YOUR_COMMAND 2>&1 | netcat -l -p PORT_NUMBER 2>&1 >/dev/null; done to keep the server alive. With an & at the end you can run the whole thing in the background.
For example your final command could look like this:
while true; do date 2>&1 | netcat -l -p 8888 2>&1 >/dev/null; done &
(Browse to 127.0.0.1:8888 to see the current date and time)

Bash Redirect to a file

I am trying to redirect output of a command to a file. The command I am using (zypper) downloads packages from the internet. The command I am using is
zypper -x -n in geany >> log.txt
The command gradually prints output to the console. The problem I am facing is that the above command writes the command output all at once after the command finishes executing. How do I redirect the bash output as I get it onto the terminal, rather than writing all the command output at the end.
Not with bash itself, but via the tee command:
zipper -x -n in geany | tee log.txt
&>>FILE COMMAND
will append the output of COMMAND to FILE
In your case
&>>log.txt zypper -x -n in geany
If you want to pipe a command through a filter, you must assure that the command outputs to standard output (file descriptor 1) -- if it outputs to standard error (file descriptor 2), you have to redirect the 2 to 1 before the pipe. Take into account that only stdout passed through a pipe.
So you have to do so:
2>&1 COMMAND | FILTER
If you want to grep the output and in the same keep it into a log file, you have to duplicate it with tee, and use a filter like ... | tee log-file | grep options

Print STDOUT in the middle of 2 Pipes in Solaris(bash)

http://www.webdesignerdepot.com/rss.htm
I have the same issue. This command:
./somescript.sh > ../log/scriptlog.log
requires the output of a command go to std out. but inside the script
command | mailx -s "Subject" recipient#somedomain.tld
what I would like to do is something like :
command | tee > /dev/stdout | mailx -s "Subject" recipient#somedomain.tld
Where the output of the command goes to stdout( to be redirected into the ..log/scriptlog.log file )
and also into stdin for the mailx command.
Any way to do that?
tee already sends to stdout.
... | tee -a log/scriptlog.log | ...
exec 3>&1
command | tee /dev/fd/3 | mailx ...
or, using process substitution:
command | tee >(mailx ...)
I'll try process substitution. To clarifily, I have a cron'd shell script . The cron entry is similar to:
/usr/script/myscript.sh > /usr/log/myscript.log
inside the script is a line similar to:
command | mailx -s "Subject" recipient
Since stdout from 'command' is being piped into the mailx command, it does appear in the log file 'myscript.log', but I want it to.
I tried capturing it into a variable but the line feeds appear to be lost that way. I could use a temporary file, but I was hoping for something more elegant.

Write STDOUT & STDERR to a logfile, also write STDERR to screen

I would like to run several commands, and capture all output to a logfile. I also want to print any errors to the screen (or optionally mail the output to someone).
Here's an example. The following command will run three commands, and will write all output (STDOUT and STDERR) into a single logfile.
{ command1 && command2 && command3 ; } > logfile.log 2>&1
Here is what I want to do with the output of these commands:
STDERR and STDOUT for all commands goes to a logfile, in case I need it later--- I usually won't look in here unless there are problems.
Print STDERR to the screen (or optionally, pipe to /bin/mail), so that any error stands out and doesn't get ignored.
It would be nice if the return codes were still usable, so that I could do some error handling. Maybe I want to send email if there was an error, like this:
{ command1 && command2 && command3 ; } > logfile.log 2>&1 || mailx -s "There was an error" stefanl#example.org
The problem I run into is that STDERR loses context during I/O redirection. A '2>&1' will convert STDERR into STDOUT, and therefore I cannot view errors if I do 2> error.log
Here are a couple juicier examples. Let's pretend that I am running some familiar build commands, but I don't want the entire build to stop just because of one error so I use the '--keep-going' flag.
{ ./configure && make --keep-going && make install ; } > build.log 2>&1
Or, here's a simple (And perhaps sloppy) build and deploy script, which will keep going in the event of an error.
{ ./configure && make --keep-going && make install && rsync -av --keep-going /foo devhost:/foo} > build-and-deploy.log 2>&1
I think what I want involves some sort of Bash I/O Redirection, but I can't figure this out.
(./doit >> log) 2>&1 | tee -a log
This will take stdout and append it to log file.
The stderr will then get converted to stdout which is piped to tee which appends it to the log (if you are have Bash 4, you can replace 2>&1 | with |&) and sends it to stdout which will either appear on the tty or can be piped to another command.
I used append mode for both so that regardless of which order the shell redirection and tee open the file, you won't blow away the original. That said, it may be possible that stderr/stdout is interleaved in an unexpected way.
If your system has /dev/fd/* nodes you can do it as:
( exec 5>logfile.txt ; { command1 && command2 && command3 ;} 2>&1 >&5 | tee /dev/fd/5 )
This opens file descriptor 5 to your logfile. Executes the commands with standard error directed to standard out, standard out directed to fd 5 and pipes stdout (which now contains only stderr) to tee which duplicates the output to fd 5 which is the log file.
Here is how to run one or more commands, capturing the standard output and error, in the order in which they are generated, to a logfile, and displaying only the standard error on any terminal screen you like. Works in bash on linux. Probably works in most other environments. I will use an example to show how it's done.
Preliminaries:
Open two windows (shells, tmux sessions, whatever)
I will demonstrate with some test files, so create the test files:
touch /tmp/foo /tmp/foo1 /tmp/foo2
in window1:
mkfifo /tmp/fifo
0</tmp/fifo cat - >/tmp/logfile
Then, in window2:
(ls -l /tmp/foo /tmp/nofile /tmp/foo1 /tmp/nofile /tmp/nofile; echo successful test; ls /tmp/nofile1111) 2>&1 1>/tmp/fifo | tee /tmp/fifo 1>/dev/pts/2
Where you replace /dev/pts/2 with whatever tty you want the stderr to display.
The reason for the various successful and unsuccessful commands in the subshell is simply to generate a mingled stream of output and error messages, so that you can verify the correct ordering in the log file. Once you understand how it works, replace the “ls” and “echo” commands with scripts or commands of your choosing.
With this method, the ordering of output and error is preserved, the syntax is simple and clean, and there is only a single reference to the output file. Plus there is flexiblity in putting the extra copy of stderr wherever you want.
Try:
command 2>&1 | tee output.txt
Additionally, you can direct stdout and stderr to different places:
command > stdout.txt >& stderr.txt
command > stdout.txt |& program_for_stderr
So some combination of the above should work for you -- e.g. you could save stdout to a file, and stderr to both a file and piping to another program (with tee).
add this at the beginning of your script
#!/bin/bash
set -e
outfile=logfile
exec > >(cat >> $outfile)
exec 2> >(tee -a $outfile >&2)
# write your code here
STDOUT and STDERR will be written to $outfile, only STDERR will be seen on the console

Resources