Condense multiple files to a single BASH script - bash

I have a .body, .script, and .sql that I would like to condense in to a single script but I am not sure how to go about it.
.body contains the e-mail message.
.sql spools the data to a .csv:
The .script runs the .sql, and sends an e-mail with the attached Report.zip:
sqlplus $user/$pass#$db #script.sql
(cat script.body; uuencode Report.zip Report.zip) | mail -s "Report" user#domain.com -- -f no-reply#domain.com
Is it possible that this (including the SQL) can all be done in a single BASH script?

If you can just pass the script into the stdin of sqlplus you can do:
sqlplus $user/$pass#$db << END
<contents of sql script here>
END
(cat script.body; uuencode Report.zip Report.zip) | mail -s "Report" user#domain.com -- -f no-reply#domain.com
if you still want stdin (useful if it might ask for a password or something) and assuming sqlplus won't try anything with the script file you can do:
sqlplus $user/$pass#$db START <(cat << END
<contents of sql script here>
END)
(cat script.body; uuencode Report.zip Report.zip) | mail -s "Report" user#domain.com -- -f no-reply#domain.com

Related

bash script to accept log on stdin and email log if inputting process fails

I'm a sysadmin and I frequently have a situation where I have a script or command that generates a lot of output which I would only like to have emailed to me if the command fails. It's pretty easy to write a script that runs the command, collects the output and emails it if the command fails, but I was thinking I should be able to write a command that
1) accepts log info on stdin
2) waits for the inputting process to exit and see what it's exit status was
3a) if the inputting process exited cleanly, append the logging input to a normal log file
3b) if the inputting process failed, append the logging input to the normal log and also send me an email.
It would look something like this on the command line:
something_important | mailonfail.sh me#example.com /var/log/normal_log
That would make it really easy to use in crontabs.
I'm having trouble figuring out how to make my script wait for the writing process and evaluate how that process exits.
Just to be exatra clear, here's how I can do it with a wrapper:
#! /bin/bash
something_important > output
ERR=$!
if [ "$ERR" -ne "0" ] ; then
cat something_important | mail -s "something_important failed" me#example.com
fi
cat something_important >> /var/log/normal_log
Again, that's not what I want, I want to write a script and pipe commands into it.
Does that make sense? How would I do that? Am I missing something?
Thanks Everyone!
-Dylan
Yes it does make sense, and you are close.
Here are some advises:
#!/bin/sh
TEMPFILE=$(mktemp)
trap "rm -f $TEMPFILE" EXIT
if [ ! something_important > $TEMPFILE ]; then
mail -s 'something goes oops' -a $TEMPFILE you#example.net
fi
cat $TEMPFILE >> /var/log/normal.log
I won't use bashisms so /bin/sh is fine
create a temporary file to avoid conflicts using mktemp(1)
use trap to remove file when the script exit, normally or not
if the command fail
then attach the file, which would or would not be preferred over embedding it
if it's a big file you could even gzip it, but the attachment method will change:
# using mailx
gzip -c9 $TEMPFILE | uuencode fail.log.gz | mailx -s subject ...
# using mutt
gzip $TEMPFILE
mutt -a $TEMPFILE.gz -s ...
gzip -d $TEMPFILE.gz
etc.

Sending email with Ruby through mutt

I'm trying to send an email in this way:
path_to_file="/home/username/tmp/filename.html"
subject="My Subject"
to.each do |address|
cmd = "`echo \"#{body}\" | mutt -s \"#{subject}\" #{address} -a #{path_to_file}`"
system(cmd)
end
end
This is running on a unix machine, the file has an html extenxion and contains html code.
I receive the file correctly, with the same filename and extension, the problem is that is completely empty, this is strange because when i run the same command (mutt) from the terminal, applying the same path to file:
echo "body here" | mutt -s "some subject" mamail#mail.com -a ${path_to_file}
Then it works just fine.
Any idea?

Sending the mail using mutt command

I have a requirement where I need to send two files A and B . The file A's content should be displayed as in-line or body of the mail and file B as an attachment.
Is multiple attachment using mutt is possible?
The command
echo "Hello everyone " | mutt -s 'My mail ' abc#gmail.com -a myFile.txt
is writing Hello everyone as body of the mail and myFile.txt as an attachment(inline).
Both of my files A and B are dynamically generated, so I cannot have an echo statement.
It's actually very simple:
mutt -s 'My mail ' abc#gmail.com -a report1.txt < report2.txt
If you had two scripts to create the reports, you could use pipes (i.e. no files would be created on disk):
script2 | mutt -s 'My mail ' abc#gmail.com -a <(script1)
cat A | mutt -s 'My mail' abc#gmail.com -a B
If the shell script prints file A's content to standard output, like this:
script >A
then you can use tee to print both to the file A and into the pipe to mutt:
script | tee A | mutt -s 'My mail' abc#gmail.com -a B

How to save a postgres query to a file from a remote computer

I have a script which outputs some data from a postgres database:
#!/bin/bash
dbname="inventory"
username="postgres"
NOW=$(date +"%d-%m-%Y-%H%M%S")
OUTPUT="output.$NOW"
psql $dbname $username << EOF
\o | cat >> $OUTPUT
select count(*) from component;
select * from product where procode='AAA';
\q
\o
EOF
This happily exports the data I need to a text file if I run it on the local machine.
Now I want to do the same thing for 10 remote computers and have the output stored in one file on the machine I run the script from. I have the following code that will grep some information from the remote machines but I also need to add the postgres data above to my $FILE_OUT below:
#!/bin/bash
NOW=$(date +"%d-%m-%Y-%H%M%S")
dbname="inventory"
username="postgres"
echo "Enter the file name"
read FILE_IN
FILE_OUT="invcheck.$NOW"
for i in $(cat $FILE_IN); do
echo $i >> $FILE_OUT
ssh user#$i "ps -efw | grep TRADINGTIMEPLUS /opt/jrms/jrms.ini" >> $FILE_OUT
I have tried to put the postgres code in a line like:
ssh rmsasi#hj$i "psql $dbname $username << EOF \o | cat >> $FILE_OUT select * from component where code='1000';\q;\o; EOF"
but I get errors and the postgres data is not included in the $FILE_OUT
psql: warning: extra option o ignored
cat: from: No such file or directory
cat: component: No such file or directory
cat: where: No such file or directory
cat: code=1000: No such file or directory
bash: line 1: q: command not found
bash: line 1: o: command not found
bash: line 1: EOF: command not found
I am only new to scripting so hopefully I am just missing something simple.
My thanks in advance and if there is a better way to do what I am trying to achieve I appreciate any pointers.
psql's \ commands are terminated by newlines, not semicolons. A bit of a wart IMO.
Because of that you need newlines in your command. Literal newlines are permissible within a quoted literal in the shell, so you could insert them in the command, but it's better to just have your here-document locally and send it to psql's stdin via ssh's stdin. This'll still leave the output on the remote machine, though:
ssh rmsasi#hj$i "psql $dbname $username" << __EOF__
\o | cat >> $FILE_OUT
SELECT * from component where code='1000';
\q
\o
__EOF__
It's much better to instead read from psql's stdout or redirect it, rather than using \o commands; that way the output comes back down the ssh session and you can redirect it locally:
echo "SELECT * from component where code='1000';" |\
ssh rmsasi#hj$i >> $FILE_OUT \
"psql --quiet --no-align --no-readline --tuples-only -P footer=off --no-password $dbname $username"
Alternately you could have psql connect over the network from the other machines.
Since you're driving psql from bash you may want to know that you can do so interactively with a co-process. You also don't need to redirect output to a tempfile like that; just suppress psql's printing of prompts and headers (see linked answer) and you can simply read the results from psql's stdout.

Is there a way to automatically e-mail after a long script is finished?

I am trying to run a long bash script overnight to get some data. I wanted to include a script that would automatically e-mail me the files after the scripts are completed. Is there a way to do this using mutt? I want something like below:
sh atoms.sh
sh angles.sh
mutt -a atoms.dat angles.dat -- [e-mail adress]
Any takers?
EDITS: If there's any other way to achieve this -- "sending multiple attachment to an e-mail address after scripting is finished" -- I'd be very appreciated.
sh atoms.sh
sh angles.sh
mutt -s "data set from atoms.sh" [email address] < ./atom.dat
mutt -s "data set from angles.sh" [email address] < ./angles.dat
will disable the terminal interaction and send e-mails after the jobs are finished
-a file [...]
Attach a file to your message using MIME. To attach multiple files, separating
filenames and recipient addresses with "--" is mandatory, e.g. mutt -a img.jpg
*.png -- addr1 addr2.
$ $( sh atoms.sh; sh angles.sh ) && mutt -s "man mutt" \
-a grab.sh raptor.mpd.ogg.m3u scripts/bussorakel \
-- emailAddress#example.com < /dev/null
alternatively, you have:
$(sh atoms.sh; sh angles.sh ) & FOR=$!
wait $FOR
mutt -s "last command done, sending email" (...)

Resources