I have a backupscript that executes every 2 weeks with cron on my mac os high sierra.
And that part works and now I want to mail the log to myself using these 2 lines:
df -Ph /Volumes/USB_Storage >> "/Users/ralphschipper/Documents/Logs/rsync"date +"%Y-%m-%d".log
cat "/Users/ralphschipper/Documents/Logs/rsync"date +"%Y-%m-%d".log | /usr/bin/mail -s "Backuplog" user#gmail.com
the thing is: my backup starts at 10:00 pm september 15 so the logfile is created on the 15th
The backup was ready at 1:00 am september 16 so a new logfile is created.
At the end the mail was send using the logfile that contains the df command from the 16th.
does anyone now how to fix this?
can I create a variable at the begin of the proces that stores the current date and use that?
or can I send a mail that sends the logfile and the df results?
Regards,
Ralph
Store the date you want to use (and do the same with the complete filename).
backupdate=$(date +"%Y-%m-%d")
backupfile="/Users/ralphschipper/Documents/Logs/rsync${backupdate}.log"
df -Ph /Volumes/USB_Storage >> "${backupfile}"
cat "${backupfile}" | /usr/bin/mail -s "Backuplog of ${backupdate}" user#gmail.com
Related
I'm working on a ksh script to retrieve a file every hour via sftp that will be put on a job scheduler to be run hourly. The script must navigate to a folder with yesterday's date (remote/path/yyyymmdd/). The filename also has yesterday's date and a timestamp (filename_yyyymmdd_hhmmss.dat). Since the job will be scheduled, my script has to include the previous hour - ex. if the job runs at 11:02, the file to retrieve would be filename_yyyymmdd_10mmss.dat. The minutes and seconds will always be the same - ex 4949. There will be multiple files in the remote directory and I only want to retrieve the latest one so that there are not multiple input files being processed by our jobs. The remote directory will also have other files being created regularly, so I can't retrieve just the last modified files.
I have variables to return yesterday's date and the previous hour, but the sftp command isn't returning the full filename and isn't retrieving the file. I've tried concatenating the variables, using brackets & quotes & parenthesis, assigning multiple variables to a single variable, and exporting the variables.
vdate=$(TZ=bb24 date '+%Y%m%d')
vhour=$(date '+%H')
prevhour=$((vhour - 1))
sftp user#host << EOF
lcd /my/dir/
cd /remote/path/$vdate/
get filename_$vdate_$prevhour*.dat
bye
EOF
exit
When running the script, the file cannot be found and the full filename isn't
returned:
File "/remote/path/20190411/filename_20190411" not found.
instead of
File "/remote/path/20190411/filename_20190411_10*.dat" not found.
Every combination of variables that I try returns the same not found - ending after filename_$vdate.
I've tried some other combinations but always get the same not found:
newvar=${vdate}_${prevhour}
get filename_$newvar*.dat
and
newvar=${vdate}\\_${prevhour}
get filename_$newvar*.dat
File "/remote/path/20190411/filename_20190411" not found.
You have a problem in your script at prevhour=$((vhour - 1))
this way a text 02 after you make subtraction, it will be 1 and not 01 and it will match to undesired files, or even none as 00 - 1 is -1
[edvin]$ vdate=$(TZ=bb24 date '+%Y%m%d')
[edvin]$ vhour=$(date '+%H')
[edvin]$ prevhour=$((vhour - 1))
[edvin]$ echo $vhour
03
[edvin]$ echo $prevhour
2
[edvin]$ prevhour=$(date -d '1 hour ago' '+%H')
[edvin]$ echo $prevhour
02
date's -d option not avaliable on some system.
I believe in that in your attempt the shell is considered the * as part of the variable prevhour as you did not put it into {} that separate variables from sorrunding text.
This is my working solution based by your attempt:
#!/bin/ksh
r_host='server2'
r_user='edvin'
l_dir='./content'
r_dir='./Test_folder'
# this still not cover the case of midnight
# it put 00 to 23 but day have to be yesterday as well
##vdate=$(TZ=bb24 date '+%Y%m%d')
##vhour=$(date '+%H') # not used
##prevhour=$(date -d '1 hour ago' '+%H')
# vtime = YYYYmmdd_HH -1 H
vtime=$(TZ=bb24 date -d '1 hour ago' '+%Y%m%d_%H')
sftp ${r_user}#${r_host} << EOF
lcd ${l_dir}
cd ${r_dir}
get filename_${vtime}*.dat
bye
EOF
exit
Output:
[edvin]$ ./script.ksh
Connected to server2.
sftp> lcd ./content
sftp> cd ./Test_folder
sftp> get filename_20190415_02*.dat
Fetching /home/edvin/Test_folder/filename_20190415_020000.dat to filename_20190415_020000.dat
Fetching /home/edvin/Test_folder/filename_20190415_020100.dat to filename_20190415_020100.dat
Fetching /home/edvin/Test_folder/filename_20190415_020200.dat to filename_20190415_020200.dat
Fetching /home/edvin/Test_folder/filename_20190415_020300.dat to filename_20190415_020300.dat
Fetching /home/edvin/Test_folder/filename_20190415_020400.dat to filename_20190415_020400.dat
Fetching /home/edvin/Test_folder/filename_20190415_020500.dat to filename_20190415_020500.dat
Fetching /home/edvin/Test_folder/filename_20190415_020600.dat to filename_20190415_020600.dat
Fetching /home/edvin/Test_folder/filename_20190415_020700.dat to filename_20190415_020700.dat
Fetching /home/edvin/Test_folder/filename_20190415_020800.dat to filename_20190415_020800.dat
Fetching /home/edvin/Test_folder/filename_20190415_020900.dat to filename_20190415_020900.dat
Fetching /home/edvin/Test_folder/filename_20190415_021000.dat to filename_20190415_021000.dat
sftp> bye
There is many thing can go wrong still in this solution,
like if remote directory not exist, not accessible, script will still go on with the rest of the command, same for the local directory and for the files as well. The connection also can run various problems you might want to handle. You like to schedule it so might a solution needed to avoid script spawn over and over again if one already run.
scp would be more preferred way to do this, as you use password less authentication.
If scp is not an option for some reason, with expect this can be handled quite well.
I have a requirement to read a splunk log file for certain parameters and use that data to update an Oracle 11g DB table once those parameters are found.
for e.g.
Splunk log file name is: app.log
input parameters in log file would be:
[timestamp] amount=100,name=xyz,time=19 May 2018 13:45 PM
output from shell script should be: amount should be read in to a variable and 100 should be assigned to that. This value 100 should be stored in a DB table in Oracle.
I may have to use awk script for this. I am not getting an idea on this as I am new to shell scripting..
tail -f|egrep -wi 'amount' /apps/JBoss/log/app.log
This type of commands doesn't seem to be working.
You may easily capture such values using Perl's regex.
amt=$(perl -pe 's/^amount=(\d+).*/$1/' /apps/JBoss/log/app.log)
If you want to use pure shell commands,
amt=$(grep amount app.log| cut -f1 -d',' | cut -f2 -d '=')
You may use this variable in the insert query from sqlplus
sqlplus -s USER/PWD<<SQL
INSERT INTO yourtable(column_name) VALUES(${amt});
commit;
exit
SQL
The DB Connect app may the job for you. See http://docs.splunk.com/Documentation/DBX/3.1.3/DeployDBX/Createandmanagedatabaseoutputs.
For an input file (app.log) like:
[timestamp] amount=100,name=xyz,time=19 May 2018 13:45 PM
[timestamp] amount=150,name=xyz,time=19 May 2018 13:45 PM
[timestamp] amount=200,name=xyz,time=19 May 2018 13:45 PM
you could use grep's P flag (PCRE):
arr=($(grep -oP "(?<=amount=)\d+" app.log))
This will store the values of amount in an array arr. Output:
echo ${arr[#]}
100 150 200
I have a script:
-rwx------. 1 root root 135 Oct 15 12:00 /backup/purge.sh
#!/bin/bash
volume=`echo "list volumes" | bconsole|grep -i "Append\|Full"|awk '{print $4}'`
echo "purge volume=$volume yes" | bconsole
If I run it manually it runs.
If I put the script to crontab it doesn't run, however the log says it ran.
Oct 15 16:07:01 sdfdsfdsf CROND[36326]: (root) CMD (/backup/purge.sh)
The schedule:
07 16 * * * /backup/purge.sh
If I run manually:
/backup/purge.sh
Connecting to Director weewr:9101
1000 OK: 1 werewrewrewr Version: 7.0.5 (28 July 2014)
Enter a period to cancel a command.
purge volume=Vol-0001 yes
This command can be DANGEROUS!!!
It purges (deletes) all Files from a Job,
JobId, Client or Volume; or it purges (deletes)
all Jobs from a Client or Volume without regard
to retention periods. Normally you should use the
PRUNE command, which respects retention periods.
Automatically selected Catalog: MyCatalog
Using Catalog "MyCatalog"
1 File on Volume "Vol-0001" purged from catalog.
There are no more Jobs associated with Volume "Vol-0001". Marking it purged.
bconsole hasn't been in the PATH so I used full path for the bconsole command like this:
!/bin/bash
volume=echo "list volumes" | /sbin/bconsole|grep -i "Append\|Full"|awk '{print $4}'
echo "purge volume=$volume yes" | /sbin/bconsole
Found the Solution !!!!
After a gob of Googling, I found this in a forum from a person asking " How to: Add or display today’s date from a shell script"
This is what I did
I added the following to the beginning of my ftp script
#!/bin/bash
TODAY=$(date)
HOST=$(hostname)
echo "--------------------------------------------"
echo "This script was run: $TODAY ON HOST:$HOST "
echo "--------------------------------------------"
# below is original code minus the #!/bin/sh
#
cd /folder where csv files are/
ftp -v -i -n 111.222.333.444 <<EOF
user mainuser dbuser
mput phas*.csv
bye
EOF
Now my log, on each cron event of the ftp'ing, show:
This script was run: Tue Nov 12 11:16:02 EST 2013 ON MyServer's HostName>
On the crontab, I changed the entry for logging to include 2 >> so the log is appended and not re-written:
16 11 * * * /srv/phonedialer_tmp/ftp-date.sh &>> /srv/phonedialer_tmp/ftp-date.log
I found a way to create a log file of daily ftp's by searching here:
./ftp_csv.sh 2>&1 > ftp_csv.log
I works great in that is records each time the cronjob runs. However, what I cannot find is a way to insert the date/time of each event. As you can see below, it records the transferring of the files.
is there a way I can somehow add the date/timestamp to the beginning or end of each recorded event within the log file?
[stevek#localhost phonedialer_tmp]$ cat ftp_csv.log
Connected to 1.2.3.4 (1.2.3.4).
220 Microsoft FTP Service
331 Password required for mainuser.
230 User mainuser logged in.
221
Connected to 1.2.3.4 (1.2.3.4).
220 Microsoft FTP Service
331 Password required for mainuser.
230 User mainuser logged in.
221
Connected to 1.2.3.4 (1.2.3.4).
220 Microsoft FTP Service
331 Password required for mainuser.
230 User mainuser logged in.
221 ETC
Thanks so much for any information
I am trying to shift the dates of a series of files by 9 hours. I've reached as far as this:
for i in *.MOV; do touch -r "$i" -d "-9 hours" "$i"; done
This should work in recent systems, but the touch command in OSX seems to be a bit outdated and not to support the -d switch.
I'm using Snow Leopard. Any idea on the best option for doing this with a single line command? I don't want to create a script for this.
Ok, sorted it out. OSX comes with a gtouch command, that knows the -d switch. It's part of GNU coreutils. See the comments below for information regarding availability on specific MacOS versions.
For more information on using relative dates with the -d switch see the manual.
Looking at the Wikipedia Page for Touch, it appears you're accustomed to the GNU version of Touch. Which MacOS isn't using.
For what you want to do, look into the "SetFile" command, which gets installed with XCode tools. You have -d and -m options, which reset the Created and Modified dates & times respectively.
http://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man1/SetFile.1.html
Donno OS X, but it should be easy enough to
get curr time stamp on the file
convert it to seconds
subtract 9 hours (9*60*60 secs) from it
convert it back to the format accepted by touch's -t option
run touch command
All this of course can be done in a single for loop on command line.
Here are simple examples from WikiPedia showing back and forth conversion.
# To convert a specific time stamp to Unix epoch time (seconds since 1970-01-01):
date +"%s" -d "Fri Apr 24 13:14:39 CDT 2009"
# 1240596879
# To convert Unix epoch time (seconds since 1970-01-01) to a human readable format:
date -d "UTC 1970-01-01 1240596879 secs"
# Fri Apr 24 13:14:39 CDT 2009
# Or:
date -ud #1000000000
# Sun Sep 9 01:46:40 UTC 2001
# or: Haven't tested this but should work..
date -d #1000000000 +%y%m%d%%H%M%S
# 010909014640