Get the time difference in seconds? - bash

I am having the below outputs and I need to get the time difference in seconds.
------------------------------
Wed Nov 23 15:09:20 2016
------------------------------
Wed Nov 23 15:27:47 2016
------------------------------
Generally month should be the same on all cases so we can escape it, the same for the year, I may get different values for the day of week and the day for sure, the difference for sure will be in seconds and minutes and might be in hours ...
I tried some awks and cut by : but I still having an issue.
Thanks in advance !
Any help appreciated !

My first perl script ever :
# extract two dates and calculate difference in s
# http://stackoverflow.com/questions/40781429/get-the-time-difference-in-seconds/
#
# cat time_diff.txt | grep -e "20[0-2][0-9]" | perl time_difference.pl
use Date::Parse;
$date_str1 = <STDIN>;
$date_str2 = <STDIN>;
$date1 = str2time($date_str1);
$date2 = str2time($date_str2);
print $date2-$date1;
print "\n";
Too bad you cannot use date -d, I was proud of this one-liner :
cat time_diff.txt | grep -e "20[0-2][0-9]" | xargs -i date -d{} +%s | (read -d "\n" t1 t2; echo $t2-$t1 | bc)
Tested with bash and zsh on Linux Mint 17.3

Related

Bash convert a number of epoch values to datetime

I have a number of files in the form foo_[SECONDS.MILLISECONDS]_bar.tar.gz and for each file I would like to be to get a datetime value (YYYYMMDDHHMMSS) for each file.
So far I have
ls -1 /filestore/*.tar.gz | cut -d _ -f 2 | date -f -
But this errors along the lines of
date: invalid date '1467535262.712041352'
How should a bash pipeline of epoch values be converted into a datetime string?
MWE
mkdir tmpBLAH
touch tmpBLAH/foo_1467483118.640314986_bar.tar.gz
touch tmpBLAH/foo_1467535262.712041352_bar.tar.gz
ls -1 tmpBLAH/*.tar.gz | cut -d _ -f 2 | date -f -
To convert epoch time to datetimem, please try the following command:
date -d #1346338800 +'%Y%m%d%H%M%S'
1346338800 is a epoch time.
About your case, for comand line as following:
echo 1467535262.712041352 | cut -d '.' -f 1 | xargs -I{} date -d #{} +'%Y%m%d%H%M%S'
you will get:
20160703174102
Something like this?
for f in /filestore/*.tar.gz; do
epoch=${f#*_}
date -d #${epoch%%.*} +%Y%m%d%H%M%S
done
The syntax of the date command differs between platforms; I have assumed GNU date, as commonly found on Linux. (You could probably use date -f if you add the # before each timestamp, but I am not in a place where I can test this right now.) Running a loop makes some things easier, such as printing both the input file name and the converted date, while otherwise a pipeline would be the most efficient and idiomatic solution.
As an aside, basically never use ls in scripts.
First, the -1 option to ls is useless, because ls prints its output one file per line by default, it's just that when the output is a terminal (not a pipe), it pretty-prints in columns. You can check that fact by just running ls | cat.
Then, date converts epoch timestamps safely only if prefixed with an #.
% date -d 0
Sun Jul 3 00:00:00 CEST 2016
% LANG=C date -d #0
Thu Jan 1 01:00:00 CET 1970
% date -d 12345
date: invalid date '12345'
% date -d #12345
Thu Jan 1 04:25:45 CET 1970
Which gives:
printf "%s\n" tmpBLAH/foo_*_bar.tar.gz | sed 's/.*foo_/#/; s/_bar.*//' | date -f -
You can do:
for i in foo_*_bar.tar.gz; do date -d "#$(cut -d_ -f2 <<<"$i")" '+%Y%m%d%H%M%S'; done
The epoch time is provided with the -d #<time> and the desired format is '+%Y%m%d%H%M%S'.
Example:
% for i in foo_*_bar.tar.gz; do date -d "#$(cut -d_ -f2 <<<"$i")" '+%Y%m%d%H%M%S'; done
20160703001158
20160703144102

Calculating times usingbash variables

I have a time in a variable
time1=14.25
is there any way i can take 30 minutes away from it ? the variable could be any time in 24hour format ?
Assuming your format is HH.MM, then with GNU date:
$ time1=14.25
$ date -d "$(tr . : <<< "$time1") 30 min ago" +%H.%M
13.55
I can't think of a pure BASH way;
You could try dc?
time2=`echo "$time1 0.30 - p" | dc`
or bc?
time2=`echo "$time1-0.30" | bc`
Will that do for you?

How can I convert a logfile entry date into a date in the future in bash

I'm sure this answer is obvious but I'm banging my head on it and getting a headache and my Search Foo is failing me…
I have a log file with this date format:
Sep 1 16:55:00 stuff happening
Sep 1 16:55:01 THIS IS THE LINE YOU WANT at this time stamp
Sep 1 16:55:02 more stuff
Sep 1 16:55:02 THIS IS THE LINE YOU WANT at this time stamp
Sep 1 16:55:03 blah
Sep 1 16:55:04 blah and so on…..
My ultimate goal is to:
Find the last line in the log file with a given string eg: "THIS IS THE LINE…" this is my "magic time" that I will do calculations on later.
Take the date of that line and set a variable that is the date +NN seconds. The time in the future will usually just short of 24hrs in the future from the time in step 1 so crossing into the next day may happen if that is important.
At some point in the script, advance the system clock to the new date/time after which I will be checking for certain events to fire.
I know this is way wrong but so far I have figured out how to:
Grab the last date stamp for my event.
logDate=cat /logdir/my.log | grep "THIS IS THE LINE" | tail -1 | cut -f1,2,3 -d" "
Returns: Sept 1 16:55:02
Convert the date into a more usable format
logDate2="$(date -d "$logDate" +"%m-%d %H:%M:%S")"; echo $logDate2
Returns: 09-17 16:55:02
I'm stuck here - what I want is:
futuredate=$logdate2 + XXXSeconds
Could someone help me with the time calculation or perhaps point out a better way to do all of this?
Thanks.
I'm stuck here - what I want is:
futuredate=$logdate2 + XXXSeconds
You can do it by converting through timestamps:
# convert log date to timestamp
logts="$(date -d "$logDate" '+%s')"
# add timestamp with seconds
futurets=$(( logts + XXXSeconds ))
# get date based from timestamp, optionally you can add a format.
futuredate=$(date -d "#${futurets}")
# Get time in seconds from the epoc (1970-01-01 00:00:00 UTC)
dateinseconds=$(date +"%s" -d "$(tail -1 logfile | grep "THIS IS THE LINE" | awk '{print $1, $2, $3}')")
# You can also use just awk without grep and tail to match and print the last line
dateinseconds=$(date +"%s" -d "$(awk '{/THIS IS THE LINE/}END{print $1, $2, $3}' logfile)")
gotofuture=$(( $dateinseconds + 2345 )) # Add 2345 seconds
newdate=$(date -d "#${gotofuture}")
echo "$newdate"

pull last 5 minutes of syslog data (750mb) with tac combo sed/awk/grep/?

Trying to pull the last 5 minutes of logs with (grep matches)
so i do a tac syslog.log | sed / date -d "5 minutes ago"
every line on the log shows this format
Jun 14 14:03:58
Jul 3 08:04:35
so i really want to get the check of data from
Jul 4 08:12
Jul 4 08:17
i tried this method but KINDA works (though its still going through every day from this that 08:12: through 08:17: fits in)
e=""
for (( i = 5; i >= 0; i-- ))
do
e='-e /'`date +\%R -d "-$i min"`':/p '$e;
done
tac /var/log/syslog.log | sed -n $e
e=""
for (( i = 5; i >= 0; i-- ))
do
if [[ -z $e ]]
then e=`date +\%R -d "-$i min"`
else e=$e'\|'`date +\%R -d "-$i min"`
fi
done
re=' \('$e'\):'
tac /var/log/syslog.log | sed -n -e "/$re/p" -e "/$re/!q"
This creates a single regular expression listing all the times from the last 5 minutes, connected with \|. It prints the lines that matches them. Then it uses the ! modifier to quit on the first line that doesn't match the RE.
If you know the format of the dates then why not do:
tac syslog.log | awk '/Jul 4 08:17/,/Jul 4 08:12/ { print } /Jul 4 08:11/ {exit}'
/ .. /,/ .. / is regex range. It will print everything in this range. So as soon as you see /Jul 4 08:11/ on your line that would mean your 5 minutes window has been captured, you exit perusing the file.
So it didnt really work for the above method But i think i got it to work
if i see this i added a RANGE for the {exit}
awk '/'"$dtnow"'/,/'"$dt6min"'/ { print } /'"$dt7min"'/,/'"$dt11min"'/ {exit}'
Seems to work im testing it again
OK Finally looks like it really works this time (where it exits after the hour using SED instead of awk finally got it to work running through some tests.
tac /var/log/syslog.log | sed -e "$( date -d '-1 hour -6 minutes' '+/^%b %e %H:/q;'
date -d '-1 day -6 minutes' '+/^%b %e /q;'
date -d '-1 month -6 minutes' '+/^%b /q;'
for ((o=0;o<=5;o++)) do date -d "-$o minutes" '+/^%b %e %R:/p;'; done ; echo d)"
It works if log entries begins from "May 14 11:41". Variable LASTMINUTES is used to set the last n minutes in the log:
cat log | awk 'BEGIN{ LASTMINUTES=30; for (L=0;L<=LASTMINUTES;L++) TAB[strftime("%b %d %H:%M",systime()-L*60)] } { if (substr($0,0,12) in TAB) print $0 }'
To run the above script you need gawk which can be installed by:
apt-get install gawk
or
yum install gawk

Show past X days /var/log/messages using Bash?

in BASH I can't think of a good way to do this but I only want to see the past 30 days of entries in /var/log/messages*. The issue to me is how do I do that with just the Month and Day. For example:
Sep 2 14:26:13 <SOME ENTRY>
Sep 4 14:26:13 <SOME ENTRY>
Sep 9 14:26:13 <SOME ENTRY>
Sep 14 14:26:13 <SOME ENTRY>
etc..
Any ideas ? HELP! ha ha
I think this is close. This will give you a sorted list of entries (most recent first) through the start of August. Depending on when you run it, it will give you as much as ~60 days instead of 30. On average, I suppose it would give you about 45. The other downside is that you need to adjust the grep statement at the end of the pipe as the date advances.
sort -k1Mr -k2nr <file> | grep -E "Aug|Sep"
a little late but...
egrep "^$(date '+%b %e' -2d)" /var/log/messages
-- This works --- but ugly --
-- Print only the searches that meet the date in each loop iteration (i.e last X num days)
for (( i=0; i<=${MAXSEARCHDAYS}; i++)) ;do
egrep $(date --date "now -${i} days" +%b) ${USBFOUND} | grep $(date --date "now -${i} days" +%e) >> ${TEMPFILE}
done
sort -k1,1M -k2,2n ${TEMPFILE} | uniq >> ${LOGFILE}

Resources