I'm trying to determine the amount of days from the last .rpm that was installed using ruby, however when I try to format using the Date class, I'm getting that it's an invalid date (ArguementError) on line 5. I've tried methods of converting it to an integer as well but nothing seems to work. Any thoughts?
require 'date'
require 'time'
tset = `rpm -qa --last | head -1 | rev | cut -d ' ' -f4-6 | rev`
atime = Date.strptime(tset, '%Y-%m-%d')
today = Date.today
btime = Date.strptime(today, '%Y-%m-%d')
daterange = ("btime - atime")
puts daterange
Here is the output of my rpm command
[root#default-centos-67 dev]# rpm -qa --last | head -1 | rev | cut -d ' ' -f4-6 | rev
18 Nov 2016
Your call to strptime isn't using the correct format for the date string you are parsing.
The correct date string is: %d %b %Y
%d is the two digit (zero padded) day of the month
%b is the abbreviated month name
%Y is the 4 digit year
Note that these are space-delimited, not hyphen delimited.
All together, here's how to parse the date from tset:
Date.strptime(tset, '%d %b %Y')
The result of your rpm command is "18 Nov 2016" and your Date.strptime command is looking for a date with a different format.
Change:
atime = Date.strptime(tset, '%Y-%m-%d')
to
atime = Date.strptime(tset, '%d %b %Y')
I almost always turn to the documentation on strftime to help me with flags for parsing Date and DateTime objects.
Related
my log has this date format at the beginning of each line:
2018 Sep 21 17:16:27:796
I need to grep the last 10 minutes of this log... any help?
my current experiments:
tenminutesago=$(date --date='10 minutes ago' +"%Y %b %e %H:%M:%S"):999
My idea was to convert the log format to a progressive number and then check everything greater than that number.
I see that the command: date +"%Y %b %e %H:%M:%S" gives a date in the same format of the log. The command: date +"%Y%m%e%H%M%S" gives a date in a progressive number (201810041204019)
You could do
for i in {10..0}; do
d=$(date -d "$i minutes ago" +'%Y %b %e %H:%M')
grep "$d" logfile
done
This just divides the problem in the 11 sequential subtasks of getting all lines from 10 minutes ago, all lines from 9 minutes ago, etc. until the current minute.
Edit:
Here's an alternate solution that prints all lines following the first one where a date stamp from the last 10 minutes was found, not only those that carry a date stamp, and also avoids reading the file over from start several times:
# build a regex pattern that matches any date in the given format from the last 10 minutes
pattern=$(date +'%Y %b %e %H:%M')
for i in {10..1}; do
pattern+=\|$(date -d "$i minutes ago" +'%Y %b %e %H:%M')
done
# print all lines starting from the first one that matches one of the dates in the pattern
awk "/$pattern/,0" logfile
Under the assumption that your loglines looks like
YYYY Bbb dd HH:MM:SS:sss Some random log message is here
You can do the following:
awk -v d=$(date -d "10 minutes ago" "+%Y %m %d %T") '
{ mm = sprintf("%0.2d",(index("JanFebMarAprMayJunJulAugSepOctNovDec",$2)+2)/3)
s = $1 " " mm " " $3 " "$4 }
(s >= d){print}' logfile
The idea is to convert your date format into a Sortable format (Note that "Jan" < "Mar" but "Feb" < "Jan"). This is done by converting your month into a number with two digits and then compare it stringwise against the correct date.
Try your current approach without the seconds and milliseconds.
tenminutesago=$(date --date='10 minutes ago' +"%Y %b %e %H:%M")
Is not exactly the last ten minutes to a second level, but I think it is enough for most of the cases. That will give you the first line in the log within the time window. Now you can get the total lines and subtract the line number of your previous grep, and then tail the file. The script could be like this:
LOGFILE="filename.log"
tenminutesago=$(date --date='9000 minutes ago' +"%Y %b %e %H:%M") # matching pattern
tlines=$(cat $LOGFILE | wc -l) # Total lines in file
let lines=$tlines-$(grep -n "$tenminutesago" $LOGFILE | grep -m 1 -oP "^[0-9]*" || echo $tlines) # lines after matching occurence
echo "$lines lines FOUND from the last X minutes"
tail -n $lines $LOGFILE # last lines in file
As suggested by #Gem Taylor, this could be reduced using +N option in tail.
LOGFILE="filename.log"
tenminutesago=$(date --date='9000 minutes ago' +"%Y %b %e %H:%M") # matching pattern
lines=$(grep -n "$tenminutesago" $LOGFILE | grep -m 1 -oP "^[0-9]*" || echo "0") # lines after matching occurence
echo "$lines lines FOUND from the last X minutes"
let lines -eq 0 && tail -n +$lines $LOGFILE # last lines in file if lines is not 0
I have a text log file, the format is like the following
Thread-28689296: Thu Aug 25 15:18:41 2016 [ info ]: xxxxx xxxxxx xxxxx
So I want to run cron job to find some certain error messages in last a few minutes. I wrote the following command
awk -vDate=`date +%b %d %H:%M:%S %Y` -vDate2=`date --date="2 minutes ago" +%b %d %H:%M:%S %Y` '$5 > Date && $5 < Date2' /var/log/dummy.log | grep "Fatal"
In the above command, i search for messages that have a timestamp beween time now and 2 minutes ago with a string Fatal.
But I got the following error
date: extra operand %d'
Try date --help' for more information.
date: extra operand %d'
Try date --help' for more information.
If I run date commands, I got the results as the following
date "+%b %d %H:%M:%S %Y"
Aug 25 15:25:01 2016
date --date="2 minutes ago" +"%b %d %H:%M:%S %Y"
Aug 25 15:31:42 2016
So the date commands in my awk script should be okay.
I also want to redirect the found error messages happening 2 minutes to a file to mail as alert but I did not get that far yet.
Please kindly advice me what is wrong in my awk script. Thanks a lot in advance!
The problem here is with date itself. Let's see how.
You are saying:
vDate2=`date --date="2 minutes ago" +%b %d %H:%M:%S %Y`
Because you want to use
date --date="2 minutes ago" +%b %d %H:%M:%S %Y
However, if you try to run it you'll see that you get the error:
date: extra operand ā%dā
Try 'date --help' for more information.
The problem is that you need to enclose the FORMAT controls within double quotes:
# v v
$ date --date="2 minutes ago" "+%b %d %H:%M:%S %Y"
Aug 25 14:49:31 2016
When this is done, all together your full awk one-liner can be:
awk -v Date="$(date "+%b %d %H:%M:%S %Y")" \
-v Date2="$(date --date="2 minutes ago" "+%b %d %H:%M:%S %Y")" \
'$5 > Date && $5 < Date2' file
Note I am using -v Date="$(date ...)":
$( ) for process substitution, since backticks ` are almost deprecated, ir at least considered legacy.
date=" things " to prevent errors if the content has spaces.
v var=value using spaces after -v, since -vvar=value is gawk-specific.
I have a number of files in the form foo_[SECONDS.MILLISECONDS]_bar.tar.gz and for each file I would like to be to get a datetime value (YYYYMMDDHHMMSS) for each file.
So far I have
ls -1 /filestore/*.tar.gz | cut -d _ -f 2 | date -f -
But this errors along the lines of
date: invalid date '1467535262.712041352'
How should a bash pipeline of epoch values be converted into a datetime string?
MWE
mkdir tmpBLAH
touch tmpBLAH/foo_1467483118.640314986_bar.tar.gz
touch tmpBLAH/foo_1467535262.712041352_bar.tar.gz
ls -1 tmpBLAH/*.tar.gz | cut -d _ -f 2 | date -f -
To convert epoch time to datetimem, please try the following command:
date -d #1346338800 +'%Y%m%d%H%M%S'
1346338800 is a epoch time.
About your case, for comand line as following:
echo 1467535262.712041352 | cut -d '.' -f 1 | xargs -I{} date -d #{} +'%Y%m%d%H%M%S'
you will get:
20160703174102
Something like this?
for f in /filestore/*.tar.gz; do
epoch=${f#*_}
date -d #${epoch%%.*} +%Y%m%d%H%M%S
done
The syntax of the date command differs between platforms; I have assumed GNU date, as commonly found on Linux. (You could probably use date -f if you add the # before each timestamp, but I am not in a place where I can test this right now.) Running a loop makes some things easier, such as printing both the input file name and the converted date, while otherwise a pipeline would be the most efficient and idiomatic solution.
As an aside, basically never use ls in scripts.
First, the -1 option to ls is useless, because ls prints its output one file per line by default, it's just that when the output is a terminal (not a pipe), it pretty-prints in columns. You can check that fact by just running ls | cat.
Then, date converts epoch timestamps safely only if prefixed with an #.
% date -d 0
Sun Jul 3 00:00:00 CEST 2016
% LANG=C date -d #0
Thu Jan 1 01:00:00 CET 1970
% date -d 12345
date: invalid date '12345'
% date -d #12345
Thu Jan 1 04:25:45 CET 1970
Which gives:
printf "%s\n" tmpBLAH/foo_*_bar.tar.gz | sed 's/.*foo_/#/; s/_bar.*//' | date -f -
You can do:
for i in foo_*_bar.tar.gz; do date -d "#$(cut -d_ -f2 <<<"$i")" '+%Y%m%d%H%M%S'; done
The epoch time is provided with the -d #<time> and the desired format is '+%Y%m%d%H%M%S'.
Example:
% for i in foo_*_bar.tar.gz; do date -d "#$(cut -d_ -f2 <<<"$i")" '+%Y%m%d%H%M%S'; done
20160703001158
20160703144102
I am completely new to shell scripting.
I need to change the format of given date to customized format like i have date in a variable with format as MM/DD/YY HH:MM:SS but i want date in a format as MM/DD/YYY HH:MM:SS which is having four digits of the year.
We can change the sys date format but i need the same in a variable.
My code as below
START_DATE="12/20/14 05:59:01"
yr=`echo $START_DATE | cut -d ' ' -f1 | cut -d '/' -f3`
yr_len=`echo $yr | wc -c`
if [ $yr_len -lt 4 ]
then
tmp_yr="20${yr}";
else
1=1;
fi
ln=`echo $tmp_yr|wc -c`
After this i strucked in reframe the same date in wanted format.
Can some one please help me
Regards,
Sai.
Using GNU date:
date -d'02/16/15 09:16:04' "+%m/%d/%Y %T"
produces
02/16/2015 09:16:04
which is what you want. See man date for details about the formatting, or this question for a number of great examples.
One option may be using the date/time functions inside awk. Here is a oneliner:
echo '02/16/15 09:16:04' | sed 's\[/:]\ \g' | awk '{d0=$3+2000FS$1FS$2FS$4FS$5FS$6; d1=mktime(d0);print strftime("%m/%d/%Y %T", d1) }'
output is:
02/16/2015 09:16:04
You can find more strftime formats in https://www.gnu.org/software/gawk/manual/html_node/Time-Functions.html
How would you parse a date in bash, with separate fields (years, months, days, hours, minutes, seconds) into different variables?
The date format is: YYYY-MM-DD hh:mm:ss
Does it have to be bash? You can use the GNU coreutils /bin/date binary for many transformations:
$ date --date="2009-01-02 03:04:05" "+%d %B of %Y at %H:%M and %S seconds"
02 January of 2009 at 03:04 and 05 seconds
This parses the given date and displays it in the chosen format. You can adapt that at will to your needs.
I had a different input time format, so here is a more flexible solution.
Convert dates in BSD/macOS
date -jf in_format [+out_format] in_date
where the formats use strftime (see man strftime).
For the given input format YYYY-MM-DD hh:mm:ss:
$ date -jf '%Y-%m-%d %H:%M:%S' '2017-05-10 13:40:01'
Wed May 10 13:40:01 PDT 2017
To read them into separate variables, I'm taking NVRAM's idea, but allowing you to use any strftime format:
$ date_in='2017-05-10 13:40:01'
$ format='%Y-%m-%d %H:%M:%S'
$ read -r y m d H M S <<< "$(date -jf "$format" '+%Y %m %d %H %M %S' "$date_in")"
$ for var in y m d H M S; do echo "$var=${!var}"; done
y=2017
m=05
d=10
H=13
M=40
S=01
In scripts, always use read -r.
In my case, I wanted to convert between timezones (see your /usr/share/zoneinfo directory for zone names):
$ format=%Y-%m-%dT%H:%M:%S%z
$ TZ=UTC date -jf $format +$format 2017-05-10T02:40:01+0200
2017-05-10T00:40:01+0000
$ TZ=America/Los_Angeles date -jf $format +$format 2017-05-10T02:40:01+0200
2017-05-09T17:40:01-0700
Convert dates in GNU/Linux
On a Mac, you can install the GNU version of date as gdate with brew install coreutils.
date [+out_format] -d in_date
where the out_format uses strftime (see man strftime).
In GNU coreutils' date command, there is no way to explicitly set an input format, since it tries to figure out the input format by itself, and stuff usually just works. (For detail, you can read the manual at coreutils: Date input formats.)
For example:
$ date '+%Y %m %d %H %M %S' -d '2017-05-10 13:40:01'
2017 05 10 13 40 01
To read them into separate variables:
$ read -r y m d H M S <<< "$(date '+%Y %m %d %H %M %S' -d "$date_in")"
To convert between timezones (see your /usr/share/zoneinfo directory for zone names), you can specify TZ="America/Los_Angeles" right in your input string. Note the literal " chars around the zone name, and the space character before in_date:
TZ=out_tz date [+out_format] 'TZ="in_tz" in_date'
For example:
$ format='%Y-%m-%d %H:%M:%S%z'
$ TZ=America/Los_Angeles date +"$format" -d 'TZ="UTC" 2017-05-10 02:40:01'
2017-05-09 19:40:01-0700
$ TZ=UTC date +"$format" -d 'TZ="America/Los_Angeles" 2017-05-09 19:40:01'
2017-05-10 02:40:01+0000
GNU date also understands hour offsets for the time zone:
$ TZ=UTC date +"$format" -d '2017-05-09 19:40:01-0700'
2017-05-10 02:40:01+0000
This is simple, just convert your dashes and colons to a space (no need to change IFS) and use 'read' all on one line:
read Y M D h m s <<< ${date//[-:]/ }
For example:
$ date=$(date +'%Y-%m-%d %H:%M:%S')
$ read Y M D h m s <<< ${date//[-: ]/ }
$ echo "Y=$Y, m=$m"
Y=2009, m=57
$ t='2009-12-03 12:38:15'
$ a=(`echo $t | sed -e 's/[:-]/ /g'`)
$ echo ${a[*]}
2009 12 03 12 38 15
$ echo ${a[3]}
12
The array method is perhaps better, but this is what you were specifically asking for:
IFS=" :-"
read year month day hour minute second < <(echo "YYYY-MM-DD hh:mm:ss")
Pure Bash:
date="2009-12-03 15:35:11"
saveIFS="$IFS"
IFS="- :"
date=($date)
IFS="$saveIFS"
for field in "${date[#]}"
do
echo $field
done
2009
12
03
15
35
11
instead of using the shell scripting,incorporate in your scripting itself like below wheever you need:
a=date +%Y
b=date +%S
c=date +%H
a will be year
b will be seconds
c will be hours. and so on.
Another solution to the OP's problem:
IFS=' -:' read y m d h m s<<<'2014-03-26 16:36:41'
Converting a date to another format with BSD date and GNU date:
$ LC_ALL=C date -jf '%a %b %e %H:%M:%S %Z %Y' 'Wed Mar 26 16:36:41 EET 2014' +%F\ %T
2014-03-26 16:36:41
$ gdate -d 'Wed Mar 26 16:36:41 EET 2014' +%F\ %T
2014-03-26 16:36:41
GNU date recognizes Wed and Mar even in non-English locales but BSD date doesn't.
Converting seconds since epoch to a date and time with GNU date and BSD date:
$ gdate -d #1234567890 '+%F %T'
2009-02-14 01:31:30
$ date -r 1234567890 '+%F %T'
2009-02-14 01:31:30
Converting seconds to hours, minutes, and seconds with a POSIX shell, POSIX awk, GNU date, and BSD date:
$ s=12345;printf '%02d:%02d:%02d\n' $((s/3600)) $((s%3600/60)) $((s%60))
05:25:45
$ echo 12345|awk '{printf "%02d:%02d:%02d\n",$0/3600,$0%3600/60,$0%60}'
05:25:45
$ gdate -d #12345 +%T
05:25:45
$ date -r 12345 +%T
05:25:45
Converting seconds to days, hours, minutes, and seconds:
$ t=12345678
$ printf '%d:%02d:%02d:%02d\n' $((t/86400)) $((t/3600%24)) $((t/60%60)) $((t%60))
142:21:21:18
another pure bash
$ d="2009-12-03 15:35:11"
$ d=${d//[- :]/|}
$ IFS="|"
$ set -- $d
$ echo $1
2009
$ echo $2
12
$ echo $#
2009 12 03 15 35 11
have you tried using cut?
something like this:
dayofweek=date|cut -d" " -f1