Date filter in logstash: bad results(shows one day back) - elasticsearch

I am loading a CSV file into elastic using logstash
This CSV file contains a column 'deadline' which has dates of the format
"deadline": "15-06-2014"
I am using the date filter plugin in logstash to get this in date format into elastic
date {
match => ["deadline","dd-MM-yyyy"]
target => "deadline_date"
}
But in the output I am receiving the date which has moved one day back
"deadline_date": "2014-06-14T18:30:00.000Z"
I have one more instance with format like this "dd-MM-yyyy HH:mm"
date {
match => ["launched","dd-MM-yyyy HH:mm"]
target => "launched_date"
}
Gives result with time changed
"launched": "09-09-2013 18:19"
"launched_date": "2013-09-09T12:49:00.000Z"
Please help me figure out this issue.

You're missing your timezone:
date {
match => ["deadline","dd-MM-yyyy"]
target => "deadline_date"
timezone => "Etc/GMT"
}

Related

date format convertion in logstash elk stack

I have a date column in my table that I fetch using jdbc input in logstash. The problem is logstash gives a wrong value to elasticsearch stack.
For example if I have a date start_date="2018-03-01" in elasticsearch I would get the value "2018-02-28 23:00:00.000".
What I want is to keep the format of start_date or at least output the value "2018-03-01 00:00:00.000" to elasticsearch.
I tried to use this filter :
date {
timezone => "UTC"
match => ["start_date" , "ISO8601", "yyyy-MM-dd HH:mm:ss"]
}
but it didn't work.
It is because, you are trying to convert it to UTC timezone. You need to change your configuration like this:
date {
match => ["start_date" , "yyyy-MM-dd"]
}
This would be enough to parse your date.
Let me know if that works.

Logstash omitting daylight saving time when parsing date

My log file contains a timestamp without timezone indicator.
In format dd-MMM-yyyy::HH:mm:ss
My server is located in central Europe, so is in timezone UTC+1 but currently uses DST that results in UTC+2.
A date in the log file: 2017-07-25::17:30:00 is parsed as 2017-07-25T16:30:00Z. But it should be 2017-07-25T15:30:00Z. As we are in DST now.
Logstash seems to consider only the timezone but not DST.
How can I fix this?
My logstash config:
date {
match => ["logdate", "dd-MMM-yyyy::HH:mm:ss"]
target => "#timestamp"
remove_field => "logdate"
}
You need to specify the timezone your dates are in:
date {
match => ["logdate", "dd-MMM-yyyy::HH:mm:ss"]
target => "#timestamp"
remove_field => "logdate"
timezone => "Europe/Zurich" <-- add this line
}
You may change "Europe/Zurich" to whatever timezone makes sense to you (other list of time zones that might be of use)

Parsing a date field in logstash to elastic search

I am trying to parse log files from IIS to the ELK stack (Logstash:2.3, Elastic:2.3 and Kibana:4.5, CentOS 7 vm).
I have attempted to parse a date field from the log message as the event timestamp using the date filter below in my logstash configuration:
date {
match => ["date_timestamp", "yyyy-MM-dd HH:mm:ss"]
timezone => "Europe/London"
locale => "en"
target => "#timestamp"
}
The first few characters of the entire log message that was parsed to Elastic Search is:
"message": "2016-03-01 03:30:49 .........
The date field above was parsed to Elastic Search as:
"date_timestamp": "16-03-01 03:30:49",
However, the event timestamp that was parsed to Elastic Search using the date filter above is:
"#timestamp": "0016-03-01T03:32:04.000Z",
I will like the #timestamp to be exactly 2016-03-01T03:30:49 as I can't immediately figure out why there is a difference between the hours and minutes.
I have looked at similar problems and documentations such as this one on SO and this one on logstash documentation and logstash documentation.
Any pointer in the right direction will be appreciated.
Regards
SO
in your date_timestamp you have only 2 characters for year: "16-03-01 03:30:49", so the date pattern in your date filter is incorrect, should be:
date {
match => ["date_timestamp", "yy-MM-dd HH:mm:ss"]
timezone => "Europe/London"
locale => "en"
target => "#timestamp"
}

How to set time in log as main #timestamp in elasticsearch

Im using logstash to index some old log files in my elastic DB.
i need kibana/elastic to set the timestamp from within the logfile as the main #timestamp.
Im using grok filter in the following way:
%{TIMESTAMP_ISO8601:#timestamp}
yet elasticsearch sets the time of indexing as the main #timestamp and not the timestamp written in the log line.
Any idea what am i doing wrong here?
Thanks
Use the date filter to set the #timestamp field. Extract the timestamp in whatever format it's in into a separate (temporary) field, e.g. timestamp, and feed it to the date filter. In your case you'll most likely be able to use the special ISO8601 timestamp format token.
filter {
date {
match => ["timestamp", "ISO8601"]
remove_field => ["timestamp"]
}
}

Convert a string field to date

So, I have a two fields in my log: timeLogged, timeQueued all this fields have date format: 2014-06-14 19:41:21+0000
My question is, how to convert string date value to logstash date? like in #timestamp
For the sole purpose of converting to #timestamp there is a dedicated date filter
date {
match => ["timeLogged","YYYY-MM-dd HH:mm:ss+SSSS"]
}
Now in your case there are basically two types of fields that might be used so you will have to dig a little, either use a grok filter to copy the values in a generic "log_date" field, or trying to see if the date filter can take several arguments like one of thoses possibilities:
date {
match => ["timeLogged","YYYY-MM-dd HH:mm:ss+SSSS",
"timeQueued","YYYY-MM-dd HH:mm:ss+SSSS" ]
}
OR
date {
match => ["timeLogged","YYYY-MM-dd HH:mm:ss+SSSS"]
match => ["timeQueued","YYYY-MM-dd HH:mm:ss+SSSS"]
}
It is up to you to experiment, I never tried myself ;)
this should suffice
date{
match => [ "timeLogged","ISO8601","YYYY-MM-dd HH:mm:ss" ]
target => "timeLogged"
locale => "en"
}
You can try this filter
filter {
ruby {
code => "
event['timeLogged'] = Time.parse(event['timeLogged'])
event['timeQueued'] = Time.parse(event['timeQueued'])
"
}
}
Use the powerful ruby library to do what you need!

Resources