How to set time in log as main #timestamp in elasticsearch - elasticsearch

Im using logstash to index some old log files in my elastic DB.
i need kibana/elastic to set the timestamp from within the logfile as the main #timestamp.
Im using grok filter in the following way:
%{TIMESTAMP_ISO8601:#timestamp}
yet elasticsearch sets the time of indexing as the main #timestamp and not the timestamp written in the log line.
Any idea what am i doing wrong here?
Thanks

Use the date filter to set the #timestamp field. Extract the timestamp in whatever format it's in into a separate (temporary) field, e.g. timestamp, and feed it to the date filter. In your case you'll most likely be able to use the special ISO8601 timestamp format token.
filter {
date {
match => ["timestamp", "ISO8601"]
remove_field => ["timestamp"]
}
}

Related

date format convertion in logstash elk stack

I have a date column in my table that I fetch using jdbc input in logstash. The problem is logstash gives a wrong value to elasticsearch stack.
For example if I have a date start_date="2018-03-01" in elasticsearch I would get the value "2018-02-28 23:00:00.000".
What I want is to keep the format of start_date or at least output the value "2018-03-01 00:00:00.000" to elasticsearch.
I tried to use this filter :
date {
timezone => "UTC"
match => ["start_date" , "ISO8601", "yyyy-MM-dd HH:mm:ss"]
}
but it didn't work.
It is because, you are trying to convert it to UTC timezone. You need to change your configuration like this:
date {
match => ["start_date" , "yyyy-MM-dd"]
}
This would be enough to parse your date.
Let me know if that works.

Timestamp to date in Logstash

I am trying to convert a string field called timestamp with value 1510722000000 in date format field in Logstash. My sole purpose is to visualize data in kibana using this date field. I tried using the date filter but it does not create the target field for me. Can anyone tell me How can I achieve this
My date filter looks like this
date {
timezone => "UTC"
match => ["timestamp", "UNIX_MS"]
target => "#timestamp1"
}
The Date filter used by me is correct. The problem is my input from timestamp is an array of milliseconds from epoch time. Date filter cannot convert an array of values in date field. To parse the values in array we can use Ruby filter.
Take a look on this post Parsing array of epoch time values in Ruby filter
please note it will still not convert into datefield format but just parse the values.

elastic stack : i need set Time Filter field name with another field

i need read messages(content is logs) from rabbitMq by logstash and then send that to elasticsearch for make visualize monitoring in kibana. so i wrote input for read from rabbitmq in logstash like this:
input {
rabbitmq {
queue => "testLogstash"
host => "localhost"
}
}
and i wrote output configuration for store in elasticsearch in logstash like this:
output {
elasticsearch{
hosts => "http://localhost:9200"
index => "d13-%{+YYYY.MM.dd}"
}
}
Both of them are placed in myConf.conf
In the content of each message, there is a Json that contains the fields like this:
{
"mDate":"MMMM dd YYYY, HH:mm:ss.SSS"
"name":"test name"
}
But there are two problems. First, there is no date field in the field of creating a new index(Time Filter field name). Second, I use the same timestamp as the default #timestamp, this field will not be displayed in the build type of graphs. I think the reason for this is because of the data type of the field. The field is of type date, but the string is considered.
i try to convert value of field to date by mutate in logstash config like this:
filter {
mutate {
convert => { "mdate" => "date" }
}
}
Now, two questions arise:
1- Is this the problem? If yes What is the right solution to fix it?
2- My main need is to use the time when messages are entered in the queue, not when Logstash takes them. What is the best solution?
If you don't specify a value for #timestamp, you should get the current system time when elasticsearch indexes the document. With that, you should be able to see items in kibana.
If I understand you correctly, you'd rather use you mDate field for #timestamp. For this, use the date{} filter in logstash.

How to add new timestamp field on logstash?

I'm parsing a log that have previously loaded in my localhost and I would like to get the event date field in each row as a timestamp, but kibana only can gets it as a string.
Example:
I have this event
2016/09/27 13:33:49.701 GMT(09/27 15:33:49 +0200) INFO BILLINGGW ConvergysDelegate.getCustomer(): Calling getCustomerFromCache: 0001:606523
It was loaded on September 27th 2016, 16:04:53.222, but the logdate field (the event date) is: 2016/09/27 13:33:49.701.
On logstash filter I defined:
(?<logdate>%{NUMBER:year}/%{NUMBER:month}/%{NUMBER:day} %{HOUR:hday}:%{MINUTE:min}:%{SECOND:sec}) %{GREEDYDATA:result}
I also proved with:
(?<logdate>%{YEAR:year}/%{MONTHNUM:month}/%{MONTHDAY:day} %{HOUR:hday}:%{MINUTE:min}:%{SECOND:sec}) %{GREEDYDATA:result}
And kibana reads logdate like string. How can I get that kibana could read it as timestamp?
I proved only with the date:
(?<logdate>%{NUMBER:year}/%{NUMBER:month}/%{NUMBER:day})
and Kibana interpreted it properly like timestamp, but the problem is how to add correctly the hours, minutes and seconds to the logdate field.
Could anyone help me?
Best regards.
You'll have to convert from a string to a timestamp, using the date filter.
date {
match => [ "logdate", "yyyy/MM/dd HH:mm:ss"]
}
This will attempt to parse the logdate field with this date pattern yyyy/MM/dd HH:mm:ss and, if successful, will replace the #timestamp field with the result. You can specify another field for the parsed date with the target option.

Logstash Dynamic Index From Document Field Fails

I still face problems to figure out, how to tell Logstash to send a dynamic index, based on a document field. Furthermore, this Field must be transformed in order to get the "real" index at the very end.
Given, that there is a field "time" (which is a UNIX Timestamp). This Field gets already transformed with a "date" Filter to a DateTime Object for Elastic.
Additionally, it should server as index (YYYYMM). The index should NOT be derived from #Timestamp, which is not touched.
Example:
{...,"time":1453412341,...}
Shall go to the Index: 201601
I use the following Config:
filter {
date {
match => [ "time", "UNIX" ]
target => "time"
timezone => "Europe/Berlin"
}
}
output {
elasticsearch {
index => "%{time}%{+YYYYMM}"
document_type => "..."
document_id => "%{ID}"
hosts => "..."
}
}
Sadly, its not working. Any idea, how to achieve that?
Thanks a lot!
The "%{+YYYYMM}" says to use the date values from #timestamp. If you want an index named after the YYYYMM in %{time}, you need to make a string out of that date field and then reference that string in the output stanza. There might be a mutate{} that would do it, or drop into ruby{}.
In most installations, you want to set #timestamp to the event's value. The default of logstash's own time is not very useful (imagine if your events were delayed by an hour during processing). If you did that, then %{+YYYYMM}" would work just fine.
This is caused because the index name is created based on UTC time by default.

Resources