Sort logs based on timestamp in original logs using elasticsearch - sorting

I am using logstash+elasticsearch to index server logs. The logs if of this format:
17/03/15-06:29:30 31609 453749 545959 1 4 http://www.somesite.com/index.html - 0
Here is my logstash config file:
filter {
grok {
match => { "message" => "%{DATESTAMP:timestamp} %{NUMBER:some_id} %{NUMBER:some_id} %{NUMBER:some_id} %{NUMBER:some_id} %{NUMBER:some_id} %{DATA:url} %{GREEDYDATA:log_message}" }
}
date {
match => ["timestamp", "dd/MM/YY-HH:mm:ss"]
#remove_field => ["timestamp"]
}
mutate {
remove => [ "message" ]
}
}
I want to sort logs using the timestamp string of the logs. I have tried with and without using the 'date' filter. But unfortunately I am not able to query the timestamp field, sort or do a range query.
What should I do to make timestamp field sortable and queryable?
Is there a way to do this? Can anyone please help me with this situation? please comment if I am not clear with my question.
Thanks in advance.

See the below image of logs are loaded on the sorting base...

Related

Is it possible to change a field by a previous value in logstash

I'm searching on internet a way to put a variable in logstash and use or modify the value if a term is corresponding to a pattern.
Here, the is an example of my data source:
2017-04-12 15:49:57,641|OK|file1|98|||
2017-04-12 15:49:58,929|OK|file2|1387|null|msg_fils|
2017-04-12 15:49:58,931|OK|file3|2|msg_pere|msg_fils|
2017-04-12 15:50:17,666|OK|file1|25|||
2017-04-12 15:50:17,929|OK|file2|1387|null|msg_fils|
I'm using this grok code to parse my source.
grok {
match => {"message" => '%{TIMESTAMP_ISO8601:msgdates:date}\|%{WORD:verb}\|%{DATA:component}\|%{NUMBER:temps:int}\|%{DATA:msg_pere}\|%{DATA:msg_fils}\|'}
}
But in fact I want to modify the first field by the previous value of the line which contains file1
Can you tell me if it's possible or not?
Thanks
I have found a solution to my issue. I'm sharing you the solution to my problem.
I'm using a plugin named logstash-filter-memorize, it can be install by the command :
logstash-plugin install logstash-filter-memorize
So my filter is like this :
grok {
match => {"message" => '%{TIMESTAMP_ISO8601:msgdates:date}\|%{WORD:verb}\|%{DATA:component}\|%{NUMBER:temps:int}\|%{DATA:msg_pere}\|%{DATA:msg_fils}\|'}
}
if [component] =~ "file1" {
mutate {
add_field => [ "msg_id", "%{msgdates}" ]
}
memorize {
fields => [ "msg_id" ]
default => { "msg_id" => "NOTFOUND" }
} }
memorize {
fields => [ "msg_id9" ]
}
I hope that it can be useful for others.

Modify the content of a field using logstash

I am using logstash to get data from a sql database. There is a field called "code" in which the content has
this structure:
PO0000001209
ST0000000909
And what I would like to do is to remove the 6 zeros after the letters to get the following result:
PO1209
ST0909
I will put the result in another field called "code_short" and use it for my query in elasticsearch. I have configured the input
and the output in logstash but I am not sure how to do it using grok or maybe mutate filter
I have read some examples but I am quite new on this and I am a bit stuck.
Any help would be appreciated. Thanks.
You could use a mutate/gsub filter for this but that will replace the value of the code field:
filter {
mutate {
gsub => [
"code", "000000", "",
]
}
}
Another option is to use a grok filter like this:
filter {
grok {
match => { "code" => "(?<prefix>[a-zA-Z]+)000000%{INT:suffix}" }
add_field => { "code_short" => "%{prefix}%{suffix}"}
}
}

Timezone causing different results when doing a search query to an index in Elastic Search

I'm trying to find out the results from a search query (ie: searching results for the given date range) of a particular index. So that I could get the results in a daily basis.
This is the query : http://localhost:9200/dialog_test/_search?q=timestamp:[2016-08-03T00:00:00.128%20TO%202016-08-03T23:59:59.128]
In the above, timestamp is a field which i added using my logstash.conf in order to get the actual log time. When i tried querying this, surprisingly i got a number of hits (total hits: 24) which should've been 0 since I didn't have any log records from the date of (2016-08-03) . It actually displays the count for the next day (ie: (2016-08-04), which has 24 records in the log file. I'm sure something has gone wrong with the timezone.
My timezone is GMT+5:30.
Here is my filtering part of logstash conf:
filter {
grok {
patterns_dir => ["D:/ELK Stack/logstash/logstash-2.3.4/bin/patterns"]
match => { "message" => "^%{LOGTIMESTAMP:logtimestamp}%{GREEDYDATA}" }
}
mutate {
add_field => { "timestamp" => "%{logtimestamp}" }
remove_field => ["logtimestamp"]
}
date {
match => [ "timestamp" , "ISO8601" , "yyyyMMdd HH:mm:ss.SSS" ]
target => "timestamp"
locale => "en"
}}
EDIT:
This is a snap of the first 24 records which has the date of (2016-08-04) from the log file:
And this is a snap of the JSON response I got when I searched for the date of 2016-08-03:
Where am i going wrong? Any help could be appreciated.
In your date filter you need to add a timezone
date {
match => [ "timestamp" , "ISO8601" , "yyyyMMdd HH:mm:ss.SSS" ]
target => "timestamp"
locale => "en"
timezone => "Asia/Calcutta" <--- add this
}

Logstash: error for querying elasticsearch

Hello everyone,
Through logstash, I want to query elasticsearch in order to get fields from previous events and do some computation with fields of my current event and add new fields. Here is what I did:
input file:
{"device":"device1","count":5}
{"device":"device2","count":11}
{"device":"device1","count":8}
{"device":"device3","count":100}
{"device":"device3","count":95}
{"device":"device3","count":155}
{"device":"device2","count":15}
{"device":"device1","count":55}
My expected output:
{"device":"device1","count":5,"previousCount=0","delta":0}
{"device":"device2","count":11,"previousCount=0","delta":0}
{"device":"device1","count":8,"previousCount=5","delta":3}
{"device":"device3","count":100,"previousCount=0","delta":0}
{"device":"device3","count":95,"previousCount=100","delta":-5}
{"device":"device3","count":155,"previousCount=95","delta":60}
{"device":"device2","count":15,"previousCount=11","delta":4}
{"device":"device1","count":55,"previousCount=8","delta":47}
Logstash filter part:
filter {
elasticsearch {
hosts => ["localhost:9200/device"]
query => 'device:"%{[device]}"'
sort => "#timestamp:desc"
fields => ['count','previousCount']
}
if [previousCount]{
ruby {
code => "event[delta] = event[count] - event[previousCount]"
}
}
else{
mutate {
add_field => { "previousCount" => "0" }
add_field => { "delta" => "0" }
}
}
}
My problem:
For every line of my input file I got the following error : Failed to query elasticsearch for previous event ..
It seems that every line completely treated is not put in elasticsearch before logstash starts to treat the next line.
I don't know if my conclusion is correct and, if yes, why it happens.
So, do you know how I could solve this problem please ?!
Thank you for your attention and your help.
S

Logstash change time format

My log statement looks like this.
2014-04-23 06:40:29 INFO [1605853264] [ModuleName] - [ModuleName] -
Blah blah
I am able to parse it fine and it gets logged to ES correctly with following ES field
"LogTimestamp": "2014-04-23T13:40:29.000Z"
But my requirement is to log this statement as following, note 'z' is dropped with +0000. I tried replace, gsub but none changes the output.
"LogTimestamp": "2014-04-23T13:40:29.000+0000"
Can somebody help?
Here is my pattern
TEMP_TIMESTAMP %{YEAR}-%{MONTHNUM}-%{MONTHDAY}\s%{HOUR}:%{MINUTE}:%{SECOND} TEMP_LOG %{TEMP_TIMESTAMP:logdate}\s*?%{LOGLEVEL:TempLogLevel}\s*?\[\s?*%{BASE10NUM:TempThreadId}\]%{GREEDYDATA}
This is the filter config:
grok{
patterns_dir => ["patterns"]
match=> ["message", "%{TEMP_LOG}"]
}
date{
match => [ "logdate", "yyyy-MM-dd HH:mm:ss" ]
target => "LogTimestamp"
timezone => "PST8PDT"
}
mutate {
gsub => ["logdate", ".000Z", ".000+0000"]
}
I haven't quite understood meaning of fields in logstash and how they map to elastic search, that confusion is making me go wrong in this case.
You can use ruby plugin to do what you want!
As your requirement, you want to change this
"LogTimestamp": "2014-04-23T13:40:29.000Z"
to
"LogTimestamp": "2014-04-23T13:40:29.000+0000"
Try to use this filter
filter {
ruby {
code => "
event['LogTimestamp'] = event['LogTimestamp'].localtime('+00:00')
"
}
}
Hope this can help you.

Resources