ES SQL result doesn't use correct date mapping - elasticsearch

I'm experimenting with the SQL options from Elasticsearch and I noticed that a timestamp field that I mapped as "strict_date_optional_time_nanos||epoch_millis" doesn't show up as it is indexed. This is what the timestamp column looks like when I do a SELECT * FROM index:
| timeStamp |
+------------------------+
|1970-01-20T04:38:39.243Z|
The actual value indexed is: 1675772407310 (9th of Feb 13:59:24). I cannot seem to find information as to why it's this way.

I believe that Elasticsearch already internally performs the conversion to the datetime type.
In this case, you can do a cast to get the value in epoch format.
GET _sql?format=txt
{
"query": """ SELECT cast(timestamp as bigint) FROM "test"
"""
}

Related

date format convertion in logstash elk stack

I have a date column in my table that I fetch using jdbc input in logstash. The problem is logstash gives a wrong value to elasticsearch stack.
For example if I have a date start_date="2018-03-01" in elasticsearch I would get the value "2018-02-28 23:00:00.000".
What I want is to keep the format of start_date or at least output the value "2018-03-01 00:00:00.000" to elasticsearch.
I tried to use this filter :
date {
timezone => "UTC"
match => ["start_date" , "ISO8601", "yyyy-MM-dd HH:mm:ss"]
}
but it didn't work.
It is because, you are trying to convert it to UTC timezone. You need to change your configuration like this:
date {
match => ["start_date" , "yyyy-MM-dd"]
}
This would be enough to parse your date.
Let me know if that works.

Timestamp to date in Logstash

I am trying to convert a string field called timestamp with value 1510722000000 in date format field in Logstash. My sole purpose is to visualize data in kibana using this date field. I tried using the date filter but it does not create the target field for me. Can anyone tell me How can I achieve this
My date filter looks like this
date {
timezone => "UTC"
match => ["timestamp", "UNIX_MS"]
target => "#timestamp1"
}
The Date filter used by me is correct. The problem is my input from timestamp is an array of milliseconds from epoch time. Date filter cannot convert an array of values in date field. To parse the values in array we can use Ruby filter.
Take a look on this post Parsing array of epoch time values in Ruby filter
please note it will still not convert into datefield format but just parse the values.

Retrieve string date and long date from query result

I have a date field defined in index as
"_reportDate": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis"
}
and I have a query to query from _source field which gives _reportDate field in string of 2015-12-05 01:05:00.
I can't seems to find a way to get date in different date format during query retrieval apart from using script field (which is not preferable). From what I understand a date field will be parse to long value to be indexed in elastic search, can we retrieve the long value as well during elasticsearch query?
You need to store the field and at search time ask for this stored field.
If it does not work you can always apply the script at index time with ingest feature and a script processor.

How to add new timestamp field on logstash?

I'm parsing a log that have previously loaded in my localhost and I would like to get the event date field in each row as a timestamp, but kibana only can gets it as a string.
Example:
I have this event
2016/09/27 13:33:49.701 GMT(09/27 15:33:49 +0200) INFO BILLINGGW ConvergysDelegate.getCustomer(): Calling getCustomerFromCache: 0001:606523
It was loaded on September 27th 2016, 16:04:53.222, but the logdate field (the event date) is: 2016/09/27 13:33:49.701.
On logstash filter I defined:
(?<logdate>%{NUMBER:year}/%{NUMBER:month}/%{NUMBER:day} %{HOUR:hday}:%{MINUTE:min}:%{SECOND:sec}) %{GREEDYDATA:result}
I also proved with:
(?<logdate>%{YEAR:year}/%{MONTHNUM:month}/%{MONTHDAY:day} %{HOUR:hday}:%{MINUTE:min}:%{SECOND:sec}) %{GREEDYDATA:result}
And kibana reads logdate like string. How can I get that kibana could read it as timestamp?
I proved only with the date:
(?<logdate>%{NUMBER:year}/%{NUMBER:month}/%{NUMBER:day})
and Kibana interpreted it properly like timestamp, but the problem is how to add correctly the hours, minutes and seconds to the logdate field.
Could anyone help me?
Best regards.
You'll have to convert from a string to a timestamp, using the date filter.
date {
match => [ "logdate", "yyyy/MM/dd HH:mm:ss"]
}
This will attempt to parse the logdate field with this date pattern yyyy/MM/dd HH:mm:ss and, if successful, will replace the #timestamp field with the result. You can specify another field for the parsed date with the target option.

how to change elasticserach query result type?

I saved a type of datetime data to ES, in the search results, this field type was converted into a timestamp(integer), is there any way to turn into a string(just by modifying the query parameters)?
You can specify fields in the query then elasticsearch returns the fields in the format that you originally stored it:
You have two options ,
You can specify the date format at index time and return the same.
You can use scripts to format the date in the format you need.
curl -XGET http://localhost:9200/myindex/test/_search?pretty -d '
{
"query":{
"match_all":{ }
},
"script_fields":{
"aDate":{
"script":"if (!_source.myDate?.equals('null')) new java.text.SimpleDateFormat('yyyy-MM-dd\\'T\\'HH:mm:ss').format(new java.util.Date(_source.myDate));"
}
}
}'
I would choose the firat one as scripts are generally a lot more expensive.

Resources