Retrieve string date and long date from query result - elasticsearch

I have a date field defined in index as
"_reportDate": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis"
}
and I have a query to query from _source field which gives _reportDate field in string of 2015-12-05 01:05:00.
I can't seems to find a way to get date in different date format during query retrieval apart from using script field (which is not preferable). From what I understand a date field will be parse to long value to be indexed in elastic search, can we retrieve the long value as well during elasticsearch query?

You need to store the field and at search time ask for this stored field.
If it does not work you can always apply the script at index time with ingest feature and a script processor.

Related

Reindex Elasticsearch converting unixtime to date

I have an Elasticsearch index which uses the #timestamp field to store the date in a date field.
There are many records which are missing the #timestamp field, but have a timestamp field containing a unix timestamp. (Generated from PHP, so seconds, not milliseconds)
Note, the timestamp field is of date type, but numeric data seems to be stored there.
How can I use Painless script in a reindex and set #timestamp where it is missing, IF there is a numeric timestamp field with a unix timestamp?
Here's an example record that I would want to transform.
{
"_index": "my_log",
"_type": "doc",
"_id": "AWjEkbynNsX24NVXXmna",
"_score": 1,
"_source": {
"name": null,
"pid": "148651",
"timestamp": 1549486104
}
},
Did you have a look at the ingest module of Elasticsearch??
https://www.elastic.co/guide/en/elasticsearch/reference/current/date-processor.html
Parses dates from fields, and then uses the date or timestamp as the
timestamp for the document. By default, the date processor adds the
parsed date as a new field called #timestamp. You can specify a
different field by setting the target_field configuration parameter.
Multiple date formats are supported as part of the same date processor
definition. They will be used sequentially to attempt parsing the date
field, in the same order they were defined as part of the processor
definition.
It does exactly what you want :) In your reindex statement you can direct documents through this ingest processor.
If you need more help let me know, then I can jump behind a computer and help out :D

Range query for a keyword or a date type field?

I have a field which store the insert time,such as 2016-10-10 11:00:00.000,I tried keyword type and date type,they all meet the range requirements,such as
{
"query": {
"range" : {
"time" : {
"gte" : "2016-10-10 11:00:00.000",
"lte" : "2016-10-10 12:00:00.000"
}
}
}
}
keyword and date type which is better?
In your case, since you're storing dates, it's more appropriate to use the date data type, indeed. Internally, those dates will be stored as a long timestamps and the range query will be run on them, so that you have a numerical range.
keyword is intended to be used for string data. If you store those dates as keyword, your dates will be stored as unanalyzed strings and the range query that will be run on them will consider them as a lexical range.
If you ever need to create date_histogram aggregation out of those dates, the keyword type won't do it. So you should definitely prefer the date data type.

how to change elasticserach query result type?

I saved a type of datetime data to ES, in the search results, this field type was converted into a timestamp(integer), is there any way to turn into a string(just by modifying the query parameters)?
You can specify fields in the query then elasticsearch returns the fields in the format that you originally stored it:
You have two options ,
You can specify the date format at index time and return the same.
You can use scripts to format the date in the format you need.
curl -XGET http://localhost:9200/myindex/test/_search?pretty -d '
{
"query":{
"match_all":{ }
},
"script_fields":{
"aDate":{
"script":"if (!_source.myDate?.equals('null')) new java.text.SimpleDateFormat('yyyy-MM-dd\\'T\\'HH:mm:ss').format(new java.util.Date(_source.myDate));"
}
}
}'
I would choose the firat one as scripts are generally a lot more expensive.

Logstash inserting dates as strings instead of dateOptionalTime

I have an Elasticsearch index with the following mapping:
"pickup_datetime": {
"type": "date",
"format": "dateOptionalTime"
}
Here is an example of a date contained in the file that is being read in
"pickup_datetime": "2013-01-07 06:08:51"
I am using Logstash to read and insert data into ES with the following lines to attempt to convert the date string into the date type.
date {
match => [ "pickup_datetime", "yyyy-MM-dd HH:mm:ss" ]
target => "pickup_datetime"
}
But the match never seems to occur.
What am I doing wrong?
It turns out the date filter was before the csv filter, where the columns get named, hence the date filter was not finding the pickup_datetime column since it had not yet been named.
It might be a good idea to clearly mention the sequentiality of the filters in the documentation to avoid others having similar problems in the future.

How to search fields with '-' characters in elastic search

I am new to elastic search. I have got following document where one of the field "eventId" has "-" in value.
When i try to search with complete value of eventId, i don't get any results.
Sample Document app/event
{
"tags": {}
"eventId": "cc98d57b-c6bc-424c-b54c-df1e3df0d942",
}
I haven't created any explicit settings for my index.
Thanks.
you should check if the tokenizer splits your value into multiple fields. Maybe your value is stored as 5 fields: "cc98d57b", "c6bc", "424c", "b54c" and "df1e3df0d942"
You can analyze that with the 'Kopf' Plugin (https://github.com/lmenezes/elasticsearch-kopf).
If that is your problem you should change your field mapping, so that the value is not analyzed ("index" : "not_analyzed").
For an example how to set that mapping see here: Elasticsearch mapping settings 'not_analyzed' and grouping by field in Java
After that, you should be able to search for your specific value.

Resources