how to change elasticserach query result type? - elasticsearch

I saved a type of datetime data to ES, in the search results, this field type was converted into a timestamp(integer), is there any way to turn into a string(just by modifying the query parameters)?

You can specify fields in the query then elasticsearch returns the fields in the format that you originally stored it:
You have two options ,
You can specify the date format at index time and return the same.
You can use scripts to format the date in the format you need.
curl -XGET http://localhost:9200/myindex/test/_search?pretty -d '
{
"query":{
"match_all":{ }
},
"script_fields":{
"aDate":{
"script":"if (!_source.myDate?.equals('null')) new java.text.SimpleDateFormat('yyyy-MM-dd\\'T\\'HH:mm:ss').format(new java.util.Date(_source.myDate));"
}
}
}'
I would choose the firat one as scripts are generally a lot more expensive.

Related

Is it possible to query by field data type in Elasticsearch?

I am needing to do a query in Elasticsearch by field data type. I have not been successful in creating that query. I want to be able to {1) specify the type I want to search for in the query, i.e. all fields of {"type"="boolean"}, and also, (2) get the field and see what the type is for that field.
Reason is to check that the field is designated correctly. Let's say I inserted the following data into this index and fields and I now want to see what the data types of those fields are programmatically. How would I query that?
POST /index_name1/_doc/
{
"field1":"hello_field_2",
"field2":"123456.54321",
"field3.field4": false,
"field3.field5.field10":"POINT(-117.918976 33.812511)",
"field3.field5.field8": "field_of_dragons",
"field3.field5.field9": "2022-05-26T07:47:26.133275Z"
}
I have tried:
GET /index_name1/_search
{
"query":{
"wildcard":{
"field3.field4":{ "type":"*"}
}
}
}
That gives [wildcard] query does not support [type].
I've tried many other queries and searched the documentation and threads, but can't find anything that will do this. It has got to be possible, right?

Retrieve string date and long date from query result

I have a date field defined in index as
"_reportDate": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis"
}
and I have a query to query from _source field which gives _reportDate field in string of 2015-12-05 01:05:00.
I can't seems to find a way to get date in different date format during query retrieval apart from using script field (which is not preferable). From what I understand a date field will be parse to long value to be indexed in elastic search, can we retrieve the long value as well during elasticsearch query?
You need to store the field and at search time ask for this stored field.
If it does not work you can always apply the script at index time with ingest feature and a script processor.

Range query for a keyword or a date type field?

I have a field which store the insert time,such as 2016-10-10 11:00:00.000,I tried keyword type and date type,they all meet the range requirements,such as
{
"query": {
"range" : {
"time" : {
"gte" : "2016-10-10 11:00:00.000",
"lte" : "2016-10-10 12:00:00.000"
}
}
}
}
keyword and date type which is better?
In your case, since you're storing dates, it's more appropriate to use the date data type, indeed. Internally, those dates will be stored as a long timestamps and the range query will be run on them, so that you have a numerical range.
keyword is intended to be used for string data. If you store those dates as keyword, your dates will be stored as unanalyzed strings and the range query that will be run on them will consider them as a lexical range.
If you ever need to create date_histogram aggregation out of those dates, the keyword type won't do it. So you should definitely prefer the date data type.

To Select documents having same startDate and endDate

I have some documents where in each document , there is a startDate and endDate date fields. I need all documents with both these value as same. I couldn't find any query which will help me to do it.
Elasticsearch supports script filters, which you can use in this case . More Info
Something like this is what you will need -
POST /<yourIndex>/<yourType>/_search?
{
"query": {
"filtered": {
"filter": {
"script": {
"script": "doc['startDate'].value == doc['endDate'].value"
}
}
}
}
}
This can be achieved in 2 manner
Index solution - While indexing add an additional field called isDateSame and set it to true or false based on the value of startDate and endDate. Then you can easily do a query based on that field. This is the best optimized solution
Script solution - Elasticsdearch maintains all the indexed data in field data which is more like a reverse reverse index. Using script you can access any indexed fields and do comparison. This is pretty fast but not as good as first one.You can use the following query for the same

ElasticSearch index unix timestamp

I have to index documents containing a 'time' field whose value is an integer representing the number of seconds since epoch (aka unix timestamp).
I've been reading ES docs and have found this:
http://www.elasticsearch.org/guide/reference/mapping/date-format.html
But it seems that if I want to submit unix timestamps and want them stored in a 'date' field (integer field is not useful for me) I have only two options:
Implement my own date format
Convert to a supported format at the sender
Is there any other option I missed?
Thanks!
If you supply a mapping that tells ES the field is a date, it can use epoch millis as an input. If you want ES to auto-detect you'll have to provide ISO8601 or other discoverable format.
Update: I should also note that you can influence what strings ES will recognize as dates in your mapping. http://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-date-format.html
In case you want to use Kibana, which I expect, and visualize according to the time of a log/entry you will need at least one field to be a date field.
Please note that you have to set the field as date type BEFORE you input any data into the /index/type. Otherwise it will be stored as long and unchangeable.
Simple example that can be pasted into the marvel/sense plugin:
# Make sure the index isn't there
DELETE /logger
# Create the index
PUT /logger
# Add the mapping of properties to the document type `mem`
PUT /logger/_mapping/mem
{
"mem": {
"properties": {
"timestamp": {
"type": "date"
},
"free": {
"type": "long"
}
}
}
}
# Inspect the newly created mapping
GET /logger/_mapping/mem
Run each of these commands in serie.
Generate free mem logs
Here is a simple script that echo to your terminal and logs to your local elasticsearch:
while (( 1==1 )); do memfree=`free -b|tail -n 1|tr -s ' ' ' '|cut -d ' ' -f4`; echo $load; curl -XPOST "localhost:9200/logger/mem" -d "{ \"timestamp\": `date +%s%3N`, \"free\": $memfree }"; sleep 1; done
Inspect data in elastic search
Paste this in your marvel/sense
GET /logger/mem/_search
Now you can move to Kibana and do some graphs. Kibana will autodetect your date field.

Resources