Is it possible to save time fields in elastic search, in format like HH:mm and search then based on some time query range like HH:mm-HH:mm?
Yes, you can store time in elastic in this format check the related doc about the different date format here:
https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-date-format.html
hour_minute or strict_hour_minute
A formatter for a two digit hour of day and two digit minute of hour: HH:mm.
You will have a mapping like this if you use the build in format:
PUT my_index
{
"mappings": {
"properties": {
"date": {
"type": "date",
"format": "hour_minute"
}
}
}
}
To search you can use the build in format in your range query.
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-range-query.html
GET /_search
{
"query": {
"range" : {
"age" : {
"gte" : "10:15",
"lte" : "20:13",
"format" : "hour_minute"
}
}
}
}
Related
I'm trying to implement partial search on a date field in elastic search. For example if startDate is stored as "2019-08-28" i should be able to retrieve the same while querying just "2019" or "2019-08" or "2019-0".
For other fields i'm doing this:
{
"simple_query_string": {
"fields": [
"customer"
],
"query": "* Andrew *",
"analyze_wildcard": "true",
"default_operator": "AND"
}}
which perfectly works on text fields, but the same doesn't work on date fields.
This is the mapping :
{"mappings":{"properties":{"startDate":{"type":"date"}}}}
Any way this can be achieved, be it change in mapping or other query method? Also i found this discussion related to partial dates in elastic, not sure if it's much relevant but here it is:
https://github.com/elastic/elasticsearch/issues/45284
Excerpt from ES-Docs
Internally, dates are converted to UTC (if the time-zone is specified)
and stored as a long number representing milliseconds-since-the-epoch.
It is not possible to do searching as we can do on a text field. However, we can tell ES to index date field as both date & text.e.g
Index date field as multi-type:
PUT sample
{
"mappings": {
"properties": {
"my_date": {
"type": "date",
"format": "year_month_day",//<======= yyyy-MM-dd
"fields": {
"formatted": {
"type": "text", //<========= another representation of type TEXT, can be accessed using my_date.formatted
"analyzer": "whitespace" //<======= whitespace analyzer (standard will tokenized 2020-01-01 into 2020,01 & 01)
}
}
}
}
}
}
POST dates/_doc
{
"date":"2020-01-01"
}
POST dates/_doc
{
"date":"2019-01-01"
}
Use wildcard query to search: You can even use n-grams at indexing time for faster search if required.
GET dates/_search
{
"query": {
"wildcard": {
"date.formatted": {
"value": "2020-0*"
}
}
}
}
I am new to elastic search and I am struggling with date range query. I have to query the records which fall between some particular dates.The JSON records pushed into elastic search database are as follows:
"messageid": "Some message id",
"subject": "subject",
"emaildate": "2020-01-01 21:09:24",
"starttime": "2020-01-02 12:30:00",
"endtime": "2020-01-02 13:00:00",
"meetinglocation": "some location",
"duration": "00:30:00",
"employeename": "Name",
"emailid": "abc#xyz.com",
"employeecode": "141479",
"username": "username",
"organizer": "Some name",
"organizer_email": "cde#xyz.com",
I have to query the records which has start time between "2020-01-02 12:30:00" to "2020-01-10 12:30:00". I have written a query like this :
{
"query":
{
"bool":
{
"filter": [
{
"range" : {
"starttime": {
"gte": "2020-01-02 12:30:00",
"lte": "2020-01-10 12:30:00"
}
}
}
]
}
}
}
This query is not giving results as expected. I assume that the person who has pushed the data into elastic search database at my office has not set the mapping and Elastic Search is dynamically deciding the data type of "starttime" as "text". Hence I am getting inconsistent results.
I can set the mapping like this :
PUT /meetings
{
"mappings": {
"dynamic": false,
"properties": {
.
.
.
.
"starttime": {
"type": "date",
"format":"yyyy-MM-dd HH:mm:ss"
}
.
.
.
}
}
}
And the query will work but I am not allowed to do so (office policies). What alternatives do I have so that I can achieve my task.
Update :
I assumed the data type to be "Text" but by default Elastic Search applies both "Text" and "Keyword" so that we can implement both Full Text and Keyword based searches. If it is also set as "Keyword" . Will this benefit me in any case. I do not have access to lots of stuff in the office that's why I am unable to debug the query.I only have the search API for which I have to build the query.
GET /meetings/_mapping output :
'
'
'
"starttime" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
}
'
'
'
Date range queries will not work on text field, for that, you have to use the date field
Since you are working on date fields , best practice is to use the date field.
I would suggest you to reindex your index to another index so that you can change the type of your text field to date field
Step1-: Create index2 using index1 mapping and make sure to change the type of your date field which is text to date type
Step 2-: Run the elasticsearch reindex and reindex all your data from index1 to index2. Since you have changed your field type to date field type. Elasticsearch will now recognize this field as date
POST _reindex
{
"source":{ "index": "index1" },
"dest": { "index": "index2" }
}
Now you can run your Normal date queries on index2
As #jzzfs suggested the idea is to add a date sub-field to the starttime field. You first need to modify the mapping like this:
PUT meetings/_mapping
{
"properties": {
"starttime" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
},
"date": {
"type" : "date",
"format" : "yyyy-MM-dd HH:mm:ss",
}
}
}
}
}
When done, you need to reindex your data using the update by query API so that the starttime.date field gets populated and index:
POST meetings/_update_by_query
When the update is done, you'll be able to leverage the starttime.date sub-field in your query:
{
"query": {
"bool": {
"filter": [
{
"range": {
"starttime.date": {
"gte": "2020-01-02 12:30:00",
"lte": "2020-01-10 12:30:00"
}
}
}
]
}
}
}
There are ways of parsing text fields as dates at search time but the overhead is impractical... You could, however, keep the starttime as text by default but make it a multi-field and query it using starttime.as_date, for example.
I'm trying to implement filter using ElasticSearch I'm simply want to implement range filter I've the following data:
{
"result": [
{
"Id": "144039",
"posted_dt": 1506951883637,
"submit_dt": 1507609800000,
"title": "Request for Information (RFI) # 306-18-0018",
"fname": "RODRI",
"email": "",
"desc": "dummy Text"
}
]
}
I want to get data from last 3 or 5 days I'm using this :
query = {
"bool": {
"must": [
{
"range" : {
"posted_dt" : {
"gte" : "now-3d/d",
"lt" : "now/d"
}
}
} ]
}
}
My mapping for posted_dt is :
"posted_dt": {
"type": "long"
},
I did try the filter as well but didn't succeed.
Please help.
Thanks
Randheer
Your mapping of "posted_dt" field is incorrect. You intend to store date which is in epoch in millis but you are storing it as long type. So the date range filter won't work on long datatype. Update your "posted_dt" field's mapping like :
PUT my_index
{
"mappings": {
"my_type": {
"properties": {
"posted_dt": {
"type": "date",
"format": "epoch_millis"
}
}
}
}
}
Refer Date datatype in Elasticsearch.
First you need to share your mapping. Actually make sure that posted_dt and submit_dt are defined as date in your mapping. Here you are using a long which is incorrect to deal with dates.
A side note is that you should use filter instead of must in your case. Will be faster IMO.
In my previous question, I was introduced to the fields in a query_string query and how it can help me to search nested fields of a document.
{
"query": {
"query_string": {
"fields": ["*.id","id"],
"query": "2"
}
}
}
But it only works for matching, what if I want to do some comparison? After some reading and testing, it seems queries like range do not support fields. Is there any way I can perform a range query, e.g. on a date, over a field that can be scattered anywhere in the document hierarchy?
i.e. considering the following document:
{
"id" : 1,
"Comment" : "Comment 1",
"date" : "2016-08-16T15:22:36.967489",
"Reply" : [ {
"id" : 2,
"Comment" : "Inner comment",
"date" : "2016-08-16T16:22:36.967489"
} ]
}
Is there a query searching over the date field (like date > '2016-08-16T16:00:00.000000') which matches the given document, because of the nested field, without explicitly giving the address to Reply.date? Something like this (I know the following query is incorrect):
{
"query": {
"range" : {
"date" : {
"gte" : "2016-08-16T16:00:00.000000",
},
"fields": ["date", "*.date"]
}
}
}
The range query itself doesn't support it, however, you can leverage the query_string query (again) and the fact that you can wildcard fields and that it supports range queries in order to achieve what you need:
{
"query": {
"query_string": {
"query": "\*date:[2016-08-16T16:00:00.000Z TO *]"
}
}
}
The above query will return your document because Reply.date matches *date
I'm using the Elasticsearch date_histogram aggregation for binning/bucketing my data. This works fine when plotting the results of a single query:
{
"query": {...},
"aggs" : {
"timeline" : {
"date_histogram" : {
"field" : "date",
"interval" : "month"
}
}
}
}
However, I now want to use ES for binning/bucketing the results of multiple queries. At the end, I need a line chart with each query representing a single line on the chart.
So, is it possible to use a single bucketing for multiple queries?
Ok, ended up defining a custom range for the date field and executed multiple queries with the same custom range. Probably not the most efficient way, but works fine.
{
"query": {...},
"aggs" : {
"ranges" : {
"date_range" : {
"field": date,
"format": yyyyMMdd,
"ranges": ranges}
}
}
}