I am new to ElasticSearch and I try to create an index for companies that come with multiple branches in the city.
Each of the branches, it has its own geolocation point.
My companies document looks like this:
{
"company_name": "Company X",
"branch": [
{
"address": {
// ... other fields
"location": "0.0000,1.1111"
}
}
]
}
The index have the following mapping:
{
"companies": {
"mappings": {
"dynamic_templates": [
{
"ids": {
"match": "id",
"match_mapping_type": "long",
"mapping": {
"type": "long"
}
}
},
{
"company_locations": {
"match": "location",
"match_mapping_type": "string",
"mapping": {
"type": "geo_point"
}
}
}
],
"properties": {
"branch": {
"properties": {
"address": {
"properties": {
// ...
"location": {
"type": "geo_point"
},
// ...
}
},
}
}
}
}
}
}
Now, in the ElasticSearch I've indexed the following documents:
{
"company_name": "Company #1",
"branch": [
{
"address": {
"location": "39.615,19.8948"
}
}
]
}
and
{
"company_name": "Company #2",
"branch": [
{
"address": {
"location": "39.586,19.9028"
}
},
{
"address": {
"location": "39.612,19.9134"
}
},
{
"address": {
"location": "39.607,19.8946"
}
}
]
}
Now what is my problem. If I try to run the following search query, unfortunately the company displayed first is the Company #2 although the geodistance query has the location data of the Company #1:
GET companies/_search
{
"fields": [
"company_name",
"branch.address.location"
],
"_source": false,
"sort": [
{
"_geo_distance": {
"branch.address.location": {
"lon": 39.615,
"lat": 19.8948
},
"order": "asc",
"unit": "km"
}
}
]
}
Am I doing something wrong? Is there a way to sort the search results using this method?
Please keep in mind that if for example search with a geolocation that is more close to some geolocations of the "Comapny #2", in this case I need the Company #2 to be first.
Finally, if the setup I have isn't correct for what I require, if there's any other way to achieve that same result with different document structure, please let me know. I am still in the beginning of the project, and It's simple to adapt to what is more appropriate.
The documentation here says "Geopoint expressed as a string with the format: "lat,lon"."
Your location is "location": "39.615,19.8948", maybe the query must be below:
"branch.address.location": {
"lat": 39.615,
"lon": 19.8948
}
My Tests:
PUT idx_test
{
"mappings": {
"properties": {
"branch": {
"properties": {
"address": {
"properties": {
"location": {
"type": "geo_point"
}
}
}
}
}
}
}
}
POST idx_test/_doc/1
{
"company_name": "Company #1",
"branch": [
{
"address": {
"location": "39.615,19.8948"
}
}
]
}
POST idx_test/_doc/2
{
"company_name": "Company #2",
"branch": [
{
"address": {
"location": "39.586,19.9028"
}
},
{
"address": {
"location": "39.612,19.9134"
}
},
{
"address": {
"location": "39.607,19.8946"
}
}
]
}
Search by location "39.607,19.8946" company #2
GET idx_test/_search?
{
"fields": [
"company_name",
"branch.address.location"
],
"_source": false,
"sort": [
{
"_geo_distance": {
"branch.address.location": {
"lat": 39.607,
"lon": 19.8946
},
"order": "asc",
"unit": "km"
}
}
]
}
Response:
"hits": [
{
"_index": "idx_test",
"_id": "2",
"_score": null,
"fields": {
"branch.address.location": [
{
"coordinates": [
19.9028,
39.586
],
"type": "Point"
},
{
"coordinates": [
19.9134,
39.612
],
"type": "Point"
},
{
"coordinates": [
19.8946,
39.607
],
"type": "Point"
}
],
"company_name": [
"Company #2"
]
},
"sort": [
0
]
},
{
"_index": "idx_test",
"_id": "1",
"_score": null,
"fields": {
"branch.address.location": [
{
"coordinates": [
19.8948,
39.615
],
"type": "Point"
}
],
"company_name": [
"Company #1"
]
},
"sort": [
0.8897252783915647
]
}
]
Search by location "39.615,19.8948" company #1
GET idx_test/_search?
{
"fields": [
"company_name",
"branch.address.location"
],
"_source": false,
"sort": [
{
"_geo_distance": {
"branch.address.location": {
"lat": 39.615,
"lon": 19.8948
},
"order": "asc",
"unit": "km"
}
}
]
}
Response
"hits": [
{
"_index": "idx_test",
"_id": "1",
"_score": null,
"fields": {
"branch.address.location": [
{
"coordinates": [
19.8948,
39.615
],
"type": "Point"
}
],
"company_name": [
"Company #1"
]
},
"sort": [
0
]
},
{
"_index": "idx_test",
"_id": "2",
"_score": null,
"fields": {
"branch.address.location": [
{
"coordinates": [
19.9028,
39.586
],
"type": "Point"
},
{
"coordinates": [
19.9134,
39.612
],
"type": "Point"
},
{
"coordinates": [
19.8946,
39.607
],
"type": "Point"
}
],
"company_name": [
"Company #2"
]
},
"sort": [
0.8897285575578558
]
}
]
Related
I am trying to simulate a watch and see if the actions are triggering fine. But my problem is the search returns no results.
My query
Checks for a particular index.
Checks for a range
Check for the servicename field to be a particular value.
This is my watch definition
{
"trigger": {
"schedule": {
"interval": "10m"
}
},
"input": {
"search": {
"request": {
"search_type": "query_then_fetch",
"indices": [
"datasolutions-svc-*"
],
"body": {
"query": {
"bool": {
"filter": [
{
"term": {
"level": {
"value": "ERROR"
}
}
},
{
"term": {
"servicename": [
"Iit.Det.Urm.MepsSubscriber"
]
}
},
{
"range": {
"#timestamp": {
"gte": "now-60m"
}
}
}
]
}
}
}
}
}
},
"condition": {
"compare": {
"ctx.payload.hits.total": {
"gt": 0
}
}
},
"actions": {
"notify-slack": {
"slack": {
"account": "elastic_watcher_alerts",
"proxy": {
"host": "proxy.dom",
"port": 80
},
"message": {
"from": "Error Monitor",
"to": [
"#det-errors"
],
"text": "The following error(s) have been logged",
"dynamic_attachments": {
"list_path": "ctx.payload.items",
"attachment_template": {
"color": "#f00",
"title": "{{msg}}",
"title_link": "https://elastic.mid.dom:port/{{index}}/doc/{{id}}?pretty",
"text": "{{msg}}",
"fields": [
{
"title": "Server",
"value": "{{host}}",
"short": true
},
{
"title": "Servicename",
"value": "{{service}}",
"short": true
}
]
}
}
}
}
}
},
"transform": {
"script": {
"source": "['items': ctx.payload.hits.hits.collect(hit -> ['msg': hit._source.message, 'service': hit._source.servicename, 'index': hit._index, 'id' : hit._id, 'host': hit._source.agent.hostname ])]",
"lang": "painless"
}
}
}
I am trying to now test it by using the simulate option and giving it an input. This input is copied from actual data that is in the index. I copied a json document from kibana (in the discover section), so the alternate input json should be ok
Here's the alternative input
{
"_index": "datasolutions-svc-live-7.7.0-2021.01",
"_type": "doc",
"_id": "Hre9SHcB1QIqYEnyxSCw",
"_version": 1,
"_score": null,
"_source": {
"exception": "System.Data.SqlClient.SqlException (0x80131904): blabla",
"agent": {
"hostname": "SATSVC3-DK1",
"name": "datasolutions-svc-live",
"id": "8c826ae1-e411-4257-a31f-08824dd58b5a",
"type": "filebeat",
"ephemeral_id": "e355bf8a-be67-4ed1-85f4-b9043674700e",
"version": "7.7.0"
},
"log": {
"file": {
"path": "D:\\logs\\7DaysRetention\\Iit.Det.Urm.MepsSubscriber\\Iit.Det.Urm.MepsSubscriber.log.20210128.log"
},
"offset": 17754757
},
"level": "ERROR",
"message": "Error while starting service.",
"#timestamp": "2021-02-17T10:00:28.343Z",
"ecs": {
"version": "1.5.0"
},
"host": {
"name": "datasolutions-svc-live"
},
"servicename": "Iit.Det.Urm.MepsSubscriber",
"codelocation": "Iit.Det.Urm.MepsSubscriber.MepsSubscriberService.OnStart:29"
},
"fields": {
"#timestamp": [
"2021-02-17T10:00:28.343Z"
]
},
"highlight": {
"servicename": [
"#kibana-highlighted-field#Iit.Det.Urm.MepsSubscriber#/kibana-highlighted-field#"
]
},
"sort": [
1611833128343
]
}
But when I run "simulate", I get the ctx.payload.total.hits as null because apparently it does not find any results. Result of the simulate-
{
"watch_id": "_inlined_",
"node": "eMS-E34eT4-zZhGwtPNSmw",
"state": "execution_not_needed",
"user": "sum",
"status": {
"state": {
"active": true,
"timestamp": "2021-02-17T10:57:04.077Z"
},
"last_checked": "2021-02-17T10:57:04.077Z",
"actions": {
"notify-slack": {
"ack": {
"timestamp": "2021-02-17T10:57:04.077Z",
"state": "awaits_successful_execution"
}
}
},
"execution_state": "execution_not_needed",
"version": -1
},
"trigger_event": {
"type": "manual",
"triggered_time": "2021-02-17T10:57:04.077Z",
"manual": {
"schedule": {
"scheduled_time": "2021-02-17T10:57:04.077Z"
}
}
},
"input": {
"search": {
"request": {
"search_type": "query_then_fetch",
"indices": [
"datasolutions-svc-*"
],
"rest_total_hits_as_int": true,
"body": {
"query": {
"bool": {
"filter": [
{
"term": {
"level": {
"value": "ERROR"
}
}
},
{
"term": {
"servicename": [
"Iit.Det.Urm.MepsSubscriber"
]
}
},
{
"range": {
"#timestamp": {
"gte": "now-60m"
}
}
}
]
}
}
}
}
}
},
"condition": {
"compare": {
"ctx.payload.hits.total": {
"gt": 0
}
}
},
"metadata": {
"name": "datasolutions-svc-mepssubscriber",
"xpack": {
"type": "json"
}
},
"result": {
"execution_time": "2021-02-17T10:57:04.077Z",
"execution_duration": 0,
"input": {
"type": "simple",
"status": "success",
"payload": {
"highlight": {
"servicename": [
"#kibana-highlighted-field#Iit.Det.Urm.MepsSubscriber#/kibana-highlighted-field#"
]
},
"_index": "datasolutions-svc-live-7.7.0-2021.01",
"_type": "doc",
"_source": {
"exception": "System.Data.SqlClient.SqlException (0x80131904): blabla",
"agent": {
"hostname": "SATSVC3-DK1",
"name": "datasolutions-svc-live",
"id": "8c826ae1-e411-4257-a31f-08824dd58b5a",
"type": "filebeat",
"ephemeral_id": "e355bf8a-be67-4ed1-85f4-b9043674700e",
"version": "7.7.0"
},
"#timestamp": "2021-02-17T10:00:28.343Z",
"ecs": {
"version": "1.5.0"
},
"log": {
"file": {
"path": "D:\\logs\\7DaysRetention\\Iit.Det.Urm.MepsSubscriber\\Iit.Det.Urm.MepsSubscriber.log.20210128.log"
},
"offset": 17754757
},
"level": "ERROR",
"host": {
"name": "datasolutions-svc-live"
},
"servicename": "Iit.Det.Urm.MepsSubscriber",
"message": "Error while starting service.",
"codelocation": "Iit.Det.Urm.MepsSubscriber.MepsSubscriberService.OnStart:29"
},
"_id": "Hre9SHcB1QIqYEnyxSCw",
"sort": [
1611833128343
],
"_score": null,
"fields": {
"#timestamp": [
"2021-02-17T10:00:28.343Z"
]
},
"_version": 1
}
},
"condition": {
"type": "compare",
"status": "success",
"met": false,
"compare": {
"resolved_values": {
"ctx.payload.hits.total": null
}
}
},
"actions": []
},
"messages": []
}
I am not sure what can't it find the results. Can someone tell me what is it that I am doing wrong?
I was able to solve it using the "inspect" section of discover page of the index.
Finally my input for the watcher query had to be changed to
"input": {
"search": {
"request": {
"search_type": "query_then_fetch",
"indices": [
"datasolutions-svc-*"
],
"rest_total_hits_as_int": true,
"body": {
"query": {
"bool": {
"must": [],
"filter": [
{
"bool": {
"should": [
{
"match_phrase": {
"servicename": "Iit.Det.Urm.MepsSubscriber"
}
}
],
"minimum_should_match": 1
}
},
{
"match_phrase": {
"level": "ERROR"
}
},
{
"range": {
"#timestamp": {
"gte": "now-10m",
"format": "strict_date_optional_time"
}
}
}
],
"should": [],
"must_not": []
}
}
}
}
}
}
Elasticsearch 7.7 and I'm using the official php client to interact with the server.
My issue was somewhat solved here: https://discuss.elastic.co/t/need-to-return-part-of-a-doc-from-a-search-query-filter-is-parent-child-the-way-to-go/64514/2
However "Types are deprecated in APIs in 7.0+" https://www.elastic.co/guide/en/elasticsearch/reference/7.x/removal-of-types.html
Here is my document:
{
"offering_id": "1190",
"account_id": "362353",
"service_id": "20087",
"title": "Quick Brown Mammal",
"slug": "Quick Brown Fox",
"summary": "Quick Brown Fox"
"header_thumb_path": "uploads/test/test.png",
"duration": "30",
"alter_ids": [
"59151",
"58796",
"58613",
"54286",
"51812",
"50052",
"48387",
"37927",
"36685",
"36554",
"28807",
"23154",
"22356",
"21480",
"220",
"1201",
"1192"
],
"premium": "f",
"featured": "f",
"events": [
{
"event_id": "9999",
"start_date": "2020-07-01 14:00:00",
"registration_count": "22",
"description": "boo"
},
{
"event_id": "9999",
"start_date": "2020-07-01 14:00:00",
"registration_count": "22",
"description": "xyz"
},
{
"event_id": "9999",
"start_date": "2020-08-11 11:30:00",
"registration_count": "41",
"description": "test"
}
]
}
Notice how the object may have one or many "events"
Searching based on event data is the most common use case.
For example:
Find events that start before 12pm
Find events with a description of "xyz"
List find events with a start date in the next 10 days.
I would like to NOT return any events that didn't match the query!
So, for example Find events with a description of "xyz" for a given service
{
"query": {
"bool": {
"must": {
"match": {
"events.description": "xyz"
}
},
"filter": {
"bool": {
"must": [
{
"term": {
"service_id": 20087
}
}
]
}
}
}
}
}
I would want the result to look like this:
{
"offering_id": "1190",
"account_id": "362353",
"service_id": "20087",
"title": "Quick Brown Mammal",
"slug": "Quick Brown Fox",
"summary": "Quick Brown Fox"
"header_thumb_path": "uploads/test/test.png",
"duration": "30",
"alter_ids": [
"59151",
"58796",
"58613",
"54286",
"51812",
"50052",
"48387",
"37927",
"36685",
"36554",
"28807",
"23154",
"22356",
"21480",
"220",
"1201",
"1192"
],
"premium": "f",
"featured": "f",
"events": [
{
"event_id": "9999",
"start_date": "2020-07-01 14:00:00",
"registration_count": "22",
"description": "xyz"
}
]
}
However, instead it just returns the ENTIRE document, with all events.
Is it even possible to return only a subset of the data? Maybe with Aggregations?
Right now, we're doing an "extra" set of filtering on the result set in the application (php in this case) to strip out event blocks that don't match the desired results.
It would be nice to just have elastic give directly what's needed instead of doing extra processing on the result to pull out the applicable event.
Thought about restructuring the data to instead have it based around "events" but then I would be duplicating data since every offering will have the parent data too.
This used to be in SQL, where there was a relation instead of having the data nested like this.
A subset of the nested data can be returned using Nested Aggregations along with Filter Aggregations
To know more about these aggregations refer these official documentation :
Filter Aggregation
Nested Aggregation
Index Mapping:
{
"mappings": {
"properties": {
"offering_id": {
"type": "integer"
},
"account_id": {
"type": "integer"
},
"service_id": {
"type": "integer"
},
"title": {
"type": "text"
},
"slug": {
"type": "text"
},
"summary": {
"type": "text"
},
"header_thumb_path": {
"type": "keyword"
},
"duration": {
"type": "integer"
},
"alter_ids": {
"type": "integer"
},
"premium": {
"type": "text"
},
"featured": {
"type": "text"
},
"events": {
"type": "nested",
"properties": {
"event_id": {
"type": "integer"
},
"registration_count": {
"type": "integer"
},
"description": {
"type": "text"
}
}
}
}
}
}
Search Query :
{
"size": 0,
"aggs": {
"nested": {
"nested": {
"path": "events"
},
"aggs": {
"filter": {
"filter": {
"match": { "events.description": "xyz" }
},
"aggs": {
"total": {
"top_hits": {
"size": 10
}
}
}
}
}
}
}
}
Search Result :
"hits": [
{
"_index": "foo21",
"_type": "_doc",
"_id": "1",
"_nested": {
"field": "events",
"offset": 1
},
"_score": 1.0,
"_source": {
"event_id": "9999",
"start_date": "2020-07-01 14:00:00",
"registration_count": "22",
"description": "xyz"
}
}
]
Second Method :
{
"query": {
"bool": {
"must": [
{
"match": {
"service_id": "20087"
}
},
{
"nested": {
"path": "events",
"query": {
"bool": {
"must": [
{
"match": {
"events.description": "xyz"
}
}
]
}
},
"inner_hits": {
}
}
}
]
}
}
}
You can even go through this SO answer:
How to filter nested aggregation bucket?
Returning a partial nested document in ElasticSearch
I'm new in ElasticSearch and I have a few questions regarding nested object retrieval when a specific condition is matched.
I have a tree-like structure as follow:
{
"id": 4,
"sora": [
{
"pContext": {
"context": {
"sT": "D3",
"uT": "ST"
},
"entities": [
{
"name": "premium",
"bName": "premium",
"fT": "site",
"eT": "F_P",
"children": [
{
"name": "capa",
"bName": "capa",
"fT": "site",
"eT": "FFT",
"children": []
},
{
"name": "code",
"bName": "Codes",
"fT": "site",
"eT": "FFT",
"children": []
},
{
"name": "selection A",
"fT": "site",
"eT": "SELECTION_A",
"children": [
{
"name": "A1",
"fT": "site",
"eT": "ADD",
"children": []
},
{
"name": "A2",
"fT": "site",
"eT": "ADD",
"children": []
}
]
}
]
}
]
}
},
{
"pContext": {
"context": {
"sT": "D2",
"uT": "ST"
},
"entities": [
{
"name": "112",
"bName": "112",
"eT": "D_TYPE",
"children": []
}
]
}
}
]
}
My structure can have more levels.
I have many documents as described above. In order to filter my document I can use the simple query sintax:
{
"_source": {
"excludes": [
"*.context"
]
},
"query": {
"bool": {
"must": [
{
"match": {
"sora.pContext.context.sT": "D3"
},
"match": {
"sora.pContext.entities.name": "premium"
},
"match": {
"sora.pContext.entities.fT": "site"
}
}
]
}
}
}
What I would like to know is, how can I get the nested object that
matches my query and their children. I need the object that matched
the must inclusive filter. Is that possible?
How can I search for a field without specifing the path?
Thanks
# EDIT
My mapping:
{
"mappings": {
"abc": {
"properties": {
"id": {
"type": "integer"
},
"sora": {
"type": "nested",
"properties": {
"pContext": {
"type": "nested",
"properties": {
"context": {
"type": "nested",
"properties": {
"sT": {
"type": "text"
},
"uT": {
"type": "text"
}
}
},
"entities": {
"type": "nested",
"properties": {
"name": {
"type": "text"
},
"bName": {
"type": "text"
},
"fT": {
"type": "text"
},
"eT": {
"type": "text"
},
"children": {
"type": "object"
}
}
}
}
}
}
}
}
}
}
}
Yes you can get the matching objects by using inner_hits along with nested query and not the one you added to the question.
Your query will look as below:
{
"_source": {
"excludes": [
"*.context"
]
},
"query": {
"bool": {
"filter": [
{
"nested": {
"inner_hits": {},
"path": "sora.pContext",
"query": {
"bool": {
"must": [
{
"nested": {
"path": "sora.pContext.context",
"query": {
"bool": {
"must": [
{
"match": {
"sora.pContext.context.sT": "D3"
}
}
]
}
}
}
},
{
"nested": {
"path": "sora.pContext.entities",
"query": {
"bool": {
"must": [
{
"match": {
"sora.pContext.entities.name": "premium"
}
},
{
"match": {
"sora.pContext.entities.fT": "site"
}
}
]
}
}
}
}
]
}
}
}
}
]
}
}
}
I have added link to inner_hits documentation where you can understand how the results will look like.
Well, if someone else is facing the same issue my solution was added all child in the same path/level as the parent but keep the mapping with parent and their children. With that, I'm able to search and retrieve the parts of the parent as wanted.
I tried to get my head wrapped around nested queries but I can't get this to work.
I have 2 items in ES that look like this
{
"_index": "catalog",
"_type": "products",
"_source": {
"product": {
"ean": "abc",
"features": {
"Product Type": "DVD player",
},
"color": "Black",
"manufacturer": "Sony",
"sitedetails": [
{
"name": "amazon.com",
"sku": "zzz",
"url": "http://www.amazon.com/dp/zzz"
}
],
"category": "Portable DVD Players"
}
}
},
{
"_index": "catalog",
"_type": "products",
"_source": {
"product": {
"ean": "def",
"features": {
"Product Type": "MP3 player",
},
"color": "Black",
"manufacturer": "LG",
"sitedetails": [
{
"name": "amazon.com",
"sku": "aaa",
"url": "http://www.amazon.com/dp/aaa"
}
],
"category": "MP3 Players"
}
}
}
2 questions:
What is the curl to get sku = zzz?
What is the curl to get both items on a search for "players"?
tnx!
Heyy bro, lets do the magic.
First , you need an mapping including your nested objects, like this
curl -XPUT "http://192.168.99.100:9200/catalog" -d'
{
"mappings": {
"products": {
"properties": {
"product": {
"type": "nested",
"properties": {
"features": {
"type":"nested"
},
"sitedetails": {
"type": "nested"
}
}
}
}
}
}
}'
After that, lets insert your data (change your Product Type to product_type)
curl -XPOST "http://192.168.99.100:9200/catalog/products" -d'
{
"product": {
"ean": "abc",
"features": {
"product_type": "DVD player"
},
"color": "Black",
"manufacturer": "Sony",
"sitedetails": [
{
"name": "amazon.com",
"sku": "zzz",
"url": "http://www.amazon.com/dp/zzz"
}
],
"category": "Portable DVD Players"
}
}'
Now, lets do the query
curl -XPOST "http://192.168.99.100:9200/catalog/products/_search" -d'
{
"query": {
"bool": {
"must": [
{
"nested": {
"path": "product.features",
"query": {
"match": {
"product.features.product_type": "player"
}
}
}
},
{
"nested": {
"path": "product.sitedetails",
"query": {
"match": {
"product.sitedetails.sku": "zzz"
}
}
}
}
]
}
}
}'
And the response will be:
"hits": {
"total": 1,
"max_score": 1.4054651,
"hits": [
{
"_index": "catalog",
"_type": "products",
"_id": "AVM_fcYgvVoSi3OfqPTX",
"_score": 1.4054651,
"_source": {
"product": {
"ean": "abc",
"features": {
"Product Type": "DVD player"
},
"color": "Black",
"manufacturer": "Sony",
"sitedetails": [
{
"name": "amazon.com",
"sku": "zzz",
"url": "http://www.amazon.com/dp/zzz"
}
],
"category": "Portable DVD Players"
}
}
}
]
}
Hope it help :D
Use:
curl 'http://localhost:9200/catalog/products/_search?q=sku:"zzz"&pretty=true'
curl 'http://localhost:9200/catalog/products/_search?q=sku:*&pretty=true'. like my thinking, you want to get data within sku:"zzz" and sku:"aaa".
Referer:
http://joelabrahamsson.com/elasticsearch-101/
http://www.elasticsearchtutorial.com/elasticsearch-in-5-minutes.html
I want to append a script field to a elasticsearch result. But I can't find a working solution.
I have a script field like this:
{
"script_fields": {
"distance": {
"script": "doc[my_field_name].arcDistance(my_lat, my_lon)",
"params": {
"my_field_name": "geopoint",
"my_lat": 52.5,
"my_lon": 13.4
}
}
}
}
As result I get something like that:
"hits": [
{
"fields": {
"distance": [
0
]
}
},
{
"fields": {
"distance": [
500
]
}
},
{
"fields": {
"distance": [
1000
]
}
}
]
But I need full documents together with the script fields. So I've tried this:
{
"script_fields": {
"distance": {
"script": "doc[my_field_name].arcDistance(my_lat, my_lon)",
"params": {
"my_field_name": "geopoint",
"my_lat": 52.5,
"my_lon": 13.4
}
},
"source": {
"script": "_source"
}
}
}
But as result I get something like this:
"hits": [
{
"fields": {
"distance": [
0
],
"source": [
{
"id": "101",
"geopoint": {
"lon": 52.5,
"lat": 13.4
},
}
]
}
},
{
"fields": {
"distance": [
500
],
"source": [
{
"id": "101",
"geopoint": {
"lon": 52.5,
"lat": 13.4
},
}
]
}
},
{
"fields": {
"distance": [
1000
],
"source": [
{
"id": "101",
"geopoint": {
"lon": 52.5,
"lat": 13.4
},
}
]
}
}
]
The source is in this case the same for all hits. I have thought the _source loads per document, but doesn't look so.
How can I achieve script field together with the document as result or isn't this possible?
I was on the wrong track. The solutions was to change the request to:
{
"fields": [
"_source"
],
"script_fields": {
"distance": {
"script": "doc[my_field_name].arcDistance(my_lat, my_lon)",
"params": {
"my_field_name": "geopoint",
"my_lat": 52.5,
"my_lon": 13.4
}
}
}
}
the result looks than something like this:
"hits": [
{
"fields": {
"distance": [
0
],
},
"_source": {
{
"id": "101",
"geopoint": {
"lat": 52.5,
"lon": 13.4
},
}
}
},
{
"fields": {
"distance": [
500
],
},
"_source": {
{
"id": "102",
"geopoint": {
"lat": 52.5,
"lon": 13.40739378
},
}
}
},
{
"fields": {
"distance": [
1000
],
},
"_source": {
{
"id": "103",
"geopoint": {
"lat": 52.5,
"lon": 13.4147875
},
}
}
}
]