Elasticsearch get intersection of coordinates - elasticsearch

I have the an index field, with which, given a set of coordinates for a polygon, I want to get any fields that intersect those coordinates. Is this possible given the structure of the field index using elasticsearch geo location features? I am using elasticsearch version 7.9
{
"geo_json": {
"geometry": {
"coordinates": [
[
[
2.1228971832029644,
41.3011586218355
],
[
2.122596585111679,
41.3012384865674
],
[
2.1221786804481835,
41.30191870980272
],
[
2.1223509744761158,
41.302042636348716
],
[
2.1226735685285507,
41.30192972550523
],
[
2.1232820963718857,
41.30165984025794
],
[
2.1232820963718857,
41.30131559725024
],
[
2.1228971832029644,
41.3011586218355
]
]
],
"type": "Polygon"
},
"properties": null,
"type": "Feature"
},
"name": "Barcelona"
}
I have tried the following query, with returned error "Field [geo_json.geometry.coordinates] is of unsupported type [float]. [geo_shape] query supports the following types [geo_shape,geo_point]"
GET field/_search
{
"query": {
"bool": {
"filter": {
"geo_shape": {
"geo_json.geometry.coordinates": {
"shape": {
"type": "polygon",
"coordinates": [
[
[
2.1228971832029644,
41.3011586218355
],
[
2.122596585111679,
41.3012384865674
],
[
2.1221786804481835,
41.30191870980272
],
[
2.1223509744761158,
41.302042636348716
],
[
2.1226735685285507,
41.30192972550523
],
[
2.1232820963718857,
41.30165984025794
],
[
2.1232820963718857,
41.30131559725024
],
[
2.1228971832029644,
41.3011586218355
]
]
]
},
"relation": "intersects"
}
}
}
}
}
}

It surely is possible. Adapting my previous answer to your use case:
Set up the index mapping
PUT geoindex
{
"mappings": {
"properties": {
"area": {
"type": "float"
},
"center": {
"type": "geo_point"
},
"geo_json": {
"type": "geo_shape"
},
"name": {
"type": "text"
}
}
}
}
Add the Barcelona polygon
POST geoindex/_doc
{
"area": 5380.8444064004625,
"center": [ 2.1227303884100346, 41.30160062909211 ],
"geo_json": {
"type": "polygon",
"coordinates": [[[2.1228971832029644,41.3011586218355],[2.122596585111679,41.3012384865674],[2.1221786804481835,41.30191870980272],[2.1223509744761158,41.302042636348716],[2.1226735685285507,41.30192972550523],[2.1232820963718857,41.30165984025794],[2.1232820963718857,41.30131559725024],[2.1228971832029644,41.3011586218355]]]
},
"name": "Barcelona"
}
Check for the intersection (the query polygon is yellow):
POST geoindex/_search
{
"query": {
"geo_shape": {
"geo_json": {
"relation": "intersects",
"shape": {
"type": "polygon",
"coordinates": [[[2.122421264648437,41.30061251600798],[2.123579978942871,41.300354591849114],[2.123579978942871,41.30120896171846],[2.122957706451416,41.30129762210163],[2.122421264648437,41.30061251600798]]]
}
}
}
}
}

Related

what is the best way to find available points(merchants) around a point(user) if these points(merchants) use different radius in Elasticsearch

I used geo_shape for this problem, want to know if there are some other better ways(faster) to solve this problem, ES version is 6.5.
mapping:
{
"index": {
"mappings": {
"merchant": {
"_all": {
"enabled": false
},
"properties": {
"delivery_circle": {
"type": "geo_shape",
"tree": "quadtree",
"precision": "50.0m",
"distance_error_pct": 0.025
}
}
}
}
}
}
document example:
{
"_source": {
"id": 1,
"delivery_circle": {
"coordinates": [ // merchant location, its radius is 4km
123.456,
1.2345
],
"radius": "4000m",
"type": "circle"
}
}
}
{
"_source": {
"id": 2,
"delivery_circle": {
"coordinates": [ // merchant location, its radius is 5km
123.567,
1.3456
],
"radius": "5000m",
"type": "circle"
}
}
}
search query example:
{
"query": {
"bool": {
"filter": [
{
"geo_shape": {
"delivery_circle": {
"relation": "contains",
"shape": {
"coordinates": [ // user location
123,
1
],
"type": "point"
}
}
}
}
]
}
}
}
This is off topic suggestion, but you can consider using geohash for the same. This can reduce your search time complexity.
https://en.wikipedia.org/wiki/Geohash

Find coordinates in a polygon

How can I find polygons that stored in elastic index.
Simple mapping:
PUT /regions
{
"mappings": {
"properties": {
"location": {
"type": "geo_shape"
}
}
}
}
And simple polygon:
/regions/_doc/1
{
"location" : {
"type" : "polygon",
"coordinates" : [
[
[53.847332102970626,27.485155519098047],
[53.84626875748117,27.487134989351038],
[53.8449047241684,27.48501067981124],
[53.84612634308789,27.482945378869765],
[53.847411219859,27.48502677306532],
[53.847332102970626,27.485155519098047]
]
]
}
}
According to documentation I can only search coordinates within polygon only if the polygon is contained in the request Geo-polygon query, but I need to find polygons by coordinates in query. Elasticsearch 7.6 version.
Query:
{
"query": {
"match_all": {}
},
"filter": {
"geo_shape": {
"geometry": {
"shape": {
"coordinates": [
53.846415,
27.485756
],
"type": "point"
},
"relation": "whithin"
}
}
}
}
You were on the right path but your query was heavily malformed. Here's the fix:
{
"query": {
"bool": {
"filter": {
"geo_shape": {
"location": {
"shape": {
"coordinates": [
53.846415,
27.485756
],
"type": "point"
},
"relation": "intersects"
}
}
}
}
}
}
Notice how I used intersects instead of within. The reason is explained in this GIS StackExchange answer.

Elasticsearch - return polygons in a given point (relation": "intersects returns union instead)

I have multiple polygons in the index and I'm trying to return all the polygons in a given point.
But my query returns all the docs in the index even if it doesn't falls in polygon 1.
It gives me union instead of intersect.
Query is given below
Map Ref. http://geojson.io/#map=7/-31.775/144.382
PUT /example3
{
"mappings": {
"_doc": {
"properties": {
"features": {
"properties": {
"geometry": {
"type": "geo_shape"
}
}
}
}
}
}
}
Sample data:
{
"type": "Feature",
"geometry": {
"name": "Poly - 04",
"type": "Polygon",
"coordinates": [
[
[
144.91670608520508,
-37.82524314302977
],
[
144.96846199035645,
-37.82524314302977
],
[
144.96846199035645,
-37.78787789236923
],
[
144.91670608520508,
-37.78787789236923
],
[
144.91670608520508,
-37.82524314302977
]
]
]
}
}
Query:
GET /example3/_search
{
"query":{
"bool": {
"must": {
"match_all": {}
},
"filter": {
"geo_shape": {
"features.geometry": {
"relation": "intersects",
"shape": {
"type": "point",
"coordinates" :
[
146.0138,
-37.1734
]
}
}
}
}
}
}
}

Multi Match Query for multiple words with operator AND

So my scenario is that in my application there is an inline search just like the one we have here on Udemy site's header bar and the user can type more than one word in it. Now, I want to use that multi word search text entered by user to be queried against multi fields.
Multi Fields against which I am querying have the following mapping
_mapping
{
"category": {
"type": "keyword"
},
"designers": {
"type": "nested",
"properties": {
"name": {
"type": "keyword"
}
}
},
"story": {
"type": "text"
},
"foundryName": {
"type": "text",
}
}
My problem here is how can I do a multi word search like "designerFirstName1 category1 foundryName1" and get results where the matched document has each word from any one of the multifields I am searching in also as I continue to add more words the result set should get reduced.
Query
{
"query": {
"bool": {
"should": [
{
"nested": {
"path": "designers",
"query": {
"match": {
"designers.name": {
"query": "designerFirstName1 category1 foundryName1",
"fuzziness": "auto"
}
}
}
}
},
{
"multi_match": {
"query": "designerFirstName1 category1 foundryName1",
"type": "cross_fields",
"fields": [
"story",
"foundryName",
"category",
]
}
}
],
"minimum_should_match": 1
}
}
}
Expected Result is that this kind of document should be higher and then as we go down the results start having not all the multiwords in any one of the field(as shown below)
{
"category": [
"category1",
"category2"
],
"designers": [
{
"name": "designerFirstName1 designerLastName1"
},
{
"name": "designerFirstName2 designerLastName2"
}
],
"story": "Sphinx of black quartz, judge my vow! Sex-charged fop blew my junk TV quiz.",
"foundryName": "foundryName1"
},
{
"category": [
"category2",
"category3"
],
"designers": [
{
"name": "designerFirstName1 designerLastName1"
},
{
"name": "designerFirstName2 designerLastName2"
}
],
"story": "Sphinx of black quartz, judge my vow! Sex-charged fop blew my junk TV quiz.",
"foundryName": "foundryName1"
},
{
"category": [
"category1",
"category3"
],
"designers": [
{
"name": "designerFirstName3 designerLastName1"
},
{
"name": "designerFirstName2 designerLastName2"
}
],
"story": "Sphinx of black quartz, judge my vow! Sex-charged fop blew my junk TV quiz.",
"foundryName": "foundryName1"
},
{
"category": [
"category2",
"category3"
],
"designers": [
{
"name": "designerFirstName3 designerLastName1" /*changed here comparing with the above document*/
},
{
"name": "designerFirstName2 designerLastName2"
}
],
"story": "Sphinx of black quartz, judge my vow! Sex-charged fop blew my junk TV quiz.",
"foundryName": "foundryName1"
},

Usage of NestedFilter and changes in mapping

I'm trying to create a filter on the basis of some product attributes which I've indexed. Earlier, they were indexed as follows -:
"attributes": {
"name1": "value1",
"name2": "value2",
....
}
The filter that I used earlier was generated according to URL query parameters as follows -:
"/search?q=product&color=black&color=blue&size=xl"
would lead to -
>>> and_filter = ANDFilter(
[
ORFilter(
[
TermFilter('color', 'blue'),
TermFilter('color', 'black')
]
),
ORFilter(
[
TermFilter('size', 'xl')
]
)
]
)
>>> main_filter = BoolFilter().add_must(and_filter)
Due to some changes in the backend, the mapping had to be changed to a nested one.
New Mapping -:
"attributes":
{
"type": "nested",
"properties": {
"name": {"type": "string"},
"display_name": {"type": "string"},
"type": {"type": "string"},
"value": {"type": "string"}
}
}
I thought of the new filter as so -:
>>> and_filter = ANDFilter(
[
ORFilter(
[
ANDFilter(
[
TermFilter("attributes.name", "color"),
TermFilter("attributes.value", "blue"),
]
),
ANDFilter(
[
TermFilter("attributes.name", "color"),
TermFilter("attributes.value", "black"),
]
)
]
),
ORFilter(
[
ANDFilter(
[
TermFilter("attributes.name", "size"),
TermFilter("attributes.value", "xl"),
]
)
]
)
]
)
>>> nested_filter = NestedFilter("attributes", BoolFilter().add_must(and_filter))
However, this doesn't seem to be the right way to do it. When I try to generate facets over products with this filter applied, all the counts come out to be zero everytime.
Additionally, trying to search for products does not yield the expected results when the filter is applied.
I would appreciate some pointers as to how to design the filter correctly.
EDIT:
Old Filter -:
{
"bool": {
"must": [
{
"and": [
{
"or": [
{
"term": {
"color": "blue"
}
},
{
"term": {
"color": "black"
}
}
]
},
{
"or": [
{
"term": {
"size": "xl"
}
}
]
}
]
}
]
}
}
New Filter -:
{
"nested": {
"filter": {
"bool": {
"must": [
{
"and": [
{
"or": [
{
"and": [
{
"term": {
"attributes.name": "color"
}
},
{
"term": {
"attributes.value": "blue"
}
}
]
},
{
"and": [
{
"term": {
"attributes.name": "color"
}
},
{
"term": {
"attributes.value": "black"
}
}
]
}
]
},
{
"or": [
{
"and": [
{
"term": {
"attributes.name": "size"
}
},
{
"term": {
"attributes.value": "xl"
}
}
]
}
]
}
]
}
]
}
},
"path": "attributes"
}
}

Resources