Can not query geo_point using geo polygon filter - filter

Hi I'm trying to query geo_point in ElasticSearch, using the query syntax geo polygon filter in the official document], but no reusult is returned.
Here are some details:
I use river plugin to index the data from MySQL into ES with the definition of mapping (a nested structure but coordinate.value is geo_point).
I can see the documents from head plugin:
The query json is:
{
"query": {
"filtered": {
"query": {
"match_all": {}
},
"filter": {
"geo_polygon": {
"coordinate.value": {
"points": [
[
-180,
90
],
[
-180,
-90
],
[
180,
-90
],
[
-180,
90
]
]
}
}
}
}
}
}
Can anyone tell me what's the correct query method to get geo_point? Thanks

polygon should be closed (ie first and last points should be the same).

Related

elasticsearch: DISJOINT query relation not supported for Field

I have created the simplest index in ES v7.10:
it maps mylocation field into geo_point (not geo_shape):
PUT /myindex
{
"mappings": {
"dynamic": "false",
"properties": {
"mylocation": {
"type": "geo_point"
}
}
}
}
then I pushed some data inside. I've omitted this to shorten the question...
When I query the below, all works fine:
{
"query": {
"bool": {
"must": {
"match_all": {}
},
"filter": {
"geo_shape": {
"mylocation": {
"shape": {
"type": "polygon",
"coordinates": [[ [ 13.0, 53.0 ], [ 0.0, 1.0 ], [ 0.0, 0.0 ], [ 13.0, 53.0 ] ]]
},
"relation": "intersects"
}
}
}
}
}
}
when I replace the intersects with disjoint, I get the error:
DISJOINT query relation not supported for Field [mylocation].
In elastic docs (for the relevant version, 7), it is mentioned that "Geoshape query filter documents indexed using the geo_shape or geo_point type" (and I am using the geo_point). Down the same page, it is written the "disjoint" is supported!
What am I missing? Why do I get the error?
No idea how I have missed it, but in v7.10 docs (link in the question) they write in the next paragraph:
When searching a field of type geo_point there is a single supported
spatial relation operator:
INTERSECTS - (default) Return all documents whose geo_point field
intersects the query geometry.
In v7.17 they do not have this limitation anymore...

Does Elasticsearch support geo queries like ST_DWithin in postgis

How to perform such query in Elasticsearch that whether a geo_point is within the specified distance(or radius/buffer) of a line(depicted by 2 pairs of lat/lon)
shape like this
This is not implemented in Elastic as far as I know. But you can still achieve what you want by calculating the polygon offline and then use it in a geo_polygon query. The geo_shape query could also be used but you need a geo_shape field instead of a geo_point one.
So, for instance, using turf you can precompute the polygon around the line using the buffer feature. Below, I'm defining a line along some road somewhere in San Jose (CA) and a buffer of 50 meters around that line/road:
const line = turf.lineString([[-121.862282,37.315430], [-121.851553,37.305532]], {name: 'line 1'});
const bufferPoly = turf.buffer(line, 50, {units: 'meters'});
You'll get the following polygon (abbreviated)
{
"type": "Feature",
"properties": {
"name": "line 1"
},
"geometry": {
"type": "Polygon",
"coordinates": [
[
[
-121.85121372045873,
37.305765606399724
],
[
-121.85116304254947,
37.30570833334188
],
[
-121.85112738572346,
37.30564429665501
],
[
-121.85110812025259,
37.30557595721911
],
...
[
-121.85121372045873,
37.305765606399724
]
]
]
}
}
Which looks like this:
Then you can leverage the geo_polygon query like this:
GET /_search
{
"query": {
"bool": {
"must": {
"match_all": {}
},
"filter": {
"geo_polygon": {
"your_geo_point": {
"points": [
[
-121.85121372045873,
37.305765606399724
],
[
-121.85116304254947,
37.30570833334188
],
[
-121.85112738572346,
37.30564429665501
],
[
-121.85110812025259,
37.30557595721911
],
...
[
-121.85121372045873,
37.305765606399724
]
]
}
}
}
}
}
}

Elasticsearch returns a row with existing field for "must not exists" query

I have an index with optional Date/Time field called lastChackoutDate. Trying to filter rows by range or term query returns 0 rows but I know there are some documents where value for this field exists.
Mappings query returns me an expected answer with:
... ,
"lastCheckoutDate": {
"type": "date"
},
...
Trying to identify what query can return me result I'm waiting for eventually led me to an expression:
{
"from": 0,
"query": {
"bool": {
"filter": [
{
"bool": {
"must_not": [
{
"exists": {
"field": "lastCheckoutDate"
}
}
],
"must": [
{
"nested": {
"path": "nested_path",
"query": {
"term": {
"nested_path.id": {
"value": "some_unique_id"
}
}
}
}
}
]
}
}
]
}
},
"size": 50,
"sort": [
{
"displaySequence": {
"order": "asc"
}
}
]
}
which returned me a single row with existing path/value:
hits
[0]
_source
lastCheckoutDate: 2020-01-23T00:00:00
explain of this query didn't shed a light on "exists" response details: ConstantScore(+ToParentBlockJoinQuery (nested_path.id:some_unique_id) -ConstantScore(_field_names:lastCheckoutDate)), product of:
So are there any ways to determine why field is invisible for query?
This works fine for test database which is being created and dropped each time, but existing storage always gives me 0 hits for any valid (from my POV) query. Ofc I did a migration action for existing database (at least somehow mapping info appeared for a new field).
Elastic documentation shows some examples why "exists" query may fail:
- The field in the source JSON is null or []
- The field has "index" : false set in the mapping
- The length of the field value exceeded an ignore_above setting in the mapping
- The field value was malformed and ignore_malformed was defined in the mapping
But I'm not sure that any option is true for my case.
New documents were added before migration happened. So AFAIK Elastic won't re-index existing documents until they are updated in index.
So that's why on test database I had no issues.
try this
GET /index_name/_search
{
"query": {
"bool": {
"must_not": [
{
"exists": {
"field": "fieldname"
}
}
]
}
}
}

Elastic Search Query (a like x and y) or (b like x and y)

Some background info: In the bellow example user searched for "HTML CSS". I split each word from the search string and created the SQL query seen bellow.
Now I am trying to make an elastic search query that has the same logic as the following SQL query:
SELECT
title, description
FROM `classes`
WHERE
(`title` LIKE '%html%' AND `title` LIKE '%css%') OR
(description LIKE '%html%' AND description LIKE '%css%')
Currently, half way there but can't seem to get it right yet.
{
"query": {
"bool": {
"must": [
{
"term": {
"title": "html"
}
},
{
"term": {
"title": "css"
}
}
]
}
},
"_source": [
"title"
],
"size": 30
}
Now I need to find how to add follow logic
OR (description LIKE '%html%' AND description LIKE '%css%')
One important point is that I need to only fetch documents that have both words in either title or disruption. I don't want to fetch documents that have only 1 word.
I will update questions as I find more info.
Update: The chosen answer also provides a way to boost scoring based on the field.
Can you try following query. You can use should for making or operation
{
"query": {
"bool": {
"should": [
{
"bool": {
"must": [
{
"match": { // Go for term if your field is analyzed
"title": {
"query": "html css",
"operator": "and",
"boost" : 2
}
}
}
]
}
},
{
"bool": {
"must": [
{
"match": {
"description": {
"query": "html css",
"operator": "and"
}
}
}
]
}
}
],
"minimum_number_should_match": 1
}
},
"_source": [
"title",
"description"
]
}
Hope this helps!!
I feel most appropriate query to be used in this case is multi_match.
multi_match query is convenient way of running the same query on
multiple fields.
So your query can be written as:
GET /_search
{
"_source": ["title", "description"],
"query": {
"multi_match": {
"query": "html css",
"fields": ["title^2", "description"],
"operator":"and"
}
}
}
_source filters the dataset so that only fields mentioned in array
will be displayed in results.
^2 denotes boosting title field with the number 2
operator:and makes sure that all terms in query must be matched
in either fields
From the elasticsearch 5.2 doc:
One option is to use the nested datatype instead of the object datatype.
More details here: https://www.elastic.co/guide/en/elasticsearch/reference/5.2/nested.html
Hope this helps

Using geo_shape filter inside bool filter

I'm trying to combine a geo_shape Elasticsearch filter with a basic term filter within a bool filter, so I can attempt to improve performance of our elasticsearch query, with little success.
This query is used over a set of polygons in Elasticsearch, to determine which shapes the specified point is in.
It seems as though, unless I have the wrong end of the stick, geo_shape filters can't be included inside a bool filter collection like this:
{
"size": 1000,
"fields": [],
"query": {
"filtered": {
"query": {
"match_all": {}
},
"filter": {
"bool": {
"must": [
{
"geo_shape": {
"deliveryAreas.area": {
"shape": {
"coordinates": [
-0.126208,
51.430874
],
"type": "point"
}
}
}
},
{
"term": {
"restaurantState": 3
}
}
]
}
}
}
}
}
The query above runs, but returns 0 results. Using the geo_shape query outside the bool works fine, but the combination of the two seems to fail. I assume it must be a syntax error, as the ElasticSearch docs recommend this approach to make the expensive geo calls cheaper, but no luck so far.

Resources