ElasticSearch - Cant not sort by distance location - "status": 500 - elasticsearch

Target: search people and sort their name by the distance from their location to my location (input lat lon)
I input many rows of data like this :
{"index":{"_id":"1"}}
{"account_number":1,"location":[22.23, 23.12],"balance":39225,"firstname":"Amber","lastname":"Duke","age":32,"gender":"M","address":"880 Holmes Lane","employer":"Pyrami","email":"amberduke#pyrami.com","city":"Brogan","state":"IL"}
Notice the location here I input "location":[22.23, 23.12]
For simplicity, the below is the mapping of firstname, lastname and location
{
"lat_lon_test2": {
"mappings": {
"properties": {
"firstname": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"lastname": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"location": {
"type": "float"
},
..........................................
This my trial:
GET localhost:9200/lat_lon_test2/_search?pretty
{
"query":
{
"multi_match":
{
"fields":["firstname", "lastname"],
"minimum_should_match": "80%",
"query": "Hattie"
}
}
,
"sort": [
{
"_geo_distance": {
"location": [
40.715,
-73.998 //my input location
],
"order": "asc",
"unit": "km",
"distance_type": "plane"
}
}
]
}
I got this result :
{
"error": {
"root_cause": [
{
"type": "class_cast_exception",
"reason": "class org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData cannot be cast to class org.elasticsearch.index.fielddata.IndexGeoPointFieldData (org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData and org.elasticsearch.index.fielddata.IndexGeoPointFieldData are in unnamed module of loader 'app')"
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [
{
"shard": 0,
"index": "lat_lon_test2",
"node": "oyC0x3WNRC-3ok_TzAMT2w",
"reason": {
"type": "class_cast_exception",
"reason": "class org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData cannot be cast to class org.elasticsearch.index.fielddata.IndexGeoPointFieldData (org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData and org.elasticsearch.index.fielddata.IndexGeoPointFieldData are in unnamed module of loader 'app')"
}
}
],
"caused_by": {
"type": "class_cast_exception",
"reason": "class org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData cannot be cast to class org.elasticsearch.index.fielddata.IndexGeoPointFieldData (org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData and org.elasticsearch.index.fielddata.IndexGeoPointFieldData are in unnamed module of loader 'app')",
"caused_by": {
"type": "class_cast_exception",
"reason": "class org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData cannot be cast to class org.elasticsearch.index.fielddata.IndexGeoPointFieldData (org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData and org.elasticsearch.index.fielddata.IndexGeoPointFieldData are in unnamed module of loader 'app')"
}
}
},
"status": 500
}
I realise that I need to mapping location to geopoint
I did this:
PUT localhost:9200/lat_lon_test2/_mapping
{
"properties": {
"location": {
"type": "geo_point"
}
}
}
I got this result:
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "mapper [location] cannot be changed from type [float] to [geo_point]"
}
],
"type": "illegal_argument_exception",
"reason": "mapper [location] cannot be changed from type [float] to [geo_point]"
},
"status": 400
}
My version: 7.9.3
Sorry for long post

You are changing the data-type of location from float to geo_point which is not possible, hence the exception, please create a new index with proper mapping and reindex all data according to the new data-type to fix the issue.
Please refer to the section on update mapping of a field and what is covered and what is not covered, Also If you need to change the mapping of a field in other indices, create a new index with the correct mapping and reindex your data into that index.

Related

Open Search, exclude field from indexing in mapping

I have the following mapping:
{
"properties": {
"type": {
"type": "keyword"
},
"body": {
"type": "text"
},
"id": {
"type": "keyword"
},
"date": {
"type": "date"
},
},
}
body field is going to be an email message, it's very long and I don't want to index it.
what is the proper way to exclude this field from indexing?
What I tried:
enabled: false - as I understand from the documentation, it's applied only to object type fields but in my case it's not really an object so I'm not sure whether I can use it.
index: false/'no' - this breaks the code at all and does not allow me to make a search. My query contains query itself and aggregations with filter. Filter contains range:
date: { gte: someDay.getTime(), lte: 'now' }
P.S. someDay is a certain day in my case.
The error I get after applying index: false in mapping to the body field is the following:
{
"error":
{
"root_cause":
[
{
"type": "number_format_exception",
"reason": "For input string: \"now\""
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards":
[
{
"shard": 0,
"index": "test",
"node": "eehPq21jQsmkotVOqQEMeA",
"reason":
{
"type": "number_format_exception",
"reason": "For input string: \"now\""
}
}
],
"caused_by":
{
"type": "number_format_exception",
"reason": "For input string: \"now\"",
"caused_by":
{
"type": "number_format_exception",
"reason": "For input string: \"now\""
}
}
},
"status": 400
}
I'm not sure how these cases are associated as the error is about date field while I'm adding index property to body field.
I'm using: "#opensearch-project/opensearch": "^1.0.2"
Please help me to understand:
how to exclude field from indexing.
why applying index: false to body field in mapping breaks the code an I get an error associated with date field.
You should just modify your mapping to this:
"body": {
"type": "text",
"index": false
}
And it should work

ElasticSearch - Optional arguments for FILTER?

I want to make an optional choice for FILTER, so if I input a specific term, the result will filter by it, otherwise match everything.
Here is my trial code :
"filter" : {
"bool" : {
"should" : [
{ "terms" : { "item.brand" : [ "{{brand}}"] }} ,
{"match_all":{}}
]
}
However, when I run:
{
"id": "bipbip002",
"params": {
"query_all": "Table"
"brand":""
}
}
I got this:
{
"error": {
"root_cause": [
{
"type": "parsing_exception",
"reason": "Unknown key for a START_OBJECT in [filter].",
"line": 1,
"col": 302
}
],
"type": "parsing_exception",
"reason": "Unknown key for a START_OBJECT in [filter].",
"line": 1,
"col": 302
},
"status": 400
}
Currently, working on version 7el
MAPPING INFO
"item": {
"properties": {
"brand": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
"name": {
"properties": {
"en": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
Logic behind:
Search name of Item, if users click on the filter button, send an API to an existing template which the filter argument is optional
Conditional queries(aka if...else) are not supported directly in elasticsearch DSL, you need to do this conditional logic in your app-server which is firing the ES queries.
Please refer to the conditional clause in elasticsearch for more info.

Set values in multifields elastic search

I have the following structure recorded in elastic:
PUT /movies
{
"mappings": {
"title": {
"properties": {
"title": {
"type": "string",
"fields": {
"de": {
"type": "string",
"analyzer": "german"
},
"en": {
"type": "string",
"analyzer": "english"
},
"fr": {
"type": "string",
"analyzer": "french"
},
"es": {
"type": "string",
"analyzer": "spanish"
}
}
}
}
}
}
}
But when I am trying to record values like this:
PUT movies/_doc/2
{
"title": "fox",
"field": "en"
}
I receive the following error:
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "Rejecting mapping update to [movies] as the final mapping would have more than 1 type: [_doc, title]"
}
],
"type": "illegal_argument_exception",
"reason": "Rejecting mapping update to [movies] as the final mapping would have more than 1 type: [_doc, title]"
},
"status": 400
}
Maybe I am doing something wrong since I am fairly new to elastic. My idea is to create one to one mapping and when I am searching for Fox in any of these languages to return results only in english since they are recorded in the DB.
Your mapping indicates a mapping type "title" but when you create the documents you use PUT movies/_doc/2 that indicates mapping type _doc which doesn't exist so ES will try to automatically create it, and in newer version of ES having multiple mapping types is forbidden.
You should just change it to: PUT movies/title/2

Elasticsearch percolate mapping error

I want to use percolate query in elasticsearch. But I couldn't setup mapping. I have received the following error. where is my fault?
PUT /my-index
{
"mappings": {
"doctype": {
"properties": {
"message": {
"type": "string"
}
}
},
"queries": {
"properties": {
"query": {
"type": "percolator"
}
}
}
}
}
Error Message:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "No handler for type [percolator] declared on field [query]"
}
],
"type": "mapper_parsing_exception",
"reason": "Failed to parse mapping [queries]: No handler for type [percolator] declared on field [query]",
"caused_by": {
"type": "mapper_parsing_exception",
"reason": "No handler for type [percolator] declared on field [query]"
}
},
"status": 400
}
Thanks

Sort parent type based on one field within an array of nested Object in elasticsearch

I have below mapping in my index:
{
"testIndex": {
"mappings": {
"type1": {
"properties": {
"text": {
"type": "string"
},
"time_views": {
"properties": {
"timestamp": {
"type": "long"
},
"views": {
"type": "integer"
}
}
}
}
}
}
}
}
"time_views" actually is an array, but inner attributes not array.
I want to sort my type1 records based on maximum value of "views" attribute of each type1 record. I read elasticsearch sort documentation, it's have solution for use cases that sorting is based on field (single or array) of single nested object. but what I want is different. I want pick maximum value of "views" for each document and sort the documents based on these values
I made this json query
{
"size": 10,
"query": {
"range": {
"timeStamp": {
"gte": 1468852617347,
"lte": 1468939017347
}
}
},
"from": 0,
"sort": [
{
"time_views.views": {
"mode": "max",
"nested_path": "time_views",
"order": "desc"
}
}
]
}
but I got this error
{
"error": {
"phase": "query",
"failed_shards": [
{
"node": "n4rxRCOuSBaGT5xZoa0bHQ",
"reason": {
"reason": "[nested] nested object under path [time_views] is not of nested type",
"col": 136,
"line": 1,
"index": "data",
"type": "query_parsing_exception"
},
"index": "data",
"shard": 0
}
],
"reason": "all shards failed",
"grouped": true,
"type": "search_phase_execution_exception",
"root_cause": [
{
"reason": "[nested] nested object under path [time_views] is not of nested type",
"col": 136,
"line": 1,
"index": "data",
"type": "query_parsing_exception"
}
]
},
"status": 400
}
as I mentioned above time_views is an array and I guess this error is because of that.
even I can't use sorting based on array field feature, because "time_views" is not a primitive type.
I think my last chance is write a custom sorting by scripting, but I don't know how.
please tell me my mistake if it's possible to achieve to what I'm want, otherwise give me a simple script sample.
tnx :)
The error message does a lot to explain what is wrong with the query. Actually, the problem is with the mapping. And I think you intended on using nested fields, since you are using nested queries.
You just need to make your time_views field as nested:
"mappings": {
"type1": {
"properties": {
"text": {
"type": "string"
},
"time_views": {
"type": "nested",
"properties": {
"timestamp": {
"type": "long"
},
"views": {
"type": "integer"
}
}
}
}
}
}

Resources