I want to make an optional choice for FILTER, so if I input a specific term, the result will filter by it, otherwise match everything.
Here is my trial code :
"filter" : {
"bool" : {
"should" : [
{ "terms" : { "item.brand" : [ "{{brand}}"] }} ,
{"match_all":{}}
]
}
However, when I run:
{
"id": "bipbip002",
"params": {
"query_all": "Table"
"brand":""
}
}
I got this:
{
"error": {
"root_cause": [
{
"type": "parsing_exception",
"reason": "Unknown key for a START_OBJECT in [filter].",
"line": 1,
"col": 302
}
],
"type": "parsing_exception",
"reason": "Unknown key for a START_OBJECT in [filter].",
"line": 1,
"col": 302
},
"status": 400
}
Currently, working on version 7el
MAPPING INFO
"item": {
"properties": {
"brand": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
"name": {
"properties": {
"en": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
Logic behind:
Search name of Item, if users click on the filter button, send an API to an existing template which the filter argument is optional
Conditional queries(aka if...else) are not supported directly in elasticsearch DSL, you need to do this conditional logic in your app-server which is firing the ES queries.
Please refer to the conditional clause in elasticsearch for more info.
Related
i use NLog to write log message to Elasticsearch, the index structure is here:
"mappings": {
"logevent": {
"properties": {
"#timestamp": {
"type": "date"
},
"MachineName": {
"type": "text",
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
}
},
"level": {
"type": "text",
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
}
},
"message": {
"type": "text",
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
}
}
}
}
}
I was able to get results using a text search:
GET /webapi-2022.07.28/_search
{
"query": {
"match": {
"message": "ERROR"
}
}
}
result
"hits" : [
{
"_index" : "webapi-2022.07.28",
"_type" : "logevent",
"_id" : "IFhYQoIBRhF4cR9wr-ja",
"_score" : 4.931916,
"_source" : {
"#timestamp" : "2022-07-28T01:07:58.8822339Z",
"level" : "Error",
"message" : """2022-07-28 09:07:58.8822|ERROR|AppSrv.Filter.AccountAuthorizeAttribute|[KO17111808]-[172.10.2.200]-[ERROR]-"message"""",
"MachineName" : "WIN-EPISTFOBD41"
}
}
//.....
]
but when i use keyword, i get nothing:
GET /webapi-2022.07.28/_search
{
"query": {
"term": {
"message.keyword": "ERROR"
}
}
}
i tried term and match, the result is same.
this is happening due to message field not just containing ERROR but also having other string in the .keyword field, you need to use the text search only in your case, you can use the .keyword field only in case of the exact search.
If your message field contained only the ERROR string than only searching on your .keyword would produce result, you can test it yourself by indexing a sample document.
I am using the latest java API for communication with the elastic search server.
I require to search data in some sorted order.
SortOptions sort = new SortOptions.Builder().field(f -> f.field("customer.keyword").order(SortOrder.Asc)).build();
List<SortOptions> list = new ArrayList<SortOptions>();
list.add(sort);
SearchResponse<Order> response = elasticsearchClient.search(b -> b.index("order").size(100).sort(list)
.query(q -> q.bool(bq -> bq
.filter(fb -> fb.range(r -> r.field("orderTime").
gte(JsonData.of(timeStamp("01-01-2022-01-01-01")))
.lte(JsonData.of(timeStamp("01-01-2022-01-01-10")))
)
)
// .must(query)
)), Order.class);
I have written the
above code for getting search results in sorted order by customer.
I am getting the below error when I run the program.
Exception in thread "main" co.elastic.clients.elasticsearch._types.ElasticsearchException: [es/search] failed: [search_phase_execution_exception] all shards failed
at co.elastic.clients.transport.rest_client.RestClientTransport.getHighLevelResponse(RestClientTransport.java:281)
at co.elastic.clients.transport.rest_client.RestClientTransport.performRequest(RestClientTransport.java:147)
at co.elastic.clients.elasticsearch.ElasticsearchClient.search(ElasticsearchClient.java:1487)
at co.elastic.clients.elasticsearch.ElasticsearchClient.search(ElasticsearchClient.java:1504)
at model.OrderDAO.fetchRecordsQuery(OrderDAO.java:128)
Code runs fine if I remove .sort() method.
My index is configured in the following format.
{
"order": {
"aliases": {},
"mappings": {
"properties": {
"customer": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"orderId": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"orderTime": {
"type": "long"
},
"orderType": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
},
"settings": {
"index": {
"routing": {
"allocation": {
"include": {
"_tier_preference": "data_content"
}
}
},
"number_of_shards": "1",
"provided_name": "order",
"creation_date": "1652783550822",
"number_of_replicas": "1",
"uuid": "mrAj8ZT-SKqC43-UZAB-Jw",
"version": {
"created": "8010299"
}
}
}
}
}
Please let me know what is wrong here also if possible please send me the correct syntax for using sort() in the new java API.
Thanks a lot.
As you have confirmed in comment, customer is a text type field and this is the reason you are getting above error as sort can not apply on texttype of field.
Your index should be configured like below for customer field to apply sort:
{
"mappings": {
"properties": {
"customer": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
Once you have index mapping like above, you can use customer.keyword as field name for sorting and customer as field name for free text search.
Target: search people and sort their name by the distance from their location to my location (input lat lon)
I input many rows of data like this :
{"index":{"_id":"1"}}
{"account_number":1,"location":[22.23, 23.12],"balance":39225,"firstname":"Amber","lastname":"Duke","age":32,"gender":"M","address":"880 Holmes Lane","employer":"Pyrami","email":"amberduke#pyrami.com","city":"Brogan","state":"IL"}
Notice the location here I input "location":[22.23, 23.12]
For simplicity, the below is the mapping of firstname, lastname and location
{
"lat_lon_test2": {
"mappings": {
"properties": {
"firstname": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"lastname": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"location": {
"type": "float"
},
..........................................
This my trial:
GET localhost:9200/lat_lon_test2/_search?pretty
{
"query":
{
"multi_match":
{
"fields":["firstname", "lastname"],
"minimum_should_match": "80%",
"query": "Hattie"
}
}
,
"sort": [
{
"_geo_distance": {
"location": [
40.715,
-73.998 //my input location
],
"order": "asc",
"unit": "km",
"distance_type": "plane"
}
}
]
}
I got this result :
{
"error": {
"root_cause": [
{
"type": "class_cast_exception",
"reason": "class org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData cannot be cast to class org.elasticsearch.index.fielddata.IndexGeoPointFieldData (org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData and org.elasticsearch.index.fielddata.IndexGeoPointFieldData are in unnamed module of loader 'app')"
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [
{
"shard": 0,
"index": "lat_lon_test2",
"node": "oyC0x3WNRC-3ok_TzAMT2w",
"reason": {
"type": "class_cast_exception",
"reason": "class org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData cannot be cast to class org.elasticsearch.index.fielddata.IndexGeoPointFieldData (org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData and org.elasticsearch.index.fielddata.IndexGeoPointFieldData are in unnamed module of loader 'app')"
}
}
],
"caused_by": {
"type": "class_cast_exception",
"reason": "class org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData cannot be cast to class org.elasticsearch.index.fielddata.IndexGeoPointFieldData (org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData and org.elasticsearch.index.fielddata.IndexGeoPointFieldData are in unnamed module of loader 'app')",
"caused_by": {
"type": "class_cast_exception",
"reason": "class org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData cannot be cast to class org.elasticsearch.index.fielddata.IndexGeoPointFieldData (org.elasticsearch.index.fielddata.plain.SortedNumericIndexFieldData and org.elasticsearch.index.fielddata.IndexGeoPointFieldData are in unnamed module of loader 'app')"
}
}
},
"status": 500
}
I realise that I need to mapping location to geopoint
I did this:
PUT localhost:9200/lat_lon_test2/_mapping
{
"properties": {
"location": {
"type": "geo_point"
}
}
}
I got this result:
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "mapper [location] cannot be changed from type [float] to [geo_point]"
}
],
"type": "illegal_argument_exception",
"reason": "mapper [location] cannot be changed from type [float] to [geo_point]"
},
"status": 400
}
My version: 7.9.3
Sorry for long post
You are changing the data-type of location from float to geo_point which is not possible, hence the exception, please create a new index with proper mapping and reindex all data according to the new data-type to fix the issue.
Please refer to the section on update mapping of a field and what is covered and what is not covered, Also If you need to change the mapping of a field in other indices, create a new index with the correct mapping and reindex your data into that index.
I have two synonyms files with few thousand lines, here is the sample causing the problem:
en_synonyms file :
cereal, semolina, wheat
fr_synonyms file :
ble, cereale, wheat
This is the error I got :
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "failed to build synonyms"
}
],
"type": "illegal_argument_exception",
"reason": "failed to build synonyms",
"caused_by": {
"type": "parse_exception",
"reason": "Invalid synonym rule at line 1",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "term: wheat analyzed to a token (cereal) with position increment != 1 (got: 0)"
}
}
},
"status": 400
}
The mapping I used:
PUT wheat_syn
{
"mappings": {
"wheat": {
"properties": {
"description": {
"type": "text",
"fields": {
"synonyms": {
"type": "text",
"analyzer": "syn_text"
},
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
},
"settings": {
"number_of_shards": 1,
"analysis": {
"filter": {
"en_synonyms": {
"type": "synonym",
"tokenizer": "keyword",
"synonyms_path" : "analysis/en_synonyms.txt"
},
"fr_synonyms": {
"type": "synonym",
"tokenizer": "keyword",
"synonyms_path" : "analysis/fr_synonyms.txt"
}
},
"analyzer": {
"syn_text": {
"tokenizer": "standard",
"filter": ["lowercase", "en_synonyms", "fr_synonyms" ]
}
}
}
}
}
Both files contain the term wheat when I remove it from one of them, the index is created successfully.
I thought about combining the two files, so the result will be :
cereal, semolina, wheat, ble, cereale
But in my case I can't do that manually since it will take a lot of time (I'll look for a way to do it programmatically, depending on the answer to this question)
Found a simple soltion:
Instead of using two files, I just concatenated the content of en_synonyms and fr_synonyms in one file all_synonyms:
cereal, semolina, wheat
ble, cereale, wheat
Then used it for the mapping.
I have the following structure recorded in elastic:
PUT /movies
{
"mappings": {
"title": {
"properties": {
"title": {
"type": "string",
"fields": {
"de": {
"type": "string",
"analyzer": "german"
},
"en": {
"type": "string",
"analyzer": "english"
},
"fr": {
"type": "string",
"analyzer": "french"
},
"es": {
"type": "string",
"analyzer": "spanish"
}
}
}
}
}
}
}
But when I am trying to record values like this:
PUT movies/_doc/2
{
"title": "fox",
"field": "en"
}
I receive the following error:
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "Rejecting mapping update to [movies] as the final mapping would have more than 1 type: [_doc, title]"
}
],
"type": "illegal_argument_exception",
"reason": "Rejecting mapping update to [movies] as the final mapping would have more than 1 type: [_doc, title]"
},
"status": 400
}
Maybe I am doing something wrong since I am fairly new to elastic. My idea is to create one to one mapping and when I am searching for Fox in any of these languages to return results only in english since they are recorded in the DB.
Your mapping indicates a mapping type "title" but when you create the documents you use PUT movies/_doc/2 that indicates mapping type _doc which doesn't exist so ES will try to automatically create it, and in newer version of ES having multiple mapping types is forbidden.
You should just change it to: PUT movies/title/2