Map params is reordered in elasticsearch - elasticsearch

I have a query using script to sort the result, for my business, I need one params named cities to calculate the sort value, but when I use Debug.explain to print the params.cities, the order is not unexpected, it's reordered.
Is there anyone know how to use the correct order in script? thanks in advance for your help.
Query:
GET tour/product/_search
{
"sort" : [
{
"_script" : {
"script" : {
"inline" : "Debug.explain(params.cities);",
"lang" : "painless",
"params" : {
"cities" : {
"2" : 100.0,
"17" : 3.71,
"12" : 3.63,
"13" : 3.63,
"14" : 3.6
}
}
},
"type" : "number",
"order" : "desc"
}
}
]
}
Print result:
{ "error": {
"root_cause": [
{
"type": "painless_explain_error",
"reason": "painless_explain_error: null"
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [
{
"shard": 0,
"index": "tour-1",
"node": "y5z7ah5cSoCk6n3Q95gbxw",
"reason": {
"type": "script_exception",
"reason": "runtime error",
"painless_class": "HashMap",
"to_string": "{12=3.63, 2=100.0, 13=3.63, 14=3.6, 17=3.71}",
"java_class": "java.util.HashMap",
"script_stack": [
"Debug.explain(params.cities);",
" ^---- HERE"
],
"script": "Debug.explain(params.cities);",
"lang": "painless",
"caused_by": {
"type": "painless_explain_error",
"reason": "painless_explain_error: null"
}
}
}
],
"caused_by": {
"type": "script_exception",
"reason": "runtime error",
"painless_class": "HashMap",
"to_string": "{12=3.63, 2=100.0, 13=3.63, 14=3.6, 17=3.71}",
"java_class": "java.util.HashMap",
"script_stack": [
"Debug.explain(params.cities);",
" ^---- HERE"
],
"script": "Debug.explain(params.cities);",
"lang": "painless",
"caused_by": {
"type": "painless_explain_error",
"reason": "painless_explain_error: null"
}
} }, "status": 500 }

Related

Is there any problem in my elasticsearch request?

I'm trying to update a doc in elasticsearch by using:
POST /rcqmkg_eco_ugc_rec/_doc/aaa/_update
{
"script": {
"source": "ctx._source.flower_cnt_0 += params.flower_cnt_0",
"lang": "painless",
"params": {
"flower_cnt_0": 1
}
}
}
But I got a illegal_argument_exception error. The result of elasticsearch is:
{
"error": {
"root_cause": [
{
"type": "remote_transport_exception",
"reason": "[data1_xxx][xxx][indices:data/write/update[s]]"
}
],
"type": "illegal_argument_exception",
"reason": "failed to execute script",
"caused_by": {
"type": "script_exception",
"reason": "runtime error",
"script_stack": [
"ctx._source.flower_cnt_0 += params.flower_cnt_0",
" ^---- HERE"
],
"script": "ctx._source.flower_cnt_0 += params.flower_cnt_0",
"lang": "painless",
"caused_by": {
"type": "null_pointer_exception",
"reason": null
}
}
},
"status": 400
}
Where is the problem in my update request of es?
If the document you're trying to update might not contain a field called flower_cnt_0, you need to account for this in your script:
Try this instead:
POST /rcqmkg_eco_ugc_rec/_doc/aaa/_update
{
"script": {
"source": "ctx._source.flower_cnt_0 = (ctx._source.flower_cnt_0 ?: 0) + params.flower_cnt_0",
"lang": "painless",
"params": {
"flower_cnt_0": 1
}
}
}

Scripted Metric Aggregation fails on unexpected = character

I'm trying to have some object containing all time deltas between two logs with the same recordId.
In order to do so, I'm executing the following query (Scripted Metric Aggregation) on elastic search, where the logs are, using fiddler:
{
"query": {
"exists": {
"field": "recordId"
}
},
"aggs": {
"deltas": {
"scripted_metric": {
"init_script": {
"source": "state.deltas = {};",
"lang": "expression"
},
"map_script": {
"source": " if (!(doc['topic'].value in state.deltas)) { state.deltas[doc['topic'].value] = {} } state.deltas[doc['topic'].value][doc['recordId'].value] = !(doc['recordId'].value in state.deltas[doc['topic'].value]) ? doc['#timestamp'].date.millisOfDay : Math.abs(state.deltas[doc['topic'].value][doc['recordId'].value] - doc['#timestamp'].date.millisOfDay) ",
"lang": "expression"
},
"reduce_script": {
"source": " res = {}; for (s in states) { for(topic in Object.keys(s.deltas)) { if (!(topic in res)) { res[topic] = {} } for(recordId in Object.keys(s.deltas[topic])) { res[topic][recordId] = !(recordId in res[topic]) ? s.deltas[topic][recordId] : Math.abs(res[topic][recordId] - s.deltas[topic][recordId]) } } } return res;",
"lang": "expression"
}
}
}
}
}
But it fails because of unexpected '=' character...
Tried other ways but it always throws the same error.
What am I missing?
{
"error": {
"root_cause": [
{
"type": "lexer_no_viable_alt_exception",
"reason": "lexer_no_viable_alt_exception: null"
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [
{
"shard": 0,
"index": "il0:il1-sys-logs-2020.10.07",
"node": "5bmIYI_iSVCtx41qEEbASg",
"reason": {
"type": "script_exception",
"reason": "compile error",
"script_stack": [
"state.deltas = {};",
" ^---- HERE"
],
"script": "state.deltas = {};",
"lang": "expression",
"caused_by": {
"type": "parse_exception",
"reason": "parse_exception: unexpected character '= ' on line (1) position (13)",
"caused_by": {
"type": "lexer_no_viable_alt_exception",
"reason": "lexer_no_viable_alt_exception: null"
}
}
}
}
],
"caused_by": {
"type": "lexer_no_viable_alt_exception",
"reason": "lexer_no_viable_alt_exception: null"
}
},
"status": 500
}

Script update with array method unique() causing error

I need to update array value with another array and then remove intersections, so I found this method 'unique()' for elasticsearch scripts, but it's causing error:
{
"error": {
"root_cause": [
{
"type": "remote_transport_exception",
"reason": "[probook][127.0.0.1:9300][indices:data/write/update[s]]"
}
],
"type": "illegal_argument_exception",
"reason": "failed to execute script",
"caused_by": {
"type": "script_exception",
"reason": "runtime error",
"script_stack": [
"ctx._source.arrFieldName = ctx._source.arrFieldName.unique();",
" ^---- HERE"
],
"script": "ctx._source.arrFieldName.addAll([111, 222, 333]);ctx._source.arrFieldName = ctx._source.arrFieldName.unique();ctx._source.arrFieldNameCount = ctx._source.arrFieldName.length",
"lang": "painless",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "dynamic method [java.util.ArrayList, unique/0] not found"
}
}
},
"status": 400
}
You should do it this way with distinct() instead of unique() for your inline painless script,
{
"script" : {
"inline": "ctx._source.arrFieldName=
ctx._source.arrFieldName.stream().distinct().sorted()
.collect(Collectors.toList())"
}
}

Update nested string field

I am trying to update a field image.uri by _update_by_query:
POST user/_update_by_query
{
"script": {
"source": "ctx._source.image.uri = 'https://example.com/default/image/profile.jpg'",
"lang": "painless"
},
"query": {
"bool": {
"must_not": [
{
"exists": {
"field": "image.id"
}
}
]
}
}
}
But it throws error:
{
"error": {
"root_cause": [
{
"type": "script_exception",
"reason": "runtime error",
"script_stack": [
"ctx._source.image.uri = 'https://example.com/default/image/profile.jpg'",
" ^---- HERE"
],
"script": "ctx._source.image.uri = 'https://example.com/default/image/profile.jpg'",
"lang": "painless"
}
],
"type": "script_exception",
"reason": "runtime error",
"script_stack": [
"ctx._source.image.uri = 'https://example.com/default/image/profile.jpg'",
" ^---- HERE"
],
"script": "ctx._source.image.uri = 'https://example.com/default/image/profile.jpg'",
"lang": "painless",
"caused_by": {
"type": "null_pointer_exception",
"reason": null
}
},
"status": 500
}
A sample document:
{
"image": {
"uri": "https://example.com/resources/uploads/default_files/profile/thumb/large/default_profile.jpg"
},
"created": "2018-06-06T21:49:26Z",
"uid": 1,
"name": "Jason Cameron",
"username": "jason"
}
UPDATED RESPONE
The problem could be coming from a document without image object in it.
Try to add strict mapping if possible, to avoid indexing documents without image object.
OLD RESPONSE/"\' are correct for use inside painless script as string
Your problem comes as use of ' to encapsulate your uri, strings must be encapsulated by ".
Try to modify your script as:
"script": {
"source": "ctx._source.image.uri = \"https://example.com/default/image/profile.jpg\"",
"lang": "painless"
}

elasticsearch query text field with length of value more than 20

I would like to query name filed with length of value(text) is more than 20 by using the following but not working:
GET /groups/_search
{
"query": {
"bool" : {
"must" : {
"script" : {
"script" : "_source.name.values.length() > 20"
}
}
}
}
}
the error msg is :
{
"error": {
"root_cause": [
{
"type": "script_exception",
"reason": "compile error",
"script_stack": [
"_source.name.values.lengt ...",
"^---- HERE"
],
"script": "_source.name.values.length() > 5",
"lang": "painless"
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [
{
"shard": 0,
"index": "groups",
"node": "exBbDVGeToSDRzLLmOh8-g",
"reason": {
"type": "query_shard_exception",
"reason": "failed to create query: {\n \"bool\" : {\n \"must\" : [\n {\n \"script\" : {\n \"script\" : {\n \"inline\" : \"_source.name.values.length() > 5\",\n \"lang\" : \"painless\"\n },\n \"boost\" : 1.0\n }\n }\n ],\n \"disable_coord\" : false,\n \"adjust_pure_negative\" : true,\n \"boost\" : 1.0\n }\n}",
"index_uuid": "_VH1OfpdRhmd_UPV7uTNMg",
"index": "groups",
"caused_by": {
"type": "script_exception",
"reason": "compile error",
"script_stack": [
"_source.name.values.lengt ...",
"^---- HERE"
],
"script": "_source.name.values.length() > ",
"lang": "painless",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "Variable [_source] is not defined."
}
}
}
}
]
},
"status": 400
}
no idea how should i fix it...
fyi: version of es is 5.4.0
I don't know the following issue related:
Painless script_fields don't have access to a _source variable #20068
https://github.com/elastic/elasticsearch/issues/20068
The best and most optimal way to handle this is to also index another field with the length of the name field, let's call it nameLength. That way you shift the burden of computing the length of the name field at indexing time instead of having to do it (repeatedly) at query time.
So at indexing time if you have a name field like {"name": "A big brown fox"}, then you create a new field with the length of the name field, such as {"name": "A big brown fox", "nameLength": 15}.
At query time, you'll be able to use a simple and quick range query on the nameLength field:
GET /groups/_search
{
"query": {
"bool" : {
"must" : {
"range" : {
"nameLength": {
"gt": 20
}
}
}
}
}
}
You can use:
params._source.name.length() > 20
In case this is a rare query, that's probably ok to do. Otherwise you should add a field for the name length, and use the range query.

Resources