My elasticsearch script sometimes works ok,and sometimes throw an exception,such as:
{
"error": {
"root_cause": [
{
"type": "remote_transport_exception",
"reason": "[es77][ip:9300] [indices:data/write/update[s]]"
}
],
"type": "illegal_argument_exception",
"reason": "failed to execute script",
"caused_by": {
"type": "script_exception",
"reason": "failed to run inline script [newArray = [];ctx._source.CILastCallResultRemark?.each{ obj->if(obj.id!=item.id){newArray=newArray+obj} }; (ctx._source.CILastCallResultRemark=newArray+item)] using lang [groovy]",
"caused_by": {
"type": "no_class_def_found_error",
"reason": "sun/reflect/MethodAccessorImpl",
"caused_by": {
"type": "class_not_found_exception",
"reason": "sun.reflect.MethodAccessorImpl"
}
}
}
},
"status": 400
}
Here is the script:
{
"script": {
"inline": "newArray = [];ctx._source.CILastCallResultRemark?.each{ obj->if(obj.id!=item.id){newArray=newArray+obj}};(ctx._source.CILastCallResultRemark=newArray+item)",
"params": {
"item": {
"id": "2",
"remart": "x1"
}
}
}
}
And here is the es log:
Caused by: ScriptException[failed to run inline script [newArray = [];ctx._source.CILastCallResultRemark?.each{ obj->if(obj.id!=item.id){newArray=newArray+obj}};(ctx._source.CILastCallResultRemark=newArray+item)] using lang [groovy]]; nested: NoClassDefFoundError[sun/reflect/MethodAccessorImpl]; nested: ClassNotFoundException[sun.reflect.MethodAccessorImpl];
at org.elasticsearch.script.groovy.GroovyScriptEngineService$GroovyScript.run(GroovyScriptEngineService.java:318)
at org.elasticsearch.action.update.UpdateHelper.executeScript(UpdateHelper.java:251)
... 12 more
Caused by: java.lang.NoClassDefFoundError: sun/reflect/MethodAccessorImpl
i see the bug.i will update the es version and try.
Related
When I import a dashboard from kibana 7.5.1 to kibana 7.4.1, it failed with error "the file could not be processed". I call kibana import object api on kibana dev tool console instead, and it provide following response:
POST /api/saved_objects/_import
{
"file" : "C:\Users\dashboards-kibana\EKC-Dashboard-Prod.ndjson"
}
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "failed to parse"
}
],
"type": "mapper_parsing_exception",
"reason": "failed to parse",
"caused_by": {
"type": "json_parse_exception",
"reason": "Unrecognized character escape 'U' (code 85)\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper#33da7471; line: 2, column: 17]"
}
},
"status": 400
}
This is my first time work with kibana. Is there anyway to solve the import error?
Elasticsearch version 7.7.0
This is the part of the mapping:
const PROFILE_MAPPING = {
mappings: {
properties: {
_userLocation: {
type: "geo_point"
},
_ignoredBy: {
type: "nested"
}
}
}
};
_ignoredBy data example:
[{
"until" : "2020-12-03T16:20:43.176Z",
"user" : <USER_ID>
}]
and this is the script I'm running to update it:
await client.update({
index,
id: target,
refresh: "wait_for",
body: {
script: {
source:
"ctx._source._ignoredBy.removeIf(item -> item.user ==
params.by.user);ctx._source._ignoredBy.add(params.by)",
params: {
by: {
user: initiator,
until: addSeconds(new Date(), ignoreInterval)
}
}
}
}
});
and this is the error I'm getting:
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "failed to execute script"
}
],
"type": "illegal_argument_exception",
"reason": "failed to execute script",
"caused_by": {
"type": "script_exception",
"reason": "runtime error",
"script_stack": ["item -> item.user == params.by.user);", "^---- HERE"],
"script": "ctx._source._ignoredBy.removeIf(item -> item.user == params.by.user);ctx._source._ignoredBy.add(params.by)",
"lang": "painless",
"position": { "offset": 32, "start": 32, "end": 69 },
"caused_by": { "type": "null_pointer_exception", "reason": null }
}
},
"status": 400
}
The weird thing is that this works 99% of the time but errors are appearing on logs and can't figure out what's the reason. The params passed in are 100% there as they appear on logs.
Such null pointers are hard to wrap one's head around but my hunch is that there's something off with ctx._source._ignoredBy itself.
In that spirit, I'd suggest to add one more check before I'm calling .removeIf on it -- perhaps initialize it in case it's null:
{
"script": {
"source": "if (ctx._source._ignoredBy == null) {ctx._source._ignoredBy = []; } ctx._source._ignoredBy.removeIf(item -> item.user == params.by.user); ctx._source._ignoredBy.add(params.by)",
"params": {
...
}
}
}
I am working with ELK and i have created index pattern with build_date. Now to calculate the Avg of build duration, i need to find the start-time and end-time in minutes using painless script.
My logstash output data given below
"build_end_time" => "2021-01-13 01:29:49",
"build_duration" => "6409651",
"build_start_time" => "2021-01-12 23:43:00",
"build_date" => "2021-01-12",
"#timestamp" => 2021-02-02T11:40:50.747Z,
Scripted field settings given below.
Name: Duration_time
Language: painless
Type: number
Format: Duration
Input format: minutes
Output format: Human readable
Popularity: 0
Script: def doc['build_end_time'].date.millisOfDay - doc['build_start_time'].date.millisOfDay
it throws - Script is invalid.
{
"root_cause": [
{
"type": "script_exception",
"reason": "compile error",
"script_stack": [
"def doc['build_end_time'].date.m ...",
" ^---- HERE"
],
"script": "def doc['build_end_time'].date.millisOfDay - doc['build_start_time'].date.millisOfDay",
"lang": "painless",
"position": {
"offset": 7,
"start": 0,
"end": 32
}
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [
{
"shard": 0,
"index": "build-logs",
"node": "JSvuaBbCQr6uI5qKvavj7Q",
"reason": {
"type": "script_exception",
"reason": "compile error",
"script_stack": [
"def doc['build_end_time'].date.m ...",
" ^---- HERE"
],
"script": "def doc['build_end_time'].date.millisOfDay - doc['build_start_time'].date.millisOfDay",
"lang": "painless",
"position": {
"offset": 7,
"start": 0,
"end": 32
},
"caused_by": {
"type": "illegal_argument_exception",
"reason": "invalid sequence of tokens near ['['].",
"caused_by": {
"type": "no_viable_alt_exception",
"reason": null
}
}
}
}
]
}
Can someone help me to work this?
This looks like a syntactic error.
Instead of
def doc['build_end_time'].date.millisOfDay - doc['build_start_time'].date.millisOfDay
use
return doc['build_end_time'].date.millisOfDay - doc['build_start_time'].date.millisOfDay
The return is actually not required -- you can leave it out entirely.
The def keyword defines something. So you could in theory say:
def result = doc['build_end_time'].date.millisOfDay - doc['build_start_time'].date.millisOfDay
but you'd need to return something -- so:
def result = '...'; return result
The elastic search azure plugin troubleshooting issue:
I am trying to snapshot the index data from on prem to azure and am getting the same exception:
{
"error": {
"root_cause": [
{
"type": "repository_verification_exception",
"reason": "[qarepository] path is not accessible on master node"
}
],
"type": "repository_verification_exception",
"reason": "[qarepository] path is not accessible on master node",
"caused_by": {
"type": "i_o_exception",
"reason": "Can not write blob master.dat",
"caused_by": {
"type": "storage_exception",
"reason": "storage_exception: ",
"caused_by": {
"type": "i_o_exception",
"reason": "qaonpremesindex.blob.core.windows.net"
}
}
}
},
"status": 500
}
steps followed:
Created storage account in azure
Created a blob container
added the keystore values(name&key)
PUT _snapshot/qarepository
{
"type": "azure"
}
I'd like to log users' input to my RESTful API for debugging purpose but whenever there's an empty field in the JSON payload, an error is generated and the log is discarded.
For instance,
{
"extra": {
"request": {
"body": {
"": ""
}
}
}
}
...will result in
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "failed to parse"
}
],
"type": "mapper_parsing_exception",
"reason": "failed to parse",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "field name cannot be an empty string"
}
},
"status": 400
}
It seems to be caused by https://github.com/elastic/elasticsearch/blob/45e7e24736eeb4a157ac89bd16a374dbf917ae26/server/src/main/java/org/elasticsearch/index/mapper/DocumentParser.java#L191.
It's a bit tricky since it happens in the parsing phase... Is there any workaround to remove/rename such fields so that it can enable ES to digest these logs?