Elasticsearc-5.0.0 Weighted average - elasticsearch

I wanted to try weighted average on ES-5.0.0.
I tried something with json code:
GET ABC/xyz/_search
{
"aggs": {
"myAggr": {
"terms": {
"field": "UrunNo",
"order": { "weightedAvg": "desc"}
},
"aggs": {
"weightedAvg": { "avg" : { "script" : "[values: doc['BirimFiyat'].value, weights: doc['Adet'].value]" }}
} } } }
I have error:
{"error": {
"root_cause": [
{ "type": "parsing_exception",
"reason": "Unexpected token VALUE_STRING [script] in [weightedAvg].",
"line": 9,
"col": 49
} ],
"type": "parsing_exception",
"reason": "Unexpected token VALUE_STRING [script] in [weightedAvg].",
"line": 9,
"col": 49
},"status": 400 }
What is the problem? or Is Weighted average possible on ES-5.0.0?

Related

Multi_terms aggregation gives me an error

I'm trying to use ElasticSearch v. 7.11.1 on Windows 10. I don't know how to make multi_terms aggregation work. This query:
{
"aggs": {
"test_agg": {
"multi_terms": {
"terms": [{
"field": "JobTitle.keyword"
}, {
"field": "AboutMe.keyword"
}]
}
}
}
}
gives me this:
{
"error": {
"root_cause": [
{
"type": "parsing_exception",
"reason": "Unknown aggregation type [multi_terms] did you mean [rare_terms]?",
"line": 4,
"col": 22
}
],
"type": "parsing_exception",
"reason": "Unknown aggregation type [multi_terms] did you mean [rare_terms]?",
"line": 4,
"col": 22,
"caused_by": {
"type": "named_object_not_found_exception",
"reason": "[4:22] unknown field [multi_terms]"
}
},
"status": 400
}
but this query:
{
"aggs": {
"test_agg": {
"terms":
{
"field": "JobTitle.keyword",
"size": "10"
}
}
}
}
works.
What am I doing wrong ?
The problem is, that you're using Elasticsearch 7.11.
As you can see in the Release notes, they added the multi_terms feature in 7.12.0.

elasticsearch query for GCP alpha and beta api's

trying to get this query below to work on GCP. need this to query for beta api's being used every 24 hours. keep getting error in the query. probably a simple syntax error, but im not seeing it.
GET /gcp-%2A/_search
{
"query": {
"range" : {
"timestamp" : {
"gte" : "now-1d/d",
"lt" : "now/d"
}
},
"wildcard": {
"protoPayload.methodName": {
"value": "*beta*",
"boost": 1.0,
"rewrite": "constant_score"
}
}
}
}
{
"error": {
"root_cause": [
{
"type": "parsing_exception",
"reason": "[range] malformed query, expected [END_OBJECT] but found [FIELD_NAME]",
"line": 9,
"col": 13
}
],
"type": "parsing_exception",
"reason": "[range] malformed query, expected [END_OBJECT] but found [FIELD_NAME]",
"line": 9,
"col": 13
},
"status": 400
}
You were almost there:
GET /gcp-%2A/_search
{
"query": {
"bool": {
"must": [
{
"range": {
"timestamp": {
"gte": "now-1d/d",
"lt": "now/d"
}
}
},
{
"wildcard": {
"protoPayload.methodName": {
"value": "*beta*",
"boost": 1,
"rewrite": "constant_score"
}
}
}
]
}
}
}

Problems with maps in elasticsearch scripts Map parameters

I cannot find a way to pass Map as a named script parameter. Groovy-style "[1:0.2,3:0.4]" and json-style {1:0.2, 3:0.4} result in syntax error. Examples:
GET tt/clip/_search
{
"query": {
"function_score": {
"script_score": {
"script": {
"lang": "painless",
"source": "return 0",
"params": {
"full_text_tfidf": [1:0.2,3:0.4]
}
}
}
}
}
}
{
"error": {
"root_cause": [
{
"type": "parsing_exception",
"reason": "[script] failed to parse field [params]",
"line": 9,
"col": 35
}
],
"type": "parsing_exception",
"reason": "[script] failed to parse field [params]",
"line": 9,
"col": 35,
"caused_by": {
"type": "json_parse_exception",
"reason": "Unexpected character (':' (code 58)): was expecting comma to separate Array entries\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput#75a6a7c1; line: 9, column: 37]"
}
},
"status": 400
}
GET tt/clip/_search
{
"query": {
"function_score": {
"script_score": {
"script": {
"lang": "painless",
"source": "return 0",
"params": {
"full_text_tfidf": {1:0.2,3:0.4}
}
}
}
}
}
}
{
"error": {
"root_cause": [
{
"type": "parsing_exception",
"reason": "[script] failed to parse field [params]",
"line": 9,
"col": 34
}
],
"type": "parsing_exception",
"reason": "[script] failed to parse field [params]",
"line": 9,
"col": 34,
"caused_by": {
"type": "json_parse_exception",
"reason": "Unexpected character ('1' (code 49)): was expecting double-quote to start field name\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput#6ed9faa9; line: 9, column: 36]"
}
},
"status": 400
}
On the other hand, I cannot say that params know to work only with primitive types. Nested arrays are accepted successfully. Is it possible to pass maps as parameters?
The correct way to specify a map in the parameters is simply by using a JSON hash (you're missing the double quotes around the keys):
GET tt/clip/_search
{
"query": {
"function_score": {
"script_score": {
"script": {
"lang": "painless",
"source": "return 0",
"params": {
"full_text_tfidf": {
"1": 0.2,
"3" :0.4
}
}
}
}
}
}
}

Elasticsearch Query + Agg search query

Data in my elasticsearch contains a field named facilityName. I have a requirement where I have to see if there are any duplicate records with facilityNameTypeCode as "UWI" and having same facilityName value. Following is a structure example:
"facilityName": [
{
"facilityNameTypeId": {
"facilityNameTypeCode": "Name"
},
"facilityName": "Rishav jayswal"
},
{
"facilityNameTypeId": {
"facilityNameTypeCode": "Name"
},
"facilityName": "R.M"
}
]
This is the query I created:
GET _search
{
"query" : {
"term" : {"facilityName.facilityNameTypeId.facilityNameTypeCode" : "UWI"}
},
"aggs" : {
"duplicateNames": {
"terms": {
"field": "facilityName.facilityName",
"size": 0,
"min_doc_count": 2
}
}
}
}
But I am having this error:
{
"error": {
"root_cause": [
{
"type": "parsing_exception",
"reason": "[terms] failed to parse field [size]",
"line": 10,
"col": 27
}
],
"type": "parsing_exception",
"reason": "[terms] failed to parse field [size]",
"line": 10,
"col": 27,
"caused_by": {
"type": "illegal_argument_exception",
"reason": "[size] must be greater than 0. Found [0] in [duplicateNames]"
}
},
"status": 400
}
Can anyone suggest on how to do this?
The error is pretty clear
[size] must be greater than 0. Found [0] in [duplicateNames]
So simply set size to something bigger than 0, it doesn't make much sense to set it to 0 anyway
"terms": {
"field": "facilityName.facilityName",
"size": 10,
"min_doc_count": 2
}

How i can apply match and range in the query DSL in elasticsearch

I want use the match and range, my body in the query is :
{
"query": {
"match" : {
"netscaler.ipadd" : "192.68.2.39"
},
"range": {
"#timestamp": {
"gte":"2015-08-04T11:00:00",
"lt":"2015-08-04T12:00:00"
}
}
},
"aggs" : {
"avg_grade" : {
"avg" : { "field" : "netscaler.stat.system.memusagepcnt" }
}
}
}
and elsaticsearch responds with:
{
"error": {
"root_cause": [{
"type": "parsing_exception",
"reason": "[match] malformed query, expected [END_OBJECT] but found [FIELD_NAME]",
"line": 6,
"col": 7
}],
"type": "parsing_exception",
"reason": "[match] malformed query, expected [END_OBJECT] but found [FIELD_NAME]",
"line": 6,
"col": 7
},
"status": 400
}
I need know which is the best way or the correct way for do that.
If you have multiple queries you probably should wrap them inside a bool query:
{
"query": {
"bool": {
"must": [
{
"match": {
"netscaler.ipadd": "192.68.2.39"
}
},
{
"range": {
"#timestamp": {
"gte": "2015-08-04T11:00:00",
"lt": "2015-08-04T12:00:00"
}
}
}
]
}
},
"aggs": {
"avg_grade": {
"avg": {
"field": "netscaler.stat.system.memusagepcnt"
}
}
}
}
More info in the docs

Resources