How to find out what is my index sorted by in elasticsearch? - elasticsearch

I created new index in elasticsearch (v6) using command:
curl -XPUT -H 'Content-Type: application/json' http://localhost:9200/sorttest -d '
{
"settings" : {
"index" : {
"sort.field" : ["username", "date"],
"sort.order" : ["asc", "desc"]
}
},
"mappings": {
"_doc": {
"properties": {
"username": {
"type": "keyword",
"doc_values": true
},
"date": {
"type": "date"
}
}
}
}
}
'
The response was
{"acknowledged":true,"shards_acknowledged":true,"index":"sorttest"}
Next I checked out generated mapping
curl -XGET localhost:9200/sorttest/_mapping?pretty
And the result was
{
"sorttest" : {
"mappings" : {
"_doc" : {
"properties" : {
"date" : {
"type" : "date"
},
"username" : {
"type" : "keyword"
}
}
}
}
}
}
The question is: how can I find out what kind of sorting is set for my index?

Just
curl -XGET localhost:9200/sorttest?pretty
and you will see:
"settings" : {
"index" : {
...
"sort" : {
"field" : [
"username",
"date"
],
"order" : [
"asc",
"desc"
]
},
...
}
}

Related

Update elastic search nested field based on query

I have an Elasticsearch index named pollstat with mapping as follows:
{
"pollstat" : {
"mappings" : {
"dynamic" : "false",
"properties" : {
"dt" : {
"properties" : {
"dte" : {
"type" : "date"
},
"is_polled" : {
"type" : "boolean"
}
}
},
"is_profiled" : {
"type" : "boolean"
},
"maid" : {
"type" : "keyword"
}
}
}
}
}
The above index is created using:
curl -XPUT "http://localhost:9200/pollstat" -H 'Content-Type: application/json' -d'
{
"settings": {
"number_of_shards": 5,
"number_of_replicas": 1
},
"mappings": {
"properties": {
"maid" : {
"type" : "keyword"
},
"dt" : {
"type" : "object",
"properties": {
"dte" : {"type":"date"},
"is_polled" : {"type":"boolean"}
}
},
"is_profiled" : {
"type" : "boolean"
}
},
"dynamic":false
}
}'
To add data into this index, I am using the following code:
curl -X POST "localhost:9200/pollstat/_doc/?pretty" -H 'Content-Type: application/json' -d'{"maid" : "fans", "dt" : [{"dte": "2022-03-19", "is_polled":true } ], "is_profiled":true } '
This is working.
The requirement is to append the dt field when a particular maid polls data on a specific date. In this case, if the maid fans polls data for another day, I want to append the same to the dt field.
I used the following code, which takes the document id to update the document.
curl -X POST "localhost:9200/pollstat/_doc/hQh4oH8BPfXX63hBUbPN/_update?pretty" -H 'Content-Type: application/json' -d'{"script": {"source": "ctx._source.dt.addAll(params.dt)", "params": {"dt": [{ "dte": "2019-07-16", "is_polled": true }, { "dte": "2019-07-17", "is_polled": false } ] } } } '
This is also working
However, my application does not have visibility to the document id but gets the maid. The maid is also as unique as the document id. Hence to update a specific maid, I was trying to do the same with a query on maid.
I used the following code:
curl -X POST "localhost:9200/pollstat/_update_by_query?pretty" -H 'Content-Type: application/json' -d'"query": {"match": { "maid": "fans" }, "script": {"source": "ctx._source.dt.addAll(params.dt)", "params": {"dt": [{ "dte": "2019-07-18", "is_polled": true }, { "dte": "2019-07-19", "is_polled": false } ] } } }'
This code executes without an error and I am getting the following update status as well:
{
"took" : 8,
"timed_out" : false,
"total" : 1,
"updated" : 1,
"deleted" : 0,
"batches" : 1,
"version_conflicts" : 0,
"noops" : 0,
"retries" : {
"bulk" : 0,
"search" : 0
},
"throttled_millis" : 0,
"requests_per_second" : -1.0,
"throttled_until_millis" : 0,
"failures" : [ ]
}
However my index is not getting updated.
Since the maid field has type keyword, I had to use the query->term instead of query->match. The final query is as follows:
curl -X POST "localhost:9200/pollstat/_update_by_query?pretty" -H 'Content-Type: application/json' -d'
{
"query": {
"term": { "maid": "fans" }},
"script": {
"source": "ctx._source.dt.addAll(params.dt)",
"params": {
"dt": [
{ "dte": "2019-07-18", "is_polled": true },
{ "dte": "2019-07-19", "is_polled": false }
]
}
}
}
'
Posting this answer for others reference.

Elastic Search error - variable [relevancy] is not defined

I am trying to query my products ElasticSearch index and create a script_score but I keep receiving the error Variable [relevancy] is not defined.
I tried replacing the script with just a number, then with Math.log(_score) to make sure the script_score was working properly and the math function is ok, and both queries executed as expected. I also tried doc['relevancy'].value and received the same error.
My query is:
curl -X GET "localhost:9200/products/_search?pretty" -H 'Content-Type: application/json' -d'
{
"query": {
"function_score": {
"query": {
"multi_match" : {
"query": "KQ",
"fields": [ "item_id", "extended_desc", "mfg_part_no" ]
}
},
"script_score" : {
"script": "Math.log(_score) + Math.log(doc['relevancy'])"
},
"boost_mode": "replace"
}
}
}
'
And the mapping for this index is:
{
"products" : {
"mappings" : {
"properties" : {
"#timestamp" : {
"type" : "date"
},
"#version" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"extended_desc" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"frecno" : {
"type" : "long"
},
"item_id" : {
"type" : "text",
"analyzer" : "my_analyzer"
},
"mfg_part_no" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"relevancy" : {
"type" : "long"
}
}
}
}
}
Replaced ' with \u0027 because this is curl.
curl -X GET "localhost:9200/products/_search?pretty" -H 'Content-Type: application/json' -d'
{
"query": {
"function_score": {
"query": {
"multi_match" : {
"query": "KQ",
"fields": [ "item_id", "extended_desc", "mfg_part_no" ]
}
},
"script_score" : {
"script": "Math.log(_score) + Math.log(doc[\u0027relevancy\u0027].value)"
},
"boost_mode": "replace"
}
}
}
'

Elasticsearch mapping select all fields via template to change their data type Elasticsearch

Hi All I am using elasticsearch-template.json to set data type of all of my fields to string. Below is the snippet of the template:
{
"template": "logstash-*",
"settings": {
"index.refresh_interval": "5s",
"number_of_shards": 1,
"number_of_replicas": 0
},
"mappings": {
"logs": {
"_all": {
"enabled": true
},
"properties": {
"level1": {
"properties": {
"level2": {
"properties": {
"_all": {"type": "string"}
}
}
}
}
}
}
}
}
Here under level2 i have got lots of fields which get created i want to set all of them to string how can i set it. I have tried "*" character as well as "%" character to select all the fields. but unfortunately it only gets added as a new field to the mapping. How to specify in template to select all the fields under a certain level?
I believe what you are looking for is a dynamic_templates and using path_match instead of match. This demonstrates how that might work:
curl -DELETE localhost:9200/test-*
curl -XDELETE http://localhost:9200/_template/test
curl -XPOST http://localhost:9200/_template/test -d '
{
"template": "test-*",
"mappings": {
"_default_": {
"dynamic_templates": [
{
"level1_level2_all": {
"path_match": "level1.level2.*",
"match_mapping_type": "*",
"mapping": {
"index": "not_analyzed",
"type": "string"
}
}
}
]
}
}
}
'
curl -XPOST http://localhost:9200/test-1/a -d '
{
"level1": {
"level2": {
"x":1
}
}
}'
curl -XPOST http://localhost:9200/test-1/a -d '
{
"level1": {
"level2": {
"y":1
}
}
}'
curl http://localhost:9200/test-1/_mapping?pretty
The output of which is:
"test-1" : {
"mappings" : {
"_default_" : {
"dynamic_templates" : [ {
"level1_level2_all" : {
"mapping" : {
"index" : "not_analyzed",
"type" : "string"
},
"match_mapping_type" : "*",
"path_match" : "level1.level2.*"
}
} ],
"properties" : { }
},
"a" : {
"dynamic_templates" : [ {
"level1_level2_all" : {
"mapping" : {
"index" : "not_analyzed",
"type" : "string"
},
"match_mapping_type" : "*",
"path_match" : "level1.level2.*"
}
} ],
"properties" : {
"level1" : {
"properties" : {
"level2" : {
"properties" : {
"x" : {
"type" : "string",
"index" : "not_analyzed"
},
"y" : {
"type" : "string",
"index" : "not_analyzed"
}
}
}
}
}
}
}
}
}
}

How to index percolator queries containing filters on inner objects?

Using Elasticsearch 2.1.1
I have documents with inner objects:
{
"level1": {
"level2": 42
}
}
I want to register percolator queries applying filters on the inner property:
$ curl -XPUT http://localhost:9200/myindex/.percolator/myquery?pretty -d '{
"query": {
"filtered": {
"filter": {
"range": {
"level1.level2": {
"gt": 10
}
}
}
}
}
}'
It fails because I don't have a mapping:
{
"error" : {
"root_cause" : [ {
"type" : "query_parsing_exception",
"reason" : "Strict field resolution and no field mapping can be found for the field with name [level1.level2]",
"index" : "myindex",
"line" : 1,
"col" : 58
} ],
"type" : "percolator_exception",
"reason" : "failed to parse query [myquery]",
"index" : "myindex",
"caused_by" : {
"type" : "query_parsing_exception",
"reason" : "Strict field resolution and no field mapping can be found for the field with name [level1.level2]",
"index" : "myindex",
"line" : 1,
"col" : 58
}
},
"status" : 500
}
So I start again, but this time I add a mapping template before:
curl -XDELETE http://localhost:9200/_template/myindex
curl -XDELETE http://localhost:9200/myindex
curl -XPUT http://localhost:9200/_template/myindex?pretty -d 'x
{
"template": "myindex",
"mappings" : {
"mytype" : {
"properties" : {
"level1" : {
"properties" : {
"level2" : {
"type" : "long"
}
}
}
}
}
}
}
'
I try to register my percolator query again:
curl -XPUT http://localhost:9200/myindex/.percolator/myquery?pretty -d '{
"query": {
"filtered": {
"filter": {
"range": {
"level1.level2": {
"gt": 10
}
}
}
}
}
}'
And now it succeeds:
{
"_index" : "myindex",
"_type" : ".percolator",
"_id" : "myquery",
"_version" : 1,
"_shards" : {
"total" : 1,
"successful" : 1,
"failed" : 0
},
"created" : true
}
And I can see the mapping that has been created:
curl http://localhost:9200/myindex/_mapping?pretty
{
"myindex" : {
"mappings" : {
".percolator" : {
"properties" : {
"query" : {
"type" : "object",
"enabled" : false
}
}
},
"mytype" : {
"properties" : {
"level1" : {
"properties" : {
"level2" : {
"type" : "long"
}
}
}
}
}
}
}
}
Now my problem is that I also need to perform searches on my percolator queries and the default percolate mapping doesn’t index the query field.
So I start again, this time specifying in my mapping template that I want percolator queries to be indexed (note "enabled": true):
curl -XPUT http://localhost:9200/_template/myindex?pretty -d '
{
"template": "myindex",
"mappings" : {
".percolator" : {
"properties" : {
"query" : {
"type" : "object",
"enabled" : true
}
}
},
"mytype" : {
"properties" : {
"level1" : {
"properties" : {
"level2" : {
"type" : "long"
}
}
}
}
}
}
}
'
I try to register my percolator query again:
curl -XPUT http://localhost:9200/myindex/.percolator/myquery?pretty -d '{
"query": {
"filtered": {
"filter": {
"range": {
"level1.level2": {
"gt": 10
}
}
}
}
}
}'
But now I get an error:
{
"error" : {
"root_cause" : [ {
"type" : "mapper_parsing_exception",
"reason" : "Field name [level1.level2] cannot contain '.'"
} ],
"type" : "mapper_parsing_exception",
"reason" : "Field name [level1.level2] cannot contain '.'"
},
"status" : 400
}
How can I create and index a percolator query matching an inner property?

MapperParsingException [Analyzer [dbl_metaphone] not found for field [phonetic]]

I've an index on an Elasticsearch cluster, and I want to support phonetic matching.
This is my request:
curl -XPUT "http://localhost:9200/propertywebsites/_mapping/doc?pretty" -i -d '
{
"properties" : {
"phoneticbuilding" : {
"type" : "string",
"fields" : {
"phonetic" : {
"type" : "string",
"analyzer" : "dbl_metaphone"
}}}}
}
'
I received this error response:
HTTP/1.1 400 Bad Request
Content-Type: application/json; charset=UTF-8
Content-Length: 116
{
"error" : "MapperParsingException[Analyzer [dbl_metaphone] not found for field [phonetic]]",
"status" : 400
}
Anyone has any idea about why the dbl_metaphone analyzer couldn't be recognized for phonetic fields?
My elasticsearch's version is elasticsearch-1.7.2
Update 1
I already have the analyzer as the following
PUT myIndexName/
{
"settings": {
"analysis": {
"filter": {
"dbl_metaphone": {
"type": "phonetic",
"encoder": "double_metaphone"
}
},
"analyzer": {
"dbl_metaphone": {
"tokenizer": "standard",
"filter": "dbl_metaphone"
}
}
}
}
}
Update 2
Querying this request
curl -XGET "http://localhost:9200/propertywebsites/_settings?pretty"
I get the following response:
{
"propertywebsites" : {
"settings" : {
"index" : {
"creation_date" : "1451838136296",
"number_of_shards" : "5",
"number_of_replicas" : "1",
"version" : {
"created" : "1070299"
},
"uuid" : "KVOuKVgGRBudsSplownrgg",
"analsis" : {
"filter" : {
"dbl_metaphone" : {
"type" : "phonetic",
"encoder" : "double_metaphone"
}
},
"analyzer" : {
"dbl_metaphone" : {
"filter" : "dbl_metaphone",
"tokenizer" : "standard"
}
}
}
}
}
}
}
"dbl_metaphone" is a token filter and not an analyzer. You need to first install the Phonetic Analysis plug-in and then create a custom analyzer with it. Find more information at https://www.elastic.co/guide/en/elasticsearch/guide/current/phonetic-matching.html.

Resources