I'm sending my data to elasticsearch with index_number of documents. Its unique identifier. When i try to sort it with this, from python client i get this consistency problem as you see in the picture.
This is my query dsl
"size": 1,
"query": {
"match_all": {}
},
"sort": [
{
"index_number.keyword": {
"order": "asc",
"missing": "_last",
"unmapped_type": "String"
}
}
]
In logstash output
output{
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash_%{+yyyy-MM-dd}"
manage_template => true
template_name => "logstash_template"
template => "..../logstash_template.json"
http_compression => true
}
}
In my logstash template.json
...
{
"index_patterns": ["logstash_*"],
"template": {
"settings":{
"number_of_shards": 1,
"number_of_replicas": 0,
"index": {
"sort.field": "index_number",
"sort.order": "asc"
}
},
"mappings": {
"dynamic_templates":{
"string_fields": {
"match": "*",
"match_mapping_type": "string",
"mapping": {"type":"keyword"}
}
},
"properties": {
"index_number": {
"type": "keyword",
"fields": {
"numeric": {
"type": "double"
}
}
}
}
}
}
}
....
Mapping on elasticsearch
{
"logstash_2020-03-12" : {
"mappings" : {
"properties" : {
.....
"index_number" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"city" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"country" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
-----
}
}
}
}
How can i solve it? Thanks for answering.
You need to add template_overwrite to your Logstash output configuration otherwise the logstash_template is not overridden if it already exists:
output{
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash_%{+yyyy-MM-dd}"
manage_template => true
template_override => true <-- add this
template_name => "logstash_template"
template => "..../logstash_template.json"
http_compression => true
}
}
Make sure that your logstash_template.json file has the following format:
{
"index_patterns": [
"logstash_*"
],
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"index": {
"sort.field": "index_number",
"sort.order": "asc"
}
},
"mappings": {
"dynamic_templates": {
"string_fields": {
"match": "*",
"match_mapping_type": "string",
"mapping": {
"type": "keyword"
}
}
},
"properties": {
"index_number": {
"type": "keyword",
"fields": {
"numeric": {
"type": "double"
}
}
}
}
}
}
You had mappings and settings enclosed within the template section, but this is only for the new index templates which the elasticsearch Logstash output doesn't support yet. You need to use the legacy index templates.
Related
I have an index template, from which I am creating an index
PUT /_index_template/example_template
{
"index_patterns": [
"example*"
],
"priority": 1,
"template": {
"aliases": {
"example":{}
},
"mappings": {
"dynamic":strict,
"_source":
{"enabled": false},
"properties": {
"SomeID":
{ "type": "keyword", "index" : true,"store":true,"ignore_above":5},
"firstName":
{ "type": "text", "index" : true,"store":true},
"lastName":
{ "type": "text", "index" : false},
"PersonInfo": {
"type": "object",
"dynamic":"true",
"properties": {
"FirstName": {
"type": "keyword",
"index": true,
"store": false
}
}
}
}
},
"settings": {
"index": {
"number_of_shards": 1,
"number_of_replicas": 3
}
}
}
}
As in the template mappings you can see I am making the dynamic as Strict, so that new fields cant be added to the mappings,
while on inner object, PersonInfo, I can set dynamic as true, which takes precedence and allow to insert a new field mapping.
PUT example10022021/_doc/1
{
"SomeID":"1234",
"firstName":"Nishikant",
"PersonInfo.service_data":"random"
}
Here service_data is getting added into mappings, as dynamic is true
"PersonInfo" : {
"dynamic" : "true",
"properties" : {
"FirstName" : {
"type" : "keyword"
},
"service_data" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
}
}
Is there any way to disable the dynamic mapping completely? like specifying globally?
Thanks!
Steps I took After #Val answer:
PUT /_index_template/example_template
{
"index_patterns": [
"example*"
],
"priority": 1,
"template": {
"aliases": {
"order":{}
},
"mappings": {
"dynamic": "strict",
"dynamic_templates": [
{
"objects": {
"match_mapping_type": "object",
"mapping": {
"dynamic": "strict"
}
}
}
],
"_source":
{"enabled": false},
"properties": {
"BillToID":
{ "type": "keyword", "index" : true,"store":true,"ignore_above":5},
"firstName":
{ "type": "text", "index" : true,"store":true},
"lastName":
{ "type": "text", "index" : false},
"PersonInfo": {
"type": "object",
"dynamic":true,
"properties": {
"FirstName": {
"type": "keyword",
"index": true,
"store": false
}
}
}
}
},
"settings": {
"index": {
"number_of_shards": 1,
"number_of_replicas": 3
}
}
}
}
then I create an index
PUT example10022021
then inserting a document
POST example10022021/_doc/1
{
"BillToID":"1234",
"firstName":"Nishikant",
"PersonInfo.service_data":"random"
}
this will result in 200OK, now if you check the mappings again
GET example10022021
in o/p you can see the dynamic field mapping getting added(this behavior I don't want),
"PersonInfo" : {
"dynamic" : "true",
"properties" : {
"FirstName" : {
"type" : "keyword"
},
"service_data" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
}
}
}
What you can do is to create another index template that applies to all indexes, i.e. using the * name pattern:
PUT /_index_template/common_template
{
"index_patterns": [
"*"
],
"priority": 0,
"template": {
"mappings": {
"dynamic": "strict",
...
If you want to also restrict the creation of dynamic fields inside inner objects, you can leverage dynamic templates, like this:
PUT /_index_template/common_template
{
"index_patterns": [
"*"
],
"priority": 1000,
"template": {
"settings": {},
"mappings": {
"dynamic": "strict",
"dynamic_templates": [
{
"objects": {
"match_mapping_type": "object",
"mapping": {
"dynamic": "strict"
}
}
}
],
"properties": {
"test": {
"type": "object",
"properties": {
"inner": {
"type": "integer"
}
}
}
}
}
}
}
With the above index template, you can create a document like this one:
POST test/_doc/
{
"test": {
"inner": 1
}
}
But not like this one:
POST test/_doc/
{
"test": {
"inner": 1,
"inner2": 2 <--- this will not be allowed
}
}
Elastic search is not recognizing my list of objects as a nested type.
I would like for that to happen automatically without needing to update mapping for every such field.
I need the response of _mappings api to have some sort of identifier that distinguishes properties which are of list type.
For ex:
When i index such a document on a new test index ('mapping_index')
{
"text":"value",
"list":[{"a":"b","c":"d"},{"a":"q","c":"f"}]
}
and hit mappings api
localhost:9200/mapping_index/_mapping
I get
{
"mapping_index": {
"mappings": {
"_doc": {
"properties": {
"list": {
"properties": {
"a": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"c": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
},
"text": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
}
}
I would want something like
"type" : "nested"
for the "list" key in this response so that another service which uses these fields stored in ES can be conveyed that this "list" is a multivalue key.
I've read about dynamic templates and think it might be able to help me but i'm not really sure
(https://www.elastic.co/guide/en/elasticsearch/reference/current/dynamic-templates.html).
Any help is much appreciated.
You can use dynamic_templates
match_mapping_type: "object" will take any object type change it to nested
{
"mappings": {
"dynamic_templates": [
{
"objects": {
"match": "*",
"match_mapping_type": "object",
"mapping": {
"type": "nested"
}
}
}
]
}
}
Data:
{
"list": [
{
"a": "b",
"c": "d"
},
{
"a": "q",
"c": "f"
}
]
}
Result:
"index80" : {
"mappings" : {
"dynamic_templates" : [
{
"objects" : {
"match" : "*",
"match_mapping_type" : "object",
"mapping" : {
"type" : "nested"
}
}
}
],
"properties" : {
"list" : {
"type" : "nested",
"properties" : {
"a" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"c" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
}
}
}
}
}
}
}
Using ELk 6.X It seems i cannot plot points due to geoip.location not populated?
I also added a template which i hope is correct. Not an expert but i am pretty sure my points aren't rendered bc its missing data there.
Kibana 6.4.2
Logstash 6.4.2-1
Elasticsearch 6.4.2
Following configs
input {
udp {
port => 9996
codec => netflow {
versions => [5, 7, 9, 10]
}
type => netflow
}
}
filter {
geoip {
source => "[netflow][ipv4_src_addr]"
target => "src_geoip"
database => "/usr/share/GeoIP/GeoLite2-City.mmdb"
}
geoip {
source => "[netflow][ipv4_dst_addr]"
target => "dst_geoip"
database => "/usr/share/GeoIP/GeoLite2-City.mmdb"
}
}
output
output {
if [type] == "netflow" {
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
}
} else {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[#metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}
}
The Mapping is like such
"geoip": {
"dynamic": "true",
"properties": {
"ip": {
"type": "ip"
},
"latitude": {
"type": "half_float"
},
"location": {
"type": "geo_point"
},
"longitude": {
"type": "half_float"
}
}
},
Template
{
"logstash": {
"order": 0,
"version": 60001,
"index_patterns": [
"logstash-*"
],
"settings": {
"index": {
"refresh_interval": "5s"
}
},
"mappings": {
"_default_": {
"dynamic_templates": [
{
"message_field": {
"path_match": "message",
"match_mapping_type": "string",
"mapping": {
"type": "text",
"norms": false
}
}
},
{
"string_fields": {
"match": "*",
"match_mapping_type": "string",
"mapping": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
],
"properties": {
"#timestamp": {
"type": "date"
},
"#version": {
"type": "keyword"
},
"geoip": {
"dynamic": true,
"properties": {
"ip": {
"type": "ip"
},
"location": {
"type": "geo_point"
},
"latitude": {
"type": "half_float"
},
"longitude": {
"type": "half_float"
}
}
}
}
}
},
"aliases": {}
}
}
My indexes come back with
src or dst but only the below
# dst_geoip.latitude 26.097
# dst_geoip.location.lat 26.097
# dst_geoip.location.lon -80.181
I can't figure out why highlight is not working. The query works but highlight just shows the field content without em tags. Here is my settings and mappings:
PUT wmsearch
{
"settings": {
"index.mapping.total_fields.limit": 2000,
"analysis": {
"analyzer": {
"custom": {
"type": "custom",
"tokenizer": "custom_token",
"filter": [
"lowercase"
]
},
"custom2": {
"type": "custom",
"tokenizer": "keyword",
"filter": [
"lowercase"
]
}
},
"tokenizer": {
"custom_token": {
"type": "ngram",
"min_gram": 3,
"max_gram": 10
}
}
}
},
"mappings": {
"doc": {
"properties": {
"document": {
"properties": {
"reference": {
"type": "text",
"analyzer": "custom"
}
}
},
"scope" : {
"type" : "nested",
"properties" : {
"level" : {
"type" : "integer"
},
"ancestors" : {
"type" : "keyword",
"index" : "true"
},
"value" : {
"type" : "keyword",
"index" : "true"
},
"order" : {
"type" : "integer"
}
}
}
}
}
}
}
Here is my query:
GET wmsearch/_search
{
"query": {
"simple_query_string" : {
"fields": ["document.reference"],
"analyzer": "custom2",
"query" : "bloom"
}
},
"highlight" : {
"fields" : {
"document.reference" : {}
}
}
}
The query does return the correct results and highlight field exists within results. However, there is not em tags around "bloom". Rather, it just shows the entire string with no tags at all.
Does anyone see any issues here or can help?
Thanks
I got it to work by adding "index_options": "offsets" to my mappings for document.reference.
I am trying to use the Completion suggester with Greek language. Unfortunately I have problems with accents like ά. I've tried a few ways.
One was simply to set the greek analyzer in the mapping the other a lowercase analyzer with asciifolding. No success, with greek analyser I dont even get a result with the accent.
Below is what I did, would be great if anyone can help me out here.
Mapping
PUT t1
{
"mappings": {
"profession" : {
"properties" : {
"text" : {
"type" : "keyword"
},
"suggest" : {
"type" : "completion",
"analyzer": "greek"
}
}
}
}
}
Dummy
POST t1/profession/?refresh
{
"suggest" : {
"input": [ "Μάγειρας"]
}
,"text": "Μάγειρας"
}
Query
GET t1/profession/_search
{ "suggest":
{ "profession" :
{ "prefix" : "Μα"
, "completion" :
{ "field" : "suggest"}
}}}
I found a way to do it with a custom analyzer or via a plugin for es which i highly recommend when it comes to non-latin texts.
Option 1
PUT t1
{ "settings":
{ "analysis":
{ "filter":
{ "greek_lowercase":
{ "type": "lowercase"
, "language": "greek"
}
}
, "analyzer":
{ "autocomplete":
{ "tokenizer": "lowercase"
, "filter":
[ "greek_lowercase" ]
}
}
}}
, "mappings": {
"profession" : {
"properties" : {
"text" : {
"type" : "keyword"
},
"suggest" : {
"type" : "completion",
"analyzer": "autocomplete"
}
}}}
}
Option 2 ICU Plugin
Install ES Plugin:
https://www.elastic.co/guide/en/elasticsearch/plugins/current/analysis-icu.html
{ "settings": {
"index": {
"analysis": {
"normalizer": {
"latin": {
"filter": [
"custom_latin_transform"
]
}
},
"analyzer": {
"latin": {
"tokenizer": "keyword",
"filter": [
"custom_latin_transform"
]
}
},
"filter": {
"noDelimiter": {"type": "word_delimiter"},
"custom_latin_transform": {
"type": "icu_transform",
"id": "Greek-Latin/UNGEGN; Lower(); NFD; [:Nonspacing Mark:] Remove; NFC"
}
}
}
}
}
, "mappings":
{ "doc" : {
"properties" : {
"verbose" : {
"type" : "keyword"
},
"name" : {
"type" : "keyword"
},
"slugHash":{
"type" : "keyword",
"normalizer": "latin"
},
"level": { "type": "keyword" },
"hirarchy": {
"type" : "keyword"
},
"geopoint": { "type": "geo_point" },
"suggest" :
{ "type" : "completion"
, "analyzer": "latin"
, "contexts":
[ { "name": "level"
, "type": "category"
, "path": "level"
}
]
}}
}
}}