I've got the following mapping for an ES index (I'm not including config for analyzer and other things):
{
"mappings": {
"properties": {
"topCustomer": {
"type": "text",
"analyzer": "autocomplete",
"search_analyzer": "autocomplete_search",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"topCustomer_suggest": {
"type": "completion",
"contexts": [
{
"name": "index_name",
"type": "category"
}
]
},
"customer": {
"type": "nested",
"include_in_root": "true",
"properties": {
"customer_name": {
"type": "text",
"analyzer": "autocomplete",
"search_analyzer": "autocomplete_search",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
},
"customer_name_suggest": {
"type": "completion",
"contexts": [
{
"name": "index_name",
"type": "category"
}
]
}
}
},
"customer_level": {
"type": "integer"
}
}
}
}
}
}
Also, I have the following logstash configuration file:
input {
jbdc {
//Input config
}
}
filter {
mutate {
remove_field => ["#version"]
}
ruby {
code => "
input = event.get('topCustomer').strip.gsub(/[\(\)]+/, '').split(/[\s\/\-,]+/);
event.set('[topCustomer_suggest][input]', input);
contexts = { 'index_name' => [event.get('type')] };
event.set('[topCustomer_suggest][contexts]', contexts);
input = event.get('[customer][cutomer_name]').strip.gsub(/[\(\)]+/, '').split(/[\s\/\-,]+/);
event.set('[customer][customer_name][fields][customer_name_suggest][input]', input);
contexts = { 'index_name' => [event.get('type')] };
event.set('[customer][customer_name][fields][customer_name_suggest][contexts]', contexts);
"
}
}
output {
elasticsearch {
index => "%{type}"
manage_template => false
hosts => ["localhost:9200"]
}
}
Now, when I try to refresh my index, to apply the changes that I made to one of these files, I get the following error:
Could not index event to Elasticsearch ...
:response=>{"index"=>{"index"=>"customers", "_type"=>"_doc",
"_id"=>"...", "status"=>400,
"error"=>{"type"=>"illegal_argument_exception", "reason"=>"Contexts
are mandatory in context enabled completion field
[customer.customer_name.customer_name_suggest]"}}}}
I tried to modify my config file so that the set events (in the ruby filter section) match the format that the error displays to access the field; I also tried many more combinations to see if this was causing the error.
As you can see, I defined another completion field in the mapping. This field works as expected. The difference is that this is not a nested field.
Notice that the customer_name_suggest is a sub-field and not an 'independent' field like the topCustomer_suggest field. Is this the correct way of doing it or should I not make customer_name_suggest a sub field? I really don't understand why I'm getting the error as I'm defining the contexts property in the mapping.
Related
I don't know whether my index has enabled/disabled dynamic field. When I use get index mapping command it just responses these informations:
GET /my_index1/_mapping
{
"my_index1": {
"mappings": {
"properties": {
"goodsName": {
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
},
"type": "text"
},
"auditTime": {
"type": "long"
},
"createUserId": {
"type": "long"
}
}
}
}
}
If you don't explicitly set the dynamic to false or strict, it will be true by default. If you explicitly set that, you will see that in your mappings:
{
"mappings": {
"dynamic": false,
"properties": {
"name": {
"type": "text"
}
}
}
}
And when you index the following document:
{"name":"products", "clickCount":1, "bookingCount":2, "isPromoted":1}
Only the field name will be indexed, the rest won't. If you call the _mapping endpoint again, it will give you the exact mappings above.
I want my elasticsearch index to match the exact value for all the fields. How do I map my index to "not_analysed" for all the fields.
I'd suggest making use of multi-fields in your mapping (which would be default behavior if you aren't creating mapping (dynamic mapping)).
That way you can switch to traditional search and exact match searches when required.
Note that for exact matches, you would need to have keyword datatype + Term Query. Sample examples are provided in the links I've specified.
Hope it helps!
You can use dynamic_templates mapping for this. As a default, Elasticsearch is making the fields type as text and index: true like below:
{
"products2": {
"mappings": {
"product": {
"properties": {
"color": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"type": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
}
}
As you see, also it creates a keyword field as multi-field. This keyword fields indexed but not analyzed like text. if you want to drop this default behaviour. You can use below configuration for the index while creating it :
PUT products
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0
},
"mappings": {
"product": {
"dynamic_templates": [
{
"strings": {
"match_mapping_type": "string",
"mapping": {
"type": "keyword",
"index": false
}
}
}
]
}
}
}
After doing this the index will be like below :
{
"products": {
"mappings": {
"product": {
"dynamic_templates": [
{
"strings": {
"match_mapping_type": "string",
"mapping": {
"type": "keyword",
"index": false
}
}
}
],
"properties": {
"color": {
"type": "keyword",
"index": false
},
"type": {
"type": "keyword",
"index": false
}
}
}
}
}
}
Note: I don't know the case but you can use the multi-field feature as mentioned by #Kamal. Otherwise, you can not search on the not analyzed fields. Also, you can use the dynamic_templates mapping set some fields are analyzed.
Please check the documentation for more information :
https://www.elastic.co/guide/en/elasticsearch/reference/current/dynamic-templates.html
Also, I was explained the behaviour in this article. Sorry about that but it is Turkish. You can check the example code samples with google translate if you want.
I am new to Elasticsearch and I am trying to use Logstash to load data to an index. Following is a partial of my losgstash config:
filter {
aggregate {
task_id => "%{code}"
code => "
map['campaignId'] = event.get('CAM_ID')
map['country'] = event.get('COUNTRY')
map['countryName'] = event.get('COUNTRYNAME')
# etc
"
push_previous_map_as_event => true
timeout => 5
}
}
output {
elasticsearch {
document_id => "%{code}"
document_type => "company"
index => "company_v1"
codec => "json"
hosts => ["127.0.0.1:9200"]
}
}
I was expecting that the aggregation would map for instance the column 'CAM_ID' into a property in the ElasticSearch Index as 'campaignId'. Instead, is creating a property with the name 'cam_id' which is the column name as lowercase. The same with the rest of the properties.
Following is the Index Document after logstash being executed:
{
"company_v1": {
"aliases": {
},
"mappings": {
"company": {
"properties": {
"#timestamp": {
"type": "date"
},
"#version": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"cam_id": {
"type": "long"
},
"campaignId": {
"type": "long"
},
"cam_type": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"campaignType": {
"type": "text"
}
}
}
},
"settings": {
"index": {
"creation_date": "1545905435871",
"number_of_shards": "5",
"number_of_replicas": "1",
"uuid": "Dz0x16ohQWWpuhtCB3Y4Vw",
"version": {
"created": "6050399"
},
"provided_name": "company_v1"
}
}
}
}
'campaignId' and 'campaignType' were created by me when i created the index, but logstash created the other 2.
Can someone explain me how to configure logstash to customize the indexes documents properties names when data is being loaded?
Thank you very much.
Best Regards
I have a problem with nested object mapping by ruby filter plugin.
My object should have field cmds which is array of objects like this:
"cmds": [
{
"number": 91,
"errors": [],
"errors_count": 0
},
{
"number": 92,
"errors": ["ERROR_1"],
"errors_count": 1
}]
By elasticsearch I need to find objects where number = 91 and error_count > 0. So object above shoudn`t be correct result. But my query (below) matches it.
GET /logs/default/_search
{
"query": {
"bool": {
"must": [
{
"match": {
"cmds.number": 91
}
},
{
"range": {
"cmds.errors_count": {
"gt": 0
}
}
}]}}
I know it because JSON document is flattened into a simple key-value format and I should mapp the cmds field as type nested instead of type object.
The problem is I have no idea how to do it in my logstash ruby script with event.set
I have folowing code:
for t in commandTexts do
commandv = Command.new(t)
if i==0
event.set("[cmds]", ["[number]" => commandv.hexnumber,
"[command_text]" => commandv.command_text,
"[errors]" => commandv.errors,
"[has_error]" => commandv.has_error,
"[errors_count]" => commandv.errors_count])
else
event.set("[cmds]", event.get("cmds") + ["[number]" => commandv.hexnumber,
"[command_text]" => commandv.command_text,
"[errors]" => commandv.errors,
"[has_error]" => commandv.has_error,
"[errors_count]" => commandv.errors_count])
end
i+=1
end
end
I`m new in ruby and my code is not perfect, but "cmds" field look fine in elastic search. The only problem is that is not nested. Please help.
Ok, i did it. I`m still new in ELK, and sometimes I'm confused where (logstash/kibana/scripts in ruby) i should do what needed.
My code is okey. Using kibana, I deleted my index, and make a new one with correct mapping
code:
PUT /logs?pretty
{
"mappings": {"default": {
"properties": {
"cmds" : {
"type" : "nested",
"properties": {
"command_text": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"errors": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"errors_count": {
"type": "long"
},
"has_error": {
"type": "boolean"
},
"number": {
"type": "long"
}
}
}
}
}}
}
Earlier I was trying to create new index just by setting "type" as "nested"
PUT /logs?pretty
{
"mappings": {"default": {
"properties": {
"cmds" : {
"type" : "nested"
}
}
}}}
But it wasn`t working correctly ("cmds" field was not added to elasticsearch) so I done it by full mapping (all properties).
Is it possible to use multi-fields to set and query multilingual fields?
Consider this mapping:
PUT multi_test
{
"mappings": {
"data": {
"_field_names": {
"enabled": false
},
"properties": {
"book_title": {
"type": "text",
"fields": {
"english": {
"type": "text",
"analyzer": "english"
},
"german": {
"type": "text",
"analyzer": "german"
},
"italian": {
"type": "text",
"analyzer": "italian"
}
}
}
}
}
}
}
I tried the following, but it doesn't work:
PUT multi_test/data/1
{
"book_title.english": "It's good",
"book_title.german": "Das gut"
}
The error seems to indicate I'm trying to add new fields:
{ "error": { "root_cause": [ { "type": "mapper_parsing_exception",
"reason": "Could not dynamically add mapping for field
[book_title.english]. Existing mapping for [book_title] must be of
type object but found [text]." } ], "type":
"mapper_parsing_exception", "reason": "Could not dynamically add
mapping for field [book_title.english]. Existing mapping for
[book_title] must be of type object but found [text]." }, "status":
400 }
What am I doing wrong here?
If my approach is unworkable, what is a better way to do this?
The problem is that you are using using fields for the field book_title.
Fields keyword is used when you want to keep same field and data in multiple ways i.e using different analyzers or some other setting changes but values should be same in all field names under fields.Here is the link describing what is keyword fields https://www.elastic.co/guide/en/elasticsearch/reference/2.4/multi-fields.html
In you use case the mapping should be like below
PUT multi_test
{
"mappings": {
"data": {
"_field_names": {
"enabled": false
},
"properties": {
"book_title": {
"properties": {
"english": {
"type": "text",
"analyzer": "english"
},
"german": {
"type": "text",
"analyzer": "german"
},
"italian": {
"type": "text",
"analyzer": "italian"
}
}
}
}
}
}
}
This will define book_title as object type and you can add multiple fields with different data under book_title