Update mapping index parameter of existing field in Elasticsearch - elasticsearch

I have the mapping
{
"test" : {
"mappings" : {
"properties" : {
"description" : {
"type" : "text"
},
"location" : {
"type" : "keyword",
"index" : false
},
"title" : {
"type" : "text"
}
}
}
}
}
and I want to update the index parameter of the location field to true
I am trying
PUT /test/_mapping
{
"properties": {
"location": {
"type": "keyword",
"index": true
}
}
}
and I am getting
{"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"Mapper for [location] conflicts with existing mapping:\n[mapper [location] has different [index] values]"}],"type":"illegal_argument_exception","reason":"Mapper for [location] conflicts with existing mapping:\n[mapper [location] has different [index] values]"},"status":400}
How to update the index parameter?

What you are trying to achieve is called breaking changes or conflicting changes and is not possible and same is mentioned in the error message.
Think of what index param does and why its breaking changes, from index docs
The index option controls whether field values are indexed. It accepts
true or false and defaults to true. Fields that are not indexed are
not queryable.
Earlier index value was false so your existing documents didn't have value indexed and wasn't queryable and now you changing to true which doesn't make sense as your earlier documents will not have the indexed value and that's the reason its called breaking changes.
You have to create a new index with new index value and you can use the reindex API for that.

Related

Disable mapping for a specific field using an Index Template Elasticsearch 6.8

I have an EFK pipeline set up. Everyday a new index is created using the logstash-* prefix. Every time a new field is sent by Fluentd, the field is added to the index pattern logstash-*. I'm trying to create an index template that will disable indexing on a specific field when an index is created. I got this to work in ES 7.1 using the PUT below:
PUT _template/logstash-test
{
"index_patterns": ["logstash-*"],
"mappings": {
"dynamic_templates" : [
{
"params" : {
"path_match" : "params",
"mapping" : {
"enabled": false
}
}
}
]
}
}
However when I try this on Elasticsearch 6.8 I get the following error:
"type": "illegal_argument_exception",
"reason": "Malformed [mappings] section for type [dynamic_templates], should include an inner object describing the mapping"
It is a little different in Elasticsearch 6.X as it had mapping types, which is not used anymore.
Try something like this:
PUT _template/logstash-test
{
"index_patterns": ["logstash-*"],
"mappings": {
"_doc": {
"dynamic_templates" : [
{
"params" : {
"path_match" : "params",
"mapping" : {
"enabled": false
}
}
}
]
}
}
}
If your index has a different custom type and is not using the _doc type, you should use that in the mapping.

How to update Index mapping to update fielddata data type

I dont know how my field mapping changed from keyword to text but now its an issue and i need to change from text to keyword .
I have huge data so re-indexing will take 2 to 3 days time .Now we are looking way to update the index mapping so that the issue will be resolved .
In our lower environment the field data is still keyword and in prod it is changed .
We are using AWS Elastic search 7.1
Please help .
This is what we want
{
"mappings":{
"properties":{
"objectID":{
"type":"keyword"
}
}
}
}
But this gives us error
"type": "resource_already_exists_exception",
This is our search query
Finally we have upgraded our ES cluster so can that be the root cause of the issue ?
Its dynamic mapping for this filed .
As mentioned in the question, it states that now the field has the dynamic mapping. This means that your current index mapping for objectId field is of text type and multifield of keyword type
{
"<index-name>" : {
"mappings" : {
"properties" : {
"objectId" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
}
}
}
}
}
You cannot change the mapping of objectId field from text to keyword type using Update mapping API. If you try to do so, you will get the below error
{
"error" : {
"root_cause" : [
{
"type" : "illegal_argument_exception",
"reason" : "mapper [name] cannot be changed from type [text] to [keyword]"
}
],
"type" : "illegal_argument_exception",
"reason" : "mapper [objectId] cannot be changed from type [text] to [keyword]"
},
"status" : 400
}
So instead you can use objectId.keyword field (that is already created using dynamic mapping as stated in the question above), or you need to use the reindex API.
With the reindex API, you have to create a new index with the required index mapping, and then reindex the old data into the new index (based on the new index mapping)

Creating a new field into existing index - ElasticSearch

I am wanting to create a new field and add it to an existing index so that way I can send a unique value to that new field. I was hoping there was an API to do this without having to do it in the CLI of Kibana. But I ran into this article that tells you how to add new fields to an existing index.
I tried to add it under _source field but it did not allow me.
PUT customer-simulation-es-app-logs-development-2021-07/_mapping
{
"_source":{
"TransactionKey":{
"type": "keyword"
}
}
}
So I then added it to properties which allowed me:
PUT customer-simulation-es-app-logs-development-2021-07/_mapping
{
"properties":{
"TransactionKey":{
"type": "keyword"
}
}
}
To make sure it was updated I ran the cmd GET customer-simulation-es-app-logs-development-2021-07/_mapping which did return it.
{
"customer-simulation-es-app-logs-development-2021-07" : {
"mappings" : {
"properties" : {
"#timestamp" : {
"type" : "date"
},
"TransactionKey" : {
"type" : "keyword"
},
"exceptions" : {
"properties" : {
"ClassName" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
.....
But when I go to Discover and type in TransactionKey for the fields nothing pops up. Did I not add the new field correctly to the existing index?
If you're running a version prior to 7.11, then you need to go to Stack Management > Index pattern and refresh your index pattern before seeing your new field in the Discover view. You need to do this every time your index mapping changes.
Since 7.11, the index pattern are being refreshed automatically whenever needed.

How to update index mapping with dynamic as strict in future?

I am new to elastic search. I have an index named users, which has a lot of fields I know. But a few more fields can be added in the future.
So when defining my mapping, I want to include the fields that I know currently with dynamic "strict", but in the future, if I want to add the new field, how will update the new mapping and if I do it, will I have to reindex everything.
I found in the ES documents that mappings are applied only during index creation time. So I am a little confused here, what's the right way to approach this.
You can always update the mapping in future, even after keeping it strict using the put mapping api. You'll not require existing data to be re-indexed unless you want the newly added field have some value for the older documents which were added before updating the mapping with the new field.
Lets assume you already have an index test with one field say field1 of type keyword. Now in future you have a requirement to add new field say field2 of integer type. You can do so by the put mapping api as below,
PUT test/_mapping
{
"properties": {
"field2": {
"type": "integer"
}
}
}
After executing the above if you check the mapping using
GET test/_mapping
You can see the new field as well in the response,
{
"test" : {
"mappings" : {
"dynamic" : "strict",
"properties" : {
"field1" : {
"type" : "keyword"
},
"field2" : {
"type" : "integer"
}
}
}
}
}
Inner objects inherit the dynamic setting from their parent object or from the mapping type. In the following example, dynamic mapping is disabled at the type level, so no new top-level fields will be added dynamically.
However, the user.social_networks object enables dynamic mapping, so you can add fields to this inner object.
https://www.elastic.co/guide/en/elasticsearch/reference/current/dynamic.html
PUT my-index-000001
{
"mappings": {
"dynamic": false,
"properties": {
"user": {
"properties": {
"name": {
"type": "text"
},
"social_networks": {
"dynamic": true,
"properties": {}
}
}
}
}
}
}

Storing only selected fields and not storing _all in pyes/elasticsearch

I am trying to use pyes with elasticsearch as full text search engine, I store only UUIDs and indexes of string fields, actual data is stored in MonogDB and retrieved using UUIDs. Unfortunately, I am unable to create a mapping that wouldn't store original data, I've tried various combinations of "store"/"source" fields and disabling "_all" but I can still get text of indexed fields. It seems that documentation is misleading on this topic as it's just a copy of original docs.
Can anyone please provide an example of mapping that would only store some fields and not the original document JSON?
Sure, you could use something like this (with two fields, 'uuid' and 'body'):
{
"mytype" : {
"_source" : {
"enabled" : false
},
"_all" : {
"enabled" : false
},
"properties" : {
"data" : {
"store" : "no",
"type" : "string"
},
"uuid" : {
"store" : "yes",
"type" : "string",
"index" : "not_analyzed"
}
}
}
}

Resources