Adding new fields to mappings requires re-indexing data in elastic search - elasticsearch

I'm using elastic search 1.7.2, As i want to add new fields to existing mappings, do i need to re index entire data?
Ex:
Mappings:
{
"properties":{
"a":{"type":string},
"b":{"type":string}
}
}
If i want to add field "c", do we need to re-index entire data?. please help me, thanks in advance.

Related

Add _id to the source as a separate field to all exist docs in index

I'm new to Elastic Search. I need go through all the documents, take the _id and add it to the _source as a separate field by script. Is it possible? If yes, сan I have an example of something similar or a link to similar scripts? I haven't seen anything like that on the docks. Why i need it? - Because after that i will do SELECT with Opendistro and SQL. This frame cannot return me fields witch not in source. If anyone can suggest I would be very grateful.
There are two options:
First option: Add this new field in your existing index and populate it and build the new index again.
Second option: Simply define a new field in a new index mapping(keep rest all field same) and than use reindex API with below script.
"script": {
"source": "ctx._source.<your-field-name> = ctx._id"
}

How to project a new field in response in ElasticSearch?

I am using Elasticsearch 6.2.
I have an index products with index_type productA having data with following structure:
{
"id": 1,
"parts": ["part1", "part2",...]
.....
.....
}
Now during the query time, I want to add or project a field parts_count to the response which simply represents the number of parts i.e the length of parts array. Also, if possible, I would also like to sort the documents of productA based on the generated field parts_count.
I have gone through most of the docs but haven't found a way to achieve this.
Note:
I don't want to update the mapping and add dynamic fields. I am not sure if Elasticsearch allows it. I just wanted to mention it.
Did you read about Script Fields and on Script Based Sorting?
I think you should be able to achieve both things and this not require any mapping updates.

Overwrite/Update Existing Elasticsearch Index Mapping (geo_point) using Kibana

I am trying to update the mapping for a geo_point field in my elasticsearch index but am running into issues. I am using the dev tool console in Kibana.
The data for the geo_point is in a double array format . I am using spark with the elasticsearch-hadoop-5.3.1.jar and the data is coming into elasticsearch/kibana but remains as a number format while I need to convert it to a geo_point.
It seems that I am unable to update the index mapping once it is defined. I've tried using the method below:
PUT my_index
{
"mappings": {
"my_type": {
"properties": {
"my_location": {
"type": "geo_point"
}
}
}
}
}
-but this results in an "index already exists exception" error.
Thanks for any suggestions.
The command you used just try to create new index with mappings mentioned. For more information read the foot notes in first example here .
As per Elasticsearch documentation, updating mappings of an existing field is not possible.
Updating Field Mappings
In general, the mapping for existing fields cannot be updated. There
are some exceptions to this rule. For instance:
new properties can be added to Object datatype fields.
new multi-fields can be added to existing fields.
the ignore_above parameter can be updated.
As geo_point doesn't fall into any case mentioned above, you cannot modify mappings of that field.
You might need to reindex the data.

Kibana keeps some fields unindexed

So I have an index in elasticsearch, and I want to search and visualize the index with Kibana. But several fields are not indexed by Kibana, and have this bubble:
This field is not indexed thus unavailable for visualization and search.
This is a snippet of one of the fields that is not indexed by Kibana:
"_event_name" : {
"type" : "string"
},
I tried to enter Kibana's index settings and click "Reload field list", but it doesn't help.
Does anyone knows what could be the problem?
Thanks in advance
The fields might not be indexed as mentioned here.
Apparently, Kibana doesn't index fields that start with underscore.
How are you loading the data into Elasticsearch? Logstash? A Beat? curl? Please describe that and if you can include your config file that would be good.
You can look at your mapping in your browser with something like this;
http://localhost:9200/logstash-2016.07.20/_mapping?pretty
(change the host and index name)

Will updating "_mappings" reflect any changes in Indexed data in Elastic search

I didn't found any change in my search result even after updating some fields in my index[_mapping]. so i want to know that "Will updating "_mappings" reflect re-indexing data in Elastic search" [or] "only data inserted after updation will effect with those index parameters[settings n mappings]"
EX:
Initially i've created my index fields as following
"fname":{
"type":"string",
"boost":5
}
"lname":{
"type":"string",
"boost":1
}
then i inserted some data. its working fine.
After updating my index mapping as following,
"fname":{
"type":"string",
"boost":1
}
"lname":{
"type":"string",
"boost":5
}
Still after updating boost values in index, also i'm getting same result.... why?
1: after each and every updation of index [settings n mapping], will elastic-search re-index the data again?
2: do we have different indexed data in same item-type?
Plz clarify this.
While you can add fields to the mappings of an index, any other change to already existing fields will either only operate on new documents or fail.
As mentioned in the comments to the question, there is an interesting article about zero-downtime index switching and there is a whole section about index management in the definitive guide.

Resources