Can't update mapping in elasticsearch - elasticsearch

When putting an anaylzer into mapping using PUT /job/_mapping/doc/ but get conflicts.
But there isn't a anaylzer in mappings.
PUT /job/_mapping/doc/
{
"properties":{
"title": {
"type": "text",
"analyzer":"ik_smart",
"search_analyzer":"ik_smart"
}
}
}
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "Mapper for [title] conflicts with existing mapping in other types:\n[mapper [title] has different [analyzer]]"
}
],
"type": "illegal_argument_exception",
"reason": "Mapper for [title] conflicts with existing mapping in other types:\n[mapper [title] has different [analyzer]]"
},
"status": 400
}
"title": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
},
"fielddata": true
},
The output config is like this.
output {
elasticsearch {
hosts => ["<Elasticsearch Hosts>"]
user => "<user>"
password => "<password>"
index => "<table>"
document_id => "%{<MySQL_PRIMARY_KEY>}"
}
}

You cant update mapping in elasticsearch, you can add mapping but not update mapping. Elasticsearch use mapping at the indexation time, that s why you cant update mapping of an existing field. Analyzer is part of the mapping, in fact if you don't specify one es a default one, analyzer tell elastic how to index the documents.
create a new index with your new mappings (include analyzer)
reindex your documents from your existing index to the new one (https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-reindex.html)

Updating Mapping:
Once a document is indexed to an index i.e. the mapping is generated under a given type as like in our case Mapping of EmployeeCode, EmployeeName & isDevelopers' is generated under type "customtype", we cannot modify it afterwards. In case if we want to modify it, we need to delete the index first and then apply the modified mapping manually and then re-index the data. But If you want to add an a new property under a given type, then it is feasible. For example, our document attached our index "inkashyap-1002" under type "customtype" is as follows:
{
"inkashyap-1002": {
"mappings": {
"customtype": {
"properties": {
"EmployeeCode": {
"type": "long"
},
"isDeveloper": {
"type": "boolean"
},
"EmployeeName": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
}
}
now let's add another property "Grade" :
curl -XPUT localhost:9200/inkashyap-1002(IndexName)/customtype(TypeName)/2 — d '{
"EmployeeName": "Vaibhav Kashyap",
"EmployeeCode": 13629,
"isDeveloper": true,
"Grade": 5
}'
Now hit the GET mapping API. In the results, you can see there is another field added called "Grade".
Common Error:
In the index "inkashyap-1002", so far we have indexed 2 documents. Both the documents had the same type for the field "EmployeeCode" and the type was "Long". Now let us try to index a document like below:
curl -XPUT localhost:9200/inkashyap-1002/customtype/3 -d '{
"EmployeeName": "Vaibhav Kashyap",
"EmployeeCode": "onethreesixtwonine",
"isDeveloper": true,
"Grade": 5
}'
Note that here the "EmployeeCode" is given in string type, which indicates that it is a string field. The response to the above request will be like below:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "failedtoparse[
EmployeeCode
]"
}
],
"type": "mapper_parsing_exception",
"reason": "failedtoparse[
EmployeeCode
]",
"caused_by": {
"type": "number_format_exception",
"reason": "Forinputstring: \"onethreesixtwonine\""
}
},
"status": 400
}
In the above response, we can see the error "mapper_parsing_exception" on the field "EmployeeCode". This indicates that the expected field here was of another type and not string. In such cases re-index the document with the appropriate type

Related

How can i fix this error Mapper for [description] conflicts with existing mapper:Cannot update parameter [analyzer] from [my_analyzer] to [default]

am new to elastic search.please help me out with this problem.
Am using elastic search version 7.13.2 .
I created an index with a custom analyzer and filter like this
PUT /analyzers_test
{
"settings": {
"analysis": {
"analyzer": {
"english_stop":{
"type":"standard",
"stopwords":"_english_"
},
"my_analyzer":{
"type":"custom",
"tokenizer":"standard",
"char_filter":["html_strip"
],
"filter":[
"lowercase",
"trim",
"my_stemmer"]
}
},
"filter": {
"my_stemmer":{
"type":"stemmer",
"name":"english"
}
}
}
}
}
Then i created a mapping for the document i will have and specified my custom analyzer i created earlier (there is no document in the index yet)
like so:
PUT /analyzers_test/_mapping
{
"properties": {
"description":{
"type": "text",
"analyzer": "my_analyzer"
},
"teaser":{
"type": "text"
}
}
}
When i try try to create a document like so
POST /analyzers_test/1
{
"description":"drinking",
"teaser":"drinking"
}
i get the following error
{
"error" : {
"root_cause" : [
{
"type" : "illegal_argument_exception",
"reason" : "Mapper for [description] conflicts with existing mapper:\n\tCannot update parameter [analyzer] from [my_analyzer] to [default]"
}
],
"type" : "illegal_argument_exception",
"reason" : "Mapper for [description] conflicts with existing mapper:\n\tCannot update parameter [analyzer] from [my_analyzer] to [default]"
},
"status" : 400
}
Use index API to add document to your index. You are missing the _doc
"You cannot change the mapping (including the analyzer) of an existing field. What you need to do if you want to change the mapping of existing documents is reindex those documents to another index with the updated mapping." Abdon Pijpelink
Click the name for the original discussion

Getting error action_request_validation_exception while mapping new field in already exist Elasticsearch index

I am trying to add a new field to my already exist Elasticsearch index but I'm getting the below exception:
{
"type": "action_request_validation_exception",
"reason": "Validation Failed: 1: mapping type is missing;"
}
I'm using the below API
PUT order/_mapping
{
"properties": {
"title": { "type": "text"}
}
}
You need to add the mapping type to the PUT request, and modify the request as :
PUT order/{{mapping-type}}/_mapping
{
"properties": {
"title": { "type": "text"}
}
}

Cannot create elasticsearch mapping or get date field to work

{
"error" : {
"root_cause" : [
{
"type" : "illegal_argument_exception",
"reason" : "Types cannot be provided in put mapping requests, unless the include_type_name parameter is set to true."
}
"status" : 400
I've read that the above is because type are deprecated in elasticsearch 7.7, is that valid for data type? I mean how am I suppose to say I want the data to be considered as a date?
My current mapping has this element:
"Time": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
I just wanted to create it again with type:"date", but I noticed even copy pasting the current mapping (which works) yields an error... The index and mappins I have are generate automatically by https://github.com/jayzeng/scrapy-elasticsearch
My goal is simply to have a date field, I have all my date in my index, but when I want to filter in kibana I can see it is not considered as a date field. And modyfing mapping doesn't seem like an option.
Obvious ELK noob here, please bare with me (:
The error is quite ironic because I pasted the mapping from an existing index/mapping...
You're experiencing this because of the breaking change in 7.0 regarding named doc types.
Long story short, instead of
PUT example_index
{
"mappings": {
"_doc_type": {
"properties": {
"Time": {
"type": "date",
"fields": {
"keyword": {
"type": "keyword"
}
}
}
}
}
}
}
you should do
PUT example_index
{
"mappings": { <-- Note no top-level doc type definition
"properties": {
"Time": {
"type": "date",
"fields": {
"keyword": {
"type": "keyword"
}
}
}
}
}
}
I'm not familiar with your scrapy package but at a glance it looks like it's just a wrapper around the native py ES client which should forward your mapping definitions to ES.
Caveat: if you have your fields defined as text and you intend to change it to date (or add a field of type date), you'll get the following error:
mapper [Time] of different type, current_type [text], merged_type [date]
You basically have two options:
drop the index, set the new mapping & reindex everything (easiest but introduces downtime)
extend the mapping with a new field, let's call it Time_as_datetime and update your existing docs using a script (introduces a brand new field/property -- that's somewhat verbose)

How to create a mutlitype index in Elasticsearch?

In several pages in Elasticsearch documentation is mentioned how to query a multi-type index.
But I failed to create one at the first place.
Here is my minimal example (on a Elasticsearch 6.x server):
PUT /myindex
{
"settings" : {
"number_of_shards" : 1
}
}
PUT /myindex/people/123
{
"first name": "John",
"last name": "Doe"
}
PUT /myindex/dog/456
{
"name": "Rex"
}
Index creation and fist insert did well, but at the dog type insert attempt:
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "Rejecting mapping update to [myindex] as the final mapping would have more than 1 type: [people, dog]"
}
],
"type": "illegal_argument_exception",
"reason": "Rejecting mapping update to [myindex] as the final mapping would have more than 1 type: [people, dog]"
},
"status": 400
}
But this is exactly what I'm trying to do, buddy! Having "more than 1 type" in my index.
Do you know what I have to change in my calls to achieve this?
Many thanks.
Multiple mapping types are not supported from Elastic 6.0.0 onwards. See breaking changes for details.
You can still effectively use multiple types by implementing your own custom type field.
For example:
{
"mappings": {
"doc": {
"properties": {
"type": {
"type": "keyword"
},
"first_name": {
"type": "text"
},
"last_name": {
"type": "text"
}
}
}
}
}
This is described in removal of types.

Why can't I mix my data types in Elasticsearch?

Currently, I'm trying to implement a variation of the car example here:
https://www.elastic.co/blog/managing-relations-inside-elasticsearch
If I run:
PUT /vehicle/_doc/1
{
"metadata":[
{
"key":"make",
"value":"Saturn"
},
{
"key":"year",
"value":"2015"
}
]
}
the code works correctly.
But if I delete the index and change 2015 from a string to a number:
DELETE /vehicle
PUT /vehicle/_doc/1
{
"metadata":[
{
"key":"make",
"value":"Saturn"
},
{
"key":"year",
"value":2015
}
]
}
I get the following error message:
{ "error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "mapper [metadata.value] of different type, current_type [long], merged_type [text]"
}
],
"type": "illegal_argument_exception",
"reason": "mapper [metadata.value] of different type, current_type [long], merged_type [text]" }, "status": 400 }
How do I fix this error?
PUT /vehicle/_doc/1
{
"metadata":[
{
"key":"make",
"value":"Saturn"
},
{
"key":"year",
"value":2015
}
]
}
After deleting the index and then trying to index a new document as above, the following steps takes place:
When elastic could not find an index by the name of vehicle and auto index creation is enabled (which is by default enabled) it will create a new index named as vehicle.
Based on the input document elastic now tries to make best guess of the datatype for the fields of the document. This is know as dynamic field mapping.
For the above document since metadata is an array of objects the field metadata is assumed to be of object data type.
Now comes the step to decide the data type of fields of individual object. When it encounters the first object it finds two fields key and value. Both these fields have string value (make and Saturn respectively) and hence elastic identifies the datatype for both the fields as text.
Elastic then try to define the mapping as below:
{
"properties": {
"metadata": {
"properties": {
"key": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"value": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
When it encounter the second object where the value of value field is numeric (2015) which it guesses the datatype as long. This results in conflict with the previously identified datatype which was text. A field can not be of mixed data type. Data types are strict, and hence the error.
To resolve the error you need to make sure the input values for the fields are of same type for each of the object as below:
PUT /vehicle/_doc/1
{
"metadata":[
{
"key":"make",
"value":2016
},
{
"key":"year",
"value":2015
}
]
}
For above it could be better used as:
PUT /vehicle/_doc/1
{
"metadata":[
{
"make":"Saturn",
"year": 2015
}
]
}

Resources