I have an existing mapping for a field, and I want to change it to a multi-field.
The existing mapping is
{
"my_index": {
"mappings": {
"my_type": {
"properties": {
"author": {
"type": "string"
},
"isbn": {
"type": "string",
"analyzer": "standard",
"fields": {
"ngram": {
"type": "string",
"search_analyzer": "keyword"
}
}
},
"title": {
"type": "string",
"analyzer": "english",
"fields": {
"std": {
"type": "string",
"analyzer": "standard"
}
}
}
}
}
}
}
}
Based on the documentation, I should be able to change "author" to a multi-field by executing the following
PUT /my_index
{
"mappings": {
"my_type": {
"properties": {
"author":
{
"type": "multi-field",
"fields": {
"ngram": {
"type": "string",
"indexanalyzer": "ngram_analyzer",
"search_analyzer": "keyword"
},
"name" : {
"type": "string"
}
}
}
}
}
}
}
But instead I get the following error:
{
"error": "IndexAlreadyExistsException[[my_index] already exists]",
"status": 400
}
Am I missing something really obvious?
Instead of PUT to /my_index do:
POST /my_index/_mapping
You won't be able to change the field type in an already existing index.
If you can't recreate your index you can make use of the copy to field to achieve a similar capability.
PUT /my_index
{
"mappings": {
"my_type": {
"properties": {
"author":
{
"type": "string",
"copy_to": ["author-name","author-ngram"]
}
"author-ngram": {
"type": "string",
"indexanalyzer": "ngram_analyzer",
"search_analyzer": "keyword"
},
"author-name" : {
"type": "string"
}
}
}
}
}
}
While I have not tried it in your particular example, it is indeed possible to update field mappings by first closing the index and then applying the mappings.
Example:
POST /my_index/_close
POST /my_index/_mapping
{
"my_field:{"new_mapping"}
}
POST /my_index/_open
I have tested it, by adding a "copy_to" mapping property to mapped field.
Based on https://gist.github.com/nicolashery/6317643.
Related
I'm wondering if there is a way to set all nested field to a specific type
The main reason I'm looking for something like that is because the attributes inside category are not not known and can vary.
That's the mapping I have, and every single property must have its type explicitly set:
PUT data
{
"mappings": {
"properties": {
"category": {
"type": "nested",
"properties": {
"mode": {
"type": "text",
"analyzer": "keyword"
},
"sequence": {
"type": "text",
"analyzer": "keyword"
}
}
}
}
}
}
I'm looking for something like this pseudo mapping below:
PUT data
{
"mappings": {
"properties": {
"category": {
"type": "nested",
"properties.*": {
"type": "text",
"analyzer": "keyword"
}
}
}
}
}
Maybe this is not the way to go and if you have any other solution to handle these dynamic attributes, it will be really appreciated.
You can achieve this with dynamic templates
PUT data
{
"mappings": {
"dynamic_templates": [
{
"category_fields": {
"path_match": "category.*",
"mapping": {
"type": "text",
"analyzer": "keyword"
}
}
}
]
},
"properties": {
"category": {
"type": "nested"
}
}
}
I would like to add anaylyser uax_url_email for search email field in elasticsearch document. However, I get the error.
Here is the code I am using to create this index
{
"user": {
"aliases": {},
"mappings": {
"user": {
"properties": {
"created": {
"type": "date"
},
"email": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
},
"settings": {
"index": {
"number_of_shards": "5",
"provided_name": "user",
"creation_date": "1521016015646",
"analysis": {
"analyzer": {
"my_analyzer": {
"tokenizer": "my_tokenizer"
}
},
"tokenizer": {
"my_tokenizer": {
"type": "uax_url_email",
"max_token_length": "255"
}
}
},
"number_of_replicas": "1",
"uuid": "RS96V9gFQbG5UmoQ2R_gLA",
"version": {
"created": "6010099"
}
}
}
}
}
PUT /user/_mapping/email
{
"mappings": {
"_doc": {
"properties": {
"email": {
"type": "text",
"fields": {
"my_analyzer": {
"email": "text",
"analyzer": "my_analyzer"
}
}
}
}
}
}
}
I got an error stating "root_cause":
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "Root mapping definition has unsupported parameters: [mappings : {_doc={properties={email={type=text, fields={my_analyzer={email=text, analyzer=my_analyzer}}}}}}]"
}
],
"type": "mapper_parsing_exception",
"reason": "Root mapping definition has unsupported parameters: [mappings : {_doc={properties={email={type=text, fields={my_analyzer={email=text, analyzer=my_analyzer}}}}}}]"
},
"status": 400
}
Nothing will be found. I want my analyzer and tokenizer work on email field any help will be highly appreciated
This should work:
PUT /user/_mapping/user
{
"properties": {
"email": {
"type": "text",
"fields": {
"my_analyzer": {
"type": "text",
"analyzer": "my_analyzer"
}
}
}
}
}
Your mistake was that you thought that the index type was _doc, but looking at your mapping, the index type is user. Basically, you have a user index with a user type.
The format of the command is PUT /[INDEX_NAME]/_mapping/[TYPE_NAME] {"properties:" { "[FIELD_TO_BE_UPDATED]": {....} } }.
I am using elasticsearch 5.1.1.
I have a requirement where in I want to index data in multiple languages.
I used following mapping:
PUT http://localhost:9200/movies
{
"mappings": {
"title": {
"properties": {
"title": {
"type": "string",
"fields": {
"de": {
"type": "string",
"analyzer": "german"
},
"en": {
"type": "string",
"analyzer": "english"
},
"fr": {
"type": "string",
"analyzer": "french"
},
"es": {
"type": "string",
"analyzer": "spanish"
}
}
}
}
}
}
}
when I try to insert some data as :
POST http://localhost:9200/movies/movie/1
{
"title.en" :"abc123"
}
I am getting following error:
{
"error": {
"root_cause": [
{
"type": "remote_transport_exception",
"reason": "[IQ7CUTp][127.0.0.1:9300][indices:data/write/index[p]]"
}
],
"type": "illegal_argument_exception",
"reason": "[title] is defined as an object in mapping [movie] but this name is already used for a field in other types"
},
"status": 400
}
Can someone point me what is wrong here?
The problem is that the title field is declared as a string and you're trying to access the title.en sub-field like you would do if title was and object field. You need to change your mapping like this instead and then it will work:
{
"mappings": {
"title": {
"properties": {
"title": {
"type": "object", <--- change this
"properties": { <--- and this
"de": {
"type": "string",
"analyzer": "german"
},
"en": {
"type": "string",
"analyzer": "english"
},
"fr": {
"type": "string",
"analyzer": "french"
},
"es": {
"type": "string",
"analyzer": "spanish"
}
}
}
}
}
}
}
As I can see, you have defined the title both as a type and as a property.
The error seems to state this issue.
From your post call, I see that the type is the movie.
Do you really want the title as a type?
You should define the mapping for the title inside the movie type.
Using sense, I'm trying to create a mapping for an index with three properties. When i try to create it i get the following response
{
"error": "MapperParsingException[Root type mapping not empty after parsing! Remaining fields: [mappings : {gram={properties={gram={type=string, fields={gram_bm25={type=string, similarity=BM25}, gram_lmd={type=string}}}, sentiment={type=string, index=not_analyzed}, word={type=string, index=not_analyzed}}}}]]",
"status": 400
}
This is what i have in the sense console
PUT /pos/_mapping/gram
{
"mappings": {
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string", "similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string", "index": "not_analyzed"
},
"word": {
"type": "string", "index": "not_analyzed"
}
}
}
}
}
The name of the index is 'pos' and I call the type 'gram'.
I have created the index with the same name.
I have validated the json using http://jsonlint.com/
I tried using XPUT in the console and i got the 'aknowleged' response, but the mapping is still {} when i request it in sense.
this question does not solve my problem. I always use the same name everywhere.
Any suggestions?
Thanks!
You just have the API syntax wrong. You've combined two different methods, basically.
Either create your index, then apply a mapping:
DELETE /pos
PUT /pos
PUT /pos/gram/_mapping
{
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string",
"similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string",
"index": "not_analyzed"
},
"word": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
Or do it all at once when you create the index:
DELETE /pos
PUT /pos
{
"mappings": {
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string",
"similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string",
"index": "not_analyzed"
},
"word": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
Here's the code I used:
http://sense.qbox.io/gist/6d645cc069f5f0fcf14f497809f7f79aff7de161
I have 2 separate indexes, each containing a different type
I want to get combined records from both.
The problem is that one type has field 'email', the other has 'work_email'. However I want to treat them as the same thing for sorting purposes.
That is why I try to use index_name in one of the types.
Here are mappings:
Index1:
"mappings": {
"people": {
"properties": {
"work_email": {
"type": "string",
"index_name": "email",
"fields": {
"raw": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
}
Index2:
"mappings": {
"companies": {
"properties": {
"email": {
"type": "string",
"fields": {
"raw": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
}
I expect this to work:
GET /index1,index2/people,companies/_search?
{
"sort": [
{
"email.raw": {
"order": "asc"
}
}
]
}
But, I get an error that there is no such field in the 'people' type.
Am I doing something wrong, or is there a better way to achieve what I need?
Here you can find a recreation script that illustrates the problem: https://gist.github.com/pmishev/11375297
There is problem in the way you map the multi field..Check out the below mapping and try to index..You should get the results
"mappings": {
"people": {
"properties": {
"work_email":{
"type": "multi_field",
"fields": {
"work_email":{
"type": "string",
"index_name": "email"
},
"raw":{
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
}
We should specify the type to multi_field and under the fields we should specify the required fields....
I ended up adding a 'copy_to' property in my mapping:
"mappings": {
"people": {
"properties": {
"work_email": {
"type": "string",
"copy_to": "email",
"fields": {
"raw": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
}
So now I can address both fields as email.
It's not ideal, as this means that the email field is actually indexed twice, but that was the only thing that worked.