Here is my mapping of Product document type. I have "copy_to" : "product_all" in mapping.
I am expecting value of label in brand should be copied in 'product_all' in product. Is it correct way of referring the field in outer object from inner object i.e. (product_all in type product from brand), below mapping doesn't work as I don't get results back for a query made on product_all field for the value of label in brand. Am I missing something?
{
"product": {
"_timestamp": {
"enabled": true,
"store": true
},
"_all": {
"enabled": false
},
"dynamic": "strict",
"properties": {
"brand": {
"properties": {
"id": {
"type": "long"
},
"label": {
"type": "multi_field",
"fields": {
"label": {
"type": "string",
"index_analyzer": "productAnalyzer",
"search_analyzer": "productAnalyzer"
},
"raw": {
"type": "string",
"index": "not_analyzed"
}
},
"copy_to": "product_all"
}
}
},
"product_all": {
"type": "string",
"index_analyzer": "productAnalyzer",
"search_analyzer": "productAnalyzer"
}
}
}
}
I moved "copy_to":"product_all" inside label.label of mutifield. Now it works
"label": {
"type": "multi_field",
"fields": {
"label": {
"type": "string",
"copy_to": "product_all"
"index_analyzer": "productAnalyzer",
"search_analyzer": "productAnalyzer"
},
"raw": {
"type": "string",
"index": "not_analyzed"
}
}
}
Related
I am using elasticsearch 5.1.1.
I have a requirement where in I want to index data in multiple languages.
I used following mapping:
PUT http://localhost:9200/movies
{
"mappings": {
"title": {
"properties": {
"title": {
"type": "string",
"fields": {
"de": {
"type": "string",
"analyzer": "german"
},
"en": {
"type": "string",
"analyzer": "english"
},
"fr": {
"type": "string",
"analyzer": "french"
},
"es": {
"type": "string",
"analyzer": "spanish"
}
}
}
}
}
}
}
when I try to insert some data as :
POST http://localhost:9200/movies/movie/1
{
"title.en" :"abc123"
}
I am getting following error:
{
"error": {
"root_cause": [
{
"type": "remote_transport_exception",
"reason": "[IQ7CUTp][127.0.0.1:9300][indices:data/write/index[p]]"
}
],
"type": "illegal_argument_exception",
"reason": "[title] is defined as an object in mapping [movie] but this name is already used for a field in other types"
},
"status": 400
}
Can someone point me what is wrong here?
The problem is that the title field is declared as a string and you're trying to access the title.en sub-field like you would do if title was and object field. You need to change your mapping like this instead and then it will work:
{
"mappings": {
"title": {
"properties": {
"title": {
"type": "object", <--- change this
"properties": { <--- and this
"de": {
"type": "string",
"analyzer": "german"
},
"en": {
"type": "string",
"analyzer": "english"
},
"fr": {
"type": "string",
"analyzer": "french"
},
"es": {
"type": "string",
"analyzer": "spanish"
}
}
}
}
}
}
}
As I can see, you have defined the title both as a type and as a property.
The error seems to state this issue.
From your post call, I see that the type is the movie.
Do you really want the title as a type?
You should define the mapping for the title inside the movie type.
I have the following legacy mapping code that works in ES 1.7 but fails in 5.2. The things that fail are multi_field is not supported as well as path. The documentation mentions that these fields were removed but fails to provide the remedy beyond suggesting to use copy_to. Cans someone give a bit more details on that.
{
"sample": {
"_parent": {
"type": "security"
},
"properties": {
"securityDocumentId": {
"type": "string",
"index": "not_analyzed",
"include_in_all": false
},
"id": {
"type": "multi_field",
"path": "full",
"fields": {
"indexer_sample_id": {
"type": "string"
},
"id": {
"type": "string",
"include_in_all": false
}
}
},
"sampleid": {
"type": "multi_field",
"path": "just_name",
"fields": {
"sampleid": {
"type": "string",
"analyzer": "my_analyzer"
},
"sample.sampleid": {
"type": "string",
"analyzer": "my_analyzer"
},
"sample.sampleid.sort": {
"type": "string",
"analyzer": "case_insensitive_sort_analyzer"
},
"sample.sampleid.name.autocomplete": {
"type": "string",
"analyzer": "autocomplete"
}
}
},
The path option's default value was full, so you can leave it out since it way deprecated in 2.0. The path value just_name doesn't exist anymore and you MUST reference all your fields by their full path name. The multi-fields can be rewritten very simply:
{
"sample": {
"_parent": {
"type": "security"
},
"properties": {
"securityDocumentId": {
"type": "keyword",
"include_in_all": false
},
"id": {
"type": "text",
"fields": {
"indexer_sample_id": {
"type": "text"
},
"id": {
"type": "text",
"include_in_all": false
}
}
},
"sampleid": {
"type": "text",
"fields": {
"sampleid": {
"type": "text",
"analyzer": "my_analyzer"
},
"sample.sampleid": {
"type": "text",
"analyzer": "my_analyzer"
},
"sample.sampleid.sort": {
"type": "text",
"analyzer": "case_insensitive_sort_analyzer"
},
"sample.sampleid.name.autocomplete": {
"type": "text",
"analyzer": "autocomplete"
}
}
},
Note that I'm not sure of the usefulness and added value of the id sub-fields
I am trying to create an index with a custom default analyzer.
I already checked the following questions:
Analyzer not found exception while creating an index with mapping and settings
How to specify an analyzer while creating an index in ElasticSearch
mapper_parsing_exception for a custom analyzer while creating index in elasticsearch?
but they didn't solve the issue.
Here is my schema:
put /emails
{
"mappings": {
"email": {
"analyzer": "lkw",
"properties": {
"createdOn": {
"type": "date",
"store": true,
"format": "strict_date_optional_time||epoch_millis"
},
"data": {
"type": "object",
"dynamic": "true"
},
"from": {
"type": "string",
"store": true
},
"id": {
"type": "string",
"store": true
},
"sentOn": {
"type": "date",
"store": true,
"format": "strict_date_optional_time||epoch_millis"
},
"sesId": {
"type": "string",
"store": true
},
"subject": {
"type": "string",
"store": true,
"analyzer": "standard"
},
"templates": {
"properties": {
"html": {
"type": "string",
"store": true
},
"plainText": {
"type": "string",
"store": true
}
}
},
"to": {
"type": "string",
"store": true
},
"type": {
"type": "string",
"store": true
}
}
},
"event": {
"_parent": {
"type": "email"
},
"analyzer": "lkw",
"properties": {
"id": {
"type": "string",
"store": true
},
"origin": {
"type": "string",
"store": true
},
"time": {
"type": "date",
"store": true,
"format": "strict_date_optional_time||epoch_millis"
},
"type": {
"type": "string",
"store": true
},
"userAgent": {
"type": "string",
"store": true
}
}
}
},
"settings": {
"analysis": {
"analyzer": {
"lkw": {
"tokenizer": "keyword",
"filter": [
"lowercase"
],
"type": "custom"
}
}
}
}
}
When I execute the command above, I get this error:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "Root mapping definition has unsupported parameters: [analyzer : lkw]"
}
],
"type": "mapper_parsing_exception",
"reason": "Failed to parse mapping [event]: Root mapping definition has unsupported parameters: [analyzer : lkw]",
"caused_by": {
"type": "mapper_parsing_exception",
"reason": "Root mapping definition has unsupported parameters: [analyzer : lkw]"
}
},
"status": 400
}
Since you have only a few string fields, I suggest you simply specify your lkw analyzer where you need it, just like you did for the standard one:
PUT /emails
{
"mappings": {
"email": {
"properties": {
"createdOn": {
"type": "date",
"store": true,
"format": "strict_date_optional_time||epoch_millis"
},
"data": {
"type": "object",
"dynamic": "true"
},
"from": {
"type": "string",
"store": true,
"analyzer": "lkw"
},
"id": {
"type": "string",
"store": true,
"analyzer": "lkw"
},
"sentOn": {
"type": "date",
"store": true,
"format": "strict_date_optional_time||epoch_millis"
},
"sesId": {
"type": "string",
"store": true,
"analyzer": "lkw"
},
"subject": {
"type": "string",
"store": true,
"analyzer": "standard"
},
"templates": {
"properties": {
"html": {
"type": "string",
"store": true,
"analyzer": "lkw"
},
"plainText": {
"type": "string",
"store": true,
"analyzer": "lkw"
}
}
},
"to": {
"type": "string",
"store": true,
"analyzer": "lkw"
},
"type": {
"type": "string",
"store": true,
"analyzer": "lkw"
}
}
},
"event": {
"_parent": {
"type": "email"
},
"properties": {
"id": {
"type": "string",
"store": true,
"analyzer": "lkw"
},
"origin": {
"type": "string",
"store": true,
"analyzer": "lkw"
},
"time": {
"type": "date",
"store": true,
"format": "strict_date_optional_time||epoch_millis"
},
"type": {
"type": "string",
"store": true,
"analyzer": "lkw"
},
"userAgent": {
"type": "string",
"store": true,
"analyzer": "lkw"
}
}
}
},
"settings": {
"analysis": {
"analyzer": {
"lkw": {
"tokenizer": "keyword",
"filter": [
"lowercase"
],
"type": "custom"
}
}
}
}
}
Using sense, I'm trying to create a mapping for an index with three properties. When i try to create it i get the following response
{
"error": "MapperParsingException[Root type mapping not empty after parsing! Remaining fields: [mappings : {gram={properties={gram={type=string, fields={gram_bm25={type=string, similarity=BM25}, gram_lmd={type=string}}}, sentiment={type=string, index=not_analyzed}, word={type=string, index=not_analyzed}}}}]]",
"status": 400
}
This is what i have in the sense console
PUT /pos/_mapping/gram
{
"mappings": {
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string", "similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string", "index": "not_analyzed"
},
"word": {
"type": "string", "index": "not_analyzed"
}
}
}
}
}
The name of the index is 'pos' and I call the type 'gram'.
I have created the index with the same name.
I have validated the json using http://jsonlint.com/
I tried using XPUT in the console and i got the 'aknowleged' response, but the mapping is still {} when i request it in sense.
this question does not solve my problem. I always use the same name everywhere.
Any suggestions?
Thanks!
You just have the API syntax wrong. You've combined two different methods, basically.
Either create your index, then apply a mapping:
DELETE /pos
PUT /pos
PUT /pos/gram/_mapping
{
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string",
"similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string",
"index": "not_analyzed"
},
"word": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
Or do it all at once when you create the index:
DELETE /pos
PUT /pos
{
"mappings": {
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string",
"similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string",
"index": "not_analyzed"
},
"word": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
Here's the code I used:
http://sense.qbox.io/gist/6d645cc069f5f0fcf14f497809f7f79aff7de161
I tried to use the following mapping to index my data:
{
"mappings": {
"chow-demo": {
"properties": {
"#fields": {
"dynamic": "true",
"properties": {
"asgid": {
"type": "string",
"analyzer": "keyword"
},
"asid": {
"type": "long"
},
"astid": {
"type": "long"
},
"clfg": {
"analyzer": "keyword",
"type": "string"
},
"httpcode": {
"type": "long"
},
"oid": {
"type": "string"
},
"onid": {
"type": "long"
},
"ptrnr": {
"analyzer": "keyword",
"type": "string"
},
"pguid": {
"analyzer": "keyword",
"type": "string"
},
"ptid": {
"type": "long"
},
"sid": {
"type": "long"
},
"src_url": {
"analyzer": "keyword",
"type": "string"
},
"title": {
"analyzer": "keyword",
"type": "string"
},
"ts": {
"type": "long"
}
}
},
"#timestamp": {
"format": "dateOptionalTime",
"type": "date"
},
"#message": {
"type": "string"
},
"#source": {
"type": "string"
},
"#type": {
"analyzer": "keyword",
"type": "string"
},
"#tags": {
"type": "string"
},
"#source_host": {
"type": "string"
},
"#source_path": {
"type": "string"
}
}
},
"chow-clfg": {
"_parent": {
"type": "chow-demo"
},
"dynamic": "true",
"properties": {
"_ttl": {
"enabled": true,
"default": "1h"
},
"clfg": {
"analyzer": "keyword",
"type": "string"
},
"#timestamp": {
"format": "dateOptionalTime",
"type": "date"
},
"count": {
"type": "long"
}
}
}
}
}
I tried to populate the parent type "chow-demo" without populating the child type "chow-clfg", and the document refused to index. (No documents were indexed into Elasticsearach)
When I take out the child mapping for "chow-clfg", it does indexing properly as usual. Hence I have the following question:
Is my mapping structure wrong?
Must the parent and child be indexed together at the same time before the data can be successfully indexed?
Really need help in this question for my project to progress! Thanks!
Yes, your mapping is wrong. The _ttl element should be one level higher in the chow-clfg type. In other words _ttl should be on the same level as _parent. However, I am not quite sure how this problem can affect your ability to index.
Parents and children don't have to be indexed together.