dynamic mapping for nested type - elasticsearch

I have an index/type of test1/all which looks as follows:
{
"test1": {
"mappings": {
"all": {
"properties": {
"colors": {
"properties": {
"H": {"type": "double"},
"S": {"type": "long"},
"V": {"type": "long"},
"color_percent": {"type": "long"}
}
},
"file_name": {
"type": "string"
},
"id": {
"type": "string"
},
"no_of_colors": {
"type": "long"
}
}
}
}
}
}
I would like to make the colors field nested, I am trying the following :
PUT /test1/all/_mapping
{
"mappings":{
"all":{
"properties": {
"file_name":{
"type": "string",
"index": "not_analyzed"
},
"id": {
"type": "string",
"index": "not_analyzed"
},
"no_of_colors":{
"type":"long",
"index": "not_analyzed"
},
"colors":{
"type":"nested",
"properties":{
"H":{"type":"double"},
"S":{"type":"long"},
"V":{"type":"long"},
"color_percent":{"type":"integer"}
}
}
}
}
}
}
But I get the following error:
{
"error": "MapperParsingException[Root type mapping not empty after parsing!
Remaining fields: [mappings : {all={properties={file_name={type=string, index=not_analyzed}, id={type=string, index=not_analyzed}, no_of_colors={type=integer, index=not_analyzed}, colors={type=nested, properties={H={type=double}, S={type=long}, V={type=long}, color_percent={type=integer}}}}}}]]",
"status": 400
}
Any suggestions? Appreciate the help.

You're almost there, you simply need to remove the mappings section like this:
PUT /test1/all/_mapping
{
"properties": {
"file_name": {
"type": "string",
"index": "not_analyzed"
},
"id": {
"type": "string",
"index": "not_analyzed"
},
"no_of_colors": {
"type": "long",
"index": "not_analyzed"
},
"colors": {
"type": "nested",
"properties": {
"H": {
"type": "double"
},
"S": {
"type": "long"
},
"V": {
"type": "long"
},
"color_percent": {
"type": "integer"
}
}
}
}
}
However, note that this will not work either, because you cannot change the colors type from object to nested and the other string fields from analyzed to _not_analyzed. You need to delete your index and re-create it from scratch

Related

elasticsearch mapping exception when using dynamic templates

hi i am using elasticsearch to index some documents. but the documents will have some fileds like goal1Completion, goal2Completion....goal100Completion. so i was trying to do mapping with dynamic Templates. so i came up with following but it throws an error:
{
"mappings": {
"date": {
"properties": {
"sessions": {
"type": "long"
},
"viewId": {
"type": "string",
"index": "not_analyzed"
},
"webPropertyId": {
"type": "string",
"index": "not_analyzed"
},
"dynamic_templates": [
{
"goalCompletions": {
"match_pattern": "regex",
"match": "goal\\d+\\w+",
"mapping": {
"type": "long"
}
}
}
]
}
}
}
}
error:"reason": "Expected map for property [fields] on field [dynamic_templates] but got a class java.lang.String"
what could be thee problem here?
You need to pull dynamic_template from properties map.
{
"mappings": {
"date": {
"properties": {
"sessions": {
"type": "long"
},
"viewId": {
"type": "string",
"index": "not_analyzed"
},
"webPropertyId": {
"type": "string",
"index": "not_analyzed"
}
},
"dynamic_templates": [ <--- Pull this out of properties
{
"goalCompletions": {
"match_pattern": "regex",
"match": "goal\\d+\\w+",
"mapping": {
"type": "long"
}
}
}
]
}
}
}

Recreation of mapping elastic search

logstash configI have created my index on elasticsearch and through kibana as well and have uploaded data. Now i want to change the mapping for the index and change some fields to not analyzed .Below is my mapping which i want to replace from existing one . But when i run below command it gives me error
{"error":{"root_cause":[{"type":"index_already_exists_exception","reason":"already
exists","index":"rettrmt"}],"type":"index_already_exists_exception","reason":"already
exists","index":"rettrmt"},"status":400}
Kindly help to get it close.
curl -XPUT 'http://10.56.139.61:9200/rettrmt' -d '{
"rettrmt": {
"aliases": {},
"mappings": {
"RETTRMT": {
"properties": {
"#timestamp": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"#version": {
"type": "string"
},
"acid": {
"type": "string"
},
"actor_id": {
"type": "string",
"index": "not_analyzed"
},
"actor_type": {
"type": "string",
"index": "not_analyzed"
},
"channel_id": {
"type": "string",
"index": "not_analyzed"
},
"circle": {
"type": "string",
"index": "not_analyzed"
},
"cr_dr_indicator": {
"type": "string",
"index": "not_analyzed"
},
"host": {
"type": "string"
},
"message": {
"type": "string"
},
"orig_input_amt": {
"type": "double"
},
"path": {
"type": "string"
},
"r_cre_id": {
"type": "string"
},
"sub_use_case": {
"type": "string",
"index": "not_analyzed"
},
"tran_amt": {
"type": "double"
},
"tran_id": {
"type": "string"
},
"tran_particulars": {
"type": "string"
},
"tran_particulars_2": {
"type": "string"
},
"tran_remarks": {
"type": "string"
},
"tran_sub_type": {
"type": "string"
},
"tran_timestamp": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
},
"tran_type": {
"type": "string"
},
"type": {
"type": "string"
},
"use_case": {
"type": "string",
"index": "not_analyzed"
}
}
}
},
"settings": {
"index": {
"creation_date": "1457331693603",
"uuid": "2bR0yOQtSqqVUb8lVE2dUA",
"number_of_replicas": "1",
"number_of_shards": "5",
"version": {
"created": "2000099"
}
}
},
"warmers": {}
}
}'
You first need to delete your index and then recreate it with the proper mapping. Here you're getting an error index_already_exists_exception because you try to create an index while the older index still exists, hence the conflict.
Run this first:
curl -XDELETE 'http://10.56.139.61:9200/rettrmt'
And then you can run your command again. Note that this will erase your data, so you will have to repopulate your index.
Did you try something like that ?
curl -XPUT 'http://10.56.139.61:9200/rettrmt/_mapping/RETTRMT' -d '
{
"properties": {
"actor_id": { // or whichever properties you want to add
"type": "string",
"index": "not_analyzed"
}
}
}
works for me

Create multi field in elastic search

I want to create a index and here is my mapping.I want to create a multi field on field - 'findings' one with the default mapping(analzyed) and other one with 'orig'(not_analyzed).
PUT nto
{
"mappings": {
"_default_": {
"properties": {
"date": {
"type": "string",
"index": "not_analyzed"
},
"bo": {
"type": "string",
"index": "not_analyzed"
},
"pg": {
"type": "string"
},
"rate": {
"type": "float"
},
"findings": {
"type": "multi_field",
"fields": {
"findings": {
"type": "string",
"index": "analyzed"
},
"orig": {
"type": "string",
"index":"not_analyzed"
}
}
}
}
}
}
}
Once I create the mapping I don't see the orig field being created. Here is the mapping that I see,
{
"ccdn": {
"aliases": {},
"mappings": {
"test": {
"properties": {
"bo": {
"type": "string",
"index": "not_analyzed"
},
"date": {
"type": "string",
"index": "not_analyzed"
},
"findings": {
"type": "string",
"fields": {
"orig": {
"type": "string",
"index": "not_analyzed"
}
}
},
"pg": {
"type": "string"
},
"rate": {
"type": "float"
}
}
},
"_default_": {
"properties": {
"bo": {
"type": "string",
"index": "not_analyzed"
},
"date": {
"type": "string",
"index": "not_analyzed"
},
"findings": {
"type": "string",
"fields": {
"orig": {
"type": "string",
"index": "not_analyzed"
}
}
},
"pg": {
"type": "string"
},
"rate": {
"type": "float"
}
}
}
},
"settings": {
"index": {
"creation_date": "1454893575663",
"uuid": "wJndGz1aSVSFjtidywsRPg",
"number_of_replicas": "1",
"number_of_shards": "5",
"version": {
"created": "2020099"
}
}
},
"warmers": {}
}
}
I don't see the default field 'findings' - analyzed being created.
Elasticsearch multi field has expired. See here.
You might need something like this for your findings field :
"findings": {
"type": "string",
"index": "analyzed",
"fields": {
"orig": { "type": "string", "index": "not_analyzed" }
}
}

Update ElasticSearch Mapping type without delete it

I have this mapping type on my Index.
{
"iotsens-summarizedmeasures": {
"mappings": {
"summarizedmeasure": {
"properties": {
"id": {
"type": "long"
},
"location": {
"type": "boolean"
},
"rawValue": {
"type": "string"
},
"sensorId": {
"type": "string"
},
"summaryTimeUnit": {
"type": "string"
},
"timestamp": {
"type": "date",
"format": "dateOptionalTime"
},
"value": {
"type": "string"
},
"variableName": {
"type": "string"
}
}
}
}
}
}
I want to update sensorId field to.
"sensorId": {
"type": "string",
"index": "not_analyzed"
}
Is there any way to update the index without delete and re-mapping it? I don't have to change type of field, I only set "index": "not_analyzed".
Thanks you.
What you can do is make a multi-field out of your existing sensorId field like this with a sub-field called raw which is not_analyzed:
curl -XPUT localhost:9200/iotsens-summarizedmeasures/_mapping/summarizedmeasure -d '{
"summarizedmeasure": {
"properties": {
"sensorId": {
"type": "string",
"fields": {
"raw": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
}'
However, you still have to re-index your data to make sure all sensorId.raw sub-fields get created.

MapperParsingException when creating mapping for elasticsearch index

Using sense, I'm trying to create a mapping for an index with three properties. When i try to create it i get the following response
{
"error": "MapperParsingException[Root type mapping not empty after parsing! Remaining fields: [mappings : {gram={properties={gram={type=string, fields={gram_bm25={type=string, similarity=BM25}, gram_lmd={type=string}}}, sentiment={type=string, index=not_analyzed}, word={type=string, index=not_analyzed}}}}]]",
"status": 400
}
This is what i have in the sense console
PUT /pos/_mapping/gram
{
"mappings": {
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string", "similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string", "index": "not_analyzed"
},
"word": {
"type": "string", "index": "not_analyzed"
}
}
}
}
}
The name of the index is 'pos' and I call the type 'gram'.
I have created the index with the same name.
I have validated the json using http://jsonlint.com/
I tried using XPUT in the console and i got the 'aknowleged' response, but the mapping is still {} when i request it in sense.
this question does not solve my problem. I always use the same name everywhere.
Any suggestions?
Thanks!
You just have the API syntax wrong. You've combined two different methods, basically.
Either create your index, then apply a mapping:
DELETE /pos
PUT /pos
PUT /pos/gram/_mapping
{
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string",
"similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string",
"index": "not_analyzed"
},
"word": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
Or do it all at once when you create the index:
DELETE /pos
PUT /pos
{
"mappings": {
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string",
"similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string",
"index": "not_analyzed"
},
"word": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
Here's the code I used:
http://sense.qbox.io/gist/6d645cc069f5f0fcf14f497809f7f79aff7de161

Resources