elasticsearch mapping exception when using dynamic templates - elasticsearch

hi i am using elasticsearch to index some documents. but the documents will have some fileds like goal1Completion, goal2Completion....goal100Completion. so i was trying to do mapping with dynamic Templates. so i came up with following but it throws an error:
{
"mappings": {
"date": {
"properties": {
"sessions": {
"type": "long"
},
"viewId": {
"type": "string",
"index": "not_analyzed"
},
"webPropertyId": {
"type": "string",
"index": "not_analyzed"
},
"dynamic_templates": [
{
"goalCompletions": {
"match_pattern": "regex",
"match": "goal\\d+\\w+",
"mapping": {
"type": "long"
}
}
}
]
}
}
}
}
error:"reason": "Expected map for property [fields] on field [dynamic_templates] but got a class java.lang.String"
what could be thee problem here?

You need to pull dynamic_template from properties map.
{
"mappings": {
"date": {
"properties": {
"sessions": {
"type": "long"
},
"viewId": {
"type": "string",
"index": "not_analyzed"
},
"webPropertyId": {
"type": "string",
"index": "not_analyzed"
}
},
"dynamic_templates": [ <--- Pull this out of properties
{
"goalCompletions": {
"match_pattern": "regex",
"match": "goal\\d+\\w+",
"mapping": {
"type": "long"
}
}
}
]
}
}
}

Related

In Elasticsearch, how to have an index with automatic mapping and all fields are not analyzed?

I tried to do:
PUT /index_name/
{ "index" : {
"analysis" : {
"analyzer" :"not_analyzed"
}
}
}
but I'm not sure it is the right way...
Edit:
I applied both methods from the answers. But I have no way to test it. If in fact it do
GET /index_name/_mapping
...
"metaData_requestHeaders_accept-language": {
"type": "string"
},
"metaData_requestHeaders_akamai-origin-hop": {
"type": "string"
},
"metaData_requestHeaders_alexatoolbar-alx_ns_ph": {
"type": "string"
},
"metaData_requestHeaders_authorization": {
"type": "string"
},
"metaData_requestHeaders_c": {
"type": "string"
},
"metaData_requestHeaders_cache-control": {
"type": "string"
},
"metaData_requestHeaders_ckiooe": {
"type": "string"
},
...
as you can see the automatic mapping does not show what analyzer is used. So I have no way to test that this is actually working. Any ideas?
Use Dynamic Templates like :
PUT my_index
{
"mappings": {
"my_type": {
"dynamic_templates": [
{
"strings": {
"match_mapping_type": "string",
"mapping": {
"type": "string",
"index": "not_analyzed"
}
}
}
]
}
}
}
After applying above template you should see something like this :
GET /my_index/_mapping
{
"my_index": {
"mappings": {
"my_type": {
"dynamic_templates": [
{
"strings": {
"mapping": {
"index": "not_analyzed",
"type": "string"
},
"match_mapping_type": "string"
}
}
],
"properties": {}
}
}
}
}
Above mapping indicate that all strings will be not_analyzed by default.
I guess what you're looking for is Elasticsearch Templates, that allow you to create mappings dynamically.
You want something like this:
PUT index_name
{
"mappings": {
"type_name": {
"dynamic_templates": [
{
"strings": {
"match_mapping_type": "string",
"mapping": {
"type": "string",
"fields": {
"raw": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
]
}
}
}

Create multi field in elastic search

I want to create a index and here is my mapping.I want to create a multi field on field - 'findings' one with the default mapping(analzyed) and other one with 'orig'(not_analyzed).
PUT nto
{
"mappings": {
"_default_": {
"properties": {
"date": {
"type": "string",
"index": "not_analyzed"
},
"bo": {
"type": "string",
"index": "not_analyzed"
},
"pg": {
"type": "string"
},
"rate": {
"type": "float"
},
"findings": {
"type": "multi_field",
"fields": {
"findings": {
"type": "string",
"index": "analyzed"
},
"orig": {
"type": "string",
"index":"not_analyzed"
}
}
}
}
}
}
}
Once I create the mapping I don't see the orig field being created. Here is the mapping that I see,
{
"ccdn": {
"aliases": {},
"mappings": {
"test": {
"properties": {
"bo": {
"type": "string",
"index": "not_analyzed"
},
"date": {
"type": "string",
"index": "not_analyzed"
},
"findings": {
"type": "string",
"fields": {
"orig": {
"type": "string",
"index": "not_analyzed"
}
}
},
"pg": {
"type": "string"
},
"rate": {
"type": "float"
}
}
},
"_default_": {
"properties": {
"bo": {
"type": "string",
"index": "not_analyzed"
},
"date": {
"type": "string",
"index": "not_analyzed"
},
"findings": {
"type": "string",
"fields": {
"orig": {
"type": "string",
"index": "not_analyzed"
}
}
},
"pg": {
"type": "string"
},
"rate": {
"type": "float"
}
}
}
},
"settings": {
"index": {
"creation_date": "1454893575663",
"uuid": "wJndGz1aSVSFjtidywsRPg",
"number_of_replicas": "1",
"number_of_shards": "5",
"version": {
"created": "2020099"
}
}
},
"warmers": {}
}
}
I don't see the default field 'findings' - analyzed being created.
Elasticsearch multi field has expired. See here.
You might need something like this for your findings field :
"findings": {
"type": "string",
"index": "analyzed",
"fields": {
"orig": { "type": "string", "index": "not_analyzed" }
}
}

elasticsearch "having not" query

Some documents has category fields.. Some of these docs has category fields its value equals to "-1". I need a query return documents which have category fields and "not equal to -1".
I tried this:
GET webproxylog/_search
{
"query": {
"filtered": {
"filter": {
"not":{
"filter": {"and": {
"filters": [
{"term": {
"category": "-1"
}
},
{
"missing": {
"field": "category"
}
}
]
}}
}
}
}
}
}
But not work.. returns docs not have "category field"
EDIT
Mapping:
{
"webproxylog": {
"mappings": {
"accesslog": {
"properties": {
"category": {
"type": "string",
"index": "not_analyzed"
},
"clientip": {
"type": "string",
"index": "not_analyzed"
},
"clientmac": {
"type": "string",
"index": "not_analyzed"
},
"clientname": {
"type": "string",
"index": "not_analyzed"
},
"duration": {
"type": "long"
},
"filetype": {
"type": "string",
"index": "not_analyzed"
},
"hierarchycode": {
"type": "string",
"index": "not_analyzed"
},
"loggingdate": {
"type": "date",
"format": "dateOptionalTime"
},
"reqmethod": {
"type": "string",
"index": "not_analyzed"
},
"respsize": {
"type": "long"
},
"resultcode": {
"type": "string",
"index": "not_analyzed"
},
"url": {
"type": "string",
"analyzer": "slash_analyzer"
},
"user": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
}
If your category field is string and is analyzed by default, then your -1 will be indexed as 1 (stripping the minus sign).
You will need that field to be not_analyzed or to add a sub-field which is not analyzed (as my solution below).
Something like this:
DELETE test
PUT /test
{
"mappings": {
"test": {
"properties": {
"category": {
"type": "string",
"fields": {
"notAnalyzed": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
}
}
POST /test/test/1
{"category": "-1"}
POST /test/test/2
{"category": "2"}
POST /test/test/3
{"category": "3"}
POST /test/test/4
{"category": "4"}
POST /test/test/5
{"category2": "-1"}
GET /test/test/_search
{
"query": {
"bool": {
"must_not": [
{
"term": {
"category.notAnalyzed": {
"value": "-1"
}
}
},
{
"filtered": {
"filter": {
"missing": {
"field": "category"
}
}
}
}
]
}
}
}

Create ElasticSearch dynamic template to ensure all fields are set to not_analyzed

I have an ElasticSearch type for which I want the mapping to be set dynamically. There are a few select fields on that type that I want to be analyzed, but everything else should be set to "not_analyzed".
I've come up with the following snippet. This sets all string fields to not be analyzed, but doesn't cover all other data types. I tried using the "generic" field shown in the documentation but it didn't help. Can anyone tell me how I can accomplish this?
{
"TypeName": {
"dynamic_templates": [
{
"template_name": {
"match": "*",
"match_mapping_type": "string",
"mapping": {
"index": "no",
"type": "string"
}
}
}
],
"dynamic": true,
"properties": {
"url": {
"index": "analyzed",
"type": "string"
},
"resourceUrl": {
"index": "analyzed",
"type": "string"
}
}
}
}
{
"mappings": {
"TypeName": {
"dynamic_templates": [
{
"base": {
"mapping": {
"index": "not_analyzed"
},
"match": "*",
"match_mapping_type": "*"
}
}
],
"dynamic": true,
"properties": {
"url": {
"index": "analyzed",
"type": "string"
},
"resourceUrl": {
"index": "analyzed",
"type": "string"
}
}
}
}
}
Overall, index-level template:
{
"mappings": {
"_default_": {
"dynamic_templates": [
{
"base": {
"mapping": {
"index": "not_analyzed"
},
"match": "*",
"match_mapping_type": "*"
}
}
]
}
}
}

MapperParsingException when creating mapping for elasticsearch index

Using sense, I'm trying to create a mapping for an index with three properties. When i try to create it i get the following response
{
"error": "MapperParsingException[Root type mapping not empty after parsing! Remaining fields: [mappings : {gram={properties={gram={type=string, fields={gram_bm25={type=string, similarity=BM25}, gram_lmd={type=string}}}, sentiment={type=string, index=not_analyzed}, word={type=string, index=not_analyzed}}}}]]",
"status": 400
}
This is what i have in the sense console
PUT /pos/_mapping/gram
{
"mappings": {
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string", "similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string", "index": "not_analyzed"
},
"word": {
"type": "string", "index": "not_analyzed"
}
}
}
}
}
The name of the index is 'pos' and I call the type 'gram'.
I have created the index with the same name.
I have validated the json using http://jsonlint.com/
I tried using XPUT in the console and i got the 'aknowleged' response, but the mapping is still {} when i request it in sense.
this question does not solve my problem. I always use the same name everywhere.
Any suggestions?
Thanks!
You just have the API syntax wrong. You've combined two different methods, basically.
Either create your index, then apply a mapping:
DELETE /pos
PUT /pos
PUT /pos/gram/_mapping
{
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string",
"similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string",
"index": "not_analyzed"
},
"word": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
Or do it all at once when you create the index:
DELETE /pos
PUT /pos
{
"mappings": {
"gram": {
"properties": {
"gram": {
"type": "string",
"fields": {
"gram_bm25": {
"type": "string",
"similarity": "BM25"
},
"gram_lmd": {
"type": "string"
}
}
},
"sentiment": {
"type": "string",
"index": "not_analyzed"
},
"word": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
Here's the code I used:
http://sense.qbox.io/gist/6d645cc069f5f0fcf14f497809f7f79aff7de161

Resources