elasticsearch mapping issue: failed to parse field - elasticsearch

I have this mapping
PUT /mytest
{
"mappings":{
"properties": {
"value": { "type": "object" }
}
}
}
When I insert this document
POST /mytest/_doc/4
{
"value": { "value": "test"}
}
I got the following error:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "failed to parse field [value.value] of type [long] in document with id '4'. Preview of field's value: 'test'"
}
],
"type": "mapper_parsing_exception",
"reason": "failed to parse field [value.value] of type [long] in document with id '4'. Preview of field's value: 'test'",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "For input string: \"test\""
}
},
"status": 400
}
I know the naming convention is bad, still, this is a valid JSON request, not sure why it doesn't allow it.

This error is telling you that you don't have a mapping for the property value within your value object property. The below example would property set the value.value property within your mytest index:
PUT mytest
{
"mappings": {
"properties": {
"value": {
"type": "object",
"properties": {
"value": {
"type": "text"
}
}
}
}
}
}
However, I don't think that's what your intention was. As a best practice, try following the Elastic Common Schema (ECS) for naming your index properties.

Related

Open Search, exclude field from indexing in mapping

I have the following mapping:
{
"properties": {
"type": {
"type": "keyword"
},
"body": {
"type": "text"
},
"id": {
"type": "keyword"
},
"date": {
"type": "date"
},
},
}
body field is going to be an email message, it's very long and I don't want to index it.
what is the proper way to exclude this field from indexing?
What I tried:
enabled: false - as I understand from the documentation, it's applied only to object type fields but in my case it's not really an object so I'm not sure whether I can use it.
index: false/'no' - this breaks the code at all and does not allow me to make a search. My query contains query itself and aggregations with filter. Filter contains range:
date: { gte: someDay.getTime(), lte: 'now' }
P.S. someDay is a certain day in my case.
The error I get after applying index: false in mapping to the body field is the following:
{
"error":
{
"root_cause":
[
{
"type": "number_format_exception",
"reason": "For input string: \"now\""
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards":
[
{
"shard": 0,
"index": "test",
"node": "eehPq21jQsmkotVOqQEMeA",
"reason":
{
"type": "number_format_exception",
"reason": "For input string: \"now\""
}
}
],
"caused_by":
{
"type": "number_format_exception",
"reason": "For input string: \"now\"",
"caused_by":
{
"type": "number_format_exception",
"reason": "For input string: \"now\""
}
}
},
"status": 400
}
I'm not sure how these cases are associated as the error is about date field while I'm adding index property to body field.
I'm using: "#opensearch-project/opensearch": "^1.0.2"
Please help me to understand:
how to exclude field from indexing.
why applying index: false to body field in mapping breaks the code an I get an error associated with date field.
You should just modify your mapping to this:
"body": {
"type": "text",
"index": false
}
And it should work

Trying to parital update a doc but getting error regarding date field with epoch_second format

I'm trying to partially update an existing document which is already in the index and is indexed well, meaning I can view it in Kibana which uses timestamp field to display the documents. I'm trying to update only the doc's name of id test_id
my request is:
POST shirkan_test/_update/test_id
{
"doc": {
"name": "shirkan"
},
"doc_as_upsert": true
}
Getting the following error:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "failed to parse field [timestamp] of type [date] in document with id 'test_id'. Preview of field's value: '1.602505857664299E9'"
}
],
"type": "mapper_parsing_exception",
"reason": "failed to parse field [timestamp] of type [date] in document with id 'test_id'. Preview of field's value: '1.602505857664299E9'",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "failed to parse date field [1.602505857664299E9] with format [epoch_second]",
"caused_by": {
"type": "date_time_parse_exception",
"reason": "Failed to parse with all enclosed parsers"
}
}
},
"status": 400
}
Much appreciate any help with this.
Thanks.
EDIT: adding index mapping
{
"mapping": {
"properties": {
"condition": {
"type": "text",
"index": false
},
"name": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"rank": {
"type": "double"
},
"timestamp": {
"type": "date",
"format": "epoch_second"
}
}
}
}
EDIT 2: Changing timestamp format to strict_date_optional_time_nanos doesn't yield such an error and the upsert works well.
Looks like for now, the solution which worked for me to this problem is to change the timestamp field format from epoch_second to strict_date_optional_time_nanos. Other formats may work as well. I had to completely delete the index and recreate it since I came across the same error message when trying to re-index.
As mentioned in one of my comments, I filed a bug report here:
https://github.com/elastic/elasticsearch/issues/64050

How to create a sub object in Elastic-search 7.x

Earlier I was using 1.x version and was creating the sub objects mapping using below syntax.
"foo": {
"type": "integer",
"doc_values": true
},
"foo.bar": {
"type": "integer",
"doc_values": true
},
"foo.bar.baz": {
"type": "integer",
"doc_values": true
},
But now when I am using same mapping syntax in ES 7.x I am getting below error:-
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "Can't merge a non object mapping [foo] with an object mapping [foo]"
}
],
"type": "illegal_argument_exception",
"reason": "Can't merge a non object mapping [foo] with an object mapping [foo]"
},
"status": 400
}
I am came accros this SO post Can’t merge a non object mapping with an object mapping error in machine learning(beta) module But, Note I am not updating the mapping, instead I am crating a new mapping still getting this error, please advise what to do?
An example of object type. Refer here for more info
"mappings": {
"properties": {
"user": {
"properties": {
"age": { "type": "integer" },
"name": {
"properties": {
"first": { "type": "text" },
"last": { "type": "text" }
}
}
}
}
}
In below name can be defined as object and further properties can be added using name.firstname etc. In your mapping foo is of type integer and then you are adding foo.bar so it is throwing error. foo must be of type object.
"properties" : {
"name" : {
"type" : "object"
},
"name.firstname":{
"type" : "text"
},
"name.lastname":{
"type" : "text"
}
}

failed to parse timestamp elasticsearch

I'm new to elasticsearch and am trying to create my first index but am having issues with a timestamp field that was working before...
I created my index like this:
PUT /kafkasdp
{
"mappings": {
"kafka_logs": {
"properties": {
"timestamp": {
"type": "date"
},
"log_level": {
"type": "string"
},
"message1": {
"type": "string"
},
"message2": {
"type": "string"
}
}
}
}
}
and then I'm trying to send data like this:
post /kafkasdp/kafka_logs
{
"timestamp": "2017-02-03 19:27:20,606",
"log_level": "INFO",
"message2": "Deleting segment 1 from log omega-replica-sync-dev-8. (kafka.log.Log)"
}
but keep getting this error:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "failed to parse [timestamp]"
}
],
"type": "mapper_parsing_exception",
"reason": "failed to parse [timestamp]",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "Invalid format: \"2017-02-03 19:27:20,606\" is malformed at \" 19:27:20,606\""
}
},
"status": 400
}
I thought my timestamp is a valid date type?
Read about date type on Elasticsearch reference: you should specify format of date you are expecting in your documents:
PUT your_index_name
{
"mappings": {
"your_index_type": {
"properties": {
"date": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss,SSS"
}
}
}
}
}
As you did not specify it, Elasticsearch will expect date value in ISO format:
yyyyMMdd'T'HHmmss.SSS'Z' (e.g., 2017-02-03T19:27:20.606Z)

Nested type in Elasticsearch: "object mapping can't be changed from nested to non-nested" when indexing a document

I try to index some nested documents into an Elasticsearch (v2.3.1) mapping which looks as follows (based on this example from the documentation):
PUT /my_index
{
"mappings": {
"blogpost": {
"properties": {
"title": { "type": "string" },
"comments": {
"type": "nested",
"properties": {
"name": { "type": "string" },
"comment": { "type": "string" }
}
}
}
}
}
}
However, I do not understand what my JSON documents have to look like in order to fit into that mapping. I tried with
PUT /my_index/some_type/1
{
"title": "some_title",
"comments": {
"name": "some_name",
"comment": "some_comment"
}
}
as well as with
PUT /my_index_some_type/1
{
"title": "some_title",
"comments": [
{
"name": "some_name",
"comment": "some_comment"
}
]
}
which both result in
{
"error":
{
"root_cause":
[
{
"type": "remote_transport_exception",
"reason": "[Caiman][172.18.0.4:9300][indices:data/write/index[p]]"
}
],
"type": "illegal_argument_exception",
"reason": "object mapping [comments] can't be changed from nested to non-nested"
},
"status": ​400
}
Which is the correct format to index nested documents? Any working examples are much appreciated, most examples here at SO or on other pages concentrate on nested queries rather than how the documents have been indexed before.
It seems you're really creating a document of type some_type and comments will default to a normal object (i.e. not nested), which is not allowed since you already have a nested object called comments in the blogpost mapping type in the same index.
Try this instead and it should work:
PUT /my_index/blogpost/1
{
"title": "some_title",
"comments": {
"name": "some_name",
"comment": "some_comment"
}
}

Resources