Getting error in Elasticsearch while creating index using postman - elasticsearch

I have installed Elasticsearch 5.1 in ubuntu 14.04. I have performed some operations in Elasticsearch like create index, delete index etc. Then I have installed Kibana 5.1. Now I want to create new index in elasticsearch using postman (localhost:9200/my_index with PUT). But I'm getting this error.
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "unknown setting [index.country] please check that any required plugins are installed, or check the breaking changes documentation for removed settings"
}
],
"type": "illegal_argument_exception",
"reason": "unknown setting [index.country] please check that any required plugins are installed, or check the breaking changes documentation for removed settings"
},
"status": 400
}
I remember that I have used country as index or type. But then I have purged elasticsearch and kibana (also deleted directories related to those). Reinstalled both. But still getting this error. If anyone knows the solution, it will be appreciated.
Here is output of some query which may you need to solve issue.
GET localhost:9200/_mapping
{ ".kibana": {
"mappings": {
"server": {
"properties": {
"uuid": {
"type": "keyword"
}
}
},
"config": {
"properties": {
"buildNum": {
"type": "keyword"
}
}
}
} } }
(GET) localhost:9200/_cat/indices?v
[ {
"health": "yellow",
"status": "open",
"index": ".kibana",
"uuid": "O_ORG0ONQNCEe8JU_C0SKQ",
"pri": "1",
"rep": "1",
"docs.count": "1",
"docs.deleted": "0",
"store.size": "3.1kb",
"pri.store.size": "3.1kb" } ]
(GET) localhost:9200/country
{ "error": {
"root_cause": [
{
"type": "index_not_found_exception",
"reason": "no such index",
"resource.type": "index_or_alias",
"resource.id": "country",
"index_uuid": "na",
"index": "country"
}
],
"type": "index_not_found_exception",
"reason": "no such index",
"resource.type": "index_or_alias",
"resource.id": "country",
"index_uuid": "na",
"index": "country" }, "status": 404 }

You can simply have a PUT request as such:
http://localhost:9200/indexname <--- give your index name
And then within your request body you could give the mappings:
{
"mappings": {
"message_logs": {
"properties": {
"anyfield": { <-- give your field
"type": "text" <-- and the type
}
}
}
}
}
This SO might help you if you're willing to create the index using CURL.
The above is just a sample. You could reproduce it.

Related

How to make a field to have lowercase analyzer after field creation?

I run
PUT /vehicles/_doc/123
{
"make" : "Honda civic",
"color" : "Blue",
"from": "Japan",
"size": "Big",
"HP" : 250,
"milage" : 24000,
"price": 19300.97
}
at the start
and after that I run
PUT /vehicles
{
"settings": {
"analysis": {
"analyzer": {
"my_lowercase_analyzer": {
"tokenizer": "standard",
"filter": [
"lowercase"
]
}
}
}
},
"mappings": {
"properties": {
"make": {
"type": "text",
"analyzer": "my_lowercase_analyzer"
}
}
}
}
It gives exception
{
"error": {
"root_cause": [
{
"type": "resource_already_exists_exception",
"reason": "index [vehicles/o66DtxmERa2lmo2WiiZ65w] already exists",
"index_uuid": "o66DtxmERa2lmo2WiiZ65w",
"index": "vehicles"
}
],
"type": "resource_already_exists_exception",
"reason": "index [vehicles/o66DtxmERa2lmo2WiiZ65w] already exists",
"index_uuid": "o66DtxmERa2lmo2WiiZ65w",
"index": "vehicles"
},
"status": 400
}
Is there away to update analyzer after creation?
Updating the analyzer of an existing field is a breaking change, you can't update it on the same index, you now have below. options
Add another field, on which you can define the new analyzer (of-course this is not recommended as you will loose the old data, but in some cases it may be useful).
Create a new index using same field and updated analyzer definition and after that you can use reindex API to update the data from old to new index.

Open Search, exclude field from indexing in mapping

I have the following mapping:
{
"properties": {
"type": {
"type": "keyword"
},
"body": {
"type": "text"
},
"id": {
"type": "keyword"
},
"date": {
"type": "date"
},
},
}
body field is going to be an email message, it's very long and I don't want to index it.
what is the proper way to exclude this field from indexing?
What I tried:
enabled: false - as I understand from the documentation, it's applied only to object type fields but in my case it's not really an object so I'm not sure whether I can use it.
index: false/'no' - this breaks the code at all and does not allow me to make a search. My query contains query itself and aggregations with filter. Filter contains range:
date: { gte: someDay.getTime(), lte: 'now' }
P.S. someDay is a certain day in my case.
The error I get after applying index: false in mapping to the body field is the following:
{
"error":
{
"root_cause":
[
{
"type": "number_format_exception",
"reason": "For input string: \"now\""
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards":
[
{
"shard": 0,
"index": "test",
"node": "eehPq21jQsmkotVOqQEMeA",
"reason":
{
"type": "number_format_exception",
"reason": "For input string: \"now\""
}
}
],
"caused_by":
{
"type": "number_format_exception",
"reason": "For input string: \"now\"",
"caused_by":
{
"type": "number_format_exception",
"reason": "For input string: \"now\""
}
}
},
"status": 400
}
I'm not sure how these cases are associated as the error is about date field while I'm adding index property to body field.
I'm using: "#opensearch-project/opensearch": "^1.0.2"
Please help me to understand:
how to exclude field from indexing.
why applying index: false to body field in mapping breaks the code an I get an error associated with date field.
You should just modify your mapping to this:
"body": {
"type": "text",
"index": false
}
And it should work

Trying to parital update a doc but getting error regarding date field with epoch_second format

I'm trying to partially update an existing document which is already in the index and is indexed well, meaning I can view it in Kibana which uses timestamp field to display the documents. I'm trying to update only the doc's name of id test_id
my request is:
POST shirkan_test/_update/test_id
{
"doc": {
"name": "shirkan"
},
"doc_as_upsert": true
}
Getting the following error:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "failed to parse field [timestamp] of type [date] in document with id 'test_id'. Preview of field's value: '1.602505857664299E9'"
}
],
"type": "mapper_parsing_exception",
"reason": "failed to parse field [timestamp] of type [date] in document with id 'test_id'. Preview of field's value: '1.602505857664299E9'",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "failed to parse date field [1.602505857664299E9] with format [epoch_second]",
"caused_by": {
"type": "date_time_parse_exception",
"reason": "Failed to parse with all enclosed parsers"
}
}
},
"status": 400
}
Much appreciate any help with this.
Thanks.
EDIT: adding index mapping
{
"mapping": {
"properties": {
"condition": {
"type": "text",
"index": false
},
"name": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"rank": {
"type": "double"
},
"timestamp": {
"type": "date",
"format": "epoch_second"
}
}
}
}
EDIT 2: Changing timestamp format to strict_date_optional_time_nanos doesn't yield such an error and the upsert works well.
Looks like for now, the solution which worked for me to this problem is to change the timestamp field format from epoch_second to strict_date_optional_time_nanos. Other formats may work as well. I had to completely delete the index and recreate it since I came across the same error message when trying to re-index.
As mentioned in one of my comments, I filed a bug report here:
https://github.com/elastic/elasticsearch/issues/64050

Set values in multifields elastic search

I have the following structure recorded in elastic:
PUT /movies
{
"mappings": {
"title": {
"properties": {
"title": {
"type": "string",
"fields": {
"de": {
"type": "string",
"analyzer": "german"
},
"en": {
"type": "string",
"analyzer": "english"
},
"fr": {
"type": "string",
"analyzer": "french"
},
"es": {
"type": "string",
"analyzer": "spanish"
}
}
}
}
}
}
}
But when I am trying to record values like this:
PUT movies/_doc/2
{
"title": "fox",
"field": "en"
}
I receive the following error:
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "Rejecting mapping update to [movies] as the final mapping would have more than 1 type: [_doc, title]"
}
],
"type": "illegal_argument_exception",
"reason": "Rejecting mapping update to [movies] as the final mapping would have more than 1 type: [_doc, title]"
},
"status": 400
}
Maybe I am doing something wrong since I am fairly new to elastic. My idea is to create one to one mapping and when I am searching for Fox in any of these languages to return results only in english since they are recorded in the DB.
Your mapping indicates a mapping type "title" but when you create the documents you use PUT movies/_doc/2 that indicates mapping type _doc which doesn't exist so ES will try to automatically create it, and in newer version of ES having multiple mapping types is forbidden.
You should just change it to: PUT movies/title/2

Update a particular field in ES 2.4.0 document?

I have document like this
"College": "UCLA",
"University": "American",
"Branch": "MECH",
"Name": {
"first": "john",
"middle": "william",
"last": "Richards"
}
I have to update the last field in Name group
I tried with the following but it is showing error
POST alma/matter/1/_update
{
"doc": { "Name.last": "junior" }
}
Error:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "Field name [Name.last] cannot contain '.'"
}
],
"type": "mapper_parsing_exception",
"reason": "Field name [Name.last] cannot contain '.'"
},
"status": 400
}
You're not allowed to reference fields with dots in them. You can do it like this instead:
POST alma/matter/1/_update
{
"doc": { "Name": {"last": "junior" }}
}

Resources