Unable to create visualization using curl command in elaticearch - elasticsearch

I am trying to create visualization using curl command. I am using elasticsearch 6.2.3. I am able to create the same in elasticsearch 5.6.8.
I am using this command
curl -XPUT http://localhost:9200/.kibana/visualization/vis1 -H 'Content-Type: application/json' -d #vis1.json
It is showing this error :
{"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"Rejecting mapping update to [.kibana] as the final mapping would have more than 1 type: [visualization, doc]"}],"type":"illegal_argument_exception","reason":"Rejecting mapping update to [.kibana] as the final mapping would have more than 1 type: [visualization, doc]"},"status":400}
Contents of vis1.json:
{
"title": "vis1",
"visState": "{\"title\":\"vis1\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showMeticsAtAllLevels\":false,\"showPartialRows\":false,\"showTotal\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"date_histogram\",\"schema\":\"split\",\"params\":{\"field\":\"UsageEndDate\",\"interval\":\"M\",\"customInterval\":\"2h\",\"min_doc_count\":1,\"extended_bounds\":{},\"row\":false}},{\"id\":\"3\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"ProductName.keyword\",\"otherBucket\":false,\"otherBucketLabel\":\"Other\",\"missingBucket\":false,\"missingBucketLabel\":\"Missing\",\"size\":5,\"order\":\"desc\",\"orderBy\":\"1\"}}]}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"4eb9f840-3969-11e8-ae19-552e148747c3\",\"filter\":[],\"query\":{\"language\":\"lucene\",\"query\":\"\"}}"
}
}
This is working fine in elasticearch 5.6.8 but not in 6.2.3.
Thanks in Advance.

In Kibana 6, the mapping of the .kibanaindex has changed in order to satisfy the upcoming "one mapping per index" breaking change.
You can try this way instead:
curl -XPUT http://localhost:9200/.kibana/doc/visualization:vis1 -H 'Content-Type: application/json' -d #vis1.json
Also the vis1.json file needs to be changed a little bit (the content needs to be moved to the visualization sub-section), like this:
{
"type": "visualization",
"updated_at": "2018-04-10T10:00:00.000Z",
"visualization": {
"title": "vis1",
"visState": "{\"title\":\"vis1\",\"type\":\"table\",\"params\":{\"perPage\":10,\"showMeticsAtAllLevels\":false,\"showPartialRows\":false,\"showTotal\":false,\"sort\":{\"columnIndex\":null,\"direction\":null},\"totalFunc\":\"sum\"},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"count\",\"schema\":\"metric\",\"params\":{}},{\"id\":\"2\",\"enabled\":true,\"type\":\"date_histogram\",\"schema\":\"split\",\"params\":{\"field\":\"UsageEndDate\",\"interval\":\"M\",\"customInterval\":\"2h\",\"min_doc_count\":1,\"extended_bounds\":{},\"row\":false}},{\"id\":\"3\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"bucket\",\"params\":{\"field\":\"ProductName.keyword\",\"otherBucket\":false,\"otherBucketLabel\":\"Other\",\"missingBucket\":false,\"missingBucketLabel\":\"Missing\",\"size\":5,\"order\":\"desc\",\"orderBy\":\"1\"}}]}",
"uiStateJSON": "{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"4eb9f840-3969-11e8-ae19-552e148747c3\",\"filter\":[],\"query\":{\"language\":\"lucene\",\"query\":\"\"}}"
}
}
}

Related

{"error":"no handler found for uri [/blog/article/1] and method [PUT]"} elasticsearch

I am trying to add data to elasticsearch with both PUT and POST
curl -k -XPUT 'https://localhost:9200/blog/article/1' -d '{"title": "New version of Elasticsearch released!", "content": "Version 2.2 released today!", "priority": 10, "tags": ["announce", "elasticsearch", "release"] }'
but I am getting error:
{"error":"no handler found for uri [/blog/article/1] and method [PUT]"}
curl -k -XPOST 'https://localhost:9200/blog/article/' -d '{"title": "New version of Elasticsearch released!", "content": "Version 2.2 released today!", "priority": 10, "tags": ["announce", "elasticsearch", "release"] }'
{"error":"no handler found for uri [/blog/article/] and method [POST]"}
Tldr;
This is expected behaviour as those endpoint do not exist.
You should refer to the official documentation for indexing documents.
Solution
Request to index a document should look like so:
PUT /<target>/_doc/<_id>
POST /<target>/_doc/
PUT /<target>/_create/<_id>
POST /<target>/_create/<_id>
In my example I choose the first flavour.
Noticed I have renamed the index to blog_article
curl -k -XPOST 'https://localhost:9200/blog_article/_doc/1' -H "Content-Type: application/json" -d '{"title": "New version of Elasticsearch released!", "content": "Version 2.2 released today!", "priority": 10, "tags": ["announce", "elasticsearch", "release"] }'

Unable to update index with mapping in Elasticsearch 7.x

I am trying to follow this resource to update the mapping in an existing index but it is given an error:
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "mapper [NAME] cannot be changed from type [text] to [keyword]"
}
],
"type": "illegal_argument_exception",
"reason": "mapper [NAME] cannot be changed from type [text] to [keyword]"
},
"status": 400
}
Below is the request I am hitting:
curl -X PUT \
http://localhost:9200/company/_mapping \
-H 'cache-control: no-cache' \
-H 'content-type: application/json' \
-H 'postman-token: a8384316-7374-069c-05e5-5be4e0a8f6d8' \
-d '{
"dynamic": "strict",
"properties": {
"NAME": {
"type": "keyword"
},
"DOJ": {
"type": "date"
}
}
}'
I know I can re-create an index with the new mapping but why can't I update the existing one?
The comments in this thread are correct in that changing a text field to a keyword field would be considered a mapping breaking change (and raise the "cannot be changed from [type] to [type]" exception).
You can, however, extend the original text mapping to become a multi-field mapping containing a keyword:
curl -X PUT \
http://localhost:9200/company/_mapping \
-H 'content-type: application/json' \
-d '{
"dynamic": "strict",
"properties": {
"NAME": {
"type": "text",
"fields": {
"keyword": { <---
"type": "keyword"
}
}
},
"DOJ": {
"type": "date"
}
}
}'
Once this adjustment is through, you can easily pick up this new property through a blank Update by query call:
curl -X PUT \
http://localhost:9200/company/_update_by_query \
-H 'content-type: application/json'
Finally, you can target the NAME.keyword in your queries. The good thing is, you'll have retained the original text too.
You cannot change the mapping of an existing field, you need to create a new index with the correct mapping then reindex your data
Except for supported mapping parameters, you can’t change the mapping or field type of an existing field. Changing an existing field could invalidate data that’s already indexed.
https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-put-mapping.html#updating-field-mappings
Can you specify the NAME? it happen to me because NAME included a point (e.g. "histological.group"

changing the timestamp format of elasticsearch index

I am trying to load log records into elasticsearch (7.3.1) and showing the results in kibana. I am facing the fact that although records are loaded into elasticearch and a curl GET shows them, they are not visible in kibana.
Most of the time, this is because of the timestamp format. In my case, the proper timestamp format should be basic_date_time, but the index only has:
# curl -XGET 'localhost:9200/og/_mapping'
{"og":{"mappings":{"properties":{"#timestamp":{"type":"date"},"componentName":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}}}}}}%
I would like to add format 'basic_date_time' to the #timestamp properties, but each try I do is either not accepted by elasticsearch or does not change the index field.
I simply fail to get the right command to do the job.
For example, the simplest I could think of,
Z cr 23;curl -H 'Content-Type: application/json' -XPUT 'http://localhost:9200/og/_mapping' -d'
{"mappings":{"properties":{"#timestamp":{"type":"date","format":"basic_date_time"}}}}
'
gives error
{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"Root mapping definition has unsupported parameters: [mappings : {properties={#timestamp={format=basic_date_time, type=date}}}]"}],"type":"mapper_parsing_exception","reason":"Root mapping definition has unsupported parameters: [mappings : {properties={#timestamp={format=basic_date_time, type=date}}}]"},"status":400}%
and trying to do it via kibana with
PUT /og
{
"mappings": {
"properties": {
"#timestamp": { "type": "date", "format": "basic_date_time" }
}
}
}
gives
{
"error": {
"root_cause": [
{
"type": "resource_already_exists_exception",
"reason": "index [og/NIT2FoNfQpuPT3Povp97bg] already exists",
"index_uuid": "NIT2FoNfQpuPT3Povp97bg",
"index": "og"
}
],
"type": "resource_already_exists_exception",
"reason": "index [og/NIT2FoNfQpuPT3Povp97bg] already exists",
"index_uuid": "NIT2FoNfQpuPT3Povp97bg",
"index": "og"
},
"status": 400
}
I am not sure if I should even try this in kibana. But I would be very glad if I could find the right curl command to get the index changed.
Thanks for helping, Ruud
You can do it either via curl like this:
curl -H 'Content-Type: application/json' -XPUT 'http://localhost:9200/og/_mapping' -d '{
"properties": {
"#timestamp": {
"type": "date",
"format": "basic_date_time"
}
}
}
'
Or in Kibana like this:
PUT /og/_mapping
{
"properties": {
"#timestamp": {
"type": "date",
"format": "basic_date_time"
}
}
}
Also worth noting is that once an index/mapping is created you can usually not modify it (very few exceptions). You can create a new index with the correct mapping and reindex your data into it.

how to upload bulk json data on elasticsearch sever using curl?

I am using ES version 6.1.3.
I am trying to upload bulk of json data using curl cmd on windows 7
here's my cmd:
curl -H "Content-Type:application/json" -XPOST "http://localhost:9200/fooditems/item/_bulk?pretty" --data-binary #food_items.json
I tried an alternate cmd for it which are
curl -H "Content-Type:application/x-ndjson" -XPOST "http://localhost:9200/fooditems/item/_bulk?pretty" --data-binary #food_items.json
curl -XPOST "http://localhost:9200/fooditems/item/_bulk?pretty" --data-binary #food_items.json
and here is my sample data of food_items.json
> {"index":{"_index":"fooditems","_type":"item","_id":0}}
> {"id":"0","name":"Jowar Puffed"}
> {"index":{"_index":"fooditems","_type":"item","_id":1}}
> {"id":"1","name":"Uamp and tea"}
> {"index":{"_index":"fooditems","_type":"item","_id":2}}
> {"id":"2","name":"Banana Chips"}
I have provided mapping for data through Kibana is as-
POST /fooditems/item
{
"mappings": {
"_default_":{
"properties": {
"id":{
"type" :"text"
},
"name":{
"type": "integer"
}
}
}
}
}
I have been encountered errors
"The bulk request must be terminated by a newline [\n]"
illegal_argument_exception
i am not sure what it's trying to say.
pls help me out here.

upserting batches into elasticsearch store with bulk API

I have huge set of documents with same index and same type but obviously different ids. I want to either update existing ones or insert new in batches. How can I achieve it using bulk indexing API? I want to do something like below but it throws error. Basically, I want to upsert multiple docs in batches which have same index and same type.
curl -s -H "Content-Type: application/json" -XPOST localhost:9200/_bulk -d'
{ "index": {"_type": "sometype", "_index": "someindex"}}
{ "_id": "existing_id", "field1": "test1"}
{ "_id": "existing_id2", "field2": "test2"}
'
You need to do it like this:
curl -s -H "Content-Type: application/json" -XPOST localhost:9200/someindex/sometype/_bulk -d'
{ "index": {"_id": "existing_id"}}
{ "field1": "test1"}
{ "index": {"_id": "existing_id2"}}
{ "field2": "test2"}
'
Since all documents are in the same index/type, move that to the URL and only specify the _id for each document you want to update in your bulk.

Resources