escape triple quotes in curl correctly - elasticsearch

I have the following curl request
curl -H "Content-Type: application/json" -X POST http://localhost:9200/_reindex\?wait_for_completion\=true -d '{"source": {"index": "analytics-prod-2019.12.30", "size":1000 }, "dest": {"index": "analytics-prod-2019.12"}, "conflicts": "proceed", "script": { "lang": "painless","source: """ctx._source.index = ctx._index; def eventData = ctx._source["event.data"]; if(eventData != null) { eventData.remove("realmDb.size"); eventData.remove("realmDb.format"); eventData.remove("realmDb.contents"); }""" } }'
but this fails with the following error:
{"error":{"root_cause":[{"type":"x_content_parse_exception","reason":"[1:166] [script] failed to parse object"}],"type":"x_content_parse_exception","reason":"[1:166] [reindex] failed to parse field [script]","caused_by":{"type":"x_content_parse_exception","reason":"[1:166] [script] failed to parse object","caused_by":{"type":"json_parse_exception","reason":"Unexpected character ('\"' (code 34)): was expecting a colon to separate field name and value\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper#51c48433; line: 1, column: 177]"}}},"status":400}
if i remove the script field from the request this works just fine:
curl -H "Content-Type: application/json" -X POST http://localhost:9200/_reindex\?wait_for_completion\=true -d '{"source":{"index":"analytics-prod-2019.12.30","size":1000},"dest":{"index":"test-index"},"conflicts":"proceed"}}'
using the kibana UI works fine.
what is the correct way to run this in curl?

Use a single " to surround your script value and \u0027 to escape in your Painless script.
curl -H "Content-Type: application/json" -X POST http://localhost:9200/_reindex\?wait_for_completion\=true -d '
{
"source": {
"index": "analytics-prod-2019.12.30",
"size": 1000
},
"dest": {
"index": "analytics-prod-2019.12"
},
"conflicts": "proceed",
"script": {
"lang": "painless",
"source": "ctx._source.index = ctx._index; def eventData = ctx._source[\u0027event.data\u0027]; if(eventData != null) { eventData.remove(\u0027realmDb.size\u0027); eventData.remove(\u0027realmDb.format\u0027); eventData.remove(\u0027realmDb.contents\u0027);"
}
}
'
You can also see an example of this here, click on the Copy as cURL link and review the example in that format.

Your source was missing a double quote:
Corrected:
curl -H "Content-Type: application/json" \
-X POST http://localhost:9200/_reindex\?wait_for_completion\=true \
-d '{"source": {"index": "analytics-prod-2019.12.30", "size":1000 }, "dest": {"index": "analytics-prod-2019.12"}, "conflicts": "proceed", "script": { "lang": "painless","source": "ctx._source.index = ctx._index; def eventData = ctx._source[\"event.data\"]; if (eventData != null) { eventData.remove(\"realmDb.size\"); eventData.remove(\"realmDb.format\"); eventData.remove(\"realmDb.contents\"); }" } }'
You can either use single quotes like #Zsolt pointed out but even Kibana itself, when clicking "Copy as cURL", uses escaped double quotes.

curl -XPOST "http://elasticsearch:9200/_reindex?requests_per_second=115&wait_for_completion=true" -H 'Content-Type: application/json' -d'
{
"source": {
"index": "analytics-prod-2019.12.30",
"size": 1000
},
"dest": {
"index": "analytics-prod-2019.12"
},
"script": {
"lang": "painless",
"source": " ctx._source.index = ctx._index;\n def eventData = ctx._source[\"event.data\"];\n if (eventData != null) {\n eventData.remove(\"realmDb.size\");\n eventData.remove(\"realmDb.format\");\n eventData.remove(\"realmDb.contents\");\n }"
}
}'
had to escape \"

Related

How to sort data analytics/queries elasticsearch?

any solution for sort by newest data by date ? because i want to know history per users already searching
follow documentation but got stuck
curl -X POST '<ENTERPRISE_SEARCH_BASE_URL>/api/as/v1/engines/national-parks-demo/analytics/queries' \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer private-xxxxxxxxxxxxxxxxxxxxxxxx' \
-d '{
{
"filters": {
"all": [
{
"date": {
"from": "2022-02-01T12:00:00+00:00",
"to": "2022-12-31T00:00:00+00:00"
}
}, {
"tag": "C001"
}
]
},
"page": {
"size": 20
}
}'
sorry my bad english

Elasticsearch bulk update geo_location of all documents with curl

I have a bunch of documents in Elasticsearch that don't have a geo_point attribute.
Now I want to add it to all of them.
With some research I found the command bellow but was originally used to update a string attribute.
curl -XPOST "http://localhost:9200/products/_update_by_query" -H 'Content-Type: application/json' -d'
{
"script": {
"source": "ctx._source.location = {'lat': 0.0, 'lon':0.0}",
"lang": "painless"
},
"query": {
"match_all": {}
}
}'
Thought I'd just replace the string with geo_point but it gives me this error:
{
"error":{
"root_cause":[{
"type":"parse_exception",
"reason":"expected one of [inline], [file] or [stored] fields, but found none"
}],
"type":"parse_exception",
"reason":"expected one of [inline], [file] or [stored] fields, but found none"
},
"status":400
}
I appreciate any help.
Good job so far!
It looks like you're running an older version of ES. Try the command below which simply replaces source by inline as it was the norm in older versions:
curl -XPOST "http://localhost:9200/products/_update_by_query" -H 'Content-Type: application/json' -d'
{
"script": {
"inline": "ctx._source.location = ['lat': 0.0, 'lon':0.0]",
"lang": "painless"
},
"query": {
"match_all": {}
}
}'
Note, however, that if your location field is already of type text or string you cannot change it to geo_point with this command. You'll need to either create a new field named differently than location and of type geo_point or create a new index with the proper mapping for the location field.
Edit: If the above doesn't work, try replacing single quote ' with \" like so
curl -XPOST "http://localhost:9200/products/_update_by_query" -H 'Content-Type: application/json' -d'
{
"script": {
"inline": "ctx._source.location = [\"lat\": 0.0, \"lon\":0.0]",
"lang": "painless"
},
"query": {
"match_all": {}
}
}'

Add additional attribute to an existing document if the attribute doesn't exist elasticsearch

I have a specific requirement were I have to add an additional attribute to elastic search index which has n documents. This has to be done only if the documents don't contain the attribute. This tasks basically involves 2 steps
1) searching
2) updating
I know how to do this with multiple queries. But it would be great if I manage to do this in a single query. Is it possible? If yes, can someone tell me how this can be done.
You can use update by query combined with the exists query to update and add the new field to only those documents which does not contain the attribute.
For example, you have only one documents containing field attrib2, others don't have that field.
curl -XPUT "http://localhost:9200/my_test_index/doc/1" -H 'Content-Type: application/json' -d'
{
"attrib1": "value1"
}'
curl -XPUT "http://localhost:9200/my_test_index/doc/2" -H 'Content-Type: application/json' -d'
{
"attrib1": "value21"
}'
curl -XPUT "http://localhost:9200/my_test_index/doc/3" -H 'Content-Type: application/json' -d'
{
"attrib1": "value31",
"attrib2": "value32"
}'
The following update by query will do the job.
curl -XPOST "http://localhost:9200/my_test_index/_update_by_query" -H 'Content-Type: application/json' -d'
{
"script": {
"lang": "painless",
"source": "ctx._source.attrib2 = params.attrib2",
"params": {
"attrib2": "new_value_for_attrib2"
}
},
"query": {
"bool": {
"must_not": [
{
"exists": {
"field": "attrib2"
}
}
]
}
}
}'
It will set the new value new_value_for_attrib2 to the field attrib2 on only those documents which don't already have that field.

Nested mvel script update

I am using this expression to update a nested document:
curl -XPOST 'localhost:9200/event/docs/cPd4cfqGTe2Hw9sq0qs_NQ/_update' -d '{
"script": "foreach (item : ctx._source.to) { item['read'] = true }"
}'
But it always says classcastexception boolean cannot be casted to String. I tried putting the true in a param, tried 'T', 'true', 'TRUE', TRUE, 1.
Running out of ideas.
The document sample:
{
"prop":"test"
"to": [{"id": "1", "read":false},
{"id":"2","read": true}]
}
I also tried changing "id" just to test, and it's telling me I can't cast HashMap into string
curl -XPOST 'localhost:9200/event/docs/cPd4cfqGTe2Hw9sq0qs_NQ/_update' -d '{
"script": "foreach (item : ctx._source.to) { item['id'] = '3' }",
}'
The script sent to UPDATE API is not right.
I have corrected your script -
curl -XPOST 'http://localhost:9200/vm/vm/vm' -d '{
"prop":"test",
"to": [{"id": "1", "read":false},
{"id":"2","read": true}]
}'
And on updating , use the below script -
curl -XPOST 'localhost:9200/vm/vm/vm/_update' -d '{
"script": "for (item in ctx._source.to) { item.id = 4 }"
}'

Elasticsearch Delete Query By Date

I'm running the following query :
q='{
"filtered" : {
"query" : {
"match_all" : {}
},
"filter": {
"and": [
{
"range": {
"creation_time": {
"from": "2012-08-30",
"to": "2012-08-31",
"include_lower": true,
"include_upper": true
}
}
},
]
}
}
}'
My domain is an ec2 server
curl -XDELETE "http://#{mydomain}:9200/monitoring/mention_reports/_query?q=#{q}"
When I am hitting this query it gives me
curl: (3) [globbing] nested braces not supported at pos 118
Please help me thanks
If you’re trying to exec curl from the command line, it should be looking like:
q='YOUR_QUERY_CODE_GOES_HERE'
curl -v -H "Content-type: application/json" -H "Accept: application/json" \
-XDELETE -d $q http://localhost:9200/monitoring/mention_reports/_query
In case of inside-ruby execution, you should format the request as you do, but the silver bullet is still in headers:
-H "Content-type: application/json" -H "Accept: application/json"

Resources