Elasticsearch Delete Query By Date - ruby

I'm running the following query :
q='{
"filtered" : {
"query" : {
"match_all" : {}
},
"filter": {
"and": [
{
"range": {
"creation_time": {
"from": "2012-08-30",
"to": "2012-08-31",
"include_lower": true,
"include_upper": true
}
}
},
]
}
}
}'
My domain is an ec2 server
curl -XDELETE "http://#{mydomain}:9200/monitoring/mention_reports/_query?q=#{q}"
When I am hitting this query it gives me
curl: (3) [globbing] nested braces not supported at pos 118
Please help me thanks

If you’re trying to exec curl from the command line, it should be looking like:
q='YOUR_QUERY_CODE_GOES_HERE'
curl -v -H "Content-type: application/json" -H "Accept: application/json" \
-XDELETE -d $q http://localhost:9200/monitoring/mention_reports/_query
In case of inside-ruby execution, you should format the request as you do, but the silver bullet is still in headers:
-H "Content-type: application/json" -H "Accept: application/json"

Related

How to sort data analytics/queries elasticsearch?

any solution for sort by newest data by date ? because i want to know history per users already searching
follow documentation but got stuck
curl -X POST '<ENTERPRISE_SEARCH_BASE_URL>/api/as/v1/engines/national-parks-demo/analytics/queries' \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer private-xxxxxxxxxxxxxxxxxxxxxxxx' \
-d '{
{
"filters": {
"all": [
{
"date": {
"from": "2022-02-01T12:00:00+00:00",
"to": "2022-12-31T00:00:00+00:00"
}
}, {
"tag": "C001"
}
]
},
"page": {
"size": 20
}
}'
sorry my bad english

Need help in complicate aggregation with ElasticSearch query

I try to make aggregation query with ElasticSearch 6.8:
I want to find last documents for specific date, and last documents before specific date, group by documents fields.
curl -X PUT "http://localhost:9200/test/xxx/1" -H 'Content-Type: application/json' -d'{"c1" : "1","c2": "1-1","ts": "2020-01-01T06:00:00.000+0000", "rec_type": "t1"}'
curl -X PUT "http://localhost:9200/test/xxx/2" -H 'Content-Type: application/json' -d'{"c1" : "1","c2": "1-2","ts": "2020-01-02T06:00:00.000+0000", "rec_type": "t1"}'
curl -X PUT "http://localhost:9200/test/xxx/3" -H 'Content-Type: application/json' -d'{"c1" : "1","c2": "1-3","ts": "2020-03-16T06:00:00.000+0000", "rec_type": "t1"}'
curl -X PUT "http://localhost:9200/test/xxx/4" -H 'Content-Type: application/json' -d'{"c1" : "1","c2": "1-4","ts": "2020-03-16T09:00:00.000+0000", "rec_type": "t1"}'
curl -X PUT "http://localhost:9200/test/xxx/5" -H 'Content-Type: application/json' -d'{"c1" : "2","c2": "2-1","ts": "2020-01-01T06:00:00.000+0000", "rec_type": "t1"}'
curl -X PUT "http://localhost:9200/test/xxx/6" -H 'Content-Type: application/json' -d'{"c1" : "2","c2": "2-2","ts": "2020-01-02T06:00:00.000+0000", "rec_type": "t1"}'
curl -X PUT "http://localhost:9200/test/xxx/7" -H 'Content-Type: application/json' -d'{"c1" : "2","c2": "2-3","ts": "2020-03-16T06:00:00.000+0000", "rec_type": "t1"}'
curl -X PUT "http://localhost:9200/test/xxx/8" -H 'Content-Type: application/json' -d'{"c1" : "2","c2": "2-4","ts": "2020-03-16T09:00:00.000+0000", "rec_type": "t1"}'
curl -X PUT "http://localhost:9200/test/_mapping/_doc" -H 'Content-Type: application/json' -d'{"properties" : {"c2": {"type": "text", "fielddata": true}}}'
curl -X PUT "http://localhost:9200/test/_mapping/_doc" -H 'Content-Type: application/json' -d'{"properties" : {"ts": {"type": "date"}}}'
Data looks like:
c1
c2
ts
rec_type
1
1-1
2020-01-01T06:00:00
t1
1
1-2
2020-01-02T06:00:00
t1
1
1-3
2020-03-16T06:00:00
t1
1
1-4
2020-03-16T09:00:00
t1
2
2-1
2020-01-01T06:00:00
t1
2
2-2
2020-01-02T06:00:00
t1
2
2-3
2020-03-16T06:00:00
t1
2
2-4
2020-03-16T09:00:00
t1
My query returns only last record for specific date (1-4 and 2-4 in the table above, but I want to take in same bucket a 1-2 and 2-2 ):
{
"size":0,
"query":{
"bool":{
"must":[
{
"bool":{
"must":[
{
"term":{
"rec_type.keyword":{
"value":"t1",
"boost":1.0
}
}
},
{
"range":{
"ts":{
"from":"2020-03-16T00:00:00.000+0000",
"to":"2020-03-16T23:59:59.000+0000",
"include_lower":true,
"include_upper":true,
"boost":1.0
}
}
}
],
"adjust_pure_negative":true,
"boost":1.0
}
}
],
"adjust_pure_negative":true,
"boost":1.0
}
},
"aggregations":{
"last_records":{
"composite":{
"size":100,
"sources":[
{
"record":{
"terms":{
"field":"c1.keyword",
"order":"asc"
}
}
}
]
},
"aggregations":{
"top_hits":{
"top_hits":{
"from":0,
"size":1,
"version":false,
"explain":false,
"sort":[
{
"ts":{
"order":"desc"
}
},
{
"c2":{
"order":"desc"
}
}
]
}
}
}
}
}
}
So question, is it even possible, and how ?

escape triple quotes in curl correctly

I have the following curl request
curl -H "Content-Type: application/json" -X POST http://localhost:9200/_reindex\?wait_for_completion\=true -d '{"source": {"index": "analytics-prod-2019.12.30", "size":1000 }, "dest": {"index": "analytics-prod-2019.12"}, "conflicts": "proceed", "script": { "lang": "painless","source: """ctx._source.index = ctx._index; def eventData = ctx._source["event.data"]; if(eventData != null) { eventData.remove("realmDb.size"); eventData.remove("realmDb.format"); eventData.remove("realmDb.contents"); }""" } }'
but this fails with the following error:
{"error":{"root_cause":[{"type":"x_content_parse_exception","reason":"[1:166] [script] failed to parse object"}],"type":"x_content_parse_exception","reason":"[1:166] [reindex] failed to parse field [script]","caused_by":{"type":"x_content_parse_exception","reason":"[1:166] [script] failed to parse object","caused_by":{"type":"json_parse_exception","reason":"Unexpected character ('\"' (code 34)): was expecting a colon to separate field name and value\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper#51c48433; line: 1, column: 177]"}}},"status":400}
if i remove the script field from the request this works just fine:
curl -H "Content-Type: application/json" -X POST http://localhost:9200/_reindex\?wait_for_completion\=true -d '{"source":{"index":"analytics-prod-2019.12.30","size":1000},"dest":{"index":"test-index"},"conflicts":"proceed"}}'
using the kibana UI works fine.
what is the correct way to run this in curl?
Use a single " to surround your script value and \u0027 to escape in your Painless script.
curl -H "Content-Type: application/json" -X POST http://localhost:9200/_reindex\?wait_for_completion\=true -d '
{
"source": {
"index": "analytics-prod-2019.12.30",
"size": 1000
},
"dest": {
"index": "analytics-prod-2019.12"
},
"conflicts": "proceed",
"script": {
"lang": "painless",
"source": "ctx._source.index = ctx._index; def eventData = ctx._source[\u0027event.data\u0027]; if(eventData != null) { eventData.remove(\u0027realmDb.size\u0027); eventData.remove(\u0027realmDb.format\u0027); eventData.remove(\u0027realmDb.contents\u0027);"
}
}
'
You can also see an example of this here, click on the Copy as cURL link and review the example in that format.
Your source was missing a double quote:
Corrected:
curl -H "Content-Type: application/json" \
-X POST http://localhost:9200/_reindex\?wait_for_completion\=true \
-d '{"source": {"index": "analytics-prod-2019.12.30", "size":1000 }, "dest": {"index": "analytics-prod-2019.12"}, "conflicts": "proceed", "script": { "lang": "painless","source": "ctx._source.index = ctx._index; def eventData = ctx._source[\"event.data\"]; if (eventData != null) { eventData.remove(\"realmDb.size\"); eventData.remove(\"realmDb.format\"); eventData.remove(\"realmDb.contents\"); }" } }'
You can either use single quotes like #Zsolt pointed out but even Kibana itself, when clicking "Copy as cURL", uses escaped double quotes.
curl -XPOST "http://elasticsearch:9200/_reindex?requests_per_second=115&wait_for_completion=true" -H 'Content-Type: application/json' -d'
{
"source": {
"index": "analytics-prod-2019.12.30",
"size": 1000
},
"dest": {
"index": "analytics-prod-2019.12"
},
"script": {
"lang": "painless",
"source": " ctx._source.index = ctx._index;\n def eventData = ctx._source[\"event.data\"];\n if (eventData != null) {\n eventData.remove(\"realmDb.size\");\n eventData.remove(\"realmDb.format\");\n eventData.remove(\"realmDb.contents\");\n }"
}
}'
had to escape \"

Elasticsearch 6 create new field requires data type but "Indices created in 6.x only allow a single-type per index"

Create a new field in Elasticsearch 6.6.2 gives the following error:
{
"error": {
"root_cause": [
{
"type": "action_request_validation_exception",
"reason": "Validation Failed: 1: mapping type is missing;"
}
],
"type": "action_request_validation_exception",
"reason": "Validation Failed: 1: mapping type is missing;"
},
"status": 400
}
The request:
curl --request PUT http://10.1.3.81:9200/netswitch_message/_mapping -H "Content-Type: application/json" -d \
'{
"properties": {
"amount": {"type": "integer"}
}
}'
gives error no matter what data type I use. The index already has types integer, text/keyword, text and date.
curl --request PUT http://10.1.3.81:9200/netswitch_message/_mapping -H "Content-Type: application/json" -d "{\"properties\": {\"amount\": {\"type\": \"integer\"}}}"
curl --request PUT http://10.1.3.81:9200/netswitch_message/_mapping -H "Content-Type: application/json" -d "{\"properties\": {\"amount\": {\"type\": \"text\"}}}"
curl --request PUT http://10.1.3.81:9200/netswitch_message/_mapping/data -H "Content-Type: application/json" -d "{\"properties\": {\"amount\": {}}}"
curl --request PUT http://10.1.3.81:9200/netswitch_message/_mapping -H "Content-Type: application/json" -d "{\"properties\": {\"amount\": {}}}
Expected to create a new field
Actually got syntax error:
{"error":{"root_cause":[{"type":"action_request_validation_exception","reason":"Validation Failed: 1: mapping type is missing;"}],"type":"action_request_validation_exception","reason":"Validation Failed: 1: mapping type is missing;"},"status":400}
You are right that 6.x restricts you to a single _type, but you do still need to supply the name of that type (in 7.x, it defaults to _doc).
Change your mapping to specify the _type, like the following which sets it to "my-type":
curl --request PUT http://10.1.3.81:9200/netswitch_message/_mapping/my-type -H "Content-Type: application/json" -d \
'{
"properties": {
"amount": {"type": "integer"}
}
}'
See: https://www.elastic.co/guide/en/elasticsearch/reference/6.6/indices-put-mapping.html#indices-put-mapping

Elasticsearch 7 unable to create index

I am trying to create index using following syntax
curl -H "Content-Type: application/json" -XPUT 127.0.0.1:9200/movies -d '
{
"mappings": {
"movie": {
"properties": {
"year": {"type":"date"}
}
}
}
}'
I guess "movie" cannot be child of the "mappings", can someone please help me transform this into Elasticsearch 7 compatible syntax.
I tried using "movie.year" : {"type":"date"} but then it fails on following insert statement
curl -H "Content-Type: application/json" -XPUT 127.0.0.1:9200/movies/movie/109487 -d '
{
"genre":["IMAX", "Sci-Fi"],
"title":"Intersteller",
"year":2014
}'
I copied from tutorial of Elasticsearch 6
"Rejecting mapping update to [movies] as the final mapping would have
more than 1 type: [_doc, movie]"
In ES 7, there are no more types. You need to do it like this.
First, create the index:
curl -H "Content-Type: application/json" -XPUT 127.0.0.1:9200/movies -d '
{
"mappings": {
"properties": {
"year": {"type":"date"}
}
}
}'
Then, index your document:
curl -H "Content-Type: application/json" -XPUT 127.0.0.1:9200/movies/_doc/109487 -d '
{
"genre":["IMAX", "Sci-Fi"],
"title":"Intersteller",
"year":2014
}'

Resources