Elasticsearch lucene query in grafana - elasticsearch

I have Grafana 2.6 and Elasticsearch 1.6.2 as datasource
on each of my documents, I have a field "status" that can have the values "Queued", "Complete"
I would like to graph the number of documents with status:Queued on time
here is 1 document:
{
"_index": "myindex",
"_type": "e_sdoc",
"_id": "AVHFTlZiGCWSWOI9Qtj4",
"_score": 3.2619324,
"_source": {
"status": "Queued",
"update_date": "2015-12-04T00:01:35.589956",
"md5": "738b67990f820ba28f3c10bc6c8b6ea3",
"sender": "Someone",
"type": "0",
"last_client_update": "2015-11-18T18:13:32.879085",
"uuid": "a80efd11-8ecc-4ef4-afb3-e8cd75d167ad",
"name": "Europe",
"insert_date": "2015-11-18T18:14:34.302295",
"filesize": 10948809532,
"is_online": "off",
"id1": 77841,
"id2": 53550932
},
"fields": {
"insert_date": [
1447870474302
],
"update_date": [
1449187295589
],
"last_client_update": [
1447870412879
]
}
}
My question is: Grafana wants a lucene query to submit to ES
but I have no idea what I should use
Have searched through the official doc, Grafana issues or looked into ES query made by Kibana but I can't find a valid syntax that is working :/

time field was the problem. it seems there is no timestamp in my documents
edited my Elasticsearch datasource
changed 'Time field name' from #timestamp to update_date
I have now datapoints !
(see comments for the lucene query)

Related

ElasticSearch - Multiple query on one call (with sub limit)

I have a problem with ElasticSearch, I need you :)
Today I have an index in which I have my documents. These documents represent either Products or Categories.
The structure is this:
{
"_index": "documents-XXXX",
"_type": "_doc",
"_id": "cat-31",
"_score": 1.0,
"_source": {
"title": "Category A",
"type": "category",
"uniqId": "cat-31",
[...]
}
},
{
"_index": "documents-XXXX",
"_type": "_doc",
"_id": "prod-1",
"_score": 1.0,
"_source": {
"title": "Product 1",
"type": "product",
"uniqId": "prod-1",
[...]
}
},
What I'd like to do, in one call, is:
Have 5 documents whose type is "Product" and 2 documents whose type is "Category". Do you think it's possible?
That is, two queries in a single call with query-level limits.
Also, isn't it better to make two different indexes, one for the products, the other for the categories?
If so, I have the same question, how, in a single call, do both queries?
Thanks in advance
If product and category are different contexts I would try to separate them into different indices. Is this type used in all your queries to filter results? Ex: I want to search for the term xpto in docs with type product or do you search without applying any filter?
About your other question, you can apply two queries in a request. The Multi search API can help with this.
You would have two answers one for each query.
GET my-index-000001/_msearch
{ }
{"query": { "term": { "type": { "value": "product" } }}}
{"index": "my-index-000001"}
{"query": { "term": { "type": { "value": "category" } }}}

What does _doc mean in elasticsearch sort search return?

When I search with sort in elasitcsearch using _search function, I got _doc in sort field. What is the difference between it and _doc field as document type?
Elasticseach version: 6.2.2
"sort": [
1577413214250, # timestamp
393 # _doc
]
Actually, kibana also uses _doc when implement "Surrounding Documents":
{"index":["prophet-job-*"],"ignore_unavailable":true,"preference":1577428415532}
{"version":true,"size":5,"search_after":[1577413214250,385],"sort":[{"#timestamp":{"order":"asc","unmapped_type":"boolean"}},{"_doc":{"order":"desc","unmapped_type":"boolean"}}],"_source":{"excludes":[]},"stored_fields":["*"],"script_fields":{},"docvalue_fields":["#timestamp"],"query":{"bool":{"must":[{"match_all":{}}],"filter":[],"should":[],"must_not":[]}}}
{"index":["prophet-job-*"],"ignore_unavailable":true,"preference":1577428415532}
{"version":true,"size":5,"search_after":[1577413214250,385],"sort":[{"#timestamp":{"order":"desc","unmapped_type":"boolean"}},{"_doc":{"order":"asc","unmapped_type":"boolean"}}],"_source":{"excludes":[]},"stored_fields":["*"],"script_fields":{},"docvalue_fields":["#timestamp"],"query":{"bool":{"must":[{"match_all":{}}],"filter":[],"should":[],"must_not":[]}}}
_doc in the context of sorting can be used if you do not care about the order, and simply want the documents to be returned in the most efficient way possible. Think of it as a search option, as opposed to the index doc type "_doc"
For more information about sorting by _doc, see the official documentation.
Hi, Dennis, below is a simple sample: http://test.kibana.some.net/elasticsearch/_msearch
request paylaod:
{"index":["some-index-*"],"ignore_unavailable":true,"preference":1577931761749}
{"version":true,"size":5,"search_after":[1577931865123,12],"sort":[{"#timestamp":{"order":"asc","unmapped_type":"boolean"}},{"_doc":{"order":"desc","unmapped_type":"boolean"}}],"_source":{"excludes":[]},"stored_fields":["*"],"script_fields":{},"docvalue_fields":["#timestamp"],"query":{"bool":{"must":[{"match_all":{}}],"filter":[],"should":[],"must_not":[]}}}
{"index":["some-index-*"],"ignore_unavailable":true,"preference":1577931761749}
{"version":true,"size":5,"search_after":[1577931865123,12],"sort":[{"#timestamp":{"order":"desc","unmapped_type":"boolean"}},{"_doc":{"order":"asc","unmapped_type":"boolean"}}],"_source":{"excludes":[]},"stored_fields":["*"],"script_fields":{},"docvalue_fields":["#timestamp"],"query":{"bool":{"must":[{"match_all":{}}],"filter":[],"should":[],"must_not":[]}}}
and the partial response is
{
"_index": "some-index-2020.01.02",
"_type": "doc",
"_id": "123456",
"_version": 1,
"_score": null,
"_source": {
"#timestamp": "2020-01-02T02:24:25.123Z",
"prospector": {
"type": "log"
},
"#version": "1",
"tags": [
"beats_input_codec_plain_applied"
],
"fields": {
"some-values": "xxxxxx"
},
"message": "[2020-01-02 10:24:24] [INFO] [evaluation.py:277] Finished evaluation at 2020-01-02-02:24:24"
},
"fields": {
"#timestamp": [
"2020-01-02T02:24:25.123Z"
]
},
"sort": [
1577931865123,
11 # this is _doc value
]
}
The second time I search the same thing, response has the same content except _doc value changed to 12, so I'm confused about the definition of this field.

What does # mean in elastic search documents?

My question is: "What does the # mean in elastic search documents?" #timestamp automatically gets created along with #version. Why is this and what's the point?
Here is some context... I have a web app that writes logs to files. Then I have logstash forward these logs to elastic search. Finally, I use Kibana to visualize everything.
Here is an example of one of the documents in elastic search:
{
"_index": "logstash-2018.02.17",
"_type": "doc",
"_id": "0PknomEBajxXe2bTzwxm",
"_version": 1,
"_score": null,
"_source": {
"#timestamp": "2018-02-17T05:06:13.362Z",
"source": "source",
"#version": "1",
"message": "message",
"env": "development",
"host": "127.0.0.1"
},
"fields": {
"#timestamp": [
"2018-02-17T05:06:13.362Z"
]
},
"sort": [
1518843973362
]
}
# fields are usually ones generated by Logstash as metadata ones, #timestamp being the value that the event was processed by Logstash. Similarly #version is also being added by Logstash to denote the version number of the document.
Here is the reference.
The # field is the metadata created for Logstash. It is part of the data itself.
More info is here.

elasticsearch - add custom field to a specific index

I hava the JSON for my index that looks like this:
{
"_index": "myindes",
"_type": "external",
"_id": "1",
"_source": {
"id": "1",
"name": "myName",
"description": "myDescription",
"source": "mySource",
}
}
And i want to add a string field in _source named topic
How can i do
You can update the index mapping as
curl -XPUT 'http://localhost:9200/myindex/_mapping/external' -d '
{
"external" : {
"properties" : {
"id": {"type":"string"},
"name": {"type":"string"},
"description": {"type":"string"},
"source": {"type":"string"},
"topic":{"type":"string"} // <---new field
}
}
}'
Although the above step was not necessary but always good to control what you are indexing.
Now, you can index your documents with the new field and it will reflect in new updates. However, old indexed documents will still not contain this new field. You will have to reindex them.

Kibana 4 index patterns time-field

Is there a way to make Kibana-4 show a timestamp field which is a epoch time as the time-field when creating an index pattern.
I know how to make this with the _timestamp field by editing the metaFields in the settings, but I would like this to be a custom field.
Eg: Let's say this is the document I am storing in ES:
{
"_id": "AVCbqgiV7A6BIPyJuJRS",
"_index": "scm-get-config-stg",
"_score": 1.0,
"_source": {
"serverDetails": {
"cloudDC": "xxx",
"cloudName": "yyyy",
"hostName": "hostname",
"ipAddress": "10.247.194.49",
"runOnEnv": "stg",
"serverTimestamp": 1445720623246
}
},
"_type": "telemetry"
}
Now I would like to create an index pattern where the Time-field name should be serverTimestamp.

Resources