Timelion syntax doesn't work - elasticsearch

I'm trying to use timelion.
When I tried es(*) function, I got no result.
I have a dataset in elasticsearch, and structure is
{
"_index": "test",
"_type": "testtype",
"_id": "abcdefg0",
"_score": 1,
"_source": {
"name": "name",
"gender": "Male",
"timestamp": "2016-07-26T06:10:56Z"
"is_foreigner": false
}
}
all fields consist of string type except timestamp field. (timestamp field consists of date type)
Do I need additional field? or Do I need to add any number field?

I found solution.
timestamp field name is wrong.
You must use #timestmap as timestamp field.
Or, if you want to use custom field, Go to timelion.json file and change timelion configuration.
"es": {
"timefield": "timestamp",
"default_index": "sensor_log",
"allow_url_parameter": false
}
like this.
timelion.json default value is
"es": {
"timefield": "#timestamp",
"default_index": "sensor_log",
"allow_url_parameter": false
}
Self question, self answer :(
Anyway, I could help somebody through this answer. :)

Related

ElasticSearch - Multiple query on one call (with sub limit)

I have a problem with ElasticSearch, I need you :)
Today I have an index in which I have my documents. These documents represent either Products or Categories.
The structure is this:
{
"_index": "documents-XXXX",
"_type": "_doc",
"_id": "cat-31",
"_score": 1.0,
"_source": {
"title": "Category A",
"type": "category",
"uniqId": "cat-31",
[...]
}
},
{
"_index": "documents-XXXX",
"_type": "_doc",
"_id": "prod-1",
"_score": 1.0,
"_source": {
"title": "Product 1",
"type": "product",
"uniqId": "prod-1",
[...]
}
},
What I'd like to do, in one call, is:
Have 5 documents whose type is "Product" and 2 documents whose type is "Category". Do you think it's possible?
That is, two queries in a single call with query-level limits.
Also, isn't it better to make two different indexes, one for the products, the other for the categories?
If so, I have the same question, how, in a single call, do both queries?
Thanks in advance
If product and category are different contexts I would try to separate them into different indices. Is this type used in all your queries to filter results? Ex: I want to search for the term xpto in docs with type product or do you search without applying any filter?
About your other question, you can apply two queries in a request. The Multi search API can help with this.
You would have two answers one for each query.
GET my-index-000001/_msearch
{ }
{"query": { "term": { "type": { "value": "product" } }}}
{"index": "my-index-000001"}
{"query": { "term": { "type": { "value": "category" } }}}

Nested attribute term Query

I have a documents something like bellow
{
"_index": "lines",
"_type": "lineitems",
"_id": "4002_11",
"_score": 2.6288738,
"_source": {
"data": {
"type": "Shirt"
}
}
}
I want to get a count based on type attribute value. Any suggestion on this?
I tried term query but no lick with that.
You should use the terms aggregation, this will return the number of documents aggregated for each "type" field values.
https://www.elastic.co/guide/en/elasticsearch/reference/current/search-aggregations-bucket-terms-aggregation.html

Elastic filter with dot (.) in name

I'm pretty new to ELK and seem to start with the complicated questions ;-)
I have elements that look like following:
{
"_index": "asd01",
"_type": "doc",
"_id": "...",
"_score": 0,
"_source": {
"#version": "1",
"my-key": "hello.world.to.everyone",
"#timestamp": "2018-02-05T13:45:00.000Z",
"msg": "myval1"
}
},
{
"_index": "asd01",
"_type": "doc",
"_id": "...",
"_score": 0,
"_source": {
"#version": "1",
"my-key": "helloworld.from.someone",
"#timestamp": "2018-02-05T13:44:59.000Z",
"msg": "myval2"
}
I want to filter for my-key(s) that start with "hello." and want to ignore elements that start with "helloworld.". The dot seem to be interpreted as a wildchard and every kind of escaping doesn't seem to work.
With a filter for that as I want to be able to use the same expression in Kibana as well as in the API directly.
Can someone point me to how to get it working with Elasticsearch 6.1.1?
It's not being used as a wildcard, it's just being removed by the default analyzer (standard analyzer). If you do not specify a mapping, elasticsearch will create one for you. For string fields it will create a multi value field, the default will be text (with default analyzer - standard) and keyword field with the keyword analyzer. If you do not want this behaviour you must specify the mapping explicitly during index creation, or update it and reindex the data
Try using this
GET asd01/_search
{
"query": {
"wildcard": {
"my-key.keyword": {
"value": "hello.*"
}
}
}
}

ElasticSearch: POST update without Index, Type and Id

Here is one of my documents in elasticsearch:
{
"_index": "2017-10-21",
"_type": "cat",
"_id": "14",
"_score": 2.2335923,
"_source": {
"name": "Biscuit",
"breed": "Persian",
"age": "3"
}
}
I know that it's possible to do a POST update to add a new field to an existing document this way:
POST [index] / [type] / [id] / _update
So for example, if I want to add a new field "hairy" to my document:
POST 2017-10-21 / cat / 14 / _update
{
"script" : "ctx._source.hairy = 'yes'"
}
I will have this result:
{
"_index": "2017-10-21",
"_type": "cat",
"_id": "14",
"_source": {
"name": "Biscuit",
"breed": "Persian",
"age": "3",
"hairy": "yes"
}
}
However, I would like to add a new field to ALL my documents, no matter their index, type or id. Unfortunately, even after hours of research I haven't found a way to do a POST update without using index, type or id.
So, my questions are: Is it even possible? If it's not, is there another way to do what I want to do?
Thank you in advance for any help you can provide!
I have finally found a solution!
I needed to use POST update_by_query instead of POST update
POST * / _update_by_query
{
"script" : {
"inline": "ctx._source.hairy = 'yes'"
}
}
The star means that you want to select all existing indexes/indices

Kibana 4 index patterns time-field

Is there a way to make Kibana-4 show a timestamp field which is a epoch time as the time-field when creating an index pattern.
I know how to make this with the _timestamp field by editing the metaFields in the settings, but I would like this to be a custom field.
Eg: Let's say this is the document I am storing in ES:
{
"_id": "AVCbqgiV7A6BIPyJuJRS",
"_index": "scm-get-config-stg",
"_score": 1.0,
"_source": {
"serverDetails": {
"cloudDC": "xxx",
"cloudName": "yyyy",
"hostName": "hostname",
"ipAddress": "10.247.194.49",
"runOnEnv": "stg",
"serverTimestamp": 1445720623246
}
},
"_type": "telemetry"
}
Now I would like to create an index pattern where the Time-field name should be serverTimestamp.

Resources