I'm using Kibana on top of logstash and I want to filter items in the index to today, or the last 24 hours is fine too.
So apparently this requires me to run a range query against the underlying ElasticSearch engine that would look like:
"range" : {
"timestamp" : {
"gte": "now-24h",
"lte": "now",
}
}
However - I can't put that in the filter box in Kibana 3:
This is a numeric range query and it doesn't work - but it shows the input box and the idea.
So my question: how can I create a filter that filters the events to a date range in Kibana 3?
Found it, it's in the top menu:
Clicking it generates the range filter as can be seen as the 2nd filter on the left.
Related
I want to maintain last 2 versions of documents in Elasticsearch.
I created, for example, first update for product123
PUT /products/_doc/product123
{ "name" : "toothPaste",
"price" : 10
}
Then for second update product123:
PUT /products/_doc/product123
{
"name" : "toothPaste",
"price" : 12
}
When I query using GET API - I am getting "price": 12 - Current Version
Is it possible that I will get "price": 10 (Last Version) of the same index
the only way to do this in Elasticsearch is to manage it yourself, as any updates applied to a document do not retain the previous version
you could do this using separate documents as MAZux mentioned above, or you could do it in different fields, eg price and previous_price
Is there any way in elasticsearch to set a default date range if to and from fields are null. Like whenever to and from are empty, then elasticsearch should perform search on the basis of defined default range. I have written a query but it only works in the case if to and from is defined:
"range": {
"time": {
"from": "2018-01-16T07:05:00",
"to": "2018-01-16T10:59:09",
"include_lower": true,
"include_upper": true
}
}
There is no default date range in elasticsearch right now.
If you didn't provide any date range filter, the elasticsearch will search the entire records which match your query.
It will search the entire index or alias you have pointed to search.
My suggestion would be
If you want to set a default time frame in your filter, you have to do that in your code (means client side).
So your program should set a time frame example - Last 30 days or something.
Is it possible to exclude results based on the outcome of an aggregation?
In other words, I have aggregated on a Term and a whole bunch of results appear in a data table ordered in descending order by the count. Is it possible to configure kibana / elasticsearch to exclude results where count is 1 or less. (Where count is an aggregation).
I realise I can export the raw data from the data table visualization and delete those records manually through a text editor or excel. But I am trying to convince my organization that elasticsearch is a cool new thing and this is one of their 1st requirements...
You can exclude the result from the search by applying a filter here a sample that can be helpfull.
"query": {
"bool": {
"filter": {
"range": {
"Your_term": {
"gte": 1
}
}
}
}
Edit: I found the answer, see below for Logstash <= 2.0 ===>
Plugin created for Logstash 2.0
Whomever is interested in this with Logstash 2.0 or above, I created a plugin that makes this dead simple:
The GEM is here:
https://rubygems.org/gems/logstash-filter-dateparts
Here is the documentation and source code:
https://github.com/mikebski/logstash-datepart-plugin
I've got a bunch of data in Logstash with a #Timestamp for a range of a couple of weeks. I have a duration field that is a number field, and I can do a date histogram. I would like to do a histogram over hour of day, rather than a linear histogram from x -> y dates. I would like the x axis to be 0 -> 23 instead of date x -> date y.
I think I can use the JSON Input advanced text input to add a field to the result set which is the hour of day of the #timestamp. The help text says:
Any JSON formatted properties you add here will be merged with the elasticsearch aggregation definition for this section. For example shard_size on a terms aggregation which leads me to believe it can be done but does not give any examples.
Edited to add:
I have tried setting up an entry in the scripted fields based on the link below, but it will not work like the examples on their blog with 4.1. The following script gives an error when trying to add a field with format number and name test_day_of_week: Integer.parseInt("1234")
The problem looks like the scripting is not very robust. Oddly enough, I want to do exactly what they are doing in the examples (add fields for day of month, day of week, etc...). I can get the field to work if the script is doc['#timestamp'], but I cannot manipulate the timestamp.
The docs say Lucene expressions are allowed and show some trig and GCD examples for GIS type stuff, but nothing for date...
There is this update to the BLOG:
UPDATE: As a security precaution, starting with version 4.0.0-RC1,
Kibana scripted fields default to Lucene Expressions, not Groovy, as
the scripting language. Since Lucene Expressions only support
operations on numerical fields, the example below dealing with date
math does not work in Kibana 4.0.0-RC1+ versions.
There is no suggestion for how to actually do this now. I guess I could go off and enable the Groovy plugin...
Any ideas?
EDIT - THE SOLUTION:
I added a filter using Ruby to do this, and it was pretty simple:
Basically, in a ruby script you can access event['field'] and you can create new ones. I use the Ruby time bits to create new fields based on the #timestamp for the event.
ruby {
code => "ts = event['#timestamp']; event['weekday'] = ts.wday; event['hour'] = ts.hour; event['minute'] = ts.min; event['second'] = ts.sec; event['mday'] = ts.day; event['yday'] = ts.yday; event['month'] = ts.month;"
}
This no longer appears to work in Logstash 1.5.4 - the Ruby date elements appear to be unavailable, and this then throws a "rubyexception" and does not add the fields to the logstash events.
I've spent some time searching for a way to recover the functionality we had in the Groovy scripted fields, which are unavailable for scripting dynamically, to provide me with fields such as "hourofday", "dayofweek", et cetera. What I've done is to add these as groovy script files directly on the Elasticsearch nodes themselves, like so:
/etc/elasticsearch/scripts/
hourofday.groovy
dayofweek.groovy
weekofyear.groovy
... and so on.
Those script files contain a single line of Groovy, like so:
Integer.parseInt(new Date(doc["#timestamp"].value).format("d")) (dayofmonth)
Integer.parseInt(new Date(doc["#timestamp"].value).format("u")) (dayofweek)
To reference these in Kibana, firstly create a new search and save it, or choose one of your existing saved searches (Please take a copy of the existing JSON before you change it, just in case) in the "Settings -> Saved Objects -> Searches" page. You then modify the query to add "Script Fields" in, so you get something like this:
{
"query" : {
...
},
"script_fields": {
"minuteofhour": {
"script_file": "minuteofhour"
},
"hourofday": {
"script_file": "hourofday"
},
"dayofweek": {
"script_file": "dayofweek"
},
"dayofmonth": {
"script_file": "dayofmonth"
},
"dayofyear": {
"script_file": "dayofyear"
},
"weekofmonth": {
"script_file": "weekofmonth"
},
"weekofyear": {
"script_file": "weekofyear"
},
"monthofyear": {
"script_file": "monthofyear"
}
}
}
As shown, the "script_fields" line should fall outside the "query" itself, or you will get an error. Also ensure the script files are available to all your Elasticsearch nodes.
Is it possible to set a fixed timespan for a saved visualization or a saved search in Kibana 4?
Scenario:
I want to create one dashboard with 2 visualizations with different time spans.
A metric counting unique users within 10 min (last 10 minutes)
A metric counting todays unique users (from 00.00am until now)
Note that changing the time span on the dashboard does not affect the visualizations. Possible?
You could add a date range query to the saved search you base each visualisation on. Eg, if your timestamp field is called timestamp:
timestamp:[now-6M/M TO now]
where the time range is from 'now' to '6 months ago, rounding to the start of the month.
Because Kibana also now supports JSON-based query DSL, you could also achieve the same thing by entering this into the search box instead:
{
"range" : {
"timestamp" : {
"gte": "now-6M/M",
"lte": "now"
}
}
}
For more on date range queries see https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-range-query.html#ranges-on-dates
However changing the dashboard timescale will override this if it's a subset. So if you use the above 6 month range in the saved search, but a 3 month range in the dashboard, you'll filter to 3 months of data.