Alert when percent of total count exceeds threshold - elasticsearch

The background
We need to know when a higher than usual percentage of requests receive responses with status code 500. Not the raw count, the percent of total.
Web request logs are shipped to Elasticsearch and visualized with Grafana.
Using the SingleStat Math plugin, we created a dashboard that displays the percent of all requests whose status code is 500.
Query A - count all where status code is 500
Query B - count all
Query Math: A/B * 100
We even set the threshold in the widget to turn red when it hits 1%.
That was easy. This, of course, requires someone to watch the dashboard. What we need is an alert.
The problem
How do I create an alert that fires under the same circumstance - i.e. for a given period of time when the number of 500s exceeds 1% of the total number of requests.
I understand Alerts only apply to the graph widget. So the answer to my question is to create a graph showing the percent of total.
I can add the two queries to the graph as with the SingleStat Math widget, but I don't see how to set the alert reducer to use both, let alone to divide one by the other.
It seems this should be simple: send me an email when that widget turns red. But how do I do this?
What I've tried
Using MetaQueries (type: Arithmetic, expression: A/B) but no data appears.
Dashboard as datasource hopefully to pull the value from the SingleStat Math widget. This appears only to pull the queries, not the calculation.

#biscuit314 You can calculate the percentage in the graph panel using the Metaqueries plugin. I have verified this, it works. You just need to get the format of the percentage calculation syntax right and the query with Metaquery datasource needs to be below the queries being used for calculation. That shall make it work.

Related

How to perform a "from" in Elasticsearch scroll context?

I have a large dataset to query and display in website on an array.
I made a pagination system with a scroll but i can only display a maximum of 100 items at a time so i'm facing issue when i want to display data of page 200 and more because i have to scroll until them and it take too long.
I have check other parts of my code and i didn't find other perf issue, is just the scroll queries which make my api call too long. I tried setting the request size from 100 to 10000 but it doesn't change anything.
I don't think sliced ​​scroll can be a solution or then I didn't understand the functionality.
I'm desperately searching a way to skip the scroll queries before datas that i'm searching even it's not a precise method.
Hoping someone has a solution or at least a clue.
Edit:
More details about what i'm trying to achieve.
I log some actions of my users like calls in Elasticsearch indexes. They do millions of actions per month so Elasticsearch seems like a good option to store them knowing that i don't have to update them after they are stored .
I'm creating a page where my users can search for actions they've performed, but they're doing the "query" themselves. I mean they can select the period and many other parameters, order them by many parameters, etc. The number of result can be 1 or 100,000 items, but I can't show 100,000 items on my page for UI reasons, so I have to manage a pagination and send only part of the result to the page.
I made a scroll query to do it for now with a size of 1000, and i scroll until i'm in the current page of my pagination. I tried to vary the size but it's not really concluent because I can't know the number of result before the query is made.
And the deeper my user go in the pagination, the longer the query take.
I could increase the index.max_result_window with an unreachable number (but I don't know what that implies) make a simple query with a from and a second scroll query for export case but I wonder if they are a way to skip some step in a scroll when i know i'm going to take 100 items after the 1 000 000th item ?
Edit: I watched how google design its pagination and i notice that if you want to go deep in search results you can't unless you go step by step. You can't go directly to the 500th page.
This is how I done mine
So I just redesign my pagination to do the same as Google and force my users to use more precise filters to get less result. Thank you #Val for getting me to ask the right questions :)

Netsuite Formula Calcs

So I was able to create a search that shows historical unit rates at an item level then I can filter by customer to show any price increases that may have taken place to create a price tracker. It works as advertised. Below is a screen shot of what the output is. "Base Price" is the minimum item rate for lifespan since we transitioned to NetSuite. Then 1-6 months back it will pull the item rate and as you can see, around the 3rd month, it increases and shows change.
Here is how the search looks when executed
What I am attempting to do now subtract the values from the formula from the "base price" because that is at the end of the day the total impact value. Eventually I want to bring in quantity so we can see the total impact of these changes to track if we are seeing an increase or not.
EX. If base is $2 and we sell 20 a month. $40 in sales. Now we up it to $3, sales would be $60. But we want to show the $20 in increase impact instead.
Below is from the results tab to generate the above.
Here is the view from the Results Tab
Is there a way to create that calculation somehow in NS? I am almost thinking it is because I used the DECODE instead of the CASE WHEN ?
Thank you
You can generally combine supported functions; you just need to make sure that any functions applied in the Summary or Function columns are replicated in the formula. For example, your first column "Item Rate" has a Summary function or "Minimum" applied - this needs to be included in the formula, something like:
DECODE(...) - MIN({rate})

Fixed time range for a grafana panel

I'm using several time series panels on my dashboards, which are showing the values based on the selected time range.
Now, I want to add a gauge panel, which should show the number of payment transactions since midnight (although I know all the problems about server restarts, the feature of the rate function, etc. the gauge panel will an interesting part of my dashboard). So my query has to be independant from the selected time range of the dashboard.
I've found the variables ${__from} und ${__to} in the Grafana docs, but I'm not sure, how I can use them in a query or how to use them flexibel, e.g. getting the current daten/time as "to" and calculating "from" als midnight value.
Has anybody an idea, if this is generally possible?
Thanks in advance
Matthias

REST API - Retrieve previous query in dynamoDB

I have 100 rows of data in DynamoDB and a api with path api/get/{number}
Now when I say number=1 api should return me first 10 values. when I say number=2 it should return next 10 values. I did something like this with query, lastEvaluatedKey and sort by on createdOn . Now the use case is if the user passes number=10 after number=2 the lastEvaluatedKey is still that of page 2 and the result would be data of page 3. How can I get data directly. Also if the user goes from number=3 to number=1 still the data will not be of page 1.
I am using this to make API call based of pagination on HTML.
I am using java 1.8 and aws-java-sdk-dynamodb.
Non-sequential pagination in DynamoDB is tough - you have to design your data model around it, if it's an operation that needs to be efficient at all times. For a recommendation in your specific case I'd need more details about the data and access patterns.
In general you have the option of setting the ExclusiveStartKey attribute in the query call, which is similar to an offset in relational databases, but only similar and not identical. The ExclusiveStartKey is the key after which the query will continue, meaning data from your table and not just a number.
That means you usually can't guess it, unless it's a sequential number - which isn't ideal.
For sequential pagination, i.e. the user goes from page 1 to page 2, page 2 to page 3 etc. you can pass that along in the request as a token, but that won't work if the user moves in the other direction page 3 to page 2 or just randomly navigates to page 14.
In your case you only have a limited amount of data - 100 items, so my solution for your specific case would be to query all items and limit the amount of items in the response to n * 10, where n is the result page. Then you return the last 10 items from that result to your client.
This is a solution that would get expensive at scale (time + cost) though, fortunately not many people will use the pagination to go to page 7 or 8 though (you could bury a body on page 2 of the google search results).
Yan Cui has written an interesting post on this problem on Hackernoon, you might want to check it out.

elasticsearch query to display lag time between two datetime fields

I have setup a dashboard that displays incoming message rates in a graph, filtered terms in a pie graph and all with nothing really to do. I want to be able to display a table/graph or some other type of panel to display elapsed time between two datetime fields that appear in my message:
{"pubDate":"2014-02-27T07:11:44",
"fetchDate":"2014-02-27T07:11:55",
"client":"ME",
"query":"(work OR my job boring OR mundane OR stodgy OR interesting
OR sucks OR exciting OR fun)",
"lang":"en"}
What I am able to get now is a language filter graph, and a query filtered pie chart with the normal incoming rate graph as well. I want to create a graph/table that gives the difference between the pubdate and fetchdate. Is this possible?
Also related to same project is that the query field does not work correctly in that elasticsearch seems to break long queries into individual terms and I wonder if it interperates the OR itself or just based on length.
Any advice appreciated.

Resources