Kibana - How to count number of error logs and the type of error - elasticsearch

I monitor our team project error logs in Kibana and report them, like: From yesterday to today, there has been 50 errors, 20 of them is IP authentification and 30 Host error... or something like that.
I wanted to automate this process, counting the number of errors and their types and displaying them on Slack, kind of a microsoft teams. I was looking at web scrapping with python to extract those error logs but it doesn't quite look like what I'm looking for.
How would you go about this?

Build a Watcher for that.
Query your stuff by timeframe, do the aggregations by "error category" & count your numbers, schedule the Watcher to fire at the frequency you're comfortable with, and send the results directly to Slack (connector is provided out of the box).
How to do it:
https://www.elastic.co/guide/en/elasticsearch/reference/current/watcher-api-put-watch.html

Related

Elastic SIEM - alerting and correlation

I was asked to do research, how can a very basic SIEM with Elastic Stack be build.
I managed to set up stack with Elasticsearch, Kibana and Beats, but now: How can I write correlation rules, like: If someone failed to log in 10 times in last 3 mins - ALERT. Or if there is unusual activity of scanning ports (detect nmap activity) - ALERT. How can it be done? Using only free options.
Elastics free and open license allows the usage of detections.
Machine Learning is a paid feature but correlations (EQL) and normal detections (query) can be build. You also get to use the kibana interface to handle the signals into cases.
https://www.elastic.co/subscriptions

How to permanently save heroku logs?

I see how to view heroku logs live
heroku logs -t
and how to view the last n logs e.g.
heroku logs -n 500
would show the last 500 lines of logs. But the n can only be so large because heroku simply doesn't store them (forever). For example, I tried heroku logs -n 5000 and I get about 1500 lines of logs, but no more.
So suppose I wanted to be able to view logs from farther back, what would I have to do to make them available? Is there some heroku setting/addon I need to implement in order for them to be permanently made available? Or is it the application layer's responsibility to store logs somewhere (e.g. persist them in a database)?
Heroku logs retain the last 1500 logging records, that are max 1 week old.
There are several add-ons you can use on Heroku, however they will all have some limitations (max logs, max days) when using a Free Plan.
Papertrail is a good solution I can recommend: nice dashboard for queries and alerting, saved searches, Free plan including the last 7 days logs. When purchasing a plan (starting from 8$ p/month) the limitations are 'relaxed' and you can keep logs for up to 1 year.
Not an answer, but just adding some advice I received in other forums:
Papertrail is awesome, it's a log management app (nothing to do with the ruby gem of the same name)
Some suggest not worrying about collecting all logs, but simply using something like Sentry so the errors are reported, which saves sifting through all the logs.

How to download 300k log lines from my application?

I am running a job on my Heroku app that generates about 300k lines of log within 5 minutes. I need to extract all of them into a file. How can I do this?
The Heroku UI only shows logs in real time, since the moment it was opened, and only keeps 10k lines.
I attached a LogDNA Add-on as a drain, but their export also only allows 10k lines export. To even have the option of export, I need to apply a search filter (I typed 2020 because all the lines start with a date, but still...). I can scroll through all the logs to see them, but as I scroll up the bottom gets truncated, so I can't even copy-paste them myself.
I then attached Sumo Logic as a drain, which is better, because the export limit is 100k. However I still need to filter the logs in 30s to 60s intervals and download separately. Also it exports to CSV file and in reverse order (newest first, not what I want) so I have to still work on the file after its downloaded.
Is there no option to get actual raw log files in full?
Is there no option to get actual raw log files in full?
There are no actual raw log files.
Heroku's architecture requires that logging be distributed. By default, its Logplex service aggregates log output from all services into a single stream and makes it available via heroku logs. However,
Logplex is designed for collating and routing log messages, not for storage. It retains the most recent 1,500 lines of your consolidated logs, which expire after 1 week.
For longer persistence you need something else. In addition to commercial logging services like those you mentioned, you have several options:
Log to a database instead of files. Something like Apache Cassandra might be a good fit.
Send your logs to a logging server via Syslog (my preference):
Syslog drains allow you to forward your Heroku logs to an external Syslog server for long-term archiving.
Send your logs to a custom logging process via HTTPS.
Log drains also support messaging via HTTPS. This makes it easy to write your own log-processing logic and run it on a web service (such as another Heroku app).
Speaking solely from the Sumo Logic point of view, since that’s the only one I’m familiar with here, you could do this with its Search Job API: https://help.sumologic.com/APIs/Search-Job-API/About-the-Search-Job-API
The Search Job API lets you kick off a search, poll it for status, and then when complete, page through the results (up to 1M records, I believe) and do whatever you want with them, such as dumping them into a CSV file.
But this is only available to trial and Enterprise accounts.
I just looked at Heroku’s docs and it does not look like they have a native way to retrieve more than 1500 and you do have to forward those logs via syslog to a separate server / service.
I think your best solution is going to depend, however, on your use-case, such as why specifically you need these logs in a CSV.

Quick and easy way to see how many hits a heroku app has received?

I can see the logs for a heroku app with heroku logs -t
Is there a way to easily see how many hits an app has received in, say, the past 24 hours? (preferably using a quick command in the CLI, but otherwise through the heroku webisite)
There are two ways I see here
the Heroku dashboard provides you with a Metrics tab, where you can see the throughput of your application.
If this is not exact enough, you can add a logging addon (logentries for example), then then analyze the router-logs there. Logentries provides you with counting, grouping, etc.
Same as a logging addon, but you also can add your own log drain and then analyze them yourself :)

Bing Search API supports multiple request?

im trying to do an load test on my page that uses Bing seach API for user queries in a search box, i wanna know who many searching users can handle my web, to do that i configure a Jmeter test, but when i run it appears to fail at 90% of searches, ¿there is any limitation on this api on multiple users that search at same time?
Bing does have API limits, the free Tier is 5,000. You should be able to see your limits at
https://datamarket.azure.com/dataset/explore/5BA839F1-12CE-4CCE-BF57-A49D98D29A44
To validate this, I create a load test to hit my 5,000 per month limit to see re-create the rate limit condition.
The error output and copy of the JMX is available at Load Test Bing. The test launched on 5 servers, each running 100 users, looping 11 times.
This first test run shows that you can quickly hit the ‘per minute’ which you can see in your error responses
Code(503) Message(The number of requests per minute for the subscription has reached the maximum threshold that is allowed. Please try again after….

Resources