Any way to have kibana 4 send alerts or take action on specific conditions - elasticsearch

I know Kibana 4 itself does not have alerting and taking action on specific conditions, but I would really want to use Kibana, but also have a way of taking automatic actions or sending alerts if something is not behaving correctly. Are there any solutions or tools that work well together. Currently I am using cloudwatch, elastic search services in AWS. Would all the alerts and actions have to be setup separately in cloudwatch. Maybe I could have kibana generate something on which I could take action, like read from a queue to which kibana sends json alerts or something of that nature?

ElastAlert looks like an interesting tool. You can define conditions and get alerts as emails or to Slack etc.
If you have data being written into Elasticsearch in near real time
and want to be alerted when that data matches certain patterns,
ElastAlert is the tool for you. If you can see it in Kibana,
ElastAlert can alert on it.

Related

ElasticSearch/ElasticCloud Alert Creation

I am a newbie in Elastic in general and currently I am trying to manage our alerts for CPU/Disk/Memory in Elastic Cloud. I can create the alerts manually just fine, but that takes a huge amount of time and if we migrate I want to be able to create the alerts in some automated way. In the past I have worked with Azure and created alerts with Az PowerShell and etc, so I am searching how to automate the alert creation for our infrastructure in Elastic Cloud. I went through the documentation for Alerts Link. But, im not sure I understand how to use the API to actually do this.
Is there a way to automate lets say creation of CPU alerts for 10 different hosts that we monitor with Elastic ? Is using the API the only way and are there any materials other than the official documentation that can help me achieve this? And am I even on the correct path? Thank you in advance.
Let me share knowledge of using Azure Monitor where you can connects the resources to Azure Monitor and manage the Alerts. Alerts can send you an email or call a web hook when some metric (for example database size or CPU usage) reaches the threshold. There are several ways to create Alerts- using Azure Portal, Command Line Interface, Powershell and Azure Monitor Rest API. Hope it will help you.
Even you can automate alerts using Azure Automation runbook with Mertic Alerts. where can automate the alerts according to the customized dimensional values and once the Alert criteria met it can even send an mail.

Whenever tableau sends a data driven alert, I want to trigger an external API call

I am using Tableau Servers data driven alerts. Whenever our alerts threshold is passed, we send an email to users. However, I also want to call an external API from my application (not in tableau).
I have considered trying to use AWS SNS to trigger a lambda but was hoping to see if anyone else has faced a similar use case? It does not seem like Tableaus Rest APIs provide enough metadata to handle this use case for data driven alerts

How integrate FireStore Health Check and Dashboard metrics with our internal Company systems

Context: it is my first use of FireStore. I want to use it to push notification status to our Mobile Application. I can see that there is Google Firestore Dashboard under Analytics umbrella. In our company we use mainly three tools for monitoring our applications: Zabbix, Dynatrace and certain internal solution based on Elasticsearch. I need to ntegrate our internal monitoring systems with metrics resulted from our first Firestore project.
What I am looking for: based on personal assumptions:
1) Maybe there might exist either some GET endpoints that a I can connect and poll for information let's say each minute
2) Maybe, following the idea of Database Realtime pushing events accross a long time connection, I can code a Spring Boot application that import Firebase SDK and every day I connect to some specific Firestore endpoint which will push any interested events (eg. delay based on custom logic or dead service)
3) Maybe some plugin I can connect straight to a Kafka hosted in our internal Datacent
4) Some plugin to connect from Firestore/Firebase to either third tools (eg. Zabbix or Dynatrace or Elasticsearch)
5) Some dependency I could import in google-cloud-funtions thiggered from Firestore Healcheck engine in orther to consume some internal end-point posting data
Perhaps there is already some approach universally used for a scenario when you have to connect Firestore to internal monitoring system. I will be highly appreciated if tell me that than I can narrow my googling searchs because I am not finding anything usefull.
Please, it is not part of this question comparing Monitoring approach. It is a very solid fact in our company use internal Dashboards and some custom alerts trigger. I just mentioned the names above to clarify what I mean by internal monitoring tools. The focus on this question is HOW IMPORT/INTEGRATE/OBSERVE/CONSUME Firestore monitoring data. Our internal stack is beyond this question.
Here is the Official Documentation for Cloud Monitoring using which you can collect metrics, events, and metadata from Google Cloud Platform products that you can use to create dashboards, charts, and alerts.
Please let me know if you have further questions.

How not to have sensitive data in Elastic search?

I'm installing ELK stack for my company, my cousin uses it for his company too, he's a programmer so I asked him if he bought the Xpack, he says no since the mysql logs he processes aren't of value. I know that I can buy XPack or use Nginx to add authentication, but let's assume that I won't do any authentication, like many elastic users, I have a couple of questions about that scenario.
So I have filebeat that ships MySQL logs to logstash, which feeds them to Elastic search and analytics is done in Kibana.
How to make sure that no information of value end up in the logs while still having meaningful analytics. My company develops an ERP, and has many companies as customers, so at the very least, you'll have the company ID and the user ID in the logs in order to have any meaningful data, isn't this considered sensitive data?
How to make sure that no unauthorized user send a Post request to elastic search or access Kibana? Do you run them locally, not on the internet?
Do you filter the logs of any sensitive before sending it to Filebeat?
I'm just trying to understand how many users manage to run ELK without authentication, while still being able to get meaningful data.
.
How to make sure that no information of value end up in the logs while still having meaningful analytics. My company develops an ERP, and has many companies as customers, so at the very least, you'll have the company ID and the user ID in the logs in order to have any meaningful data, isn't this considered sensitive data?
If you don't want sensitive data stored in your elasticsearch you need to filter it out or anonymize it, for example, you can use a logstash filter to create a fingerprint combining the company id and user id fields, or you can remove any field with sensitive data from your message.
How to make sure that no unauthorized user send a Post request to elastic search or access Kibana? Do you run them locally, not on the internet?
Without authentication this is almost impossible, you will need full control of who knows about your elasticsearch instance and who can access it, if someone besides you has access, they can send requests to your instance, to avoid that you can use a firewall on your servers and only allow access to specific IPs.
Even if you take some precautions, running a Elasticsearch instance in production without any kind of access control is not recommended and is very risky.
You should use an access control method, it could be X-Pack, NGINX or a plugin like Search Guard.
Please check out Search Guard (https://search-guard.com/). The basic version (which is sufficient for most use cases and definitely better than nothing) is free and open source (Apache 2 License).
Disclaimer: I work for Search Guard/floragunn GmbH
If you need to grant some access / privilege, you can use grafana instead of kibana for free.
For ES access, this is like any DB security. Configure your server to allow only some IPs on 9200 and 9300.
You can also look at: https://github.com/sscarduzio/elasticsearch-readonlyrest-plugin to secure delete query (still free).

What is the best way to send email reports from Kibana dashboard?

I've setup an ELK (Elasticsearch, Logstash and Kibana) stack and created some Kibana dashboard widgets. So far everything went amazing. Now I want to send daily and weekly email with the generated reports.
What is the best way to do that. Do I need to install any plugin or I can sent it right from Kibana?
You can use ElastAlert. You will be able to mail a link with the Kibana dashboard with only the data of the period you want. The period parameter in the top right corner will be set automatically in Kibana.
There are some workarounds, such as phantomjs but not straightforward to implement. For specific events and Kibana queries there are alerting mechanisms available (Watcher, Logz.io), but I'm guessing you're looking to receive the entire dashboard by email.
There are two out-of-the box options for sending email reports from Kibana dashboard:
Skedler which allows you to schedule and send automated email reports based on your Kibana dashboard or search.
If you have Elasticsearch license/subscription, then you can use the reporting plugin.
Hope it helps.
You can use Sentinl that extends Kibana for Alerting and Reporting functionality to monitor, notify and report on data series changes using standard queries, programmable validators and a variety of configurable actions - Think of it as a free an independent "Watcher" which also has scheduled "Reporting" capabilities (PNG/PDFs snapshots).
The greatest thing about Sentinl is you can easily configure alerts through it's native App interface integrated in Kibana.

Resources