Show logs stored in ElasticSearch On Grafana - elasticsearch

I use ElasticSeearch and GrayLog to show and analyse logs, this solution is great, but I want to replace grayLog by Grafana, I see that it can do a lot of greate Graphes, but I dont found any solution to show logs on Grafana.
I wont to collect syslogs and author logs, also metrics from system Parse and store theme on ElasticSearch, and finaly show both of theme on Grafana, logs and métrics.
If you have a suggestion or a solluttion for that, please help me.

Related

Need to drop the commented lines and some lines matching the specified string

I have installed ELK in one server and filebeat in other server where logs resides. My logs are moved and able to view in Kibana. But I dont need the commented lines and lines with certains text to be displayed in kibana. Hence I used drop_event and exclude_lines in Filebeat and I even used drop filter in logstash but I dont see them refelecting in Kibana dashboard. Can anyone help on this.

ELK with Grafana instead of Kibana for centralized log

When comes to centralized log tools, I see lot of comparison of ELK vs EFK vs Loki vs other.
But I have hard time to actually see information about "ELG", ELK (or EFK) but with Grafana instead of Kibana.
I know Grafana can use Elasticsearch as datasource, so it should be technically working. But how good is it? Any drawback compare to using Kibana? Maybe there are more existing dashboard for Kibana than Grafana when it comes to log?
I am asking this as I would like to have one UI system for both my metrics dashboard and my logs dashboard.
Kibana is part of the stack, so it is deeply integrated with elasticsearch, you have a lot of pre-built dashboards and apps inside Kibana like SIEM and Observability. If you use filebeat, metricbeat or any other beat to collect data it will have a lot of dashboards for a lot of systems, services and devices, so it is pretty easy to visualize your data without having to do a lot of work, basically you just need to follow the documentation.
But if you have some data that doesn't fit with one of pre-built dashboards, or want more flexibility and creat your own dashboards, Kibana needs more work than Grafana, and Kibana also only works with elasticsearch, so if you have other datasources you would need to put the data in elasticsearch. Also, if you want to have map visualizations, Kibana Map app is pretty good.
The Grafana plugin for Elasticsearch has some small bugs, but in overall it works fine, things probably will change for better since Elastic and Grafana made a partnership to improve the plugin.
So, if all your data is in elasticsearch, use Kibana, if you have different datasources, use grafana.

Showing crashed/terminated pod logs on Kibana

I am currently working on the ELK setup for my Kubernetes clusters. I set up logging for all the pods and fortunately, it's working fine.
Now I want to push all terminated/crashed pod logs (which we get by describing but not as docker logs) as well to my Kibana instance.
I checked on my server for those logs, but they don't seem to be stored anywhere on my machine. (inside /var/log/)
maybe it's not enabled or I might not aware where to find them.
If these logs are available in a log file similar to the system log then I think it would be very easy to put them on Kibana.
It would be a great help if anyone can help me achieve this.
You need to use kube-state-metrics by which you can get all pod related metrics. You can configure to your kube-state-metrics to connect elastic search. It will create an index for a different kind of metrics. Then you can easily use that index to display your charts/graphs in Kibana UI.
https://github.com/kubernetes/kube-state-metrics

Getting data from Jira to Elasticsearch

What is the best way to get information about creation and closing of issues in Jira into Elasticsearch? I want to visualize the average resolution time for our issues in Kibana.
Any advice is welcome!
You might want to take a look at this github project which claims to do what you are looking for , I havent tested this yet , but this is the closest to your request.
https://github.com/DaGrisa/agile-metrics/
Look at this page,
https://ilaesolution.atlassian.net/wiki/spaces/ELA/pages/31883454/Elastic+Log+For+Jira
there is a Jira Plugin called as Elastic Log. You can configure this in Your Jira Instance and information will be pushed to Elasticsearch. Later you can create visualizations and dashboards in Kibana.

How to Analyze logs from multiple sources in ELK

I have started working on ELK recently and have a doubt regarding handling of multiple types of logs.
I have two sets of logs on my server that I want to analyse, one from my android application and the other from my website. I have successfully transferred logs from this server via filebeat to the ELK server.
I have created two filters for either types of logs and have successfully imported these logs into logstash and then Kibana.
This link helped do the above stuff.
https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-centos-7
The above link directs to use the logs in the filebeat index in Kibana and start analysing(I successfully did for one type of logs). But the problem that I am facing is that since both these logs are very different, they need to be analysed differently. How do I do this in Kibana. Should I create multiple filebeat indexes there and import them, or should it be just one single index, or some other way. I am not very clear on this(could not find much documentation), hence would request to please help and guide me here.
Elasticsearch organizes by index and type. Elastic used to compare these to SQL concepts, but now offers a new explanation.
Since you say that the logs are very different, Elastic is saying that you should use different indexes.
In Kibana, the visualization is tied to an index. If you had one panel from each index, you can show them both on the same dashboard.

Resources