I am really new to the ELK stack, any help will be appreciated.
The idea was to have:
rsyslog server -> redis -> ELK stack
by following this recipe: https://sematext.com/blog/recipe-rsyslog-redis-logstash/
I can see the traffic go all the way to Elasticsearch, but have not been able to debug Elasticsearch yet. I believe that traffic should be going there because tcpdump shows it.
If I go to "Stack monitoring", Logstash is not showing up there. When going deeper, it does say that the "Logstash node has been detected", and nothing more.
The issue was that Kibana is not automatically showing logs in the Observability/Stream.
On the top of the page there is a link to settings where you should choose
log index pattern that you have created.
A little unintuitive considering having that massive screaming button to add integrations.
Related
I updated Elastic Search and Kibana last week from 7.6.2 to 7.17.5. When doing so, the Stack Monitoring page within Kibana broke. I was previously using a legacy exporter for sending the logs from my Elastic Search cluster to my Kibana/Monitoring cluster.
After diving down many rabbit holes I installed metricbeat and got it to export logs to the monitoring cluster at .monitoring-es-7-mb-*. To no avail, Kibana still shows an empty Stack Monitoring page.
Does anyone have any idea of what I should check to try and get the monitoring logs to show up on the Stack Monitoring page?
I am currently working on the ELK setup for my Kubernetes clusters. I set up logging for all the pods and fortunately, it's working fine.
Now I want to push all terminated/crashed pod logs (which we get by describing but not as docker logs) as well to my Kibana instance.
I checked on my server for those logs, but they don't seem to be stored anywhere on my machine. (inside /var/log/)
maybe it's not enabled or I might not aware where to find them.
If these logs are available in a log file similar to the system log then I think it would be very easy to put them on Kibana.
It would be a great help if anyone can help me achieve this.
You need to use kube-state-metrics by which you can get all pod related metrics. You can configure to your kube-state-metrics to connect elastic search. It will create an index for a different kind of metrics. Then you can easily use that index to display your charts/graphs in Kibana UI.
https://github.com/kubernetes/kube-state-metrics
There are more than 50 Java applications (They are not microservices, so we don't have to worry about multiple instance of the service). Now my architect designed a solution to get the log files and feed into a kafka topic and from kafka feed it into logstash and push it to elastic search so we can view the logs in kibana. Now I am new to Kafka and ELK stack. Will someone point me to a right direction on how to do this task. I learnt that Log4J and SLF4J can be configured to push the logs to kafka topic.
1. Now how to consume from kafka and load it into logstash? Do I have to write a kafka consumer or we can do that just by configuration?
2. How logstash will feed the logs to elastic search?
3. How can I differentiate all the 50 application logs, do i have to create topic for each and every application?
I put the business problem, now I need step by step expert advice. - Thanks in advance.
Essentially what your architect has laid out for you can be divided into two major components based upon their function (on architecture level);
Log Buffer (Kafka)
Log Ingester (ELK)
[Java Applications] =====> [Kafka] ------> [ELK]
If you study ELK you would feel like it is sufficient for your solution and Kafka would appear surplus. However, Kafka has important role to play when it comes to scale. When many of your Java applications would send logs to ELK, ELK may become overloaded and break.
To avoid ELK from overload your architect has setup a buffer (Kafka). Kafka will receive logs from applications and queue it up in case ELK is under load. In this way you do not break ELK and also you do not loose logs when ELK is struggling.
Answers to your questions in the same order;
(1) Logstash has 'input' plugins that can be used to setup a link between Kafka & Logstash. Read on Logstash and its plugins.
i- Logstash Guide or Reference
ii- Input Plugins (scroll down to find Kafka plugin)
(2) Logstash will feed received logs to Elasticsearch by Output plugin for Elasticsearch. See Logstash output plugin for Elasticsearch.
(3) I may not be spot-on on this, but I think you would be able to filter & distinguish the logs at the Logstash level once you receive it from Kafka. You could apply tags or fields to each log message on reception. This additional info will be used by Elasticsearch to distinguish the applications from one another.
Implementation Steps
As somebody who is new to Kafka & ELK follow these steps to your solution;
Step 1: Start by setting up ELK first. Once you do that you would be able to see how logs are visualized and will become clearer how end solution may look like.
Guide to ELK Stack
Step 2: Setup Kafka to link your application logs to ELK.
Caveats:
You may find ELK to have some decent learning curve. Much time is required to understand how each element in the ELK stack works and what is its individual configuration and languages are.
To have deep understanding of ELK use the local deployment path where you setup ELK on your system. Avoid the cloud ELK services for that matter.
Logstash has a kafka input and an elasticsearch output, so this is configuration on the logstash side. You could differentiate the application using configuration on the log4j side (although using many topics is another possibility).
I'm new to the ELK stack so I'm not sure what the problem is. I have a configuration file (see screenshot, it's based on the elasticsearch tutorial):
Configuration File
Logstash is able to read the logs (it says Pipeline main started) but when the configuration file is run, elasticsearch doesn't react. I can search through the files
However, when I open Kibana, it says no results found. I checked and made sure that my range is the full day.
Any help would be appreciated!
I have started working on ELK recently and have a doubt regarding handling of multiple types of logs.
I have two sets of logs on my server that I want to analyse, one from my android application and the other from my website. I have successfully transferred logs from this server via filebeat to the ELK server.
I have created two filters for either types of logs and have successfully imported these logs into logstash and then Kibana.
This link helped do the above stuff.
https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-centos-7
The above link directs to use the logs in the filebeat index in Kibana and start analysing(I successfully did for one type of logs). But the problem that I am facing is that since both these logs are very different, they need to be analysed differently. How do I do this in Kibana. Should I create multiple filebeat indexes there and import them, or should it be just one single index, or some other way. I am not very clear on this(could not find much documentation), hence would request to please help and guide me here.
Elasticsearch organizes by index and type. Elastic used to compare these to SQL concepts, but now offers a new explanation.
Since you say that the logs are very different, Elastic is saying that you should use different indexes.
In Kibana, the visualization is tied to an index. If you had one panel from each index, you can show them both on the same dashboard.