Filebeat unable to send data to logstash which results in empty data in elastic & kibana - elasticsearch

I am trying to deploy ELK stack in openshift platform (OKD - v3.11) and using filebeat to automatically detect the logs.
The kibana dashboard is up, elastic & logstash api's are working fine but the filebeat is not sending the data to logstash since I do not see any data polling on the logstash listening on 5044 port.
So I found that from elastic forums that the following iptables command would resolve my issue but no luck,
iptables -A OUTPUT -t mangle -p tcp --dport 5044 -j MARK --set-mark 10
Still nothing is polling on the logstash listener. Please help me if I am missing anything and let me know if you need any more information.
NOTE:
The filebeat.yml, logstash.yml & logstash.conf files are working perfectly while deployed in the plain kubernetes.

The steps I have followed to debug this issue are:
Check if Kibana is coming up,
Check if Elastic API's are working,
Check if Logstash is accessible from Filebeat.
Everything is working fine in my case. Added log levels in Filebeat.yml and found "Permission Denied" error while filebeat is accessing the docker container logs under "/var/lib/docker/containers//" folder.
Fixed the issue by setting selinux to "Permissive" by running the following command,
sudo setenforce Permissive
After this ELK started to sync the logs.

Related

Its possible to send logs from two different machines without logstash to elasticsearch?

I have installed on a ubuntu machine elasticsearch, kibana and auditbeat so im monitoring the log events on the ubuntu machine. I also installed winglogbeat on a windows machine to monitorize it too and I configured it to send the logs to the elasticsearch on the ubuntu machine.
This is the configuration of the winglogbeat.yml
But when I tried to run the winglogbeat I get the following error when its trying to connect to kibana on the ubuntu machine.
On the ubuntu machine kibana, elasticsearch and auditbeat works properly.
This is the configuration of the elasticsearch.yml:
And this is the kibana.yml configuration:
I just modify the file kibana.yml to allow connections from a remote host:
Server.host: "0.0.0.0"

"Attempting to reconnect to backoff(elasticsearch(http://localhost:9200)) with 3 reconnect attempt(s)" error appears

I am running filebeat, elastic search and kibana to get logs of nginx from local machine, i am directly connecting filebeat with elastic search in filebeat configuration but as i start filebeat config , it shows errors like" pipeline/output.go:145 Attempting to reconnect to backoff(elasticsearch(http://localhost:9200)) with 3 reconnect attempt(s)" and no logs received by kibana.

Where to elastic server logs in localhost?

My elastic search server is hosted in port 9200.
My application server makes request to the ES server.
I would like to see the request params and the request URL that hits the elastic search server.
Where can I see these?
OS: macOS Mojave
If you are using a Unix based OS, you should be able to find the Elasticsearch logs in:
/var/log/elasticsearch
I'd also check the messages in /var/log/messages, to tail & filter for Elasticsearch:
tail -f /var/log/messages | grep elasticsearch
If you are using windows system, then open elasticsearch.yml file under config folder and uncomment below line and provide your local path
path.logs: <local_path>
Save the elasticsearch.yml file and start the server.
Now you can see all the logs under your local path.

send logs to external elasticsearch from openshift projects

I'm trying to send specific openshift project logs to unsecured external elastic search.
I have tried solution which is there in https://github.com/richm/docs/releases/tag/20190205142308. But found that it will work only when ELS is secured.
Later I have tried using elasticsearch plugin also by adding in output-applications.conf.
output-applications.conf:
<match *.*>
#type elasticsearch
host xxxxx
port 9200
logstash_format true
</match>
All other files are same which is described in https://github.com/richm/docs/releases/tag/20190205142308 #Application logs from specific namespaces/pods/containers
Included output-applications.conf in fluent.conf file.
In fluentd logs except "[info]: reading config file path="/etc/fluent/fluent.conf" " this message i dont see any other things and data is not reaching to elasticsearch
Can anyone tell how to proceed?

Does fluentd depend on rsyslog?

Still wrapping my head around logging technology. I'm following the fluentd to graylog2 recipe but I don't understand this step:
Open /etc/rsyslog.conf and add the following line to the beginning of the file: *.* #127.0.0.1:5140 Then, restart rsyslogd by running sudo /etc/init.d/rsyslog restart.
What's supposed to listen on 127.0.0.1:5140? Is rsyslog a fluentd dependency?
According to Parse Syslog Messages Robustly:
The problem with syslog is that services have a wide range of log
format, and no single parser can parse all syslog messages
effectively.
Rsyslog seems the recommended way to forward logs to fluentd.
Fluentd listens on the port 5140 if you enable the Rsyslog input. Changing the line in
/etc/rsyslogd.conf
forwards the traffic from Rsyslog to Fluentd.
However, if you don't want to turn on Rsyslog you can just send the traffic straight to port 5140.

Resources