Can't Connect to Logstash Server from Client using Filebeat - elasticsearch

i'm trying to start filebeat and send the output to ELK server, but getting the following error message when run command /usr/share/filebeat/bin/filebeat -e -c /etc/filebeat/filebeat.yml
error message:
2021-10-18T11:46:18.575Z ERROR [logstash] logstash/async.go:280 Failed to publish events caused by: write tcp 172.31.20.157:48724->MyPublicIP:5044: write: connection reset by peer
2021-10-18T11:46:18.575Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2021-10-18T11:46:18.575Z INFO [publisher] pipeline/retry.go:223 done
2021-10-18T11:46:20.215Z ERROR [publisher_pipeline_output] pipeline/output.go:180 failed to publish events: write tcp 172.31.20.157:48724->MyPublicIP:5044: write: connection reset by peer
2021-10-18T11:46:20.215Z INFO [publisher_pipeline_output] pipeline/output.go:143 Connecting to backoff(async(tcp://MyPublicIP:5044))
2021-10-18T11:46:20.215Z INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2021-10-18T11:46:20.215Z INFO [publisher] pipeline/retry.go:223 done
2021-10-18T11:46:20.216Z INFO [publisher_pipeline_output] pipeline/output.go:151 Connection to backoff(async(tcp://MyPublicIP:5044)) established
2021-10-18T11:46:20.273Z ERROR [logstash] logstash/async.go:280 Failed to publish events caused by: write tcp 172.31.20.157:48726->MyPublicIP:5044: write: connection reset by peer
here is the filbeat.yml config
# ------------------------------ Logstash Output -------------------------------
output.logstash:
# The Logstash hosts
hosts: ["MyPublicIP:5044"]
and here is the logstash-sample.conf in the ELK server
input {
beats {
port => 5044
type => "logs"
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
file {
path => "/var/log/*.log"
type => "syslog"
}
file {
path => "/tmp/logstash.txt"
}
}
output {
elasticsearch {
hosts => ["MyPublicIP:9200"]
index => "%{[#metadata][beat]}-%{[#metadata][version]}-%{+YYYY.MM.dd}"
#user => "elastic"
#password => "changeme"
}
}
i'm able to telnet the logstash from filbeat client server
[root#ip-172-x-x-x-system]# telnet MyPublicIP 5044
Trying MyPublicIP ...
Connected to MyPublicIP .
Escape character is '^]'.

Related

How to make logstah public over the network

Need your support, As the below image I configure ELK and deploy it in a separate server. it's working inside the same server but I'm trying to access Logstash over the same network on other servers.
My question is possible on local env?.
Logstash Configuration
# Read input from filebeat by listening to port 5044 on which filebeat will send the data
input {
beats {
host => "0.0.0.0"
port => "5044"
}
}
output {
stdout {
codec => rubydebug
}
# Sending properly parsed log events to elasticsearch
elasticsearch {
hosts => ["localhost:9200"]
index => "e-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}
File Beats Configuration
filebeat.inputs:
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
#- /var/log/*.log
- E:\IMR-App\imrh\logs\imrh.log
# ------------------------------ Logstash Output -------------------------------
output.logstash:
# The Logstash hosts
hosts: ["0.0.0.0:5044"]

why is filebeat unable to connect to logstash

I am trying to send a log4net log to logstash to get parsed and then end up in elasticsearch. I have added the port to the windows firewall security setting and allow all connection, both to 5044 and 9600.
In the filebeat log, i get this error
pipeline/output.go:100 Failed to connect to backoff(async(tcp://[http://hostname:5044]:5044)): lookup http://hostname:5044: no such host
Filebeat.yml (Logstash section)
#----------------------------- Logstash output --------------------------------
output.logstash:
# The Logstash hosts
hosts: ["http://hostname:5044"]
# Optional SSL. By default is off.
# List of root certificates for HTTPS server verifications
#ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]
# Certificate for SSL client authentication
#ssl.certificate: "/etc/pki/client/cert.pem"
# Client Certificate Key
#ssl.key: "/etc/pki/client/cert.key"
#================================ Processors =====================================
Logstash.yml
I have set the http.host to 0.0.0.0
# ------------ Metrics Settings --------------
#
# Bind address for the metrics REST endpoint
#
http.host: "0.0.0.0"
#
# Bind port for the metrics REST endpoint, this option also accept a range
# (9600-9700) and logstash will pick up the first available ports.
#
# http.port: 9600-9700
Logstash Filter Config
input {
beats {
port => "5044"
}
}
filter {
if [type] == "log4net" {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:threadid}\] %{WORD:level}\s*%{DATA:class} \[%{DATA:NDC}\]\s+-\s+%{GREEDYDATA:message}" ]
}
date {
match => ["timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss"]
remove_field => ["timestamp"]
}
mutate {
update => {
"type" => "log4net-logs"
}
}
}
}
output {
elasticsearch {
hosts => ["http://hostname:9200"]
index => "%{[#metadata][beat]}-%{[#metadata][version]}-%{+YYYY.MM.dd}"
#user => "elastic"
#password => "changeme"
}
}
You can try using hostname:
hosts: ["hostname:5044"]
As mentioned by #Adrian Dr try using:
hosts: ["hostname:5044"]
But also bind logstash to a single port:
http.port: 9600
Same error. It's because you mention protocol.
You have to remove 'http' from hosts field.
hosts: ["somename.com:5044"]
or ip
hosts: ["10.10.10.1:5044"]

Consul proxy failed to dial: dial tcp 127.0.0.1:0: connect: connection refused

I'm trying to run a consul connect proxy, but it displays unexpected errors
This is my configuration
{
"service": {
"name": "api",
"check": {
"name": "HTTP 80",
"http": "http://localhost:80",
"interval": "10s",
"timeout": "10s"
},
"connect": {
"sidecar_service": {
"proxy":{
"upstreams":[{
"destination_name": "elasticsearch",
"local_bind_port": 9200
}]
}
}
}
}
}
Here is the command with logging
$ consul connect proxy -sidecar-for elasticsearch
==> Consul Connect proxy starting...
Configuration mode: Agent API
Sidecar for ID: elasticsearch
Proxy ID: elasticsearch-sidecar-proxy
==> Log data will now stream in as it occurs:
2019/06/03 08:00:54 [INFO] Proxy loaded config and ready to serve
2019/06/03 08:00:54 [INFO] TLS Identity: spiffe://fadce594-37c1-8586-1b57-c6245436684c.consul/ns/default/dc/dc1/svc/elasticsearch
2019/06/03 08:00:54 [INFO] TLS Roots : [Consul CA 8]
2019/06/03 08:00:54 [INFO] public listener starting on 0.0.0.0:21000
2019/06/03 08:01:02 [ERR] failed to dial: dial tcp 127.0.0.1:0: connect: connection refused
^C==> Consul Connect proxy shutdown
Any suggestions?
The issue is because the service has no port, so it tried to connect to proxy.local_service_port - Defaults to the parent service port.
Specifying the port for the parent service solves the issue

Filebeat to logstash connection refused

I'm trying to send log files from filebeat->logstash->elastic search.
filebeat.yml. But I'm getting the following error in filebeat log:
2017-12-07T16:15:38+05:30 ERR Failed to connect: dial tcp [::1]:5044: connectex: No connection could be made because the target machine actively refused it.
My filebeat and logstash configurations are as follows:
1.filebeat.yml
filebeat.prospectors:
- input_type: log
paths:
- C:\Users\shreya\Data\mylog.log
document_type: springlog
multiline.pattern: ^\[[0-9]{4}-[0-9]{2}-[0-9]{2}
multiline.negate: true
multiline.match: before
output.logstash:
hosts: ["localhost:5044"]
2.logstash.yml
http.host: "127.0.0.1"
http.port: 5044
3.logstash conf file:
input {
beats {
port => 5044
codec => multiline {
pattern => "^(%{TIMESTAMP_ISO8601})"
negate => true
what => "previous"
}
}
}
filter {
grok{
id => "myspringlogfilter"
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}; [LOG_LEVEL=%{LOGLEVEL:log-level}, CMPNT_NM= %{GREEDYDATA:component}, MESSAGE=%{GREEDYDATA:message}" }
overwrite => ["message"]
}
}
output {
elasticsearch {
hosts => "localhost:9200"
manage_template => false
index => "%{[#metadata][beat]}-%{[#metadata][version]}-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
stdout {
codec => rubydebug
}
}
Problem got solved after I commented out the metric settings in logstash.yml as follows:
# ------------ Metrics Settings --------------
#
# Bind address for the metrics REST endpoint
#
#http.host: "127.0.0.1"
#
# Bind port for the metrics REST endpoint, this option also accept a range
# (9600-9700) and logstash will pick up the first available ports.
#
#http.port: 5044
#
But still do not know why this solved the issue. as both(filebeat and logstash) were pointing to the same port. If someone could explain the reason,
then prior Thanks!

CircuitBreaker::rescuing exceptions {:name=>"Beats input", :exception=>LogStash::Inputs::Beats::InsertingToQueueTakeTooLong, :level=>:warn}

I am new to ELK stack. I am trying to setup FileBeat --> Logstash --> ElasticSearch --> Kibana. Here while trying to send FileBeat output to Logstash input I am getting below error on Logstash side:
CircuitBreaker::rescuing exceptions {:name=>"Beats input", :exception=>LogStash::Inputs::Beats::InsertingToQueueTakeTooLong, :level=>:warn}
Beats input: The circuit breaker has detected a slowdown or stall in the pipeline, the input is closing the current connection and rejecting new connection until the pipeline recover. {:exception=>LogStash::Inputs::BeatsSupport::CircuitBreaker::HalfOpenBreaker, :level=>:warn}
I am using Logstash 2.3.2 version with FileBeat: 1.2.2, elasticsearch: 2.2.1
my logstash config:
input {
beats {
port => 5044
# codec => multiline {
# pattern => "^%{TIME}"
# negate => true
# what => previous
# }
}
}
filter {
grok {
match => { "message" => "^%{TIME:time}\s+%{LOGLEVEL:level}" }
}
}
output {
elasticsearch {
hosts => ["host:9200"]
manage_template => false
index => "%{[#metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}
my filebeat config:
filebeat:
prospectors:
- paths: - *.log
input_type: log
tail_files: false
output:
logstash:
hosts: ["host:5044"]
compression_level: 3
shipper:
logging:
to_files: true
files:
path: /tmp
name: mybeat.log
level: error

Resources