Logstash with queue enabled not ack http input events after jdbc input runs - elasticsearch

I’m using logstash with queuing enabled.
I’ve setup logstash to inject rows from mysql via the mysql input plugin on startup. Currently this is injecting 1846 rows.
I also have a http input.
When I take down ES and restart logstash as expected I get errors
logstash_1 WARN logstash.outputs.amazones - Failed to flush outgoing
items {:outgoing_count=>1, :exception=>“Faraday::ConnectionFailed”,
:backtrace=>nil} logstash_1 ERROR logstash.outputs.amazones -
Attempted to send a bulk request to Elasticsearch configured at … I’d
expect when in this situation hitting the logstash http input would
result in an ack.
Actually the http POST does not return and the injection is not seen in logstash logs.
My logstash.yaml looks like
queue {
type: persisted
checkpoint.writes: 1
queue.max_bytes: 8gb
queue.page_capacity: 512mb
}
And my logstash.conf
input {
jdbc {
jdbc_connection_string => "${JDBC_CONNECTION_STRING}"
jdbc_user => "${JDBC_USER}"
jdbc_password => "${JDBC_PASSWORD}"
jdbc_driver_library => "/home/logstash/jdbc_driver.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
statement => "
SELECT blah blah blah
"
}
http {
host => "0.0.0.0"
port => 31311
}
}
output {
stdout { codec => json_lines }
amazon_es {
hosts => ["${AWS_ES_HOST}"]
region => "${AWS_REGION}"
aws_access_key_id => '${AWS_ACCESS_KEY_ID}'
aws_secret_access_key => '${AWS_SECRET_ACCESS_KEY}'
"index" => "${INDEX_NAME}"
"document_type" => "data"
"document_id" => "%{documentid}"
}
}
Is it possible for the http input to still ack events as I’m pretty sure the queue cannot be full as each event payload is about 850 characters?
Thanks in advance

Related

trying Consume data From RabbitMQ To Elasticsearch

I am trying Consume data From RabbitMQ To Elasticsearch, and I followed this tutorial https://akintola-lonlon.medium.com/logstash-5-easy-steps-to-consume-data-from-rabbitmq-to-elasticsearch-8fb0eb6e9196
this is my rabbitmq quque
This is my logstash-rabbitmq.conf
input {
rabbitmq {
id => "rabbitmq_logs"
host => "localhost"
port => 5672
vhost => "/"
queue => "system_logs"
ack => false
}
}
filter {
grok {
match => {"message" => "%{COMBINEDAPACHELOG}"}
}
date {
match => ["timestamp", "dd/MM/yyyy:HH:mm:ss Z"]
}
}
output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "logstash_rabbit_mq_hello"
}
stdout {
codec => rubydebug
}
}
Then I try to run sudo bin/logstash -f conf.d/logstash-rabbitmq.conf I get flowing error
[2022-10-17T10:08:43,917][WARN ][logstash.inputs.rabbitmq ][main][rabbitmq_logs] Error while setting up connection, will retry {:exception=>MarchHare::PreconditionFailed, :message=>"PRECONDITION_FAILED - inequivalent arg 'durable' for queue 'system_logs' in vhost '/': received 'false' but current is 'true'", :cause=>#<Java::JavaIo::IOException: >}
[2022-10-17T10:08:43,917][WARN ][logstash.inputs.rabbitmq ][main][rabbitmq_logs] RabbitMQ connection was closed {:url=>"amqp://guest:XXXXXX#localhost:5672/", :automatic_recovery=>true, :cause=>#<Java::ComRabbitmqClient::ShutdownSignalException: clean connection shutdown; protocol method: #method<connection.close>(reply-code=200, reply-text=OK, class-id=0, method-id=0)>}
[2022-10-17T10:08:44,929][INFO ][logstash.inputs.rabbitmq ][main][rabbitmq_logs] Connected to RabbitMQ {:url=>"amqp://guest:XXXXXX#localhost:5672/"}
how can I fix this problem?
I am a beginner in RabbitMQ and ELK, pleas help me

Got response code '400' contacting Elasticsearch at URL in logstash

I am new to elasticsearch. I tried to configure elastisearch, Kibana , logstash with MQTT plugin. I supposed to send logs to elasticseach through logstash MQTT plugin. I installed them on Mac locally, but when starting logstash, it throws following error.
[2021-11-12T17:26:37,976][ERROR][logstash.outputs.elasticsearch][logstash_pipeline] Unable to get license information {:url=>"http://localhost:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :message=>"Got response code '400' contacting Elasticsearch at URL 'http://localhost:9200/_license'"}
my logstash configuration file islike:
input {
mqtt {
host => "localhost"
port => 1883
topic => "test"
qos => 0
certificate_path => "/Users/john/logstash-7.10.2/logstash/m2mqtt_ca.crt"
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "%{[#metadata][beat]}-%{[#metadata][version]}-%{+YYYY.MM.dd}"
#user => "elastic"
#password => "changeme"
}
}
can anybody tell, what was the issue?

Getting logstash configuration error while transferring mysql data to kibana, sql db password is blank so i am passing jdbc_password=" "

I want to upload MySQL table data to kibana using Logstash and JDBC.
MYSql database username is "root" and password is blank. I tried giving password as "" and " ", "Null" but it's not working.
This is my logstash configuration file:
input {
jdbc {
jdbc_driver_library => "C:/elasticsearch-7.3.0/driver/com.mysql.jdbc_5.1.5.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/dbname?useSSL=false"
jdbc_user => "root"
jdbc_password=>" "
statement => "SELECT * FROM table"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["localhost"]
index => "index_name"
}
}
logstash output:
[2019-11-06T13:02:28,143][ERROR][logstash.inputs.jdbc ] Failed to load C:/elasticsearch-7.3.0/driver/com.mysql.jdbc_5.1.5.jar {:exception=>#}
[2019-11-06T13:02:28,146][ERROR][logstash.javapipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: "root", jdbc_password=>, statement=>"SELECT * FROM tracker", jdbc_driver_library=>"C:/elasticsearch-7.3.0/driver/com.mysql.jdbc_5.1.5.jar", jdbc_connection_string=>"jdbc:mysql://localhost:3306/pvtrace?useSSL=false", id=>"5eccb173adcbec4cd0c68701c4737d83e11f82fdc157788bc9b76507e2a70a06", jdbc_driver_class=>"com.mysql.jdbc.Driver", enable_metric=>true, codec=>"plain_feefd4f8-c2ca-4050-8044-04f466e0c157", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>1970-01-01 00:00:00 UTC}, last_run_metadata_path=>"C:\Users\himanshika.yeduvans/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: com.mysql.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
Exception: LogStash::ConfigurationError
Stack: C:/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:in open_jdbc_connection'
C:/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:inexecute_statement'
C:/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:in execute_query'
C:/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:263:inrun'
C:/logstash-7.3.0/logstash-core/lib/logstash/java_pipeline.rb:309:in inputworker'
C:/logstash-7.3.0/logstash-core/lib/logstash/java_pipeline.rb:302:inblock in start_input'
[2019-11-06T13:03:31,349][WARN ][logstash.runner ] SIGINT received. Shutting down.
[2019-11-06T13:03:32,070][ERROR][logstash.inputs.jdbc ] Failed to load C:/elasticsearch-7.3.0/driver/com.mysql.jdbc_5.1.5.jar {:exception=>#}
[2019-11-06T13:03:36,354][WARN ][logstash.runner ] Received shutdown signal, but pipeline is still waiting for in-flight events
to be processed. Sending another ^C will force quit Logstash, but this may cause data loss.
Check if the jdbc driver is present at the mentioned path
"C:/elasticsearch-7.3.0/driver/com.mysql.jdbc_5.1.5.jar"

Consume messages from rabbitmq in logstash

Im trying to read logs from rabbitmq queue from logstash and then pass it to elasticsearch. But with no success. Here is my logstash config.
input {
rabbitmq {
host => "localhost"
port => 15672
heartbeat => 30
durable => true
exchange => "logging_queue"
exchange_type => "logging_queue"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
}
stdout {}
}
But there is no index created so ofcourse I cant see any logs in Kibana
There are some messages in queue
I think the correct (default) port is 5672, as 15672 is the port of the web admin console.
input {
rabbitmq {
host => "localhost"
port => 5672 <--- change this
heartbeat => 30
durable => true
exchange => "logging_queue"
exchange_type => "logging_queue"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
}
stdout {}
}

sending json from one logstash to another

i have 3 node setup
10.x.x.1 - application and filebeat
10.x.x.2 - machine for parsing and logstash
10.x.x.3 - having centralized logstash node from where we need to push messages into Elastic Search
in 10.x.x.2 when i set the output codec to stdout , i can see the messages coming from 10.x.x.1.
Now, i need to forward all the json messages from 10.x.x.2 to 10.x.x.3 . I tried using TCP. But the messages are not gettting sent.
10.x.x.2 logstash conf file
input {
beats {
port => 5045
}
}
output{
#stdout { codec => rubydebug }
tcp{
host => "10.x.x.3"
port => 3389
}
10.x.x.3 logstash conf file
input{
tcp{
host => "10.x.x.3"
port => 3389
#mode => "server"
#codec => "json"
}
}
output{
stdout{ codec => rubydebug }
}
is there any plugin which can send json data from one logstash to another logstash server
Your config should work.
But you have to be carreful with the "codec" properties.
Try first to set it to "line" on the output AND the input plugins of the two logstash.
And see if log are incoming.
With the codec set to "line" you will have logicly no problem to forward the logs.
Then work on the "json" properties.
Do not forget that you can activate the debug mode of logstash with the argument --debug and you can log with the arguments : -l logFileName
When you start to work with the codec json look for "_jsonparsefailure" tags, which could explain why it do not transfert logs between the two logstash.

Resources