Logstash - JDBC - MYSQL config error - jdbc

After watching this tutorial ;
https://www.youtube.com/watch?v=ZnI_rlrei1s
I'm trying to fetch my localhost mysql (using laravel valet mysql) using logstash with jdbc to sent to elasticsearch server .
This is my config :
# file: db.conf
input {
jdbc {
# MySQL jdbc connection string to our database, mydb
jdb_connection_string => "jdbc:mysql://localhost:3306/dragon"
# The user we wish to execute our statement as
jdbc_user => "root"
# The user password
jdbc_password => ""
# The path to our downloaded jdbc driver
jdbc_driver_library => "/Users/muaz/downloads/logstash/mysql-connector-java-5.1.39-bin.jar"
# The name of the deliver clas for MySQL
jdbc_driver_class => "com.mysql.jdbc.Driver"
# Our query
statement = "SELECT * from Receipt"
}
}
output {
# stdout { codec => json_lines }
elasticsearch {
# protocol = https
index => "power_receipt"
document_type => "Receipt"
document_id => "%{id}"
host => "https://search-power-yidhfhkidiiddcccyhyrijaagamu.ap-southeast-1.es.amazonaws.com"
}
}
And i run it using command (in logstash folder) :
./bin/logstash agent -f db.conf
It produce :
fetched an invalid config {:config=>"# file: db.conf\ninput {\n\tjdbc {\n\t\t# MySQL jdbc connection string to our database, mydb\n\t\tjdb_connection_string => \"jdbc:mysql://localhost:3306/dragon\"\n\t\t# The user we wish to execute our statement as \n\t\tjdbc_user => \"root\"\n\t\t# The user password\n\t\tjdbc_password => \"\"\n\t\t# The path to our downloaded jdbc driver\n\t\tjdbc_driver_library => \"/Users/muaz/downloads/logstash/mysql-connector-java-5.1.39-bin.jar\"\n\t\t# The name of the deliver clas for MySQL\n\t\tjdbc_driver_class => \"com.mysql.jdbc.Driver\"\n\t\t# Our query\n\t\tstatement = \"SELECT * from Receipt\"\n\t}\n}\noutput {\n\t# stdout { codec => json_lines }\n\telasticsearch {\n\t\t# protocol = https\n\t\tindex => \"slurp_receipt\"\n\t\tdocument_type => \"Receipt\"\n\t\tdocument_id => \"%{id}\"\n\t\thost => \"https://search-power-yidhfhkidiiddcccyhyrijaagamu.ap-southeast-1.es.amazonaws.com\"\n\t}\n}\n\n\n", :reason=>"Expected one of #, => at line 15, column 13 (byte 521) after # file: db.conf\ninput {\n\tjdbc {\n\t\t# MySQL jdbc connection string to our database, mydb\n\t\tjdb_connection_string => \"jdbc:mysql://localhost:3306/dragon\"\n\t\t# The user we wish to execute our statement as \n\t\tjdbc_user => \"root\"\n\t\t# The user password\n\t\tjdbc_password => \"\"\n\t\t# The path to our downloaded jdbc driver\n\t\tjdbc_driver_library => \"/Users/muaz/downloads/logstash/mysql-connector-java-5.1.39-bin.jar\"\n\t\t# The name of the deliver clas for MySQL\n\t\tjdbc_driver_class => \"com.mysql.jdbc.Driver\"\n\t\t# Our query\n\t\tstatement ", :level=>:error}
How to solve it?
Thank you

You have a typo on the last line of your jdbc input
statement = "SELECT * from Receipt"
should read
statement => "SELECT * from Receipt"
Also in your elasticsearch output you need to change
host => "https://search-power-yidhfhkidiiddcccyhyrijaagamu.ap-southeast-1.es.amazonaws.com"
to
hosts => ["https://search-power-yidhfhkidiiddcccyhyrijaagamu.ap-southeast-1.es.amazonaws.com"]

Related

How to send data from HTTP input to ElasticSearch using Logstash ans jdbc_streaming filter?

I want to send data from Http to elasticsearch using logstash and I want to enrich my data using jdbc_streaming filter plugin. This is my logstash config:
input {
http {
id => "sensor_data_http_input"
user => "sensor_data"
password => "sensor_data"
}
}
filter {
jdbc_streaming {
jdbc_driver_library => "E:\ElasticStack\mysql-connector-java-8.0.18\mysql-connector-java-8.0.18.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/sensor_metadata"
jdbc_user => "elastic"
jdbc_password => "hide"
statement => "select st.sensor_type as sensorType, l.customer as customer, l.department as department, l.building_name as buildingName, l.room as room, l.floor as floor, l.location_on_floor as locationOnFloor, l.latitude, l.longitude from sensors s inner join sensor_type st on s.sensor_type_id=st.sensor_type_id inner join location l on s.location_id=l.location_id where s.sensor_id= :sensor_identifier"
parameters => { "sensor_identifier" => "sensor_id"}
target => lookupResult
}
mutate {
rename => {"[lookupResult][0][sensorType]" => "sensorType"}
rename => {"[lookupResult][0][customer]" => "customer"}
rename => {"[lookupResult][0][department]" => "department"}
rename => {"[lookupResult][0][buildingName]" => "buildingName"}
rename => {"[lookupResult][0][room]" => "room"}
rename => {"[lookupResult][0][floor]" => "floor"}
rename => {"[lookupResult][0][locationOnFloor]" => "locationOnFloor"}
add_field => {
"location" => "%{lookupResult[0]latitude},%{lookupResult[0]longitude}"
}
remove_field => ["lookupResult", "headers", "host"]
}
}
output {
elasticsearch {
hosts =>["localhost:9200"]
index => "sensor_data-%{+YYYY.MM.dd}"
user => "elastic"
password => "hide"
}
}
But when I start logstash, I see following error:
[2020-01-09T22:57:16,260]
[ERROR][logstash.javapipeline]
[main] Pipeline aborted due to error {
:pipeline_id=>"main",
:exception=>#<TypeError: failed to coerce jdk.internal.loader.ClassLoaders$AppClassLoader to java.net.URLClassLoader>,
:backtrace=>[
"org/jruby/java/addons/KernelJavaAddons.java:29:in `to_java'",
"E:/ElasticStack/Logstash/logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/logstash-filter-jdbc_streaming-1.0.7/lib/logstash/plugin_mixins/jdbc_streaming.rb:48:in `prepare_jdbc_connection'",
"E:/ElasticStack/Logstash/logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/logstash-filter-jdbc_streaming-1.0.7/lib/logstash/filters/jdbc_streaming.rb:200:in `prepare_connected_jdbc_cache'",
"E:/ElasticStack/Logstash/logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/logstash-filter-jdbc_streaming-1.0.7/lib/logstash/filters/jdbc_streaming.rb:116:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:56:in `register'",
"E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:195:in `block in register_plugins'", "org/jruby/RubyArray.java:1800:in `each'",
"E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:194:in `register_plugins'",
"E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:468:in `maybe_setup_out_plugins'",
"E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:207:in `start_workers'",
"E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:149:in `run'",
"E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:108:in `block in start'"],
:thread=>"#<Thread:0x17fa8113 run>"
}
[2020-01-09T22:57:16,598]
[ERROR][logstash.agent] Failed to execute action {
:id=>:main,
:action_type=>LogStash::ConvergeResult::FailedAction,
:message=>"Could not execute action: PipelineAction::Create<main>, action_result: false",
:backtrace=>nil
}
I am enriching my http input with some data in my mysql database but it doesn't start logstash at all.
I see two potential problems, but you need to check which is really the issue here:
MySql Driver class name has changed to com.mysql.cj.jdbc.Driver
Maybe a classloader problem is occurring when you are using a recent jdbc driver outside the classloader path in combination with newer jdk versions. There are serveral issues around that at github.
Put the driver in the logstash folder under <logstash-install-dir>/vendor/jar/jdbc/ (you need to create this folder first). If this don't work, move the driver under <logstash-install-dir>/logstash-core\lib\jars and don't provide any driver path in config file: jdbc_driver_library => ""
Problem solved with remove jdbc_driver_library option entirely from the config file and also, as mentioned, set jdbc_driver_class to com.mysql.cj.jdbc.Driver.

Elasticsearch-6.24 logstash-6.2.4 migration error from MySQL to ElasticSearch

Hi please have a look at below issue. I am clueless how to fix this issue.
I've downloaded ElasticSrearch -6.2.4 and Logstash - 6.2.4 on the window machine.
I'm trying to import data from MySQL to ElasticSearch using LogStash. but I'm getting the below error :
C:\logstash-6.2.4\bin>logstash -f logstash.conf
Error: Could not find or load main class Files\Apache
here are the steps I'm following:
first I started the ElasticSearch which is running perfectly on the port 9200.
then I've added the below Scripts in logstash.yml which has all the migration instructions.
# ------------ MySQL to ElasticSearch -------------
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/MySQL_ElasticSearch_Demo"
# The user we wish to execute our statement as
jdbc_user => "root"
jdbc_password => "root"
# The path to our downloaded jdbc driver
jdbc_driver_library => "C:\mysql-connector-java-5.1.46/mysql-connector-java-5.1.46.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT * FROM user"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "users"
"document_type" => "usersData"
}
}
I'm trying to run the logstash via command prompt using below command:
C:\logstash-6.2.4\bin>logstash -f logstash.conf
Error: Could not find or load main class Files\Apache
===> any help will be much appreciated. thanks in advance!

how to connect cassandra with logstash input?

Logstash.conf
input { tcp { port => 7199 } } output { elasticsearch { hosts => ["localhost"] } }
Cassandra running on 7199 port and jhipster application running on localhost:8080.
we are unable to add into logstash by my_application
No log4j2 file found.
I think you can use the JDBC plugin:
https://github.com/logstash-plugins/logstash-input-jdbc
input {
jdbc {
jdbc_connection_string => "jdbc:cassandra://hostname:XXXX" # Your port
jdbc_user => "user" # The user value
jdbc_password => "password" # The password
jdbc_driver_library => "$PATH/cassandra_driver.jar" # Jar path
jdbc_driver_class => "org.apache.cassandra.cql.jdbc.CassandraDriver" # Driver
statement => "SELECT * FROM keyspace.my_table" # Your query
}
}
I had the same issue. The issue was solved by downloading a Cassandra JDBC from DatabaseSchema.
also when You want to add the jar files, add it in the
logstashFolder/logstash-core/lib/jar
there seems to be a bug with logstash which only looks this path for external jar files.
also if there were some jar files that were duplicated use the latest ones.

Logstash with queue enabled not ack http input events after jdbc input runs

I’m using logstash with queuing enabled.
I’ve setup logstash to inject rows from mysql via the mysql input plugin on startup. Currently this is injecting 1846 rows.
I also have a http input.
When I take down ES and restart logstash as expected I get errors
logstash_1 WARN logstash.outputs.amazones - Failed to flush outgoing
items {:outgoing_count=>1, :exception=>“Faraday::ConnectionFailed”,
:backtrace=>nil} logstash_1 ERROR logstash.outputs.amazones -
Attempted to send a bulk request to Elasticsearch configured at … I’d
expect when in this situation hitting the logstash http input would
result in an ack.
Actually the http POST does not return and the injection is not seen in logstash logs.
My logstash.yaml looks like
queue {
type: persisted
checkpoint.writes: 1
queue.max_bytes: 8gb
queue.page_capacity: 512mb
}
And my logstash.conf
input {
jdbc {
jdbc_connection_string => "${JDBC_CONNECTION_STRING}"
jdbc_user => "${JDBC_USER}"
jdbc_password => "${JDBC_PASSWORD}"
jdbc_driver_library => "/home/logstash/jdbc_driver.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
statement => "
SELECT blah blah blah
"
}
http {
host => "0.0.0.0"
port => 31311
}
}
output {
stdout { codec => json_lines }
amazon_es {
hosts => ["${AWS_ES_HOST}"]
region => "${AWS_REGION}"
aws_access_key_id => '${AWS_ACCESS_KEY_ID}'
aws_secret_access_key => '${AWS_SECRET_ACCESS_KEY}'
"index" => "${INDEX_NAME}"
"document_type" => "data"
"document_id" => "%{documentid}"
}
}
Is it possible for the http input to still ack events as I’m pretty sure the queue cannot be full as each event payload is about 850 characters?
Thanks in advance

Logstash not starting up

I am trying to start logstash 5.4 on my linux rhel 6 server but i'm getting the following message:
WARNING: Default JAVA_OPTS will be overridden by the JAVA_OPTS defined in the environment. Environment JAVA_OPTS are -Xms1G .Xmx64G
Error: Could not find or load main class .Xmx64G
Following is my logstash.conf in which I'm try to ingest data from sqlserver
input {
jdbc {
jdbc_driver_library => "/usr/share/logstash/mysql-connector-java-5.1.42-bin.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://9.37.92.72:1433;databaseName=KaiserPermanente;"
jdbc_user => "sa"
jdbc_password => "passw0rd!"
statement => "select * from IEVDIncident ;"
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "kaiserpermanente"
}
stdout { codec => json_lines }
}
Please tell me how can I resolve this one. Thanks
It seems you have an environment variable JAVA_OPTS with value -Xms1G .Xmx64G so it overrides logstash options. You need to change your variable to -Xms1G -Xmx64G. Replace . with -.

Resources