Logstash: Error: mongodb.jdbc.MongoDriver not loaded - elasticsearch

I am getting below error while using Mongodb Java Driver to ready data from MongoDB and push it to ElasticSearch-
Error: mongodb.jdbc.MongoDriver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
Plateform Info:
OS- RHEL 6.6
Logstash- 5.5.0
Elasticsearch- 5.5.0
Mongodb- 3.2.13
Jars- mongodb-driver-core-3.4.2.jar, mongo-java-driver-3.4.2.jar and bson-3.4.2.jar
Logstash config
input{
jdbc{
jdbc_driver_library => "/home/pdwiwe/logstash-5.5.0/bin/mongo-java-driver-3.4.2.jar"
jdbc_driver_class => "mongodb.jdbc.MongoDriver"
jdbc_connection_string => "jdbc:mongo://hostname:27017?authSource=admin"
jdbc_user => "user"
jdbc_password => "pwd"
statement => "select * from system.users"
}
}
output {
if "_grokparsefailure" not in [tags]{
elasticsearch {
hosts => [ "localhost:9200" ]
index => "mongodb-data"
}
}
}
Logstash Service Start:
/home/pdwiwe/logstash-5.5.0/bin$ sh logstash -f mongo.conf

mongodb.jdbc.MongoDriver is not a Driver class in the mongo-java-driver.
AFAIK - this driver does not support JDBC
Various JDBC drivers have wrapped the mongo-java-driver such as Unity, Simba, DbSchema

Related

Logstash error when loading data from sql to elasticsearch

I am trying to load data from MSSQL to elasticsearch using logstash. I am using ES 7.17.3, logstash 7.17.3, JDBC 10.2
This is how my logstash config file looks like
input {
jdbc {
jdbc_connection_string => "jdbc:sqlserver://DATABASE_SERVER_PLACEHOLDER;database=DATABASE_PLACEHOLDER;responseBuffering=adaptive;integratedSecurity=true;applicationIntent=ReadOnly;multiSubnetFailover=true"
jdbc_driver_library => "C:\Program Files\Microsoft JDBC Driver 10.2 for SQL Server\sqljdbc_10.2\enu\auth\x64"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_user => ""
jdbc_password => ""
statement => "select * from Presentation"
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "test"
}
}
When I run the logstash I get the below error
Error: unable to load C:\Program Files\Microsoft JDBC Driver 10.2 for SQL Server\sqljdbc_10.2\enu\auth\x64 from :jdbc_driver_library, no such file to load -- C:/Program Files/Microsoft JDBC Driver 10.2 for SQL Server/sqljdbc_10.2/enu/auth/x64
Exception: LogStash::PluginLoadingError
Tried looking for solution but none worked. Can you please help me out.

Getting logstash configuration error while transferring mysql data to kibana, sql db password is blank so i am passing jdbc_password=" "

I want to upload MySQL table data to kibana using Logstash and JDBC.
MYSql database username is "root" and password is blank. I tried giving password as "" and " ", "Null" but it's not working.
This is my logstash configuration file:
input {
jdbc {
jdbc_driver_library => "C:/elasticsearch-7.3.0/driver/com.mysql.jdbc_5.1.5.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/dbname?useSSL=false"
jdbc_user => "root"
jdbc_password=>" "
statement => "SELECT * FROM table"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["localhost"]
index => "index_name"
}
}
logstash output:
[2019-11-06T13:02:28,143][ERROR][logstash.inputs.jdbc ] Failed to load C:/elasticsearch-7.3.0/driver/com.mysql.jdbc_5.1.5.jar {:exception=>#}
[2019-11-06T13:02:28,146][ERROR][logstash.javapipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: "root", jdbc_password=>, statement=>"SELECT * FROM tracker", jdbc_driver_library=>"C:/elasticsearch-7.3.0/driver/com.mysql.jdbc_5.1.5.jar", jdbc_connection_string=>"jdbc:mysql://localhost:3306/pvtrace?useSSL=false", id=>"5eccb173adcbec4cd0c68701c4737d83e11f82fdc157788bc9b76507e2a70a06", jdbc_driver_class=>"com.mysql.jdbc.Driver", enable_metric=>true, codec=>"plain_feefd4f8-c2ca-4050-8044-04f466e0c157", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>1970-01-01 00:00:00 UTC}, last_run_metadata_path=>"C:\Users\himanshika.yeduvans/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: com.mysql.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
Exception: LogStash::ConfigurationError
Stack: C:/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:in open_jdbc_connection'
C:/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:inexecute_statement'
C:/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:in execute_query'
C:/logstash-7.3.0/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:263:inrun'
C:/logstash-7.3.0/logstash-core/lib/logstash/java_pipeline.rb:309:in inputworker'
C:/logstash-7.3.0/logstash-core/lib/logstash/java_pipeline.rb:302:inblock in start_input'
[2019-11-06T13:03:31,349][WARN ][logstash.runner ] SIGINT received. Shutting down.
[2019-11-06T13:03:32,070][ERROR][logstash.inputs.jdbc ] Failed to load C:/elasticsearch-7.3.0/driver/com.mysql.jdbc_5.1.5.jar {:exception=>#}
[2019-11-06T13:03:36,354][WARN ][logstash.runner ] Received shutdown signal, but pipeline is still waiting for in-flight events
to be processed. Sending another ^C will force quit Logstash, but this may cause data loss.
Check if the jdbc driver is present at the mentioned path
"C:/elasticsearch-7.3.0/driver/com.mysql.jdbc_5.1.5.jar"

Elasticsearch-6.24 logstash-6.2.4 migration error from MySQL to ElasticSearch

Hi please have a look at below issue. I am clueless how to fix this issue.
I've downloaded ElasticSrearch -6.2.4 and Logstash - 6.2.4 on the window machine.
I'm trying to import data from MySQL to ElasticSearch using LogStash. but I'm getting the below error :
C:\logstash-6.2.4\bin>logstash -f logstash.conf
Error: Could not find or load main class Files\Apache
here are the steps I'm following:
first I started the ElasticSearch which is running perfectly on the port 9200.
then I've added the below Scripts in logstash.yml which has all the migration instructions.
# ------------ MySQL to ElasticSearch -------------
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/MySQL_ElasticSearch_Demo"
# The user we wish to execute our statement as
jdbc_user => "root"
jdbc_password => "root"
# The path to our downloaded jdbc driver
jdbc_driver_library => "C:\mysql-connector-java-5.1.46/mysql-connector-java-5.1.46.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT * FROM user"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "users"
"document_type" => "usersData"
}
}
I'm trying to run the logstash via command prompt using below command:
C:\logstash-6.2.4\bin>logstash -f logstash.conf
Error: Could not find or load main class Files\Apache
===> any help will be much appreciated. thanks in advance!

how to connect cassandra with logstash input?

Logstash.conf
input { tcp { port => 7199 } } output { elasticsearch { hosts => ["localhost"] } }
Cassandra running on 7199 port and jhipster application running on localhost:8080.
we are unable to add into logstash by my_application
No log4j2 file found.
I think you can use the JDBC plugin:
https://github.com/logstash-plugins/logstash-input-jdbc
input {
jdbc {
jdbc_connection_string => "jdbc:cassandra://hostname:XXXX" # Your port
jdbc_user => "user" # The user value
jdbc_password => "password" # The password
jdbc_driver_library => "$PATH/cassandra_driver.jar" # Jar path
jdbc_driver_class => "org.apache.cassandra.cql.jdbc.CassandraDriver" # Driver
statement => "SELECT * FROM keyspace.my_table" # Your query
}
}
I had the same issue. The issue was solved by downloading a Cassandra JDBC from DatabaseSchema.
also when You want to add the jar files, add it in the
logstashFolder/logstash-core/lib/jar
there seems to be a bug with logstash which only looks this path for external jar files.
also if there were some jar files that were duplicated use the latest ones.

Logstash not starting up

I am trying to start logstash 5.4 on my linux rhel 6 server but i'm getting the following message:
WARNING: Default JAVA_OPTS will be overridden by the JAVA_OPTS defined in the environment. Environment JAVA_OPTS are -Xms1G .Xmx64G
Error: Could not find or load main class .Xmx64G
Following is my logstash.conf in which I'm try to ingest data from sqlserver
input {
jdbc {
jdbc_driver_library => "/usr/share/logstash/mysql-connector-java-5.1.42-bin.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://9.37.92.72:1433;databaseName=KaiserPermanente;"
jdbc_user => "sa"
jdbc_password => "passw0rd!"
statement => "select * from IEVDIncident ;"
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "kaiserpermanente"
}
stdout { codec => json_lines }
}
Please tell me how can I resolve this one. Thanks
It seems you have an environment variable JAVA_OPTS with value -Xms1G .Xmx64G so it overrides logstash options. You need to change your variable to -Xms1G -Xmx64G. Replace . with -.

Resources