Logstash jdbc connect to mssql - jdbc

Exception: ArgumentError
Stack: /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/cronline.rb:60:in `initialize'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:604:in `initialize'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/rufus-scheduler-3.0.9/lib/rufus/scheduler.rb:629:in `do_sche/gemsdule'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.6/lib/logstash/plugin_mixins/jdbc/scheduler.rb:129:in `do_schedule'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler.rb:249:in `schedule_cron'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.6/lib/logstash/plugin_mixins/jdbc/scheduler.rb:23:in `start_cron_scheduler'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.6/lib/logstash/inputs/jdbc.rb:323:in `run'
/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:410:in `inputworker'
/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:401:in `block in start_input'`
My input.conf
input {
jdbc {
jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/mssql-jdbc.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://db:1433;databaseName=db2022"
jdbc_user => "xxx"
jdbc_password => "xxx"
statement => "Select * From db"
schedule => "/14 * * *"
add_field => { "tag" => "db" }
type => "db"
}
}
my logstash version: "logstash 7.17.6"
My java:
openjdk version "11.0.16" 2022-07-19
OpenJDK Runtime Environment Temurin-11.0.16+8 (build 11.0.16+8)
OpenJDK 64-Bit Server VM Temurin-11.0.16+8 (build 11.0.16+8, mixed mode)
Jdbc version: "mssql-jdbc-7.2.2.jre11.jar"
Can you help me? I don't understand what is wrong.

Related

Logstash Error: unable to load mysql-connector-java- from :jdbc_driver_library, no such file to load -- mysql-connector-java-

I want to move some data from mysql to elasticsearch using logstash
but iget Error: unable to load mysql-connector-java- from :jdbc_driver_library, no such file to load -- mysql-connector-java-,and i already move the mysql connector to [logstash_folder]\logstash-core\lib\jarsand this is My config
input{
jdbc{
jdbc_driver_library => "mysql-connector-java-8.0.18.jar"
jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/reporting"
jdbc_user => "*"
jdbc_password => "*"
tracking_column => "id_activity"
use_column_value=>true
statement => "select tbactivity.id_project,tbuser.name,tbproject.project_name,tbactivity.dateinsert,tbactivity.datetime_start,tbactivity.datetime_end,tbactivity.description from tbactivity inner join tbproject on tbactivity.id_project = tbproject.id_project inner join tbuser on tbactivity.id_user = tbuser.id_user where tbproject.id_category = 2"
}
}
filter {
mutate {
copy => { "id_activity" => "[#metadata][_id]"}
}
}
output{
elasticsearch{
user => "elastic"
password => "changeme"
hosts => "localhost:9200"
index => "activity"
document_type => 'text'
document_id => "%{[id_activity]}"
}
}
and the error i get
Error: unable to load mysql-connector-java-8.0.18.jar from :jdbc_driver_library, no such file to load -- mysql-connector-java-8.0.18
Exception: LogStash::PluginLoadingError
Stack: D:/logstash-7.5.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.19/lib/logstash/plugin_mixins/jdbc/jdbc.rb:152:in `block in load_driver_jars'
org/jruby/RubyArray.java:1800:in `each'
D:/logstash-7.5.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.19/lib/logstash/plugin_mixins/jdbc/jdbc.rb:145:in `load_driver_jars'
D:/logstash-7.5.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.19/lib/logstash/plugin_mixins/jdbc/jdbc.rb:167:in `open_jdbc_connection'
D:/logstash-7.5.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.19/lib/logstash/plugin_mixins/jdbc/jdbc.rb:243:in `execute_statement'
D:/logstash-7.5.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.19/lib/logstash/inputs/jdbc.rb:309:in `execute_query'
D:/logstash-7.5.2/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.19/lib/logstash/inputs/jdbc.rb:281:in `run'
D:/logstash-7.5.2/logstash-core/lib/logstash/java_pipeline.rb:332:in `inputworker'
D:/logstash-7.5.2/logstash-core/lib/logstash/java_pipeline.rb:324:in `block in start_input'
[2020-02-03T14:54:49,708][ERROR][logstash.javapipeline ][main] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
I hope you can help me
Thank You
Regrads
Kevin

Exception: LogStash::ConfigurationError

Am trying to connect Oracle database via logstash and am getting below error.
Error: oracle.jdbc.OracleDriver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
Exception: LogStash::ConfigurationError
Stack: D:/softwares/logstash-6.2.4/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.9/lib/logstash/plugin_mixins/jdbc.rb:162:in `open_jdbc_connection'
Please find my logstash config file :
input {
jdbc {
jdbc_driver_library => "D:\data\ojdbc14.jar"
jdbc_driver_class => "oracle.jdbc.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:#localhost:1521:xe"
jdbc_user => "user_0ne"
jdbc_password => "xxxyyyzzz"
statement => "SELECT * FROM PRODUCT"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "my_index"
}
}
logstash config file : (corrected)
input {
jdbc {
jdbc_driver_library => "D:\Karthikeyan\data\ojdbc14.jar"
jdbc_driver_class => "Java::oracle.jdbc.OracleDriver" // problem in this line is corrected
jdbc_connection_string => "jdbc:oracle:thin:#localhost:1521:xe"
jdbc_user => "vb"
jdbc_password => "123456"
statement => "SELECT * FROM VB_PRODUCT"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "my_index"
}
}
You can validate the configuration file using,
./user/share/logstash/bin/logstash -f etc/logstash/conf.d/sample.conf --config.test_and_exit

Logstash Error: A plugin had an unrecoverable error

A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::Jdbc jdbc_connection_string=>"jdbc:mysql://dns/db", jdbc_user=>"root", jdbc_password=><password>, jdbc_driver_library=>"/home/ubuntu/mysql-connector-java-5.1.21.jar", jdbc_driver_class=>"com.mysql.jdbc.Driver", statement=>"SELECT * FROM table;", codec=><LogStash::Codecs::JSON id=>"json_ff05abb6-1b36-4ebf-aba1-1f8cf47a13a5", enable_metric=>true, charset=>"UTF-8">, id=>"93f23172918335b7f06ba3f8ee201c0b78f2c8e2-1", enable_metric=>true, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>1970-01-01 00:00:00 UTC}, last_run_metadata_path=>"/home/ubuntu/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: undefined method `close_jdbc_connection' for #<Sequel::JDBC::Database:0x745d6c19>
logstash-5.5.0
conf file
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://dns:3306/stats"
jdbc_user => "root"
jdbc_password => "sdf"
#jdbc_validate_connection => true
jdbc_driver_library => "/home/ubuntu/mysql-connector-java-5.1.42/mysql-connector-java-5.1.42-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
statement => "SELECT * FROM table;"
#codec => "json"
}
}
output {
elasticsearch {
index => "mysqltest"
document_type => "mysqltest_type"
document_id => "%{id}"
hosts => "dns:80"
}
}
What is that about ? How can I solve this ?

How to load using logstash in AWS Elasticsearch

I didn't find any proper documentation in output plugins of logsatsh ,for loading data into AWS ES,i do find
output plugin only speaks the HTTP protocol. without specifying port 9200 can we load data in AWS ES
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost/elasticsearch"
jdbc_user => "root"
jdbc_password => "empower"
#jdbc_validate_connection => true
jdbc_driver_library => "/home/wtc082/Documents/com.mysql.jdbc_5.1.5.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
statement => "SELECT * FROM index_part_content_local LIMIT 10;"
schedule => "1 * * * *"
#codec => "json"
}
}
output {
elasticsearch {
index => "mysqltest"
document_type => "mysqltest_type"
document_id => "%{partnum}"
hosts => "AWSURI"
}
}
Can we do like this ?
Actually, I was using the log stash 2.4 to load data from Mysql to ES 5.X version when I used the log stash 5.X version it solved my issue.I didn't get any error while running the conf file
Thanks Val

Logstash Amazon Output Elasticsearch Config Error

I'm try to send the data from mysql using logstash jdbc mysql to Amazon Elasticsearch Service and I got an error , my config db.conf as follow :
input {
jdbc {
# Postgres jdbc connection string to our database, mydb
jdbc_connection_string => "jdbc:mysql://awsmigration.XXXXXXXX.ap-southeast-1.rds.amazonaws.com:3306/admin_slurp?zeroDateTimeBehavior=convertToNull"
# The user we wish to execute our statement as
jdbc_user => "root"
jdbc_password => "XXXXXX"
# The path to our downloaded jdbc driver
jdbc_driver_library => "/opt/logstash/drivers/mysql-connector-java-5.1.39/mysql-connector-java-5.1.39-bin.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT *, id as _id from Receipt"
jdbc_paging_enabled => true
jdbc_page_size => 200
}
}
output {
amazon_es {
hosts => ["https://search-XXXXXXXX.ap-southeast-1.es.amazonaws.com:443"]
region => "ap-southeast-1"
# aws_access_key_id, aws_secret_access_key optional if instance profile is configured
aws_access_key_id => 'XXXXXXXX'
aws_secret_access_key => 'XXXXXXXX'
index => "slurp_receipt"
}
}
The errors :
fetched an invalid config {:config=>" jdbc {\n # Postgres jdbc connection string to our database, mydb\n jdbc_connection_string => \"jdbc:mysql://awsmigration.XXXXXXXX.ap-southeast-1.rds.amazonaws.com:3306/admin_slurp?zeroDateTimeBehavior=convertToNull\"\n # The user we wish to execute our statement as\n jdbc_user => \"dryrun\"\n jdbc_password => \"dryruntesting\"\n # The path to our downloaded jdbc driver\n jdbc_driver_library => \"/opt/logstash/drivers/mysql-connector-java-5.1.39/mysql-connector-java-5.1.39-bin.jar\"\n # The name of the driver class for Postgresql\n jdbc_driver_class => \"com.mysql.jdbc.Driver\"\n # our query\n statement => \"SELECT *, id as _id from Receipt\"\n\n jdbc_paging_enabled => true\n jdbc_page_size => 200\n }\n}\noutput {\n amazon_es {\n hosts => [\"https://search-XXXXXXXX-southeast-1.es.amazonaws.com:443\"]\n region => \"ap-southeast-1\"\n # aws_access_key_id, aws_secret_access_key optional if instance profile is configured\n aws_access_key_id => 'XXXXXXXX'\n aws_secret_access_key => 'XXXXXXXX'\n index => \"slurp_receipt\"\n }\n}\n\n\n", :reason=>"Expected one of #, input, filter, output at line 1, column 5 (byte 5) after ", :level=>:error}
I'm using Ubuntu 14 , logstash 2.3.4 .
How to solve it?
Thank you

Resources