I'm using Amazon Elasticsearch Service 2.3.4 and Logstash 2.3.0 .
My configuration
input {
jdbc {
# Postgres jdbc connection string to our database, mydb
jdbc_connection_string => "jdbc:mysql://awsmigration.XXXXXXXXX.ap-southeast-1.rds.amazonaws.com:3306/admin?zeroDateTimeBehavior=convertToNull"
# The user we wish to execute our statement as
jdbc_user => "dryrun"
jdbc_password => "dryruntesting"
# The path to our downloaded jdbc driver
jdbc_driver_library => "/opt/logstash/drivers/mysql-connector-java-5.1.39/mysql-connector-java-5.1.39-bin.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT * from Receipt"
jdbc_paging_enabled => true
jdbc_page_size => 200
}
}
output {
elasticsearch {
index => "slurp_receipt"
document_type => "Receipt"
document_id => "%{uid}"
hosts => ["https://search-XXXXXXXXXXXX.ap-southeast-1.es.amazonaws.com:443"]
aws_access_key_id => 'XXXXXXXXXXXXXXXXX'
aws_secret_access_key => 'XXXXXXXXXXXXXXX'
}
}
I got this error :
Fri Aug 26 07:30:13 UTC 2016 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Unknown setting 'aws_access_key_id' for elasticsearch {:level=>:error}
Unknown setting 'aws_secret_access_key' for elasticsearch {:level=>:error}
Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: Something is wrong with your configuration.>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/config/mixin.rb:134:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/outputs/base.rb:63:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.4-java/lib/logstash/agent.rb:473:in `start_pipeline'"], :level=>:error}
How to solve it ?
Assuming you are using the amazon_es plugin your output should look like this:
output {
amazon_es {
index => "slurp_receipt"
hosts => ["https://search-XXXXXXXXXXXX.ap-southeast-1.es.amazonaws.com:443"]
aws_access_key_id => 'XXXXXXXXXXXXXXXXX'
aws_secret_access_key => 'XXXXXXXXXXXXXXX'
}
}
aws_access_key_id and aws_secret_access_key are not valid configuration options for the Logstash elasticsearch plugin.
cf documentation
Related
Logstash.conf
input { tcp { port => 7199 } } output { elasticsearch { hosts => ["localhost"] } }
Cassandra running on 7199 port and jhipster application running on localhost:8080.
we are unable to add into logstash by my_application
No log4j2 file found.
I think you can use the JDBC plugin:
https://github.com/logstash-plugins/logstash-input-jdbc
input {
jdbc {
jdbc_connection_string => "jdbc:cassandra://hostname:XXXX" # Your port
jdbc_user => "user" # The user value
jdbc_password => "password" # The password
jdbc_driver_library => "$PATH/cassandra_driver.jar" # Jar path
jdbc_driver_class => "org.apache.cassandra.cql.jdbc.CassandraDriver" # Driver
statement => "SELECT * FROM keyspace.my_table" # Your query
}
}
I had the same issue. The issue was solved by downloading a Cassandra JDBC from DatabaseSchema.
also when You want to add the jar files, add it in the
logstashFolder/logstash-core/lib/jar
there seems to be a bug with logstash which only looks this path for external jar files.
also if there were some jar files that were duplicated use the latest ones.
I am getting below error while using Mongodb Java Driver to ready data from MongoDB and push it to ElasticSearch-
Error: mongodb.jdbc.MongoDriver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
Plateform Info:
OS- RHEL 6.6
Logstash- 5.5.0
Elasticsearch- 5.5.0
Mongodb- 3.2.13
Jars- mongodb-driver-core-3.4.2.jar, mongo-java-driver-3.4.2.jar and bson-3.4.2.jar
Logstash config
input{
jdbc{
jdbc_driver_library => "/home/pdwiwe/logstash-5.5.0/bin/mongo-java-driver-3.4.2.jar"
jdbc_driver_class => "mongodb.jdbc.MongoDriver"
jdbc_connection_string => "jdbc:mongo://hostname:27017?authSource=admin"
jdbc_user => "user"
jdbc_password => "pwd"
statement => "select * from system.users"
}
}
output {
if "_grokparsefailure" not in [tags]{
elasticsearch {
hosts => [ "localhost:9200" ]
index => "mongodb-data"
}
}
}
Logstash Service Start:
/home/pdwiwe/logstash-5.5.0/bin$ sh logstash -f mongo.conf
mongodb.jdbc.MongoDriver is not a Driver class in the mongo-java-driver.
AFAIK - this driver does not support JDBC
Various JDBC drivers have wrapped the mongo-java-driver such as Unity, Simba, DbSchema
I am trying to start logstash 5.4 on my linux rhel 6 server but i'm getting the following message:
WARNING: Default JAVA_OPTS will be overridden by the JAVA_OPTS defined in the environment. Environment JAVA_OPTS are -Xms1G .Xmx64G
Error: Could not find or load main class .Xmx64G
Following is my logstash.conf in which I'm try to ingest data from sqlserver
input {
jdbc {
jdbc_driver_library => "/usr/share/logstash/mysql-connector-java-5.1.42-bin.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://9.37.92.72:1433;databaseName=KaiserPermanente;"
jdbc_user => "sa"
jdbc_password => "passw0rd!"
statement => "select * from IEVDIncident ;"
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "kaiserpermanente"
}
stdout { codec => json_lines }
}
Please tell me how can I resolve this one. Thanks
It seems you have an environment variable JAVA_OPTS with value -Xms1G .Xmx64G so it overrides logstash options. You need to change your variable to -Xms1G -Xmx64G. Replace . with -.
I'm trying to run this:
input {
twitter {
# add your data
consumer_key => "shhhhh"
consumer_secret => "shhhhh"
oauth_token => "shhhhh"
oauth_token_secret => "shhhhh"
keywords => ["words"]
full_tweet => true
}
}
output {
elasticsearch_http {
host => "shhhhh"
index => "idx_ls"
index_type => "tweet_ls"
}
}
This is the error I got:
Sending Logstash's logs to /usr/local/Cellar/logstash/5.2.1/libexec/logs which is now configured via log4j2.properties
[2017-02-24T04:48:03,060][ERROR][logstash.plugins.registry] Problems loading a plugin with {:type=>"output", :name=>"elasticsearch_http", :path=>"logstash/outputs/elasticsearch_http", :error_message=>"NameError", :error_class=>NameError, :error_backtrace=>["/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/plugins/registry.rb:221:in `namespace_lookup'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/plugins/registry.rb:157:in `legacy_lookup'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/plugins/registry.rb:133:in `lookup'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/plugins/registry.rb:175:in `lookup_pipeline_plugin'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/plugin.rb:129:in `lookup'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/pipeline.rb:452:in `plugin'", "(eval):12:in `initialize'", "org/jruby/RubyKernel.java:1079:in `eval'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/pipeline.rb:98:in `initialize'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/agent.rb:246:in `create_pipeline'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/agent.rb:95:in `register_pipeline'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/runner.rb:264:in `execute'", "/usr/local/Cellar/logstash/5.2.1/libexec/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:67:in `run'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/runner.rb:183:in `run'", "/usr/local/Cellar/logstash/5.2.1/libexec/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:132:in `run'", "/usr/local/Cellar/logstash/5.2.1/libexec/lib/bootstrap/environment.rb:71:in `(root)'"]}
[2017-02-24T04:48:03,073][ERROR][logstash.agent ] fetched an invalid config {:config=>"input { \n twitter {\n # add your data\n consumer_key => \"shhhhh\"\n consumer_secret => \"Shhhhhh\"\n oauth_token => \"shhhh\"\n oauth_token_secret => \"shhhhh\"\n keywords => [\"word\"]\n full_tweet => true\n }\n}\noutput { \n elasticsearch_http {\n host => \"shhhhh.amazonaws.com\"\n index => \"idx_ls\"\n index_type => \"tweet_ls\"\n }\n}\n", :reason=>"Couldn't find any output plugin named 'elasticsearch_http'. Are you sure this is correct? Trying to load the elasticsearch_http output plugin resulted in this error: Problems loading the requested plugin named elasticsearch_http of type output. Error: NameError NameError"}
I've tried installing elasticsearch_http but it doesnt seem to be a package. Ive also tried
logstash-plugin install logstash-input-elasticsearch
and
logstash-plugin install logstash-output-elasticsearch
which did install but got the same error.
Totally new to logstash so this might be very simple.
I am Trying to follow this https://www.rittmanmead.com/blog/2015/08/three-easy-ways-to-stream-twitter-data-into-elasticsearch/
I tried Val's answer and got this:
[2017-02-24T05:12:45,385][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>#<URI::HTTP:0x4c2332e0 URL:http://shhhhh:9200/>, :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [http://sshhhhhh:9200/][Manticore::ConnectTimeout] connect timed out"}
I can go to the url and i get a response on browser and I it set open on permissions so Im not sure what the issue with that would be.
The elasticsearch_http output is no longer alive. You need to use the elasticsearch output instead.
elasticsearch {
hosts => "localhost:9200"
index => "idx_ls"
document_type => "tweet_ls"
}
Just an addition to #Val's answer. What if you have your hosts parameter without the port:
output {
elasticsearch {
index => "idx_ls"
document_type => "tweet_ls"
hosts => "localhost"
}
}
by default ES runs on port 9200 so you don't have to explicitly set it up.
After watching this tutorial ;
https://www.youtube.com/watch?v=ZnI_rlrei1s
I'm trying to fetch my localhost mysql (using laravel valet mysql) using logstash with jdbc to sent to elasticsearch server .
This is my config :
# file: db.conf
input {
jdbc {
# MySQL jdbc connection string to our database, mydb
jdb_connection_string => "jdbc:mysql://localhost:3306/dragon"
# The user we wish to execute our statement as
jdbc_user => "root"
# The user password
jdbc_password => ""
# The path to our downloaded jdbc driver
jdbc_driver_library => "/Users/muaz/downloads/logstash/mysql-connector-java-5.1.39-bin.jar"
# The name of the deliver clas for MySQL
jdbc_driver_class => "com.mysql.jdbc.Driver"
# Our query
statement = "SELECT * from Receipt"
}
}
output {
# stdout { codec => json_lines }
elasticsearch {
# protocol = https
index => "power_receipt"
document_type => "Receipt"
document_id => "%{id}"
host => "https://search-power-yidhfhkidiiddcccyhyrijaagamu.ap-southeast-1.es.amazonaws.com"
}
}
And i run it using command (in logstash folder) :
./bin/logstash agent -f db.conf
It produce :
fetched an invalid config {:config=>"# file: db.conf\ninput {\n\tjdbc {\n\t\t# MySQL jdbc connection string to our database, mydb\n\t\tjdb_connection_string => \"jdbc:mysql://localhost:3306/dragon\"\n\t\t# The user we wish to execute our statement as \n\t\tjdbc_user => \"root\"\n\t\t# The user password\n\t\tjdbc_password => \"\"\n\t\t# The path to our downloaded jdbc driver\n\t\tjdbc_driver_library => \"/Users/muaz/downloads/logstash/mysql-connector-java-5.1.39-bin.jar\"\n\t\t# The name of the deliver clas for MySQL\n\t\tjdbc_driver_class => \"com.mysql.jdbc.Driver\"\n\t\t# Our query\n\t\tstatement = \"SELECT * from Receipt\"\n\t}\n}\noutput {\n\t# stdout { codec => json_lines }\n\telasticsearch {\n\t\t# protocol = https\n\t\tindex => \"slurp_receipt\"\n\t\tdocument_type => \"Receipt\"\n\t\tdocument_id => \"%{id}\"\n\t\thost => \"https://search-power-yidhfhkidiiddcccyhyrijaagamu.ap-southeast-1.es.amazonaws.com\"\n\t}\n}\n\n\n", :reason=>"Expected one of #, => at line 15, column 13 (byte 521) after # file: db.conf\ninput {\n\tjdbc {\n\t\t# MySQL jdbc connection string to our database, mydb\n\t\tjdb_connection_string => \"jdbc:mysql://localhost:3306/dragon\"\n\t\t# The user we wish to execute our statement as \n\t\tjdbc_user => \"root\"\n\t\t# The user password\n\t\tjdbc_password => \"\"\n\t\t# The path to our downloaded jdbc driver\n\t\tjdbc_driver_library => \"/Users/muaz/downloads/logstash/mysql-connector-java-5.1.39-bin.jar\"\n\t\t# The name of the deliver clas for MySQL\n\t\tjdbc_driver_class => \"com.mysql.jdbc.Driver\"\n\t\t# Our query\n\t\tstatement ", :level=>:error}
How to solve it?
Thank you
You have a typo on the last line of your jdbc input
statement = "SELECT * from Receipt"
should read
statement => "SELECT * from Receipt"
Also in your elasticsearch output you need to change
host => "https://search-power-yidhfhkidiiddcccyhyrijaagamu.ap-southeast-1.es.amazonaws.com"
to
hosts => ["https://search-power-yidhfhkidiiddcccyhyrijaagamu.ap-southeast-1.es.amazonaws.com"]