Logstash can not connect to Elastic search - elasticsearch

{:timestamp=>"2017-07-19T15:56:36.517000+0530", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200\"]', but Elasticsearch appears to be unreachable or down!", :error_message=>"Connection refused (Connection refused)", :class=>"Manticore::SocketException", :level=>:error}
{:timestamp=>"2017-07-19T15:56:37.761000+0530", :message=>"Connection refused (Connection refused)", :class=>"Manticore::SocketException", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:37:in `initialize'", "org/jruby/RubyProc.java:281:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:79:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:256:in `call_once'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:153:in `code'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/http/manticore.rb:84:in `perform_request'", "org/jruby/RubyProc.java:281:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/base.rb:257:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/http/manticore.rb:67:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/sniffer.rb:32:in `hosts'", "org/jruby/ext/timeout/Timeout.java:147:in `timeout'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/sniffer.rb:31:in `hosts'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/base.rb:79:in `reload_connections!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:72:in `sniff!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:60:in `start_sniffing!'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:60:in `start_sniffing!'", "org/jruby/RubyKernel.java:1479:in `loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:59:in `start_sniffing!'"], :level=>:error}
{:timestamp=>"2017-07-19T15:56:38.520000+0530", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200\"]', but Elasticsearch appears to be unreachable or down!", :error_message=>"Connection refused (Connection refused)", :class=>"Manticore::SocketException", :level=>:error}
Though Elastic search in running on port 127.0.0.1:9200
I do not understand from where logstash is taking this configuration
I have not configured logstash to connect elastic search on localhost
in logstash.service
ExecStart=/usr/share/logstash/bin/logstash "--path.settings" "/etc/logstash"
and in
/etc/logstash
I have logstash.yml
path.config: /etc/logstash/conf.d
in /etc/logstash/conf.d
output {
elasticsearch { hosts => ["10.2.0.10:9200"]
manage_template => false
index => "%{[#metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}

conf.d is your directory right, you need a file there something like myconf.conf and be in following format:
input {
}
filter {
#can be empty
}
output {
}
Once you apply all your changes you need to restart your logstash service, and it will apply your new changes. You can also control that in your LS settings logstash.yml file, if you need it to restart itself, once you apply a new change to any of the files under conf.d
You can also break up your conf files like 1_input.conf 2_filter.conf and 99_output.conf such that each will contain its own plugin i.e. input , filter and output.

Start Elasticsearch.
Write a conf file for Logstash to connect and upload data into Elasticsearch.
input {
file {
type => "csv"
path => "path for csv."
start_position => "beginning"
}
}
filter {
csv {
columns => ["Column1","Column2"]
separator => ","
}
mutate {
convert => {"Column1" => "float"}
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
}
stdout { codec => rubydebug}
}
host for elasticsearch can be configured in elasticsearch.yml file.
run the conf file (logstash.bat -f abc.conf)

Related

Logstash pipeline is failing when adding filter block in it

I am creating logstash pipeline where I am giving log file as an input and reading those logs on elasticsearch. I want to add geoip filter in my logstash pipeline configuration but when I am adding it's failing and shutting down.
Here is an errors:
[2022-03-17T12:41:05,243][WARN ][logstash.outputs.elasticsearch][main]
Elasticsearch Output configured with `ecs_compatibility => v8`, which
resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Common
Schema. Once ECS v8 and an updated release of this plugin are publicly
available, you will need to update this plugin to resolve this warning.
[2022-03-17T12:41:05,293][ERROR][logstash.javapipeline ][main]
Pipeline error {:pipeline_id=>"main", :exception=>#
<LogStash::ConfigurationError: GeoIP Filter in ECS-Compatiblity mode
requires a `target` when `source` is not an `ip` sub-field, eg. [client]
[ip]>, :backtrace=>["D:/logstash-
8.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-7.2.11-
java/lib/logstash/filters/geoip.rb:143:in `auto_target_from_source!'",
"D:/logstash-8.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-
7.2.11-java/lib/logstash/filters/geoip.rb:133:in `setup_target_field'",
"D:/logstash-8.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-
7.2.11-java/lib/logstash/filters/geoip.rb:108:in `register'",
"org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in
`register'", "D:/logstash-8.1.0/logstash-
core/lib/logstash/java_pipeline.rb:232:in `block in register_plugins'",
"org/jruby/RubyArray.java:1821:in `each'", "D:/logstash-8.1.0/logstash-
core/lib/logstash/java_pipeline.rb:231:in `register_plugins'",
"D:/logstash-8.1.0/logstash-core/lib/logstash/java_pipeline.rb:590:in
`maybe_setup_out_plugins'", "D:/logstash-8.1.0/logstash-
core/lib/logstash/java_pipeline.rb:244:in `start_workers'",
"D:/logstash-
8.1.0/logstash-core/lib/logstash/java_pipeline.rb:189:in `run'",
"D:/logstash-8.1.0/logstash-core/lib/logstash/java_pipeline.rb:141:in `block in start'"], "pipeline.sources"=>["D:/logstash-8.1.0/my-logstash.conf"], :thread=>"#<Thread:0x6ea94258 run>"}
[2022-03-17T12:41:05,314][INFO ][logstash.javapipeline ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2022-03-17T12:41:05,357][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2022-03-17T12:41:05,390][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2022-03-17T12:41:05,499][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2022-03-17T12:41:05,523][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2022-03-17T12:41:05,525][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2022-03-17T12:41:05,532][DEBUG]
[logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2022-03-17T12:41:05,556][DEBUG][logstash.agent ] Shutting
down all pipelines {:pipelines_count=>0}
When I am using below configuration without filter, then it's working fine:
input {
file {
path => "D:/nest/es-logging-example/log/info/*.log"
start_position => beginning
sincedb_path => "NULL"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "myapplogs"
}
stdout{}
}
But on adding filter in configuration file then it's failing and shutting down:
input {
file {
path => "D:/nest/es-logging-example/log/info/*.log"
start_position => beginning
sincedb_path => "NULL"
}
}
filter {
geoip {
source => "clientip"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "myapplogs"
}
stdout{}
}
What I am doing wrong in 2nd configuration?
What the error states is this
GeoIP Filter in ECS-Compatiblity mode requires a target when source is not an ip sub-field. You're simply missing an explicit target field
So your filter should look like this:
filter {
geoip {
source => "clientip"
target => "clientgeo"
}
}

Logstash sync mongo data to elasticsearch

I'm newbie for using Logstash and Elasticsearch. I wanted to sync my MongoDB data into Elasticsearch using Logstash Plugin (logstash-input-mongodb).
In my mongodata.conf is
input {
uri => 'mongodb://127.0.0.1:27017/final?ssl=true'
placeholder_db_dir => '/opt/logstash-mongodb/'
placeholder_db_name => 'logstash_sqlite.db'
collection => 'twitter_stream'
batch_size => 5000
}
filter {
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
action => "index"
index => "twitter_stream"
hosts => ["localhost:9200"]
}
}
While I running bin/logstash -f /etc/logstash/conf.d/mongodata.conf --path.settings /etc/logstash/
The error was displayed like this
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-02-28T08:48:20,246][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-02-28T08:48:20,331][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.6.0"}
[2020-02-28T08:48:20,883][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", "{" at line 2, column 13 (byte 21) after input {\n uri ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:47:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:55:in compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:17:in block in compile_sources'", "org/jruby/RubyArray.java:2580:in map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:14:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:161:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:27:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:326:in block in converge_state'"]}
[2020-02-28T08:48:21,114][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-02-28T08:48:25,969][INFO ][logstash.runner ] Logstash shut down.
Please help me, I don't have any idea about this.
Your configuration is wrong, you need to specify what type of input you are using.
Try to change your input to this one:
input {
mongodb {
uri => 'mongodb://127.0.0.1:27017/final?ssl=true'
placeholder_db_dir => '/opt/logstash-mongodb/'
placeholder_db_name => 'logstash_sqlite.db'
collection => 'twitter_stream'
batch_size => 5000
}
}

Error with elasticsearch_http Logstash

I'm trying to run this:
input {
twitter {
# add your data
consumer_key => "shhhhh"
consumer_secret => "shhhhh"
oauth_token => "shhhhh"
oauth_token_secret => "shhhhh"
keywords => ["words"]
full_tweet => true
}
}
output {
elasticsearch_http {
host => "shhhhh"
index => "idx_ls"
index_type => "tweet_ls"
}
}
This is the error I got:
Sending Logstash's logs to /usr/local/Cellar/logstash/5.2.1/libexec/logs which is now configured via log4j2.properties
[2017-02-24T04:48:03,060][ERROR][logstash.plugins.registry] Problems loading a plugin with {:type=>"output", :name=>"elasticsearch_http", :path=>"logstash/outputs/elasticsearch_http", :error_message=>"NameError", :error_class=>NameError, :error_backtrace=>["/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/plugins/registry.rb:221:in `namespace_lookup'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/plugins/registry.rb:157:in `legacy_lookup'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/plugins/registry.rb:133:in `lookup'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/plugins/registry.rb:175:in `lookup_pipeline_plugin'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/plugin.rb:129:in `lookup'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/pipeline.rb:452:in `plugin'", "(eval):12:in `initialize'", "org/jruby/RubyKernel.java:1079:in `eval'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/pipeline.rb:98:in `initialize'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/agent.rb:246:in `create_pipeline'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/agent.rb:95:in `register_pipeline'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/runner.rb:264:in `execute'", "/usr/local/Cellar/logstash/5.2.1/libexec/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:67:in `run'", "/usr/local/Cellar/logstash/5.2.1/libexec/logstash-core/lib/logstash/runner.rb:183:in `run'", "/usr/local/Cellar/logstash/5.2.1/libexec/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:132:in `run'", "/usr/local/Cellar/logstash/5.2.1/libexec/lib/bootstrap/environment.rb:71:in `(root)'"]}
[2017-02-24T04:48:03,073][ERROR][logstash.agent ] fetched an invalid config {:config=>"input { \n twitter {\n # add your data\n consumer_key => \"shhhhh\"\n consumer_secret => \"Shhhhhh\"\n oauth_token => \"shhhh\"\n oauth_token_secret => \"shhhhh\"\n keywords => [\"word\"]\n full_tweet => true\n }\n}\noutput { \n elasticsearch_http {\n host => \"shhhhh.amazonaws.com\"\n index => \"idx_ls\"\n index_type => \"tweet_ls\"\n }\n}\n", :reason=>"Couldn't find any output plugin named 'elasticsearch_http'. Are you sure this is correct? Trying to load the elasticsearch_http output plugin resulted in this error: Problems loading the requested plugin named elasticsearch_http of type output. Error: NameError NameError"}
I've tried installing elasticsearch_http but it doesnt seem to be a package. Ive also tried
logstash-plugin install logstash-input-elasticsearch
and
logstash-plugin install logstash-output-elasticsearch
which did install but got the same error.
Totally new to logstash so this might be very simple.
I am Trying to follow this https://www.rittmanmead.com/blog/2015/08/three-easy-ways-to-stream-twitter-data-into-elasticsearch/
I tried Val's answer and got this:
[2017-02-24T05:12:45,385][WARN ][logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>#<URI::HTTP:0x4c2332e0 URL:http://shhhhh:9200/>, :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [http://sshhhhhh:9200/][Manticore::ConnectTimeout] connect timed out"}
I can go to the url and i get a response on browser and I it set open on permissions so Im not sure what the issue with that would be.
The elasticsearch_http output is no longer alive. You need to use the elasticsearch output instead.
elasticsearch {
hosts => "localhost:9200"
index => "idx_ls"
document_type => "tweet_ls"
}
Just an addition to #Val's answer. What if you have your hosts parameter without the port:
output {
elasticsearch {
index => "idx_ls"
document_type => "tweet_ls"
hosts => "localhost"
}
}
by default ES runs on port 9200 so you don't have to explicitly set it up.

Logstash not writing output to elasticsearch

The code mentioned is my logstash conf file . I provide my nginx access log file as input and output to elasticsearch .I also write the output to a text file which works fine .. But the output is never been written to elasticsearch.
input {
file {
path => "filepath"
start_position => "beginning"
}
}
output {
file {
path => "filepath"
}
elasticsearch {
host => localhost
port => "9200"
}
}
I also tried executing logstash binary from command line using -e option
input { stdin{ } output { elasticsearch { host => localhost } }
which works fine. I get the output written to elasticsearch.. But in the former case i dont . Help me solve this
I tried a few things, I have no idea why your case with just host works. If I try it, i get timeouts. This is the configuration that works for me:
elasticsearch {
protocol => "http"
host => "localhost"
port => "9200"
}
I tried with logstash 1.4.2 and elasticsearch 1.4.4

Unable to load index to elasticsearch using logstash

I'm Unable to load index to elasticsearch using logstash. The follwing are my logstash.conf settings. To me config settings seems fine. Please help if I'm missing something.
Assume that Logstash & elastic search services are running fine.
input {
file {
type => "IISLog"
path => "C:/inetpub/logs/LogFiles/W3SVC1/u_ex140930.log"
start_postition => "beginning"
}
}
output {
stdout { debug => true debug_format => "ruby"}
elasticsearch_http {
host => "localhost"
port => 9200
protocol => "http"
index => "iislogs2"
}
}
You can start with checking the following:
Check the logstash log file for errors.
Run the following command:telnet localhost 9200 and verify you are able to connect.
Check elasticsearch log files for errors.

Resources