Logstash sync mongo data to elasticsearch - elasticsearch

I'm newbie for using Logstash and Elasticsearch. I wanted to sync my MongoDB data into Elasticsearch using Logstash Plugin (logstash-input-mongodb).
In my mongodata.conf is
input {
uri => 'mongodb://127.0.0.1:27017/final?ssl=true'
placeholder_db_dir => '/opt/logstash-mongodb/'
placeholder_db_name => 'logstash_sqlite.db'
collection => 'twitter_stream'
batch_size => 5000
}
filter {
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
action => "index"
index => "twitter_stream"
hosts => ["localhost:9200"]
}
}
While I running bin/logstash -f /etc/logstash/conf.d/mongodata.conf --path.settings /etc/logstash/
The error was displayed like this
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-02-28T08:48:20,246][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-02-28T08:48:20,331][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.6.0"}
[2020-02-28T08:48:20,883][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", "{" at line 2, column 13 (byte 21) after input {\n uri ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:47:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:55:in compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:17:in block in compile_sources'", "org/jruby/RubyArray.java:2580:in map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:14:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:161:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:27:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:326:in block in converge_state'"]}
[2020-02-28T08:48:21,114][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-02-28T08:48:25,969][INFO ][logstash.runner ] Logstash shut down.
Please help me, I don't have any idea about this.

Your configuration is wrong, you need to specify what type of input you are using.
Try to change your input to this one:
input {
mongodb {
uri => 'mongodb://127.0.0.1:27017/final?ssl=true'
placeholder_db_dir => '/opt/logstash-mongodb/'
placeholder_db_name => 'logstash_sqlite.db'
collection => 'twitter_stream'
batch_size => 5000
}
}

Related

Logstash not ingesting content into elasticsearch

I have installed elasticsearch-8.2.3 logstash-8.2.3 and kibana-8.2.3 I have configure the logstash conf file to ingest content into elasticsearch, logstash run without any error but it is not ingesting the content.
Below is the conf file:
input {
#stdin {type => "stdin-type" }
file
{
path => "D:/logstash-8.2.3/inspec/*.*"
type => "file"
start_position=>"beginning"
sincedb_path => "NUL"
ignore_older => 0
}
}
filter {
csv
{
columns =>
[
"itemid","itemtitle","rlabel","ayear","rid","rsid","anotatedby","anotatetime","antype","astate","broaderlevel3","broaderlevel2","broaderlevel1","categorylabel","toppreferedlabel"
]
separator => ","
remove_field => ["type","host"]
}
mutate
{
split => { "antype" => ";" }
split => { "broaderlevel3" => ";" }
split => { "broaderlevel2" => ";" }
split => { "broaderlevel1" => ";" }
split => { "categorylabel" => ";" }
split => { "toppreferedlabel" => ";" }
}
}
output {
stdout { }
elasticsearch
{
hosts => ["localhost"]
index => "iet-tv"
}
}
I don't get any error message while running logstash but content not getting ingested into Elasticsearch.
Below is the log:
[2022-06-29T14:03:03,579][INFO ][logstash.runner ] Log4j configuration path used is: D:\logstash-8.2.3\config\log4j2.properties
[2022-06-29T14:03:03,595][WARN ][logstash.runner ] The use of JAVA_HOME has been deprecated. Logstash 8.0 and later ignores JAVA_HOME and uses the bundled JDK. Running Logstash with the bundled JDK is recommended. The bundled JDK has been verified to work with each specific version of Logstash, and generally provides best performance and reliability. If you have compelling reasons for using your own JDK (organizational-specific compliance requirements, for example), you can configure LS_JAVA_HOME to use that version instead.
[2022-06-29T14:03:03,598][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.2.3", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.15+10 on 11.0.15+10 +indy +jit [mswin32-x86_64]"}
[2022-06-29T14:03:03,600][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2022-06-29T14:03:03,736][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-06-29T14:03:11,340][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-06-29T14:03:12,628][INFO ][org.reflections.Reflections] Reflections took 153 ms to scan 1 urls, producing 120 keys and 395 values
[2022-06-29T14:03:15,580][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2022-06-29T14:03:15,662][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2022-06-29T14:03:16,210][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2022-06-29T14:03:16,532][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2022-06-29T14:03:16,549][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.2.3) {:es_version=>8}
[2022-06-29T14:03:16,553][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2022-06-29T14:03:16,627][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2022-06-29T14:03:16,627][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2022-06-29T14:03:16,632][WARN ][logstash.outputs.elasticsearch][main] Elasticsearch Output configured with `ecs_compatibility => v8`, which resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Common Schema. Once ECS v8 and an updated release of this plugin are publicly available, you will need to update this plugin to resolve this warning.
[2022-06-29T14:03:16,652][INFO ][logstash.filters.csv ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2022-06-29T14:03:16,694][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2022-06-29T14:03:16,762][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["D:/logstash-8.2.3/conf/inspec.conf"], :thread=>"#<Thread:0x48e38277 run>"}
[2022-06-29T14:03:18,017][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>1.25}
[2022-06-29T14:03:18,102][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-06-29T14:03:18,171][INFO ][filewatch.observingtail ][main][2c845ee5978dc5ed1bf8d0f617965d2013df9d31461210f0e7c2b799e02f6bb8] START, creating Discoverer, Watch with file and sincedb collections
[2022-06-29T14:03:18,220][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
Any suggestions much appreciated.
Thanks
Dharmendra Kumar Singh
In filebeat, ignore_older => 0 turns off age-based filtering. In a logstash file input it tells the filter to ignore any file more than zero seconds old, and since the file input sleeps between its periodic polls for new files, that can mean it ignores all files, even if they are being updated.
In my case (Windows 10, Logstash 8.1.0), the file path with back-slashes ( C:\path\to\csv\etc.CSV ) caused the same issue, changing back-slashes to forward-slashes fixed the problem.
Here is a working logstash config:
input {
file {
path => "C:/path/to/csv/file.csv"
type => "file"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
columns =>
[
"WID","LID","IID","Product","QTY","TID"
]
separator => ","
}
mutate {
rename => {
"WID" => "w_id"
"LID" => "l_id"
"IID" => "i_id"
"Product" => "product"
"QTY" => "quantity"
}
convert => {
"w_id" => "integer"
"l_id" => "integer"
"i_id" => "integer"
"quantity" => "float"
}
remove_field => [
"#timestamp",
"#version",
"host",
"message",
"type",
"path",
"event",
"log",
"TID"
]
}
}
output {
elasticsearch {
action => "index"
hosts => ["https://127.0.0.1:9200"]
index => "product_inline"
}
stdout { }
}

Logstash pipeline is failing when adding filter block in it

I am creating logstash pipeline where I am giving log file as an input and reading those logs on elasticsearch. I want to add geoip filter in my logstash pipeline configuration but when I am adding it's failing and shutting down.
Here is an errors:
[2022-03-17T12:41:05,243][WARN ][logstash.outputs.elasticsearch][main]
Elasticsearch Output configured with `ecs_compatibility => v8`, which
resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Common
Schema. Once ECS v8 and an updated release of this plugin are publicly
available, you will need to update this plugin to resolve this warning.
[2022-03-17T12:41:05,293][ERROR][logstash.javapipeline ][main]
Pipeline error {:pipeline_id=>"main", :exception=>#
<LogStash::ConfigurationError: GeoIP Filter in ECS-Compatiblity mode
requires a `target` when `source` is not an `ip` sub-field, eg. [client]
[ip]>, :backtrace=>["D:/logstash-
8.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-7.2.11-
java/lib/logstash/filters/geoip.rb:143:in `auto_target_from_source!'",
"D:/logstash-8.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-
7.2.11-java/lib/logstash/filters/geoip.rb:133:in `setup_target_field'",
"D:/logstash-8.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-
7.2.11-java/lib/logstash/filters/geoip.rb:108:in `register'",
"org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in
`register'", "D:/logstash-8.1.0/logstash-
core/lib/logstash/java_pipeline.rb:232:in `block in register_plugins'",
"org/jruby/RubyArray.java:1821:in `each'", "D:/logstash-8.1.0/logstash-
core/lib/logstash/java_pipeline.rb:231:in `register_plugins'",
"D:/logstash-8.1.0/logstash-core/lib/logstash/java_pipeline.rb:590:in
`maybe_setup_out_plugins'", "D:/logstash-8.1.0/logstash-
core/lib/logstash/java_pipeline.rb:244:in `start_workers'",
"D:/logstash-
8.1.0/logstash-core/lib/logstash/java_pipeline.rb:189:in `run'",
"D:/logstash-8.1.0/logstash-core/lib/logstash/java_pipeline.rb:141:in `block in start'"], "pipeline.sources"=>["D:/logstash-8.1.0/my-logstash.conf"], :thread=>"#<Thread:0x6ea94258 run>"}
[2022-03-17T12:41:05,314][INFO ][logstash.javapipeline ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2022-03-17T12:41:05,357][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2022-03-17T12:41:05,390][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2022-03-17T12:41:05,499][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2022-03-17T12:41:05,523][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2022-03-17T12:41:05,525][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2022-03-17T12:41:05,532][DEBUG]
[logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2022-03-17T12:41:05,556][DEBUG][logstash.agent ] Shutting
down all pipelines {:pipelines_count=>0}
When I am using below configuration without filter, then it's working fine:
input {
file {
path => "D:/nest/es-logging-example/log/info/*.log"
start_position => beginning
sincedb_path => "NULL"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "myapplogs"
}
stdout{}
}
But on adding filter in configuration file then it's failing and shutting down:
input {
file {
path => "D:/nest/es-logging-example/log/info/*.log"
start_position => beginning
sincedb_path => "NULL"
}
}
filter {
geoip {
source => "clientip"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "myapplogs"
}
stdout{}
}
What I am doing wrong in 2nd configuration?
What the error states is this
GeoIP Filter in ECS-Compatiblity mode requires a target when source is not an ip sub-field. You're simply missing an explicit target field
So your filter should look like this:
filter {
geoip {
source => "clientip"
target => "clientgeo"
}
}

Error while parsing csv to kafka in logstash

I am trying to send csv data to kafka using LogStash implementing my own configuration script named test.conf.
I got this error while parsing.
Using JAVA_HOME defined java: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.262.b10-0.el7_8.x86_64
WARNING, using JAVA_HOME while Logstash distribution comes with a bundled JDK
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2021-05-24 19:12:08.565 [main] runner - Starting Logstash {"logstash.version"=>"7.10.0", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 25.262-b10 on 1.8.0_262-b10 +indy +jit [linux-x86_64]"}
[FATAL] 2021-05-24 19:12:08.616 [main] runner - An unexpected error occurred! {:error=>#<ArgumentError: Path "/usr/share/logstash/data" must be a writable directory. It is not writable.>, :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/settings.rb:530:in `validate'", "/usr/share/logstash/logstash-core/lib/logstash/settings.rb:290:in `validate_value'", "/usr/share/logstash/logstash-core/lib/logstash/settings.rb:201:in `block in validate_all'", "org/jruby/RubyHash.java:1415:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/settings.rb:200:in `validate_all'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:317:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/clamp-0.6.5/lib/clamp/command.rb:67:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:273:in `run'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/clamp-0.6.5/lib/clamp/command.rb:132:in `run'", "/usr/share/logstash/lib/bootstrap/environment.rb:88:in `<main>'"]}
[ERROR] 2021-05-24 19:12:08.623 [main] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
This is the command used to run logstash.
/usr/share/logstash/bin/logstash -f test.conf
Here is the config file.
input {
file {
path => "/home/data/*.csv"
start_position =>"beginning"
sincedb_path => "/dev/null"
}
}
filter {
mutate {
add_field => {
"timestamp" => "%{Date} %{Time}"
}
}
date { match => ["timestamp", "dd-MM-YYYY HH:mm:ss"]}
csv {
remove_field => ["Date", "Time"]
}
grok {
match => { "message" => [
"^%{DATE:timestamp},%{NUMBER:ab},%{NUMBER:cd},%{NUMBER:ef},%{NUMBER:gh},%{NUMBER:ij},%{NUMBER:kl},%{NUMBER:mn},%{NUMBER:op},%{NUMBER:qr},%{NUMBER:st},%{NUMBER:uv},%{NUMBER:wx},%{NUMBER:yz}$"
]
}
}
}
output {
stdout { codec => rubydebug }
if "_grokparsefailure" not in [tags] {
kafka {
codec => "json"
topic_id => "abcd1234"
bootstrap_servers => "192.16.12.119:9092"
}
}
}
Please help me with this.
First of all make sure about the the availability of your server ip with given port(192.16.12.119) with "telnet 192.16.12.119 9092 .
after that you forgot one field in in Kafka output section, Add the group_id field in your output Kafka section such as
Kafka{group_id => "35834"
topics => ["Your topic name"]
bootstrap_server => "192.16.12.199:9092"
codec => json}
If it doesn't worked again then change you bootstrap as "advertise type" look like below
bootstrap.servers => "advertised.listeners=PLAINTEXT://192.16.12.199:9092"
at the end do not metion you server or system ip in internet, It's not safe ;)
Your config is not even being loaded, you have a FATAL error when starting logstash.
[FATAL] 2021-05-24 19:12:08.616 [main] runner - An unexpected error occurred! {:error=>#<ArgumentError: Path "/usr/share/logstash/data" must be a writable directory. It is not writable.>,
The user that you are using to run logstash does not have permissions to write in this directory, it needs permission to write to the path.data directory or logstash won't start.
Your logstash.yml file also is not being loaded.
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
You need first to give permissions to the user running logstash to write into the path.data, you can change the path.data in the logstash.yml file, then you can pass the path to that file in the command line.
Considering that you installed logstash using a package manager like yum or apt, your logstash.yml file will be in the directory /etc/logstash/.
So you need to run logstash this way:
/usr/share/logstash/bin/logstash -f /path/to/your/config.conf --path.settings /etc/logstash/.
In the logstash.yml you need to set path.data to a directory where the user has permissions to write.
path.data: /path/to/writable/directory

Logstash error: parsing xml file

I am new on ELK and I need your help.
I would like to get some information about the cpu, memory. Those informative are generated every 30 minutes.
My xml file
<?xml version="1.0" encoding="UTF-8"?>
<measData>
<measInfo Id="SensorProcessingCounters">
<measType p="1">SensorsProcessed</measType>
<measValue xxxxxxxxx >
<r p="1">81</r>
</measValue>
</measInfo>
</measData>
My logstash file.conf
input {
file {
path => "/home/test/Desktop/data/file.xml"
start_position => beginning
sincedb_path => "/dev/null"
codec => multiline
{
pattern => "<measData>|</measData>"
negate => true
what => "previous"
}
}
}
filter
{
xml {
store_xml => false
source => "message"
xpath =>
["//measInfo[#measInfoId="SensorProcessingCounters"]/measValue/r[#p='1']/text()", "SensorProcessingCounters"
]
}
mutate{
convert => {
"SensorProcessingCounters"=> "float"}
}
}
output{
elasticsearch
{
action => "index"
hosts => ["localhost:9200"]
index => "stock"
}
stdout{}
}
error message
[2018-07-12T11:16:19,253][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-07-12T11:16:19,973][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.1"}
[2018-07-12T11:16:20,649][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ,, ] at line 20, column 27 (byte 432) after filter\r\n{\r\nxml {\r\nstore_xml => false\r\nsource => \"message\"\r\nxpath =>\r\n[\"//measInfo[#measInfoId=\"", :backtrace=>["/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/pipeline.rb:49:in `initialize'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/pipeline.rb:167:in `initialize'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/agent.rb:305:in `block in converge_state'"]}
[2018-07-12T11:16:21,024][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Thank you
For this line:
["//measInfo[#measInfoId="SensorProcessingCounters"]/measValue/r[#p='1']/text()",
"SensorProcessingCounters"
I guess you should use single quotes:
["//measInfo[#measInfoId='SensorProcessingCounters']/measValue/r[#p='1']/text()",
"SensorProcessingCounters"
because quotes mismatch.

Logstash can not connect to Elastic search

{:timestamp=>"2017-07-19T15:56:36.517000+0530", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200\"]', but Elasticsearch appears to be unreachable or down!", :error_message=>"Connection refused (Connection refused)", :class=>"Manticore::SocketException", :level=>:error}
{:timestamp=>"2017-07-19T15:56:37.761000+0530", :message=>"Connection refused (Connection refused)", :class=>"Manticore::SocketException", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:37:in `initialize'", "org/jruby/RubyProc.java:281:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:79:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:256:in `call_once'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.0-java/lib/manticore/response.rb:153:in `code'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/http/manticore.rb:84:in `perform_request'", "org/jruby/RubyProc.java:281:in `call'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/base.rb:257:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/http/manticore.rb:67:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/sniffer.rb:32:in `hosts'", "org/jruby/ext/timeout/Timeout.java:147:in `timeout'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/sniffer.rb:31:in `hosts'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.18/lib/elasticsearch/transport/transport/base.rb:79:in `reload_connections!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:72:in `sniff!'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:60:in `start_sniffing!'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:60:in `start_sniffing!'", "org/jruby/RubyKernel.java:1479:in `loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:59:in `start_sniffing!'"], :level=>:error}
{:timestamp=>"2017-07-19T15:56:38.520000+0530", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"http://localhost:9200\"]', but Elasticsearch appears to be unreachable or down!", :error_message=>"Connection refused (Connection refused)", :class=>"Manticore::SocketException", :level=>:error}
Though Elastic search in running on port 127.0.0.1:9200
I do not understand from where logstash is taking this configuration
I have not configured logstash to connect elastic search on localhost
in logstash.service
ExecStart=/usr/share/logstash/bin/logstash "--path.settings" "/etc/logstash"
and in
/etc/logstash
I have logstash.yml
path.config: /etc/logstash/conf.d
in /etc/logstash/conf.d
output {
elasticsearch { hosts => ["10.2.0.10:9200"]
manage_template => false
index => "%{[#metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}
conf.d is your directory right, you need a file there something like myconf.conf and be in following format:
input {
}
filter {
#can be empty
}
output {
}
Once you apply all your changes you need to restart your logstash service, and it will apply your new changes. You can also control that in your LS settings logstash.yml file, if you need it to restart itself, once you apply a new change to any of the files under conf.d
You can also break up your conf files like 1_input.conf 2_filter.conf and 99_output.conf such that each will contain its own plugin i.e. input , filter and output.
Start Elasticsearch.
Write a conf file for Logstash to connect and upload data into Elasticsearch.
input {
file {
type => "csv"
path => "path for csv."
start_position => "beginning"
}
}
filter {
csv {
columns => ["Column1","Column2"]
separator => ","
}
mutate {
convert => {"Column1" => "float"}
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
}
stdout { codec => rubydebug}
}
host for elasticsearch can be configured in elasticsearch.yml file.
run the conf file (logstash.bat -f abc.conf)

Resources