Logstash 5.1.1 config file execution error? - logstash-configuration

This code is used to send slowlog of elasticsearch 5.1.1 to logstash 5.1.1 as an input:
input {
file {
path => "C:\Users\571952\Downloads\elasticsearch-5.1.1\elasticsearch-5.1.1\logs\elasticsearch_index_search_slowlog"
start_position => "beginning"
}
}
filter {
grok { # parses the common bits
match => [ "message", "[%{URIHOST}:%{ISO8601_SECOND}][%{LOGLEVEL:log_level}]
[%{DATA:es_slowquery_type}]\s*[%{DATA:es_host}]\s*[%{DATA:es_index}]\s*[%{DATA:es_shard}]\s*took[%{DATA:es_duration}],\s*took_millis[%{DATA:es_duration_ms:float}],\s*types[%{DATA:es_types}],\s*stats[%{DATA:es_stats}],\s*search_type[%{DATA:es_search_type}],\s*total_shards[%{DATA:es_total_shards:float}],\s*source[%{GREEDYDATA:es_source}],\s*extra_source[%{GREEDYDATA:es_extra_source}]"]
}
mutate {
gsub => [
"source_body", "], extra_source[$", ""
]
}
}
output {
file {
path => "C:\Users\571952\Desktop\logstash-5.1.1\just_queries"
codec => "json_lines"
}
}
When i ran this code it is showing error like this in the command prompt.
[2017-01-04T18:30:32,032][ERROR][logstash.agent ] Pipeline aborted due to error
{:exception=>#<RegexpError: premature end of char-class: /], extra_source[$/>, :backtrac
e=>["org/jruby/RubyRegexp.java:1424:in `initialize'", "C:/Users/571952/Desktop/logstash-5
.1.1/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutat
e.rb:196:in `register'", "org/jruby/RubyArray.java:1653:in `each_slice'", "C:/Users/57195
2/Desktop/logstash-5.1.1/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/lo
gstash/filters/mutate.rb:184:in `register'", "C:/Users/571952/Desktop/logstash-5.1.1/logs
tash-core/lib/logstash/pipeline.rb:230:in `start_workers'", "org/jruby/RubyArray.java:161
3:in `each'", "C:/Users/571952/Desktop/logstash-5.1.1/logstash-core/lib/logstash/pipeline
.rb:230:in `start_workers'", "C:/Users/571952/Desktop/logstash-5.1.1/logstash-core/lib/lo
gstash/pipeline.rb:183:in `run'", "C:/Users/571952/Desktop/logstash-5.1.1/logstash-core/l
ib/logstash/agent.rb:292:in `start_pipeline'"]}
[2017-01-04T18:30:32,141][INFO ][logstash.agent ] Successfully started Logstash
API endpoint {:port=>9600}
[2017-01-04T18:30:35,036][WARN ][logstash.agent ] stopping pipeline {:id=>"main
"}
Can anyone help me in solving this problem?
This is the code of my slowlog
[2016-12-28T15:53:21,341][DEBUG][index.search.slowlog.query] [vVhZxH7] [sw][0] took[184.7micros], took_millis[0], types[], stats[], search_type[QUERY_THEN_FETCH], total_shards[5], source[{
"ext" : { }
}],

Related

Logstash pipeline is failing when adding filter block in it

I am creating logstash pipeline where I am giving log file as an input and reading those logs on elasticsearch. I want to add geoip filter in my logstash pipeline configuration but when I am adding it's failing and shutting down.
Here is an errors:
[2022-03-17T12:41:05,243][WARN ][logstash.outputs.elasticsearch][main]
Elasticsearch Output configured with `ecs_compatibility => v8`, which
resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Common
Schema. Once ECS v8 and an updated release of this plugin are publicly
available, you will need to update this plugin to resolve this warning.
[2022-03-17T12:41:05,293][ERROR][logstash.javapipeline ][main]
Pipeline error {:pipeline_id=>"main", :exception=>#
<LogStash::ConfigurationError: GeoIP Filter in ECS-Compatiblity mode
requires a `target` when `source` is not an `ip` sub-field, eg. [client]
[ip]>, :backtrace=>["D:/logstash-
8.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-7.2.11-
java/lib/logstash/filters/geoip.rb:143:in `auto_target_from_source!'",
"D:/logstash-8.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-
7.2.11-java/lib/logstash/filters/geoip.rb:133:in `setup_target_field'",
"D:/logstash-8.1.0/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-
7.2.11-java/lib/logstash/filters/geoip.rb:108:in `register'",
"org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in
`register'", "D:/logstash-8.1.0/logstash-
core/lib/logstash/java_pipeline.rb:232:in `block in register_plugins'",
"org/jruby/RubyArray.java:1821:in `each'", "D:/logstash-8.1.0/logstash-
core/lib/logstash/java_pipeline.rb:231:in `register_plugins'",
"D:/logstash-8.1.0/logstash-core/lib/logstash/java_pipeline.rb:590:in
`maybe_setup_out_plugins'", "D:/logstash-8.1.0/logstash-
core/lib/logstash/java_pipeline.rb:244:in `start_workers'",
"D:/logstash-
8.1.0/logstash-core/lib/logstash/java_pipeline.rb:189:in `run'",
"D:/logstash-8.1.0/logstash-core/lib/logstash/java_pipeline.rb:141:in `block in start'"], "pipeline.sources"=>["D:/logstash-8.1.0/my-logstash.conf"], :thread=>"#<Thread:0x6ea94258 run>"}
[2022-03-17T12:41:05,314][INFO ][logstash.javapipeline ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2022-03-17T12:41:05,357][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2022-03-17T12:41:05,390][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2022-03-17T12:41:05,499][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2022-03-17T12:41:05,523][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2022-03-17T12:41:05,525][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2022-03-17T12:41:05,532][DEBUG]
[logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2022-03-17T12:41:05,556][DEBUG][logstash.agent ] Shutting
down all pipelines {:pipelines_count=>0}
When I am using below configuration without filter, then it's working fine:
input {
file {
path => "D:/nest/es-logging-example/log/info/*.log"
start_position => beginning
sincedb_path => "NULL"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "myapplogs"
}
stdout{}
}
But on adding filter in configuration file then it's failing and shutting down:
input {
file {
path => "D:/nest/es-logging-example/log/info/*.log"
start_position => beginning
sincedb_path => "NULL"
}
}
filter {
geoip {
source => "clientip"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "myapplogs"
}
stdout{}
}
What I am doing wrong in 2nd configuration?
What the error states is this
GeoIP Filter in ECS-Compatiblity mode requires a target when source is not an ip sub-field. You're simply missing an explicit target field
So your filter should look like this:
filter {
geoip {
source => "clientip"
target => "clientgeo"
}
}

Logstash sync mongo data to elasticsearch

I'm newbie for using Logstash and Elasticsearch. I wanted to sync my MongoDB data into Elasticsearch using Logstash Plugin (logstash-input-mongodb).
In my mongodata.conf is
input {
uri => 'mongodb://127.0.0.1:27017/final?ssl=true'
placeholder_db_dir => '/opt/logstash-mongodb/'
placeholder_db_name => 'logstash_sqlite.db'
collection => 'twitter_stream'
batch_size => 5000
}
filter {
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
action => "index"
index => "twitter_stream"
hosts => ["localhost:9200"]
}
}
While I running bin/logstash -f /etc/logstash/conf.d/mongodata.conf --path.settings /etc/logstash/
The error was displayed like this
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-02-28T08:48:20,246][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-02-28T08:48:20,331][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.6.0"}
[2020-02-28T08:48:20,883][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", "{" at line 2, column 13 (byte 21) after input {\n uri ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:47:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:55:in compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:17:in block in compile_sources'", "org/jruby/RubyArray.java:2580:in map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:14:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:161:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:27:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:326:in block in converge_state'"]}
[2020-02-28T08:48:21,114][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2020-02-28T08:48:25,969][INFO ][logstash.runner ] Logstash shut down.
Please help me, I don't have any idea about this.
Your configuration is wrong, you need to specify what type of input you are using.
Try to change your input to this one:
input {
mongodb {
uri => 'mongodb://127.0.0.1:27017/final?ssl=true'
placeholder_db_dir => '/opt/logstash-mongodb/'
placeholder_db_name => 'logstash_sqlite.db'
collection => 'twitter_stream'
batch_size => 5000
}
}

Google Protobuf undefined method msgclass' for nil:NilClass

I use google protobuf in the logstash input,Start error when running logstash。
./bin/logstash -f logstash.conf -r
the error is:
[ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exceptio n=>"NoMethodError", :message=>"undefined method msgclass' for nil:NilClass", :backtrace=>["/home/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-pr otobuf-1.1.0/lib/logstash/codecs/protobuf.rb:101:inregister'", "/home/logstash/logstash-core/lib/logstash/codecs/base.rb:20:in initialize'", "/home/logs tash/logstash-core/lib/logstash/plugins/plugin_factory.rb:97:inplugin'", "/home/logstash/logstash-core/lib/logstash/pipeline.rb:110:in plugin'", "(eval) :8:in'", "org/jruby/RubyKernel.java:994:in eval'", "/home/logstash/logstash-core/lib/logstash/pipeline.rb:82:ininitialize'", "/home/logstash/log stash-core/lib/logstash/pipeline.rb:167:in initialize'", "/home/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:inexecute'", "/home/log stash/logstash-core/lib/logstash/agent.rb:305:in `block in converge_state'"]}
logstash.conf is setting:
input {
beats {
port => 5044
ssl => false
codec => protobuf {
class_name => ["Elk.ElkData"]
include_path => ["/home/logstash/test_code/elk.pb.rb"]
protobuf_version => 3
}
type => "protobuf"
}
}
output {
stdout { codec => rubydebug }
}
the 'register' in the 'logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-protobuf-1.1.0/lib/logstash/codecs/protobuf.rb' is:
def register
#metainfo_messageclasses = {}
#metainfo_enumclasses = {}
#metainfo_pb2_enumlist = []
include_path.each { |path| load_protobuf_definition(path) }
if #protobuf_version == 3
#pb_builder = Google::Protobuf::DescriptorPool.generated_pool.lookup(class_name).msgclass
else
#pb_builder = pb2_create_instance(class_name)
end
end
Google::Protobuf::DescriptorPool.generated_pool.lookup(class_name).msgclass
logstash version is 6.3.0, protoc version is 3.6.1, ruby-protoc version is 1.6.1,
Elk community question connection is as follows:
https://discuss.elastic.co/t/logstash-uses-protobuf-running-error-nomethoderror-message-undefined-method-msgclass-for-nil-nilclass/144806?u=sun_changlong
Is it my environmental factor or the protobuf version? Can be used in the protobuf 2 environment.Welcome to leave valuable suggestions

Logstash error: parsing xml file

I am new on ELK and I need your help.
I would like to get some information about the cpu, memory. Those informative are generated every 30 minutes.
My xml file
<?xml version="1.0" encoding="UTF-8"?>
<measData>
<measInfo Id="SensorProcessingCounters">
<measType p="1">SensorsProcessed</measType>
<measValue xxxxxxxxx >
<r p="1">81</r>
</measValue>
</measInfo>
</measData>
My logstash file.conf
input {
file {
path => "/home/test/Desktop/data/file.xml"
start_position => beginning
sincedb_path => "/dev/null"
codec => multiline
{
pattern => "<measData>|</measData>"
negate => true
what => "previous"
}
}
}
filter
{
xml {
store_xml => false
source => "message"
xpath =>
["//measInfo[#measInfoId="SensorProcessingCounters"]/measValue/r[#p='1']/text()", "SensorProcessingCounters"
]
}
mutate{
convert => {
"SensorProcessingCounters"=> "float"}
}
}
output{
elasticsearch
{
action => "index"
hosts => ["localhost:9200"]
index => "stock"
}
stdout{}
}
error message
[2018-07-12T11:16:19,253][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-07-12T11:16:19,973][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.3.1"}
[2018-07-12T11:16:20,649][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, ,, ] at line 20, column 27 (byte 432) after filter\r\n{\r\nxml {\r\nstore_xml => false\r\nsource => \"message\"\r\nxpath =>\r\n[\"//measInfo[#measInfoId=\"", :backtrace=>["/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/pipeline.rb:49:in `initialize'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/pipeline.rb:167:in `initialize'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "/home/test/Desktop/logstash-6.3.1/logstash-core/lib/logstash/agent.rb:305:in `block in converge_state'"]}
[2018-07-12T11:16:21,024][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Thank you
For this line:
["//measInfo[#measInfoId="SensorProcessingCounters"]/measValue/r[#p='1']/text()",
"SensorProcessingCounters"
I guess you should use single quotes:
["//measInfo[#measInfoId='SensorProcessingCounters']/measValue/r[#p='1']/text()",
"SensorProcessingCounters"
because quotes mismatch.

error while importing csv file into elasticsearch using logstash

i want to import one csv file into Logsearch using logstash. my code is below
input {
file {
path =>"C:\Users\welcome\Dropbox\IT_Department\BigDataProjects\StudentWithdraElastic\student_withdraw.csv"
start_position => "beginning"
sincedb_path => "j:\null"
}
}
filter {
csv {
separator => ","
columns => ["DEPT_NAME" ,"CERT_NAME", "SPEC_NAME" ,"STUDENT_NO", "STUD_NAME", "GENDER", "ADVISORS_NAME", "ACADEMIC_YEAR", "REQUEST_NO",
"withdraw_reason_category", "STATUS", "WITHDRAW_DATE", "LECTURER", "COURSE_NO", "COURSE_NAME", "SECTION_NO", "TOTAL_REG"]
mutate {convert => ["SECTION_NO", "integer"] }
mutate {convert => ["TOTAL_REG", "integer"] }
}
output {
elasticsearch {
hosts => "https://hadoop.hct.org"
index => "withDrawIndex"
document_type => "studentWithdrawDocument"
}
stdout {}
}
i run it with below command
C:\Elastic\logstash-6.2.2>bin\logstash -f C:\Users\welcome\Dropbox\IT_Department\BigDataProjects\StudentWithdraElastic\logstash_withdraw.config
it gives the following error
Sending Logstash's logs to C:/Elastic/logstash-6.2.2/logs which is now configured via log4j2.properties
[2018-03-29T09:26:09,570][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Elastic/logstash-6.2.2/modules/fb_apache/configuration"}
[2018-03-29T09:26:09,595][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Elastic/logstash-6.2.2/modules/netflow/configuration"}
[2018-03-29T09:26:09,964][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-03-29T09:26:10,565][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.2"}
[2018-03-29T09:26:11,027][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-29T09:26:11,400][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 13, column 12 (byte 486) after filter {\ncsv {\nseparator => \",\"\ncolumns => [\"DEPT_NAME\"\t,\"CERT_NAME\",\t\"SPEC_NAME\"\t,\"STUDENT_NO\",\t\"STUD_NAME\",\t\"GENDER\",\t\"ADVISORS_NAME\",\t\"ACADEMIC_YEAR\",\t\"REQUEST_NO\",\n\t\t\"withdraw_reason_category\",\t\"STATUS\",\t\"WITHDRAW_DATE\",\t\"LECTURER\",\t\"COURSE_NO\",\t\"COURSE_NAME\",\t\"SECTION_NO\",\t\"TOTAL_REG\"]\n mutate ", :backtrace=>["C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/compiler.rb:42:in `compile_imperative'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/compiler.rb:50:in `compile_graph'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:51:in `initialize'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:169:in `initialize'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/pipeline_action/create.rb:40:in `execute'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:315:in `block in converge_state'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:312:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:299:in `converge_state'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:90:in `execute'", "C:/Elastic/logstash-6.2.2/logstash-core/lib/logstash/runner.rb:348:in `block in execute'", "C:/Elastic/logstash-6.2.2/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
i searched for solutions but didnt find anything which can help.
You're missing a closing curly brace at the end of your csv filter:
filter {
csv {
separator => ","
columns => ["DEPT_NAME" ,"CERT_NAME", "SPEC_NAME" ,"STUDENT_NO", "STUD_NAME", "GENDER", "ADVISORS_NAME", "ACADEMIC_YEAR", "REQUEST_NO", "withdraw_reason_category", "STATUS", "WITHDRAW_DATE", "LECTURER", "COURSE_NO", "COURSE_NAME", "SECTION_NO", "TOTAL_REG"]
} <-- this is missing
mutate {convert => ["SECTION_NO", "integer"] }
mutate {convert => ["TOTAL_REG", "integer"] }
}

Resources