Translation Missing Error in Logstash Logs - elasticsearch

I'm trying to feed data in csv files into elastic search using logstash. My logsatsh config file looks like this:
input {
file {
path => "C:\Users\shreya\Data\RetailData.csv"
start_position => "beginning"
#sincedb_path => "C:\Users\shreya\null"
}
}
filter {
csv {
separator => ","
id => "Store_ID"
columns => ["Store","Date","Temperature","Fuel_Price", "MarkDown1", "MarkDown2", "MarkDown3", "MarkDown4", "CPI", "Unemployment", "IsHoliday"]
}
mutate {convert => ["Store", "integer"]}
mutate {convert => ["Date", "date"]}
mutate {convert => ["Temperature", "float"]}
mutate {convert => ["Fuel_Price", "float"]}
mutate {convert => ["CPI", "float"]}
mutate {convert => ["Unemployment", "float"]}
}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "store"
document_type => "store_retail"
}
stdout {}
#stdout {
# codec => rubydebug
#}
}
But I'm getting an error and not able to figure out a way to solve that. I'm new to logstash. My error log looks like this:
[2017-12-02T15:56:38,150][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/fb_apache/configuration"}
[2017-12-02T15:56:38,165][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/netflow/configuration"}
[2017-12-02T15:56:38,243][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-02T15:56:39,117][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-12-02T15:56:42,965][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch action=>"index", hosts=>["localhost:9200"], index=>"store", document_type=>"store_retail", id=>"91a4406a13e9377abb312acf5f6be8e609a685f9c84a5906af957e956119798c">}
[2017-12-02T15:56:43,604][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-12-02T15:56:43,604][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-12-02T15:56:43,854][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2017-12-02T15:56:43,932][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-12-02T15:56:43,933][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date"}, "#version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-12-02T15:56:43,964][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2017-12-02T15:56:44,011][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x3e4985f1 #metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: out value:0, #metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: in value:0, #logger=#<LogStash::Logging::Logger:0x48eebcf8 #logger=#<Java::OrgApacheLoggingLog4jCore::Logger:0x113b0d16>>, #metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: duration_in_millis value:0, #id=\"e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb\", #klass=LogStash::Filters::Mutate, #metric_events=#<LogStash::Instrument::NamespacedMetric:0x7c8acc8 #metric=#<LogStash::Instrument::Metric:0x3afcd9b5 #collector=#<LogStash::Instrument::Collector:0x73e63041 #agent=nil, #metric_store=#<LogStash::Instrument::MetricStore:0x60e51f03 #store=#<Concurrent::Map:0x00000000000fb0 entries=3 default_proc=nil>, #structured_lookup_mutex=#<Mutex:0x2209413b>, #fast_lookup=#<Concurrent::Map:0x00000000000fb4 entries=86 default_proc=nil>>>>, #namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, :events]>, #filter=<LogStash::Filters::Mutate convert=>{\"Date\"=>\"date\"}, id=>\"e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb\", enable_metric=>true, periodic_flush=>false>>", :error=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register", :thread=>"#<Thread:0x3cc2461b#C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-12-02T15:56:44,042][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:186:in `block in register'", "org/jruby/RubyHash.java:1343:in `each'", "C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:184:in `register'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:388:in `register_plugin'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in `register_plugins'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:801:in `maybe_setup_out_plugins'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:409:in `start_workers'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:333:in `run'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:293:in `block in start'"], :thread=>"#<Thread:0x3cc2461b#C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-12-02T15:56:44,058][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}

The problem comes from the convert target in one of the mutate filters. From the documentation:
Valid conversion targets are: integer, float, string, and boolean.
So this part is causing the crash:
mutate {convert => ["Date", "date"]}
If you want a to convert a String to a date, you'll have to use the date filter.

Validate your config file using below command, that shows the error details.
./logstash -f /etc/logstash/conf.d/your_config_file.conf --config.test_and_exit

Related

pushing data from logtsash to elasticserach

i had stored my configuration file of logstash in the same folder in which logstash is installed.
while trying to push the data from logstash to elasticsearch it is showing that server is started but data is not pushed to the elastic serach. how we can validate whether data is being pushed to elastic search or not.
this is my logstash configuration file.
input{
file{
path =>"C:\Elastic\GOOG.csv"
start_position =>"beginning"
}
}
filter{
csv{
columns =>
["date_of_record","open","high","low","close","volume","adj_close"]
separator => ","
}
date {
match => ["date_of_record","yyyy-MM-dd"]
}
mutate {
convert => ["open","float"]
convert => ["high","float"]
convert => ["low","float"]
convert => ["close","float"]
convert => ["volume","integer"]
convert => ["adj_close","float"]
}
}
output{
elasticsearch {
hosts => ["localhost:9200"]
index => "CSVGOGO"
}
}
Logstash Logs are:
c:\Elastic>.\logstash-7.0.0\bin\logstash -f .\gogo.conf
Sending Logstash logs to c:/Elastic/logstash-7.0.0/logs which is now configured via log4j2.properties
[2019-10-12T20:13:24,602][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-10-12T20:13:24,831][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.0.0"}
[2019-10-12T20:14:42,358][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-10-12T20:14:43,392][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-10-12T20:14:43,868][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-10-12T20:14:43,882][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-10-12T20:14:43,961][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-10-12T20:14:43,971][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-10-12T20:14:44,124][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x22517e24 run>"}
[2019-10-12T20:14:44,604][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash"}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date"}, "#version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-10-12T20:14:48,863][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"c:/Elastic/logstash-7.0.0/data/plugins/inputs/file/.sincedb_1eb0c3bd994c60a8564bc344e0f91452", :path=>["C:\\Elastic\\GOOG.csv"]}
[2019-10-12T20:14:48,976][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-10-12T20:14:49,319][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-10-12T20:14:49,331][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-10-12T20:14:52,244][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
The data will be pushed in ES only if the data flow happened through reader and processor correctly.
Input: Try to make sure that the file is correctly read by the input filter.
Filter: Try writing a ruby processor that prints what data if got from the input.
Output: Write output in the console too to make sure it's as per your expectation.
Also, you can start Logstash in debug mode to get more info.
For ELK stack- to test if data is pushed to ES and if you have installed kibana follow below process
Explanation->
1.optional- Add stdout in logstash pipeline to show what is going on.
stdout { codec => rubydebug }
2.mandatory- Add sincedb_path => "/dev/null" in input/ file pipeline.
Logstash has an interesting component or feature called sincedb. Logstash keeps track of where it was last reading a file before it crashed or stopped.
3.mandatory- index name should be in lowercase (csvgogo)
4.optional/mandatory- document_type => "csvfile" if you dont add then default will be 'logs'
So your logstash output pipeline may look like the following:-
input{
file{
path =>"C:\Elastic\GOOG.csv"
start_position =>"beginning"
sincedb_path => "/dev/null"
}
}
filter{
csv{
columns => ["date_of_record","open","high","low","close","volume","adj_close"]
separator => ","
}
date {
match => ["date_of_record","yyyy-MM-dd"]
}
mutate {
convert => ["open","float"]
convert => ["high","float"]
convert => ["low","float"]
convert => ["close","float"]
convert => ["volume","integer"]
convert => ["adj_close","float"]
}
}
output{
elasticsearch {
hosts => ["localhost:9200"]
index => "csvgogo"
document_type => "csvfile" #default 'logs'
}
}
1.try with kibana's dev tool('http://localhost:5601/app/kibana') option to run query-
GET /csvgogo/_search
{
"query": {
"match_all": {}
}
}
2.try with Chrome browser- 'http://localhost:9200/csvgogo/_search?pretty'
where 'csvgogo' is your ES index name.
it will show you the raw data on browser itself from elastic search.

logstash configuration pipeline

I have a log file looking like this
116.50.181.5 - - [18/May/2015:19:05:32 +0000] "GET /images/web/2009/banner.png HTTP/1.1" 200 52315 "http://www.semicomplete.com/style2.css" "Mozilla/5.0 (X11; Linux x86_64; rv:26.0) Gecko/20100101 Firefox/26.0"
my logstash configuration is as below:
input {
file {
path => "C:\Users\PC\Documents\elk\Input\listening.txt"
start_position => "beginning"
}
}
filter {
grok {
match => {
"message" => '%{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "%{WORD:verb} %{DATA:request} HTTP/%{NUMBER:httpversion}" %{NUMBER:response:int} (?:-|%{NUMBER:bytes:int}) %{QS:referrer} %{QS:agent}'
}
}
date {
match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
locale => en
}
geoip {
source => "clientip"
}
useragent {
source => "agent"
target => "useragent"
}
}
output {
elasticsearch
{
hosts => "http://localhost:9200"
index => "log"
}
}
everything works just fine, I have no errors in logstash , but the data does'nt appear in elasticsearch as expected.
C:\elk\logstash-7.1.1\bin>logstash -f logstashETL.conf
Sending Logstash logs to C:/elk/logstash-7.1.1/logs which is now configured via log4j2.properties
[2019-06-12T16:02:27,371][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-06-12T16:02:27,405][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-06-12T16:02:36,087][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-06-12T16:02:36,344][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-06-12T16:02:36,428][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-06-12T16:02:36,428][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-06-12T16:02:36,469][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2019-06-12T16:02:36,493][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-06-12T16:02:36,513][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x75642d2 run>"}
[2019-06-12T16:02:36,753][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date"}, "#version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-06-12T16:02:37,814][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/elk/logstash-7.1.1/data/plugins/inputs/file/.sincedb_636c54fa423804cc695f80e1cb9d6ccd", :path=>["C:\\Users\\PC\\Documents\\elk\\Input\\listening.txt"]}
[2019-06-12T16:02:37,878][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-06-12T16:02:37,988][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-06-12T16:02:38,008][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-06-12T16:02:38,773][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
maybe there are somethings wrong or misssing in my code.
Add the below code in your input
start_position => "beginning"
sincedb_path => "/dev/null"
sincedb_path => "/dev/null" means it doesn't store sincedb files. These files are keeping byte ofset of where the logstash left on the file.
Then go to logstash/data/plugins/inputs/file directory. After that run below command at this directory
rm -r .sincedb*
Finally run your logstash pipeline. It should work.

Not able to map csv file from logstash to kibana in Window

I'm trying to feed data in csv files into elastic search using logstash. My logsatsh config file looks like this:
input {
file {
path => "D:\Log Anlyser\data\cars.csv"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
separator => ","
columns => [ "maker", "model", "mileage", "manufacture_year", "engine_displacement", "engine_power", "body_type", "color_slug", "stk_year", "transmission", "door_count", "seat_count", "fuel_type", "date_created", "date_last_seen", "price_eur" ]
}
mutate {convert => ["milage", "integer"] }
mutate {convert => ["price_eur", "float"] }
mutate {convert => ["engine_power", "integer"] }
mutate {convert => ["door_count", "integer"] }
mutate {convert => ["seat_count", "integer"] }
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => ["cars-%{+YYYY.MM.dd}"]
}
}
while firing this command for logstash in window : logstash -f cars.conf i am getting this:-
Sending Logstash logs to D:/Log_Anlyser/logstash/logs which is now configured via log4j2.properties
[2019-02-26T12:05:51,690][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-02-26T12:05:51,721][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.6.1"}
[2019-02-26T12:05:57,133][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-02-26T12:05:57,510][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-02-26T12:05:57,664][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-02-26T12:05:57,711][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>5}
[2019-02-26T12:05:57,742][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-02-26T12:05:57,758][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-02-26T12:05:57,852][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date", "include_in_all"=>false}, "#version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-02-26T12:05:58,179][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x274079d5 run>"}
[2019-02-26T12:05:58,226][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-02-26T12:05:58,226][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-02-26T12:05:58,547][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Now While connecting to kibana(localhost:5601) i am not able to map the data. Getting this error:-
Unable to fetch mapping. Do you have indices matching the pattern?
Can you please help.
I got the problem. Mistake is very silly. Path of CSV file was wrong. Earlier path was path => "D:\Log Anlyser\data\cars.csv". Current Path is`
path => "D:/Log_Anlyser/data/cars.csv"
It will work
There might be few reasons - maybe the data is not reaching ES at all. you can check that by verifying the index exists, by running
GET es-url:9200/_cat/indices/cars*
If an index exists then you should be able to create the index pattern in Kibana.
If the index is missing then either Logstash is not reading the input file, or elasticsearch is not reachable. need to check logstash logs, and make sure data reaches ES.

Input as file path in logstash config didn't work

When I run a command like this(on a Windows System):
logstash -f logstash-apache.conf
there's no output and it didn't store any log to the elasticsearch.
so I think it didn't work.
btw I refered to the website:https://www.elastic.co/guide/en/logstash/current/config-examples.html#config-examples
this is my conf file(logstash-apache.conf):
input {
file {
path => ["C:/Users/User/Downloads/logstash-5.5.1/bin/access_log.txt"]
start_position => "beginning"
}
}
filter {
if [path] =~ "access" {
mutate { replace => { "type" => "apache_access" } }
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout { codec => rubydebug }
}
this is the output:
C:\Users\User\Downloads\logstash-5.5.1\bin>logstash -f logstash-apache.conf
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Sending Logstash's logs to C:/Users/User/Downloads/logstash-5.5.1/logs which is now configured via log4j2.properties
[2017-08-18T08:35:20,504][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[localhost:9200/]}}
[2017-08-18T08:35:20,509][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>localhost:9200/, :path=>"/"}
[2017-08-18T08:35:20,668][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#}
[2017-08-18T08:35:20,670][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-08-18T08:35:20,725][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date", "include_in_all"=>false}, "#version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-08-18T08:35:20,734][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#]}
[2017-08-18T08:35:21,010][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-08-18T08:35:21,896][INFO ][logstash.pipeline ] Pipeline main started
[2017-08-18T08:35:22,036][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
thank you in advance :)

logstash output not showing up in kibana

I just started learning Elastic Search and trying to dump IIS logs to ES via logstash and see how it looks in Kibana.
Have set up all the 3 agents succuessfully and they run witout errors. But when I run logstash on my stored log files, the logs doesnt show up in Kibana.
(Am using ES5.0 which doesnt have the 'head' _plugin)
This is the output I see in logstash command.
Sending Logstash logs to C:/elasticsearch-5.0.0/logstash-5.0.0-rc1/logs which is now configured via log4j2.properties.
06:28:26.067 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
06:28:26.081 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Using mapping template from {:path=>nil}
06:28:26.501 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date", "include_in_all"=>false}, "#version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
06:28:26.573 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"]}
06:28:26.717 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
06:28:26.736 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
06:28:26.857 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
But kibana doesnt show up any indexes. I am a newbie here and am not sure whats going on internally. Could you please help me understand what is wrong here.
Logstash Config file:
input {
file {
type => "iis-w3c"
path => "C:/Users/ras/Desktop/logs/logs/LogFiles/test/aug1/*.log"
}
}
filter {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:serviceName} %{WORD:serverName} %{IP:serverIP} %{WORD:method} %{URIPATH:uriStem} %{NOTSPACE:uriQuery} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clientIP} %{NOTSPACE:protocolVersion} %{NOTSPACE:userAgent} %{NOTSPACE:cookie} %{NOTSPACE:referer} %{NOTSPACE:requestHost} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:win32response} %{NUMBER:bytesSent} %{NUMBER:bytesReceived} %{NUMBER:timetaken}"]
}
mutate {
## Convert some fields from strings to integers
#
convert => ["bytesSent", "integer"]
convert => ["bytesReceived", "integer"]
convert => ["timetaken", "integer"]
## Create a new field for the reverse DNS lookup below
#
add_field => { "clientHostname" => "%{clientIP}" }
## Finally remove the original log_timestamp field since the event will
# have the proper date on it
#
remove_field => [ "log_timestamp"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "%{type}-%{+YYYY.MM}"
}
stdout { codec => rubydebug }
}
You can check the name of the index present in Elasticsearch with a plugin like kopf, or with the endpoint _cat/indices/, which you can access directly via a browser at [ip of ES]:9200/_cat/indices or via curl: curl [ip of ES]:9200/_cat/indices.
With Kibana you have to provide a pattern of the index names, which is by default logstash-*, as shown in your screenshot. This default is used in Kibana since, in the elasticsearch output plugin for logstash, the default index pattern is logstash-%{+YYYY.MM.dd} (cf doc), which will be use to name the index created with this plugin.
But in you case, the plugin is configured with index => "%{type}-%{+YYYY.MM}". So the index created will be of the iis-w3c-%{+YYYY.MM} format. So you'll have to replace logstash-* by iis-w3c-* in the field Index name or pattern

Resources