Input as file path in logstash config didn't work - elasticsearch

When I run a command like this(on a Windows System):
logstash -f logstash-apache.conf
there's no output and it didn't store any log to the elasticsearch.
so I think it didn't work.
btw I refered to the website:https://www.elastic.co/guide/en/logstash/current/config-examples.html#config-examples
this is my conf file(logstash-apache.conf):
input {
file {
path => ["C:/Users/User/Downloads/logstash-5.5.1/bin/access_log.txt"]
start_position => "beginning"
}
}
filter {
if [path] =~ "access" {
mutate { replace => { "type" => "apache_access" } }
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout { codec => rubydebug }
}
this is the output:
C:\Users\User\Downloads\logstash-5.5.1\bin>logstash -f logstash-apache.conf
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Sending Logstash's logs to C:/Users/User/Downloads/logstash-5.5.1/logs which is now configured via log4j2.properties
[2017-08-18T08:35:20,504][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[localhost:9200/]}}
[2017-08-18T08:35:20,509][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>localhost:9200/, :path=>"/"}
[2017-08-18T08:35:20,668][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#}
[2017-08-18T08:35:20,670][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-08-18T08:35:20,725][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date", "include_in_all"=>false}, "#version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-08-18T08:35:20,734][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#]}
[2017-08-18T08:35:21,010][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-08-18T08:35:21,896][INFO ][logstash.pipeline ] Pipeline main started
[2017-08-18T08:35:22,036][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
thank you in advance :)

Related

logstash configuration pipeline

I have a log file looking like this
116.50.181.5 - - [18/May/2015:19:05:32 +0000] "GET /images/web/2009/banner.png HTTP/1.1" 200 52315 "http://www.semicomplete.com/style2.css" "Mozilla/5.0 (X11; Linux x86_64; rv:26.0) Gecko/20100101 Firefox/26.0"
my logstash configuration is as below:
input {
file {
path => "C:\Users\PC\Documents\elk\Input\listening.txt"
start_position => "beginning"
}
}
filter {
grok {
match => {
"message" => '%{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "%{WORD:verb} %{DATA:request} HTTP/%{NUMBER:httpversion}" %{NUMBER:response:int} (?:-|%{NUMBER:bytes:int}) %{QS:referrer} %{QS:agent}'
}
}
date {
match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
locale => en
}
geoip {
source => "clientip"
}
useragent {
source => "agent"
target => "useragent"
}
}
output {
elasticsearch
{
hosts => "http://localhost:9200"
index => "log"
}
}
everything works just fine, I have no errors in logstash , but the data does'nt appear in elasticsearch as expected.
C:\elk\logstash-7.1.1\bin>logstash -f logstashETL.conf
Sending Logstash logs to C:/elk/logstash-7.1.1/logs which is now configured via log4j2.properties
[2019-06-12T16:02:27,371][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-06-12T16:02:27,405][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-06-12T16:02:36,087][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-06-12T16:02:36,344][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-06-12T16:02:36,428][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-06-12T16:02:36,428][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2019-06-12T16:02:36,469][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2019-06-12T16:02:36,493][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-06-12T16:02:36,513][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x75642d2 run>"}
[2019-06-12T16:02:36,753][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date"}, "#version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-06-12T16:02:37,814][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/elk/logstash-7.1.1/data/plugins/inputs/file/.sincedb_636c54fa423804cc695f80e1cb9d6ccd", :path=>["C:\\Users\\PC\\Documents\\elk\\Input\\listening.txt"]}
[2019-06-12T16:02:37,878][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-06-12T16:02:37,988][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-06-12T16:02:38,008][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-06-12T16:02:38,773][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
maybe there are somethings wrong or misssing in my code.
Add the below code in your input
start_position => "beginning"
sincedb_path => "/dev/null"
sincedb_path => "/dev/null" means it doesn't store sincedb files. These files are keeping byte ofset of where the logstash left on the file.
Then go to logstash/data/plugins/inputs/file directory. After that run below command at this directory
rm -r .sincedb*
Finally run your logstash pipeline. It should work.

Not able to map csv file from logstash to kibana in Window

I'm trying to feed data in csv files into elastic search using logstash. My logsatsh config file looks like this:
input {
file {
path => "D:\Log Anlyser\data\cars.csv"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
separator => ","
columns => [ "maker", "model", "mileage", "manufacture_year", "engine_displacement", "engine_power", "body_type", "color_slug", "stk_year", "transmission", "door_count", "seat_count", "fuel_type", "date_created", "date_last_seen", "price_eur" ]
}
mutate {convert => ["milage", "integer"] }
mutate {convert => ["price_eur", "float"] }
mutate {convert => ["engine_power", "integer"] }
mutate {convert => ["door_count", "integer"] }
mutate {convert => ["seat_count", "integer"] }
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => ["cars-%{+YYYY.MM.dd}"]
}
}
while firing this command for logstash in window : logstash -f cars.conf i am getting this:-
Sending Logstash logs to D:/Log_Anlyser/logstash/logs which is now configured via log4j2.properties
[2019-02-26T12:05:51,690][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-02-26T12:05:51,721][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.6.1"}
[2019-02-26T12:05:57,133][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-02-26T12:05:57,510][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2019-02-26T12:05:57,664][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-02-26T12:05:57,711][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>5}
[2019-02-26T12:05:57,742][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-02-26T12:05:57,758][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-02-26T12:05:57,852][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date", "include_in_all"=>false}, "#version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-02-26T12:05:58,179][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x274079d5 run>"}
[2019-02-26T12:05:58,226][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-02-26T12:05:58,226][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-02-26T12:05:58,547][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Now While connecting to kibana(localhost:5601) i am not able to map the data. Getting this error:-
Unable to fetch mapping. Do you have indices matching the pattern?
Can you please help.
I got the problem. Mistake is very silly. Path of CSV file was wrong. Earlier path was path => "D:\Log Anlyser\data\cars.csv". Current Path is`
path => "D:/Log_Anlyser/data/cars.csv"
It will work
There might be few reasons - maybe the data is not reaching ES at all. you can check that by verifying the index exists, by running
GET es-url:9200/_cat/indices/cars*
If an index exists then you should be able to create the index pattern in Kibana.
If the index is missing then either Logstash is not reading the input file, or elasticsearch is not reachable. need to check logstash logs, and make sure data reaches ES.

Logstash not creating indexes in Windows 10

I have used the zip files to start logstash, kibana and elasticsearch. I am ingesting a csv file from logstash to elastic search
input {
file {
path => "D:\tls202_part01\tls202_part01.csv"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["appln_id", "appln_title_lg", "appln_title"]
}
mutate {
convert => ["appln_id", "integer"]
convert => ["appln_title_lg", "string"]
convert => ["appln_title", "string"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "title"
}
stdout {
codec => rubydebug
}
}
this is my config file. When I search for index title it is not there and logstash logs are these:
Sending Logstash logs to D:/logstash-6.5.4/logs which is now configured via log4j2.properties
[2018-12-26T10:22:35,672][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-12-26T10:22:35,699][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.5.4"}
[2018-12-26T10:22:41,588][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-12-26T10:22:42,051][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-12-26T10:22:42,297][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-12-26T10:22:42,370][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-12-26T10:22:42,376][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-12-26T10:22:42,417][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2018-12-26T10:22:42,439][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-12-26T10:22:42,473][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date"}, "#version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-12-26T10:22:43,330][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"D:/logstash-6.5.4/data/plugins/inputs/file/.sincedb_bb5ff7ebd070422c5b611ac87e9e7087", :path=>["D:\\tls202_part01\\tls202_part01.csv"]}
[2018-12-26T10:22:43,390][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x389cc614 run>"}
[2018-12-26T10:22:43,499][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-12-26T10:22:43,532][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-12-26T10:22:43,842][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
The CSV file is large of 2GB csv data.
Also, kibana is showing no Elasticsearch data found for creating indexes.
It seems that logstash didn't found your file, change your path from backslash to forward slash and see if it works.
path => "D:/tls202_part01/tls202_part01.csv"

Running logstash via central NFS location

I have configured an ELK Server and all there component are on the same servers.. Though, i'm trying to pick my logstash-syslog.conf from the central point that's based on NFS so i dont need to Install logstash on each client..
1) My logstash-syslog.conf file
input {
file {
path => [ "/var/log/messages" ]
type => "test"
}
}
filter {
if [type] == "test" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{#timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
if "automount" in [message] {
elasticsearch {
hosts => "oida-elk:9200"
#index => "newmsglog-%{+YYYY.MM.dd}"
index => "%{type}-%{+YYYY.MM.dd}"
document_type => "msg"
}
stdout {}
}
}
2) When I'm running the on the clients to get the data it start the thread and just stucks there ..
[Myclient1 =~] # $ /home/data/logstash-6.0.0/bin/logstash -f /home/data/logstash-6.0.0/conf.d/ --path.data=/tmp/klm
3) after running the above command its shows below logs and then do not proceed anywhere...
[2018-03-05T21:20:51,014][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://oida-elk:9200/"}
[2018-03-05T21:20:51,078][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-03-05T21:20:51,085][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date"}, "#version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-03-05T21:20:51,101][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//oida-elk:9200"]}
[2018-03-05T21:20:51,297][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125, :thread=>"#<Thread:0x2ea3b180#/home/data/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2018-03-05T21:20:51,746][INFO ][logstash.pipeline ] Pipeline started {"pipeline.id"=>"main"}
[2018-03-05T21:20:51,800][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}
Please help / suggest anything ..
can you run the logtash validation using -t flag.
Also you have to pass full path to file when using the -f flag.
Can you tail on the file to see if it has newer entries coming.
I would like to add that For reading files over to logtash using filebeat is a better choice.

How to create Index and Mapping into ES from LOGSTASH

I've been following this Tutorial for impport data from a DB into LOGSTASh and create a Idex and Mapping into Elastic Search
INSERT INTO LOGSTASH SELECT DATA FROM DATABASE
This is my OUTPUT based on my Configurations file:
[2017-10-12T11:50:45,807][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/Bruno/Downloads/logstash-5.6.2/logstash-5.6.2/modules/fb_apache/configuration"}
[2017-10-12T11:50:45,812][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Users/Bruno/Downloads/logstash-5.6.2/logstash-5.6.2/modules/netflow/configuration"}
[2017-10-12T11:50:46,518][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-10-12T11:50:46,521][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-10-12T11:50:46,652][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2017-10-12T11:50:46,654][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-10-12T11:50:46,716][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"#timestamp"=>{"type"=>"date", "include_in_all"=>false}, "#version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-10-12T11:50:46,734][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2017-10-12T11:50:46,749][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-10-12T11:50:47,053][INFO ][logstash.pipeline ] Pipeline main started
[2017-10-12T11:50:47,196][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-10-12T11:50:47,817][INFO ][logstash.inputs.jdbc ] (0.130000s) SELECT * from EP_RDA_STRING
[2017-10-12T11:50:53,095][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}
Everything seems OK, at least I think. Except the fact that querying the ES server to OUTPUT indexes and Mappings, I have it Empty.
http://localhost:9200/_all/_mapping
{}
http://localhost:9200/_cat/indices?v
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
this is my File Config:
input {
jdbc {
# sqlserver jdbc connection string to our database, mydb
jdbc_connection_string => "jdbc:sqlserver://localhost:1433;databaseName=RDA; integratedSecurity=true;"
# The user we wish to execute our statement as
jdbc_user => ""
# The path to our downloaded jdbc driver
jdbc_driver_library => "C:\mypath\sqljdbc_6.2\enu\mssql-jdbc-6.2.1.jre8.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
# our query
statement => "SELECT * from EP_RDA_STRING"
}
}
output {
elasticsearch {
index => "RDA"
document_type => "RDA_string_view"
document_id => "%{ndb_no}"
hosts => "localhost:9200"
}
}
Which version of logstash are you using? What is the command that you are using to start the logstash? Make sure that the input and output blocks resemble the one that is given below
input {
beats {
port => "29600"
type => "weblogic-server"
}
}
filter {
}
output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "logstash-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}

Resources