Logstash not importing data - elasticsearch

I am working on an ELK stack setup I want to import data from a csv file from my PC to elasticsearch via logstash. Elasticsearch and Kibana is working properly.
Here is my logstash.conf file:
input {
file {
path => "C:/Users/aron/Desktop/es/archive/weapons.csv"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
separator => ","
columns => ["name", "type", "country"]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200/"]
index => "weapons"
document_type => "ww2_weapon"
}
stdout {}
}
And a sample row data from my .csv file looks like this:
Name
Type
Country
10.5 cm Kanone 17
Field Gun
Germany
German characters are all showing up.
I am running logstash via: logstash.bat -f path/to/logstash.conf
It starts working but it freezes and becomes unresponsive along the way, here is a screenshot of stdout
In kibana, it created the index and imported 2 documents but the data is all messed up. What am I doing wrong?

If your task is only to import that CSV you better use the file upload in Kibana.
Should be available under the following link (for Kibana > v8):
<your Kibana domain>/app/home#/tutorial_directory/fileDataViz
Logstash is used if you want to do this job on a regular basis with new files coming in over time.

You can try with below one. It is running perfectly on my machine.
input {
file {
path => "path/filename.csv"
start_position => "beginning"
sincedb_path => "NULL"
}
}
filter {
csv {
separator => ","
columns => ["field1","field2",...]
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => "https://localhost:9200"
user => "username" ------> if any
password => "password" ------> if any
index => "indexname"
document_type => "doc_type"
}
}

Related

Elasticsearch not recieving input from logstash

I'm running logstash where the output is set to elasticsearch on my localhost. However, when I open up elasticsearch, it appears that it did not receive any data from logstash. Logstash parses the csv file correctly, as I can see by the output in the terminal.
I've tried modifying the conf file, but the problem remains. The conf file is below
input {
file {
path => "/Users/kevinliu/Desktop/logstash_tutorial/gg.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["name","price","unit","url"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "gg-prices"
}
stdout {}
}
When I access localhost:9200/ I just see the default " "You Know, for Search" display/message from elasticsearch.

Logstash multiple logs

I am following an online tutorial and have been provided with a cars.csv file and the following Logstash config file. My logstash is running perfectly well and is indexing the CSV as we speak.
The question is, I have another log file (entirely different data) which I need to parse and index into a different index.
How do I add this configuration without restarting logstash?
If above isn't possible and I edit the config file then restart logstash - it won't reindex the entire cars file will it?
If I do 2. How do I format the config for multiple styles of log file.
eg. my new log file looks like this:
01-01-2017 ORDER FAILED: £12.11 Somewhere : Fraud
Existing Config File:
input {
file {
path => "/opt/cars.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns =>
[
"maker",
"model",
"mileage",
"manufacture_year",
"engine_displacement",
"engine_power",
"body_type",
"color_slug",
"stk_year",
"transmission",
"door_count",
"seat_count",
"fuel_type",
"date_last_seen",
"date_created",
"price_eur"
]
}
mutate {
convert => ["mileage", "integer"]
}
mutate {
convert => ["price_eur", "float"]
}
mutate {
convert => ["engine_power", "integer"]
}
mutate {
convert => ["door_count", "integer"]
}
mutate {
convert => ["seat_count", "integer"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "cars"
document_type => "sold_cars"
}
stdout {}
}
Config file for orders.log
input {
file {
path => "/opt/logs/orders.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok {
match => { "message" => "(?<date>[0-9-]+) (?<order_status>ORDER [a-zA-Z]+): (?<order_amount>£[0-9.]+) (?<order_location>[a-zA-Z]+)( : (?<order_failure_reason>[A-Za-z ]+))?"}
}
mutate {
convert => ["order_amount", "float"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "sales"
document_type => "order"
}
stdout {}
}
Disclaimer: I'm a complete newbie. Second day using ELK.
For point 1, either in your logstash.yml file, you can set
config.reload.automatic:true
Or, while executing logstash with conf file, run it like:
bin/logstash -f conf-file-name.conf --config.reload.automatic
After doing either of these settings, you can start your logstash and from now on any change you make in conf file will be reflected back.
2. If above isn't possible and I edit the config file then restart logstash - it won't reindex the entire cars file will it?
If you use sincedb_path => "/dev/null", Logstash won't remember where is has stopped reading a document and will reindex it at each restart. You'll have to remove this line if you wish for Logstash to remember (see here).
3.How do I format the config for multiple styles of log file.
To support multiple style of log files, you can put tags on the file inputs (see https://www.elastic.co/guide/en/logstash/5.5/plugins-inputs-file.html#plugins-inputs-file-tags) and then use conditionals (see https://www.elastic.co/guide/en/logstash/5.5/event-dependent-configuration.html#conditionals) in your file config.
Like this:
file {
path => "/opt/cars.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
tags => [ "csv" ]
}
file {
path => "/opt/logs/orders.log"
start_position => "beginning"
sincedb_path => "/dev/null"
tags => [] "log" ]
}
if csv in [tags] {
...
} else if log in [tags] {
...
}

Importing csv file into elastic search

I am trying to import a huge csv file to elastic serach.Trying to use logstash for the same.
Sample csv file [Note:multiple values ]
Shop_name,Review_Title,Review_Text,,,,
Accord ,Excellent ,Nice Collection.,,,,,
Accord , Bad ,Not too comfortable,,,
Accord , Good ,excellent location and Staff,,,
Accord , Good ,Great Colletion,,,
Shopon,good, staff very good ,,,
Harrisons ,Spacious,Nice Colletion
Logstash congiguration
input {
file {
path => ["shopreview.csv"]
start_position => "beginning"
}
}
filter {
csv {
columns => [
"Shop_name",
"Review_Title",
"Review_Text"
]
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
action => "index"
hosts => ["127.0.0.1:9200"]
index => "reviews"
document_type => "shopreview"
document_id => "%{Shop_name}"
workers => 1
}
}
here when i query for review i should get all the reviews for the particular shop.
Issue
while I query with localhost:9020/review/shopreview/Accord I am not getting all values .only 1 entry. Is the config missing something . I am little new to elk stack

Mutiple logs in single config file to elasticsearch

I want to send logs from different location to elasticsearch using logstash conf file.
input {
file
{
path => "C:/Users/611166850/Desktop/logstash-5.0.2/logstash-5.0.2/logs/logs/ce.log"
type => "CE"
start_position => "beginning"
}
file
{
path => "C:/Users/611166850/Desktop/logstash-5.0.2/logstash-5.0.2/logs/logs/spovp.log"
type => "SP"
start_position => "beginning"
}
file
{
path => "C:/Users/611166850/Desktop/logstash-5.0.2/logstash-5.0.2/logs/logs/ovpportal_log"
type => "OVP"
start_position => "beginning"
}
}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "multiple"
codec => json
workers => 1
}
}
This is the config file I use, but Kibana is not recognising this index. Can someone help with this
Thanks in advance ,Rashmi
Check logstash's log file for errors.(maybe you'r config file is not correct)
Also search ES directly for preferred index, maybe problem is not Kibana, and you don't have any index with this name.
try starting logstash in debug mode to see if there are any logs in it.
you can also try to get the logstash out put to a file on local system rather than directly sending it to the elasticsearch. uncomment block as per your requirement
# block-1
# if "_grokparsefailure" not in [tags] {
# stdout {
# codec => rubydebug { metadata => true }
# }
# }
# block-2
# if "_grokparsefailure" not in [tags] {
# file {
# path => "/tmp/out-try1.logstash"
# }
# }
so by any of these methods you can get the output to console or to a file. comment _grokparsefailure part in case you don't see any output in file.
Note: in kibana default indices have #timestamp in their fields so check
1. if kibana is able to recognize the index if you unckeck the checkbox on page where you create new index
2. if your logs are properly parsed. if not you need to work out with grok filters with pattern matching your logs or create grok filters
all elasticsearch indices are visible on http://elasticsearch-ip:9200/_cat/indices?v (your elasticsearch ip) so try that too. share what you find

elasticsearch is not indexing from logstash.conf

i'm new to elasticsearch this is my logstash.conf file
input {
#stdin {}
file {
path => "/demo_logs/2015-12-14.txt"
start_position => "beginning"
}
}
filter {
csv {
columns => ["data_date", "ip", "method", "status", "time"]
separator => ","
}
}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
stdout { codec => rubydebug }
}
i have triggered the conf file using /bin/logstach -f logstash.conf
loggstash started and no process about open the file and indexed in elasticsearch
so i just did un comment the stdin{} for getting the input in terminal as below
input{
stdin {}
#file {
#path => "/demo_logs/2015-12-14.txt"
#start_position => "beginning"
#}
}
again i have run the conf file and insert the value as
2015-12-14 07:29:24.356302,127.0.0.1,get_names,exit,0:00:00.298635
its show error as
Trouble parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>, :level=>:warn}
can anybody help in this two thing to successfully execute logstash.conf file from a .txt file and indexing all the values in that .txt file successfully.
i too tried with grok filter but i could not make it, if its grok filter also great for me
Thanks
This could be mainly due to the fact that your data contains blank lines. If it's a log file you can open it in notepad++ and go to edit->line operations->remove blank lines. Hope this helps

Resources