I want to send logs from different location to elasticsearch using logstash conf file.
input {
file
{
path => "C:/Users/611166850/Desktop/logstash-5.0.2/logstash-5.0.2/logs/logs/ce.log"
type => "CE"
start_position => "beginning"
}
file
{
path => "C:/Users/611166850/Desktop/logstash-5.0.2/logstash-5.0.2/logs/logs/spovp.log"
type => "SP"
start_position => "beginning"
}
file
{
path => "C:/Users/611166850/Desktop/logstash-5.0.2/logstash-5.0.2/logs/logs/ovpportal_log"
type => "OVP"
start_position => "beginning"
}
}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "multiple"
codec => json
workers => 1
}
}
This is the config file I use, but Kibana is not recognising this index. Can someone help with this
Thanks in advance ,Rashmi
Check logstash's log file for errors.(maybe you'r config file is not correct)
Also search ES directly for preferred index, maybe problem is not Kibana, and you don't have any index with this name.
try starting logstash in debug mode to see if there are any logs in it.
you can also try to get the logstash out put to a file on local system rather than directly sending it to the elasticsearch. uncomment block as per your requirement
# block-1
# if "_grokparsefailure" not in [tags] {
# stdout {
# codec => rubydebug { metadata => true }
# }
# }
# block-2
# if "_grokparsefailure" not in [tags] {
# file {
# path => "/tmp/out-try1.logstash"
# }
# }
so by any of these methods you can get the output to console or to a file. comment _grokparsefailure part in case you don't see any output in file.
Note: in kibana default indices have #timestamp in their fields so check
1. if kibana is able to recognize the index if you unckeck the checkbox on page where you create new index
2. if your logs are properly parsed. if not you need to work out with grok filters with pattern matching your logs or create grok filters
all elasticsearch indices are visible on http://elasticsearch-ip:9200/_cat/indices?v (your elasticsearch ip) so try that too. share what you find
Related
I am working on an ELK stack setup I want to import data from a csv file from my PC to elasticsearch via logstash. Elasticsearch and Kibana is working properly.
Here is my logstash.conf file:
input {
file {
path => "C:/Users/aron/Desktop/es/archive/weapons.csv"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
separator => ","
columns => ["name", "type", "country"]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200/"]
index => "weapons"
document_type => "ww2_weapon"
}
stdout {}
}
And a sample row data from my .csv file looks like this:
Name
Type
Country
10.5 cm Kanone 17
Field Gun
Germany
German characters are all showing up.
I am running logstash via: logstash.bat -f path/to/logstash.conf
It starts working but it freezes and becomes unresponsive along the way, here is a screenshot of stdout
In kibana, it created the index and imported 2 documents but the data is all messed up. What am I doing wrong?
If your task is only to import that CSV you better use the file upload in Kibana.
Should be available under the following link (for Kibana > v8):
<your Kibana domain>/app/home#/tutorial_directory/fileDataViz
Logstash is used if you want to do this job on a regular basis with new files coming in over time.
You can try with below one. It is running perfectly on my machine.
input {
file {
path => "path/filename.csv"
start_position => "beginning"
sincedb_path => "NULL"
}
}
filter {
csv {
separator => ","
columns => ["field1","field2",...]
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => "https://localhost:9200"
user => "username" ------> if any
password => "password" ------> if any
index => "indexname"
document_type => "doc_type"
}
}
I'm running logstash where the output is set to elasticsearch on my localhost. However, when I open up elasticsearch, it appears that it did not receive any data from logstash. Logstash parses the csv file correctly, as I can see by the output in the terminal.
I've tried modifying the conf file, but the problem remains. The conf file is below
input {
file {
path => "/Users/kevinliu/Desktop/logstash_tutorial/gg.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["name","price","unit","url"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "gg-prices"
}
stdout {}
}
When I access localhost:9200/ I just see the default " "You Know, for Search" display/message from elasticsearch.
I am trying to use logstash 5.5 for analyzing archived (.gz) files generating every minute. Each.gz file contains csv file in it. My .conf file looks like below:
input {
file {
type => "gzip"
path => [ “C:\data*.gz” ]
start_position => "beginning"
sincedb_path=> "gzip"
codec => gzip_lines
}
}
filter {
csv {
separator => ","
columns => [“COL1”,“COL2”,“COL3”,“COL4”,“COL5”,“COL6”,“COL7”]
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "mydata"
document_type => “zipdata”
}
stdout {}
}
Initially I was getting error for missing gzip_lines plugin. So, I installed it. After installing this plugin, I can see that logstash says “Succesfully started Logstash API endpoint” but nothing get indexed. I do not see any indexing of data in elasticsearch in logstash logs. When I try to get the index in Kibana, it is not available there. It means that logstash is not putting data in elasticsearch.
May be I am using wrong configuration. Please suggest, what is the correct way of doing this?
i'm new to elasticsearch this is my logstash.conf file
input {
#stdin {}
file {
path => "/demo_logs/2015-12-14.txt"
start_position => "beginning"
}
}
filter {
csv {
columns => ["data_date", "ip", "method", "status", "time"]
separator => ","
}
}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
stdout { codec => rubydebug }
}
i have triggered the conf file using /bin/logstach -f logstash.conf
loggstash started and no process about open the file and indexed in elasticsearch
so i just did un comment the stdin{} for getting the input in terminal as below
input{
stdin {}
#file {
#path => "/demo_logs/2015-12-14.txt"
#start_position => "beginning"
#}
}
again i have run the conf file and insert the value as
2015-12-14 07:29:24.356302,127.0.0.1,get_names,exit,0:00:00.298635
its show error as
Trouble parsing csv {:field=>"message", :source=>"", :exception=>#<NoMethodError: undefined method `each_index' for nil:NilClass>, :level=>:warn}
can anybody help in this two thing to successfully execute logstash.conf file from a .txt file and indexing all the values in that .txt file successfully.
i too tried with grok filter but i could not make it, if its grok filter also great for me
Thanks
This could be mainly due to the fact that your data contains blank lines. If it's a log file you can open it in notepad++ and go to edit->line operations->remove blank lines. Hope this helps
I begin with logstash and ElasticSearch and I would like to index .pdf or .doc file type in ElasticSearch via logstash.
I configured logstash using the codec multiline to get my file in a single message in ElasticSearch. Below is my configuration file:
input {
file {
path => "D:/BaseCV/*"
codec => multiline {
# Grok pattern names are valid! :)
pattern => ""
what => "previous"
}
}
}
output {
stdout {
codec => "rubydebug"
}
elasticsearch {
hosts => "localhost"
index => "cvindex"
document_type => "file"
}
}
At the start of logstash the first file I add, I recovered in ElasticSearch in one message, but the following are spread over several messages. I wish I had the correspondence : 1 file = 1 message.
Is this possible ? What should I change my setup to solve the problem ?
Thank you for your feedback.