LogStash setup LoadError - ruby

I'm trying to set up LogStash and I'm following this tutorial exactly. But when I run command
bin/logstash -e 'input { stdin { } } output { stdout {} }'
it gives me the following error:
warning: --1.9 ignored
LoadError: no such file to load -- bundler
require at org/jruby/RubyKernel.java:940
require at C:/jruby-9.0.0.0/lib/ruby/stdlib/rubygems/core_ext/kernel_require.rb:54
setup! at C:/Users/ryan.dai/Desktop/logstash-1.5.3/lib/bootstrap/bundler.rb:43
<top> at c:/Users/ryan.dai/Desktop/logstash-1.5.3/lib/bootstrap/environment.rb:46
I tried jruby -S gem install bundler as suggested from someone else but it doesn't work. Totally new to Ruby, what is happening and what should I do?

You can fallow the below URL for installing entire ELK Setup.
Here you need to pass the file(log) as a path to the input of the logstash configuration.
input {
file {
path => "/tmp/access_log"
start_position => "beginning"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
ELK Setup Installtion
Commands for running with CMD Prompt:
logstash -f logstash.conf for running logstash
logstash --configtest -f logstash.conf for configuration test
logstash --debug -f logstash.conf for debug the logstash configuration
Logstash configuration Examples

Related

Error while connecting Logstash and Elasticsearch

I am very very new to ELK, I installed ELK version 5.6.12 on CentOS sever. Elasticsearch and Kibana works fine. But I cannot connect Logstash to Elastic search.
I have set environment variable as
export JAVA_HOME=/usr/local/jdk1.8.0_131
export PATH=/usr/local/jdk1.8.0_131/bin:$PATH
I run simple test :
bin/logstash -e 'input { stdin { } } output { elasticsearch { host => localhost:9200 protocol => "http" port => "9200" } }'
I get error :
WARNING: Could not find logstash.yml which is typically located in
$LS_HOME/config or /etc/logstash. You can specify the path using --
path.settings. Continuing using the defaults
Could not find log4j2 configuration at path
/etc/logstash/logstash.yml/log4j2.properties. Using default config which
logs errors to the console
Simple "slash" mentioned in official documentation of Logstash works like following :
$bin/logstash -e 'input { stdin { } } output { stdout {} }'
Hello
WARNING: Could not find logstash.yml which is typically located in
$LS_HOME/config or /etc/logstash. You can specify the path using --
path.settings. Continuing using the defaults Could not find log4j2
configuration at path /usr/share/logstash/config/log4j2.properties.
Using default config which logs errors to the console
The stdin plugin is now waiting for input: {
"#version" => "1",
"host" => "localhost",
"#timestamp" => 2018-11-01T04:44:58.648Z,
"message" => "Hello" }
What could be the problem?

Logstash codec rubydebug is not working

Installed latest version of Logstash in windows machine and tried to execute the below configuration .
"
input{
stdin{}
}
output
{
stdout {codec => rubydebug}
}"
output data is not showing the format. do u need install rubydebug plugin. how can u install in windows? what is the command to execute ?
Configure file:logstash-simple.conf in logstash directory
input { stdin { } }
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
then run the following command from windows(my case)
bin\logstash.bat -f logstash-simple.conf

Logfile won't apear in elasticsearch

I'm very new to logstash and elasticsearch, I am trying to stash my first log to logstash in a way that I can (correct me if it is not the purpose) search it using elasticsearch....
I have a log that looks like this basically:
2016-12-18 10:16:55,404 - INFO - flowManager.py - loading metadata xml
So, I have created a config file test.conf that looks like this:
input {
file {
path => "/home/usr/tmp/logs/mylog.log"
type => "test-type"
id => "NEWTRY"
}
}
filter {
grok {
match => { "message" => "%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second} - %{LOGLEVEL:level} - %{WORD:scriptName}.%{WORD:scriptEND} - " }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "ecommerce"
codec => line { format => "%{year}-%{month}-%{day} %{hour}:%{minute}:%{second} - %{level} - %{scriptName}.%{scriptEND} - \"%{message}\"" }
}
}
And then : ./bin/logstash -f test.conf
I do not see the log in elastic search when I go to: http://localhost:9200/ecommerce OR to http://localhost:9200/ecommerce/test-type/NEWTRY
Please tell me what am I doing wrong.... :\
Thanks,
Heather
I found a solution eventually-
I added both sincedb_path=>"/dev/null" (which from what I understood is for testing enviorment only) and start_position => "beginning" to the output file plugin and the file appeared both in elastic and in kibana
Thanks anyway for responding and trying to help!

Unable to view Apache log in elasticsearch

I have installed ELK stack on windows and configured Logstash to read an Apache Log file. I cant seem to see the output in Elasticsearch. I am very new to ELK stack.
Environment Setup
Elasticsearch: http://localhost:9200/
Logstash :
Kibana : http://localhost:5601/
All 3 applications above are running as a service.
I have created a file called "logstash.conf" to read apache logs in "C:\Elk\logstash\conf\logstash.conf" with the following :
input {
file {
path => "C:\Elk\apache.log"
start_position => "beginning"
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
}
I then restarted my Logstash service and now wish to see if elasticsearch is indexing the content of my log. How do i go about doing this ?
try adding following lines to your logstash conf and let us know if there are any grokparsing failures...which would mean your pattern used in filter section is not correct..
output {
stdout { codec => json }
file { path => "C:/POC/output3.txt" }
}

Logstash not writing output to elasticsearch

The code mentioned is my logstash conf file . I provide my nginx access log file as input and output to elasticsearch .I also write the output to a text file which works fine .. But the output is never been written to elasticsearch.
input {
file {
path => "filepath"
start_position => "beginning"
}
}
output {
file {
path => "filepath"
}
elasticsearch {
host => localhost
port => "9200"
}
}
I also tried executing logstash binary from command line using -e option
input { stdin{ } output { elasticsearch { host => localhost } }
which works fine. I get the output written to elasticsearch.. But in the former case i dont . Help me solve this
I tried a few things, I have no idea why your case with just host works. If I try it, i get timeouts. This is the configuration that works for me:
elasticsearch {
protocol => "http"
host => "localhost"
port => "9200"
}
I tried with logstash 1.4.2 and elasticsearch 1.4.4

Resources