Logstash codec rubydebug is not working - elasticsearch

Installed latest version of Logstash in windows machine and tried to execute the below configuration .
"
input{
stdin{}
}
output
{
stdout {codec => rubydebug}
}"
output data is not showing the format. do u need install rubydebug plugin. how can u install in windows? what is the command to execute ?

Configure file:logstash-simple.conf in logstash directory
input { stdin { } }
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
then run the following command from windows(my case)
bin\logstash.bat -f logstash-simple.conf

Related

Logfile won't apear in elasticsearch

I'm very new to logstash and elasticsearch, I am trying to stash my first log to logstash in a way that I can (correct me if it is not the purpose) search it using elasticsearch....
I have a log that looks like this basically:
2016-12-18 10:16:55,404 - INFO - flowManager.py - loading metadata xml
So, I have created a config file test.conf that looks like this:
input {
file {
path => "/home/usr/tmp/logs/mylog.log"
type => "test-type"
id => "NEWTRY"
}
}
filter {
grok {
match => { "message" => "%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second} - %{LOGLEVEL:level} - %{WORD:scriptName}.%{WORD:scriptEND} - " }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "ecommerce"
codec => line { format => "%{year}-%{month}-%{day} %{hour}:%{minute}:%{second} - %{level} - %{scriptName}.%{scriptEND} - \"%{message}\"" }
}
}
And then : ./bin/logstash -f test.conf
I do not see the log in elastic search when I go to: http://localhost:9200/ecommerce OR to http://localhost:9200/ecommerce/test-type/NEWTRY
Please tell me what am I doing wrong.... :\
Thanks,
Heather
I found a solution eventually-
I added both sincedb_path=>"/dev/null" (which from what I understood is for testing enviorment only) and start_position => "beginning" to the output file plugin and the file appeared both in elastic and in kibana
Thanks anyway for responding and trying to help!

sending json from one logstash to another

i have 3 node setup
10.x.x.1 - application and filebeat
10.x.x.2 - machine for parsing and logstash
10.x.x.3 - having centralized logstash node from where we need to push messages into Elastic Search
in 10.x.x.2 when i set the output codec to stdout , i can see the messages coming from 10.x.x.1.
Now, i need to forward all the json messages from 10.x.x.2 to 10.x.x.3 . I tried using TCP. But the messages are not gettting sent.
10.x.x.2 logstash conf file
input {
beats {
port => 5045
}
}
output{
#stdout { codec => rubydebug }
tcp{
host => "10.x.x.3"
port => 3389
}
10.x.x.3 logstash conf file
input{
tcp{
host => "10.x.x.3"
port => 3389
#mode => "server"
#codec => "json"
}
}
output{
stdout{ codec => rubydebug }
}
is there any plugin which can send json data from one logstash to another logstash server
Your config should work.
But you have to be carreful with the "codec" properties.
Try first to set it to "line" on the output AND the input plugins of the two logstash.
And see if log are incoming.
With the codec set to "line" you will have logicly no problem to forward the logs.
Then work on the "json" properties.
Do not forget that you can activate the debug mode of logstash with the argument --debug and you can log with the arguments : -l logFileName
When you start to work with the codec json look for "_jsonparsefailure" tags, which could explain why it do not transfert logs between the two logstash.

LogStash setup LoadError

I'm trying to set up LogStash and I'm following this tutorial exactly. But when I run command
bin/logstash -e 'input { stdin { } } output { stdout {} }'
it gives me the following error:
warning: --1.9 ignored
LoadError: no such file to load -- bundler
require at org/jruby/RubyKernel.java:940
require at C:/jruby-9.0.0.0/lib/ruby/stdlib/rubygems/core_ext/kernel_require.rb:54
setup! at C:/Users/ryan.dai/Desktop/logstash-1.5.3/lib/bootstrap/bundler.rb:43
<top> at c:/Users/ryan.dai/Desktop/logstash-1.5.3/lib/bootstrap/environment.rb:46
I tried jruby -S gem install bundler as suggested from someone else but it doesn't work. Totally new to Ruby, what is happening and what should I do?
You can fallow the below URL for installing entire ELK Setup.
Here you need to pass the file(log) as a path to the input of the logstash configuration.
input {
file {
path => "/tmp/access_log"
start_position => "beginning"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
ELK Setup Installtion
Commands for running with CMD Prompt:
logstash -f logstash.conf for running logstash
logstash --configtest -f logstash.conf for configuration test
logstash --debug -f logstash.conf for debug the logstash configuration
Logstash configuration Examples

Logstash got error to send bulk of actions to elasticsearch server at localhost on Windows

I wrote a .conf file as in the example given in the Logstash documentation and tried to run it. Logstash started but when I gave the input it gave the error as mentioned in the title.
I am using Windows 8.1 and the .conf file in saved in the logstash-1.5.0/bin.
Here is the .conf file:
input { stdin { } }
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
Here is the screenshot of the command prompt:
Try with this, "logstash" should be the same name of your cluster in Elasticsearch.yml
output {
elasticsearch {
cluster => "logstash"
}
}
I found the error. It was because I have not installed elasticsearch before running logstash.
Thanks for trying to helping me out

Logstash not writing output to elasticsearch

The code mentioned is my logstash conf file . I provide my nginx access log file as input and output to elasticsearch .I also write the output to a text file which works fine .. But the output is never been written to elasticsearch.
input {
file {
path => "filepath"
start_position => "beginning"
}
}
output {
file {
path => "filepath"
}
elasticsearch {
host => localhost
port => "9200"
}
}
I also tried executing logstash binary from command line using -e option
input { stdin{ } output { elasticsearch { host => localhost } }
which works fine. I get the output written to elasticsearch.. But in the former case i dont . Help me solve this
I tried a few things, I have no idea why your case with just host works. If I try it, i get timeouts. This is the configuration that works for me:
elasticsearch {
protocol => "http"
host => "localhost"
port => "9200"
}
I tried with logstash 1.4.2 and elasticsearch 1.4.4

Resources