Logstash exec plugin - available arguments - elasticsearch

I have the following in my logstash configuration file:
input {
file {
path => "C:/myfile.txt"
}
}
output {
exec {
command => 'mytest.bat %message% %path%'
interval => 0
}
}
the %message% and %path% parameters are being passed to the batch file.
I am expecting to see:
message contain the line of the input file currently being parsed
path contain C:/myfile.txt
However, this is what the batch file recieves:
message "%message%"
path "C:/logstash-1.5.0/vendor/bundle/jruby/1.9/bin"
What is correct way to define the placeholders for:
the current line to be output
the name of file being parsed
Thanks

Please modify your config to
input {
file {
path => "C:/myfile.txt"
}
}
output {
exec {
command => "mytest.bat %{message} %{path}"
interval => 0
}
}
If you want to get the field in the event, please use %{message} ,not %message%

Related

Function to read role, environment file in masterless puppet

I'm working with Puppet 4.5 in masterless configuration and am trying to create a Puppet function to read a simple config file that assigns roles and environments. I don't have any integration with hiera/facter that I can change.
The file format is:
host1::java_app_node::qa
host2::nodejs_app_node::prod
The Puppet function that will read this file is in a module called homebase. I want to function to return a hash or array of hashes that split the config values. This will let me use them in templates.
In modules/homebase/manifests/init.pp I define:
$role_file = 'puppet://role.lst'
I then created modules/homebase/functions/get_roles.pp as follows:
function homebase::get_roles() {
$func_name = 'homebase::get_roles()'
if ! File.exists?($::homebase::role_file) {
fail("Could not find #{$::homebase::role_file}")
}
hosts = { }
File.open($::homebase::role_file).each |line| {
parts = line.split(/::/)
hosts[parts[0]] = { 'host' => parts[0], 'role' => parts[1], 'env' => parts[2] }
}
return hosts
}
In other classes, I then want to call:
class myapp {
$servers = homebase::get_roles().each | k, v | {
$v['host'] if $v['role'] =~ /myapp/ && $v['env'] == $environment
}
file { 'myapp.cfg':
ensure => file,
path => '/opt/myapp/myapp.cfg',
source => template("/myapp/myapp.cfg.erb"),
mode => '0644',
owner => myuser,
group => myuser,
}
}
Seems like there would be a better way to do this. Am I completely off base?
There turned out to be a much easier way to this rather than try to create a function to read a non-standard configuration file. Instead, I used a site.pp file to create node {} entries. I also parameterized the myapp class to take inputs based on the node.
So my site.pp looks like:
node 'server1.mydomain', 'server2.mydomain' {
$myvar = [ 'val1', 'val2' ]
class { 'myapp':
values => $myvar
}
}
This could probably be improved. One of the issues is with a non-Puppet configuration file I was able to also able to control execution in my bash wrapper script. Much of the need for that went away when, though, with the node definitions.

logstash:how to use environment variables in input host

I wanna print 'host source' to output. For this goal, local or global variables is necessary. But I wanna not use the global variables like 'export ...'.
So before the input{}, I put the host in metadata then use in 'input{}'.
Like below:
filter{
environment{
add_field =>{
"[#metadata][TEMP]" => "127.0.0.1"
}
}
}
input{
udp{
host => "%{[#metadata][TEMP]}"
port => "10000"
}
}
output{
udp{
host => "127.0.0.1"
port => "10001"
}
}
But logstash is not running then log is like below:
[WARN ][logstash.inputs.udp ] UDP listener died {:exception=>#<SocketError: bind: name or service not known>known>
So how can solve this problem??
Let me try the answer your question in two steps.
The error message
Your config file is malformed. The workflow is always like this:
# This is a comment. You should use comments to describe
# parts of your configuration.
input {
...
}
filter {
...
}
output {
...
}
That is why you get the error message, your filter is in the wrong place and not applied before the input.
Multiple input sources
If you want to add information to your events depending on which input is used, you can add a type during input handling. Here is an example config file:
input {
file {
type => "file"
path => "/var/log/some_name.log"
}
udp{
type => "udp"
host => "127.0.0.1"
port => "10001"
}
}
filter {
# can be omitted, if not used
}
output {
udp{
host => "127.0.0.1"
port => "10001"
}
}
The type is stored as part of the event itself, so you can also use the type to search for it in Kibana.

Logfile won't apear in elasticsearch

I'm very new to logstash and elasticsearch, I am trying to stash my first log to logstash in a way that I can (correct me if it is not the purpose) search it using elasticsearch....
I have a log that looks like this basically:
2016-12-18 10:16:55,404 - INFO - flowManager.py - loading metadata xml
So, I have created a config file test.conf that looks like this:
input {
file {
path => "/home/usr/tmp/logs/mylog.log"
type => "test-type"
id => "NEWTRY"
}
}
filter {
grok {
match => { "message" => "%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:day} %{HOUR:hour}:%{MINUTE:minute}:%{SECOND:second} - %{LOGLEVEL:level} - %{WORD:scriptName}.%{WORD:scriptEND} - " }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "ecommerce"
codec => line { format => "%{year}-%{month}-%{day} %{hour}:%{minute}:%{second} - %{level} - %{scriptName}.%{scriptEND} - \"%{message}\"" }
}
}
And then : ./bin/logstash -f test.conf
I do not see the log in elastic search when I go to: http://localhost:9200/ecommerce OR to http://localhost:9200/ecommerce/test-type/NEWTRY
Please tell me what am I doing wrong.... :\
Thanks,
Heather
I found a solution eventually-
I added both sincedb_path=>"/dev/null" (which from what I understood is for testing enviorment only) and start_position => "beginning" to the output file plugin and the file appeared both in elastic and in kibana
Thanks anyway for responding and trying to help!

puppet notify Exec doesn't working

Here is my code, don't worry about variable which is already set in original code. I am just putting small snippet here to show you what its doing. Following code updating file /etc/sysctl.d/pgsql.conf but not triggering notify or Exec to reload file. what is wrong here?
$sysctl_config = "/etc/sysctl.d/pgsql.conf"
exec { 'update_sysctl_shmall':
unless => "grep -q ^kernel.shmall ${sysctl_config}",
command => "/bin/echo \"kernel.shmall = ${shmall}\" >> ${sysctl_config}",
}
file { '/etc/sysctl.d/pgsql.conf':
ensure => present,
notify => Exec['reload_sysctl']
}
exec { 'reload_sysctl':
provider => shell,
command => '/bin/sysctl --system',
logoutput => 'on_failure',
refreshonly => true,
}
The following code:
file { '/etc/sysctl.d/pgsql.conf':
ensure => present,
notify => Exec['reload_sysctl']
}
only ensures that /etc/sysctl.d/pgsql.conf file exists. If the file exist it will do nothing, that's why Exec was not triggered to reload the file.
Please check the following links about notifications in puppet 1,2.
UPDATE:
Consider using audit metaparemeter:
file { '/etc/sysctl.d/pgsql.conf':
audit => 'content',
ensure => present,
notify => Exec['reload_sysctl']
}

LogStash setup LoadError

I'm trying to set up LogStash and I'm following this tutorial exactly. But when I run command
bin/logstash -e 'input { stdin { } } output { stdout {} }'
it gives me the following error:
warning: --1.9 ignored
LoadError: no such file to load -- bundler
require at org/jruby/RubyKernel.java:940
require at C:/jruby-9.0.0.0/lib/ruby/stdlib/rubygems/core_ext/kernel_require.rb:54
setup! at C:/Users/ryan.dai/Desktop/logstash-1.5.3/lib/bootstrap/bundler.rb:43
<top> at c:/Users/ryan.dai/Desktop/logstash-1.5.3/lib/bootstrap/environment.rb:46
I tried jruby -S gem install bundler as suggested from someone else but it doesn't work. Totally new to Ruby, what is happening and what should I do?
You can fallow the below URL for installing entire ELK Setup.
Here you need to pass the file(log) as a path to the input of the logstash configuration.
input {
file {
path => "/tmp/access_log"
start_position => "beginning"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
ELK Setup Installtion
Commands for running with CMD Prompt:
logstash -f logstash.conf for running logstash
logstash --configtest -f logstash.conf for configuration test
logstash --debug -f logstash.conf for debug the logstash configuration
Logstash configuration Examples

Resources