I try to send logs to windows event by using logstash. After added some ruby code;it is created below error.How can I send logs to windoes event?
input {
file {
type => "json"
path => ["C:/Temp/logs/*.json"]
start_position => "beginning"
codec => "json"
discover_interval => 120
stat_interval => 60
sincedb_write_interval => 60
close_older => 60
}
}
filter {
mutate {
remove_field => [ "path" ]
}
ruby {
code => "
require 'win32/eventlog'
logger = Win32::EventLog.new
logger.report_event(:event_type => Win32::EventLog::INFO, :data => "a test event log entry")
"
}
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["http://loguser:xxxx#192.158.5.84:333"]
index => "logstash-%{+YYYY.MM}"
}
}
Error:
[2018-03-20T09:51:28,629][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Expected one of #, {, } at line 23, column 75 (byte 464) after filter {\nmutate {\n remove_field => [ \"path\" ] \n\n}\n ruby {\n init => \" require 'win32/eventlog' \n\t \"\n code => \"\n logger = Win32::EventLog.new\n logger.report_event(:event_type => Win32::EventLog::INFO, :data => \""}
As you can tell from the syntax highlighting in your question, there is an issue with the double quotes you are using. Pay close attention to the black letters in the code block:
"
require 'win32/eventlog'
logger = Win32::EventLog.new
logger.report_event(:event_type => Win32::EventLog::INFO, :data => "a test event log entry")
"
You are wrapping the code block in double quotes, but are also using them to define the string in the event: "a test event log entry". The first quote for the string ends the code block, and LogStash reports a syntax error, because it expected you to close the instruction with a }.
You can also see this in the error message, where it reports the value as the data attribute as a single double quote: :data => \".
Try wrapping the string in single quotes: 'a test event log entry' to fix this issue.
Related
I have the following log file,every Logstash record is supposed to contain multiline lines that end with dash signs '----' ,from the example log you can see that I have four keys, the last key is 'Message' . I have configured the following config file that handles all the lines except the line that starts with 'Message :' As you can see from the log in some of the cases, the 'Message :' contains more than one line, and for these cases, I need that all the lines coming after the 'Message :' will be part of the 'Message :' value and not a separate lines .
Please help me fixing this issue.
log file (input.log)
Timestamp :2022-11-03 09:42:08.095
User :USER1
Type :warning
Message :Return code : EXCI_NO_ERROR 0
------------------------------------------
Timestamp :2022-11-03 09:42:08.095
User :USER1
Type :warning
Message :Abend code : 1564
------------------------------------------
Timestamp :2022-11-03 09:42:08.095
User :USER1
Type :warning
Message :Buffer received from xxx
line1
line2
line3
line4
------------------------------------------
Timestamp :2022-11-03 09:42:08.095
User :USER1
Type :warning
Message :Return code : EXCI_NO_ERROR 0
------------------------------------------
Timestamp :2022-11-03 09:42:08.095
User :USER1
Type :warning
Message :Abend code : 1564
------------------------------------------
config file
input {
file {
path => "/etc/logstash/input.log"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => multiline {
# pattern = This says that any line not starting with a '-----' should be merged with the previous line
pattern => "^-----"
negate => true
what => "previous"
}
}
}
filter {
kv {
#source => "message"
field_split => "\n"
value_split => ":"
}
}
output {
file {
path => "/etc/logstash/output.log"
}
#stdout { codec => rubydebug }
}
Tldr;
Take a step back and use other filters type.
dissect
mutate
You will find below one possible solution.
Solution
Mutate, remove the ----\n from the message
Drop empty message
Dissect the message into the necessary fields
input {
file {
path => "/tmp/input.log"
start_position => "beginning"
sincedb_path => "/dev/null"
mode => "read"
codec => multiline {
# pattern = This says that any line not starting with a '-----' should be merged with the previous line
pattern => "^-+$"
negate => true
what => "previous"
}
}
}
filter {
mutate { gsub => ["message", "^-+(\n)?", ""] }
if [message] == ""{
drop {}
}
dissect {
mapping => {
"message" => "Timestamp :%{Timestamp}
User :%{User}
Type :%{Type}
Message :%{Message}"
}
}
}
output {
stdout { codec => rubydebug }
}
I have the following ruby hash.
config = {
'output' => {
'elasticsearch' => {
'hosts' => ['localhost:9200']
}
}
}
Which I'm trying to represent as a logstash configuration file (https://www.elastic.co/guide/en/logstash/current/configuration.html). In this case, something that looks similar to this.
output {
elasticsearch { hosts => ["localhost:9200"] }
}
I've tried using map which is close, but "elasticsearch" should not have "=>" and "elasticsearch" and "hosts" should not be quoted.
puts config.map{|k, v| "#{k} #{v}"}.join('&')
output {"elasticsearch"=>{"hosts"=>["localhost:9200"]}}
I've also tried converting to json and using gsub, but in this case I need to unindent the string and "output" and "elasticsearch" should not be quoted.
puts JSON.pretty_generate(config).gsub(/^[{}]$/, "")
.gsub(": {", " {")
.gsub(": ", " => ")[1..-2]
"output" {
"elasticsearch" {
"hosts" => [
"localhost:9200"
]
}
}
While each implementation is close, it's still off by a bit. Is there a simple way to achieve this?
The Logstash config format isn't standard JSON or anything. It may be best to just write a serializer for it. I took a quick stab at it:
def serialize_config(config, tabs = 0)
clauses = []
config.each do |key, val|
case val
when Hash
clauses << format("%s {\n%s%s}", key, serialize_config(val, tabs + 1), "\t" * tabs)
else
clauses << format("%s => %s", key, val.inspect)
end
end
clauses.map {|c| format("%s%s\n", "\t" * tabs, c) }.join
end
config = {
'output' => {
'elasticsearch' => {
'hosts' => ['localhost:9200']
},
'ruby' => {
"code" => "event.cancel if rand <= 0.90"
}
}
}
puts serialize_config(config)
When gives output:
output {
elasticsearch {
hosts => ["localhost:9200"]
}
ruby {
code => "event.cancel if rand <= 0.90"
}
}
You'd want to check it against more complex Logstash configs, though.
The only thing that is certain about [url][queryString] is that it begins with 404; or that the key is long.I need to remove such keys.
If I use the ruby code below it gives cannot convert linked hashmap to string exception.
input {
file {
# Wildcards work, here :)
path => ["C:\Users\ppurush\Desktop\test\*.log"]
start_position => "beginning"
}
}
filter {
ruby {
code=>
"
require json
my_hash = JSON.parse([url][queryString])
my_hash.delete_if { |key,value| key.to_s.match(/^404;/) }
"
}
}
output {
stdout{}
elasticsearch {
host => localhost
}
}
You get a ruby exception because your ruby code is invalid. Try this instead:
filter {
ruby {
init => "require 'json'"
code => "
my_hash = JSON.parse( event['url']['queryString'] )
my_hash.delete_if { |key,value| key.to_s.match(/^404;/) }
"
}
}
This works if your event has a 'url' => 'queryString' field which contains valid json. You might already have some kind of filter to achieve this (e.g. grok). You might also consider using logstash's built-in json filter and maybe drop to delete certain events.
EDIT:
Suppose your input is plain json (I had to tidy this up):
{"id":"val1","host":"val2","app":"val3","#timestamp":"2015-08-04T19:00:03.6429322Z","#timestampEnd":"2015-08-04T19:00:03.6429322Z","vid":"val4","vidNew":"val5","sessionId":"val6","url":{"rawUrl":"val7","path":"val8","queryString":{"404;dfdgfdgfghfhjghhhhhhhhhhhhh":""}},"net":{"method":"GET","status":"200","size":"0","timeTakenMillis":"0"},"context":{"SearchType":""}}
You can use codec => "json" in your file input.
input {
file {
path => ["C:\Users\ppurush\Desktop\test\*.log"]
start_position => "beginning"
codec => "json"
}
}
You will get a field:
"url" => {
"rawUrl" => "val7",
"path" => "val8",
"queryString" => {
"404;dfdgfdgfghfhjghhhhhhhhhhhhh" => ""
}
}
So 404;dfdgfdgfghfhjghhhhhhhhhhhhh is a variable, too. To check for it and delete the event you could do something like this:
if [url][queryString][404;dfdgfdgfghfhjghhhhhhhhhhhhh] {
drop {}
}
I have a log with a format similar to:
name=johnny amount=30 uuid=2039248934
The problem is I am using this parser on multiple log files with each basically containing numerous kv pairs.
Is there a way to recognize when values are integers and cast them as such without having to use mutate on every single key value pair?(Rather than a string)
I found this link but it was very vague in where the template json file was suppose to go and how I was to go about using it.
Can kv be told to auto-detect numeric values and emit them as numeric JSON values?
You can use ruby plugin to do it.
input {
stdin {}
}
filter {
ruby {
code => "
fieldArray = event['message'].split(' ');
for field in fieldArray
name = field.split('=')[0];
value = field.split('=')[1];
if value =~ /\A\d+\Z/
event[name] = value.to_i
else
event[name] = value
end
end
"
}
}
output {
stdout { codec => rubydebug }
}
First, split the message to an array by SPACE.
Then, for each k,v mapping, check whether the value is numberic, if YES, convert it to Integer.
Here is the sample output for your input:
{
"message" => "name=johnny amount=30 uuid=2039248934",
"#version" => "1",
"#timestamp" => "2015-06-25T08:24:39.755Z",
"host" => "BEN_LIM",
"name" => "johnny",
"amount" => 30,
"uuid" => 2039248934
}
Update Solution for Logstash 5:
input {
stdin {}
}
filter {
ruby {
code => "
fieldArray = event['message'].split(' ');
for field in fieldArray
name = field.split('=')[0];
value = field.split('=')[1];
if value =~ /\A\d+\Z/
event.set(name, value.to_i)
else
event.set(name, value)
end
end
"
}
}
output {
stdout { codec => rubydebug }
}
Note, if you decide to upgrade to Logstash 5, there are some breaking changes:
https://www.elastic.co/guide/en/logstash/5.0/breaking-changes.html
In particular, it is the event that needs to be modified to use either event.get or event.set. Here is what I used to get it working (based on Ben Lim's example):
input {
stdin {}
}
filter {
ruby {
code => "
fieldArray = event.get('message').split(' ');
for field in fieldArray
name = field.split('=')[0];
value = field.split('=')[1];
if value =~ /\A\d+\Z/
event.set(name, value.to_i)
else
event.set(name, value)
end
end
"
}
}
output {
stdout { codec => rubydebug }
}
I'm trying display some Mongo data that I've been collecting using logstash using the Mongostat tool. It displays things with a suffix like "b", "k", "g" to signify byte, kilobyte, gigabyte, which is fine if I'm just reading the output, but I want to throw this into kibana and display it in a graphical format to see trends.
I've done this with several other log files and everything is fine. When I use a grok filter everything is fine but I've added a Ruby filter and now data seems to be duplicated in all fields other than the logstash generated fields and my new field created in my Ruby filter.
Here is the relevant parts of my conf file:
input {
file {
path => "/var/log/mongodb/mongostat.log"
type => "mongostat"
start_position => "end"
}
}
filter {
if [type] == "mongostat" {
grok {
patterns_dir => "/opt/logstash/patterns"
match => ["message","###a bunch of filtering that i know works###"]
add_tag => "mongostat"
}
if [mongoMappedQualifier] == 'b' {
ruby {
code => "event['mongoMappedKB'] = event['mongoMapped'].to_f / 1024"
}
}
if [mongoMappedQualifier] == 'k' {
ruby {
code => "event['mongoMappedKB'] = event['mongoMapped'].to_f * 1"
}
}
if [mongoMappedQualifier] == 'm' {
ruby {
code => "event['mongoMappedKB'] = event['mongoMapped'].to_f * 1024"
}
}
if [mongoMappedQualifier] == 'g' {
ruby {
code => "event['mongoMappedKB'] = event['mongoMapped'].to_f * 1048576"
}
}
}
}
output {
if [type] == "mongostat" {
redis {
host => "redis"
data_type => "list"
key => "logstash-mongostat"
}
}
}
Any idea why or how this can be fixed?