Converting epoch time to date in logstash using ruby filter - ruby

I have a field name "timestamp" in my configuration. It holds an array of data in epoch time (miliseconds). I want to use Ruby filter to convert each epoch time in the array and convert into Date format consumable by Kibana. I am trying to convert each date field and store in a new field as an array. I am getting syntax errors. Can anyone help me out ? I am new to Ruby.
ruby {
code => {'
event.get("timestamp").each do |x| {
event["timestamp1"] = Time.at(x)
}
'}
}

I don't know about logstash, but the Ruby code you include within quotes is invalid. Try this:
ruby {
code => {'
event.get("timestamp").each { |x| event["timestamp1"] = Time.at(x) }
'}
}
If you intend your timestamp key to increment, then you need to include an index:
ruby {
code => {'
event.get("timestamp").each_with_index { |x, i| event["timestamp#{i}"] = Time.at(x) }
'}
}

//This will take an timestamp array with values in milliseconds from epoch time and create a new field with parsed time. This code is part of ruby filter Note : This does not convert into Date field format
code => '
timestamps = Array.new
event.get("timestamp").each_with_index { |x, i|
timestamps.push(Time.at(x.to_i / 1000)) }
event.set( "timestamp1" , timestamps)
'

Related

How to use Logstash filter to convert into nested object for elasticsearch output?

I have following event or row from JDBC input.
{"academic_session_id" : "as=1|dur=2015-16,as=2|dur=2016-17",
"branch_id" : 1}
I want to convert or format it into following using logstash filters...
{"branch_id": 1,"sessions":[{"as":"1","dur":"2015-16"},{"as":"2","dur":"2016-17"}]}
If you can suggest any alternative to logstash.
Note- I am using Elasticsearch 5.X version
Since this is a pretty customized manipulation of the data, I would use the ruby filter, and just write a script using the code setting to parse the data. Something like this would work:
filter {
ruby {
code => "
academic_session = event.get('academic_session_id').split(',').map{|data| data.split('|')}
sessions = academic_session.map do |arr|
temp_hash = {}
arr.each do |kv|
k,v = kv.split('=')
temp_hash[k] = v
end
temp_hash
end
event.set('sessions', sessions)
"
remove_field => ['academic_session_id']
}
}

Logstash to elasticsearch. Keys with dots

I'm facing a problem with logstash configuration. You can find my logstash configuration below.
Ruby filter removes every dot - "." from my fields. It seems that every works fine - the result of data filtration is correct but elasticsearch magically responds with: "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Field name [/ConsumerAdminWebService/getConsumerTransactions.call] cannot contain '.'"} where getConsumerTransactions.call is one of my field key.
input {
http_poller {
urls => {
uatBackend1 => {
method => get
url => "http://some-url/"
headers => {
Accept => "application/json"
}
}
}
request_timeout => 60
# Run every 30 seconds
schedule => { cron => "* * * * * UTC"}
codec => "json"
metadata_target => "http_poller_metadata"
}
}
filter {
ruby {
init => "
def remove_dots hash
new = Hash.new
hash.each { |k,v|
if v.is_a? Hash
v = remove_dots(v)
end
new[ k.gsub('.','_') ] = v
if v.is_a? Array
v.each { |elem|
if elem.is_a? Hash
elem = remove_dots(elem)
end
new[ k.gsub('.','_') ] = elem
} unless v.nil?
end
} unless hash.nil?
return new
end
"
code => "
event.instance_variable_set(:#data,remove_dots(event.to_hash))
"
}
}
output {
elasticsearch {
hosts => localhost
}
}
I'm afraid that this line of code is not correct: event.instance_variable_set(:#data,remove_dots(event.to_hash)) - result data is somehow pinned to the event but the original data persists unchanged and is delivered to Elasticsearch api.
I suppose some clarifications are required here:
I use ES version > 2.0 so dots are not allowed
Ruby filter should replace dots with "_" and it works great - resulting data is fully correct however ES replies with mentioned error. I suspect that filter does not replace event data but simply adds a new filed to Event object. ES then still reads primal data not the updated one.
To be honest Ruby is a magic to me :)
If you're using the ES version 2.0 it could be a version issue where ES doesn't pick up fields which contains . dots.
According to this response in this thread:
Field names cannot contain the . character in Elasticsearch 2.0.
As a work around you might have to mutate (rename) your field names into something like _ or - instead of using the . dot. This ticket pretty much explains this issue, where as . dots can be used in the ES versions which are after 2.0. Hope it helps!

Getting date value from an array hash

I have a hash that looks like
tagArray = { :"2014Date"=>["11/22/2014"], :"2015Date"=>["03/21/2015"] }
Since i already know for a given key, there is only one element in the array of 'values', how can you get just the value not as an array?
value = tagArray[:"2015Date"]
=>
value = ["04/12/2015"]
You can just index the array then and get the date like
value = value.fetch(0).to_s
=>
value = "04/12/2015"
However I am looking for a more elegant way of doing it.
Note: I am using Ruby 2.2, so need to 'strp' the date first to change it from mm/dd/yyyy format which is the end goal.
This is a bit simpler:
value = tagArray[:"2015Date"].last
=>
value = "03/21/2015"

Could not load : can't convert nil into String

I want to read logs only after a particular date. My approach is to drop all the events previous to that date. I try to achieve it like this:
I am dropping all logs before June 1, 2015:
Logstash config file:
input {
file{
path => [
"/var/log/rsyslog/**/*.log"
]
}
}
filter {
grok {
match => ["path", "/var/log/rsyslog/(?<server>[^/]+)/%{YEAR:year}-%{MONTHNUM:month}-%{MONTHDAY:month_day}/(?<logtype>.*).log"]
}
if [year] < "2015" and [month] < "6" and [month_day] < "1" {
drop { }
}
My logstash.err file keeps printing this:
Could not load : can't convert nil into String
Any idea why?
One of the three values - year, month or month_day is nil. Because the regex is not matching for some of the lines in the log file.

Can someone explain/annotate this Ruby snippet with comments?

Please explain this Ruby code so I can convert it to PHP:
data = Hash.new({})
mysql_results.each { |r| data[r['year']][r['week']] = r['count'] }
(year_low..year_high).each do |year|
(1..52).each do |week|
puts "#{year} #{week} #{data[year][week]}"
end
end
data = Hash.new({})
# create hash 'data'
mysql_results.each { |r| data[r['year']][r['week']] = r['count'] }
# maps each row from sql query to hash like this: data[2010][30] = 23
# So you can access 'count' from every year and week in very simple way
(year_low..year_high).each do |year|
# for (year = year_low; year <= year_high; year++)
(1..52).each do |week|
# for (week = 1; week <=52; week++)
puts "#{year} #{week} #{data[year][week]}"
# printf("%d %d %d\n", year, week, data[year][week]);
end
end
Sorry for mixing C with pseudo code, but I hope it helps!
The first bit is just forming an array like so:
$data[2009][17] = 10;
PHP equivalent of that would be:
foreach ($mysql_results as $r){
$data[$r['year']][$r['week']] = $r['count'];
}
The second part would equate to the following:
foreach(range($year_low, $year_high) as $year){
foreach(range(1, 52) as $week){
print $year.' '.$week.' '.$data[$year][$week]
}
}
Hope that helps :)
$data = array();
#Build an array of 'count' per year/week
foreach($mysql_results as $r) {
$data[$r['year']][$r['week']] = $r['count'];
}
#Loop through the $data variable, printing out the 'count' for each year in the array,
#and all 52 weeks that year
for($year = $year_min; $year <= $year_max; $year++) {
for($week=1; $week<=52; $week++) {
echo "$year $week {$data[$year][$week]}";
}
}
Note that year_low and year_high are variables undefined in the current snippet, but they should be known to you.
Also, $mysql_results should be an array containing all rows returned by the database.
In short, the following code does this:
Make an array grouped per year, then per week, containing the value 'count'
Loop through this array, displaying, in order, the year, the week, and the value for 'count', if any

Resources