ruby {
code => "
event['data'].each {|k, v|
event[k] = v
}
event.remove('data')
"
}
This was used earlier in logstash 1.5 to split key value pair from a field called "data". Currently we need to migrate to logstash 6.8.0, Could someone help me for this?
I had Tried This, but It's not working properly
ruby {
code => "
event.get('data').each {|k, v|
event.set(k, v)
}
event.remove('data')
"
}
Log is like
{"data":[{"key1":"value1","key2":"value2","key3":"value3"}]}
Related
The issue that I'm facing is when I'm trying to automate a terraform script for bootstrap VMs. I keep all the templates in json files and depending on instance it will take one, two or more templates to create the node json file.
My code looks like:
template.each do |t|
node = JSON.parse(File.read("./terraform/node/#{t}.json"))
atr << node.each { |k, v| puts "#{k}: #{v}" }
concatenated = atr.flatten
temp = concatenated
File.open("./terraform/node/#{instance_name}.json","w") do |f|
f.puts JSON.pretty_generate(temp)
end
end
The output file looks like:
[{"haproxy"=>{"app_server_role"=>["s1", "s2"]}}, {"apache"=>{"listen_ports"=>["80", "443"]}}, {"tags"=>[]}]
The issue is that inside the array we have the exact template stored in erb :
{"haproxy"=>{"app_server_role"=>["s1", "s2"]}}
{"tags"=>[]} ...
What I want is a valid json with the content of templates like:
{"haproxy"=>{"app_server_role"=>["s1", "s2"]}, "apache"=>{"listen_ports"=>["80", "443"]}, {"tags"=>[]}
output_file = [{"haproxy"=>{"app_server_role"=>["s1", "s2"]}}, {"apache"=>{"listen_ports"=>["80", "443"]}}, {"tags"=>[]}]
output_file.each_with_object({}) { |h, h2| h2.merge!(h) }
=> {"haproxy"=>{"app_server_role"=>["s1", "s2"]}, "apache"=>{"listen_ports"=>["80", "443"]}, "tags"=>[]}
another great option suggested by mudasobwa:
output_file.reduce(&:merge)
Trying to get the node role from Kubeclient api
Command: client.get_nodes()[0].metadata.labels
Kubeclient::Node beta.kubernetes.io/arch="amd64",
beta.kubernetes.io/instance-type="t2.medium",
beta.kubernetes.io/os="linux",
failure-domain.beta.kubernetes.io/region="eu-west-1",
failure-domain.beta.kubernetes.io/zone="eu-west-1a",
kubernetes.io/hostname="ip-X-X-XX-XX.eu-west-1.compute.internal",
kubernetes.io/role="**node**", node-role.kubernetes.io/node="">
I need to get the value for kubernetes.io/role="node" which is node, can some one help with the Ruby code to format this output.
maybe you can do this:
require 'kubeclient'
config = Kubeclient::Config.read('/path/to/.kube/config')
client = Kubeclient::Client.new(
config.context.api_endpoint,
config.context.api_version,
{
ssl_options: config.context.ssl_options,
auth_options: config.context.auth_options
}
)
// prints the label kubernetes.io/role
puts client.get_nodes()[0].metadata.labels['kubernetes.io/role']
// Iterate over all labels
client.get_nodes()[0].metadata.labels.each_pair do |key, value|
puts "#{key} = #{value}"
end
I have defined a custom function currently based on the very simple example here: https://docs.puppet.com/guides/custom_functions.html
module Puppet::Parser::Functions
newfunction(:transform_service_hash) do |args|
filename = args[0]
hash_to_be_transformed = args[1]
File.open(filename, 'a') {|fd| fd.puts hash_to_be_transformed }
end
end
This kinda works. I can call it like this:
$my_hash = { key => "value1" , key2 => "value2" }
notify{ "new hash!! $my_hash" :}
transform_service_hash('/var/tmp/blah',$my_hash)
and the file displays:
mgt21 ~ # cat /var/tmp/blah
keyvalue1key2value2
But, if I try to access elements of the hash, nothing changes:
module Puppet::Parser::Functions
newfunction(:transform_service_hash) do |args|
filename = args[0]
hash_to_be_transformed = args[1]
element1 = hash_to_be_transformed["key"]
File.open(filename, 'a') {|fd| fd.puts element1 }
end
end
The above block outputs the exact same data to /var/tmp/blah.
And, interestingly, if I remove the filename pass and define it statically in the module:
$my_hash = { key => "value1" , key2 => "value2" }
notify{ "new hash!! $my_hash. element1 is: $my_hash.key" :}
transform_service_hash($my_hash)
and
module Puppet::Parser::Functions
newfunction(:transform_service_hash) do |args|
hash_to_be_transformed = args[0]
element1 = hash_to_be_transformed["key"]
File.open('/var/tmp/blah2', 'a') {|fd| fd.puts element1 }
end
end
I get the following error: "Error 400 on SERVER: can't convert Hash into String" with a line reference pointing to "transform_service_hash($my_hash)"
I am new to both puppet and ruby...so I'm unsure I am not passing the element properly, if I am not receiving it properly, or if it something that puppet cannot handle. Please note that I am using version 3.8 of puppet and 1.8.7 of ruby.
Thanks for any help. I've been banging my head against this, and google hasn't been forthcoming yet.
---Edit to clarify my goals (I also edited my code a bit for specificity): I am attempting to pass a hash into a custom ruby function within puppet. The "test" hash has two elements: one string and one array. It is defined as such:
$my_hash = { key => "value1" , key2 => ['array_value1', 'array_value2'] }
$my_display_element=$my_hash["key2"][0]
notify{ "new hash!! $my_hash. the first value of the array stored in element2 is: $my_display_element" :}
transform_service_hash('/var/tmp/blah',$my_hash)
The function appears like so:
module Puppet::Parser::Functions
newfunction(:transform_service_hash) do |args|
filename = args[0]
hash_to_be_transformed = args[1]
element1 = args[1]["key"]
element2 = args[1]["key2"][0]
#element1 = hash_to_be_transformed["key"]
#element2 = hash_to_be_transformed["key2"][0]
File.open(filename, 'a') {|fd| fd.puts "hash_to_be_transformed: #{hash_to_be_transformed}\n" }
File.open(filename, 'a') {|fd| fd.puts "element1: #{element1}\n" }
File.open(filename, 'a') {|fd| fd.puts "element2: #{element2}\n" }
end
end
For now, I just want to be able to see that I am able to access elements within the passed hash like a hash. So I'd love for the output file to look like:
hash_to_be_transformed: keyvalue1key2array_value1array_value2
element1: value1
element2: array_value1
However, in the output file, I see:
mgt21 ~ # cat /var/tmp/blah
keyvalue1key2array_value1array_value2
Clearly, something is off here as my text is not being added and the full hash is just printed out just once and seemingly in string form.
I believe that this may be related to the error that I get when I don't pass in a file name (see above). I think that my hash is getting interpreted (or passed) as a string and, as such, I am unable to access the elements. Unfortunately, I still have been unable to verify this or figure out why it might be happening.
---Edit2 based on Matt's answer below.
I decided to simplify my code to isolate this "can't convert Hash into String error". I also made his suggested changes to remove the ambiguity from my key declarations.
$my_hash = { 'key' => "value1" , 'key2' => ['array_value1', 'array_value2'] }
$my_display_element=$my_hash["key2"][0]
notify{ "new hash!! $my_hash. the first value of the array stored in element2 is: $my_display_element" :}
transform_service_hash($my_hash)
and
module Puppet::Parser::Functions
newfunction(:transform_service_hash) do |args|
hash_to_be_transformed = args[0]
element1 = args[0]['key']
element2 = args[0]['key2'][0]
File.open('/var/tmp/blah', 'a') {|fd| fd.puts "hash_to_be_transformed: #{hash_to_be_transformed}\n" }
File.open('/var/tmp/blah', 'a') {|fd| fd.puts "element1: #{element1}\n" }
File.open('/var/tmp/blah', 'a') {|fd| fd.puts "element2: #{element2}\n" }
end
end
But, I still end up with the same "Hash to String error". It is worth noting that I also tried simplifying my hash to:
$my_hash = { 'key' => "value1" , 'key2' => "value2" }
and I still get the "Hash to String error".
I quickly took your custom parser function and converted it into pure ruby like the following:
hash = { 'key' => 'value1', 'key2' => %w(array_value1 array_value2) }
def newfunction(filename, a_hash)
element1 = a_hash['key']
element2 = a_hash['key2'][0]
File.open(filename, 'a') do |fd|
fd.puts "hash_to_be_transformed: #{a_hash}"
fd.puts "element1: #{element1}"
fd.puts "element2: #{element2}"
end
end
newfunction('foo.txt', hash)
This results in the output text file like the following:
hash_to_be_transformed: {"key"=>"value1", "key2"=>["array_value1", "array_value2"]}
element1: value1
element2: array_value1
This seems to confirm my initial suspicion about what is going wrong here. Your hash in Puppet of:
$my_hash = { key => "value1" , key2 => ['array_value1', 'array_value2'] }
has keys of implicit/ambiguous types. In the ruby code I used to test, I explicitly established them as strings. This also correlates strongly with these lines in your code failing:
element1 = args[1]["key"]
element2 = args[1]["key2"][0]
and your error message of:
Error 400 on SERVER: can't convert Hash into String
because you are specifying in your ruby code that you expect the keys to be string. Changing your hash in Puppet to:
$my_hash = { 'key' => "value1" , 'key2' => "value2" }
should fix this.
On an unrelated note, I recommend the use of linters to help you learn these languages. Puppet-Lint, Rubocop, and Reek will all help point out suboptimal and messy parts of your code to help you learn the new languages.
On a related note, you may want to put something like this at the top of your custom parser function:
raise(Puppet::ParseError, 'newfunction expects two arguments') if args.length != 2
After much gnashing of teeth (and some very helpful pointers from #MattSchuchard), I realized that none of the changes to my function were going into effect. One needs to restart the puppetmaster service after each change to a custom function: docs.puppet.com/guides/custom_functions.html (appropriately under "Gotchas").
Once I started restarting this service after each change to the function, my hash was able to be parsed properly:
from the .pp file:
$filename = "/var/tmp/test"
$my_hash = { 'key' => "value1" , 'key2' => ["M\'lady\n*doffs cap*", 'array_value2'] }
transform_service_hash($filename, $my_hash)
from the ruby file:
module Puppet::Parser::Functions
newfunction(:transform_service_hash) do |args|
filename = args[0]
hash_to_be_transformed = args[1]
array_val = hash_to_be_transformed['key2'][0]
File.open(filename, 'a') {|fd| fd.puts "#{array_val}\n" }
end
end
and output:
mgt21 tmp # cat test
M'lady
*doffs cap*
I've struggled with this problem for a while, and I'm finally going to ask here for help.
Take a very straightforward hash that represents some event:
{
:eventkey=>"someeventkey",
:web_id=>"77d5f434-5a40-4582-88e8-9667b7774c7d",
:apikey=>"eaf3b6e1-b020-41b6-b67f-98f1cc0a9590",
:details=> {
:phone=>"1-936-774-6886",
:email=>"dasia_schuster#wisokytrantow.com",
:pageUrl=>"http://ortiz.info/joe"
}
}
My goal is to create a 'master record' for the entire hash, with the fields in the record being all the keys that do not contain values that are also hashes. When I run into a value that is a hash (in this case 'details'), I need to create a separate record for each k/v pair in that hash bearing the same record id as the parent master record.
I'm not getting the recursion right somehow. Ideally I would get back a single primary record:
{
:recordid=>"some-generated-record-id",
:web_id=>"77d5f434-5a40-4582-88e8-9667b7774c7d",
:apikey=>"eaf3b6e1-b020-41b6-b67f-98f1cc0a9590",
:details=>nil
}
And a distinct entry for each key in the nested hash:
{
:recordid=>"some-generated-detail-record-id",
:parentid=>"the-parent-id-from-the-master-record",
:phone=>"1-936-774-6886"
}
{
:recordid=>"another-generated-detail-record-id",
:parentid=>"the-same-parent-id-from-the-master-record",
:email=>"dasia_schuster#wisokytrantow.com"
}
And so on. I'm trying to get this set of records back as an array of hashes.
I've gotten as far as being able to generate the master record, as well as a detail record, but the detail record contains all the keys in the detail.
def eventToBreakout(eventhash,sequenceid = -1, parentrecordid = nil, records = [])
recordid = SecureRandom.uuid
sequenceid += 1
recordstruc = {:record_id => recordid, :parent_record_id => parentrecordid, :record_processed_ts => Time.now, :sequence_id => sequenceid}
eventhash.each_pair do |k,v|
if recurse?(v)
eventToBreakout(v,sequenceid,recordid,records)
else
if !recordstruc.keys.include?(k)
recordstruc[k]=v
end
end
end
records << recordstruc
records
end
I've included my code and here is the output I'm currently getting from it.
[{:record_id=>"ed98be89-4c1f-496e-beb4-ede5f38dd549",
:parent_record_id=>"fa77299b-95b0-429d-ad8a-f5d365f2f357",
:record_processed_ts=>2016-04-25 16:46:10 -0500,
:sequence_id=>1,
:phone=>"1-756-608-8114",
:email=>"hipolito_wilderman#morar.co",
:pageUrl=>"http://haag.net/alexie.marvin"},
{:record_id=>"fa77299b-95b0-429d-ad8a-f5d365f2f357",
:parent_record_id=>nil,
:record_processed_ts=>2016-04-25 16:46:10 -0500,
:sequence_id=>0,
:eventts=>2016-04-25 22:10:32 -0500,
:web_id=>"a61c57ae-3a01-4994-8803-8d8292df3338",
:apikey=>"9adbc7a4-03ff-4fcc-ac81-ae8d0ee01ef0"}]
Maybe you want something along these lines?
input = { id: 'parent', value: 'parent value', child: { child_value: 1}}
record = {}
input.each do |k,v|
if v.is_a? Hash
v[:parent_id] = input[:id]
(record[:children] ||= []) << v
else
record[k] = v
end
end
puts record
# {:id=>"parent", :value=>"parent value", :children=>[{:child_value=>1, :parent_id=>"parent"}]}
By the way this is a good example to get started with "spec" or "test" frameworks like minitest or rspec (both can be used for both). You have defined input and expected output already and "just" need to code until all test/spec-runs are green.
In the following Ruby example, is there a mode to have YAML NOT silently ignore the duplicate key 'one'?
irb(main):001:0> require 'yaml'
=> true
irb(main):002:0> str = '{ one: 1, one: 2 }'
=> "{ one: 1, one: 2 }"
irb(main):003:0> YAML.load(str)
=> {"one"=>2}
Thanks!
Using Psych, you can traverse the AST tree to find duplicate keys. I'm using the following helper method in my test suite to validate that there are no duplicate keys in my i18n translations:
def duplicate_keys(file_or_content)
yaml = file_or_content.is_a?(File) ? file_or_content.read : file_or_content
duplicate_keys = []
validator = ->(node, parent_path) do
if node.is_a?(Psych::Nodes::Mapping)
children = node.children.each_slice(2) # In a Mapping, every other child is the key node, the other is the value node.
duplicates = children.map { |key_node, _value_node| key_node }.group_by(&:value).select { |_value, nodes| nodes.size > 1 }
duplicates.each do |key, nodes|
duplicate_key = {
file: (file_or_content.path if file_or_content.is_a?(File)),
key: parent_path + [key],
occurrences: nodes.map { |occurrence| "line: #{occurrence.start_line + 1}" },
}.compact
duplicate_keys << duplicate_key
end
children.each { |key_node, value_node| validator.call(value_node, parent_path + [key_node.try(:value)].compact) }
else
node.children.to_a.each { |child| validator.call(child, parent_path) }
end
end
ast = Psych.parse_stream(yaml)
validator.call(ast, [])
duplicate_keys
end
There is a solution involving a linter, but I'm not sure it will be relevant to
you since it's not a 100% Ruby solution. I'll post it anyway since I don't know
any way to do this in Ruby:
You can use the yamllint command-line tool:
sudo pip install yamllint
Specifically, it has a rule key-duplicates that detects duplicated keys:
$ cat test.yml
{ one: 1, one: 2 }
$ yamllint test.yml
test.yml
1:11 error duplication of key "one" in mapping (key-duplicates)
One of the things I do to help maintain the YAML files I use, is write code to initially generate it from a known structure in Ruby. That gets me started.
Then, I'll write a little snippet that loads it and outputs what it parsed using either PrettyPrint or Awesome Print so I can compare that to the file.
I also sort the fields as necessary to make it easy to look for duplicates.
No. You'd have to decide how to rename the keys since hash keys have to be unique - I'd go for some workaround like manually looking for keys that are the same and renaming them before you do a YAML::load.