I want to store multidimensional arrays in text files and reload them efficiently. The tricky part is that the array includes strings which could look like " ] , [ \\\"" or anything.
Easiest way of writing the table to file is just as my_array.inspect (right?)
How do I then recreate the array as quickly and painlessly as possible from a string read back from the text file that might look like "[\" ] , [ \\\\\\\"\"]" (as in the above case)?
If your array only includes objects that are literally written such as Numerals, Strings, Arrays, Hashes, you can use eval.
a = [1, 2, 3].inspect
# => "[1, 2, 3]"
eval(a)
# => [1, 2, 3]
In my opinion, this sounds like too much trouble. Use YAML instead.
require 'yaml'
a = [ [ [], [] ], [ [], [] ] ]
File.open("output.yml", "w") do |f|
f.write a.to_yaml
end
b = YAML.load File.open('output.yml', 'r')
As an alternative, you could use JSON instead.
Say you have array
ary
You could write the array to a file:
File.open(path, 'w') { |f| f.write Marshal.dump(ary) }
and then re-create the array by reading the file into a string and saying
ary = Marshal.load(File.read(path))
Related
Within Chef, the attribute looks as below:
default['cluster']['ipaddress'] = ["10.211.130.108", "10.211.242.203"]
Within Chef recipe, I have put every array element in double quotes, using map:
json_nodes = node['consul']['cluster']['ipaddress'].map { |s| "#{s.to_s}:8300" }
bash 'configuring file.json' do
code <<-EOH
echo #{json_nodes} > "/home/user1/file.json"
EOH
end
I get the following output within the file /home/user1/file.json:
[10.211.130.108:8300, 10.211.242.203:8300]
The output should have double quotes as follows:
["10.211.130.108:8300", "10.211.242.203:8300"]
How about:
json_nodes = node['cluster']['ipaddress'].map { |s| "#{s.to_s}:8300" }
I don't have your data but here is an example from my console:
[3] pry(main)> arr = [1,2,3,4,5]
=> [1, 2, 3, 4, 5]
[4] pry(main)> arr.map { |s| "#{s.to_s}:3000" }
=> ["1:3000", "2:3000", "3:3000", "4:3000", "5:3000"]
Note that when I put the .join on the end, it concats them all in a single string.
[8] pry(main)> arr.map { |s| "#{s.to_s}:3000" }.join(',')
=> "1:3000,2:3000,3:3000,4:3000,5:3000"
Things become a lot easier, when you use Chef in the way it is made for (and don't call bash..). Your problem translated to solving it in Chef is:
file "/home/user1/file.json" do
content node['consul']['cluster']['ipaddress'].map { |s| "#{s}:8300" }.to_s
end
If you need non-root ownership and read permissions, have a look at the owner and mode options of the file resource.
This code will do what you specified in your own answer. It's a bit strange though, the string you are aiming looks like an array declaration.
a = ['10.211.130.108','10.211.242.203']
a.map! { |s| "\"#{s}:8300\"" }
result = "[#{a.join(',')}]"
puts result
# Output: ["10.211.130.108:8300","10.211.242.203:8300"]
I was able to get double quotes using the following:
json_nodes = node['cluster']['ipaddress'].map { |s| "\"#{s.to_s}:8300\"" }
I'm still struggling with a basic problem I have not found an answer to online.
I am getting CSV like data as name and quantity:
Foo, 1.5
Bar, 1.2
Foo, 1.1
...
And want to consolidate it to unique names with the totals as a new value:
Foo, 2.6 #total of both Foo lines
Bar, 1.2
...
Every single time the data set is not large, but the task is quite repetitive.
I tried to convert it into an array of hashes, finding uniq names, and then use inject, but somehow it got quite complicated and did not work. Also, looping through everything seems not to be the ideal approach.
Does anyone have a nice and easy idea or solution I am missing? (I only found "Extract value from row in csv and sum it" for PHP.)
First of all, you can use Ruby's CSV library to parse and convert your CSV data:
require 'csv'
csv_data = "Foo, 1.5\nBar, 1.2\nFoo, 1.1"
data_array = CSV.parse(csv_data, converters: :numeric)
#=> [["Foo", 1.5], ["Bar", 1.2], ["Foo", 1.1]]
To sum the values I'd use a hash along with each_with_object:
data_array.each_with_object(Hash.new(0)) { |(k, v), h| h[k] += v }
#=> {"Foo"=>2.6, "Bar"=>1.2}
Passing 0.0 as the default option for your Hash accounts nicely for the first occurrence of each item:
input = [ ['Foo', 1.5],
['Bar', 1.2],
['Foo', 1.1] ]
result = input.inject(Hash.new(0.0)) do |sum, (key, value)|
sum[key] += value
sum
end
p result
The array of hash seems to be the easiest approach:
Let's say that:
CSV=[["foo",1.5],["bar",2.2],["foo",1.1]]
Just do:
myCSV=[["foo",1.5],["bar",1.2],["foo",1.1]]
myCSV.each_with_object(Hash.new(0.0)){|row,sum| sum[row[0]]+=row[1]}
=> {
"foo" => 2.6,
"bar" => 1.2
}
If you are reading from a file, it's more or less the same using the CSV library:
sum=Hash.new(0.0)
CSV.foreach("path/to/file.csv") do |row|
sum[row[0]]+=row[1]
end
Using Ruby 2.1 (with ActiveSupport 3.x, if that helps), I want to convert an array like this:
[ :apples, :bananas, :strawberries ]
Into a hash like this:
{ :apples => 20, :bananas => 20, :strawberries => 20 }
Technically, this works:
array = [ :apples, :bananas, :strawberries ]
hash = Hash[array.zip(Array.new(array.length, 20))]
# => {:apples=>20, :bananas=>20, :strawberries=>20}
But that seems really clunky, and I feel like there's a more straightforward way to do this. Is there one?
I looked at Enumerable#zip as well as the default value option for Hash#new but didn't see anything providing a simple method for this conversion.
Another answer:
ary = [ :apples, :bananas, :strawberries ]
Hash[[*ary.each_with_object(20)]]
# => {:apples=>20, :bananas=>20, :strawberries=>20}
Alternatively (as pointed out by the OP):
ary.each_with_object(20).to_h
# => {:apples=>20, :bananas=>20, :strawberries=>20}
Basically, calling each_with_object returns an Enumerator object of pairs consisting of each value and the number 20 (i.e. [:apples, 20], ...) which can subsequently be converted to a hash.
Use Hash[]:
Hash[array.map { |f| [f, 20] }]
I think, Array#product will be helpful here :
ary = [ :apples, :bananas, :strawberries ]
Hash[ary.product([20])]
# => {:apples=>20, :bananas=>20, :strawberries=>20}
I'm building a config file for one of our inline apps. Its essentially a json file. I'm having a lot of trouble getting puppet/ruby 1.8 to output the hash/json the same way each time.
I'm currently using
<%= require "json"; JSON.pretty_generate data %>
But while outputting human readable content, it doesn't guarantee the same order each time. Which means that puppet will send out change notifications often for the same data.
I've also tried
<%= require "json"; JSON.pretty_generate Hash[*data.sort.flatten] %>
Which will generate the same data/order each time. The problem comes when data has a nested array.
data => { beanstalkd => [ "server1", ] }
becomes
"beanstalkd": "server1",
instead of
"beanstalkd": ["server1"],
I've been fighting with this for a few days on and off now, so would like some help
Since hashes in Ruby are ordered, and the question is tagged with ruby, here's a method that will sort a hash recursively (without affecting ordering of arrays):
def sort_hash(h)
{}.tap do |h2|
h.sort.each do |k,v|
h2[k] = v.is_a?(Hash) ? sort_hash(v) : v
end
end
end
h = {a:9, d:[3,1,2], c:{b:17, a:42}, b:2 }
p sort_hash(h)
#=> {:a=>9, :b=>2, :c=>{:a=>42, :b=>17}, :d=>[3, 1, 2]}
require 'json'
puts sort_hash(h).to_json
#=> {"a":9,"b":2,"c":{"a":42,"b":17},"d":[3,1,2]}
Note that this will fail catastrophically if your hash has keys that cannot be compared. (If your data comes from JSON, this will not be the case, since all keys will be strings.)
Hash is an unordered data structure. In some languages (ruby, for example) there's an ordered version of hash, but in most cases in most languages you shouldn't rely on any specific order in a hash.
If order is important to you, you should use an array. So, your hash
{a: 1, b: 2}
becomes this
[{a: 1}, {b: 2}]
I think, it doesn't force too many changes in your code.
Workaround to your situation
Try this:
data = {beanstalkId: ['server1'], ccc: 2, aaa: 3}
data2 = data.keys.sort.map {|k| [k, data[k]]}
puts Hash[data2]
#=> {:aaa=>3, :beanstalkId=>["server1"], :ccc=>2}
I've got an array of hashes representing objects as a response to an API call. I need to pull data from some of the hashes, and one particular key serves as an id for the hash object. I would like to convert the array into a hash with the keys as the ids, and the values as the original hash with that id.
Here's what I'm talking about:
api_response = [
{ :id => 1, :foo => 'bar' },
{ :id => 2, :foo => 'another bar' },
# ..
]
ideal_response = {
1 => { :id => 1, :foo => 'bar' },
2 => { :id => 2, :foo => 'another bar' },
# ..
}
There are two ways I could think of doing this.
Map the data to the ideal_response (below)
Use api_response.find { |x| x[:id] == i } for each record I need to access.
A method I'm unaware of, possibly involving a way of using map to build a hash, natively.
My method of mapping:
keys = data.map { |x| x[:id] }
mapped = Hash[*keys.zip(data).flatten]
I can't help but feel like there is a more performant, tidier way of doing this. Option 2 is very performant when there are a very minimal number of records that need to be accessed. Mapping excels here, but it starts to break down when there are a lot of records in the response. Thankfully, I don't expect there to be more than 50-100 records, so mapping is sufficient.
Is there a smarter, tidier, or more performant way of doing this in Ruby?
Ruby <= 2.0
> Hash[api_response.map { |r| [r[:id], r] }]
#=> {1=>{:id=>1, :foo=>"bar"}, 2=>{:id=>2, :foo=>"another bar"}}
However, Hash::[] is pretty ugly and breaks the usual left-to-right OOP flow. That's why Facets proposed Enumerable#mash:
> require 'facets'
> api_response.mash { |r| [r[:id], r] }
#=> {1=>{:id=>1, :foo=>"bar"}, 2=>{:id=>2, :foo=>"another bar"}}
This basic abstraction (convert enumerables to hashes) was asked to be included in Ruby long ago, alas, without luck.
Note that your use case is covered by Active Support: Enumerable#index_by
Ruby >= 2.1
[UPDATE] Still no love for Enumerable#mash, but now we have Array#to_h. It creates an intermediate array, but it's better than nothing:
> object = api_response.map { |r| [r[:id], r] }.to_h
Something like:
ideal_response = api_response.group_by{|i| i[:id]}
#=> {1=>[{:id=>1, :foo=>"bar"}], 2=>[{:id=>2, :foo=>"another bar"}]}
It uses Enumerable's group_by, which works on collections, returning matches for whatever key value you want. Because it expects to find multiple occurrences of matching key-value hits it appends them to arrays, so you end up with a hash of arrays of hashes. You could peel back the internal arrays if you wanted but could run a risk of overwriting content if two of your hash IDs collided. group_by avoids that with the inner array.
Accessing a particular element is easy:
ideal_response[1][0] #=> {:id=>1, :foo=>"bar"}
ideal_response[1][0][:foo] #=> "bar"
The way you show at the end of the question is another valid way of doing it. Both are reasonably fast and elegant.
For this I'd probably just go:
ideal_response = api_response.each_with_object(Hash.new) { |o, h| h[o[:id]] = o }
Not super pretty with the multiple brackets in the block but it does the trick with just a single iteration of the api_response.