I have data in the format:
data={"services"=>[{"name"=>"foo","checks"=>["script1","script2"]},
{"name"=>"bar","checks"=>["script3"]}]}
I am trying to replace "checks" for each check to be "/bin/#{check}". The code I was playing with is
data["services"].map! {|svc|
svc["checks"].map! {|check|
check = "/bin/#{check}"
}
}
But this code replaces the whole service hash instead of just one key, and I get:
{"services"=>[["/bin/script1", "/bin/script2"], ["/bin/script3"]]}
Can I use map! in deeply nested arrays of hashes of arrays of... ? Or how else can I do what I need?
I suggest avoiding changing data and just return a new hash:
new_data = data.each_with_object({}) do |(service, values), hash|
hash[service] = values.map do |value|
checks = value['checks'].map { |check| "/bin/#{check}" }
value.merge('checks' => checks)
end
end
=> {"services"=>[{"name"=>"foo", "checks"=>["/bin/script1", "/bin/script2"]}, {"name"=>"bar", "checks"=>["/bin/script3"]}]}
Context and Code Examples
I have an Array with instances of a class called TimesheetEntry.
Here is the constructor for TimesheetEntry:
def initialize(parameters = {})
#date = parameters.fetch(:date)
#project_id = parameters.fetch(:project_id)
#article_id = parameters.fetch(:article_id)
#hours = parameters.fetch(:hours)
#comment = parameters.fetch(:comment)
end
I create an array of TimesheetEntry objects with data from a .csv file:
timesheet_entries = []
CSV.parse(source_file, csv_parse_options).each do |row|
timesheet_entries.push(TimesheetEntry.new(
:date => Date.parse(row['Date']),
:project_id => row['Project'].to_i,
:article_id => row['Article'].to_i,
:hours => row['Hours'].gsub(',', '.').to_f,
:comment => row['Comment'].to_s.empty? ? "N/A" : row['Comment']
))
end
I also have a Set of Hash containing two elements, created like this:
all_timesheets = Set.new []
timesheet_entries.each do |entry|
all_timesheets << { 'date' => entry.date, 'entries' => [] }
end
Now, I want to populate the Array inside of that Hash with TimesheetEntries.
Each Hash array must contain only TimesheetEntries of one specific date.
I have done that like this:
timesheet_entries.each do |entry|
all_timesheets.each do |timesheet|
if entry.date == timesheet['date']
timesheet['entries'].push entry
end
end
end
While this approach gets the job done, it's not very efficient (I'm fairly new to this).
Question
What would be a more efficient way of achieving the same end result? In essence, I want to "split" the Array of TimesheetEntry objects, "grouping" objects with the same date.
You can fix the performance problem by replacing the Set with a Hash, which is a dictionary-like data structure.
This means that your inner loop all_timesheets.each do |timesheet| ... if entry.date ... will simply be replaced by a more efficient hash lookup: all_timesheets[entry.date].
Also, there's no need to create the keys in advance and then populate the date groups. These can both be done in one go:
all_timesheets = {}
timesheet_entries.each do |entry|
all_timesheets[entry.date] ||= [] # create the key if it's not already there
all_timesheets[entry.date] << entry
end
A nice thing about hashes is that you can customize their behavior when a non-existing key is encountered. You can use the constructor that takes a block to specify what happens in this case. Let's tell our hash to automatically add new keys and initialize them with an empty array. This allows us to drop the all_timesheets[entry.date] ||= [] line from the above code:
all_timesheets = Hash.new { |hash, key| hash[key] = [] }
timesheet_entries.each do |entry|
all_timesheets[entry.date] << entry
end
There is, however, an even more concise way of achieving this grouping, using the Enumerable#group_by method:
all_timesheets = timesheet_entries.group_by { |e| e.date }
And, of course, there's a way to make this even more concise, using yet another trick:
all_timesheets = timesheet_entries.group_by(&:date)
I have an arbitrary list of file names I'd like to sort into a hash. I'd like to do it like this:
## Example file name 'hello.world.random_hex"
file_name_list.each do |file|
name_array = file.split('.')
files[name_array[0].to_sym][name_array[1].to_sym] << file
end
Those keys may not exist and I'd like for them to be automatically created with a default value of [] so the << works as expected. The final files hash would look like:
{ :hello => { :world => [ "hello.world.random_hex", "hello.world.other_random_hex" ] } }
How can I initialize files to accomplish this?
If there are always two levels of keys like this, you can do it using the block form of Hash.new:
files = Hash.new {|k,v| k[v] = Hash.new {|k,v| k[v] = [] }}
(On the other hand, if the keys can be nested to an arbitrary depth, this is much harder because the Hash can't know whether the value for a nonexistent key should be a Hash or an Array at the time it is accessed.)
I have been using Ruby for a while, but this is my first time doing anything with a database. I've been playing around with MongoDB for a bit and, at this point, I've begun to try and populate a simple database.
Here is my problem. I have a text file containing data in a particular format. When I read that file in, the data is stored in nested arrays like so:
dataFile = ["sectionName", ["key1", "value1"], ["key2", "value2", ["key3", ["value3A", "value3B"]]]
The format will always be that the first value of the array is a string and each subsequent value is an array. Each array is formatted in as a key/value pair. However, the value can be a string, an array of two strings, or a series of arrays that have their own key/value array pairs. I don't know any details about the data file before I read it in, just that it conforms to these rules.
Now, here is my problem. I want to read this into to a Mongo database preserving this basic structure. So, for instance, if I were to do this by hand, it would look like this:
newDB = mongo_client.db("newDB")
newCollection = newDB["dataFile1"]
doc = {"section_name" => "sectionName", "key1" => "value1", "key2" => "value2", "key3" => ["value3A", "value3B"]}
ID = newCollection.insert(doc)
I know there has to be an easy way to do this. So far, I've been trying various recursive functions to parse the data out, turn it into mongo commands and try to populate my database. But it just feels clunky, like there is a better way. Any insight into this problem would be appreciated.
The value that you gave for the variable dataFile isn't a valid array, because it is missing an closing square bracket.
If we made the definition of dataFile a valid line of ruby code, the following code would yield the hash that you described. It uses map.with_index to visit each element of the array and transforms this array into a new array of key/value hashes. This transformed array of hashes is flatted and converted into single hash using the inject method.
dataFile = ["sectionName", ["key1", "value1"], ["key2", "value2", ["key3", ["value3A", "value3B"]]]]
puts dataFile.map.with_index {
|e, ix|
case ix
when 0
{ "section_name" => e }
else
list = []
list.push( { e[0] => e[1] } )
if( e.length > 2 )
list.push(
e[2..e.length-1].map {|p|
{ p[0] => p[1] }
}
)
end
list
end
}.flatten.inject({ }) {
|accum, e|
key = e.keys.first
accum[ key ] = e[ key ]
accum
}.inspect
The output looks like:
{"section_name"=>"sectionName", "key1"=>"value1", "key2"=>"value2", "key3"=>["value3A", "value3B"]}
For input that looked like this:
["sectionName", ["key1", "value1"], ["key2", "value2", ["key3", ["value3A", "value3B"]], ["key4", ["value4A", "value4B"]]], ["key5", ["value5A", "value5B"]]]
We would see:
{"section_name"=>"sectionName", "key1"=>"value1", "key2"=>"value2", "key3"=>["value3A", "value3B"], "key4"=>["value4A", "value4B"], "key5"=>["value5A", "value5B"]}
Note the arrays "key3" and "key4", which is what I consider as being called a series of arrays. If the structure has array of arrays of unknown depth then we would need a different implementation - maybe use an array to keep track of the position as the program walks through this arbitrarily nested array of arrays.
In the following test, please find two solutions.
The first converts to a nested Hash which is what I think that you want without flattening the input data.
The second stores the key-value pairs exactly as given from the input.
I've chosen to fix missing closing square bracket by preserving key values pairs.
The major message here is that while the top-level data structure for MongoDB is a document mapped to a Ruby Hash
that by definition has key-value structure, the values can be any shape including nested arrays or hashes.
So I hope that test examples cover the range, showing that you can match storage in MongoDB to fit your needs.
test.rb
require 'mongo'
require 'test/unit'
require 'pp'
class MyTest < Test::Unit::TestCase
def setup
#coll = Mongo::MongoClient.new['test']['test']
#coll.remove
#dataFile = ["sectionName", ["key1", "value1"], ["key2", "value2"], ["key3", ["value3A", "value3B"]]]
#key, *#value = #dataFile
end
test "nested array data as hash value" do
input_doc = {#key => Hash[*#value.flatten(1)]}
#coll.insert(input_doc)
fetched_doc = #coll.find.first
assert_equal(input_doc[#key], fetched_doc[#key])
puts "#{name} fetched hash value doc:"
pp fetched_doc
end
test "nested array data as array value" do
input_doc = {#key => #value}
#coll.insert(input_doc)
fetched_doc = #coll.find.first
assert_equal(input_doc[#key], fetched_doc[#key])
puts "#{name} fetched array doc:"
pp fetched_doc
end
end
ruby test.rb
$ ruby test.rb
Loaded suite test
Started
test: nested array data as array value(MyTest) fetched array doc:
{"_id"=>BSON::ObjectId('5357d4ac7f11ba0678000001'),
"sectionName"=>
[["key1", "value1"], ["key2", "value2"], ["key3", ["value3A", "value3B"]]]}
.test: nested array data as hash value(MyTest) fetched hash value doc:
{"_id"=>BSON::ObjectId('5357d4ac7f11ba0678000002'),
"sectionName"=>
{"key1"=>"value1", "key2"=>"value2", "key3"=>["value3A", "value3B"]}}
.
Finished in 0.009493 seconds.
2 tests, 2 assertions, 0 failures, 0 errors, 0 pendings, 0 omissions, 0 notifications
100% passed
210.68 tests/s, 210.68 assertions/s
I have the following code to create an array to object hash:
tp = TupleProfile.new(98, 99)
keyDict = Hash[Array[98,99] => tp]
keyDict[[98,99]].addLatency(0.45)
puts keyDict[[98,99]].getAvg()
This works, but I'd like to be able to call addLatency without checking for an existing hash value:
keyDict[[100,98]].addLatency(0.45) #throws error right now
So I want to create a default value that varies based on the key, something like:
keyDict = Hash.new(TupleProfile.new(theKey[0], theKey[1]))
Where theKey is some sort of special directive. Is there any reasonably clean way to do this, or am I better off checking each time or making a wrapper class for the hash?
Try the Hash.new block notation:
keyDict = Hash.new {|hash,key| hash[key] = TupleProfile.new(*key) }
Using the standard parameter notation (Hash.new(xyz)) will really only instantiate a single TupleProfile object for the hash; this way there will be one for each individual key.
If I understand your question, I think you might be able to use a default procedure. The code in the default procedure will get run if you ask for a key that doesn't exist. Here is an example using a tuple key:
class Test
def initialize(a,b); #a = a; #b = b; end
attr_accessor :a, :b
end
keyDict = {}
keyDict.default_proc = proc do |hash, (key1, key2)|
hash[[key1, key2]] = Test.new(key1, key2)
end
keyDict[[99,200]]
=> #<Test:0x007f9681ad2720 #a=99, #b=200>
keyDict[[99,200]].a
=> 99