Related
I'm stuck trying to safely navigate a hash from json.
The json could have a string, eg:
or it could be further nested:
h1 = { location: { formatted: 'Australia', code: 'AU' } }
h2 = { location: 'Australia' }
h2.dig('location', 'formatted')
Then String does not have #dig method
Basically I'm trying to load the JSON then populate the rails model with the data available which may be optional. It seems backwards to check every nested step with an if.
Hash#dig has no magic. It reduces the arguments recursively calling Hash#[] on what was returned from the previous call.
h1 = { location: { formatted: 'Australia', code: 'AU' } }
h1.dig :location, :code
#⇒ "AU"
It works, because h1[:location] had returned a hash.
h2 = { location: 'Australia' }
h2.dig :location, :code
It raises, because h2[:location] had returned a String.
That said, the solution would be to reimplement Hash#dig, as usually :)
Explicitly taking into account that it’s extremely trivial. Just take a list of keys to dig and (surprise) reduce, returning either the value, or nil.
%i|location code|.reduce(h2) do |acc, e|
acc.is_a?(Hash) ? acc[e] : nil
end
#⇒ nil
%i|location code|.reduce(h1) do |acc, e|
acc.is_a?(Hash) ? acc[e] : nil
end
#⇒ "AU"
Shameless plug. You might find the gem iteraptor I had created for this exact purpose useful.
You can use a simple piece of code like that:
def nested_value(hash, key)
return hash if key == ''
keys = key.split('.')
value = hash[keys.first] || hash[keys.first.to_sym]
return value unless value.is_a?(Hash)
nested_value(value, keys[1..-1].join('.'))
end
h1 = { location: { formatted: 'Australia', code: 'AU' } }
h2 = { 'location' => 'Australia' }
p nested_value(h1, 'location.formatted') # => Australia
p nested_value(h2, 'location.formatted') # => Australia
You can also use that method for getting any nested value of a hash by providing key in format foo.bar.baz.qux. Also the method doesn't worry whether a hash has string keys or symbol keys.
I don't know if this lead to the expected behaviour (see examples below) but you can define a patch for the Hash class as follow:
module MyHashPatch
def safe_dig(params) # ok, call as you like..
tmp = self
res = nil
params.each do |param|
if (tmp.is_a? Hash) && (tmp.has_key? param)
tmp = tmp[param]
res = tmp
else
break
end
end
res
end
end
Hash.include MyHashPatch
Then test on your hashes:
h1 = { location: { formatted: 'Australia', code: 'AU' } }
h2 = { location: 'Australia' }
h1.safe_dig([:location, :formatted]) #=> "Australia"
h2.safe_dig([:location, :formatted]) #=> "Australia"
h1.safe_dig([:location, :code]) #=> "AU"
h2.safe_dig([:location, :code]) #=> "Australia"
Given I have this hash:
h = { a: 'a', b: 'b', c: { d: 'd', e: 'e'} }
And I convert to OpenStruct:
o = OpenStruct.new(h)
=> #<OpenStruct a="a", b="b", c={:d=>"d", :e=>"e"}>
o.a
=> "a"
o.b
=> "b"
o.c
=> {:d=>"d", :e=>"e"}
2.1.2 :006 > o.c.d
NoMethodError: undefined method `d' for {:d=>"d", :e=>"e"}:Hash
I want all the nested keys to be methods as well. So I can access d as such:
o.c.d
=> "d"
How can I achieve this?
You can monkey-patch the Hash class
class Hash
def to_o
JSON.parse to_json, object_class: OpenStruct
end
end
then you can say
h = { a: 'a', b: 'b', c: { d: 'd', e: 'e'} }
o = h.to_o
o.c.d # => 'd'
See Convert a complex nested hash to an object.
I came up with this solution:
h = { a: 'a', b: 'b', c: { d: 'd', e: 'e'} }
json = h.to_json
=> "{\"a\":\"a\",\"b\":\"b\",\"c\":{\"d\":\"d\",\"e\":\"e\"}}"
object = JSON.parse(json, object_class:OpenStruct)
object.c.d
=> "d"
So for this to work, I had to do an extra step: convert it to json.
personally I use the recursive-open-struct gem - it's then as simple as RecursiveOpenStruct.new(<nested_hash>)
But for the sake of recursion practice, I'll show you a fresh solution:
require 'ostruct'
def to_recursive_ostruct(hash)
result = hash.each_with_object({}) do |(key, val), memo|
memo[key] = val.is_a?(Hash) ? to_recursive_ostruct(val) : val
end
OpenStruct.new(result)
end
puts to_recursive_ostruct(a: { b: 1}).a.b
# => 1
edit
Weihang Jian showed a slight improvement to this here https://stackoverflow.com/a/69311716/2981429
def to_recursive_ostruct(hash)
hash.each_with_object(OpenStruct.new) do |(key, val), memo|
memo[key] = val.is_a?(Hash) ? to_recursive_ostruct(val) : val
end
end
Also see https://stackoverflow.com/a/63264908/2981429 which shows how to handle arrays
note
the reason this is better than the JSON-based solutions is because you can lose some data when you convert to JSON. For example if you convert a Time object to JSON and then parse it, it will be a string. There are many other examples of this:
class Foo; end
JSON.parse({obj: Foo.new}.to_json)["obj"]
# => "#<Foo:0x00007fc8720198b0>"
yeah ... not super useful. You've completely lost your reference to the actual instance.
Here's a recursive solution that avoids converting the hash to json:
def to_o(obj)
if obj.is_a?(Hash)
return OpenStruct.new(obj.map{ |key, val| [ key, to_o(val) ] }.to_h)
elsif obj.is_a?(Array)
return obj.map{ |o| to_o(o) }
else # Assumed to be a primitive value
return obj
end
end
My solution is cleaner and faster than #max-pleaner's.
I don't actually know why but I don't instance extra Hash objects:
def dot_access(hash)
hash.each_with_object(OpenStruct.new) do |(key, value), struct|
struct[key] = value.is_a?(Hash) ? dot_access(value) : value
end
end
Here is the benchmark for you reference:
require 'ostruct'
def dot_access(hash)
hash.each_with_object(OpenStruct.new) do |(key, value), struct|
struct[key] = value.is_a?(Hash) ? dot_access(value) : value
end
end
def to_recursive_ostruct(hash)
result = hash.each_with_object({}) do |(key, val), memo|
memo[key] = val.is_a?(Hash) ? to_recursive_ostruct(val) : val
end
OpenStruct.new(result)
end
require 'benchmark/ips'
Benchmark.ips do |x|
hash = { a: 1, b: 2, c: { d: 3 } }
x.report('dot_access') { dot_access(hash) }
x.report('to_recursive_ostruct') { to_recursive_ostruct(hash) }
end
Warming up --------------------------------------
dot_access 4.843k i/100ms
to_recursive_ostruct 5.218k i/100ms
Calculating -------------------------------------
dot_access 51.976k (± 5.0%) i/s - 261.522k in 5.044482s
to_recursive_ostruct 50.122k (± 4.6%) i/s - 250.464k in 5.008116s
My solution, based on max pleaner's answer and similar to Xavi's answer:
require 'ostruct'
def initialize_open_struct_deeply(value)
case value
when Hash
OpenStruct.new(value.transform_values { |hash_value| send __method__, hash_value })
when Array
value.map { |element| send __method__, element }
else
value
end
end
Here is one way to override the initializer so you can do OpenStruct.new({ a: "b", c: { d: "e", f: ["g", "h", "i"] }}).
Further, this class is included when you require 'json', so be sure to do this patch after the require.
class OpenStruct
def initialize(hash = nil)
#table = {}
if hash
hash.each_pair do |k, v|
self[k] = v.is_a?(Hash) ? OpenStruct.new(v) : v
end
end
end
def keys
#table.keys.map{|k| k.to_s}
end
end
Basing a conversion on OpenStruct works fine until it doesn't. For instance, none of the other answers here properly handle these simple hashes:
people = { person1: { display: { first: 'John' } } }
creds = { oauth: { trust: true }, basic: { trust: false } }
The method below works with those hashes, modifying the input hash rather than returning a new object.
def add_indifferent_access!(hash)
hash.each_pair do |k, v|
hash.instance_variable_set("##{k}", v.tap { |v| send(__method__, v) if v.is_a?(Hash) } )
hash.define_singleton_method(k, proc { hash.instance_variable_get("##{k}") } )
end
end
then
add_indifferent_access!(people)
people.person1.display.first # => 'John'
Or if your context calls for a more inline call structure:
creds.yield_self(&method(:add_indifferent_access!)).oauth.trust # => true
Alternatively, you could mix it in:
module HashExtension
def very_indifferent_access!
each_pair do |k, v|
instance_variable_set("##{k}", v.tap { |v| v.extend(HashExtension) && v.send(__method__) if v.is_a?(Hash) } )
define_singleton_method(k, proc { self.instance_variable_get("##{k}") } )
end
end
end
and apply to individual hashes:
favs = { song1: { title: 'John and Marsha', author: 'Stan Freberg' } }
favs.extend(HashExtension).very_indifferent_access!
favs.song1.title
Here is a variation for monkey-patching Hash, should you opt to do so:
class Hash
def with_very_indifferent_access!
each_pair do |k, v|
instance_variable_set("##{k}", v.tap { |v| v.send(__method__) if v.is_a?(Hash) } )
define_singleton_method(k, proc { instance_variable_get("##{k}") } )
end
end
end
# Note the omission of "v.extend(HashExtension)" vs. the mix-in variation.
Comments to other answers expressed a desire to retain class types. This solution accommodates that.
people = { person1: { created_at: Time.now } }
people.with_very_indifferent_access!
people.person1.created_at.class # => Time
Whatever solution you choose, I recommend testing with this hash:
people = { person1: { display: { first: 'John' } }, person2: { display: { last: 'Jingleheimer' } } }
If you are ok with monkey-patching the Hash class, you can do:
require 'ostruct'
module Structurizable
def each_pair(&block)
each do |k, v|
v = OpenStruct.new(v) if v.is_a? Hash
yield k, v
end
end
end
Hash.prepend Structurizable
people = { person1: { display: { first: 'John' } }, person2: { display: { last: 'Jingleheimer' } } }
puts OpenStruct.new(people).person1.display.first
Ideally, instead of pretending this, we should be able to use a Refinement, but for some reason I can't understand it didn't worked for the each_pair method (also, unfortunately Refinements are still pretty limited)
If I want to recursively merge 2 hashes, I can do so with the following function:
def recursive_merge(a,b)
a.merge(b) {|key,a_item,b_item| recursive_merge(a_item,b_item) }
end
This works great, in that I can now do:
aHash = recursive_merge(aHash,newHash)
But I'd like to add this as a self-updating style method similar to merge!. I can add in the returning function:
class Hash
def recursive_merge(newHash)
self.merge { |key,a_item,b_item| a_item.recursive_merge(b_item) }
end
end
But am not sure how to re-create the bang function that updates the original object without association.
class Hash
def recursive_merge!(newHash)
self.merge { |key,a_item,b_item| a_item.recursive_merge(b_item) }
# How do I set "self" to this new hash?
end
end
edit example as per comments.
h={:a=>{:b => "1"}
h.recursive_merge!({:a=>{:c=>"2"})
=> {:a=>{:b=>"1", :c="2"}}
The regular merge results in :b=>"1" being overwritten by :c="2"
Use merge! rather than attempt to update self. I don't believe it makes sense to use merge! anywhere but at the top level, so I wouldn't call the bang version recursively. Instead, use merge! at the top level, and call the non-bang method recursively.
It may also be wise to check both values being merged are indeed hashes, otherwise you may get an exception if you attempt to recursive_merge on a non-hash object.
#!/usr/bin/env ruby
class Hash
def recursive_merge(other)
self.merge(other) { |key, value1, value2| value1.is_a?(Hash) && value2.is_a?(Hash) ? value1.recursive_merge(value2) : value2}
end
def recursive_merge!(other)
self.merge!(other) { |key, value1, value2| value1.is_a?(Hash) && value2.is_a?(Hash) ? value1.recursive_merge(value2) : value2}
end
end
h1 = { a: { b:1, c:2 }, d:1 }
h2 = { a: { b:2, d:4 }, d:2 }
h3 = { d: { b:1, c:2 } }
p h1.recursive_merge(h2) # => {:a=>{:b=>2, :c=>2, :d=>4}, :d=>2}
p h1.recursive_merge(h3) # => {:a=>{:b=>1, :c=>2}, :d=>{:b=>1, :c=>2}}
p h1.recursive_merge!(h2) # => {:a=>{:b=>2, :c=>2, :d=>4}, :d=>2}
p h1 # => {:a=>{:b=>2, :c=>2, :d=>4}, :d=>2}
If you have a specific reason to fully merge in place, possibly for speed, you can experiment with making the second function call itself recursively, rather than delegate the recursion to the first function. Be aware that may produce unintended side effects if the hashes store shared objects.
Example:
h1 = { a:1, b:2 }
h2 = { a:5, c:9 }
h3 = { a:h1, b:h2 }
h4 = { a:h2, c:h1 }
p h3.recursive_merge!(h4)
# Making recursive calls to recursive_merge
# => {:a=>{:a=>5, :b=>2, :c=>9}, :b=>{:a=>5, :c=>9}, :c=>{:a=>1, :b=>2}}
# Making recursive calls to recursive_merge!
# => {:a=>{:a=>5, :b=>2, :c=>9}, :b=>{:a=>5, :c=>9}, :c=>{:a=>5, :b=>2, :c=>9}}
As you can see, the second (shared) copy of h1 stored under the key :c is updated to reflect the merge of h1 and h2 under the key :a. This may be surprising and unwanted. Hence why I recommend using recursive_merge for the recursion, and not recursive_merge!.
I have an OpenStruct that is nested with many other OpenStructs. What's the best way to deeply convert them all to JSON?
Ideally:
x = OpenStruct.new
x.y = OpenStruct.new
x.y.z = OpenStruct.new
z = 'hello'
x.to_json
// {y: z: 'hello'}
Reality
{ <OpenStruct= ....> }
There is no default methods to accomplish such task because the built-in #to_hash returns the Hash representation but it doesn't deep converts the values.
If a value is an OpenStruct, it's returned as such and it's not converted into an Hash.
However, this is not that complicated to solve. You can create a method that traverses each key/value in an OpenStruct instance (e.g. using each_pair), recursively descends into the nested OpenStructs if the value is an OpenStruct and returns an Hash of just Ruby basic types.
Such Hash can then easily be serialized using either .to_json or JSON.dump(hash).
This is a very quick example, with an update from #Yuval Rimar for arrays of OpenStructs:
def openstruct_to_hash(object, hash = {})
case object
when OpenStruct then
object.each_pair do |key, value|
hash[key] = openstruct_to_hash(value)
end
hash
when Array then
object.map { |v| openstruct_to_hash(v) }
else object
end
end
openstruct_to_hash(OpenStruct.new(foo: 1, bar: OpenStruct.new(baz: 2)))
# => {:foo=>1, :bar=>{:baz=>2}}
Fixes to above solution to handle arrays
def open_struct_to_hash(object, hash = {})
object.each_pair do |key, value|
hash[key] = case value
when OpenStruct then open_struct_to_hash(value)
when Array then value.map { |v| open_struct_to_hash(v) }
else value
end
end
hash
end
Here's yet another approach, modified from lancegatlin's answer. Also adding the method to the OpenStruct class itself.
class OpenStruct
def deep_to_h
each_pair.map do |key, value|
[
key,
case value
when OpenStruct then value.deep_to_h
when Array then value.map {|el| el === OpenStruct ? el.deep_to_h : el}
else value
end
]
end.to_h
end
in initializers/open_struct.rb:
require 'ostruct'
# Because #table is a instance variable of OpenStruct and Object#as_json returns Hash of instance variables.
class OpenStruct
def as_json(options = nil)
#table.as_json(options)
end
end
Usage:
OpenStruct.new({ a: { b: 123 } }).as_json
# Result
{
"a" => {
"b" => 123
}
}
Edit:
This seems to do almost the same thing (notice the keys are symbols instead of strings)
OpenStruct.new({ a: { b: 123 } }).marshal_dump
# Result
{
:a => {
:b => 123
}
}
Same function that can accept arrays as an input too
def openstruct_to_hash(object, hash = {})
case object
when OpenStruct then
object.each_pair do |key, value|
hash[key] = openstruct_to_hash(value)
end
hash
when Array then
object.map { |v| openstruct_to_hash(v) }
else object
end
end
None of the above worked for me with a copy and paste. Adding this solution:
require 'json'
class OpenStruct
def deep_to_h
each_pair.map do |key, value|
[
key,
case value
when OpenStruct then value.deep_to_h
when Array then value.map {|el| el.class == OpenStruct ? el.deep_to_h : el}
else value
end
]
end.to_h
end
end
json=<<HERE
{
"string": "fooval",
"string_array": [
"arrayval"
],
"int": 2,
"hash_array": [
{
"string": "barval",
"string2": "bazval"
},
{
"string": "barval2",
"string2": "bazval2"
}
]
}
HERE
os = JSON.parse(json, object_class: OpenStruct)
puts JSON.pretty_generate os.to_h
puts JSON.pretty_generate os.deep_to_h
I'm creating a nested hash in ruby rexml and want to update the hash when i enter a loop.
My code is like:
hash = {}
doc.elements.each(//address) do |n|
a = # ...
b = # ...
hash = { "NAME" => { a => { "ADDRESS" => b } } }
end
When I execute the above code the hash gets overwritten and I get only the info in the last iteration of the loop.
I don't want to use the following way as it makes my code verbose
hash["NAME"] = {}
hash["NAME"][a] = {}
and so on...
So could someone help me out on how to make this work...
Assuming the names are unique:
hash.merge!({"NAME" => { a => { "ADDRESS" => b } } })
You always create a new hash in each iteration, which gets saved in hash.
Just assign the key directly in the existing hash:
hash["NAME"] = { a => { "ADDRESS" => b } }
hash = {"NAME" => {}}
doc.elements.each('//address') do |n|
a = ...
b = ...
hash['NAME'][a] = {'ADDRESS' => b, 'PLACE' => ...}
end
blk = proc { |hash, key| hash[key] = Hash.new(&blk) }
hash = Hash.new(&blk)
doc.elements.each('//address').each do |n|
a = # ...
b = # ...
hash["NAME"][a]["ADDRESS"] = b
end
Basically creates a lazily instantiated infinitely recurring hash of hashes.
EDIT: Just thought of something that could work, this is only tested with a couple of very simple hashes so may have some problems.
class Hash
def can_recursively_merge? other
Hash === other
end
def recursive_merge! other
other.each do |key, value|
if self.include? key and self[key].can_recursively_merge? value
self[key].recursive_merge! value
else
self[key] = value
end
end
self
end
end
Then use hash.recursive_merge! { "NAME" => { a => { "ADDRESS" => b } } } in your code block.
This simply recursively merges a heirachy of hashes, and any other types if you define the recursive_merge! and can_recusively_merge? methods on them.