I want to convert a hash to an object using OpenStruct that has an id property, however the resultant object#id returns the native object id, e.g.
test = OpenStruct.new({:id => 666})
test.id # => 70262018230400
Is there anyway to override this? As at the moment my workaround isn't so pretty.
OpenStruct uses a combination of define_method calls inside an unless self.respond_to?(name) check and method_missing. This means if the property name conflicts with the name of any existing method on the object then you will encounter this problem.
tokland's answer if good but another alternative is to undefine the id method e.g.
test.instance_eval('undef id')
You could also incorporate this into your own customised version of OpenStruct e.g.
class OpenStruct2 < OpenStruct
undef id
end
irb(main):009:0> test2 = OpenStruct2.new({:id => 666})
=> #<OpenStruct2 id=666>
irb(main):010:0> test2.id
=> 666
This was the classical workaround, I'd be also glad to hear a better way:
>> OpenStruct.send(:define_method, :id) { #table[:id] }
=> #<Proc:0x00007fbd43798990#(irb):1>
>> OpenStruct.new(:id => 666).id
=> 666
I've switched to using Hashery and the BasicStruct (renamed version of OpenObject in latest version, 1.4) as that allows me to do this:
x = BasicStruct.new({:id => 666, :sub => BasicStruct.new({:foo => 'bar', :id => 777})})
x.id # => 666
x.sub.id # => 777
x.sub.foo # => "bar"
Related
Just wondering if something like:
# frozen_string_literal: true
exists but for Array and Hash?
The goal is not having to .freeze every single of those within the same globals file.
I didn't find any library that monkey patches default ruby classes like Array or Hash. But I found an interesting gem immutable-ruby that may fit your needs
Simple example
require "immutable/hash"
person = Immutable::Hash[name: "Simon", gender: :male]
# => Immutable::Hash[:name => "Simon", :gender => :male]
and you cannot just modify values of it, cause it is immutable. You can perform some actions on that hash, but new copy will be returned to you
friend = person.put(:name, "James") # => Immutable::Hash[:name => "James", :gender => :male]
person # => Immutable::Hash[:name => "Simon", :gender => :male]
friend[:name] # => "James"
person[:name] # => "Simon"
Found a way to handle it without using another gem using only vscode and rubocop :
Install the rubocop extension on vscode
Open your .vscode/settings.json
Append those rules :
{
"editor.formatOnSave": true,
"editor.formatOnSaveTimeout": 5000,
"ruby.format": "rubocop"
}
save
enjoy
Thanks to Tom Lord for the hint.
I have a document with the field admins and am looking to add new users into this field. The value for these new users is a simple number string.
def modify_admin(identity, doc)
ip_addr = "127.0.0.1:27017"
client = Mongo::Client.new([ip_addr], :database => "camp")
if doc[0] == 'r'
doc = doc[2..-1]
client[:inventory].update_one({"name": doc}, {$push => {"admins" => identity}})
client.close
end
The collection I'm trying to add is in this line: client[:inventory].update_one({"name": doc}, {$push => {"admins" => identity}}),
However I am running into the error NilClass instances are not allowed as keys in a BSON document. (BSON::InvalidKey).
I have tried different syntax for the $push method but nothing seems to work.
My document structure is as follows, I'm using symbols as the field value.
document = {:name => build_array[1], :owner => identity, :admins => identity}
How can I add new values to the :owner field using Ruby?
$push in ruby usually means global variable. So, all you need is to wrap $push operation into parentheses:
- client[:inventory].update_one({"name": doc}, {$push => {"admins" => identity}})
+ client[:inventory].update_one({"name": doc}, {"$push" => {"admins" => identity}})
And you should be fine
I'm trying to build a class that will basically be used as a data structure for storing values/nested values. I want there to be two methods, get and set, that accept a dot-notated path to recursively set or get variables.
For example:
bag = ParamBag.new
bag.get('foo.bar') # => nil
bag.set('foo.bar', 'baz')
bag.get('foo.bar') # => 'baz'
The get method could also take a default return value if the value doesn't exist:
bag.get('foo.baz', false) # => false
I could also initialize a new ParamBag with a Hash.
How would I manage this in Ruby? I've done this in other languages, but in order to set a recursive path, I would take the value by reference, but I'm not sure how I'd do it in Ruby.
This was a fun exercise but still falls under the "you probably should not do this" category.
To accomplish what you want, OpenStruct can be used with some slight modifications.
class ParamBag < OpenStruct
def method_missing(name, *args, &block)
if super.nil?
modifiable[new_ostruct_member(name)] = ParamBag.new
end
end
end
This class will let you chain however many method calls together you would like and set any number of parameters.
Tested with Ruby 2.2.1
2.2.1 :023 > p = ParamBag.new
=> #<ParamBag>
2.2.1 :024 > p.foo
=> #<ParamBag>
2.2.1 :025 > p.foo.bar
=> #<ParamBag>
2.2.1 :026 > p.foo.bar = {}
=> {}
2.2.1 :027 > p.foo.bar
=> {}
2.2.1 :028 > p.foo.bar = 'abc'
=> "abc"
Basically, take your get and set methods away and call methods like you would normally.
I do not advise you actually do this, I would instead suggest you use OpenStruct by itself to acheive some flexibility without going too crazy. If you find yourself needing to chain a ton of methods and have them never fail, maybe take a step backwards and ask "is this really the right way to approach this problem?". If the answer to that question is a resounding yes, then ParamBag might just be perfect.
Let's say we have a MongoDB collection called "images", and a MongoMapper-powered application with a corresponding "Image" model. If we set up a MongoMapper query using this model, we see that it is of type Plucky::Query and returns results of type Image:
>> Image.where(:file_type => 'image/jpeg').class
=> Plucky::Query
>> Image.where(:file_type => 'image/jpeg').first.class
=> Image
We can run the corresponding query directly on the Mongo adapter, mostly bypassing MongoMapper, by accessing the MongoMapper.connection. If we do it this way, the query is of type Mongo::Cursor and returns raw data results of type BSON::OrderedHash:
>> MongoMapper.connection.db(dbname).collection('images').find({ :file_type => 'image/jpeg' }).class
=> Mongo::Cursor
>> MongoMapper.connection.db(dbname).collection('images').find({ :file_type => 'image/jpeg' }).first.class
=> BSON::OrderedHash
The question is, is there a way to take a Plucky::Query like above and convert it to (or retrieve from it) a basic, non-extended Mongo::Cursor object?
At first I thought I found a solution with find_each, which does actually take a Plucky::Query and return a Mongo::Cursor:
>> Image.where(:file_type => 'image/jpeg').find_each.class
=> Mongo::Cursor
But it turns out this Mongo::Cursor is somehow extended or otherwise different from the above one because it still returns Image objects instead of BSON::OrderHash objects:
>> Image.where(:file_type => 'image/jpeg').find_each.first.class
=> Image
Update: I can't simply bypass MongoMapper query magic altogether like I did in the second case because I need to access features of MongoMapper (specifically named scopes) to build up the query, so what I end up with is a Plucky::Query. But then I want the results to be plain data objects, not models, because all I need is data and I don't want the overhead of model instantiation.
If you drop to the driver, the transformer is nil by default:
1.9.3p194 :003 > Image.collection.find({ :file_type => 'image/jpeg' }, { :limit => 1 }).first.class
=> BSON::OrderedHash
MongoMapper achieves the conversion by setting a "transformer" lambda on the plucky query. You can see this in the MongoMapper source code:
def query(options={})
query = Plucky::Query.new(collection, :transformer => transformer)
...
end
...
def transformer
#transformer ||= lambda { |doc| load(doc) }
end
So after each mongo document retrieval, this Plucky::Query runs the transformation that loads the model. Looking at the Plucky source code we see that there is a simple setter method [] we can use to disable this. So this is the solution:
plucky_query = Image.where(:file_type => 'image/jpeg')
plucky_query.first.class
# => Image
plucky_query[:transformer] = nil
plucky_query.first.class
# => BSON::OrderedHash
If you don't mind monkey-patching you can encapsulate like so:
module Plucky
class Query
def raw_data
self[:transformer] = nil
self
end
end
end
Then you could simply write:
Image.where(:file_type => 'image/jpeg').raw_data.first.class
# => BSON::OrderedHash
Using Mongoid, is it possible to use "update_all" to push a value onto an array field for all entries matching a certain criteria?
Example:
class Foo
field :username
field :bar, :type => Array
def update_all_bars
array_of_names = ['foo','bar','baz']
Foo.any_in(username: foo).each do |f|
f.push(:bar,'my_new_val')
end
end
end
I'm wondering if there's a way to update all the users at once (to push the value 'my_new_val' onto the "foo" field for each matching entry) using "update_all" (or something similar) instead of looping through them to update them one at a time. I've tried everything I can think of and so far no luck.
Thanks
You need call that from the Mongo DB Driver. You can do :
Foo.collection.update(
Foo.any_in(username:foo).selector,
{'$push' => {bar: 'my_new_val'}},
{:multi => true}
)
Or
Foo.collection.update(
{'$in' => {username: foo}},
{'$push' => {bar: 'my_new_val'}},
{:multi => true}
)
You can do a pull_request or a feature request if you want that in Mongoid builtin.