I'm sending serialized data to a class which need to access a Mongoid document which may or may not be embedded.
In case of embedded document, I'm accepting a variable number of arguments which I reduce to get the embedded document.
The code is pretty simple:
def perform(object, *arguments)
#opts = arguments.extract_options!
#object = arguments.reduce(object){|object, args| object.public_send(*args)}
# [...]
I used public_send because AFAIK I only need to call public methods.
However, when I try to access an embedded document I have some really strange result where #object is an enumerator.
After some debugging, this is what I found that for any root document object and an embedded collection items, I have:
object.items.public_send(:find)
# => #<Enumerator: ...>
object.items.send(:find) # or __send__
# => nil
The method called is not the same at all when I call public_send or send!
How is it even possible?
Is it normal? Is that a bug?
public_send seems to invoke the find method of Array (Enumerable) but send (or __send__) invokes the find method of Mongoid
Edit: simple reproductible case:
require 'mongoid'
class User
include Mongoid::Document
field :name, type: String
embeds_many :groups
end
class Group
include Mongoid::Document
field :name, type: String
embedded_in :user
end
Mongoid.load_configuration({
sessions: {
default: {
database: 'send_find',
hosts: [
'localhost:27017'
]
}
}
})
user = User.create(name: 'john')
user.groups.create(name: 'g1')
user.groups.create(name: 'g2')
puts "public_send :find"
puts user.groups.public_send(:find).inspect
# => #<Enumerator: [#<Group _id: 5530dea57735334b69010000, name: "g1">, #<Group _id: 5530dea57735334b69020000, name: "g2">]:find>
puts "send :find"
puts user.groups.send(:find).inspect
# => nil
puts "__send__ :find"
puts user.groups.__send__(:find).inspect
# => nil
Okay, after a few hours of debugging, I found that it is actually a bug in Mongoid.
The relation is not an array but a proxy around the array, which delegates most methods to the array.
As public_send was also delegated but not send and __send__, the behavior was not the same.
For more information, see my pull request and the associated commit.
Related
I'm trying to create a test to check if my post's embedded document(author) call to it callback method.
Code:
class Post
include Mongoid::Document
include Mongoid::Timestamps::Created
include Mongoid::Timestamps::Updated
{....}
# relations
embeds_one :author, cascade_callbacks: true
accepts_nested_attributes_for :author
{...}
end
Class Author
include Mongoid::Document
include Mongoid::Timestamps::Created
include Mongoid::Timestamps::Updated
{...}
embedded_in :post
after_save :my_callback_method
def save_estimation_logs
{...}
end
{...}
end
test:
RSpec.describe Author, :type => :model do
context "Create author on Post" do
let!(:post) { create(:post, :with_external_author) }
it "should call after_save method my_callback_method when saving" do
expect(post.author).to receive(:my_callback_method)
expect(post.save).to eq true
end
end
end
when i'm trying to run this rspec - i got
Failure/Error: expect(post.author).to receive(:my_callback_method)
(#<Author _id: 5c7ea762f325709edac2ae84, created_at: 2019-03-05 16:44:18 UTC, updated_at: 2019-03-05 16:44:18 UTC>). my_callback_method(*(any args))
expected: 1 time with any arguments
received: 0 times with any arguments
Can you guys help me understand how should I test this embedded document callbacks?
First of all, you should trust mongoid to call after_save and test my_callback_method in isolation.
Now, as said in the comments, that you want to check if someone deleted the after_save, you can add a test for:
RSpec.describe Author, :type => :model do
context "Author" do
it "should define my_callback_method for after_save" do
result = Author._save_callbacks.select { |cb| cb.kind.eql?(:after) }.collect(&:filter).include?(:my_callback_method)
expect(result).to eq true
end
end
end
Your code looks correct but there is a number of outstanding issues in Mongoid related to callbacks in persistence. Ensure the callback is called in normal operation (i.e. when you save a post from a Rails console).
I am running Ruby 2.1 and Mongoid 5.0 (no Rails).
I want to track on a before_save callback whether or not an embedded field has changed.
I can use the document.attribute_changed? or document.changed methods to check normal fields, but somehow these don't work on relations (embed_one, has_one, etc).
Is there a way of detecting these changes before saving the document?
My model is something like this
class Company
include Mongoid::Document
include Mongoid::Attributes::Dynamic
field :name, type: String
#...
embeds_one :address, class_name: 'Address', inverse_of: :address
#...
before_save :activate_flags
def activate_flags
if self.changes.include? 'address'
#self.changes never includes "address"
end
if self.address_changed?
#This throws an exception
end
end
One example of how I save my document is:
#...
company.address = AddressUtilities.parse address
company.save
#After this, the callback is triggered, but self.changes is empty...
#...
I have read the documentation and Google the hell out of it, but I can't find a solution?
I have found this gem, but it's old and doesn't work with the newer versions of Mongoid. I want to check if there is another way of doing it before considering on trying to fix/pull request the gem...
Adding these two methods to your Model and calling get_embedded_document_changes should provide you an hash with the changes to all its embedded documents:
def get_embedded_document_changes
data = {}
relations.each do |name, relation|
next unless [:embeds_one, :embeds_many].include? relation.macro.to_sym
# only if changes are present
child = send(name.to_sym)
next unless child
next if child.previous_changes.empty?
child_data = get_previous_changes_for_model(child)
data[name] = child_data
end
data
end
def get_previous_changes_for_model(model)
data = {}
model.previous_changes.each do |key, change|
data[key] = {:from => change[0], :to => change[1]}
end
data
end
[ source: https://gist.github.com/derickbailey/1049304 ]
I need to search within Mongoid objects that have array attributes. Here are the relevant objects:
class Author
include Mongoid::Document
field :name, type: String
class Book
include Mongoid::Document
field :name, type: String
field :authors, type: Array
I can see that at least one book has a given author:
Book.all.sample.authors
=> [BSON::ObjectId('5363c73a4d61635257805e00'),
BSON::ObjectId('5363c73a4d61635257835e00'),
BSON::ObjectId('5363c73a4d61635257c75e00'),
BSON::ObjectId('5363c73b4d616352574a5f00')]
But I'm unable to find books that have that author.
Book.where(authors: '5363c73a4d61635257805e00').first
=> nil
I've tried the solution listed here: https://groups.google.com/forum/#!topic/mongoid/csNOcugYH0U but it didn't work for me:
Book.any_in(:author => ["5363c73b4d616352574a5f00"]).first
=> nil
I'm not sure what I'm doing wrong. Any ideas? I'd prefer to use Mongoid Origin commands.
This output:
Book.all.sample.authors
=> [BSON::ObjectId('5363c73a4d61635257805e00'),
BSON::ObjectId('5363c73a4d61635257835e00'),
BSON::ObjectId('5363c73a4d61635257c75e00'),
BSON::ObjectId('5363c73b4d616352574a5f00')]
tells us that authors contains BSON::ObjectIds. ObjectIds are often presented as Strings and sometimes you can use a String instead of a full blown ObjectId (such as with Model.find) but they're still not Strings. You are searching the array for a String:
Book.where(authors: '5363c73a4d61635257805e00')
but '5363c73a4d61635257805e00' and ObjectId('5363c73a4d61635257805e00') are not the same thing inside MongoDB. You need to search for the right thing:
Book.where(authors: BSON::ObjectId('5363c73a4d61635257805e00'))
You might want to monkey patch a to_bson_id method into various places. Something like this:
class String
def to_bson_id
BSON::ObjectId.from_string(self)
end
end
module Mongoid
module Document
def to_bson_id
id
end
end
end
module BSON
class ObjectId
def to_bson_id
self
end
end
end
class NilClass
def to_bson_id
self
end
end
Should do the trick. Then you can say things like:
Book.where(authors: '5363c73a4d61635257805e00'.to_bson_id)
Book.where(authors: some_string_or_object_id.to_bson_id)
and The Right Thing happens.
You might want to rename authors to author_ids to make its nature a little clearer.
I'm using data_mapper/sinatra and trying to create some attributes with attr_accessor. The following example code:
require 'json'
class Person
include DataMapper::Resource
property :id, Serial
property :first_name, String
attr_accessor :last_name
end
ps = Person.new
ps.first_name = "Mike"
ps.last_name = "Smith"
p ps.to_json
produces this output:
"{\"id\":null,\"first_name\":\"Mike\"}"
Obviously I would like for it to give me both the first and last name attributes. Any ideas on how to get this to work in the way one would expect so that my json has all of the attributes?
Also, feel free to also explain why my expectation (that I'd get all of the attributes) is incorrect. I'm guessing some internal list of attributes isn't getting the attr_accessor instance variables added to it or something. But even so, why?
Datamapper has it’s own serialization library, dm-serializer, that provides a to_json method for any Datamapper resource. If you require Datamapper with require 'data_mapper' in your code, you are using the data_mapper meta-gem that requires dm-serializer as part of it’s set up.
The to_json method provided by dm-serializer only serializes the Datamapper properties of your object (i.e. those you’ve specified with property) and not the “normal” properties (that you’ve defined with attr_accessor). This is why you get id and first_name but not last_name.
In order to avoid using dm-serializer you need to explicitly require those libraries you need, rather than rely on data_mapper. You will need at least dm-core and maybe others.
The “normal” json library doesn’t include any attributes in the default to_json call on an object, it just uses the objects to_s method. So in this case, if you replace require 'data_mapper' with require 'dm-core', you will get something like "\"#<Person:0x000001013a0320>\"".
To create json representations of your own objects you need to create your own to_json method. A simple example would be to just hard code the attributes you want in the json:
def to_json
{:id => id, :first_name => first_name, :last_name => last_name}.to_json
end
You could create a method that looks at the attributes and properties of the object and create the appropriate json from that instead of hardcoding them this way.
Note that if you create your own to_json method you could still call require 'data_mapper', your to_json will replace the one provided by dm-serializer. In fact dm-serializer also adds an as_json method that you could use to create the combined to_json method, e.g.:
def to_json
as_json.merge({:last_name => last_name}).to_json
end
Thanks to Matt I did some digging and found the :method param for dm-serializer's to_json method. Their to_json method was pretty decent and was basically just a wrapper for an as_json helper method so I overwrote it by just adding a few lines:
if options[:include_attributes]
options[:methods] = [] if options[:methods].nil?
options[:methods].concat(model.attributes).uniq!
end
The completed method override looks like:
module DataMapper
module Serializer
def to_json(*args)
options = args.first
options = {} unless options.kind_of?(Hash)
if options[:include_attributes]
options[:methods] = [] if options[:methods].nil?
options[:methods].concat(model.attributes).uniq!
end
result = as_json(options)
# default to making JSON
if options.fetch(:to_json, true)
MultiJson.dump(result)
else
result
end
end
end
end
This works along with an attributes method I added to a base module I use with my models. The relevant section is below:
module Base
def self.included(base)
base.extend(ClassMethods)
end
module ClassMethods
def attr_accessor(*vars)
#attributes ||= []
#attributes.concat vars
super(*vars)
end
def attributes
#attributes || []
end
end
def attributes
self.class.attributes
end
end
now my original example:
require 'json'
class Person
include DataMapper::Resource
include Base
property :id, Serial
property :first_name, String
attr_accessor :last_name
end
ps = Person.new
ps.first_name = "Mike"
ps.last_name = "Smith"
p ps.to_json :include_attributes => true
Works as expected, with the new option parameter.
What I could have done to selectively get the attributes I wanted without having to do the extra work was to just pass the attribute names into the :methods param.
p ps.to_json :methods => [:last_name]
Or, since I already had my Base class:
p ps.to_json :methods => Person.attributes
Now I just need to figure out how I want to support collections.
First off, I am not using Rails. I am using Sinatra for this project with Active Record.
I want to be able to override either to_json or as_json on my Model class and have it define some 'default' options. For example I have the following:
class Vendor < ActiveRecord::Base
def to_json(options = {})
if options.empty?
super :only => [:id, :name]
else
super options
end
end
end
where Vendor has more attributes than just id and name. In my route I have something like the following:
#vendors = Vendor.where({})
#vendors.to_json
Here #vendors is an Array vendor objects (obviously). The returned json is, however, not invoking my to_json method and is returning all of the models attributes.
I don't really have the option of modifying the route because I am actually using a modified sinatra-rest gem (http://github.com/mikeycgto/sinatra-rest).
Any ideas on how to achieve this functionality? I could do something like the following in my sinatra-rest gem but this seems silly:
#PLURAL.collect! { |obj| obj.to_json }
Try overriding serializable_hash intead:
def serializable_hash(options = nil)
{ :id => id, :name => name }
end
More information here.
If you override as_json instead of to_json, each element in the array will format with as_json before the array is converted to JSON
I'm using the following to only expose only accessible attributes:
def as_json(options = {})
options[:only] ||= self.class.accessible_attributes.to_a
super(options)
end