Runtime changing model with mongodb/mongoid - ruby

I've to add several fields in a mongoid model, I know there is not migration with MongoDB but if I go on without dropping the DB, making rails to "regenerate" the DB entirely, it doesn't display or use the new fields at all !
What's the best way to go here ? Is there something softer than drop/reopen mongodb ?
Thanks in advance
luca

In general it should be possible to update old documents with the new fields at runtime. There is no need for migrations in MongoDB.
You maybe want to write rake tasks to update your old documents with the new fields and default values.
You could find out these documents by checking those new fields which have per default a nil value.
Update
Easy style:
If you define a new field with a default value, this value should always be used as long as you set a new one:
app/models/my_model.rb
class MyModel
include Mongoid::Document
field :name, type: String
field :data, type: String
# NEW FIELD
field :note, type: String, default: "no note given so far!"
end
If you query your database you should get your default value for documents which haven't this field before your extension:
(rails console)
MyModel.first
#=> #<MyModel …other fields…, note: "no note given so far!">
I tested this with a fresh rails stack with a current mongoid on Ruby 1.9.2 - should work with other stacks, too.
More complicated/complex style:
If you didn't set a default value, you'll get nil for this new field.
app/models/my_model.rb
class MyModel
include Mongoid::Document
field :name, type: String
field :data, type: String
# NEW FIELD
field :note, type: String
end
(rails console)
MyModel.first
#=> #<MyModel …other fields…, note: nil>
Then you could set up a rake task and migration file like in this example:
lib/tasks/my_model_migration.rake:
namespace :mymodel do
desc "MyModel migration task"
task :migrate => :environment do
require "./db/migrate.rb"
end
end
db/migrate.rb:
olds = MyModel.where(note: nil)
# Enumerator of documents without a valid :note field (= nil)
olds.each do |doc|
doc.note = "(migration) no note given yet"
# or whatever your desired default value should be
doc.save! rescue puts "Could not modify doc #{doc.id}/#{doc.name}"
# the rescue is only a failsafe statement if something goes wrong
end
Run this migration with rake mymodel:migrate.
This is only a starting point and you can extend this to a full mongoid migration engine.
The task :migrate => :environment do … is necessary, otherwise rake won't load models.

It is a little ridiculous to say that you don't need migrations with mongodb or mongoid. Any sophisticated app needs to be refactored from time to time and that can mean pulling fields out of disparate documents into a new one.
Writing one off rake tasks is way less convenient and error prone than having migrations be part of your deploy script so that it always gets run on every environment.
https://github.com/adacosta/mongoid_rails_migrations brings AR style migrations to mongoid.
You might need them less often, but you will certainly need them as an app grows.

Below is a nice code example for data migration script with mongoid and the ruby mongo driver - to be used when your updated model no longer match production data.
http://pivotallabs.com/users/lee/blog/articles/1548-mongoid-migrations-using-the-mongo-driver
I whish we would stop using "no migrations with mongoid" as slogan. It'll turn people to MongoDB for the wrong reasons, and it's only partially true. No schema, true, but data still needs to be maintained, which IMO is harder with MongoDB than RDBMs. There are other, great reasons for choosing MongoDB and it depends on your problem.

Related

How to test a GraphQL schema with graphql-ruby?

My goal is to test the types of my GraphQL schema in ruby, I'm using the graphql-ruby gem.
I couldn't find any best practice for this, so I'm wondering what's the best way to test the fields and types of a Schema.
The gem recommends against testing the Schema directly http://graphql-ruby.org/schema/testing.html but I still find valuable to be able to know when the schema changes unexpectedly.
Having a type like this:
module Types
class DeskType < GraphQL::Schema::Object
field :id, ID, 'Id of this Desk', null: false
field :location, String, 'Location of the Desk', null: false
field :custom_id, String, 'Human-readable unique identifier for this desk', null: false
end
end
My first approach has been to use the fields hash in the GraphQL::Schema::Object type, for example:
Types::DeskType.fields['location'].type.to_s => 'String!'
Creating an RSpec matcher, I could come up with tests that look like this:
RSpec.describe Types::DeskType do
it 'has the expected schema fields' do
fields = {
'id': 'ID!',
'location': 'String!',
'customId': 'String!'
}
expect(described_class).to match_schema_fields(fields)
end
end
This approach has some drawbacks though:
The code in the matcher depends on the implementation of the class GraphQL::Schema::Object, any breaking changes will break the test suite after an update.
We're repeating code, the tests asserts the same fields in the type.
Writing these tests get tedious, and that makes devs less likely to write them.
It looks you want to test your schema because you want to know if it is going to break the client. Basically you should avoid this.
Instead you can use gems like: graphql-schema_comparator to print breaking changes.
I suggest to have a rake task for dumping your schema (and commit it in your repo).
You can write some spec to check if the schema was dump - then you will make sure, you have always up-to date schema dump.
Setup your CI to compare schema of current branch with schema on master branch.
Fail your build if schema has dangerous or breaking changes.
You can even generate Schema Changelog using schema-comparator ;) Or you can even use slack notifications to send any schema changes there so your team could easilly track any changes.
What I feel is an improvement over the first approach I took is to use snapshot testing for the GraphQL Schema, instead of testing each of the types/mutation schemas one by one, I created a single test:
RSpec.describe MySchema do
it 'renders the full schema' do
schema = GraphQL::Schema::Printer.print_schema(MySchema)
expect(schema).to match_snapshot('schema')
end
end
This approach uses a slightly modified version of the rspec-snapshot gem, see my PR here.
The gem doesn't let you update the snapshot with a single command like in Jest, so I also created a rake task to delete the current snapshot:
namespace :tests do
desc 'Deletes the schema snapshot'
task delete_schema_snapshot: :environment do
snapshot_path = Rails.root.join('spec', 'fixtures', 'snapshots', 'schema.snap')
File.delete(snapshot_path) if File.exist?(snapshot_path)
end
end
With this you'll get a pretty RSpec diff when the schema has been modified.
The top-level Schema object has an #execute method. You can use this to write tests like
RSpec.describe MySchema do
it 'fetches an object' do
id = 'Zm9vOjE'
query = <<~GRAPHQL
query GetObject($id: ID!) {
node(id: $id) { __typename id }
}
GRAPHQL
res = described_class.execute(
query,
variables: { id: id }
)
expect(res['errors']).to be_nil
expect(res['data']['node']['__typename']).to eq('Foo')
expect(res['data']['node']['id']).to eq(id)
end
end
The return value of the #execute method will be the conventional HTTP-style response, as a string-keyed hash. (Actually it's a GraphQL::Query::Result, but it delegates most things to an embedded hash.)

Use `created_at` instead of `inserted_at`

I want to use an old Rails MySQL database/table without changing it with a new Elixir application. Obviously I run into the created_atand inserted_at problem. https://hexdocs.pm/ecto/Ecto.Schema.html#timestamps/1 says that I can solve it. But I can't get it to work.
Here's what I do:
mix phoenix.new address_book --database mysql
cd address_book
mix phoenix.gen.html User users first_name last_name
mix phoenix.server
web/models/user.ex
defmodule AddressBook.User do
use AddressBook.Web, :model
schema "users" do
field :first_name, :string
field :last_name, :string
timestamps([{:created_at,:updated_at}])
end
[...]
But than I get the following error:
How can I fix this?
You aren't calling the timestamps function with the correct argument. It takes options, so it should be either:
timestamps(inserted_at: :created_at)
Or:
timestamps([{inserted_at: :created_at}])
You are calling:
timestamps(created_at: :updated_at)
Since there is no check for the created_at option, this does not change the timestamps being used.
You can configure this for all your schemas by using:
#timestamps_opts inserted_at: :created_at
In your web.ex file (in the schema section).
In addition to the answer above, the best way to set this across a context is:
defmodule MyApp.Schema do
defmacro __using__(_) do
quote do
use Ecto.Schema
#timestamps_opts inserted_at: :created_at
end
end
end
Then, instead of having to define that in each schema, you can just use MyApp.Schema.
I use this pattern in my phoenix app that uses a coexisting db-schema generated by rails.
https://hexdocs.pm/ecto/Ecto.Schema.html

time to live doesn't work on mongoid

see the #2443 topic
https://github.com/mongoid/mongoid/blob/master/CHANGELOG.md
In mongoid, time to live (expire_after_seconds option) is supported,
but doesn't work.
I executed its sample code, then I tried to replace Time with DateTime or used Timestamps(created_at). but doesn't.
class Event
include Mongoid::Document
field :created_at, type: DateTime
index({ created_at: 1 }, { expire_after_seconds: 3600 })
end
Mongoid does not "automatically" create indexes of any kind when connecting to the class model. This is considered to be a "separate" task for which there is the following rake command at the bottom of the documentation:
rake db:mongoid:create_indexes
Of course if you are not using this in a "rails" config then you would look at alternate means to create the indexes on collections when you want to. You can either script this externally or directly use the mongo driver ensureIndex method.

Embedding documents in existing documents with the Ruby Driver for MongoDB

I'm trying to embed a document inside an existing document using the Ruby Driver.
Here's what my primary document looks like:
db = Mongo::Connection.new.db("Portfolios")
project_collection = db.collection("Projects")
new_Project = { :url => 'http://www.tekfolio.me/billy/portfolio/focus', :author => 'Billy'}
project_collection.insert(new_Project)
After I've created my new_project and added it to my project_collection I may or may not add another collection to the same document later called assets. This is where I'm stuck. The following code doesn't seem to do anything:
new_asset = { :image_url => 'http://assets.tekfolio.me/portfolios/68fbb25a-8353-41a8-a779-4bd9762b00f2/projects/13/assets/20/focus2.PNG'}
new_Project.assest.insert(new_asset)
I'm certain I've butchered my understanding of Mongodb and the Ruby driver and the embeded document concept and would appreciate your help getting me out of this wet paper bag I can't seem to get out of ;)
Have you tried just setting the value of asset without insert and instead using update?
new_Project["asset"] = new_asset
project_collection.update({"_id" => new_Project["_id"]}, new_Project)
I think , are you trying to "update" the new_project record with the asset
it doesn't work because then you are only updating the hash in ruby, not in mongo, you have to first get the reference to the object in mongo, update it, and then save it, check this info:
http://www.mongodb.org/display/DOCS/Updating+Data+in+Mongo
(if you can, you can assign the asset before inserting, and it should work)

MongoMapper - manually setting an ObjectId failing with "illegal ObjectID format"

I've got a simple model object:
class UserRating
include MongoMapper::EmbeddedDocument
key :idea_id, ObjectId
key :rating, Integer
end
I'm trying to set an Idea_Id on this object with:
user_rating.idea_id = ObjectId.new
This throws: "illegal ObjectID format"
This sure seems like simple code... The only oddity I am noticing is that ObjectID != ObjectId. That could just be a problem with the error message. Not sure. Very simple code. No idea why I can't make it work. If it helps, this is in the context of a Rails 3 Beta 4 project inside of a Cucumber test. I am hitting the mongodb daemon successfully, so there's not a weird connection issue. Would really appreciate any pointers.
MongoMapper has a proxy object called ObjectId - in this case, you want a BSON::ObjectID, which represents an ID as it is stored in mongodb itself.
You probably want:
key :idea_id, BSON::ObjectID, :index => true
No, you want ObjectId. When you assign that you'll want to pass an actual object id which gets generated for each MM model.
user_rating.idea_id = idea.id

Resources