I am using Cequel as ORM for Cassandra without rails.
I have a problem when trying to create a simple list of projects.
First I defined the model with three columns which should belong to a compound key.
class Project
include Cequel::Record
key :client, :text, { partition: true }
key :type, :text, { partition: true }
key :dep, :text, { partition: true }
end
Later when I try to create the Project via
project = Project.create!({client: "test", type: "test", dep: "test"})
I get an error that the table does not exist.
The tables are not created automatically with Project.create! but I have to create the table manually first:
connection.schema.create_table(:projects) do
partition_key :client, :text
partition_key :type, :text
partition_key :dept, :text
end
But this syntax is different from the documented Record definition and I only found it by sifting through the source code. But this creates two problems.
Code overhead
I don't know the syntax for has_many and belongs_to so I cannot create the table correctly if the Record includes this
Am I overlooking a method to create the table automatically from the Project class definition?
The tables can be created by calling the method synchronize_schema on the class. So, in your case, you should execute Project.synchronize_schema before actually attempting to read/write into it.
Given that you are building a broader project, you can consider using Rake tasks for it. You also need to migrate so that the tables are actually created in Cassandra. You can use rake cequel:migrate for that. There are more tasks which you can see via rake --tasks.
If you are creating your custom project with custom places for models, you probably need to hack the rake migration task a little. This is an implementation that I did for my project https://github.com/octoai/gems/blob/master/octo-core/Rakefile#L75. Also take a look at how models are defined https://github.com/octoai/gems/tree/master/octo-core/lib/octocore/models
Hope this helps.
Related
I have an existing legacy Firebird database with nonstandard table and field names.
I would like to write a Sinatra app that can access it and display information. I've seen stuff like dm-is-reflective that appears to work when a database has proper naming conventions, but how do I use DataMapper (or ActiveRecord whichever is the easiest) to access those tables?
For example, assuming I had these two tables:
Bookshelfs
shelf_id: integer
level: integer
created: timestamp
Book
id: integer
id_of_shelf: integer
title: string
pages: integer
Something like with odd naming conventions that don't follow any set pattern and where one table's record might "own" multiple entries in another table even though there is not foreign_key assigned.
How would you set up datamapper (or activerecord) to communicate with it?
Look in to this gem to get setup with ActiveRecord on Sinatra:
https://github.com/bmizerany/sinatra-activerecord
As for how to define the relations, activerecord can do this easily.
class Book < ActiveRecord::Base
belongs_to :bookshelf, :class_name => 'Bookshelf', :foreign_key => 'id_of_shelf'
end
class Bookshelf < ActiveRecord::Base
has_many :books, :class_name => 'Book', :foreign_key => 'id_of_shelf'
end
Assuming you figured out how to connect to your legacy database using ActiveRecord's Firebird adapter, the next thing I would do is define a view on top of each table, e.g.
CREATE VIEW books AS SELECT * FROM Book;
CREATE VIEW bookshelves AS SELECT * FROM Bookshelfs;
This way you can simply define models Book and Bookshelf in ActiveRecord as per usual and it will find everything in the right place inside the database.
Is it possible with datamapper to generate the models from an existing database schema? So, to do the inverse of a migration, which takes the models and generate the sql. What I want is given a database schema generate the models.
try to check https://github.com/yogo/dm-reflection or any of its forks ..
At last, I found that till now the best solution is to use dm-is-reflective plugin: https://github.com/godfat/dm-is-reflective.
It doesn't generate the code for DataMapper models reflecting an existing database schema, but its properties access methods are automatically available (as long as you keep using this plugin, of course).
Here is an example of use:
require 'data_mapper'
require 'dm-is-reflective'
DataMapper.setup(:default, "postgres://user:pwd#localhost/db")
class Table
include DataMapper::Resource
is :reflective #activate dm-is-reflective
reflect #reflects eeach property. You can be more specific (look at plugin documentation)
end
DataMapper.finalize
#Even if no field is defined, all of them are accessible
entry = Table.first(nil, {:id => 469})
print entry.anotherField
Using Rails 3.2.2 and ruby 1.9.3dev and mysql
I am new to ruby and rails. We have an existing database with a couple hundred tables. We would like to try out rails to see if it would be a positive change from PHP & ZendFramework.
Migrating data into another database is not an option for us because we have several other applications currently using this database. We wanted to "attach" a rails project to the existing database.
The part I am struggling is generating all the models from our existing database.
I seen a couple of older posts talking about some automated techniques including Magic Model Generator. While others talked about there is no way to do this, or you just have create them all manually.
I was not successful in generating models using Magic Model Generator (perhaps rails 2 only?)
Long ago, when we switched to ZendFramework, I wrote a quick script to analyze the database and generate all the model files for us. It would seem this would be a somewhat common scenario.
Note: We use ID instead of id and many have many foreign_key relationships.
So I wanted to ask the community what is the best (way/practice) to handle this?
It's not that difficult, just takes a bit more configuration. Here's a basic template for a model:
class YourIdealModelName < ActiveRecord::Base
self.table_name = `actual_table_name`
self.primary_key = `ID`
belongs_to :other_ideal_model,
:foreign_key => 'foreign_key_on_other_table'
has_many :some_other_ideal_models,
:foreign_key => 'foreign_key_on_this_table',
:primary_key => 'primary_key_on_other_table'
end
I am no expert and even had researched about this.
Without thinking to much first solution in my mind is to make the models and migrations according to the rails way so you don't have any problem, for example key and foreign key naming. If you have already some data you should migrate it to the rails db.
One reason to do this is that models are suppose not to be only data accessors but also contain the business logic
If I am using DataMapper, and I have two databases, is there any way using migration.rb to copy a table for example table person from database 1 to database 2? (same schema and table values).
Referring this:https://github.com/datamapper/dm-migrations/blob/master/examples/sample_migration.rb
It only tells me how to add/modify/drop tables.
Thanks for help.
I don't think that's the intention of dm-migrations. I believe the easiest way would be something like this:
DataMapper.setup(:default, db1_config)
DataMapper.setup(:new, db2_config)
class Foo
include DataMapper::Resource
property :id, Serial
property :name, String
...
end
DataMapper.finalize
Foo.each do |foo|
DataMapper.repository(:new) do
# It may not let you set the "id" attribute here...
Foo.create(foo.attributes)
end
end
Edit
In hindsight, I'm not sure if you were asking how to to copy table structure as to opposed to table data. This is obviously copying table data.
I'm running an import script which imports a CSV dump of a database into a local sqlite database using DataMapper.
My models look like this:
class Staff
include DataMapper::Resource
property :staff_id, String, :key => true
property :full_name, String
end
class Project
include DataMapper::Resource
property :project_id, Integer, :key => true
property :title, String
property :status, String
belongs_to :staff
end
The CSV contains the primary key so when I'm do the import I'm using that as it's key. Next time I run the import I clear the tables and start again, however datamapper moans because the primary keys have already been taken.
Is there a way to stop datamapper moaning about this or should I just delete the .db file and re-create an empty .db file just before the import runs? If so what's the easiest way to do this.
You can use DataMapper.auto_migrate! to blow away the tables, and then recreate them matching the current model state. The new tables will be empty of any data from previous runs.
So just after you define your models, but before you begin importing the data do something like the following:
DataMapper.finalize.auto_migrate!
If you create the complete DB from the import, I'd really recommend nuking the database completely to avoid any spill-over from earlier runs. This also keeps your build-db code path well tested.