I use Sequel::Model.DB to interact with my DB, but for some reason, the DB structure was changed, for example, via the DB console.
This method:
Sequel::Model.db.schema('table_name')
still returns the old DB, cached from the first connection I guess.
How can I reset that cache or, ideally, ensure the actual DB connection on each request?
I tried to use a new connection every time:
def db
#db ||= Sequel.connect(Sequel::Model.db.opts)
end
but, predictably, I got this error eventually:
Sequel::DatabaseConnectionError - PG::ConnectionBad: FATAL: sorry, too many clients already
You shouldn't be changing the structure of the database in an incompatible way while Sequel is running. The easiest way to solve this issue is just to restart the process after changing the database schema, and Sequel will pick up the new database structure.
If you really wanted to try to do this without restarting the process, you could remove the cached schemas (#db.instance_variable_get(:#schemas).clear), and reset the dataset for all model classes (ModelClass.dataset = ModelClass.dataset for each Sequel::Model). However, that doesn't necessarily result in the same thing, since if you remove a column, the old column name will still have a method defined for it.
Related
I am writing a ruby script that needs to connect to a primary database in order to retrieve a list of secondary databases to connect to. The steps I need to carry out are:
Connect to primary DB
Retrieve list of secondary databases to connect to
Iterate through list of secondary databases, extracting a single model from each database
I don't need to be connected to multiple secondary databases at once, so open -> retrieve object -> close is fine.
All of the examples I have seen so far describe multiple connections where the databases are described in database.yml, which is not possible here as the number of databases I need to connect to can vary.
This blog post describes using a connection pool as follows:
spec = database_a
ActiveRecord::ConnectionPool.open(spec) do |conn|
User.find_all
end
spec = database_b
ActiveRecord::ConnectionPool.open(spec) do |conn|
User.find_all
end
However, ConnectionPool seems to have changed and the .open method no longer exists.
I would appreciate any pointers.
The problem lies with this:
[1] pry(main)> ActiveRecord::ConnectionPool
NameError: uninitialized constant ActiveRecord::ConnectionPool
ActiveRecord::ConnectionPool doesn't exist. However, on my journey to find it, I found ActiveRecord::ConnectionAdapters::ConnectionPool, which looks like something that you want.
I need to do multiple updates on a PostgreSQL database at once. To make it faster, I would like to avoid continually changing context between my ruby application and the database.
I've learned about the #update_sql method, however my strategy to queue those queries is not working. I'm making a list sqls which holds many update strings, and then db[sqls.join(';') + ';']. If instead of executing this line I decide to print the resulting sql and copy/paste to the database it works.
Why doesn't this work? How can I send multiple updates in a single sql?
Sequel doesn't offer an API that supports multiple queries in a single statement. If the driver you are using supports, it, you can use Sequel::Database#synchronize to get access to the driver connection object, and call methods directly on that.
We have been running one server for the past few months and it contains all the files, SQL data, and is running as our server. We have recently bought 2 more servers to use replication because our database load was so high.
We are going to use a simple master slave replication using transaction replication in MSSQL however our methods that we use to acess LINQ entities must be changes.
For all functions that update they need to connect to the master, but all the ones that select need to query the slave.
How can we edit the connection string based on the function that needs to be done?
Any help would be appreciated.
Thanks
The simplest approach would be;
Create two connection strings on the web.config <connectionStrings> section for read and write.
When querying data, pass the read connection string name to the context's constructor.
and, pass the write connection string name when updating.
If you are using LINQ to entities, you can pass the connection string to the instance of the context i.e ModelContext ctx = new ModelContext("[edmx format connectionstring]");
How connect to different db depending on #request.host value?
Using Sinatra and MongoDB with Mongoid.
I need to read a Sintra application's menu, data ... from different databases. I wish to deploy it only in one place and depending on request.host(subdomain) value to serve the specific pages.
You're probably better off storing all your data in one database marking/tagging/categorizing it depending on the subdomain you're on.
If you setup your Mongoid connection manually already, you could do something like this:
connection = Mongo::Connection.new
Mongoid.database = connection.db(#request.host)
But still, I think you're better of with one database.
How to prevent the connection to the oracle server being gets lost if it is kept for some ideal time
If you use the newest JDBC spec 4.0 there is a isValid() method available for a connection that allows you to check if the connection is usable, if not then get a new (reconnect) connection and execute your SQL.
One possible way that I knew to save the database connection from being getting lost is to send a dummy query after the threshhold time, By Threash hold I mean the time after which the connection to the database is expected to become idle or get lost.
Some thing like
Ping_time_to_DB=60
if(Current_time - Last_Ping_time > Ping_time_to_DB)
{
--send a dummy query like
select 1 from dual;
}