ruby sequel and postgreSQL - too many clients (connections) - ruby

I'm using the sequel gem inside a DB class that is used across my app (rack app) and it's instantiated only once.
The DB class initialises sequel once and has some methods I call, mainly read-only:
def initialize
#psql ||= Sequel.connect('postgres://localhost/mydb')
end
def query_example
#psql[:users].order(:time)
end
The app is basically an API. Something like:
class API < Grape::API
format :json
before do
#db = Db.new
end
get '/' do
#db.query_example
end
This works until I reach 100 connections in postgreSQL. I assume sequel is using some sort of connection pool but somehow is not freeing up the connections? I can see the 100 'selects' in the pg_stat_activity table with a status of 'idle'. However every new request fails with the following error:
Sequel::DatabaseConnectionError: PG::ConnectionBad: FATAL: sorry, too many clients already
/Users/lopezj2/.rvm/gems/ruby-2.1.2/gems/sequel-4.22.0/lib/sequel/adapters/postgres.rb:236:in `initialize'
/Users/lopezj2/.rvm/gems/ruby-2.1.2/gems/sequel-4.22.0/lib/sequel/adapters/postgres.rb:236:in `new'
/Users/lopezj2/.rvm/gems/ruby-2.1.2/gems/sequel-4.22.0/lib/sequel/adapters/postgres.rb:236:in `connect'
/Users/lopezj2/.rvm/gems/ruby-2.1.2/gems/sequel-4.22.0/lib/sequel/connection_pool.rb:101:in `make_new'
It looks like Sequel is trying to create a new connection in the pool, however, the app is not particularly chatty.

You should create connection pool only once, and checkout a connection from the pool for each request, but in your code, you just create a new pool for each request.
You can change your DB class like this:
class DB
class << self
attr_reader :psql
end
# Note that #psql is a class instance variable
#psql = Sequel.connect('postgres://localhost/mydb')
def query_example
DB.psql[:users].order(:time)
end
end

Related

rspec test failing - methods.include?

I keep getting this validation error in rspec. Could someone please tell what I'm doing wrong?
1) MyServer uses module
Failure/Error: expect(MyClient.methods.include?(:connect)).to be true
expected true
got false
# ./spec/myclient_spec.rb:13:in `block (2 levels) in <top (required)>'
This is my client.rb
#!/bin/ruby
require 'socket'
# Simple reuseable socket client
module SocketClient
def connect(host, port)
sock = TCPSocket.new(host, port)
begin
result = yield sock
ensure
sock.close
end
result
rescue Errno::ECONNREFUSED
end
end
# Should be able to connect to MyServer
class MyClient
include SocketClient
end
And this is my spec.rb
describe 'My server' do
subject { MyClient.new('localhost', port) }
let(:port) { 1096 }
it 'uses module' do
expect(MyClient.const_defined?(:SocketClient)).to be true
expect(MyClient.methods.include?(:connect)).to be true
end
I have method connect defined in module SocketClient. I don't understand why the test keeps failing.
The class MyClient doesn't have a method named connect. Try it: MyClient.connect will not work.
If you want to check what methods a class defines for its instances, use instance_methods: MyClient.instance_methods.include?(:connect) will be true. methods lists the methods an object itself responds to, so MyClient.new(*args).methods.include?(:connect) would be true.
Really, though, for checking whether a specific instance method exists on a class you should use method_defined?, and for checking whether an object itself responds to a specific method, you should use respond_to?:
MyClient.method_defined?(:connect)
MyClient.new(*args).respond_to?(:connect)
If you really do want MyClient.connect to work directly, you'd need to use Object#extend rather than Module#include (see What is the difference between include and extend in Ruby?).

How do I make a class conditionally return one of two other classes?

I have a design problem.
I'm writing a REST client in ruby. For reasons beyond my control, it has to extend another gem that uses my networks zookeeper instance to do service lookup. My client takes a user provided tier, and based on that value, queries the zookeeper registry for the appropriate service url.
The problem is that I also need to be able to run my client against a locally running version of the service under test. When the service is running locally, zookeeper is obviously not involved, so I simply need to be able to make GET requests against the localhost resource url.
When a user instantiates my gem, they call something like:
client = MyRestClient.new(tier: :dev)
or in local mode
client = MyRestClient.new(tier: :local)
I would like to avoid conditionally hacking the constructor in MyRestClient (and all of the GET methods in MyRestClient) to alter requests based on :local vs. :requests_via_the_zk_gem.
I'm looking for an elegant and clean way to handle this situation in Ruby.
One thought was to create two client classes, one for :local and the other for :not_local. But then I don't know how to provide a single gem interface that will return the correct client object.
If MyClient has a constructor that looks something like this:
class MyClient
attr_reader :the_klass
def initialize(opts={})
if opts[:tier] == :local
#the_klass = LocalClass.new
else
#the_klass = ZkClass.new
end
#the_klass
end
end
then I end up with something like:
test = MyClient.new(tier: :local)
=> #<MyClient:0x007fe4d881ed58 #the_klass=#<LocalClass:0x007fe4d883afd0>>
test.class
=> MyClient
test.the_klass.class
=> LocalClass
those who then use my gem would have to make calls like:
#client = MyClient.new(tier: :local)
#client.the_klass.get
which doesn't seem right
I could use a module to return the appropriate class, but then I'm faced with the question of how to provide a single public interface for my gem. I can't instantiate a module with .new.
My sense is that this is a common OO problem and I just haven't run into it yet. It's also possible the answer is staring me in the face and I just haven't found it yet.
Most grateful for any help.
A common pattern is to pass the service into the client, something like:
class MyClient
attr_reader :service
def initialize(service)
#service = service
end
def some_method
service.some_method
end
end
And create it with:
client = MyRestClient.new(LocalClass.new)
# or
client = MyRestClient.new(ZkClass.new)
You could move these two into class methods:
class MyClient
self.local
new(LocalClass.new)
end
self.dev
new(ZkClass.new)
end
end
And instead call:
client = MyRestClient.local
# or
client = MyRestClient.dev
You can use method_missing to delegate from your client to the actual class.
def method_missing(m, *args, &block)
#the_class.send(m, *args, &block)
end
So whenever a method gets called on your class that doesn't exist (like get in your example) it wil be called on #the_class instead.
It's good style to also define the corresponding respond_to_missing? btw:
def respond_to_missing?(m, include_private = false)
#the_class.respond_to?(m)
end
The use case you are describing looks like a classic factory method use case.
The common solution for this is the create a method (not new) which returns the relevant class instance:
class MyClient
def self.create_client(opts={})
if opts[:tier] == :local
LocalClass.new
else
ZkClass.new
end
end
end
And now your usage is:
test = MyClient.create(tier: :local)
=> #<LocalClass:0x007fe4d881ed58>
test.class
=> LocalClass

Can I hook into ActiveRecord connection establishment?

I would like to add a user-defined function using Sqlite3's create_function, which will be used by database triggers.
Is there a way to hook into ActiveRecord connection establishment to run some code each time a connection to the database is made, where one could create the function and make it available to the triggers? This would also be useful for setting pragmas on the connection.
Here is what I've found on my own. Using an initializer in app/config/initializers I do this:
ActiveRecord::ConnectionAdapters::AbstractAdapter.class_eval do
alias_method :orig_initialize, :initialize
def initialize(connection, logger = nil, pool = nil)
orig_initialize(connection, logger, pool)
if connection.is_a? SQLite3::Database
# 'reverse' is just an example :^)
connection.create_function('reverse', 1) { |func, value| func.result = if value then value.to_s.reverse end }
end
end
end
I'm not sure if this class gets reloaded, but I'll cross that bridge if I get to it.

Transactions in Ruby Sequel module: how to get DB object?

I'm working in a Sinatra application using Sequel.
I want to make a transaction, according to the manual I have to use the DB object, how can I get this object from any part of my code?
You can define it in your base app.rb (or equivalent) or include a separate file where you configure the DB object if you wish.
For example, in one of my Sinatra apps, I have an app.rb that includes a
class App < Sinatra::Application
#lots of stuff here...
end
require_relative 'models/init'
In my models/init.rb I configure DB
require 'sequel'
conf = YAML.load(File.open(File.expand_path('./config/dbconn.yml')))
env = ENV['RACK_ENV'] || 'development'
DB = Sequel.connect(host:conf['database'][env]['host'],
port:conf['database'][env]['port'],
database:conf['database'][env]['schema'],
username:conf['database'][env]['username'],
password:conf['database'][env]['password'],
adapter:conf['database'][env]['adapter'],
encoding:conf['database'][env]['encoding'])
raise "Unable to connect to #{conf['database'][env]['host']}" unless DB.test_connection
...
That's one way. Hope it helps.
You mention that you want to reference from any part of your code; however I've found that encapsulated within the models is where I tend to wrap transactions; and from there it's relatively easy:
class X < Sequel::Model
def self.y
self.db.transaction {
...
end
end
def z
db.transaction {
...
}
end
end

Ruby ActiveRecord Dynamic Model creation

I am trying to establish a multiple DB connection with ActiveRecord.Currently I need to insert data into total 2 Databases. There is a possibility for increase in the No.Of databases.
So I created 2 classes dynamically which will Extend from ActiveRecord::Base
Object.const_set("Connection1",Class.new(ActiveRecord::Base) do
self.abstract_class = true
self.establish_connection({
:host=>"localhost", :username=>"root", :password=>"root", :database=>"db1", :encoding=>"utf8", :adapter=>"mysql2"})
end)
Object.const_set("Connection2",Class.new(ActiveRecord::Base) do
self.abstract_class = true
self.establish_connection({
:host=>"localhost", :username=>"root", :password=>"root", :database=>"db2", :encoding=>"utf8", :adapter=>"mysql2"})
end)
Then I created Dynamic models extends from each class accordingly
Object.const_set("ConnectionUser1",Class.new(Connection1) do
self.table_name = 'user'
def self.foo
all.count
end
end)
Object.const_set("ConnectionUser2",Class.new(Connection2) do
self.table_name = 'user'
def self.foo
all.count
end
end)
Then when I tried to call foo method
p ConnectionUser1.foo
p ConnectionUser2.foo
It gives me ActiveRecord::ConnectionNotEstablished Error.
I heard that if the model doesn't have connection ActiveRecord will take connection of their parent.
So according to this ConnectionUser1 should use the connection of Connection1 and ConnectionUser2 use the connection of Connection2.
Then why ActiveRecord fails to Establish Connection?
Any help will be appreciated.
Thank you.
Take a look at below link which shows that how to use multiple database with ActiveRecord.
How do i work with two different databases in rails with active records?

Resources