Is it better to create new controller instance for each HTTP request in Rack based app or to use the same instance? - ruby

I'm creating a very simple Rack based application as I want it to do a very specific task.
The server.rb looks something like this:
Path= File.expand_path("#{File.dirname __FILE__}/../../")
require "bundler/setup"
require "thin"
require "rack"
%w(parser auth controller).each do |file|
require "#{Path}/app/server/#{file}.rb"
end
builder = Rack::Builder.app do
use Auth
run Parser.new
end
Rack::Handler::Thin.run(builder, :Port => 8080, :threaded => true)
parser.rb looks like:
class Parser
def initialize
#controller = Controller.new
end
def call(env)
req = Rack::Request.new(env).params
res = Rack::Response.new
res['Content-Type'] = "text/plain"
command= req[:command]
if command =~ /\A(register|r|subscribe|s)\z/i
#controller.register
end
res.write command
res.finish
end
end
Now my question here, from design prospective, is it better to create one instance of Controller and use it with each request(like Idid with the code above), or to create new controller instance for each request(change #controller.register to Controller.new.register)? which is better to use and why?
Thanks in advance

The overhead of creating a new controller per request is likely not that large.
If you store state in the controller (instance variables etcetera) and you reuse it, you could run into concurrency issues such as race conditions or deadlock when under load.
If you take care to ensure that your Controller object stores no state, you can reuse it. If it does any sort of state storage per request, you will need to ensure that the shared resources are property synchronized.
My 2c - create a new controller per request, until you can confirm that you have a performance hit from creating a new controller per request. It's simpler, cleaner, and less prone to strange bugs.

Related

Sinatra Multiple Parallel Requests Variable Behaviour

I am fairly new to ruby and would like to understand how class instance variables behave in case of multiple parallel requests.
I have a method inside my controller class which is called everytime for each request for a specific operation (create in this case)
class DeployProvision
def self.create(data)
raise "Input JSON not received." unless data
# $logger.info input_data.inspect
failure = false
response_result = ""
response_status = "200"
#validator = SchemaValidate.new
validation = #validator.validate_create_workflow(data.to_json)
end
end
This method is called as (DeployProvision.create(data))
I am a little confused on how #validator class instance variable behaves when multiple requests come. Is it shared among multiple requests. Is it a good idea to declare this as class instance variable instead of a local variable ?
I am working on an existing code base and would like to understand the intent of creating #validator as a class instance variable instead of local variable.
You can write ultra-simple script like this:
require 'sinatra'
class Foo
def self.bar
#test = Time.now
puts #test
end
end
get '/' do
Foo.bar
end
and you'll see it does nothing, because with every call, you're creating new instance of Time(SchemaValidate in your code).
If you used memoization and had something like #validator ||= SchemaValidate.new you would have one instance of SchemaValidate stored between requests.
I don't think that'd change anything in terms of performance and I don't have idea why would anyone do something like that.
You can have some fun with ultra-simple scripts with sinatra to test how it behaves.
Good luck with this code!

return json from ruby using rack

i'm still fairly new to server side scripts and try myself a little bit on ruby to write me little helpers and to learn some new things.
I currently try to write a small ruby app which sends a json file of all images within a specific folder to my page where i can use those to handle them further in js.
I read quite a few introductions to ruby and rails and got a recommendation to look into rack as a lightweight communicator between server and app.
While the ruby part works fine, i have difficulties to understand how to send out the generated JSON as a reaction to a future ajax call (e.g.). Hope someone can give me a few hints or sources to look into for further understanding. Thanks!
require 'json'
class listImages
def call(env)
imageDir = Dir.chdir("./img");
files = Dir.glob("img*")
n = 0
tempHash = {}
files.each do |i|
tempHash["img#{n}"] = i
n += 1
end
File.open("temp.json","w") do |f|
f.write(tempHash.to_json)
end
[200,{"Content-Type" => "application/javascript"}, ["temp.json"]]
end
puts "All done!"
end
run listImages.new
if $0 == __FILE__
require 'rack'
Rack::Handler::WEBrick.run MyApp.new
end
You don't have to save the JSON to a file before you can send it. Just send it directly:
[200, {"Content-Type" => "application/json"}, [tempHash.to_json]]
With your current code, you are only sending the String "temp.json".
That said, the rest of your code looks a little bit messy/not conform Ruby coding standards:
Start your classnames with an uppercase: class ListImages, not class listImages.
Use underscores, not camelcase for variable names: image_dir, not imageDir.
The puts "All done!" statement is outside the method definition and will be called early, when the class is loaded.
You define a class ListImages but in the last line of your code you refer to MyApp.

How do I make a class conditionally return one of two other classes?

I have a design problem.
I'm writing a REST client in ruby. For reasons beyond my control, it has to extend another gem that uses my networks zookeeper instance to do service lookup. My client takes a user provided tier, and based on that value, queries the zookeeper registry for the appropriate service url.
The problem is that I also need to be able to run my client against a locally running version of the service under test. When the service is running locally, zookeeper is obviously not involved, so I simply need to be able to make GET requests against the localhost resource url.
When a user instantiates my gem, they call something like:
client = MyRestClient.new(tier: :dev)
or in local mode
client = MyRestClient.new(tier: :local)
I would like to avoid conditionally hacking the constructor in MyRestClient (and all of the GET methods in MyRestClient) to alter requests based on :local vs. :requests_via_the_zk_gem.
I'm looking for an elegant and clean way to handle this situation in Ruby.
One thought was to create two client classes, one for :local and the other for :not_local. But then I don't know how to provide a single gem interface that will return the correct client object.
If MyClient has a constructor that looks something like this:
class MyClient
attr_reader :the_klass
def initialize(opts={})
if opts[:tier] == :local
#the_klass = LocalClass.new
else
#the_klass = ZkClass.new
end
#the_klass
end
end
then I end up with something like:
test = MyClient.new(tier: :local)
=> #<MyClient:0x007fe4d881ed58 #the_klass=#<LocalClass:0x007fe4d883afd0>>
test.class
=> MyClient
test.the_klass.class
=> LocalClass
those who then use my gem would have to make calls like:
#client = MyClient.new(tier: :local)
#client.the_klass.get
which doesn't seem right
I could use a module to return the appropriate class, but then I'm faced with the question of how to provide a single public interface for my gem. I can't instantiate a module with .new.
My sense is that this is a common OO problem and I just haven't run into it yet. It's also possible the answer is staring me in the face and I just haven't found it yet.
Most grateful for any help.
A common pattern is to pass the service into the client, something like:
class MyClient
attr_reader :service
def initialize(service)
#service = service
end
def some_method
service.some_method
end
end
And create it with:
client = MyRestClient.new(LocalClass.new)
# or
client = MyRestClient.new(ZkClass.new)
You could move these two into class methods:
class MyClient
self.local
new(LocalClass.new)
end
self.dev
new(ZkClass.new)
end
end
And instead call:
client = MyRestClient.local
# or
client = MyRestClient.dev
You can use method_missing to delegate from your client to the actual class.
def method_missing(m, *args, &block)
#the_class.send(m, *args, &block)
end
So whenever a method gets called on your class that doesn't exist (like get in your example) it wil be called on #the_class instead.
It's good style to also define the corresponding respond_to_missing? btw:
def respond_to_missing?(m, include_private = false)
#the_class.respond_to?(m)
end
The use case you are describing looks like a classic factory method use case.
The common solution for this is the create a method (not new) which returns the relevant class instance:
class MyClient
def self.create_client(opts={})
if opts[:tier] == :local
LocalClass.new
else
ZkClass.new
end
end
end
And now your usage is:
test = MyClient.create(tier: :local)
=> #<LocalClass:0x007fe4d881ed58>
test.class
=> LocalClass

Ruby object with object property

I have a class that uses the AWS S3 gem and have several methods in my class that utilise the gem. My issue is, that rather than configuring it in several locations, I'd like to make it a property of my object.
In PHP, we'd do this;
<?php
class myClass {
private $obj;
public function __construct() {
$this->obj = new Object();
}
}
?>
And then I could use $this->obj->method() anywhere in myClass.
I am having a hard time getting similar to work in ruby.
My scenario is similar to this;
require 'aws/s3'
class ProfileVideo < ActiveRecord::Base
def self.cleanup
# <snip> YAML load my config etc etc
AWS::S3::Base.establish_connection!(
:access_key_id => #aws_config['aws_key'],
:secret_access_key => #aws_config['aws_secret']
)
end
def self.another_method
# I want to use AWS::S3 here without needing to establish connection again
end
end
I have also noticed in my class that initialize fails to execute - a simple 'puts "here"' does nothing. Considering this is a rake task and I can 'puts "here"' in the other methods. I'm not sure if maybe rake does not init like running ProfileVideo.new would?
Anyway, thanks in advance.
I'm not familiar with the S3 gem in particular but here are a couple ways you could go about this.
If you simply want to make establishing the connection easier, you can create a method in your model like so:
def open_s3
return if #s3_opened
AWS::S3::Base.establish_connection!(
:access_key_id => #aws_config['aws_key'],
:secret_access_key => #aws_config['aws_secret']
)
#s3_opened = true
end
then you can call open_s3 at the top of any methods that require it and it will only open once.
Another route you could take is to place the connection code in a before hook set to fire before any other hooks (IIRC, the order in which you define them sets the order in which they fire) and then make your calls.
In either case, I would recommend against putting your AWS key and secret into your code. Instead, those should go into a config file is ignored by your version control system and generated on-deploy for remote systems.

Calling Sinatra from within Sinatra

I have a Sinatra based REST service app and I would like to call one of the resources from within one of the routes, effectively composing one resource from another. E.g.
get '/someresource' do
otherresource = get '/otherresource'
# do something with otherresource, return a new resource
end
get '/otherresource' do
# etc.
end
A redirect will not work since I need to do some processing on the second resource and create a new one from it. Obviously I could a) use RestClient or some other client framework or b) structure my code so all of the logic for otherresource is in a method and just call that, however, it feels like it would be much cleaner if I could just re-use my resources from within Sinatra using their DSL.
Another option (I know this isn't answering your actual question) is to put your common code (even the template render) within a helper method, for example:
helpers do
def common_code( layout = true )
#title = 'common'
erb :common, :layout => layout
end
end
get '/foo' do
#subtitle = 'foo'
common_code
end
get '/bar' do
#subtitle = 'bar'
common_code
end
get '/baz' do
#subtitle = 'baz'
#common_snippet = common_code( false )
erb :large_page_with_common_snippet_injected
end
Sinatra's documentation covers this - essentially you use the underlying rack interface's call method:
http://www.sinatrarb.com/intro.html#Triggering%20Another%20Route
Triggering Another Route
Sometimes pass is not what you want, instead
you would like to get the result of calling another route. Simply use
call to achieve this:
get '/foo' do
status, headers, body = call env.merge("PATH_INFO" => '/bar')
[status, headers, body.map(&:upcase)]
end
get '/bar' do
"bar"
end
I was able to hack something up by making a quick and dirty rack request and calling the Sinatra (a rack app) application directly. It's not pretty, but it works. Note that it would probably be better to extract the code that generates this resource into a helper method instead of doing something like this. But it is possible, and there might be better, cleaner ways of doing it than this.
#!/usr/bin/env ruby
require 'rubygems'
require 'stringio'
require 'sinatra'
get '/someresource' do
resource = self.call(
'REQUEST_METHOD' => 'GET',
'PATH_INFO' => '/otherresource',
'rack.input' => StringIO.new
)[2].join('')
resource.upcase
end
get '/otherresource' do
"test"
end
If you want to know more about what's going on behind the scenes, I've written a few articles on the basics of Rack you can read. There is What is Rack? and Using Rack.
This may or may not apply in your case, but when I’ve needed to create routes like this, I usually try something along these lines:
%w(main other).each do |uri|
get "/#{uri}" do
#res = "hello"
#res.upcase! if uri == "other"
#res
end
end
Building on AboutRuby's answer, I needed to support fetching static files in lib/public as well as query paramters and cookies (for maintaining authenticated sessions.) I also chose to raise exceptions on non-200 responses (and handle them in the calling functions).
If you trace Sinatra's self.call method in sinatra/base.rb, it takes an env parameter and builds a Rack::Request with it, so you can dig in there to see what parameters are supported.
I don't recall all the conditions of the return statements (I think there were some Ruby 2 changes), so feel free to tune to your requirements.
Here's the function I'm using:
def get_route url
fn = File.join(File.dirname(__FILE__), 'public'+url)
return File.read(fn) if (File.exist?fn)
base_url, query = url.split('?')
begin
result = self.call('REQUEST_METHOD' => 'GET',
'PATH_INFO' => base_url,
'QUERY_STRING' => query,
'rack.input' => StringIO.new,
'HTTP_COOKIE' => #env['HTTP_COOKIE'] # Pass auth credentials
)
rescue Exception=>e
puts "Exception when fetching self route: #{url}"
raise e
end
raise "Error when fetching self route: #{url}" unless result[0]==200 # status
return File.read(result[2].path) if result[2].is_a? Rack::File
return result[2].join('') rescue result[2].to_json
end

Resources