I'm writing Ruby Gem where I have Connection module for Faraday configuration
module Example
module Connection
private
def connection
Faraday.new(url: 'http://localhost:3000/api') do |conn|
conn.request :url_encoded # form-encode POST params
conn.response :logger # log requests to STDOUT
conn.adapter Faraday.default_adapter # make requests with Net::HTTP
conn.use Faraday::Response::ParseJson
conn.use FaradayMiddleware::RaiseHttpException
end
end
end
end
Second module which makes API requests looks like this:
module Example
module Request
include Connection
def get(uri)
connection.get(uri).body
end
def post(url, attributes)
response = connection.post(url) do |request|
request.body = attributes.to_json
end
end
def self.extended(base)
base.include(InstanceMethods)
end
module InstanceMethods
include Connection
def put(url, attributes)
response = connection.put(url) do |request|
request.body = attributes.to_json
end
end
end
end
end
Class Cusomer where I use Request looks like this:
module Example
class Customer
extend Request
attr_accessor :id, :name, :age
def initialize(attrs)
attrs.each do |key, value|
instance_variable_set("##{key}", value)
end
end
def self.all
customers = get('v1/customer')
customers.map { |cust| new cust }
end
def save
params = {
id: self.id,
age: self.age
name: self.name,
}
put("v1/customers/#{self.id}", params)
end
end
end
So here you see in Customer#all class method I'm calling Request#get method which is available because I extended Request in Customer. then I'm using self.extended method in Request module to be make Request#put available in Customer class, so I have question is this good approach to use mixins like this, or do you have any suggestion?
Mixins are a strange beast. Best practices vary depending on who you talk to. As far as reuse goes, you've achieved that here with mixins, and you have a nice separation of concerns.
However, mixins are a form of inheritance (you can take a peek at #ancestors). I would challenge you saying that you shouldn't use inheritance here because a Customer doesn't have an "is-a" relationship with Connection. I would recommend you use composition instead (e.g. pass in Connection/Request) as it makes more sense to me in this case and has stronger encapsulation.
One guideline for writing mixins is to make everything end in "-able", so you would have Enumerable, Sortable, Runnable, Callable, etc. In this sense, mixins are generic extensions that provide some sort of helpers that are depending on a very specific interface (e.g. Enumerable depends on the class to implement #each).
You could also use mixins for cross-cutting concerns. For example, we've used mixins in the past in our background jobs so that we could add logging for example without having to touch the source code of the class. In this case, if a new job wants logging, then they just mixin the concern which is coupled to the framework and will inject itself properly.
My general rule of thumb is don't use them if you don't have to. They make understanding the code a lot more complicated in most cases
EDIT: Adding an example of composition. In order to maintain the interface you have above you'd need to have some sort of global connection state, so it may not make sense. Here's an alternative that uses composition
class CustomerConnection
# CustomerConnection is composed of a Connection and retains isolation
# of responsibilities. It also uses constructor injection (e.g. takes
# its dependencies in the constructor) which means easy testing.
def initialize(connection)
#connection = connection
end
def all_customers
#connection.get('v1/customers').map { |res| Customer.new(res) }
end
end
connection = Connection.new
CustomerConnection.new(connection).all_customers
Related
I have a design problem.
I'm writing a REST client in ruby. For reasons beyond my control, it has to extend another gem that uses my networks zookeeper instance to do service lookup. My client takes a user provided tier, and based on that value, queries the zookeeper registry for the appropriate service url.
The problem is that I also need to be able to run my client against a locally running version of the service under test. When the service is running locally, zookeeper is obviously not involved, so I simply need to be able to make GET requests against the localhost resource url.
When a user instantiates my gem, they call something like:
client = MyRestClient.new(tier: :dev)
or in local mode
client = MyRestClient.new(tier: :local)
I would like to avoid conditionally hacking the constructor in MyRestClient (and all of the GET methods in MyRestClient) to alter requests based on :local vs. :requests_via_the_zk_gem.
I'm looking for an elegant and clean way to handle this situation in Ruby.
One thought was to create two client classes, one for :local and the other for :not_local. But then I don't know how to provide a single gem interface that will return the correct client object.
If MyClient has a constructor that looks something like this:
class MyClient
attr_reader :the_klass
def initialize(opts={})
if opts[:tier] == :local
#the_klass = LocalClass.new
else
#the_klass = ZkClass.new
end
#the_klass
end
end
then I end up with something like:
test = MyClient.new(tier: :local)
=> #<MyClient:0x007fe4d881ed58 #the_klass=#<LocalClass:0x007fe4d883afd0>>
test.class
=> MyClient
test.the_klass.class
=> LocalClass
those who then use my gem would have to make calls like:
#client = MyClient.new(tier: :local)
#client.the_klass.get
which doesn't seem right
I could use a module to return the appropriate class, but then I'm faced with the question of how to provide a single public interface for my gem. I can't instantiate a module with .new.
My sense is that this is a common OO problem and I just haven't run into it yet. It's also possible the answer is staring me in the face and I just haven't found it yet.
Most grateful for any help.
A common pattern is to pass the service into the client, something like:
class MyClient
attr_reader :service
def initialize(service)
#service = service
end
def some_method
service.some_method
end
end
And create it with:
client = MyRestClient.new(LocalClass.new)
# or
client = MyRestClient.new(ZkClass.new)
You could move these two into class methods:
class MyClient
self.local
new(LocalClass.new)
end
self.dev
new(ZkClass.new)
end
end
And instead call:
client = MyRestClient.local
# or
client = MyRestClient.dev
You can use method_missing to delegate from your client to the actual class.
def method_missing(m, *args, &block)
#the_class.send(m, *args, &block)
end
So whenever a method gets called on your class that doesn't exist (like get in your example) it wil be called on #the_class instead.
It's good style to also define the corresponding respond_to_missing? btw:
def respond_to_missing?(m, include_private = false)
#the_class.respond_to?(m)
end
The use case you are describing looks like a classic factory method use case.
The common solution for this is the create a method (not new) which returns the relevant class instance:
class MyClient
def self.create_client(opts={})
if opts[:tier] == :local
LocalClass.new
else
ZkClass.new
end
end
end
And now your usage is:
test = MyClient.create(tier: :local)
=> #<LocalClass:0x007fe4d881ed58>
test.class
=> LocalClass
Situation: testing a rails application using Rspec, FactoryGirl and VCR.
Every time a User is created, an associated Stripe customer is created through Stripe's API. While testing, it doesn't really makes sense to add a VCR.use_cassette or describe "...", vcr: {cassette_name: 'stripe-customer'} do ... to every spec where User creation is involved. My actual solution is the following:
RSpec.configure do |config|
config.around do |example|
VCR.use_cassette('stripe-customer') do |cassette|
example.run
end
end
end
But this isn't sustainable because the same cassette will be used for every http request, which of course is very bad.
Question: How can I use specific fixtures (cassettes) based on individual request, without specifying the cassette for every spec?
I have something like this in mind, pseudo-code:
stub_request(:post, "api.stripe.com/customers").with(File.read("cassettes/stripe-customer"))
Relevant pieces of code (as a gist):
# user_observer.rb
class UserObserver < ActiveRecord::Observer
def after_create(user)
user.create_profile!
begin
customer = Stripe::Customer.create(
email: user.email,
plan: 'default'
)
user.stripe_customer_id = customer.id
user.save!
rescue Stripe::InvalidRequestError => e
raise e
end
end
end
# vcr.rb
require 'vcr'
VCR.configure do |config|
config.default_cassette_options = { record: :once, re_record_interval: 1.day }
config.cassette_library_dir = 'spec/fixtures/cassettes'
config.hook_into :webmock
config.configure_rspec_metadata!
end
# user_spec.rb
describe :InstanceMethods do
let(:user) { FactoryGirl.create(:user) }
describe "#flexible_name" do
it "returns the name when name is specified" do
user.profile.first_name = "Foo"
user.profile.last_name = "Bar"
user.flexible_name.should eq("Foo Bar")
end
end
end
Edit
I ended doing something like this:
VCR.configure do |vcr|
vcr.around_http_request do |request|
if request.uri =~ /api.stripe.com/
uri = URI(request.uri)
name = "#{[uri.host, uri.path, request.method].join('/')}"
VCR.use_cassette(name, &request)
elsif request.uri =~ /twitter.com/
VCR.use_cassette('twitter', &request)
else
end
end
end
VCR 2.x includes a feature specifically to support use cases like these:
https://relishapp.com/vcr/vcr/v/2-4-0/docs/hooks/before-http-request-hook!
https://relishapp.com/vcr/vcr/v/2-4-0/docs/hooks/after-http-request-hook!
https://relishapp.com/vcr/vcr/v/2-4-0/docs/hooks/around-http-request-hook!
VCR.configure do |vcr|
vcr.around_http_request(lambda { |req| req.uri =~ /api.stripe.com/ }) do |request|
VCR.use_cassette(request.uri, &request)
end
end
IMO, libraries like this should provided you with a mock class, but w/e.
You can do your pseudocode example already with Webmock, which is the default internet mocking library that VCR uses.
body = YAML.load(File.read 'cassettes/stripe-customer.yml')['http_interactions'][0]['response']['body']['string']
stub_request(:post, "api.stripe.com/customers").to_return(:body => body)
You could put that in a before block that only runs on a certain tag, then tag the requests that make API calls.
In their tests, they override the methods that delegate to RestClient (link). You could do this as well, take a look at their test suite to see how they use it, in particular their use of test_response. I think this is a terribly hacky way of doing things, and would feel really uncomfortable with it (note that I'm in the minority with this discomfort) but it should work for now (it has the potential to break without you knowing until runtime). If I were to do this, I'd want to build out real objects for the two mocks (the one mocking rest-client, and the other mocking the rest-client response).
The whole point (mostly anyway) of VCR to just to replay the response of a previous request. If you are in there picking and choosing what response goes back to what request, you are quote/unquote doing-it-wrong.
Like Joshua already said, you should use Webmock for something like this. That's what VCR is uing behind the scenes anyway.
I am trying to use some functionality in ActiveModel but I'm having trouble making everything work. I've included my class file and the test I'm running.
The test is failing with:
': undefined method `attr_accessible
I really don't know why, since MassAssignmentSecurity will bring that in and it is in fact running. I've also tried to include all of ActiveModel as well but that's doesn't work either. It doesn't seem to matter if I use include or extend to bring in the MassAssignmentSecurity.
If I pass in some attributes in my test to exercise "assign_attributes" in the initialize, that fails as well. I'm fairly new to rails, so I'm hoping I'm just missing something really simple.
TIA.
Using rails 3.2.12
my_class.rb
class MyClass
include ActiveModel::MassAssignmentSecurity
include ActiveModel::Validations
include ActiveModel::Conversion
extend ActiveModel::Naming
extend ActiveSupport::Callbacks
attr_accessible :persisted, :creds
def initialize(attributes = nil, options = {})
#persisted = false
assign_attributes(attributes, options) if attributes
yield self if block_given?
end
end
my_class_spec.rb
require 'spec_helper'
describe MyClass do
before do
#testcase = MyClass.new
end
subject { #testcase }
it_should_behave_like "ActiveModel"
it { MyClass.should include(ActiveModel::MassAssignmentSecurity) }
it { should respond_to(:persisted) }
end
support/active_model.rb
shared_examples_for "ActiveModel" do
include ActiveModel::Lint::Tests
# to_s is to support ruby-1.9
ActiveModel::Lint::Tests.public_instance_methods.map{|m| m.to_s}.grep(/^test/).each do |m|
example m.gsub('_',' ') do
send m
end
end
def model
subject
end
end
Yikes! What a mess I was yesterday. Might as well answer my own question since I figured out my issues.
attr_accessible in MassAssignmentSecurity does not work like it does with ActiveRecord. It does not create getters and setters. You still have to use attr_accessor if you those created.
assign_attributes is a connivence function that someone wrote to wrap around mass_assignment_sanitizer and isn't something baked into in MassAssignment Security. An example implementation is below:
def assign_attributes(values, options = {})
sanitize_for_mass_assignment(values, options[:as]).each do |k, v|
send("#{k}=", v)
end
end
I've been trying Padrino framework in one of my project, and there is one thing that really annoys me. I want to implement just for instance a user registration process using OmniAuth and want to break my request handler (controller's action) to separate methods, like this:
get ":provider/callback" do
#user = find_the_user_by_oauth(request)
create_user unless #user
store_user_in_session
end
def find_the_user_by_oauth(request)
#...
end
def store_user_in_session
session[:user_id] = #user.id
end
I know it would be nicer to push the logic to the model layer, but my question is, how could I break a controller logic to separated methods and share information among them (like using instance variables). In Rails I created these methods in the private scope of my controller, but here I should extend the Application class because it throws Undefined method exception for the previous code. I tried Helpers, but helpers don't know the instance variables, so you should pass the variables every time.
What is the good way to make my controller actions clean in Padrino?
To define a method inside an Padrino Controller you can use define_method instead of def.
For your example, do something like this:
Admin.controllers :dummy do
define_method :find_the_user_by_oauth do |request|
request.params["username"]
# ...
end
define_method :store_user_in_session do
session[:user_id] = #user
end
get :test do
#user = find_the_user_by_oauth(request)
create_user unless #user
store_user_in_session()
session.inspect
end
end
Padrino runs the block sent to Admin.controllers using instance_eval.
See this answer for the differences https://stackoverflow.com/a/3171649 between define_method and def
possible offtopic, but would you consider to use Espresso Framework instead.
then you'll can solve your issue as simple as:
class App < E
def index provider, action = 'callback'
#user = find_the_user_by_oauth
create_user unless #user
store_user_in_session
end
private
def find_the_user_by_oauth
# provider, action are accessed via `action_params`
# action_params[:provider]
# action_params[:action]
end
def store_user_in_session
session[:user_id] = #user.id
end
end
I'm quite new to OOP and I'm concerned that this class that I've written is really poorly designed. It seems to disobey several principles of OOP:
It doesn't contain its own data, but relies on a yaml file for
values.
Its methods need to be called in a particular order
It has a lot of instance variables and methods
It does work, however. It's robust, but I'll need to modify the source code to add new getter methods every time I add page elements
It's a model of an html document used in an automated test suite. I keep thinking that some of the methods could be put in subclasses, but I'm concerned that I'd have too many classes then.
What do you think?
class BrandFlightsPage < FlightSearchPage
attr_reader :route, :date, :itinerary_type, :no_of_pax,
:no_results_error_container, :submit_button_element
def initialize(browser, page, brand)
super(browser, page)
#Get reference to config file
config_file = File.join(File.dirname(__FILE__), '..', 'config', 'site_config.yml')
#Store hash of config values in local variable
config = YAML.load_file config_file
#brand = brand #brand is specified by the customer in the features file
#Define instance variables from the hash keys
config.each do |k,v|
instance_variable_set("##{k}",v)
end
end
def visit
#browser.goto(#start_url)
end
def set_origin(origin)
self.text_field(#route[:attribute] => #route[:origin]).set origin
end
def set_destination(destination)
self.text_field(#route[:attribute] => #route[:destination]).set destination
end
def set_departure_date(outbound)
self.text_field(#route[:attribute] => #date[:outgoing_date]).set outbound
end
def set_journey_type(type)
if type == "return"
self.radio(#route[:attribute] => #itinerary_type[:single]).set
else
self.radio(#route[:attribute] => #itinerary_type[:return]).set
end
end
def set_return_date(inbound)
self.text_field(#route[:attribute] => #date[:incoming_date]).set inbound
end
def set_number_of_adults(adults)
self.select_list(#route[:attribute] => #no_of_pax[:adults]).select adults
end
def set_no_of_children(children)
self.select_list(#route[:attribute] => #no_of_pax[:children]).select children
end
def set_no_of_seniors(seniors)
self.select_list(#route[:attribute] => #no_of_adults[:seniors]).select seniors
end
def no_flights_found_message
#browser.div(#no_results_error_container[:attribute] => #no_results_error_container[:error_element]).text
raise UserErrorNotDisplayed, "Expected user error message not displayed" unless divFlightResultErrTitle.exists?
end
def submit_search
self.link(#submit_button_element[:attribute] => #submit_button_element[:button_element]).click
end
end
If this class is designed as a Facade, then it's not (too) bad design. It provides a coherent unified way to perform related operations that rely on a variety of un-related behavior holders.
It appears to be poor separation of concerns, in that this class essentially coupling all the various implementation details, which might turn out to be somewhat tricky to maintain.
Finally, the fact methods need to be called in a specific order may hint at the fact you're trying to model a state machine - in which case it probably should be broken down to several classes (one per "state"). I don't think there's a "too many methods" or "too many classes" point you'd reach, the fact is you need the features provided by each class to be coherent and making sense. Where to draw the line is up to you and your specific implementation's domain requirements.