How to invoke Mongoid model inside Sidekiq worker? - ruby

Is it possible to write MongoDB document via Mongoid from Sidekiq worker?
# lib/worker.rb
require 'sidekiq'
require 'model' # lib/model.rb
class AwesomeWorker
include Sidekiq::Worker
def perform
RandomNumberModel.create! { random_number: Random.new.rand(1..100) }
end
end
When I run Sidekiq it returns uninitialized constant AwesomeWorker::RandomNumberModel
What I'm doing wrong?

try ::RandomNumberModel.create! { random_number: Random.new.rand(1..100) }
if it fails, you did mistake in path to required file

Might it help of anyone :)
You have to create your worker under app > workers folder. Then every model will be easily accessible there.
# app/workers
class AwesomeWorker
include Sidekiq::Worker
def perform
RandomNumberModel.create! { random_number: Random.new.rand(1..100) }
end
end

Related

Newly created sidekiq worker not found (NameError: uninitialized constant UniqueJobWorkerTest::UniqueJobWorker)

unique_job_worker.rb
# -*- encoding : utf-8 -*-
require_relative 'logging_helper'
class UniqueJobWorker
include Sidekiq::Worker
include WorkerHelper
sidekiq_options retry: false,
backtrace: true,
queue: :sender,
failures: true
def perform(worker,campaign_guid, queue)
require'pry';binding.pry
end
end
unique_job_worker_test.rb
require 'test_helper'
require 'mocha/setup'
class UniqueJobWorkerTest < ActiveSupport::TestCase
def setup
require'pry';binding.pry
#worker = UniqueJobWorker.new
end
test "it exists" do
assert #worker
end
end
When enqueued through redis I get this response
INFO -- : Exception: uninitialized constant UniqueJobWorker
Any suggestions as to why my newly created worker, UniqueJobWorker, is not being found during runtime through redis or through a simple test?
Thanks ahead of time!
When you use sidekiq outside of Rails, you need to use the -r option to tell it how to load your workers. So (assuming that your worker is in a sub-directory called workers):
% sidekiq -r ./workers/unique_job_worker.rb
If you have multiple workers, an option is to create a loader file to ensure everything is loaded.
load_workers.rb
($LOAD_PATH << 'workers').uniq!
require 'unique_job_worker'
require 'other_worker'
...
Then require the loader file on the command line:
% sidekiq -r ./load_workers.rb
I had the same issue and ended up being a Redis namespace issue:
https://github.com/mperham/sidekiq/issues/2834#issuecomment-184800981
Adding that fixed it for me:
config.redis = {
url: ENV['REDIS_URL'],
namespace: "some_namespace_different_for_each_app"
}
You also need the redis-namespace gem BTW

Dry::Web::Container yielding different objects with multiple calls to resolve

I'm trying write a test to assert that all defined operations are called on a successful run. I have the operations for a given process defined in a list and resolve them from a container, like so:
class ProcessController
def call(input)
operations.each { |o| container[o].(input) }
end
def operations
['operation1', 'operation2']
end
def container
My::Container # This is a Dry::Web::Container
end
end
Then I test is as follows:
RSpec.describe ProcessController do
let(:container) { My::Container }
it 'executes all operations' do
subject.operations.each do |op|
expect(container[op]).to receive(:call).and_call_original
end
expect(subject.(input)).to be_success
end
end
This fails because calling container[operation_name] from inside ProcessController and from inside the test yield different instances of the operations. I can verify it by comparing the object ids. Other than that, I know the code is working correctly and all operations are being called.
The container is configured to auto register these operations and has been finalized before the test begins to run.
How do I make resolving the same key return the same item?
TL;DR - https://dry-rb.org/gems/dry-system/test-mode/
Hi, to get the behaviour you're asking for, you'd need to use the memoize option when registering items with your container.
Note that Dry::Web::Container inherits Dry::System::Container, which includes Dry::Container::Mixin, so while the following example is using dry-container, it's still applicable:
require 'bundler/inline'
gemfile(true) do
source 'https://rubygems.org'
gem 'dry-container'
end
class MyItem; end
class MyContainer
extend Dry::Container::Mixin
register(:item) { MyItem.new }
register(:memoized_item, memoize: true) { MyItem.new }
end
MyContainer[:item].object_id
# => 47171345299860
MyContainer[:item].object_id
# => 47171345290240
MyContainer[:memoized_item].object_id
# => 47171345277260
MyContainer[:memoized_item].object_id
# => 47171345277260
However, to do this from dry-web, you'd need to either memoize all objects auto-registered under the same path, or add the # auto_register: false magic comment to the top of the files that define the dependencies and boot them manually.
Memoizing could cause concurrency issues depending on which app server you're using and whether or not your objects are mutated during the request lifecycle, hence the design of dry-container to not memoize by default.
Another, arguably better option, is to use stubs:
# Extending above code
require 'dry/container/stub'
MyContainer.enable_stubs!
MyContainer.stub(:item, 'Some string')
MyContainer[:item]
# => "Some string"
Side note:
dry-system provides an injector so that you don't need to call the container manually in your objects, so your process controller would become something like:
class ProcessController
include My::Importer['operation1', 'operation2']
def call(input)
[operation1, operation2].each do |operation|
operation.(input)
end
end
end

Transactions in Ruby Sequel module: how to get DB object?

I'm working in a Sinatra application using Sequel.
I want to make a transaction, according to the manual I have to use the DB object, how can I get this object from any part of my code?
You can define it in your base app.rb (or equivalent) or include a separate file where you configure the DB object if you wish.
For example, in one of my Sinatra apps, I have an app.rb that includes a
class App < Sinatra::Application
#lots of stuff here...
end
require_relative 'models/init'
In my models/init.rb I configure DB
require 'sequel'
conf = YAML.load(File.open(File.expand_path('./config/dbconn.yml')))
env = ENV['RACK_ENV'] || 'development'
DB = Sequel.connect(host:conf['database'][env]['host'],
port:conf['database'][env]['port'],
database:conf['database'][env]['schema'],
username:conf['database'][env]['username'],
password:conf['database'][env]['password'],
adapter:conf['database'][env]['adapter'],
encoding:conf['database'][env]['encoding'])
raise "Unable to connect to #{conf['database'][env]['host']}" unless DB.test_connection
...
That's one way. Hope it helps.
You mention that you want to reference from any part of your code; however I've found that encapsulated within the models is where I tend to wrap transactions; and from there it's relatively easy:
class X < Sequel::Model
def self.y
self.db.transaction {
...
end
end
def z
db.transaction {
...
}
end
end

How to test em-mongo + Goliath?

This below app saves some data to the db and I want to test that it saves properly.
require 'goliath'
class App < Goliath::API
def response(env)
db = EM::Mongo::Connection.new('localhost').db('hello')
db.collection('coll').insert({'identifier => 1'})
[204, {}, {}]
end
end
require 'goliath/test_helper'
Goliath.env = :test
describe App do
include Goliath::TestHelper
it do
with_api(described_class) do
get_request do |req|
db = EM::Mongo::Connection.new('localhost').db('hello')
db.collection('coll').first.callback do |rec|
rec['identifier'].should == 100
end
end
end
end
end
The above spec passes since reactor ends before callback returns. I thought about manually starting a reactor like:
EM.run do
db = EM::Mongo::Connection.new('localhost').db('hello')
db.collection('coll').first.callback do |rec|
rec['identifier'].should == 100
EM.stop
end
end
Though I'm not sure if starting the reactor for every spec is good practice. Help please?
The problem is that when the get_request is setup we add a callback on the request that stops the event loop. So, as soon as your block finishes (which will be before the connection is even created), it will stop the reactor.
I'm not sure the best solution, but a crappy one would be to override:
def hookup_request_callbacks(req, errback, &blk)
req.callback &blk
req.callback { stop }
req.errback &errback if errback
req.errback { stop }
end
in your test class after you include Goliath::TestHelper. Then, I think, you should be able to write your own that just has something like:
def hookup_request_callbacks(req, errback, &blk)
req.callback &blk
req.errback &errback if errback
req.errback { stop }
end
You'll just have to make sure you call stop in your callback from Mongo.
I haven't actually tested this, so let me know if something doesn't work and I can dig in further.
#dj2's solution works great, but I decided instead of use mongo gem in specs, instead of em-mongo. Since mongo blocks, I don't have to worry about Goliath stopping the reactor before database returns results.

Including external Ruby module inside a module?

I'm making a Rack framework for Ruby that runs a App::Router module inside of the following:
module App
Router = HttpRouter.new do
get('/') { |env| erb('home') }
end
end
Notice the erb() method I wish to use in my router. The problem is getting the methods from an external source (my framework) into a module and get passed into the do block inside.
Is there there a possible way to get modules from an external source into a module in another file?
Thanks.
Is erb a method you define somewhere? Try something like this:
require 'path/to/module/with/erb_method'
module App
include YourModule
Router = HttpRouter.new do
get('/') { |env| erb('home') }
end
end
module App
def foo
"bar"
end
end
module Route
include App
end
include Route
foo
=> "bar"

Resources