How to control mutil daemons processes in ruby? - ruby

How to start many processes(will run the same script) by "Daemons gem", and how to stop one of them as I want ? thanks!

Consider using bluepill or god for this task. Both use daemons internally, and both provide a nice ruby way of defining your processes and monitoring them.

Related

runit call script with start/stop

I've some SysVinit scripts with start/stop, they are used for remote server deployment, as now i am using runit in other deployment purpose, and don't want to duplicate the scripts (maintenance reason). Is it possible that runit invokes these scripts? or other approach? Thank you in advance.
It is unlikely that you can use these scripts unmodified with runit. Runit expects the service started with run to remain in the foreground and not exit.
A SysVInit script expects the opposite behavior. Because it does not perform any process monitoring (and the lack of process monitoring is presumably why you are switching to runit), SysVInit scripts expect services to run in the background, and will exit after starting the service.
These are two fundamentally incompatible models.
You could consider using systemd instead of runit, which provides good process monitoring while also being able to follow processes that fork.

Can I use Sidekiq for continuous processes?

I have several processes which currently run as rake tasks. Can I somehow use Sidekiq to execute a process in a continuous loop? Is that a best-practice with Sidekiq?
These processes, though they run in the background in a continuous loop in their respective rake tasks now, occasionally fail. Then I have to restart the rake task.
I am trying a couple of options, with help from the SO community. One is to figure out how to monitor the rake tasks with monit. But that means each process will have to have its own environment, adding to server load. Since I'm running in a virtualized environment, I want to eliminate that wherever possible.
The other option is just to leverage the Sidekiq option I already have. I use Sidekiq now for background processing, but it's always just one-offs. Is there some way I can have a continuous process in Sidekiq? And also be notified of failures and have the processes restart automatically?
The answer per Mike Perham the Sidekiq author is to use a cron job for scheduled tasks like this. You can create a rake task which submits the job to Sidekiq to run in the background. Then create a cron job to schedule it.
I don't know why you go for sideki, is this project specific ? Previously I faced the same problem but I migrated to delayed_job and it satisfy my needs. If the active record objects are transactional use delayed_job otherwise go for resque it is also a nice one.

delayed_job with multiple workers and upstart

I want to switch my single delayed_job process to multiple workers. I currently have an upstart job that runs rake and uses respawn method with no 'expect fork' since rake does not fork. Now to switch to a multiple worker method I need to 'expect' in my upstart configuration file. Any suggestions.
Out of the box, it appears that upstart expect does not support the behavior outlined in https://github.com/collectiveidea/delayed_job#running-jobs , as there are multiple workers that each fork twice to daemonize.
As outlined in this question about upstart: Can upstart expect/respawn be used on processes that fork more than twice? , you can use a bit of scripting to shepherd the processes yourself in the different hooks.
Another option would be to use upstart job instances (http://upstart.ubuntu.com/cookbook/#instance) to start multiple jobs that do not fork.
I'm not very clear with what you were asking. But if you want multiple delayed jobs to run in background, when you start the delayed job using the command something like rake Jobs:Work, you can specify the number of consumer threads you want to spawn. Hope it helps you.

Way to know that worker has finished Job/Process in Resque

Is there any way to know if worker has finished the particular job/process in Resque.
Scenario: I have 5 worker doing some specific process, I want to know whether process is done to proceed with other part of code.
I am using Ruby 1.8.7 and Rails 3.1.1 if that is of any help.
you can try gearman if you need know this
log your information in your code
use redis-cli to check if the key of your job has value
resque-web and resque-status can also help you
You probably want to use something like resque-status: https://github.com/quirkey/resque-status .
If that doesn't quite meet your needs, you can always check the wiki plugin page for more possibilities: https://github.com/defunkt/resque/wiki/plugins
It is also not hard to store the fact of job completion as an extra field in your database.

Something like PPerl for Ruby?

I've used PPerl for deamon like processes.
This program turns ordinary perl
scripts into long running daemons,
making subsequent executions extremely
fast. It forks several processes for
each script, allowing many proceses to
call the script at once.
Does anyone know of something like this for ruby? Right now I am planing on using a wrapper around curl to call a REST WebService written in Sinatra running on JRuby. I'm hoping there is a simpler option.
Have you looked at using nailgun? It sets up a background JVM process that your scripts execute in. That way you can use jruby w/o incurring the JVM startup time you would normally get with each script run.
You mean like daemons?
Simple example of in-process daemonization
require 'rubygems'
require 'daemons'
Daemons.daemonize
loop do
`touch /tmp/me`
sleep 1
end
Also, instead of using curl, have you looked at rest-client?

Resources