I am using the delay_until functionality of sidekiq like this:
email.check_for_response_jid = CheckForResponse.delay_until(email.check_for_response).perform_async(email.id)
Is there any way I can test that the job is created when called like this?
Related
I am using HangFire hosted by IIS with an app pool set to "AlwaysRunning". I am using the Autofac extension for DI. Currently, when running background jobs with HangFire they are executing sequentially. Both jobs are similar in nature and involve File I/O. The first job executes and starts generating the requisite file. The second job executes and it starts executing. It will then stop executing until the first job is complete at which point the second job is resumed. I am not sure if this is an issue related to DI and the lifetime scope. I tend to think not as I create everything with instance per dependency scope. I am using owin to bootstrap hangfire and I am not passing any BackgroundServer options, nor am I applying any hints via attributes. What would be causing the jobs to execute sequentially? I am using the default configuration for workers. I am sending a post request to web api and add jobs to the queue with the following BackgroundJob.Enqueue<ExecutionWrapperContext>(c => c.ExecuteJob(job.SearchId, $"{request.User} : {request.SearchName}"));
Thanks In Advance
I was looking for that exact behavior recently and I manage to have that by using this attribute..
[DisableConcurrentExecution(<timeout>)]
Might be that you had this attribute applied, either in the job or globally?
Is this what you were looking for?
var hangfireJobId = BackgroundJob.Enqueue<ExecutionWrapperContext>(x => x.ExecuteJob1(arguments));
hangfireJobId = BackgroundJob.ContinueWith<ExecutionWrapperContext>(hangfireJobId, x => x.ExecuteJob2(arguments));
This will basically execute the first part and when that is finished it will start the second part
I am working on CakePHP 3.4 project.
I have to execute some command to scan through the files and directories of a particular directory.
This might take long time depending on the size of the directory, therefore I want to run it in background and mark running label in view until it executed successfully.
How can I run a Shell Task in the background from Controller and update database on execution?
I'm new to Shell tasks.
Your thinking along good lines about running this in the background if it is a time consuming task. You will need to use some form of queuing system that allows you to add jobs to a queue that can then get run in the background by running the queue from a cronjob. Take a look at the Queue plugin for doing this.
You'll basically need to create a queue task that contains the functionality that you need running in the background and then add a job to the queue that will run that task in the background. The Queue plugin's documentation shows how to do this and there are a load of example queue tasks included with the plugin.
If you need to indicate the status of the queued job you could save the job's ID in a session and check if it is complete when loading a page.
You can dispatch a Shell task from the controller. If you want to run this in the background you could, for example, run this controller action via JavaScript/Ajax.
// maybe this task runs looooong
set_time_limit(0);
$shell = new ShellDispatcher();
$output = $shell->run(['cake', 'bake', 'model', 'Products']);
if ($output === 0) {
$this->Flash->success('Yep!');
} else {
$this->Flash->error('Nope!');
}
But you could indeed have googled this at least. ;-)
EDIT Forget this one, go for drmonkeyninja’s answer.
I've just started using wercker and I'd like a job to run regularly (e.g. daily, hourly). I realize this may be an anti-pattern, but is it possible? My intent is not to keep the container running indefinitely, just that my workflow is executed on a particular interval.
You can use a call to the Wercker API to trigger a build for any project which is set up already in Wercker.
So maybe set up a cron job somewhere that uses curl to make the right API call?
I'm studying the possibilitie of using Resque in order to store rake tasks as jobs, and executing them later across multiple workers. Is this possible? I've already read the documentation for redis+resque, and tried to search examples of someone doing this, without success.
The rake tasks that I would like to execute as jobs, would fire cucumber+selenium to perform some web automation tests, that takes some time to complete.
Thanks in Advance!
Rodrigo.
As Amar told, using the invoke worked just fine.
I have a an app running on apache + passenger and I have a initializer to initializer rufus scheduler and then schedule jobs.
It seems like the initializer is getting executed multiple times after the app has been started which schedules duplicate jobs within rufus scheduler.
I am not sure why the initializers are getting executed multiple times without a restart.
Initializers are not the right place to do it. Each initializar is executed for every process your web server run. i.e. You apache start 4 process to accept connections to your rails application, your initializer is executed 4 times.
A simple solution would be to use a rake task as part of your deployment strategy.