I'm torn on whether to schedule jobs or commands in the scheduler. I can't really find any in depth data on why I would choose one over the other. Typically, I've considered how long a given scheduled task will run and if it's "long" then I'll create a job, but I've recently switched a few jobs over to commands more recently because I can run them manually.
Also, if I'm using commands in the scheduler and I'm using runInBackground() how does that differ from a job?
When you use runInBackground, you're just sending the command to the shell background, like calling a command in the shell with & after the command.
Jobs can be executed in queues, which can be retried, scaled, executed with middlewares, executed in batches and monitored with tools like Laravel Horizon.
Tip: you can dispatch your jobs as commands by registering commands in routes/console.php that just dispatch the job, example:
Artisan::command('my-job-command', fn () => dispatch(new MyJob()));
The commands in this file are registered automatically by this code in the Kernel:
protected function commands()
{
$this->load(__DIR__ . '/Commands');
require base_path('routes/console.php');
}
Related
I have job A which downloads xml and then calls other job B which will create data on database. This job B will be called in loop and can be more than 10.000 items. First tried to use chain method but problem is that, if someone will call queue in wrong sequence it will not work. Then tried to use batch from new Laravel 8. Collecting all jobs (more than 10000) to one batch can cause out of memory exception. Other problem is calling job C at the end. This job will update some credentials. Thats why job A and B must be runned successfully. May be there is any good idea for this situation?
Laravel's job batching feature allows you to easily execute a batch of jobs and then perform some action when the batch of jobs has completed executing.
If you have an out-of-memory problem with Jobs Batching you are doing things wrong. Since the queues are executed one by one if you have it configured that way there should be no problems, even if they are more than 100k records. So, make sure you glue one Job for each item, and execute the action, you won't have problems with this.
Then, you could do something like this.
$chain = [
new ProcessPodcast(Podcast::find(1)),
new ProcessPodcast(Podcast::find(2)),
new ProcessPodcast(Podcast::find(3)),
new ProcessPodcast(Podcast::find(4)),
new ProcessPodcast(Podcast::find(5)),
...
// And so on for all your items.
// This should be generated by a foreach with all his items.
];
Bus::batch($chain)->then(function (Batch $batch) {
// All jobs completed successfully...
// Uupdate some credentials...
})->catch(function (Batch $batch, Throwable $e) {
// First batch job failure detected...
})->finally(function (Batch $batch) {
// The batch has finished executing...
})->dispatch();
I need to run a job in queue, which takes a long time (2 hours around). It checks availability of some certain service. So instead of running one job for two hours, which constantly (every fice mins) makes an API request, I thought to use scheduling of laravel for queued jobs. I could call scheduler from anywhere by Artisan helper:
Artisan::call('schedule:run', [
'args' => $args
]);
Which would dispatch a job. But can't figure out, how I can pass arguments ($arg1, $arg2, ..) in kernel.php, which my job file requires.
// Dispatch the job to the "heartbeats" queue...
$schedule->job(new Heartbeat($arg1, $arg2, ..), 'heartbeats')->everyFiveMinutes();
I tried to pass args in schedule method, but I suppose that's not the right way to do it.
Make an artisan command. For example:
php artisan make:command DispatchJobFromDynamicCalculatedArg
Calculate $args whatever you want inside this artisan command.
Dispatch target job with arguments which actually contain processes you need there.
Schedule above artisan command in Kernel.php file.
That's it.
I'm trying to get department details from an API which supports pagination, so if I spawn one job per page like following
/departments?id=1&page=1 -> job1
/departments?id=1page=2 -> job2
How can I keep track of these jobs for a particular department as I have to write the responses to txt file.
The jobs are instantiated via controller class like:
class ParseAllDeptsJob implements ShouldQueue
{
public function handle()
{
foreach (Departments::all() as $dept) {
ParseDeptJob::dispatch($dept);
}
}
}
You can chain a job, using withChain(). This job will not run if the jobs higher up the chain fail.
From the documentation:
Job chaining allows you to specify a list of queued jobs that should
be run in sequence. If one job in the sequence fails, the rest of the
jobs will not be run. To execute a queued job chain, you may use the
withChain method on any of your dispatchable jobs:
In your case, this is how you'd do it:
ParseAllDeptsJob::withChain([
new SendEmailNotification
])->dispatch();
SendEmailNotification won't be dispatched if an error occurs while processing ParseAllDeptsJob.
I need to run a series of Jobs that run in sequence in Laravel at a scheduled interval (weekly) The withChain method works perfectly for this:
firstJob::withChain([
new secondJob,
new thirdJob
]);
When trying to run the chain within the Scheduler:
$schedule->job(firstJob::withChain([
new secondJob,
new thirdJob
]))->weekly();
I get the following error:
In BoundMethod.php line 135:
Method Illuminate\Foundation\Bus\PendingDispatch::handle() does not exist
The output I get from the Scheduler in the cli is:
Running scheduled command: Illuminate\Foundation\Bus\PendingDispatch
So I understand that the job method isn't actually calling the job but the dispatch() method in the Dispatchable trait.
My question is how can I run chained Jobs within the Laravel Task Scheduler?
I fixed this by replacing $schedule->job() with $schedule->call(). The closure simple runs the job::withChain(). I now have a supervisord command making sure the queue:work artisan command is running in the background so the scheduler is responsible for firing the job queue on the allocated time which triggers the queue.
Using $schedule->call() looses the ability to specify a queue name, and other job specific parameters.
To retain this, you should use the schedule job command, which can be chained as follows:
$schedule->job((new firstJob())->chain([
new secondJob(),
new thirdJob()
]), 'queue-name')->everyFiveMinutes();
Late answer, but may help other people:
$schedule->call(function () {
firstJob::withChain([
new secondJob,
new thirdJob
])->dispatch()->allOnQueue('queue_name');
})->weekly();
Is it possible to use dispatchShell from a Controller?
My mission is to start a shell job when the user has signed up.
I'm using CakePHP 2.0
If you can't mitigate the need to do this as dogmatic suggests then, read on.
So you have a (potentially) long-running job you want to perform and you don't want the user to wait.
As the PHP code your user is executing happens during a request that has been started by Apache, any code that is executed will stall that request until it completion (unless you hit Apache's request timeout).
If the above isn't acceptable for your application then you will need to trigger PHP outwith the Apache request (ie. from the command line).
Usability-wise, at this point it would make sense to notify your user that you are processing data in the background. Anything from a message telling them they can check back later to a spinning progress bar that polls your application over ajax to detect job completion.
The simplest approach is to have a cronjob that executes a PHP script (ie. CakePHP shell) on some interval (at minimum, this is once per minute). Here you can perform such tasks in the background.
Some issues arise with background jobs however. How do you know when they failed? How do you know when you need to retry? What if it doesn't complete within the cron interval.. will a race-condition occur?
The proper, but more complicated setup, would be to use a work/message queue system. They allow you to handle the above issues more gracefully, but generally require you to run a background daemon on a server to catch and handle any incoming jobs.
The way this works is, in your code (when a user registers) you insert a job into the queue. The queue daemon picks up the job instantly (it doesn't run on an interval so it's always waiting) and hands it to a worker process (a CakePHP shell for example). It's instant and - if you tell it - it knows if it worked, it knows if it failed, it can retry if you want and it doesn't accidentally handle the same job twice.
There are a number of these available, such as Beanstalkd, dropr, Gearman, RabbitMQ, etc. There are also a number of CakePHP plugins (of varying age) that can help:
cakephp-queue (MySQL)
CakePHP-Queue-Plugin (MySQL)
CakeResque (Redis)
cakephp-gearman (Gearman)
and others.
I have had experience using CakePHP with both Beanstalkd (+ the PHP Pheanstalk library) and the CakePHP Queue plugin (first one above). I have to credit Beanstalkd (written in C) for being very lightweight, simple and fast. However, with regards to CakePHP development, I found the plugin faster to get up and running because:
The plugin comes with all the PHP code you need to get started. With Beanstalkd, you need to write more code (such as a PHP daemon that polls the queue looking for jobs)
The Beanstalkd server infrastructure becomes more complex. I had to install multiple instances of beanstalkd for dev/test/prod, and install supervisord to look after the processes).
Developing/testing is a bit easier since it's a self-contained CakePHP + MySQL solution. You simply need to type cake queue add user signup and cake queue runworker.
I was able to run consolle from controller/action, see the example below.
App::uses('ShellDispatcher', 'Console');
...
public function aco_sync() {
$command = '-app '.APP.' AclExtras.AclExtras aco_sync -r adminControllers -p UserAdmin';
$args = explode(' ', $command);
$dispatcher = new ShellDispatcher($args, false);
if($dispatcher->dispatch()) {
$this->Session->flash('OK');
} else {
$this->Session->flash('Error');
}
return $this->redirect(array('action' => 'index'));
}
In CakePHP-3 you can dispatch shells from the controller & do it almost the same as in CakePHP-2. The documentation does not mention this.
// in your controller:
$shell = new \Cake\Console\Shell;
$shell->dispatchShell('shell_class param1 param2');
// or how the docs suggest
$shell->dispatchShell('shell_class', 'param1', 'param2');
Beware of stdout & stderr in unit tests.
Dispatching a shell turns on stdout and stderr logging with ConsoleLogger, and will give you all the logging in your console if you have something like the code snippet above in code that you are testing from phpunit.
function getEbayOrder(){
$this->autoRender = false;
App::import('Console/Command', 'AppShell');
App::import('Console/Command', 'EbayShell');
$job = new EbayShell();
$job->dispatchMethod('get_orders');
echo "REPONSE";
}
anything is possible, but why would you want to. If you find you need to do something in a shell and the actual application look at using libs.
you stick the code in the lib and then call the lib from both your app and the shell.
If this is to intialize AclExtras the best way is:
App::import('Console/Command', 'AppShell');
App::import('Plugin/AclExtras/Console/Command', 'AclExtrasShell');
$job = new AclExtrasShell();
$job->startup();
$job->dispatchMethod('aco_sync');
But avoid this unless you have no possibilities to run the console script.