How does Laravel handle multiple command tasks scheduled at the same time? - laravel

this is something that I've been thinking of for a while. How does Laravel's task scheduler handle multiple tasks scheduled at the same time?
Let's say I had 4 different commands, each set to execute at 1:15 AM:
$schedule->command('emails:send')->daily()->at('1:15');
$schedule->command('cache:maintenance')->daily()->at('1:15');
$schedule->command('users:remove-deleted')->daily()->at('1:15');
$schedule->command('users:notification-reminders')->daily()->at('1:15');
Also, for argument's sake, let's say each command took 2-5 minutes to complete. Laravel polls queue:work every minute, so what would happen at 1:16 AM if the first command hasn't completed yet? Does Laravel place the remaining commands into a queue automatically or would I have to explicitly create a queue worker for each command?

If you have a queue set up with multiple drivers, they can run simultaneously. Most production servers can easily handle many tasks at the same time. Otherwise, it will add them to one queue stack and be run in the order specified.

Related

Laravel queue nested processes

I want to use a queue for file uploads. Users can upload files. Each file will have around 500 rows. Now I want to implement this logic:
Maximum of 5 files can be processed at the same time. The remaining files should be in the queue.
Each file should have 5 processes, so 5 rows will be inserted into databases at the same time. Shortly, there are will be a maximum of 25
processes (5 processes in every 5 files).
Now I am adding all files to one queue. Files processing one by one. Shortly first-come, first out. 2nd file needs to wait to finish 1st file.
How can I implement this? Or do you have any other suggestions?
What exactly is the difference between processing a file, and inserting rows into the DB?
If you want to run multiple workers for the same queue, you can simply start more workers using php artisan queue:work and additionally use flags to specify the queues --queue=process-files for example. See the documentation.
In a production environment, consider to configure a supervisor to run a specific amount of workers on a queue using numprocs directive.
Do I understand correctly you want to run 25 queue workers per user? That does not seem right. Instead, you should consider creating queues for fast/slow jobs.

How do I configure queue workers, connection and limiter to avoid job failing

My Project consumes several 3rd party APIs which enforce requests limiting. My Project calls these api's through Laravel Jobs. I am using using Spatie/aravel-rate-limited-job-middleware for rate limiting
Once a Project is submitted, around 60 jobs are dispatched on an average. These jobs needs to be executed as 1 Job/Minute
There is one supervisord program running 2 process of the default queue with --tries=3
also in config/queue.php for redis I am using 'retry_after' => (60 * 15) to avoid retrying while job is executing.
My current Rate Limiter middleware is coded this way
return (new RateLimited())
->allow(1)
->everySeconds(60)
->releaseAfterBackoff($this->attempts());
What happens is that 3 jobs get processed in 3 mins, but after that all jobs gets failed.
what I can understand is all jobs are requeued every min and once they cross tries threshold (3), they are moved to failed_jobs.
I tried removing --tries flags but that didn't work. I also tried increasing --tries=20, but then jobs fails after 20 mins.
I don't want to hardcode the --tries flag as in some situation more than 100 jobs can be dispatched.
I also want to increase no of queue workers process in the supervisor so that few jobs can execute parallely.
I understand it is issue with configuring retry, timeouts flags but I don't understand how. Need Help...

Will low priority job in artisan queue stop high priority tas from being executed if low priority job takes long time to complete?

I'm running artisan queue worker with pm2 and was thinking to run two artisan workers one that could process high priority queue, and the other would process low priotiry, long jobs.
The issue is that pm2 does not allow to run the same script as a separate instance.
I know that I can set priorities here --queue=live-high,live-low,default, but my problem is that if low priority job takes 5 mins to complete, I need to be able to process high priority jobs meanwhile
From the Laravel Documentation:
Background Tasks
By default, multiple commands scheduled at the same time will execute
sequentially. If you have long-running commands, this may cause
subsequent commands to start much later than anticipated. If you would
like to run commands in the background so that they may all run
simultaneously, you may use the runInBackground method:
$schedule->command('analytics:report')
->daily()
->runInBackground();
https://laravel.com/docs/5.7/scheduling#background-tasks

How can I associate a queued job with a particular user in Laravel?

I've built a system based on Laravel where users are able to begin a "task" which repeats a number of times, with a delay between each repetition. I've accomplished this by queueing a job with an amount argument, which then recursively queues an additional job until the count is up.
For example, I start my task with 3 repetitions:
A job is queued with an amount argument of 3. It is ran, the amount is decremented to 2. The same job is queued again with a delay of 5 seconds specified.
When the job runs again, the process repeats with an amount of 1.
The last job executes, and now that the amount has reached 0, it is not queued again and the tasks have been completed.
This is working as expected, but I need to know whether a user currently has any tasks being processed. I need to be able to do the following:
Check if a particular queue has any jobs started by a particular user.
Check the value that was set for amount on that job.
I'm using the database driver for a queue named tasks. Is there any existing method to accomplish my goals here?
Thanks!
You shoudln't be using delay to queue multiple repetitions of the same job over and over. That functionality is meant for something like retrying a failed network request. Keeping jobs in the queue for multiple hours at a time can lead to memory issues with your queues if the count gets too high.
I would suggest you use the php artisan schedule:run functionality to run a command every 1-5 minutes to check the database if it is time to run a user's job. If so, kick off that job and add a status flag to the user table (or whatever table you want to keep track of these things). When finished you mark that same row as completed and wait for the next time the cron runs to do it again.

Laravel Queues for multi user environment

I am using Laravel 5.1, and I have a task that takes around 2 minutes to process, and this task particularly is generating a report...
Now, it is obvious that I can't make the user wait for 2 minutes on the same page where I took user's input, instead I should process this task in the background and notify the user later about task completion...
So, to achieve this, Laravel provides Queues that runs the tasks in background (If I didn't understand wrong), Now for multi-user environment, i.e. if more than one user demands report generation (say there are 4 users), so being the feature named Queues, does it mean that tasks will be performed one after the other (i.e. when 4 users demand for report generation one after other, then 4th user's report will only be generated when report of 3rd user is generated) ??
If Queues completes their tasks one after other, then is there anyway with which tasks are instantly processed in background, on request of user, and user can get notified later when its task is completed??
Queue based architecture is little complicated than that. See the Queue provides you an interface to different messaging implementations like rabbitMQ, beanstalkd.
Now at any point in code you send send message to Queue which in this context is termed as a JOB. Now your queue will have multiple jobs which are ready to get out as in FIFO sequence.
As per your questions, there are worker which listens to queue, they get a job and execute them. It's up to you how many workers you want. If you have one worker your tasks will be executed one after another, more the workers more the parallel processes.
Worker process are started with command line interface of laravel called Artisan. Each process means one worker. You can start multiple workers with supervisor.
Since you know for sure that u r going to send notification to user after around 2 mins, i suggest to use cron job to check whether any report to generate every 2 mins and if there are, you can send notification to user. That check will be a simple one query so don't need to worry about performance that much.

Resources