Elastic Beanstalk and Laravel queues - laravel

I'm implementig Laravel queues via database driver, and when I start the process for listening the jobs, another process is also started.
So basically I'm doing php artisan queue:listen and the process that is automatically started is php artisan queue:work.
so basically the second process here is generated automatically, also the thing is that it doesnt point to the folder where it should be

The listener:
php artisan queue:listen
starts a long-running process that will "run" (process) new jobs as they are pushed onto the queue. See docs.
The processer:
php artisan queue:work
Will process new jobs as they are pushed onto the queue. See docs.
So basically queue:listen runs queue:work as and when new jobs are pushed.
P.S: You need not worry about this but its good to know how it works. You can dig into the code if you need more information.

Related

How to stop a laravel SyncQueue

I've tried queue:clear, even tried removing all jobs but when I add them again the former queue starts working again as evidenced by the timely log entries, I'd just like to start fresh but couldn't find any way to actually stop the former queue
You can use
php artisan queue:restart
This will stop all running queue workers so that you can start fresh with a new worker process.

Laravel 7 - Stop Processing Jobs and Clear Queue

I have a production system on AWS and use Laravel Forge. There is a single default queue that is processing Jobs.
I've created a number jobs and now wish to delete them (as they take many hours to complete and I realize my input data was bad). I created a new job with good data, but it won't be processed until all the others have finished.
How can I delete all jobs?
It was previously set up using redis queue driver. I could not figure out how to delete jobs, so I switched the driver to database, and restarted the server, thinking that this would at least get the jobs to stop processing. However, much to my dismay, they continue to be processed :-(
I even deleted the worker from the forge ui and restarted the server: the jobs still process.
Why do jobs continue to be processed?
How can I stop them?
You can use:
php artisan queue:clear redis
It will clear all jobs from default queue on redis connection. If you put jobs in other queue then you should specify queue name for example:
php artisan queue:clear redis --queue=custom_queue_name

Laravel - Is there a way to run a queued job from another queued job on different queue?

Imagine:
1- Eating job
2- Drinking job
3- Eating Queue
4- Drinking Queue
If I have an Eating Job running on Eating queue and from this job I want to run Drinking Job but on Drinking Queue.
Is it possible?
I just found what solves my problem.
Instead of using php artisan queue:work I used php artisan queue:listen on the job that runs another job on the different queue.
Looks like queue:work only runs that job on the queue and doesn't affect another queue but when I used queue:listen it works fine.
A note: This is helped in the windows machine while on linux server it works normally with php artisan queue:work

How to debug Redis queue in Laravel

I'm new with Laravel, have implemented a queue with Redis and Supervisor installed to monitor but can't figure out somethings.
The Supervisor configuration is:
command=php <laravel path>/artisan queue:work --once
autostart=true
autorestart=true
user=www-data
numprocs=2
redirect_stderr=true
stdout_logfile=<laravel path>worker.log
Questions:
Any error produced by a job executed by the queue will be stored in worker.log or depending on the error will be stored there or elsewhere?
How can I know the data of the job that is running?
How can know the queue content and if queue is working?
How can know if supervisor is working?
Taylor has built Laravel Horizon since 5.5. This is an absolute must if you have a job/queue-heavy application:
Laravel Horizon
While it takes just a little bit of configuration to get up and running, once you do, you'll have all the metrics and data that you need to monitor and inspect your jobs.

Laravel queuing jobs with Redis

Do I have to run migrations for creating jobs table and failed_jobs?
php artisan queue:table
php artisan queue:failed-table
php artisan migrate
jobs table is used when your queue driver is database. (Since you're using Redis you don't have to have this)
failed_jobs is used when your queue jobs fail to run. It's good to have this table so that you can track jobs that failed.

Resources