Running artisan queue:work with additional arguments - laravel

I am trying to run queued jobs, and pass additional parameters through the command line. My use case is this:
I have 4 running queue:work processes through supervisor. The jobs in my queue all require access to a proxy server, through which i can only have 4 processes running at any given time. When I start up a queued job, I have to find a process number (1 through 4) that is not currently being used, then run my command through that process.
I have been using a database table to store the processes and it has a column for in_use which keeps track of whether its being used, but the problem I'm seeing is when two queue:work commands run simultaneously, the same proxy process can be picked from the database for both.
What I want
php artisan queue:work --process=1
Then to somehow retrieve that argument inside the job, so I can run my 4 processes each in supervisor separately.
As a workaround, I have created a custom artisan command which will take the argument, but I then lose the queue functionality. I don't want to have to develop a custom queue process.
Is there a way to pass this argument? Or, alternatively, is there a way that I could pop jobs off the queue from within my custom artisan command, and then run them manually rather than through queue:work?

The problem could be solved by using dedicated queue's. So each queue has a specific proxy process attached to it. The only thing left is to create a function/process to determine to which queue the process should go.
https://laravel.com/docs/5.1/queues#pushing-jobs-onto-the-queue
Check out the part: Specifying The Queue For A Job

Related

Recently added json language file values are not updated in email blade

I send mail as a cron job with Laravel. For this, when I want to use the last value I added in my resources/lang/de.json file in the mail blade template file(resources/views/mails/...blade.php), it gives an output as if such a value is not defined. However, if I use the same key in a blade file I created before, it works without any errors. In addition, the keys that I added to the same file (de.json) in the first time work without errors in the same mail blade file.
Thinking it's some kind of cache situation, I researched and found out that restarting the queue worker might fix the problem. However, both locally and on the server with ssh.
'php artisan queue:restart'
Even though I ran the command, there was no improvement.
Do you have any ideas?
Since queue workers are long-lived processes, they will not notice changes to your code without being restarted. So, the simplest way to deploy an application using queue workers is to restart the workers during your deployment process. https://laravel.com/docs/9.x/queues#queue-workers-and-deployment
but php artisan queue:restart will instruct all queue workers to gracefully exit after they finish processing their current job so that no existing jobs are lost. And I see a lot of issues with this command not to solve restart and deploy the worker.
So, Simplest way,
try to stop the worker manually (ctrl+C)
start the worker again with php artisan queue:work again.
might this help.

How to stop killing the queue command in laravel?

I want this command php artisan queue:work stays active and not get killed for a long time...
when we have queues, and we run the server, if we only use the command php artisan queue:work, it can get killed for some reason and our queues don't work anymore. what should I do in this case?
Your question is quite ambiguous , but I'll assume that you need the command to run whilst accessing that same instance of command line without closing or stopping the process.
I would recommend using Screens for this which allow you to effectively have virtual terminals open within the one you have.
Give the following article a read
https://www.digitalocean.com/community/tutorials/how-to-install-and-use-screen-on-an-ubuntu-cloud-server

Laravel 5.5 listen dynamically generated queues

My application requires to have dynamically generated queues with some prefix like "process_user_1", "process_user_2", "process_user_n"
The main idea is to separate execution of some jobs depends on model ID.
To run watcher I need execute command php artisan queue:work --queue process_user_1
I didn't find possibility to put pattern like
php artisan queue:work --queue process_user_*
And only one way what I found it is to run them manually each time before job is sent. But it's so dirty...
Maybe someone know another way?
EXAMPLE:
I have 10 users. Each user has 100 jobs to process in queue.
If I put them to one queue like "process_user_job" users will wait a
lot of time when jobs will be finished.
So I want to separate queues to speed up result returning

Confusions about Laravel Queues

I am using Laravel Queues and I am using IronMQ for it. But I have little bit confusion about how this process.
I have set my default connection in queue.php as 'default' => 'iron' and also set iron settings in same file.
Now I use
$this->dispatch(new createEvents($data, $user));
while createEvents class is a job class created as explained in Laravel tutorial. Now when following code is executed
$this->dispatch(new createEvents($data, $user));
It successfully creates a queue in my ironmQ account under project.
Now here is my confusion starts. I have queued some task to that queue but now how will I run that queue? How will I run the task that is queued? Do I need to create some extra code for it or Do I need to do some settings for it. Please guide
You don't need to go to your server and run this command by hand, you need to have process that will keep running, and perform those jobs.
I would recomment "supervisord"
http://supervisord.org/
This programs is for launching a script and keep it running, even if it fails, it will relaunch it(until certain amount of failures of course)
After you install it, you should probably create this supervisor task file:
[program:queue]
command=php artisan queue:listen --tries=3 --env=your_environment
directory=/path/to/laravel
stdout_logfile=/path/to/laravel/app/storage/logs/supervisord.log
redirect_stderr=true
autostart=true
autorestart=true
You can do php artisan queue:listen it will start all listed queue
or if you specify the queue name php artisan queue:listen queue_name
Don't forget to run php artisan queue:failed-table. This will make failed_jobs table in your database.
So if anything goes wrong when the queue run it will save failed queue to the database.
If you want the failed queue to get insert the database add this when run listen:
php artisan queue:listen connection-name --tries=3
to run the failed queue php artisan queue:retry all
Hope i answer your question.
Once your job is in the queue, and according to your question it is, you have two simple options:
Run one or more queue listeners on the same/different servers (using supervisor is recommended in Laravel documentation, see sample configuration)
Run queue worker manually or automatically, on regular basis (crontab)
php artisan queue:work iron
This command will fetch one job from the queue and process it. You launch it again – it fetches one more, and so on.
If you don't do extra processing and your queue driver is not 'sync' – your job will never see the day light.
My advice – launch queue workers manually on your development/test machine, and use supervisor on production server.
If your project is small and it doesn't require great scalability, you may want to simply switch to 'sync' driver (jobs will be processed immediately). There is no need to make the infrastructure more complicated, unless there is real necessity!

How to fire Laravel Queues with beanstalkd

I'm pretty new to the whole Queue'd jobs thing in Laravel 4. I have some process heavy tasks I need the site to run in the background after being fired by the user doing a particular action.
When I was doing the local development for my site I was using this:
Queue::push('JobClass', array('somedata' => $dataToBeSent));
And I was using the local "sync" driver to do it. (The jobs would just automatically fire, impacting on the user experience but I assumed when going into the production phase I could switch it to beanstalkd and they would then be run in the background)
Which brings me to where I'm at now. I have beanstalkd set up with the dependencies installed with composer and the beanstalkd process listening for new jobs. I installed a beanstalk admin interface and can see my jobs going into the queue, but I have no idea how to actually get them to run!
Any help would be apprieciated, thanks!
This is actually a really badly documented feature in Laravel.
What you actually need to do is have the JobClass.php in a folder that is auto-loaded, I use app/commands, but they can also be in app/controllers or app/models if you like. And this function needs to have a fire event that takes the $job and $data argument.
To run these, simply execute php artisan queue:listen --timeout=60 in your terminal, and it will be busy emptying the queue, until it's empty, or it's been running for longer then 60 seconds. (Small note: The timeout is the time-limit to start a queue, so it may run for 69 seconds if 1 job takes 10 seconds.
If you only want to run 1 job (perfect for testing), run php artisan queue:work
There are tools like Supervisord that make sure your job handlers keep running, but I recommend to just make a Cron task that starts every X minutes based on how fast the data needs to be processed, and on how much data comes in.
Keep in mind you need to path your artisan.
php /some/path/to/artisan queue:work

Resources