Laravel run the queue job every second - laravel

I have created a queue job which need to run every second.How can I do that ? So i have created a job using artisan command,but the job is not run every second. I think I need to reconfigure some config files of supervisor.

Laravel docs have examples of exactly that. check https://laravel.com/docs/5.6/queues#supervisor-configuration
The default examples is
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/html/yourproject/artisan queue:work sqs --sleep=3 --tries=3
autostart=true
autorestart=true
user=forge
numprocs=8
redirect_stderr=true
stdout_logfile= /var/www/html/yourproject/storage/logs/worker.log
Note that you need to have a worker connection set in config/queues.php, and then on superviser, in the command artisan queue:work you can specify the connection. in the example i sent is using sqs but you can configure other stuff like redis

I'm successfully using Spatie's Short Schedule package for this.

You may use crontab for your queueable job. But Cron only allows for a minimum of one minute. Use crontab -e to set your schedule with the help of https://crontab.guru/ add this 2 * * * * php /var/www/html/your-project-folder/artisan queue:work >> /dev/null 2>&1 to your crontab -e which runs every 2 minutes.
What you could, you need to write a shell script with an infinite loop that runs your task, and then sleeps for every second.

Related

Running Artisan Horizon on Shared Hosting

im tried to create cronjob on shared hosting with artisan horizon like this
/usr/local/bin/ea-php74 /home/example/example.com/artisan schedule:run 1>> /dev/null 2>&1
/usr/local/bin/ea-php74 /home/example/example.com/artisan horizon>> /dev/null 2>&1
but after a view hours our server goes down. any solutions for us?
Horizon is not supposed to be on a cronjob, as every time the cron triggers that line, it is a new Horizon process that has to run, so probably that's why your server is going down.
The right solution is to set up Horizon using Supervisor: https://laravel.com/docs/8.x/horizon#supervisors

supervisor returns error too many arguments, expected arguments "command"

I wanna run this command php artisan schedule:run >> /dev/null 2>&1 using supervisor but it returns error too many arguments, expected arguments "command"..
My /etc/supervisord.d/conf.d/job-runner.conf file content:
[program:job-runner]
command=php /home/mysite/public_html/artisan schedule:run >> /dev/null 2>&1
autostart=true
autorestart=true
user=apache
redirect_stderr=true
stdout_logfile=/home/mysite/public_html/storage/logs/job-runner.log
[supervisord]
How can I fix this?
You should not use supervisor for this, supervisor is meant to manage processes not execute scripts.
The command will run, the script will execute and exit, it is likely that supervisor will then auto restart (repeat) this at an uncontrolled tick rate (as fast as the hardware will allow it) that can cause an undesired out of control CPU and memory consumption.
You should use a cron task job as specified in the docs in order to execute schedule task at a controlled rate.
https://laravel.com/docs/5.7/scheduling#introduction

Bash to start and kill process on Ubuntu in a given period

I have this situation: I have a script in php running on ubuntu terminal (xfce4-terminal) as a console/process (in php there is a loop with some process).
The problem is: every two days this process is killed due to memory overuse.
What I need is: A bash script that can start the process and every 48hrs it kills this process and start it again.
The optimal solution is fixing the memory leak, trace the leaking function and post a new question with the relevant code if you need help.
Now for this specific case you can use something like this:
while true
do
timeout 12h php myfile.php
done
This is a infinite loop that starts your command and kills it afer 12 hours. (or any other duration you want: 30m, 1d, etc)
A more stable solution is creating a systemd service or deploying your script using some process manager like Supervisor or Monit.
Supervisor has a config parameter "autorestart", if you specify true it restarts your script every time it crashes, and this is a stable production ready solution.
A sample supervisor config from this post
[program:are_we_there_yet]
command=php /var/www/areWeThereYet.php
numprocs=1
directory=/tmp
autostart=true
autorestart=true
startsecs=5
startretries=10
redirect_stderr=false

How to run queue worker on shared hosting

My Laravel application has a queued event listener and I have also set up the cronjob to run schedule:run every minute.
But I don't know how I can run the php artisan queue:worker command persistently in the background. I found this thread where it was the most voted approach:
$schedule->command('queue:work --daemon')->everyMinute()->withoutOverlapping();
However, on a different thread some people complained that the above-mentioned command creates multiple queue worker.
How can I safely run a queue worker?
Since Laravel 5.7, there's a new queue command to stop working when empty:
php artisan queue:work --stop-when-empty
As this is mostly just for emails or few small jobs, I put it on a cronjob to run every minute. This isn't really a solution for more than 100 jobs per minute I'd say, but works for my emails. This will run about 5 seconds every minute just to send emails, depending on how many emails or how big the job.
Steps
Create new command: php artisan make:command SendContactEmails
In SendContactEmails.php, change: protected $signature = 'emails:work';
In the handle() method, add:
return $this->call('queue:work', [
'--queue' => 'emails', // remove this if queue is default
'--stop-when-empty' => null,
]);
Schedule your command every minute:
protected function schedule(Schedule $schedule)
{
$schedule->command('emails:work')->everyMinute();
// you can add ->withoutOverlapping(); if you think it won't finish in 1 minute
}
Update your cronjobs:
* * * * * /usr/local/bin/php /home/username/project/artisan schedule:run > /dev/null 2>&1
Source
Processing All Queued Jobs & Then Exiting
The --stop-when-empty option may be used to instruct the worker to process all jobs and then exit gracefully. This option can be useful when working Laravel queues within a Docker container if you wish to shutdown the container after the queue is empty:
php artisan queue:work --stop-when-empty
are you using cpanel?
you can set in the Scheduler or Cron Jobs menu.
and set the command in there
You can set a schedule task like this
$schedule->command('queue:work --stop-when-empty')->everyMinute()->withoutOverlapping();

How to kill the laravel queue:listen --queue=notification?

For a cron job I am using following code in laravel 5.1 and run the command in every 1 min. But even though after stopping cronjob from crontab the laravel code still executes. ?
$this->call('queue:listen', [
'--queue' => 'notification-emails','--timeout'=>'30'
]);
what could be the problem ? How can I stop this queue listen ?
You probably looking for queue:work which will stop, when no more jobs left, meanwhile queue:listen will persist.
If You want to kill existing process - You have to do it manually, because there is no command in laravel to kill all queue:listen processes.
Keep in mind, that You will not find process like artisan queue:listen, You have to look for artisan schedule:run because queue:listen, when called internally, will not create separate process.

Resources