This question already has answers here:
How to keep Laravel Queue system running on server
(20 answers)
Closed 4 years ago.
I have implemented Laravel queue.The thing is i have to run the command php artisan queue:listen every time.Is there any way that the jobs get executed automatically without running any command.
Here's a one-liner to put into your crontab (let it run, let say, every 5 minutes):
cd /path/to/your/project && jobs -l | grep `cat queue.pid` || { nohup /usr/bin/php artisan queue:listen & echo $! > queue.pid; }
two variables here:
1. /path/to/your/project -- is your Laravel project root. Effectively, the folder, where php artisan would work;
2. /usr/bin/php -- path to PHP executable on the server (which php)
Yes, if you use Linux you can use for example supervisor which will run php artisan queue:listen (you need to add this command to supervisor configuration file) and it will make sure all the time this command is running.
Related
I want to run the php artisan schedule:work command but the issue is that when i close putty it terminate the operation while i need it to remain processing on server
my server is running Ubuntu 20.4.1 LTS
Actually, the command schedule:work is for local development environment.
While you want to run your scheduler on server you should add a cron job like the following:
First, Go to your terminal, ssh into your server, cd into your project and run this command:
crontab -e
This will open the server Crontab file, paste the code below into the file, save and then exit.
* * * * * cd /path-to-your-project && php artisan schedule:run >> /dev/null 2>&1
Here we added one Cron job which executes every minute to start the Laravel scheduler.
Don't forget to replace /path-to-your-project with your project path.
Use "nohup php artisan schedule:work"
Go to your project path e.g. cd /var/www/mywebsite
Command this
crontab -e
choose editor nano number 1 if it shows a list of editors.
below of file add this.
* * * * * php /var/www/mywebsite/artisan schedule:run >> /dev/null 2>&1
save it CTR+S
i'm using bitbucket pipeline to deploy and run some artisan command,
but there is a problem that make me headache, when artisan command failed, envoy show the error/Exception, but not continue to run next envoy task.it's keep show me the exception till i kill the php process in vps server (using kill/pkill command)
here is my envoy
#task('start_check_log', ['on' => 'web'])
cd /home/deployer/mywork/laravel/
nohup bash -c "php artisan serve --env=dusk.local 2>&1 &" && sleep 2
curl -vk http://localhost:8000 &
php artisan check_log
sudo kill $(sudo lsof -t -i:8000)
php artisan cache:clear
php artisan config:clear
#endtask
php artisan check_log just to check the log file, i want to check if error occurred, but when error comes up, envoy stuck on that error.
I've resolved this problem, this is just my stupid, I 've to add command pipe in other to envoy continue the task php artisan check_log && sleep 2 and the envoy continue the process
I am trying to create a nginx-laravel-mysql stack of docker containers using [laradock][1], a free docker-compose plugin for laravel.
To make it work, I have to run php artisan key:generate either from my local environment, or from within a running container (both bad practices).
I tried adding command: /bin/bash -c "php artisan key:generate" to my docker-compose.yml file. This causes an exit; when I run docker-compose ps, I see laradock_workspace_1 /bin/bash -c nohup php art ... Exit 1. Adding nohup causes the same result. In fact, any command I run here causes an exit
On to the Dockerfile. If I add RUN php artisan key:generate (or any variation of it), I get this:
ERROR: Service 'workspace' failed to build: The command '/bin/sh -c php artisan key:generate' returned a non-zero code: 1
If I run that same command as CMD or ENTRYPOINT, even with nohup, it runs and generates the key, but exits:
docker-compose ps says: laradock_workspace_1 /bin/sh -c nohup php artis... Exit 0
I can add restart: always to docker-compose.yml, but that begets a vicious cycles of key generate, exit, restart.
Any ideas how to execute this command (or any command) from Dockerfile or docker-compose.yml without exiting?
EDIT: to answer #dnephin's question: php artisan key:generate adds a hash to the /.env file and adds a value to a php array. It just takes that command, no input. When I run docker-compose run workspace php artisan key:generate, I get Could not open input file: artisan.
Strangely, when I run docker-compose run workspace pwd, I see the correct path to my laravel files (and I can see all of them if I run docker-compose exec workspace bash, but when I try to run docker-compose run workspace ls, I see nothing. It's like the files aren't there.
In my project I am using database queue and executing this queue by using command
php artisan queue:listen
in composer and it is working. But in my windows server, there are many projects that using queues so many windows of composer are open. It is quite inconvenient. Is this possible to run this command in background without composer window open?
YOu can use the command but it will work only until you logout or restart
nohup php artisan queue:work --daemon &
The trailing ampersand (&) causes process start in the background, so you can continue to use the shell and do not have to wait until the script is finished.
See nohup
nohup - run a command immune to hangups, with output to a non-tty
This will output information to a file entitled nohup.out in the directory where you run the command. If you have no interest in the output you can redirect stdout and stderr to /dev/null, or similarly you could output it into your normal laravel log. For example
nohup php artisan queue:work --daemon > /dev/null 2>&1 &
nohup php artisan queue:work --daemon > app/storage/logs/laravel.log &
But you should also use something like Supervisord to ensure that the service remains running and is restarted after crashes/failures.
Running queue:listen with supervisord
supervisord is a *nix utility to monitor and control processes below is a portion of /etc/supervisord.conf that works well.
Portion of supervisord.conf for queue:listen
[program:l5beauty-queue-listen]
command=php /PATH/TO/l5beauty/artisan queue:listen
user=NONROOT-USER
process_name=%(program_name)s_%(process_num)d
directory=/PATH/TO/l5beauty
stdout_logfile=/PATH/TO/l5beauty/storage/logs/supervisord.log
redirect_stderr=true
numprocs=1
You’ll need to replace the /PATH/TO/ to match your local install. Likewise, the user setting will be unique to your installation.
I'm having a bit of a nightmare getting a crontab/cronjob to run an Artisan command.
I have another Artisan command running via cronjob no problems but this second command won't run.
Firstly, when I do 'crontab -e' and edit the file to contain:
0 0 * * * /usr/local/bin/php /home/purple/public_html/artisan feeds:send
The cronjob doesn't run at all.
If I go to cPanel and add the cronjob there, it runs but I receive the following error:
open(public/downloads/feeds/events.csv): failed to open stream: No such file or directory
The thing is the file exists and the directories have the correct permissions. If I run the command when logged in via SSH as root or the user purple (php artisan feeds:send) the command runs flawlessly and completes its tasks no problem.
If in cPanel, I edit the cronjob to use:
0 0 * * * php /home/purple/public_html/artisan feeds:send
I receive the following error:
There are no commands defined in the "feeds" namespace.
The funny thing is that my other command is registered in the crontab file and works and has no reference in cPanel at all.
Any help would be much appreciated. Just for brevity I have included the command and model that the command uses.
Feed.php Model:
http://laravel.io/bin/1e2n
DataFeedController.php Controller:
http://laravel.io/bin/6x0E
SendFeeds.php Command:
http://laravel.io/bin/BW3d
start/artisan.php:
http://laravel.io/bin/2xV3
FeedInterface.php Interface:
http://laravel.io/bin/LxnO
As you can see there is a GetRates command, which works.
Well it looks like I had to cd in to the script directory first before running the command, which now after working it out it makes sense. Easy when you know how eh!
* * * * * cd /home/purple/public_html/ && /usr/local/bin/php artisan feeds:send