PHP Laravel schedule command to run once across all servers - laravel

I'm trying to figure out if I can run schedule a command to run across all servers.
Currently I have a command clean:directories and I run it like this:
$schedule->command('clean:directories')->daily();
It scans the filesystem and removes files older than a set date. I need it to run on a queue server rather than just on the main.
Update: for now I've added an entry to crontab on the specific server I would like this ran on

If your Laravel application is deployed to multiple servers, you should think about adding the schedule:run command on each of them as a linux cronjob.
Laravel Docs:
* * * * * cd /path-to-your-project && php artisan schedule:run >> /dev/null 2>&1
Running the Scheduler (Laravel Docs)
Before you add the schedule:run command on your other servers you should be in mind, that all of your kernel defined jobs will be executed on all servers. If you have commands, which require only an execution once, you should improve your commands with the onOneServer() method. * (See requirements below)
Official Laravel Docs:
If your application's scheduler is running on multiple servers, you may limit a scheduled job to only execute on a single server. For instance, assume you have a scheduled task that generates a new report every Friday night. If the task scheduler is running on three worker servers, the scheduled task will run on all three servers and generate the report three times. Not good!
Running Tasks on One Server (Laravel Docs)
onOneServer Requirements:
To utilize this feature, your application must be using the database, memcached, dynamodb, or redis cache driver as your application's default cache driver. In addition, all servers must be communicating with the same central cache server.

Related

Should I remove Laravel scheduling in Cloud Run?

I'm doing a migration of my Laravel 8 app to Cloud Run. But I have problem with my schedulers. My Laravel app using Laravel Scheduling so I got 5 tasks :
protected function schedule(Schedule $schedule) {
$schedule->command(Commands\CmdOne::class)->monthlyOn(1, '02:10');
$schedule->command(Commands\CmdTwo::class)->dailyAt('04:00');
$schedule->command(Commands\CmdThree::class)->dailyAt('04:00');
$schedule->command(Commands\CmdFour::class)->dailyAt('05:00');
$schedule->command('activations:clean')->daily();
}
But I think it's risky to place the cron inside the container because Cloud Run can run multiple container instances of my app and I fear about to run the tasks multiple times because my tasks send email to my customers and I want to run them just once.
e.g: if Cloud Run create 5 instances of my container at 05:00Am so the command $schedule->command(Commands\CmdFour::class)->dailyAt('05:00'); will be executed 5 times and I don't want this.
So I see Google Cloud Scheduler and I can expose a web service to run my tasks. But I don't know if it's the good way ? Or there is another way to execute my tasks ? I don't know if removing Laravel Scheduler is the right way.
So if I'm using Cloud Scheduler now, I have to create 5 crons in Cloud Scheduler. I think it's ok for one application but if I have 10 apps (with the same code base but different Cloud run service) it will be hard to manager all these crons because I'll get 5 crons per apps. So in this case 50 crons.
Do you have a better way to manager this ?
If you have the right cache setup (shared by all servers) then you can use the onOneServer() method.
See https://laravel.com/docs/9.x/scheduling#running-tasks-on-one-server

gcloud cron jobs and laravel

I am trying to execute an api in laravel every minute.
The api's method is GET. However I could not specify the method in the cron.yaml file. Could I use DELETE method here and how? The code should be deployed on google cloud.
I have created a cron.yaml file that has the following format:
cron:
- description: "every minutes job"
url: /deletestories
schedule: every 1 mins
retry_parameters:
min_backoff_seconds: 2.5
max_doublings: 5
I also created the api deletestories that delete rows under specific conditions.
However this isn't working, and when I open google cloud console I could not found any error or any cron job executed.
This cron.yaml file appears to be a Google App Engine cron configuration. If this is correct then only the GET method is supported, you cannot use DELETE.
The GAE cron service itself consists simply of scheduled GET requests that your app needs to handle. From Scheduling Tasks With Cron for Python (the same applies to other languages and to the flexible environment cron as well):
A cron job makes an HTTP GET request to a URL as scheduled. The
handler for that URL executes the logic when it is called.
You also need to deploy your cron.yaml file for it to be effective. You should be able to see the deployed cron configuration in the developer console's Cron Jobs tab under the Task Queues Menu (where you can also manually trigger any of the cron jobs). The performed GET requests for the respective cron jobs should appear in your app's request logs as well, when executed.

Running artisan queue:work with additional arguments

I am trying to run queued jobs, and pass additional parameters through the command line. My use case is this:
I have 4 running queue:work processes through supervisor. The jobs in my queue all require access to a proxy server, through which i can only have 4 processes running at any given time. When I start up a queued job, I have to find a process number (1 through 4) that is not currently being used, then run my command through that process.
I have been using a database table to store the processes and it has a column for in_use which keeps track of whether its being used, but the problem I'm seeing is when two queue:work commands run simultaneously, the same proxy process can be picked from the database for both.
What I want
php artisan queue:work --process=1
Then to somehow retrieve that argument inside the job, so I can run my 4 processes each in supervisor separately.
As a workaround, I have created a custom artisan command which will take the argument, but I then lose the queue functionality. I don't want to have to develop a custom queue process.
Is there a way to pass this argument? Or, alternatively, is there a way that I could pop jobs off the queue from within my custom artisan command, and then run them manually rather than through queue:work?
The problem could be solved by using dedicated queue's. So each queue has a specific proxy process attached to it. The only thing left is to create a function/process to determine to which queue the process should go.
https://laravel.com/docs/5.1/queues#pushing-jobs-onto-the-queue
Check out the part: Specifying The Queue For A Job

How to fire Laravel Queues with beanstalkd

I'm pretty new to the whole Queue'd jobs thing in Laravel 4. I have some process heavy tasks I need the site to run in the background after being fired by the user doing a particular action.
When I was doing the local development for my site I was using this:
Queue::push('JobClass', array('somedata' => $dataToBeSent));
And I was using the local "sync" driver to do it. (The jobs would just automatically fire, impacting on the user experience but I assumed when going into the production phase I could switch it to beanstalkd and they would then be run in the background)
Which brings me to where I'm at now. I have beanstalkd set up with the dependencies installed with composer and the beanstalkd process listening for new jobs. I installed a beanstalk admin interface and can see my jobs going into the queue, but I have no idea how to actually get them to run!
Any help would be apprieciated, thanks!
This is actually a really badly documented feature in Laravel.
What you actually need to do is have the JobClass.php in a folder that is auto-loaded, I use app/commands, but they can also be in app/controllers or app/models if you like. And this function needs to have a fire event that takes the $job and $data argument.
To run these, simply execute php artisan queue:listen --timeout=60 in your terminal, and it will be busy emptying the queue, until it's empty, or it's been running for longer then 60 seconds. (Small note: The timeout is the time-limit to start a queue, so it may run for 69 seconds if 1 job takes 10 seconds.
If you only want to run 1 job (perfect for testing), run php artisan queue:work
There are tools like Supervisord that make sure your job handlers keep running, but I recommend to just make a Cron task that starts every X minutes based on how fast the data needs to be processed, and on how much data comes in.
Keep in mind you need to path your artisan.
php /some/path/to/artisan queue:work

Running background services ruby

I have to run couple of scripts which crawl some 1000s web pages and save some information for every 10 minutes.
I am using dreamhost shared hosting for my PHP site hosting.
What would be the appropriate way to configure these services in cron so that it executes 24X7.
Please let me know which host i can use for the same.
If you can ssh into your server, you would need to run "crontab -e" to edit your cron jobs and then add a line like this:
*/10 * * * * path/to/ruby path/to/your/script.rb

Resources