I am trying to execute an api in laravel every minute.
The api's method is GET. However I could not specify the method in the cron.yaml file. Could I use DELETE method here and how? The code should be deployed on google cloud.
I have created a cron.yaml file that has the following format:
cron:
- description: "every minutes job"
url: /deletestories
schedule: every 1 mins
retry_parameters:
min_backoff_seconds: 2.5
max_doublings: 5
I also created the api deletestories that delete rows under specific conditions.
However this isn't working, and when I open google cloud console I could not found any error or any cron job executed.
This cron.yaml file appears to be a Google App Engine cron configuration. If this is correct then only the GET method is supported, you cannot use DELETE.
The GAE cron service itself consists simply of scheduled GET requests that your app needs to handle. From Scheduling Tasks With Cron for Python (the same applies to other languages and to the flexible environment cron as well):
A cron job makes an HTTP GET request to a URL as scheduled. The
handler for that URL executes the logic when it is called.
You also need to deploy your cron.yaml file for it to be effective. You should be able to see the deployed cron configuration in the developer console's Cron Jobs tab under the Task Queues Menu (where you can also manually trigger any of the cron jobs). The performed GET requests for the respective cron jobs should appear in your app's request logs as well, when executed.
Related
I'm doing a migration of my Laravel 8 app to Cloud Run. But I have problem with my schedulers. My Laravel app using Laravel Scheduling so I got 5 tasks :
protected function schedule(Schedule $schedule) {
$schedule->command(Commands\CmdOne::class)->monthlyOn(1, '02:10');
$schedule->command(Commands\CmdTwo::class)->dailyAt('04:00');
$schedule->command(Commands\CmdThree::class)->dailyAt('04:00');
$schedule->command(Commands\CmdFour::class)->dailyAt('05:00');
$schedule->command('activations:clean')->daily();
}
But I think it's risky to place the cron inside the container because Cloud Run can run multiple container instances of my app and I fear about to run the tasks multiple times because my tasks send email to my customers and I want to run them just once.
e.g: if Cloud Run create 5 instances of my container at 05:00Am so the command $schedule->command(Commands\CmdFour::class)->dailyAt('05:00'); will be executed 5 times and I don't want this.
So I see Google Cloud Scheduler and I can expose a web service to run my tasks. But I don't know if it's the good way ? Or there is another way to execute my tasks ? I don't know if removing Laravel Scheduler is the right way.
So if I'm using Cloud Scheduler now, I have to create 5 crons in Cloud Scheduler. I think it's ok for one application but if I have 10 apps (with the same code base but different Cloud run service) it will be hard to manager all these crons because I'll get 5 crons per apps. So in this case 50 crons.
Do you have a better way to manager this ?
If you have the right cache setup (shared by all servers) then you can use the onOneServer() method.
See https://laravel.com/docs/9.x/scheduling#running-tasks-on-one-server
I have two apps running on the same server.
Now it seems like when adding withoutOverlapping() to the scheduler job and managing the base cronjob via cron itself, these 2 apps are blocking each other in execution.
Could that be?
Yes, withoutOverlapping only works per application.
Laravel creates a file in the storage folder with a hash of the job. This way, if the file exists, Laravel knows the job is still running. The one application cannot possibly know if the other one is currently running a job because it does not have access to the storage folder of the other application.
If your code looks like the following
$schedule->command('process:queue 0')->everyMinute()->withoutOverlapping();
$schedule->command('process:queue 1')->everyMinute()->withoutOverlapping();
It is because same commands with different parameters might bc considered overlapping.
I.e. the hash of the job will consider only the command signature.
I've just started using wercker and I'd like a job to run regularly (e.g. daily, hourly). I realize this may be an anti-pattern, but is it possible? My intent is not to keep the container running indefinitely, just that my workflow is executed on a particular interval.
You can use a call to the Wercker API to trigger a build for any project which is set up already in Wercker.
So maybe set up a cron job somewhere that uses curl to make the right API call?
I am looking for a way to link an azure scheduler or web job to the Laravel schedule.
My understanding is, to set up an Azure schedule I would need an end point to link to my Laravel, which I am not sure how to achieve that.
TL;DR
You can use the WebJobs under WebApps with an commandline script to trigger the Laravel scheduler.
Full reference
Azure providers WebJobs that can fire various triggers including Cron-like schedulers. In order to run the Laravel scheduler you need to trigger the schedule:run command every minute. For now I'll assume artisan lives in D:\home\site\wwwroot\artisan which is the default location for PHP based deployments.
Create a new file called runsched.cmd or anything else als long as it has the .cmd extension. Edit the file with notepad and add:
php %HOME%\site\wwwroot\artisan schedule:run
Save the file and go to the Azure portal.
Select you WebApp and find WebJobs under the application settings. Click Add and a sidepanel will appear.
Give the WebJob a name, for example LaravelSchulder and upload the runsched.cmd file from the first step.
Set Type to Triggered and make sure Triggers is set to Scheduled.
Now you can specify how often the command must be triggered. Even the portal says 'CRON Expression' the cron format is not the same as the Linux notation. If you set the expression to all asterisks as shown in the Laravel documentation you're command will be triggered every second, which is way too often for most applications. The correct CRON Expression is:
0 * * * * *
If you're Job looks something like this click OK.
The Laravel scheduler will now be triggered every minute. To verify that everything is working correctly you can trigger the job once yourself (Select the LaravelSchulder job and click Run) and check the job status. If Azure reports Failed under status check the logs and make sure you've entered the correct paths'
Hope that explains it.
Quick note: Microsoft likes to change Azure Portal on a regular basis, so any of these instructions may have changed by now.
Is it possible for CakePHP to execute a cakephp shell task on background for
i.e running long reports. I would also want to update the current
status back to the user via updating a table during the report
generation and querying using Ajax.
Yes, you can run shells in the background via normal system calls like
/path/to/cake/console/cake -app /path/to/app/ <shell> <task>
The tricky part is to start one asynchronously from PHP; the best option would be to put jobs in a queue and run the shell as a cron job every so often, which then processes the queue. You can then also update the status of the job in the queue and poll that information via AJAX.
Consider implementing it as a daemon: http://pear.php.net/package/System_Daemon