Is it possible for CakePHP to execute a cakephp shell task on background for
i.e running long reports. I would also want to update the current
status back to the user via updating a table during the report
generation and querying using Ajax.
Yes, you can run shells in the background via normal system calls like
/path/to/cake/console/cake -app /path/to/app/ <shell> <task>
The tricky part is to start one asynchronously from PHP; the best option would be to put jobs in a queue and run the shell as a cron job every so often, which then processes the queue. You can then also update the status of the job in the queue and poll that information via AJAX.
Consider implementing it as a daemon: http://pear.php.net/package/System_Daemon
Related
I am new to Nifi.My requirement is to trigger Nifi process group using external scheduling tool called Control M. I tried using shell script to start and.stop the process group using curl command. Process group will fetch data from text file and writes into a database but unable to determine when the process group gets completed because I could see status like Started, Running and Stopped but not Completed state. Struck with this issue and need your valuable inputs on this of how to determine all the records got inserted into database placed inside process group
NiFi is not a batch 'start & stop' style tool. NiFi is built to work with continuous streams of data, meaning that flows are 'always on'. It is not intended to be used with batch schedulers like ControlM, Oozie, Airflow, etc. As such, there is no 'Completed' status for a flow.
That said, if you want to schedule flows in this way, it is possible - but you need to build it in to the flow yourself. You will need to define what 'Completed' is and build that logic in your flow - e.g. MonitorActivity after your last processor to watch for activity.
I found this similar question How to check If the current app process is running within a queue environment in Laravel
But actually this is the opposite of what I want. I want to be able to distinguish between code being executed manually from an artisan command launched on the CLI, and when a job is being run as a result of a POST trigger via a controller, or a scheduled run
Basically I want to distinguish between when a job is being run via the SYNC driver, manually triggered by the developer with eyes on the CLI output, and otherwise
app()->runningInConsole() returns true in both cases so it is not useful to me
Is there another way to detect this? For example is there a way to detect the currently used queue connection? Keeping in mind that it's possible to change the queue connection at runtime so just checking the value of the env file is not enough
I recently submitted a training job with a command that looked like:
gcloud ai-platform jobs submit training foo --region us-west2 --master-image-uri us.gcr.io/bar:latest -- baz qux
(more on how this command works here: https://cloud.google.com/ml-engine/docs/training-jobs)
There was a bug in my code which cause the job to just keep running, rather than terminate. Two weeks and $61 later, I discovered my error and cancelled the job. I want to make sure I don't make that kind of mistake again.
I'm considering using the timeout command within the training container to kill the process if it takes too long (typical runtime is about 2 or 3 hours), but rather than trust the container to kill itself, I would prefer to configure GCP to kill it externally.
Is there a way to achieve this?
As a workaround, you could write a small script that runs your command and then sleeps the time you want until running a cancel job command.
As a timeout definition is not available in AI Platform training service, I took the liberty to open a Public Issue with a Feature Request to record the lack of this command. You can track the PI progress here.
Except the script mentioned above, you can also try:
TimeOut Keras callback, or timeout= Optuna param (depending on which library you actually use)
Cron-triggered Lambda (Cloud Function)
I am using HangFire hosted by IIS with an app pool set to "AlwaysRunning". I am using the Autofac extension for DI. Currently, when running background jobs with HangFire they are executing sequentially. Both jobs are similar in nature and involve File I/O. The first job executes and starts generating the requisite file. The second job executes and it starts executing. It will then stop executing until the first job is complete at which point the second job is resumed. I am not sure if this is an issue related to DI and the lifetime scope. I tend to think not as I create everything with instance per dependency scope. I am using owin to bootstrap hangfire and I am not passing any BackgroundServer options, nor am I applying any hints via attributes. What would be causing the jobs to execute sequentially? I am using the default configuration for workers. I am sending a post request to web api and add jobs to the queue with the following BackgroundJob.Enqueue<ExecutionWrapperContext>(c => c.ExecuteJob(job.SearchId, $"{request.User} : {request.SearchName}"));
Thanks In Advance
I was looking for that exact behavior recently and I manage to have that by using this attribute..
[DisableConcurrentExecution(<timeout>)]
Might be that you had this attribute applied, either in the job or globally?
Is this what you were looking for?
var hangfireJobId = BackgroundJob.Enqueue<ExecutionWrapperContext>(x => x.ExecuteJob1(arguments));
hangfireJobId = BackgroundJob.ContinueWith<ExecutionWrapperContext>(hangfireJobId, x => x.ExecuteJob2(arguments));
This will basically execute the first part and when that is finished it will start the second part
I am working on CakePHP 3.4 project.
I have to execute some command to scan through the files and directories of a particular directory.
This might take long time depending on the size of the directory, therefore I want to run it in background and mark running label in view until it executed successfully.
How can I run a Shell Task in the background from Controller and update database on execution?
I'm new to Shell tasks.
Your thinking along good lines about running this in the background if it is a time consuming task. You will need to use some form of queuing system that allows you to add jobs to a queue that can then get run in the background by running the queue from a cronjob. Take a look at the Queue plugin for doing this.
You'll basically need to create a queue task that contains the functionality that you need running in the background and then add a job to the queue that will run that task in the background. The Queue plugin's documentation shows how to do this and there are a load of example queue tasks included with the plugin.
If you need to indicate the status of the queued job you could save the job's ID in a session and check if it is complete when loading a page.
You can dispatch a Shell task from the controller. If you want to run this in the background you could, for example, run this controller action via JavaScript/Ajax.
// maybe this task runs looooong
set_time_limit(0);
$shell = new ShellDispatcher();
$output = $shell->run(['cake', 'bake', 'model', 'Products']);
if ($output === 0) {
$this->Flash->success('Yep!');
} else {
$this->Flash->error('Nope!');
}
But you could indeed have googled this at least. ;-)
EDIT Forget this one, go for drmonkeyninja’s answer.