I am new to spring boot scheduler. I want to schedule multiple jobs to start and stop based on user requirements. The start and stop have to be exposed as a restful service. Is there a way to achieve this using a spring scheduler where I can start a job when the user requests it and stop it when a stop is requested?
If you need to trigger method as new thread annotate with #Async.
If need trigger job based on user trigger and stop based on user trigger spring batch good option if its allowed
running spring batch job
jobLauncher.run(job, jobParameter);
stop running job.
jobexplorer.stop(jobExecution.getId);
Related
In our Spring Batch application workers, item processors are further interacting with another service asynchronously through Kafka. The requirement here is we required an acknowledgement in order to retry failed batches and the condition is to not wait for the acknowledgement.
Is there any mechanism in spring batch by which we can asynchronously consume Kafka ?
Is it possible to rerun specific local worker step in rerun of job?
We implement producers and consumers over same step using Spring batch decider. Thus, during the first run it will only produce Kafka and on second run it will consume the Kafka.
We are looking for solution where we can asynchronously consume Kafka in Spring batch application in order to rerun specific worker step.
Is there any mechanism in spring batch by which we can asynchronously consume Kafka ? Is it possible to rerun specific local worker step in rerun of job?
According to your diagram, you are doing that call from an item processor. The closest "feature" you can get from Spring Batch is the AsyncItemProcessor. This is a special processor that processes items asynchronously in a separate thread. The callback is unwrapped in an AsyncItemWriter with the result of the call.
Other than that, I do not see any other obvious way to do that with a built-in feature from Spring Batch. So you would have to manage that in a custom ItemProcessor.
I have used Spring Cloud Data Flow to control some batch jobs. In SCDF, after I defined some tasks, they were launched as jobs with running status. When I tried to stop a particular job, It did not stop immediately. I have found that the job was still running until it finished it's current step.
For example, My job 'ABC' has 2 steps A and B. In SCDF, I stop job 'ABC" when step A is being executed and job 'ABC' is still running until step A is completed and it do not implement step B.
So, Are there any ways to stop a Job immediately from Spring Cloud Data Flow?
From the Spring Cloud Data Flow, the batch job stop operation is delegated to the Spring Batch API. This means there is nothing Spring Cloud Data Flow offers to stop a batch job immediately as it needs to be handled by the Spring Batch or the job implementation itself.
When a batch job stop request is sent for a Batch Job execution (if it is running), the current step execution is set with the flag terminateOnly to true which means the step execution is ready to be stopped based on the underlying step execution implementation.
I have two spring boot applications, where one features the creating and scheduling of the job. There is another Spring Boot application where I configured the Quartz Scheduler, which will prepare the job parameters using a shared database and launches the spring batch job.
I need to update the running Quartz Scheduler if the user updates or adds a new job from UI. Also if server restarts I need to restart the Scheduler and the Jobs.
How should I update my Quartz Scheduler object when there's the new job added or updated by the user? My Quartz Scheduler will always be in running condition. Can I use REST Template so that my UI application will notify my scheduler application for the jobs?
You will need to store the jobs to schedule in a database (using the share database) from the UI application, create a job and build a trigger for it.
Its hard to say without any code given, but there is a guide that does something like you want.
https://www.callicoder.com/spring-boot-quartz-scheduler-email-scheduling-example/
This is a dynamic scheduler/trigger
I have created a project with 2 rest API that launches different Jobs. My project is connected to a MySql database. I would like to monitor both the Jobs in spring cloud data flow. Please help me out how we need to configure SCDF to MySql so that both the Jobs will be monitored. And additionally, i would like to know that how, if we launch the job by firing the API, whether our SCDF will monitor those Job Instance. If not, please let me know how we can do that.
Thanks in Advance
Please take a moment to read the Spring Batch Admin to SCDF migration guide. It is a requirement that the jobs are wrapped with Spring Cloud Task programming model.
Once you have the batch-job wrapped as a Task, you can register them in SCDF to build Task/Batch pipelines using SCDF's DSL or the GUI.
As for the datasource, all you must have to make sure is that the same datasource is shared between SCDF and the batch-jobs. With this, SCDF's Dashboard will automatically list the jobs and its execution details.
Here are a few examples for your reference.
And additionally, i would like to know that how, if we launch the job by firing the API, whether our SCDF will monitor those Job Instance
Assuming you're referring to SCDF's Task launch API (e.g., a Scheduled trigger or by other means); if triggered, yes, the job executions will be captured in the database as far as SCDF and the batch-jobs share a common datasource as explained previously.
I got in to a task where we have to publish Metadata of one table to another application via REST web services.
Basically need is
It has to be weekly scheduler and every week we push the data to
them.
Synchronous way.
Job scheduler will kick up the job and call REST client.
I am thinking using spring batch scheduler as it is simple and not with Quartz scheduler. Let me know you view and perspectives.