Is there any option to run Spring Batch job automatically without scheduling or trigger it from other source?
We can run spring batch job by scheduling expression or by main method (As mentioned below).
public static void runBatch() {
JobLauncher jobLauncher = (JobLauncher) getApplicationContext().getBean("jobLauncher");
Job job = get Job details here;
JobParametersBuilder jobParameters = new JobParametersBuilder();
// Setting Job Parameter and run job using below
JobExecution jobExecution = jobLauncher.run(job, jobParameters.toJobParameters());
}
public static void main(String[] args) {
runBatch();
}
Means we need to call main method manually or by some other scheduler so that it will trigger. Without this main method or scheduler can we call this batch process automatically? Or Is there any better options?
Spring batch should trigger automatically without triggering from any entry point its like daemon thread.
Suppose once batch is processing data using spring batch. How can automatically trigger spring batch job after completion of running
job?
Many options exist to trigger the batch:
To call a batch from another scheduler, issue run parameters there and trigger the Job.
If it has to be triggered automatically, say may be on certain interval, use fixedDelay scheduling
To trigger manually, you can go for MBean approach where it can be triggered from JConsole.
Or have an endpoint to call runBatch.
Related
I am scheduling a task that runs at fixed rate in Spring boot. The function that I am using to schedule a a task is as below:
private void scheduleTask(Store store, int frequency) {
final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
Runnable task = store::scan;
scheduler.scheduleAtFixedRate(task, 0, frequency, TimeUnit.MILLISECONDS);
}
This works fine but if if there is an exception at application startup, the application should exit on exception. What is happening is that I get the exception in the log and the message "Application Failed to start" but looks like the scheduler shows as still running although it looks like only the scheduled thread is still running.
Any hints on how to properly schedule an asynchronous task in a Spring boot application? I tried the #Scheduled annotation but it does not run at all.
The #Scheduled should work. Have you added the #EnabledScheduling annotation to a #Configuration or the #SpringBootApplication? The Scheduling Getting Started explains it in detail.
Regarding the scheduleTask method: What calls that? Is it started outside the Spring context? If yes then Spring won't stop it. You have to take care of the lifecycle.
You should try to use the #Scheduled as it will manage the thread pools/executors for you and most people will find it easier to understand.
2 questions on spring batch , Can someone please shed more light on this.
1) I have implemented registerShutdownHook in my spring batch project, but when I do kill my batch process it is not stopping immediately. It is waiting till the entire batch process is completed. Thats how it works?
public static void main(String[] args) {
final AbstractApplicationContext appContext = new AnnotationConfigApplicationContext("\"com.lexisnexis.batch\",\"com.lexisnexis.rules\"");
appContext.registerShutdownHook();
...
}
Does it needs to stop all running batches and when we do kill with this registerShutdownHook code?
2) What is the best way to restart all stopped jobs?
Yes, that's how it works. It's needed to close spring application context gracefully. Please check spring framework's documentation.
To restart your stopped job you need to invoke jobOperator.restart(executionId) or use jobLauncher.run with same parameters you used to start original job.
I have a spring batch job that takes a long time to execute. After executing for a while I decided I wanted to stop it but whenever I restart the server the job still continues executing after the server come back up.
I want to know where spring batch saves the state so that I can possibly delete it and stop that from happening.
I found out there are properties that I can configure to not have the job restartable and I will use that going forward but now I just need to make sure the job can stop for good.
You can see the documentation here which shows & describes the spring Batch Meta-Data tables.
if you just want to prevent restart of job below config should help you.
preventRestart() helps to avoid restart of job
https://docs.spring.io/spring-batch/trunk/apidocs/org/springframework/batch/core/job/builder/JobBuilderHelper.html
#Bean
public Job myJob(JobBuilderFactory jobs) throws Exception {
return jobs.get("myJob")
.flow(step1())
.build()
.preventRestart()
.build();
}
I need to create batch jobs using Spring Batch.
Job will access oracle DB then fetch records, process them in tasklet and commit results.
I am planning to use hibernate with spring to deal with data.
Jobs will be executed via AutoSys. I am using CommandLineJobRunner as entry point.
(Extra info - I am using DynamicWebProject converted to Gradle, STS, Spring 4.0, Hibernate 5.0, NO Spring Boot)
I have few queries/doubts about this entire application. They are more towards environment/deployment.
Do I need to deploy this whole app as a war in Tomcat(or any server) to instantiate all beans(spring and hibernate)?
If yes, how can I start jobs using CommandLineJobRunner ?
If no, I will have to manually instantiate beans in main method using ClassPathXmlApplicationContext. In this case how should I execute jobs ? Do I need to create jar(is this mandatory) ?
How can I test these jobs on command line ? Do I need to pass jars(spring , hibernate etc dependencies) while using CommandLineJobRunner to execute jobs ?
I am new to batch jobs and all your comments would be of great help.
Thanks
No server is needed for spring batch applications.
You can launch job using jobLauncher bean . below is sample code.
public class MyJobLauncher {
public static void main(String[] args) {
GenericApplicationContext context = new AnnotationConfigApplicationContext(MyBatchConfiguration.class);
JobLauncher jobLauncher = (JobLauncher) context.getBean("jobLauncher");
Job job = (Job) context.getBean("myJobName");//this is bean name of your job
JobExecution execution = jobLauncher.run(job, jobParameters);
}
}
You will need to create jar. Also all other jar that are needed are also required. You can use maven maven assembly plugin for this.
Currently I'm moving from Spring XD as my workflow and runtime environment to Spring Cloud DataFlow and Apache Airflow. I want to create workflows in Airflow and use custom Airflow operator to run Spring Cloud Tasks on Spring Cloud DataFlow server by REST-API.
It's possible using:
curl -X GET http://SERVER:9393/tasks/deployments/...
Unfortunately DataFlow doesn't return job execution ID in this request to create simple way for monitoring of app. Is there a way to get this id in synchronic way? Because getting the last execution of specific job can lead to mistakes eg. missing job execution if I ran many the same jobs at the same time.
On Spring DataFlow I am running Spring Batch jobs so maybe better way is too somehow set execution job id and pass it as input parameter?
Try to use the following annotations to collect the task information from your bean:
public class MyBean {
#BeforeTask
public void methodA(TaskExecution taskExecution) {
}
#AfterTask
public void methodB(TaskExecution taskExecution) {
}
#FailedTask
public void methodC(TaskExecution taskExecution, Throwable throwable) {
}
}
https://docs.spring.io/spring-cloud-task/docs/current-SNAPSHOT/reference/htmlsingle/#features-task-execution-listener