I've developed spring batch job which gets the data from JDBC. The problem I'm facing, it's executing on project startup regardless of enabled_property. The value of the property is FALSE. I've tried to create a conditional bean on property but it didn't also worked and job is being executed on project startup.
Following the my code snippet.
#Bean
#ConditionalOnProperty(value = "wallet-manager.djeezyConfig.enableJob" , havingValue = "false")
public Job createJob() {
return jobBuilderFactory.get("DJeezy wallet cleaner job")
.incrementer(new RunIdIncrementer())
.flow(Step1())
.end()
.build();
}
#Bean
#ConditionalOnProperty(value = "wallet-manager.djeezyConfig.enableJob" , havingValue = "false")
public Step Step1() {
return stepBuilderFactory.get("DJeezy wallet cleaner job - step1")
.<ResellerWallet,ResellerWallet> chunk(wConfig.getDjeezyConfig().getChunkSize())
.reader(resellerWalletItemReader)
//.processor(resellerWalletProcessor)
.writer(resellerWalletItemWriter)
.faultTolerant()
.skip(EmptyResultDataAccessException.class)
.build();
}
I've also tried to commented the #Scheduled annotation but it stills executing the job and steps.
//#Scheduled(fixedDelay = 15000)
public void scheduleByFixedRate() throws Exception {
if(config.getDjeezyConfig().isEnableJob()) {
System.out.println("Batch job starting");
JobParameters jobParameters = new JobParametersBuilder()
.addString("time", format.format(Calendar.getInstance().getTime())).toJobParameters();
jobLauncher.run(job, jobParameters);
System.out.println("Batch job executed successfully\n");
}
}
Can someone please guide me what I'm missing here? and how can I prevent my job and step being executed on startup.
spring.batch.job.enabled=false
hope you are using this property in your properties file
this should work
Related
I'm working on a project that includes Spring batch, before copying the code snippets, I'm going to summarize easily how the job works with a cron.
the cron calls a rest API on my project (#PostMapping("/jobs/external/{jobName}"))
in the post method, I get the job and execute it.
in each execution, I'm supposed to run a step.
the step contains a reader (external rest call to elastic API to get documents) and a processor.
now my problem: in the catalina.out, I'm able to see the rest call from the cron every 10 minutes as configured in my cron. BUT, the step doesn't seem to make that call to elastic every 10 minutes, the batch process always has the same set of data, which is fetched one time when the batch is called during tomcat restart.
job rest api :
#PostMapping("/jobs/external/{jobName}")
#Timed
public ResponseEntity start(#PathVariable String jobName) throws BatchException {
log.info("LAUNCHING JOB FROM EXTERNAL : {}, timestamp : {}", jobName, Instant.now().toString());
try {
Job job = jobRegistry.getJob(jobName);
JobParametersBuilder builder = new JobParametersBuilder();
builder.addDate("date", new Date());
return Optional.of(jobLauncher.run(job, builder.toJobParameters()))
.map(BatchExecutionVM::new)
.map(exec -> ResponseEntity
.ok()
.headers(HeaderUtil.createAlert("jobManagement.started", jobName))
.body(exec))
.orElseGet(() -> ResponseEntity.badRequest().build());
} catch (NoSuchJobException aEx) {
log.warn(JOB_NOT_FOUND, aEx);
throw new BatchException();
} catch (JobInstanceAlreadyCompleteException | JobExecutionAlreadyRunningException | JobRestartException aEx) {
log.warn("Job execution error.", aEx);
throw new BatchException();
} catch (JobParametersInvalidException aEx) {
log.warn("Job parameters are invalid.", aEx);
throw new BatchException();
}
}
job configuration :
#Bean
public Job usualJob() {
return jobBuilderFactory
.get("usualJob")
.incrementer(new SimpleJobIncrementer())
.flow(readUsualStep())
.end()
.build();
}
#Bean
public Step readUsualStep() {
// TODO: simplifier on n'a pas besoin de chunk
return stepBuilderFactory.get("readUsualStep")
.allowStartIfComplete(true)
.<AlertDocument, Void>chunk(25)
.readerIsTransactionalQueue()
.reader(rowItemReader())
.processor(rowItemProcessor())
.build();
}
#Bean
public ItemReader<AlertDocument> rowItemReader() {
return new UsualItemReader(usualService.getLastAlerts());
}
#Bean
public UsualMapRowProcessor rowItemProcessor() {
return new UsualMapRowProcessor();
}
i don't know why usualService.getLastAlerts() is called just once and not every 10 minutes.
thanks to M. Deinum, this is basically the solution :
#Bean
#StepScope
public ItemReader<AlertDocument> rowItemReader() {
return new UsualItemReader(usualService.getLastAlerts());
}
annotating the step bean with stepScope annotation will make it reinstantiate every step.
I have 2 jobs where i have to launch the job2 based on the the status of the job1.
is it right to make the following call:
JobExecution jobExecution1 = jobLauncher.run(job1, jobParameters1);
if(jobExecution1.getStatus()== BatchStatus.COMPLETED){
JobExecution jobExecution2 = jobLauncher.run(job2, jobParameters2);
}
i have a doubt that the initial jobexecution1 status might not be final when the condition is checked.
if anyone could explain more about yhis process it would be great
thanks in advance .
It depends on the TaskExecutor implementation that you configure in your JobLauncher:
If the task executor is synchronous, then the first job will be run until completion (either success or failure) before launching the second job. In this case, your code is correct.
If the task executor is asynchronous, then the call JobExecution jobExecution1 = jobLauncher.run(job1, jobParameters1); will return immediately and the status of jobExecution1 would be unknown at that point. This means it is incorrect to check the status of job1 right after launching it as a condition to launch job2 (job1 will be run in a separate thread and could still be running at that point).
By default, Spring Batch configures a synchronous task executor in the JobLauncher.
There are multiple ways. We are using below option by chaining jobs by decider. Below is the code snippet . Please let me know if you face any issues.
public class MyDecider implements JobExecutionDecider {
#Override
public FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecution) {
String exitCode = new Random().nextFloat() < .70f ? "CORRECT":"INCORRECT";
return new FlowExecutionStatus(exitCode);
}
}
#Bean
public JobExecutionDecider myDecider() {
return new MyDecider();
}
#Bean
public Job myTestJob() {
return this.jobBuilderFactory.get("myTestJob")
.start(step1())
.next(step2())
.on("FAILED").to(step3())
.from(step4())
.on("*").to(myDecider())
.on("PRESENT").to(myNextStepOrJob)
.next(myDecider()).on("CORRECT").to(myNextStepOrJob())
.from(myDecider()).on("INCORRECT").to(myNextStepOrJob())
.from(decider())
.on("NOT_PRESENT").to(myNextStepOrJob())
.end()
.build();
}
I'm currently trying to launch a Job programatically in my Spring application.
Here is my actual code:
public void launchJob(String[] args)
throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException,
JobInstanceAlreadyCompleteException {
String jobName = args[0];
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(jobName, Job.class);
JobParameters jobParameters = getJobParameters(args);
JobExecution jobExecution = jobLauncher.run(job, jobParameters);
BatchStatus batchStatus = jobExecution.getStatus();
}
And how I launch it :
String[] args = {"transformXML2CSV", "myFile.xml", "myFile.csv"};
springBatchJobRunner.launchJob(args);
But I have some troubles during the launch, the first is to retrieve the app context, my first try was to annotate my class with a #Service and use an #Autowired like this:
#Autowired
private ApplicationContext context;
But with this way my context is always null.
My second try was to get the context by this way:
ApplicationContext context = new AnnotationConfigApplicationContext(SpringBatchNoDatabaseConfiguration.class);
The SpringBatchNoDatabaseConfiguration is my #Configuration and the Job is inside it.
Using this my context is not null but I have a strange behavior and I can't understand why and how to prevent this:
I run the launchJob function from my processor class, then when I get the context by AnnotationConfigApplicationContext my processor class is rebuild and I have a NullPointerException in it stopping all the process.
I really don't understand the last part, why it's relaunching my processor class when I get the context ?
As indicated in comments above, you run a parent batch (with spring-batch) which at a moment needs your job to process an xml file.
I suggest you keep the same spring-batch context and run the process xml file job as a nested job of the parent job. You can do that using JobStep class and spring batch controlling step flow feature. As an example, here is what your parent job would like to :
public Job parentJob(){
JobParameters processXmlFileJobParameters = getJobParameters(String[]{"transformXML2CSV", "myFile.xml", "myFile.csv"});
return this.jobBuilderFactory.get("parentJob")
.start(firstStepOfParentJob())
.on("PROCESS_XML_FILE").to(processXmlFileStep(processXmlFileJobParameters)
.from(firstStepOfParentJob()).on("*").to(lastStepOfParentJob())
.from(processXmlFileStep(processXmlFileJobParameters))
.next(lastStepOfParentJob())
.end().build();
}
public Step firstStepOfParentJob(){
return stepBuilderFactory.get("firstStepOfParentJob")
// ... depends on your parent job's business
.build();
}
public Step lastStepOfParentJob(){
return stepBuilderFactory.get("lastStepOfParentJob")
// ... depends on your parent job's business
.build();
}
public Step processXmlFileStep(JobParameters processXmlFileJobParameters){
return stepBuilderFactory.get("processXmlFileStep")
.job(processXmlFileJob())
.parametersExtractor((job,exec)->processXmlFileJobParameters)
.build();
}
public Job processXmlFileJob(){
return jobBuilderFactory.get("processXmlFileJob")
// ... describe here your xml process job
.build();
}
It seems a second Spring context is initialized with the instruction new AnnotationConfigApplicationContext(SpringBatchNoDatabaseConfiguration.class) so Spring initialiezs your beans a second time.
I recommend you to use Spring boot to automatically launch your jobs at sartup
If you don't want to use Spring boot and launch your job manually, your main method should look like this :
public static void main(String[] args){
String[] localArgs = {"transformXML2CSV", "myFile.xml", "myFile.csv"};
ApplicationContext context = new AnnotationConfigApplicationContext(SpringBatchNoDatabaseConfiguration.class);
Job job = context.getBean(Job.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
JobParameters jobParameters = getJobParameters(localArgs);
jobExecution = jobLauncher.run(job, jobParameters);
// Ohter code to do stuff after the job ends...
}
Notice that this code should be completed with your own needs. The class SpringBatchNoDatabaseConfigurationhas to be anotated with #Configuration and #EnableBatchProcessingand should define the beans of your job like this
You can also use the Spring batch class CommandLineJobRunner with your java config as explain here : how to launch Spring Batch Job using CommandLineJobRunner having Java configuration
It saves writing the code above
All the steps being passed still the job is completed with failed status.
#Bean
public Job job() {
return this.jobBuilderFactory.get("person-job")
.start(initializeBatch())
.next(readBodystep())
.on("STOPPED")
.stopAndRestart(initializeBatch())
.end()
.validator(batchJobParamValidator)
.incrementer(jobParametersIncrementer)
.listener(jobListener)
.build();
}
#Bean
public Flow preProcessingFlow() {
return new FlowBuilder<Flow>("preProcessingFlow")
.start(extractFooterAndBodyStep())
.next(readFooterStep())
.build();
}
#Bean
public Step initializeBatch() {
return this.stepBuilderFactory.get("initializeBatch")
.flow(preProcessingFlow())
.build();
public Step readBodystep() {
return this.stepBuilderFactory.get("readChunkStep")
.<PersonDTO, PersonBO>chunk(10)
.reader(personFileBodyReader)
.processor(itemProcessor())
.writer(dummyWriter)
.listener(new ReadFileStepListener())
.listener(personFileBodyReader)
.build();
}
is anything wrong with the above configuration?
When I am removing the stopAndRestart configuration, it is getting passed.
For your use case, it is not stopAndRestart that you need, it is rather setting allowStartIfComplete on the step. With that, if the job fails, the step will be re-executed even if it was successfully completed in the previous run.
I am new to springbatch and I am trying something where from a CSV file I am trying to read about 2000 records every 10 seconds using a quartz scheduler and write it into a database.
The problem is everytime it starts reading the file from the beginning and hence writes the same set of records into the database.
I've tried dynamically changing the paramter "setLinesToSkip" but to no avail, which is probably because it is included in my default bean definition.
Is there some way by which I can resume processing from the same spot or maybe can update the value in setlinetoskip
#Bean
public Step stepOne() {
return stepBuilderFactory
.get("stepOne")
.<Stock,Stock>chunk(5)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
#Bean
public Job readCSVFileJob1() {
return jobBuilderFactory
.get("readCSVFileJob1")
.incrementer(new RunIdIncrementer())
.start(stepOne())
.build();
}
#Bean
public ItemProcessor<Stock, Stock> processor(){
return new DBLogProcessor();
}
#Bean
public FlatFileItemReader<Stock> reader() {
FlatFileItemReader<Stock> itemReader = new FlatFileItemReader<Stock>();
itemReader.setLineMapper(lineMapper());
itemReader.setLinesToSkip(1);
itemReader.setMaxItemCount(2000);
itemReader.setResource(new FileSystemResource("example.csv"));
return itemReader;
}