All the steps being passed still the job is completed with failed status.
#Bean
public Job job() {
return this.jobBuilderFactory.get("person-job")
.start(initializeBatch())
.next(readBodystep())
.on("STOPPED")
.stopAndRestart(initializeBatch())
.end()
.validator(batchJobParamValidator)
.incrementer(jobParametersIncrementer)
.listener(jobListener)
.build();
}
#Bean
public Flow preProcessingFlow() {
return new FlowBuilder<Flow>("preProcessingFlow")
.start(extractFooterAndBodyStep())
.next(readFooterStep())
.build();
}
#Bean
public Step initializeBatch() {
return this.stepBuilderFactory.get("initializeBatch")
.flow(preProcessingFlow())
.build();
public Step readBodystep() {
return this.stepBuilderFactory.get("readChunkStep")
.<PersonDTO, PersonBO>chunk(10)
.reader(personFileBodyReader)
.processor(itemProcessor())
.writer(dummyWriter)
.listener(new ReadFileStepListener())
.listener(personFileBodyReader)
.build();
}
is anything wrong with the above configuration?
When I am removing the stopAndRestart configuration, it is getting passed.
For your use case, it is not stopAndRestart that you need, it is rather setting allowStartIfComplete on the step. With that, if the job fails, the step will be re-executed even if it was successfully completed in the previous run.
Related
I've developed spring batch job which gets the data from JDBC. The problem I'm facing, it's executing on project startup regardless of enabled_property. The value of the property is FALSE. I've tried to create a conditional bean on property but it didn't also worked and job is being executed on project startup.
Following the my code snippet.
#Bean
#ConditionalOnProperty(value = "wallet-manager.djeezyConfig.enableJob" , havingValue = "false")
public Job createJob() {
return jobBuilderFactory.get("DJeezy wallet cleaner job")
.incrementer(new RunIdIncrementer())
.flow(Step1())
.end()
.build();
}
#Bean
#ConditionalOnProperty(value = "wallet-manager.djeezyConfig.enableJob" , havingValue = "false")
public Step Step1() {
return stepBuilderFactory.get("DJeezy wallet cleaner job - step1")
.<ResellerWallet,ResellerWallet> chunk(wConfig.getDjeezyConfig().getChunkSize())
.reader(resellerWalletItemReader)
//.processor(resellerWalletProcessor)
.writer(resellerWalletItemWriter)
.faultTolerant()
.skip(EmptyResultDataAccessException.class)
.build();
}
I've also tried to commented the #Scheduled annotation but it stills executing the job and steps.
//#Scheduled(fixedDelay = 15000)
public void scheduleByFixedRate() throws Exception {
if(config.getDjeezyConfig().isEnableJob()) {
System.out.println("Batch job starting");
JobParameters jobParameters = new JobParametersBuilder()
.addString("time", format.format(Calendar.getInstance().getTime())).toJobParameters();
jobLauncher.run(job, jobParameters);
System.out.println("Batch job executed successfully\n");
}
}
Can someone please guide me what I'm missing here? and how can I prevent my job and step being executed on startup.
spring.batch.job.enabled=false
hope you are using this property in your properties file
this should work
I'm working on a project that includes Spring batch, before copying the code snippets, I'm going to summarize easily how the job works with a cron.
the cron calls a rest API on my project (#PostMapping("/jobs/external/{jobName}"))
in the post method, I get the job and execute it.
in each execution, I'm supposed to run a step.
the step contains a reader (external rest call to elastic API to get documents) and a processor.
now my problem: in the catalina.out, I'm able to see the rest call from the cron every 10 minutes as configured in my cron. BUT, the step doesn't seem to make that call to elastic every 10 minutes, the batch process always has the same set of data, which is fetched one time when the batch is called during tomcat restart.
job rest api :
#PostMapping("/jobs/external/{jobName}")
#Timed
public ResponseEntity start(#PathVariable String jobName) throws BatchException {
log.info("LAUNCHING JOB FROM EXTERNAL : {}, timestamp : {}", jobName, Instant.now().toString());
try {
Job job = jobRegistry.getJob(jobName);
JobParametersBuilder builder = new JobParametersBuilder();
builder.addDate("date", new Date());
return Optional.of(jobLauncher.run(job, builder.toJobParameters()))
.map(BatchExecutionVM::new)
.map(exec -> ResponseEntity
.ok()
.headers(HeaderUtil.createAlert("jobManagement.started", jobName))
.body(exec))
.orElseGet(() -> ResponseEntity.badRequest().build());
} catch (NoSuchJobException aEx) {
log.warn(JOB_NOT_FOUND, aEx);
throw new BatchException();
} catch (JobInstanceAlreadyCompleteException | JobExecutionAlreadyRunningException | JobRestartException aEx) {
log.warn("Job execution error.", aEx);
throw new BatchException();
} catch (JobParametersInvalidException aEx) {
log.warn("Job parameters are invalid.", aEx);
throw new BatchException();
}
}
job configuration :
#Bean
public Job usualJob() {
return jobBuilderFactory
.get("usualJob")
.incrementer(new SimpleJobIncrementer())
.flow(readUsualStep())
.end()
.build();
}
#Bean
public Step readUsualStep() {
// TODO: simplifier on n'a pas besoin de chunk
return stepBuilderFactory.get("readUsualStep")
.allowStartIfComplete(true)
.<AlertDocument, Void>chunk(25)
.readerIsTransactionalQueue()
.reader(rowItemReader())
.processor(rowItemProcessor())
.build();
}
#Bean
public ItemReader<AlertDocument> rowItemReader() {
return new UsualItemReader(usualService.getLastAlerts());
}
#Bean
public UsualMapRowProcessor rowItemProcessor() {
return new UsualMapRowProcessor();
}
i don't know why usualService.getLastAlerts() is called just once and not every 10 minutes.
thanks to M. Deinum, this is basically the solution :
#Bean
#StepScope
public ItemReader<AlertDocument> rowItemReader() {
return new UsualItemReader(usualService.getLastAlerts());
}
annotating the step bean with stepScope annotation will make it reinstantiate every step.
I have created a Spring Batch app and I'm struggling to implement a simple flow with a condition. Here's what I want to implement:
I tried to achieve this implementing the following code:
#Bean
public Job job(JobCompletionNotificationListener listener) {
return jobs.get(Constants.JOB_SIARD_FILES_PROCESSOR + new Date().getTime())
.incrementer(new RunIdIncrementer())
.listener(listener)
.start(step1())
.next(decider()).on("yes").to(step2345Flow())
.end()
.build();
}
#Bean
public Flow step2345Flow() {
return new FlowBuilder<SimpleFlow>("yes_flow")
.start(step2())
.next(step3())
.next(step4())
.next(step5())
.build();
}
When the condition is "yes" the flow is working just fine, but when the condition is "no" the flow always ends with an execution status "FAILED". I want it to be "COMPLETED" just like the first flow but without executing the steps 2, 3, 4 and 5.
Hope anyone can help me with this.
Spring Batch does not allow alternative branches in the flow to be implicit. In other words, you need an on(...) for each case.
Assuming decider() yields a proxied bean, it should work fine with
#Bean
public Job job(JobCompletionNotificationListener listener) {
return jobs.get(Constants.JOB_SIARD_FILES_PROCESSOR + new Date().getTime())
.incrementer(new RunIdIncrementer())
.listener(listener)
.start(step1())
.next(decider()).on("yes").to(step2345Flow())
.from(decider()).on("no").end()
.end()
.build();
}
To cover really all cases, you can also use on("*") instead of on("no").
Please also have a second look at the official documentation: https://docs.spring.io/spring-batch/docs/4.3.x/reference/html/index-single.html#controllingStepFlow
I have tried to find the solution but I cannot... ㅠㅠ
I want to separate steps in a job like below.
step1.class -> step2.class -> step3.class -> done
The reason why I'm so divided is that I have to use queries each step.
#Bean
public Job bundleJob() {
return jobBuilderFactory.get(JOB_NAME)
.start(step1) // bean
.next(step2) // bean
.next(step3()) // and here is the code ex) reader, processor, writer
.build();
}
my purpose is that I have to use the return data in step1, step2.
but jpaItemReader is like async ... so it doesn't process like above order.
debug flow like this.
readerStep1 -> writerStep1 -> readerStep2 -> readerWriter2 -> readerStep3 -> writerStep3
and
-> processorStep1 -> processorStep2 -> processorStep3
that is the big problem to me...
How can I wait each step in a job? Including querying.
aha! I got it.
the point is the creating beans in a configuration.
I wrote annotation bean all kinds of steps so that those are created by spring.
the solution is late binding like #JobScope or #StepScope
#Bean
#StepScope. // late creating bean.
public ListItemReader<Dto> itemReader() {
// business logic
return new ListItemReader<>(dto);
}
To have a separate steps in your job you can use a Flow with a TaskletStep. Sharing a snippet for your reference,
#Bean
public Job processJob() throws Exception {
Flow fetchData = (Flow) new FlowBuilder<>("fetchData")
.start(fetchDataStep()).build();
Flow transformData = (Flow) new FlowBuilder<>("transformData")
.start(transformData()).build();
Job job = jobBuilderFactory.get("processTenantLifeCycleJob").incrementer(new RunIdIncrementer())
.start(fetchData).next(transformData).next(processData()).end()
.listener(jobCompletionListener()).build();
ReferenceJobFactory referenceJobFactory = new ReferenceJobFactory(job);
registry.register(referenceJobFactory);
return job;
}
#Bean
public TaskletStep fetchDataStep() {
return stepBuilderFactory.get("fetchData")
.tasklet(fetchDataValue()).listener(fetchDataStepListener()).build();
}
#Bean
#StepScope
public FetchDataValue fetchDataValue() {
return new FetchDataValue();
}
#Bean
public TaskletStep transformDataStep() {
return stepBuilderFactory.get("transformData")
.tasklet(transformValue()).listener(sendReportDataCompletionListener()).build();
}
#Bean
#StepScope
public TransformValue transformValue() {
return new TransformValue();
}
#Bean
public Step processData() {
return stepBuilderFactory.get("processData").<String, Data>chunk(chunkSize)
.reader(processDataReader()).processor(dataProcessor()).writer(processDataWriter())
.listener(processDataListener())
.taskExecutor(backupTaskExecutor()).build();
}
In this example I have used 2 Flows to Fetch and Transform data which will execute data from a class.
In order to return the value of those from the step 1 and 2, you can store the value in the job context and retrieve that in the ProcessData Step which has a reader, processor and writer.
I am new to springbatch and I am trying something where from a CSV file I am trying to read about 2000 records every 10 seconds using a quartz scheduler and write it into a database.
The problem is everytime it starts reading the file from the beginning and hence writes the same set of records into the database.
I've tried dynamically changing the paramter "setLinesToSkip" but to no avail, which is probably because it is included in my default bean definition.
Is there some way by which I can resume processing from the same spot or maybe can update the value in setlinetoskip
#Bean
public Step stepOne() {
return stepBuilderFactory
.get("stepOne")
.<Stock,Stock>chunk(5)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
#Bean
public Job readCSVFileJob1() {
return jobBuilderFactory
.get("readCSVFileJob1")
.incrementer(new RunIdIncrementer())
.start(stepOne())
.build();
}
#Bean
public ItemProcessor<Stock, Stock> processor(){
return new DBLogProcessor();
}
#Bean
public FlatFileItemReader<Stock> reader() {
FlatFileItemReader<Stock> itemReader = new FlatFileItemReader<Stock>();
itemReader.setLineMapper(lineMapper());
itemReader.setLinesToSkip(1);
itemReader.setMaxItemCount(2000);
itemReader.setResource(new FileSystemResource("example.csv"));
return itemReader;
}