Spring batch with quartz configuration not working - Looking for spring metadata table instead of quartz table - spring-boot

I have requirement to develop a spring boot batch using quartz as scheduler.
I have created spring batch application and also configured quartz to trigger job and persist job/trigger data in quartz related RDBMS tables.
Spring boot application starts successfully but when it triggers the job as per cron config. Its throws below error.
2020-04-27 20:42:09.967 INFO 1460 --- [ main] c.b.g.batch.IemsBatchApplication : Started IemsBatchApplication in 12.112 seconds (JVM running for 13.22)
org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from BATCH_JOB_INSTANCE where JOB_NAME = ? and JOB_KEY = ?]; nested exception is java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist
I am not able to find the root cause that why application is trying to persist data in spring meta data table rather than quartz tables.
Quartz related tables are already created in my db.
Please help me.
Please find below the code for the same.
#Configuration
public class BatchSchedulerConfig {
#Autowired
#Qualifier("batchDataSource")
private DataSource batchDataSource;
#Autowired
JobLauncher jobLauncher;
#Autowired
JobLocator jobLocator;
#Bean
public JobDetailFactoryBean jobDetailFactoryBean() {
JobDetailFactoryBean jobDetailFactoryBean = new JobDetailFactoryBean();
jobDetailFactoryBean.setJobClass(BatchJobLauncherConfig.class);
Map<String, Object> map = new HashMap<String, Object>();
map.put("jobName", IEMSBatchConstant.MonthlyElectricityMeterReadingJob);
jobDetailFactoryBean.setJobDataAsMap(map);
jobDetailFactoryBean.setGroup("etl_group");
jobDetailFactoryBean.setName("etl_job");
return jobDetailFactoryBean;
}
#Bean
public CronTriggerFactoryBean cronTriggerFactoryBean(JobDetailFactoryBean jobDetailFactoryBean) {
CronTriggerFactoryBean cronTriggerFactoryBean = new CronTriggerFactoryBean();
cronTriggerFactoryBean.setJobDetail(jobDetailFactoryBean.getObject());
cronTriggerFactoryBean.setStartDelay(3000);
cronTriggerFactoryBean.setName("cron_trigger");
cronTriggerFactoryBean.setGroup("cron_group");
cronTriggerFactoryBean.setCronExpression("0 0/2 * 1/1 * ? *");
return cronTriggerFactoryBean;
}
#Bean
public SchedulerFactoryBean schedulerFactoryBean(CronTriggerFactoryBean cronTriggerFactoryBean) throws IOException {
SchedulerFactoryBean scheduler = new SchedulerFactoryBean();
Map<String, Object> map = new HashMap<String, Object>();
map.put("joblocator", jobLocator);
map.put("JobLauncher", jobLauncher);
scheduler.setSchedulerContextAsMap(map);
scheduler.setTriggers(cronTriggerFactoryBean.getObject());
scheduler.setDataSource(batchDataSource);
scheduler.setQuartzProperties(quartzProperties());
// scheduler.setSchedulerName(schedulerName);
return scheduler;
}
public Properties quartzProperties() throws IOException {
PropertiesFactoryBean propertiesFactoryBean = new PropertiesFactoryBean();
propertiesFactoryBean.setLocation(new ClassPathResource("/quartz.properties"));
propertiesFactoryBean.afterPropertiesSet();
return propertiesFactoryBean.getObject();
}
}
#Configuration
#EnableConfigurationProperties
public class BatchBaseConfig {
private static final Logger log = LoggerFactory.getLogger(BatchBaseConfig.class);
#Bean
public JobLauncher jobLauncher(JobRepository jobRepository) throws Exception {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepository);
return simpleJobLauncher;
}
#Bean
public JobRepository jobRepository(DataSource batchDataSource) throws Exception {
DataSourceTransactionManager batchTransactionManager = new DataSourceTransactionManager();
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
batchTransactionManager.setDataSource(batchDataSource);
jobRepositoryFactoryBean.setTransactionManager(batchTransactionManager);
jobRepositoryFactoryBean.setDatabaseType(DatabaseType.ORACLE.getProductName());
//jobRepositoryFactoryBean.setIsolationLevelForCreate("ISOLATION_DEFAULT");
jobRepositoryFactoryBean.setDataSource(batchDataSource);
jobRepositoryFactoryBean.setTablePrefix("QRTZ_");
jobRepositoryFactoryBean.afterPropertiesSet();
return jobRepositoryFactoryBean.getObject();
}
#Bean(name = "batchDataSource")
#Primary
#ConfigurationProperties(prefix = "quartz")
public DataSource batchDataSource() {
DataSource batchDbSrc = DataSourceBuilder.create().build();
return batchDbSrc;
}
#Bean(name = "appDataSource")
#ConfigurationProperties(prefix = "app")
public DataSource appDataSource() {
DataSource appDbSrc = DataSourceBuilder.create().build();
return appDbSrc;
}
#Bean
public JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor(JobRegistry jobRegistry) {
JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor = new JobRegistryBeanPostProcessor();
jobRegistryBeanPostProcessor.setJobRegistry(jobRegistry);
return jobRegistryBeanPostProcessor;
}
}
#Getter
#Setter
public class BatchJobLauncherConfig extends QuartzJobBean {
private static final Logger log = LoggerFactory.getLogger(BatchJobLauncherConfig.class);
private JobLauncher jobLauncher;
private JobLocator joblocator;
private String jobName;
#Override
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
try {
JobExecution jobExecution = jobLauncher.run(joblocator.getJob(jobName), new
JobParameters());
log.info("{}_{} was completed successfully", jobExecution.getJobConfigurationName(),
jobExecution.getId());
} catch (JobExecutionAlreadyRunningException | JobRestartException |
JobInstanceAlreadyCompleteException
| JobParametersInvalidException | NoSuchJobException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
}
}
#Configuration
public class JobConfig {
private static final Logger log = LoggerFactory.getLogger(JobConfig.class);
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private ReadingFromDB readingFromDB;
#Autowired
private ReadingFileWriter readingFileWriter;
#Qualifier(value = "meterReadingJob")
#Bean
public Job meterReadingJob() throws Exception {
log.info("Enter into meterReadingJob method ");
return this.jobBuilderFactory.get("meterReadingJob")
.start(createReadingFile()).build();
}
#Bean
public Step createReadingFile() throws Exception {
return this.stepBuilderFactory.get("createReadingFile").chunk(1)
.reader(readingFromDB).writer(readingFileWriter).build();
}
}
Application.properties
spring.batch.job.enabled=false
spring.main.allow-bean-definition-overriding=true
#Spring's default configuration to generate datasource
app.jdbcUrl=jdbc:oracle:thin:#//<Removed>
app.username=<Removed>
app.password=<Removed>
app.driverClassName=oracle.jdbc.OracleDriver
spring.app.jpa.database-platform=org.hibernate.dialect.Oracle10gDialect
#Quartz dataSource
quartz.jdbcUrl=jdbc:oracle:thin:#//<Removed>
quartz.username=<Removed>
quartz.password=<Removed>
quartz.driverClassName=oracle.jdbc.OracleDriver
Quartz.properties
#scheduler name will be "MyScheduler"
org.quartz.scheduler.instanceName=MyNewScheduler
org.quartz.scheduler.instanceId=AUTO
#maximum of 3 jobs can be run simultaneously
org.quartz.threadPool.class=org.quartz.simpl.SimpleThreadPool
org.quartz.threadPool.threadCount=50
#Quartz persistent jobStore config
org.quartz.jobStore.class=org.quartz.impl.jdbcjobstore.JobStoreTX
org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.StdJDBCDelegate
org.quartz.jobStore.tablePrefix=QRTZ_
#org.quartz.jobStore.dataSource=batchDataSource
org.quartz.jobStore.useProperties=false
org.quartz.jobStore.isClustered=false
Both datasources are getting created and entries are also going for triggers and jobs in datasource names batchDataSource.
Issue comes when job gets triggered and batch tries to insert job details in DB.
Thanks in advance.

Related

Spring Batch - How to Automatically Restart a Scheduled Job with Multi-Threaded Steps if Failed?

I'm new to Spring Batch. I have a scheduled job which needs to run every 2 hours. This job has several multi-threaded steps which should run independent from each other. The job is currently launched using a JobLauncher, as mentioned below.
#Component
#EnableScheduling
public class JobScheduler {
private static final Logger logger = LoggerFactory.getLogger(JobScheduler.class);
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job job;
#Scheduled(cron = "0 0 */2 * * ?")
#Retryable(maxAttempts = 3, backoff = #Backoff(delay = 60000),
include = {SQLException.class, RuntimeException.class})
public void automatedTask() {
JobParameters jobParameters = new JobParametersBuilder().addLong("time", System.currentTimeMillis()).toJobParameters();
try {
JobExecution jobExecution = jobLauncher.run(job, jobParameters);
} catch (JobInstanceAlreadyCompleteException | JobRestartException | JobParametersInvalidException |
JobExecutionAlreadyRunningException ex) {
logger.error("Error occurred when executing job scheduler", ex);
}
}
}
Mentioned below is my BatchConfig class.
#Configuration
#EnableBatchProcessing
#EnableRetry
public class BatchConfig {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private DataSource dataSource;
#Bean
#StepScope
public JdbcPagingItemReader<Model> reader1() {
StringBuffer selectClause = new StringBuffer();
selectClause.append("SELECT ");
selectClause.append("* ");
StringBuffer fromClause = new StringBuffer();
fromClause.append("FROM ");
fromClause.append("TABLENAME");
OraclePagingQueryProvider oraclePagingQueryProvider = new OraclePagingQueryProvider();
oraclePagingQueryProvider.setSelectClause(selectClause.toString());
oraclePagingQueryProvider.setFromClause(fromClause.toString());
Map<String, Order> orderByKeys = new HashMap<>();
orderByKeys.put("id", Order.ASCENDING);
oraclePagingQueryProvider.setSortKeys(orderByKeys);
JdbcPagingItemReader<Model> jdbcPagingItemReader = new JdbcPagingItemReader<>();
jdbcPagingItemReader.setSaveState(false);
jdbcPagingItemReader.setDataSource(dataSource);
jdbcPagingItemReader.setQueryProvider(oraclePagingQueryProvider);
jdbcPagingItemReader.setRowMapper(BeanPropertyRowMapper.newInstance(Model.class));
return jdbcPagingItemReader;
}
#Bean
#StepScope
public JdbcPagingItemReader<Model> reader2() {
}
#Bean
#StepScope
public JdbcPagingItemReader<Model> reader3() {
}
#Bean
#StepScope
public ItemWriter<Model> writer1() {
return new CustomItemWriter1();
}
#Bean
#StepScope
public ItemWriter<Model> writer2() {
return new CustomItemWriter2();
}
#Bean
#StepScope
public ItemWriter<Model> writer3() {
return new CustomItemWriter3();
}
#Bean
public Step step1() {
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setCorePoolSize(4);
taskExecutor.setMaxPoolSize(4);
taskExecutor.afterPropertiesSet();
return stepBuilderFactory.get("step1")
.<Model, Model>chunk(1000)
.reader(reader1())
.writer(writer1())
.faultTolerant()
.skipPolicy(new AlwaysSkipItemSkipPolicy())
.skip(Exception.class)
.listener(new CustomSkipListener())
.taskExecutor(taskExecutor)
.build();
}
#Bean
public Step step2() {
}
#Bean
public Step step3() {
}
#Bean
public Job myJob() {
return jobBuilderFactory.get("myJob").incrementer(new RunIdIncrementer())
// .listener(new CustomJobExecutionListener())
.start(step1()).on("*").to(step2())
.from(step1()).on(ExitStatus.FAILED.getExitCode()).to(step2())
.from(step2()).on("*").to(step3())
.from(step2()).on(ExitStatus.FAILED.getExitCode()).to(step3())
.end().build();
}
}
I've added conditional flow to the job so that every next step should work regardless of a failure in the previous step. Everything works fine in the initial steps but if an exception is thrown in the last step, the Exit Status of the whole job becomes FAILED. To solve this AND to solve any other failures in the job, I tried to implement restart functionality. Please note that I'm not saving the state in the readers due to multi-threading and I'm not sure whether this could affect the restarting.
I have referred the accepted solution in the below question,
https://stackoverflow.com/questions/38846457/how-can-you-restart-a-failed-spring-batch-job-and-let-it-pick-up-where-it-left-o
but I don't quite understand how or where to call the jobOperator.restart method at.
I've tried it like below, expecting the job to restart after launching, if failed. But it didn't work at all. Also, this implementation would stop the functionality of #Retryable annotation due to the try-catch block with Exception class caught.
#Component
#EnableScheduling
public class JobScheduler {
private static final Logger logger = LoggerFactory.getLogger(JobScheduler.class);
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job job;
#Autowired
private JobRepository jobRepository;
#Autowired
private JobRegistry jobRegistry;
#Autowired
private DataSource dataSource;
#Scheduled(cron = "0 0 */2 * * ?")
#Retryable(maxAttempts = 3, backoff = #Backoff(delay = 60000),
include = {SQLException.class, RuntimeException.class})
public void automatedTask() {
JobParameters jobParameters = new JobParametersBuilder().addLong("time", System.currentTimeMillis()).toJobParameters();
try {
JobExecution jobExecution = jobLauncher.run(job, jobParameters);
JobExplorer jobExplorer = this.getJobExplorer(dataSource);
JobOperator jobOperator = this.getJobOperator(jobLauncher, jobRepository, jobRegistry, jobExplorer);
List<JobInstance> jobInstances = jobExplorer.getJobInstances("myJob",0,1);
if(!jobInstances.isEmpty()){
JobInstance jobInstance = jobInstances.get(0);
List<JobExecution> jobExecutions = jobExplorer.getJobExecutions(jobInstance);
if(!jobExecutions.isEmpty()){
for(JobExecution execution: jobExecutions){
if(execution.getStatus().equals(BatchStatus.FAILED)){
jobOperator.restart(execution.getId());
}
}
}
}
} catch (Exception ex) {
logger.error("Error occurred when executing job scheduler", ex);
}
}
#Bean
public JobOperator getJobOperator(final JobLauncher jobLauncher, final JobRepository jobRepository,
final JobRegistry jobRegistry, final JobExplorer jobExplorer) {
final SimpleJobOperator jobOperator = new SimpleJobOperator();
jobOperator.setJobLauncher(jobLauncher);
jobOperator.setJobRepository(jobRepository);
jobOperator.setJobRegistry(jobRegistry);
jobOperator.setJobExplorer(jobExplorer);
return jobOperator;
}
#Bean
public JobExplorer getJobExplorer(final DataSource dataSource) throws Exception {
final JobExplorerFactoryBean bean = new JobExplorerFactoryBean();
bean.setDataSource(dataSource);
bean.setTablePrefix("BATCH_");
bean.setJdbcOperations(new JdbcTemplate(dataSource));
bean.afterPropertiesSet();
return bean.getObject();
}
}
I then tried adding a custom JobExecutionListener like below, expecting it to restart it after running the job, if failed. But it just fails as all the Autowired beans are becoming NULL.
public class CustomJobExecutionListener {
private static final Logger logger = LoggerFactory.getLogger(CustomJobExecutionListener.class);
#Autowired
private JobLauncher jobLauncher;
#Autowired
private JobRepository jobRepository;
#Autowired
private JobRegistry jobRegistry;
#Autowired
private DataSource dataSource;
#BeforeJob
public void beforeJob(JobExecution jobExecution) {
}
#AfterJob
public void afterJob(JobExecution jobExecution) {
try {
JobExplorer jobExplorer = this.getJobExplorer(dataSource);
JobOperator jobOperator = this.getJobOperator(jobLauncher, jobRepository, jobRegistry, jobExplorer);
if(jobExecution.getStatus().equals(BatchStatus.FAILED)){
jobOperator.restart(jobExecution.getId());
}
} catch (Exception ex) {
logger.error("Unknown error occurred when executing after job execution listener", ex);
}
}
#Bean
public JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor(JobRegistry jobRegistry) {
final JobRegistryBeanPostProcessor jobRegistryBeanPostProcessor = new JobRegistryBeanPostProcessor();
jobRegistryBeanPostProcessor.setJobRegistry(jobRegistry);
return jobRegistryBeanPostProcessor;
}
#Bean
public JobOperator getJobOperator(final JobLauncher jobLauncher, final JobRepository jobRepository,
final JobRegistry jobRegistry, final JobExplorer jobExplorer) {
final SimpleJobOperator jobOperator = new SimpleJobOperator();
jobOperator.setJobLauncher(jobLauncher);
jobOperator.setJobRepository(jobRepository);
jobOperator.setJobRegistry(jobRegistry);
jobOperator.setJobExplorer(jobExplorer);
return jobOperator;
}
#Bean
public JobExplorer getJobExplorer(final DataSource dataSource) throws Exception {
final JobExplorerFactoryBean bean = new JobExplorerFactoryBean();
bean.setDataSource(dataSource);
bean.setTablePrefix("BATCH_");
bean.setJdbcOperations(new JdbcTemplate(dataSource));
bean.afterPropertiesSet();
return bean.getObject();
}
}
What am I doing wrong? How should the restart functionality be implemented for this job?
Appreciate your kind help!
Please note that I'm not saving the state in the readers due to multi-threading and I'm not sure whether this could affect the restarting.
It certainly affects restartability. Multi-threading in steps is incompatible with restartability. From the javadoc of the JdbcPagingItemReader that you are using, you can read the following:
The implementation is thread-safe in between calls to open(ExecutionContext),
but remember to use saveState=false if used in a multi-threaded client
(no restart available).
Without restart data, Spring Batch cannot restart the step from where it left-off. This is a trade-off that you have accepted by using a multi-threaded step.
but I don't quite understand how or where to call the jobOperator.restart method at.
Now with regard to restarting the failed job, a few notes:
Trying to restart a job in a JobExecutionListener is incorrect. This listener is called in the scope of the current job execution, while a restart will have its own, distinct job execution
JobOperator#restart should not be called inside the scheduled method, otherwise it will be called for every scheduled run. You can find an example here: https://stackoverflow.com/a/55137314/5019386

how to configure spring dataflow for spring batch

I have spring batch project I want to configure it on spring cloud dataflow I m able to register it on SCDF but on launching task my job is not running
following is my configuration file
#SpringBootApplication
#EnableBatchProcessing
#EnableTask
public class BatchApplication {
/*#Autowired
BatchCommandLineRunner batchcommdrunner;
#Bean
public CommandLineRunner commandLineRunner() {
System.out.println("Executed at :" + new SimpleDateFormat().format(new Date()));
return batchcommdrunner ;
}*/
public static void main(String[] args) {
SpringApplication.run(BatchApplication.class, args);
}
}
And this is my batch confriguration file
#Configuration
public class BatchConfiguaration {
#Autowired
private DataSource datasouce;
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
public Environment env;
#Bean(name = "reader")
#StepScope
public ItemReader<Schedules> reader(#Value("#{stepExecutionContext[scheduleRecs]}") List<Schedules> scherecs) {
ItemReader<Schedules> reader = new IteratorItemReader<Schedules>(scherecs);
return reader;
}
#Bean(name = "CWSreader")
#StepScope
public ItemReader<Contents> CWSreader(#Value("#{stepExecutionContext[scheduleRecs]}") List<Contents> scherecs) {
ItemReader<Contents> reader = new IteratorItemReader<Contents>(scherecs);
return reader;
}
#SuppressWarnings("rawtypes")
#Bean
#StepScope
public BatchProcessor processor() {
return new BatchProcessor();
}
#Bean(name = "batchSchedulePreparedStatement")
#StepScope
public BatchSchedulePreparedStatement batchSchedulePreparedStatement() {
return new BatchSchedulePreparedStatement();
}
#SuppressWarnings({ "rawtypes", "unchecked" })
#Bean(name = "batchWriter")
#StepScope
public BatchWriter batchWriter() {
BatchWriter batchWriter = new BatchWriter();
batchWriter.setDataSource(datasouce);
batchWriter.setSql(env.getProperty("batch.insert.schedule.query"));
batchWriter.setItemPreparedStatementSetter(batchSchedulePreparedStatement());
return batchWriter;
}
#Bean("acheronDbTm")
#Qualifier("acheronDbTm")
public PlatformTransactionManager platformTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public JobExplorer jobExplorer() throws Exception {
MapJobExplorerFactoryBean explorerFactoryBean = new MapJobExplorerFactoryBean();
explorerFactoryBean.setRepositoryFactory(mapJobRepositoryFactoryBean());
explorerFactoryBean.afterPropertiesSet();
return explorerFactoryBean.getObject();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean() {
MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean = new MapJobRepositoryFactoryBean();
mapJobRepositoryFactoryBean.setTransactionManager(platformTransactionManager());
return mapJobRepositoryFactoryBean;
}
#Bean
public JobRepository jobRepository() throws Exception {
return mapJobRepositoryFactoryBean().getObject();
}
#Bean
public SimpleJobLauncher jobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository());
return jobLauncher;
}
#Bean(name = "batchPartition")
#StepScope
public BatchPartition batchPartition() {
BatchPartition batchPartition = new BatchPartition();
return batchPartition;
}
#Bean(name="taskExecutor")
public TaskExecutor taskExecutor() {
ThreadPoolTaskExecutor poolTaskExecutor = new ThreadPoolTaskExecutor();
poolTaskExecutor.setCorePoolSize(10);
poolTaskExecutor.setMaxPoolSize(30);
poolTaskExecutor.setQueueCapacity(35);
poolTaskExecutor.setThreadNamePrefix("Acheron");
poolTaskExecutor.afterPropertiesSet();
return poolTaskExecutor;
}
#Bean(name = "masterStep")
public Step masterStep() {
return stepBuilderFactory.get("masterStep").partitioner(slave()).partitioner("slave", batchPartition())
.taskExecutor(taskExecutor()).build();
}
#Bean(name = "slave")
public Step slave() {
return stepBuilderFactory.get("slave").chunk(100).faultTolerant().retryLimit(2)
.retry(DeadlockLoserDataAccessException.class).reader(reader(null)).processor(processor())
.writer(batchWriter()).build();
}
#Bean(name = "manageStagingScheduleMaster")
public Job manageStagingScheduleMaster(final Step masterStep) throws Exception {
return jobBuilderFactory.get("manageStagingScheduleMaster").preventRestart().incrementer(new RunIdIncrementer())
.start(masterStep).build();
}
can anyone help me to configure it properly or is there any other way where I can monitor my batch jobs
I also tried with Spring boot admin but it is not supporting java configuration in SBA is there any way to add jobs without jobs in xml
I am launcing this job from controller
JobParametersBuilder builder = new JobParametersBuilder();
System.out.println("Job Builder " + builder);
JobParameters jobParameters = builder.toJobParameters();
JobExecution execution = jobLauncher.run(job, jobParameters);
return execution.getStatus().toString();
This sample shows a basic Spring batch application that can be launched as a task in Spring Cloud Data Flow.

How can i restrict duplicate job creation using spring boot and spring batch?

I created a Spring Boot with Spring Batch Application and Scheduling. When i create only one job, things are working fine . But when i try to create another job using the modular approach. The jobs and it's step are running many times and they are getting duplicated.
Getting the below error.
2017-08-24 16:05:00.581 INFO 16172 --- [cTaskExecutor-2] o.s.b.c.l.support.SimpleJobLauncher : Job: [FlowJob: [name=importDeptJob]] completed with the following parameters: [{JobID1=1503579900035}] and the following status: [FAILED]
2017-08-24 16:05:00.581 ERROR 16172 --- [cTaskExecutor-2] o.s.batch.core.step.tasklet.TaskletStep : JobRepository failure forcing rollback
org.springframework.dao.OptimisticLockingFailureException: Attempt to update step execution id=1 with wrong version (3), where current version is 1
Can anyone Please guide me how to resolve these issues and run the jobs in a parallel way independent of each other ?
Below are the configuration Classes : ModularJobConfiguration.java , DeptBatchConfiguration.java ,CityBatchConfiguration.java and BatchScheduler.java
#Configuration
#EnableBatchProcessing(modular=true)
public class ModularJobConfiguration {
#Bean
public ApplicationContextFactory firstJob() {
return new GenericApplicationContextFactory(DeptBatchConfiguration.class);
}
#Bean
public ApplicationContextFactory secondJob() {
return new GenericApplicationContextFactory(CityBatchConfiguration.class);
}
}
#Configuration
#EnableBatchProcessing
#Import({BatchScheduler.class})
public class DeptBatchConfiguration {
private static final Logger LOGGER = LoggerFactory.getLogger(DeptBatchConfiguration.class);
#Autowired
private SimpleJobLauncher jobLauncher;
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public JobExecutionListener listener;
public ItemReader<DepartmentModelReader> deptReaderSO;
#Autowired
#Qualifier("dataSourceReader")
private DataSource dataSourceReader;
#Autowired
#Qualifier("dataSourceWriter")
private DataSource dataSourceWriter;
#Scheduled(cron = "0 0/1 * * * ?")
public void performFirstJob() throws Exception {
long startTime = System.currentTimeMillis();
LOGGER.info("Job1 Started at :" + new Date());
JobParameters param = new JobParametersBuilder().addString("JobID1",String.valueOf(System.currentTimeMillis())).toJobParameters();
JobExecution execution = (JobExecution) jobLauncher.run(importDeptJob(jobBuilderFactory,stepdept(deptReaderSO,customWriter()),listener), param);
long endTime = System.currentTimeMillis();
LOGGER.info("Job1 finished at " + (endTime - startTime) / 1000 + " seconds with status :" + execution.getExitStatus());
}
#Bean
public ItemReader<DepartmentModelReader> deptReaderSO() {
//LOGGER.info("Inside deptReaderSO Method");
JdbcCursorItemReader<DepartmentModelReader> deptReaderSO = new JdbcCursorItemReader<>();
//deptReaderSO.setSql("select id, firstName, lastname, random_num from reader");
deptReaderSO.setSql("SELECT DEPT_CODE,DEPT_NAME,FULL_DEPT_NAME,CITY_CODE,CITY_NAME,CITY_TYPE_NAME,CREATED_USER_ID,CREATED_G_DATE,MODIFIED_USER_ID,MODIFIED_G_DATE,RECORD_ACTIVITY,DEPT_CLASS,DEPT_PARENT,DEPT_PARENT_NAME FROM TBL_SAMPLE_SAFTY_DEPTS");
deptReaderSO.setDataSource(dataSourceReader);
deptReaderSO.setRowMapper(
(ResultSet resultSet, int rowNum) -> {
if (!(resultSet.isAfterLast()) && !(resultSet.isBeforeFirst())) {
DepartmentModelReader recordSO = new DepartmentModelReader();
recordSO.setDeptCode(resultSet.getString("DEPT_CODE"));
recordSO.setDeptName(resultSet.getString("DEPT_NAME"));
recordSO.setFullDeptName(resultSet.getString("FULL_DEPT_NAME"));
recordSO.setCityCode(resultSet.getInt("CITY_CODE"));
recordSO.setCityName(resultSet.getString("CITY_NAME"));
recordSO.setCityTypeName(resultSet.getString("CITY_TYPE_NAME"));
recordSO.setCreatedUserId(resultSet.getInt("CREATED_USER_ID"));
recordSO.setCreatedGDate(resultSet.getDate("CREATED_G_DATE"));
recordSO.setModifiedUserId(resultSet.getString("MODIFIED_USER_ID"));
recordSO.setModifiedGDate(resultSet.getDate("MODIFIED_G_DATE"));
recordSO.setRecordActivity(resultSet.getInt("RECORD_ACTIVITY"));
recordSO.setDeptClass(resultSet.getInt("DEPT_CLASS"));
recordSO.setDeptParent(resultSet.getString("DEPT_PARENT"));
recordSO.setDeptParentName(resultSet.getString("DEPT_PARENT_NAME"));
// LOGGER.info("RowMapper record : {}", recordSO.getDeptCode() +" | "+recordSO.getDeptName());
return recordSO;
} else {
LOGGER.info("Returning null from rowMapper");
return null;
}
});
return deptReaderSO;
}
#Bean
public ItemProcessor<DepartmentModelReader, DepartmentModelWriter> processor() {
//LOGGER.info("Inside Processor Method");
return new RecordProcessor();
}
#Bean
public ItemWriter<DepartmentModelWriter> customWriter(){
//LOGGER.info("Inside customWriter Method");
return new CustomItemWriter();
}
#Bean
public Job importDeptJob(JobBuilderFactory jobs, Step stepdept,JobExecutionListener listener){
return jobs.get("importDeptJob")
.incrementer(new RunIdIncrementer())
.listener(listener())
.flow(stepdept).end().build();
}
#Bean
public Step stepdept(ItemReader<DepartmentModelReader> deptReaderSO,
ItemWriter<DepartmentModelWriter> writerSO) {
LOGGER.info("Inside stepdept Method");
return stepBuilderFactory.get("stepdept").<DepartmentModelReader, DepartmentModelWriter>chunk(5)
.reader(deptReaderSO).processor(processor()).writer(customWriter()).transactionManager(platformTransactionManager(dataSourceWriter)).build();
}
#Bean
public JobExecutionListener listener() {
return new JobCompletionNotificationListener();
}
#Bean
public JdbcTemplate jdbcTemplate(DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
#Bean
public BatchWriteService batchWriteService() {
return new BatchWriteService();
}
#Bean
public PlatformTransactionManager platformTransactionManager(#Qualifier("dataSourceWriter") DataSource dataSourceWriter) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setDataSource(dataSourceWriter);
return transactionManager;
}
}
#Configuration
#EnableBatchProcessing
#Import({BatchScheduler.class})
public class CityBatchConfiguration {
private static final Logger LOGGER = LoggerFactory.getLogger(CityBatchConfiguration.class);
#Autowired
private SimpleJobLauncher jobLauncher;
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public JobExecutionListener listener;
public ItemReader<CitiesModelReader> citiesReaderSO;
#Autowired
#Qualifier("dataSourceReader")
private DataSource dataSourceReader;
#Autowired
#Qualifier("dataSourceWriter")
private DataSource dataSourceWriter;
#Scheduled(cron = "0 0/1 * * * ?")
public void performSecondJob() throws Exception {
long startTime = System.currentTimeMillis();
LOGGER.info("\n Job2 Started at :" + new Date());
JobParameters param = new JobParametersBuilder().addString("JobID2",String.valueOf(System.currentTimeMillis())).toJobParameters();
JobExecution execution = (JobExecution) jobLauncher.run(importCitiesJob(jobBuilderFactory,stepcity(citiesReaderSO,customCitiesWriter()),listener), param);
long endTime = System.currentTimeMillis();
LOGGER.info("Job2 finished at " + (endTime - startTime) / 1000 + " seconds with status :" + execution.getExitStatus());
}
#Bean
public ItemReader<CitiesModelReader> citiesReaderSO() {
//LOGGER.info("Inside readerSO Method");
JdbcCursorItemReader<CitiesModelReader> readerSO = new JdbcCursorItemReader<>();
readerSO.setSql("SELECT CITY_CODE,CITY_NAME,PARENT_CITY,CITY_TYPE,CITY_TYPE_NAME,CREATED_G_DATE,CREATED_USER_ID,MODIFIED_G_DATE,MODIFIED_USER_ID,RECORD_ACTIVITY FROM TBL_SAMPLE_SAFTY_CITIES");
readerSO.setDataSource(dataSourceReader);
readerSO.setRowMapper(
(ResultSet resultSet, int rowNum) -> {
if (!(resultSet.isAfterLast()) && !(resultSet.isBeforeFirst())) {
CitiesModelReader recordSO = new CitiesModelReader();
recordSO.setCityCode(resultSet.getLong("CITY_CODE"));
recordSO.setCityName(resultSet.getString("CITY_NAME"));
recordSO.setParentCity(resultSet.getInt("PARENT_CITY"));
recordSO.setCityType(resultSet.getString("CITY_TYPE"));
recordSO.setCityTypeName(resultSet.getString("CITY_TYPE_NAME"));
recordSO.setCreatedGDate(resultSet.getDate("CREATED_G_DATE"));
recordSO.setCreatedUserId(resultSet.getString("CREATED_USER_ID"));
recordSO.setModifiedGDate(resultSet.getDate("MODIFIED_G_DATE"));
recordSO.setModifiedUserId(resultSet.getString("MODIFIED_USER_ID"));
recordSO.setRecordActivity(resultSet.getInt("RECORD_ACTIVITY"));
//LOGGER.info("RowMapper record : {}", recordSO.toString());
return recordSO;
} else {
LOGGER.info("Returning null from rowMapper");
return null;
}
});
return readerSO;
}
#Bean
public ItemProcessor<CitiesModelReader,CitiesModelWriter> citiesProcessor() {
//LOGGER.info("Inside Processor Method");
return new RecordCitiesProcessor();
}
#Bean
public ItemWriter<CitiesModelWriter> customCitiesWriter(){
LOGGER.info("Inside customCitiesWriter Method");
return new CustomCitiesWriter();
}
#Bean
public Job importCitiesJob(JobBuilderFactory jobs, Step stepcity,JobExecutionListener listener) {
LOGGER.info("Inside importCitiesJob Method");
return jobs.get("importCitiesJob")
.incrementer(new RunIdIncrementer())
.listener(listener())
.flow(stepcity).end().build();
}
#Bean
public Step stepcity(ItemReader<CitiesModelReader> readerSO,
ItemWriter<CitiesModelWriter> writerSO) {
LOGGER.info("Inside stepCity Method");
return stepBuilderFactory.get("stepcity").<CitiesModelReader, CitiesModelWriter>chunk(5)
.reader(readerSO).processor(citiesProcessor()).writer(customCitiesWriter()).transactionManager(platformTransactionManager(dataSourceWriter)).build();
}
#Bean
public JobExecutionListener listener() {
return new JobCompletionNotificationListener();
}
#Bean
public JdbcTemplate jdbcTemplate(DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
#Bean
public BatchWriteService batchWriteService() {
return new BatchWriteService();
}
#Bean
public PlatformTransactionManager platformTransactionManager(#Qualifier("dataSourceWriter") DataSource dataSourceWriter) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setDataSource(dataSourceWriter);
return transactionManager;
}
}
#Configuration
#EnableScheduling
public class BatchScheduler {
private static final Logger LOGGER = LoggerFactory.getLogger(BatchScheduler.class);
#Bean
public ResourcelessTransactionManager resourcelessTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory(
ResourcelessTransactionManager txManager) throws Exception {
LOGGER.info("Inside mapJobRepositoryFactory method");
MapJobRepositoryFactoryBean factory = new
MapJobRepositoryFactoryBean(txManager);
factory.afterPropertiesSet();
return factory;
}
#Bean
public JobRepository jobRepository(
MapJobRepositoryFactoryBean factory) throws Exception {
LOGGER.info("Inside jobRepository method");
return factory.getObject();
}
#Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) {
LOGGER.info("Inside jobLauncher method");
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setJobRepository(jobRepository);
final SimpleAsyncTaskExecutor simpleAsyncTaskExecutor = new SimpleAsyncTaskExecutor();
launcher.setTaskExecutor(simpleAsyncTaskExecutor);
return launcher;
}
}
The map-based SimpleJobRepository created from MapJobRepositoryFactoryBean is not thread-safe.
From the Javadocs:
A FactoryBean that automates the creation of a SimpleJobRepository using non-persistent in-memory DAO implementations. This repository is only really intended for use in testing and rapid prototyping. In such settings you might find that ResourcelessTransactionManager is useful (as long as your business logic does not use a relational database). Not suited for use in multi-threaded jobs with splits, although it should be safe to use in a multi-threaded step.
You can create a JDBC-based SimpleJobRepository from JobRepositoryFactoryBean, which may utilize an in-memory H2 database if you don't require that Batch metadata be persisted.
Since you are using Spring Boot, to use an H2-backed JobRepository simply remove your JobRepository bean and add the following dependency to your pom.xml file:
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>runtime</scope>
</dependency>
Spring Boot will automatically configure a DataSource as though you had configured the following in your application.properties file and automatically use that DataSource in the creation of a JobRepository.
spring.datasource.url=jdbc:h2:mem:testdb
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=
Alternatively, to use some other JDBC-backed JobRepository add the JDBC dependencies for your RDBMS of choice to your project, and configure a DataSource for it (either in code as a DataSource bean, or in application.properties using the spring.datasource prefix as shown above). Spring Boot will automatically use this DataSource during the creation of the JobRepository bean.

Batch with Spring Boot & JPA - use in-memory datasource for batch-related tables

Context
I'm trying to develop a batch service with Spring Boot, using JPA Repository. Using two different datasources, I want the batch-related tables created in a in-memory database, so that it does not pollute my business database.
Following multiple topics on the web, I came up with this configuration of my two datasources :
#Configuration
public class DataSourceConfiguration {
#Bean(name = "mainDataSource")
#Primary
#ConfigurationProperties(prefix="spring.datasource")
public DataSource mainDataSource(){
return DataSourceBuilder.create().build();
}
#Bean(name = "batchDataSource")
public DataSource batchDataSource( #Value("${batch.datasource.url}") String url ){
return DataSourceBuilder.create().url( url ).build();
}
}
The first one, mainDataSource, uses the default Spring database configuration. The batchDataSource defines an embedded HSQL database, in which I want the batch and step tables to be created.
# DATASOURCE (DataSourceAutoConfiguration & DataSourceProperties)
spring.datasource.url=jdbc:mariadb://localhost:3306/batch_poc
spring.datasource.username=root
spring.datasource.password=
spring.datasource.driver-class-name=org.mariadb.jdbc.Driver
spring.datasource.max-age=10000
spring.datasource.initialize=false
# JPA (JpaBaseConfiguration, HibernateJpaAutoConfiguration)
spring.jpa.generate-ddl=false
spring.jpa.show-sql=true
spring.jpa.database=MYSQL
# SPRING BATCH (BatchDatabaseInitializer)
spring.batch.initializer.enabled=false
# ----------------------------------------
# PROJECT SPECIFIC PROPERTIES
# ----------------------------------------
# BATCH DATASOURCE
batch.datasource.url=jdbc:hsqldb:file:C:/tmp/hsqldb/batchdb
Here is my batch config :
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
private static final Logger LOG = Logger.getLogger( BatchConfiguration.class );
#Bean
public BatchConfigurer configurer(){
return new CustomBatchConfigurer();
}
#Bean
public Job importElementsJob( JobBuilderFactory jobs, Step step1 ){
return jobs.get("importElementsJob")
.incrementer( new RunIdIncrementer() )
.flow( step1 )
.end()
.build();
}
#Bean
public Step step1( StepBuilderFactory stepBuilderFactory, ItemReader<InputElement> reader,
ItemWriter<List<Entity>> writer, ItemProcessor<InputElement, List<Entity>> processor ){
return stepBuilderFactory.get("step1")
.<InputElement, List<Entity>> chunk(100)
.reader( reader )
.processor( processor )
.writer( writer )
.build();
}
#Bean
public ItemReader<InputElement> reader() throws IOException {
return new CustomItemReader();
}
#Bean
public ItemProcessor<InputElement, List<Entity>> processor(){
return new CutsomItemProcessor();
}
#Bean
public ItemWriter<List<Entity>> writer(){
return new CustomItemWriter();
}
}
The BatchConfigurer, using the in-memory database :
public class CustomBatchConfigurer extends DefaultBatchConfigurer {
#Override
#Autowired
public void setDataSource( #Qualifier("batchDataSource") DataSource dataSource) {
super.setDataSource(dataSource);
}
}
And, finally, my writer :
public class CustomItemWriter implements ItemWriter<List<Entity>> {
private static final Logger LOG = Logger.getLogger( EntityWriter.class );
#Autowired
private EntityRepository entityRepository;
#Override
public void write(List<? extends List<Entity>> items)
throws Exception {
if( items != null && !items.isEmpty() ){
for( List<Entity> entities : items ){
for( Entity entity : entities ){
Entity fromDb = entityRepository.findById( entity.getId() );
// Insert
if( fromDb == null ){
entityRepository.save( entity );
}
// Update
else {
// TODO : entityManager.merge()
}
}
}
}
}
}
The EntityRepository interface extends JpaRepository.
Problem
When I separate the datasources this way, nothing happens when I call the save method of the repository. I see the select queries from the call of findById() in the logs. But nothing for the save. And my output database is empty at the end.
When I come back to a unique datasource configuration (removing the configurer bean and letting Spring Boot manage the datasource alone), the insert queries work fine.
Maybe the main datasource configuration is not good enough for JPA to perform the inserts correctly. But what is missing ?
I finally solved the problem implementing my own BatchConfigurer, on the base of the Spring class BasicBatchConfigurer, and forcing the use of Map based jobRepository and jobExplorer. No more custom datasource configuration, only one datasource which I let Spring Boot manage : it's easier that way.
My custom BatchConfigurer :
public class CustomBatchConfigurer implements BatchConfigurer {
private static final Logger LOG = Logger.getLogger( CustomBatchConfigurer.class );
private final EntityManagerFactory entityManagerFactory;
private PlatformTransactionManager transactionManager;
private JobRepository jobRepository;
private JobLauncher jobLauncher;
private JobExplorer jobExplorer;
/**
* Create a new {#link CustomBatchConfigurer} instance.
* #param entityManagerFactory the entity manager factory
*/
public CustomBatchConfigurer( EntityManagerFactory entityManagerFactory ) {
this.entityManagerFactory = entityManagerFactory;
}
#Override
public JobRepository getJobRepository() {
return this.jobRepository;
}
#Override
public PlatformTransactionManager getTransactionManager() {
return this.transactionManager;
}
#Override
public JobLauncher getJobLauncher() {
return this.jobLauncher;
}
#Override
public JobExplorer getJobExplorer() throws Exception {
return this.jobExplorer;
}
#PostConstruct
public void initialize() {
try {
// transactionManager:
LOG.info("Forcing the use of a JPA transactionManager");
if( this.entityManagerFactory == null ){
throw new Exception("Unable to initialize batch configurer : entityManagerFactory must not be null");
}
this.transactionManager = new JpaTransactionManager( this.entityManagerFactory );
// jobRepository:
LOG.info("Forcing the use of a Map based JobRepository");
MapJobRepositoryFactoryBean jobRepositoryFactory = new MapJobRepositoryFactoryBean( this.transactionManager );
jobRepositoryFactory.afterPropertiesSet();
this.jobRepository = jobRepositoryFactory.getObject();
// jobLauncher:
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.afterPropertiesSet();
this.jobLauncher = jobLauncher;
// jobExplorer:
MapJobExplorerFactoryBean jobExplorerFactory = new MapJobExplorerFactoryBean(jobRepositoryFactory);
jobExplorerFactory.afterPropertiesSet();
this.jobExplorer = jobExplorerFactory.getObject();
}
catch (Exception ex) {
throw new IllegalStateException("Unable to initialize Spring Batch", ex);
}
}
}
My configuration class looks like this now :
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
#Bean
public BatchConfigurer configurer( EntityManagerFactory entityManagerFactory ){
return new CustomBatchConfigurer( entityManagerFactory );
}
[...]
}
And my properties files :
# DATASOURCE (DataSourceAutoConfiguration & DataSourceProperties)
spring.datasource.url=jdbc:mariadb://localhost:3306/inotr_poc
spring.datasource.username=root
spring.datasource.password=root
spring.datasource.driver-class-name=org.mariadb.jdbc.Driver
spring.datasource.max-age=10000
spring.datasource.initialize=true
# JPA (JpaBaseConfiguration, HibernateJpaAutoConfiguration)
spring.jpa.generate-ddl=false
spring.jpa.show-sql=true
spring.jpa.database=MYSQL
# SPRING BATCH (BatchDatabaseInitializer)
spring.batch.initializer.enabled=false
Thanks for the above posts! I have been struggling for the past couple of days to get my Spring Boot with Batch application working with an in-memory Map based job repository and a resourceless transaction manager. ( I cannot let Spring Batch to use my application datasource for the Batch meta data tables as I don't have the DDL access to create the BATCH_ tables there )
Finally arrived at the below configuration after looking at the above posts and it worked perfectly!!
public class CustomBatchConfigurer implements BatchConfigurer {
private static final Logger LOG = LoggerFactory.getLogger(CustomBatchConfigurer.class);
// private final EntityManagerFactory entityManagerFactory;
private PlatformTransactionManager transactionManager;
private JobRepository jobRepository;
private JobLauncher jobLauncher;
private JobExplorer jobExplorer;
/**
* Create a new {#link CustomBatchConfigurer} instance.
* #param entityManagerFactory the entity manager factory
public CustomBatchConfigurer( EntityManagerFactory entityManagerFactory ) {
this.entityManagerFactory = entityManagerFactory;
}
*/
#Override
public JobRepository getJobRepository() {
return this.jobRepository;
}
#Override
public PlatformTransactionManager getTransactionManager() {
return this.transactionManager;
}
#Override
public JobLauncher getJobLauncher() {
return this.jobLauncher;
}
#Override
public JobExplorer getJobExplorer() throws Exception {
return this.jobExplorer;
}
#PostConstruct
public void initialize() {
try {
// transactionManager:
LOG.info("Forcing the use of a Resourceless transactionManager");
this.transactionManager = new ResourcelessTransactionManager();
// jobRepository:
LOG.info("Forcing the use of a Map based JobRepository");
MapJobRepositoryFactoryBean jobRepositoryFactory = new MapJobRepositoryFactoryBean( this.transactionManager );
jobRepositoryFactory.afterPropertiesSet();
this.jobRepository = jobRepositoryFactory.getObject();
// jobLauncher:
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.afterPropertiesSet();
this.jobLauncher = jobLauncher;
// jobExplorer:
MapJobExplorerFactoryBean jobExplorerFactory = new MapJobExplorerFactoryBean(jobRepositoryFactory);
jobExplorerFactory.afterPropertiesSet();
this.jobExplorer = jobExplorerFactory.getObject();
}
catch (Exception ex) {
throw new IllegalStateException("Unable to initialize Spring Batch", ex);
}
}
}
And below is the bean i added in my job configuration class
#Bean
public BatchConfigurer configurer(){
return new CustomBatchConfigurer();
}
Eria's answer worked! However, I had modified it to use:
org.springframework.batch.support.transaction.ResourcelessTransactionManager
From CustomBatchConfigurer:
#PostConstruct
public void initialize() {
try {
// transactionManager:
LOGGER.info("Forcing the use of ResourcelessTransactionManager for batch db");
this.transactionManager = new ResourcelessTransactionManager();
//the rest of the code follows...
}

spring batch with annotations

I'm currently trying to create a batch with the spring annotations but the batch is never called. No error occurs, my batch isn't called. Its a simple batch that retrieves values from the database and add messages in a queue (rabbitmq).
The main configuration class:
#Configuration
#EnableBatchProcessing
public class BatchInfrastructureConfiguration {
#Bean
public JobLauncher getJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
public JobRepository getJobRepository() throws Exception {
MapJobRepositoryFactoryBean factory = new MapJobRepositoryFactoryBean();
factory.setTransactionManager(new ResourcelessTransactionManager());
factory.afterPropertiesSet();
return (JobRepository) factory.getObject();
}
}
The configuration class specific to my batch
#Configuration
#Import(BatchInfrastructureConfiguration.class)
public class PurchaseStatusBatchConfiguration {
#Inject
private JobBuilderFactory jobBuilders;
#Inject
private StepBuilderFactory stepBuilders;
#Bean
public Job purchaseStatusJob(){
return jobBuilders.get("purchaseStatusJob")
.start(step())
.build();
}
#Bean
public Step step(){
return stepBuilders.get("purchaseStatusStep")
.tasklet(new PurchaseStatusBatch())
.build();
}
}
The batch class:
public class PurchaseStatusBatch implements Tasklet {
#Inject
private PurchaseRepository purchaseRepository;
#Inject
#Qualifier(ApplicationConst.BEAN_QUALIFIER_PURCHASE_QUEUE)
private RabbitTemplate rabbitTemplate;
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
PurchaseDto purchaseDto;
PurchaseMessage purchaseMessage;
List<Purchase> notVerifiedPurchase = purchaseRepository.findByVerified(false);
for (Purchase purchase : notVerifiedPurchase) {
purchaseDto = new PurchaseDto();
purchaseDto.setOrderId(purchase.getOrderId());
purchaseDto.setProductId(purchase.getProductId());
purchaseDto.setPurchaseToken(purchase.getPurchaseToken());
purchaseDto.setUserScrapbookKey(purchase.getUserScrapbookKey());
purchaseMessage = new PurchaseMessage();
purchaseMessage.setPurchaseDto(purchaseDto);
rabbitTemplate.convertAndSend(purchaseMessage);
}
return null;
}
}
The job runner (class calling the batch):
#Service
public class PurchaseStatusJobRunner {
#Inject
private JobLocator jobLocator;
#Inject
private JobLauncher jobLauncher;
//#Scheduled(fixedDelay = 3000L)
//#Scheduled(cron="* * * * *") // every 1 minute
#Scheduled(fixedDelay = 3000L)
public void runJob() throws JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobParametersInvalidException, NoSuchJobException {
jobLauncher.run(jobLocator.getJob("purchaseStatusJob"), new JobParameters());
}
}
Short answer: The problem is that your PurchaseStatusBatch instance is not a spring bean... then all its attributes are null (they have not been injected)
Try this :
#Bean
public Step step(){
return stepBuilders.get("purchaseStatusStep")
.tasklet(purchaseStatusBatch())
.build();
}
#Bean
public PurchaseStatusBatch purchaseStatusBatch() {
return new PurchaseStatusBatch()
}
When you want to have feedback on your job execution, just use the JobExecution instance returned by the JobLauncher. You can get the ExitStatus, you can get all exceptions catched by the JobLauncher, and more information.
Another solution to have feedback would be providing a real database to your JobRepository, and then check execution statuses in it.

Resources