How to reset MultiResourceItemReader for each job run . Step scope not working - spring

How can I initilize the MultiResourceItemReader for each job run . currently with this setup its still using the same instance for each job run
I put the #StepScope still its using the same old list of files which it has already been processed. I am not sure what else I have to add in this code
I also tried with the #JobScope also it did not work out. there is something fundamental I am missing
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Value("file:ftp-inbound/*.csv")
#Autowired
private Resource[] inputResources;
#Autowired
private StepBuilderFactory steps;
#Autowired
private JobBuilderFactory jobs;
#Autowired
private ResourceLoader resourceLoader;
#Bean
public FlatFileItemReader<AccommodationRoomAvailability> itemReader() throws UnexpectedInputException, ParseException, IOException {
FlatFileItemReader<AccommodationRoomAvailability> reader = new FlatFileItemReader<AccommodationRoomAvailability>();
DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
String[] tokens = {"Product ID", "Allotment", "Kamertype", "Zoeknaam", "Hotel", "Datum", "Beschikbaar", "Nachten"};
tokenizer.setNames(tokens);
tokenizer.setDelimiter(";");
tokenizer.setStrict(true);
reader.setLinesToSkip(1);
DefaultLineMapper<AccommodationRoomAvailability> lineMapper = new DefaultLineMapper<AccommodationRoomAvailability>();
lineMapper.setLineTokenizer(tokenizer);
lineMapper.setFieldSetMapper(new RecordFieldSetMapper());
reader.setLineMapper(lineMapper);
return reader;
}
#Bean
#Qualifier("multiResourceReader")
#StepScope
public MultiResourceItemReader<AccommodationRoomAvailability> multiResourceItemReader() throws Exception {
MultiResourceItemReader<AccommodationRoomAvailability> resourceItemReader = new MultiResourceItemReader<AccommodationRoomAvailability>();
resourceItemReader.setResources(inputResources);
resourceItemReader.setDelegate(itemReader());
resourceItemReader.setStrict(false);
resourceItemReader.setSaveState(false);
// resourceItemReader.read();
return resourceItemReader;
}
#Bean
public ItemProcessor<AccommodationRoomAvailability, String> itemProcessor() {
return new AvailabilityProcessor();
}
#Bean
public ItemWriter itemWriter() {
return new ItemWriter() {
#Override
public void write(List list) throws Exception {
}
};
}
#Bean
protected Step step1(#Qualifier("multiResourceReader") MultiResourceItemReader<AccommodationRoomAvailability> reader, ItemProcessor<AccommodationRoomAvailability, String> processor,
ItemWriter writer) {
return steps.get("step1")/*.listener(new StepListener())*/.<AccommodationRoomAvailability, String>chunk(30000).reader(reader)
.processor(processor)
.writer(writer)
.build();
}
#Bean
public Step step2() throws IOException {
FileDeletingTasklet task = new FileDeletingTasklet();
task.setResources(inputResources);
return stepBuilderFactory.get("step2")
.tasklet(task)
.build();
}
#Bean(name = "job")
public Job job(#Qualifier("step1") Step step1, Step step2) throws IOException {
return jobs.get("job")
.start(step1).on("*").to(step2).end()
// .flow(step1).on("").to(step2()).end()
.build();
}
}

Once your application context is created, the injected resources #Value("file:ftp-inbound/*.csv") will be the same during the whole lifetime of your app. That's why the reader will always read the same values.
You need to pass these resources as a parameter to your job and late-bind them in your reader with Step scope. In your example it would be something like:
#Bean
#Qualifier("multiResourceReader")
#StepScope
public MultiResourceItemReader<AccommodationRoomAvailability> multiResourceItemReader(#Value("#{jobParameters['inputResources']}") Resource[] inputResources) throws Exception {
MultiResourceItemReader<AccommodationRoomAvailability> resourceItemReader = new MultiResourceItemReader<AccommodationRoomAvailability>();
resourceItemReader.setResources(inputResources);
resourceItemReader.setDelegate(itemReader());
resourceItemReader.setStrict(false);
resourceItemReader.setSaveState(false);
return resourceItemReader;
}
Then pass input resources as a parameter to your job:
JobParameters jobParameters = new JobParametersBuilder()
.addString("inputResources", "file:ftp-inbound/*.csv")
.toJobParameters();
currently with this setup its still using the same instance for each job run
That's because your resources are always the same when they are injected in a field of your configuration class. If you use the job parameters approach I mentioned in the previous example, you will have a different instance if you run the job with different set of files.

Related

spring batch job with partitions : setting clientInfo in an Oracle session not working for all partitions

I have a spring batch job using partions and reader is JdbcCursorItemReader, so in this reader I need an authorisation to read correctely crypted data, so when I declare my reader a call the method just bellow .
the problem is that somme partions read null value for the field which need to be decrypted , the only reason is that the authorisation is not set ( I check in database and data are not null), so why it's work for some partions and not for all?
private void authorize() {
//Authorize
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.update(setClientInfo, authorization);
}
and this is how i declare my reader
#Bean
#StepScope
public JdbcCursorItemReader<MyEntity> reader(#Value("#{stepExecutionContext['modulo']}") Integer modulo)
throws IOException {
ClassPathResource resource = new ClassPathResource(SQL_FILE);
BufferedReader reader = new BufferedReader(new InputStreamReader(resource.getInputStream()));
String query = FileCopyUtils.copyToString(reader);
query = query.replace(MODULO_LABEL, String.valueOf(modulo));
query = query.replace(GRID_SIZE_LABEL, String.valueOf(gridSize));
authorize();
JdbcCursorItemReader<MyEntity> cursorItemReader = new JdbcCursorItemReader<>();
cursorItemReader.setSql(query);
final int partitionSize = maxNumberCards / gridSize;
cursorItemReader.setMaxItemCount(partitionSize);
cursorItemReader.setDataSource(dataSource);
cursorItemReader.setRowMapper(myRowMapper);
return cursorItemReader;
}
and my job configuration
#Configuration
#EnableBatchProcessing
#RefreshScope
public class MyFunctionJobConfiguration {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
JdbcCursorItemReader<MyEntity> reader;
#Value("${max-number-card-to-process}")
private Integer MAX_NUMBER_CARD;
#Value("${chunck-size:10}")
private int chunckSize;
#Value("${grid-size:1}")
private int gridSize;
private final static String JOB_DISABLED = "job is disabled, check the configuration file !";
#Value("${job.enabled}")
private boolean batchIsEnabled;
private static final Logger LOGGER = LoggerFactory.getLogger("FUNCTIONAL_LOGGER");
#Bean
#StepScope
#RefreshScope
public MyEntityWriter writer() {
return new MyEntityWriter();
}
#Bean
#StepScope
#RefreshScope
public MyFunctionProcessor processor() throws IOException {
return new MyFunctionProcessor();
}
#Bean
public MyPrationner partitioner() {
return new MyPrationner();
}
#Bean
public Step masterStep() throws SQLException, IOException, ClassNotFoundException {
return stepBuilderFactory.get("masterStep")
.partitioner("MyFunctionStep", partitioner())
.step(MyFunctionStep())
.gridSize(gridSize)
.taskExecutor(MyFunctionTaskExecutor())
.build();
}
#Bean
public TaskExecutor myFunctionTaskExecutor() {
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setThreadNamePrefix("MyFunctionTaskExecutor_");
int corePoolSize = gridSize + 2;
int maxPoolSize = corePoolSize * 2;
taskExecutor.setMaxPoolSize(maxPoolSize);
taskExecutor.setAllowCoreThreadTimeOut(true);
taskExecutor.setCorePoolSize(corePoolSize);
taskExecutor.setQueueCapacity(Integer.MAX_VALUE);
return taskExecutor;
}
#Bean
public Step myFunctionStep() throws IOException, ClassNotFoundException, SQLException {
return stepBuilderFactory.get("MyFunctionStep")
.<MyEntity, MyEntity>chunk(chunckSize)
.reader(reader)
.faultTolerant()
.skipLimit(MAX_NUMBER_CARD)
.skip(InvalidCardNumberException.class)
.skip(TokenManagementException.class)
.processor(processor())
.listener(new MyEntityProcessListener())
.writer(writer())
.listener(new MyEntityWriteListener())
.build();
}
#Bean
public Job myFunctionJob(#Qualifier("MyFunctionStep") Step myFunctionStep)
throws SQLException, IOException, ClassNotFoundException {
if (!batchIsEnabled) {
LOGGER.error(JOB_DISABLED);
System.exit(0);
}
return jobBuilderFactory.get("MyFunctionJob")
.listener(new MyFunctionJobListener())
.incrementer(new RunIdIncrementer())
.flow(masterStep())
.end()
.build();
}
}
I try to run a spring batch job with partions to read data from oracle database( in the sql there is a decryption function ) this need to set an authorisation for every session of connexion
the problem when th batch run some partion not decrypt data and return null an the only reason fo that , is that the authorisation is not set
The JdbcCursorItemReader does not use a JdbcTemplate. It directly creates connections to the database from the data source object passed to it. So you should not be expecting to call authorize which operates on a separate JdbcTemplate instance to impact the behaviour of the JdbcCursorItemReader. You said it works for some partitions, and that's really surprising.
If you want to take control on how the connection to the database is configured and override the default settings (for example by adding some authorization attributes), you need to extend JdbcCursorItemReader and override the protected void openCursor(Connection con) method, something like:
class MyCustomJdbcCursorItemReader extends JdbcCursorItemReader {
#Override
protected void openCursor(Connection con) {
super.openCursor(con);
// con.setClientInfo(); // set client info as needed here
}
}

MongoDB Spring Batch job using Gradle

I want to create an item reader, writer and processor for a Spring batch job using Gradle. I am having trouble with a few things. The delimited() portion is giving me an error. I am trying to read two fields for now: rxFname, rxLname.
Here is my code:
#Configuration
#EnableBatchProcessing
public class PaymentPortalJob {
private static final Logger LOG =
LoggerFactory.getLogger(PaymentPortalJob.class);
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Bean
MongoItemReader <PaymentAudit> fileReader(){
return new MongoItemReaderBuilder <PaymentAudit>()
.name("file-reader")
.targetType(PaymentAudit.class)
.delimited().delimiter(",").names(new String [] {"rxFname" , "rxLname"})
.build();
}
#Bean
public ItemProcessor<PaymentAudit, PaymentAudit> processor() {
return new PaymentAuditItemProcessor();
}
#Bean
public ItemWriter<PaymentAudit> writer() {
return new MongoItemWriterBuilder()<PaymentAudit>();
try {
writer.setTemplate(mongoTemplate());
}catch (Exception e) {
LOG.error(e.toString());
}
writer.setCollection("paymentAudit");
return writer;
}
#Bean
Job job (JobBuilderFactory jbf,
StepBuilderFactory sbf,
ItemReader<? extends PaymentAudit> ir,
ItemWriter<? super PaymentAudit> iw) {
Step s1 = sbf.get("file-db")
.<PaymentAudit, PaymentAudit>chunk(100)
.reader(ir)
.writer(iw)
.build();
return jbf.get("etl")
.incrementer(new RunIdIncrementer())
.start(s1)
.build();
}
#Bean
public MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(new MongoClient(), "db-name");
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
MongoTemplate mongoTemplate = new MongoTemplate(mongoDbFactory());
return mongoTemplate;
}
}

how to configure spring dataflow for spring batch

I have spring batch project I want to configure it on spring cloud dataflow I m able to register it on SCDF but on launching task my job is not running
following is my configuration file
#SpringBootApplication
#EnableBatchProcessing
#EnableTask
public class BatchApplication {
/*#Autowired
BatchCommandLineRunner batchcommdrunner;
#Bean
public CommandLineRunner commandLineRunner() {
System.out.println("Executed at :" + new SimpleDateFormat().format(new Date()));
return batchcommdrunner ;
}*/
public static void main(String[] args) {
SpringApplication.run(BatchApplication.class, args);
}
}
And this is my batch confriguration file
#Configuration
public class BatchConfiguaration {
#Autowired
private DataSource datasouce;
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
public Environment env;
#Bean(name = "reader")
#StepScope
public ItemReader<Schedules> reader(#Value("#{stepExecutionContext[scheduleRecs]}") List<Schedules> scherecs) {
ItemReader<Schedules> reader = new IteratorItemReader<Schedules>(scherecs);
return reader;
}
#Bean(name = "CWSreader")
#StepScope
public ItemReader<Contents> CWSreader(#Value("#{stepExecutionContext[scheduleRecs]}") List<Contents> scherecs) {
ItemReader<Contents> reader = new IteratorItemReader<Contents>(scherecs);
return reader;
}
#SuppressWarnings("rawtypes")
#Bean
#StepScope
public BatchProcessor processor() {
return new BatchProcessor();
}
#Bean(name = "batchSchedulePreparedStatement")
#StepScope
public BatchSchedulePreparedStatement batchSchedulePreparedStatement() {
return new BatchSchedulePreparedStatement();
}
#SuppressWarnings({ "rawtypes", "unchecked" })
#Bean(name = "batchWriter")
#StepScope
public BatchWriter batchWriter() {
BatchWriter batchWriter = new BatchWriter();
batchWriter.setDataSource(datasouce);
batchWriter.setSql(env.getProperty("batch.insert.schedule.query"));
batchWriter.setItemPreparedStatementSetter(batchSchedulePreparedStatement());
return batchWriter;
}
#Bean("acheronDbTm")
#Qualifier("acheronDbTm")
public PlatformTransactionManager platformTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public JobExplorer jobExplorer() throws Exception {
MapJobExplorerFactoryBean explorerFactoryBean = new MapJobExplorerFactoryBean();
explorerFactoryBean.setRepositoryFactory(mapJobRepositoryFactoryBean());
explorerFactoryBean.afterPropertiesSet();
return explorerFactoryBean.getObject();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean() {
MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean = new MapJobRepositoryFactoryBean();
mapJobRepositoryFactoryBean.setTransactionManager(platformTransactionManager());
return mapJobRepositoryFactoryBean;
}
#Bean
public JobRepository jobRepository() throws Exception {
return mapJobRepositoryFactoryBean().getObject();
}
#Bean
public SimpleJobLauncher jobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository());
return jobLauncher;
}
#Bean(name = "batchPartition")
#StepScope
public BatchPartition batchPartition() {
BatchPartition batchPartition = new BatchPartition();
return batchPartition;
}
#Bean(name="taskExecutor")
public TaskExecutor taskExecutor() {
ThreadPoolTaskExecutor poolTaskExecutor = new ThreadPoolTaskExecutor();
poolTaskExecutor.setCorePoolSize(10);
poolTaskExecutor.setMaxPoolSize(30);
poolTaskExecutor.setQueueCapacity(35);
poolTaskExecutor.setThreadNamePrefix("Acheron");
poolTaskExecutor.afterPropertiesSet();
return poolTaskExecutor;
}
#Bean(name = "masterStep")
public Step masterStep() {
return stepBuilderFactory.get("masterStep").partitioner(slave()).partitioner("slave", batchPartition())
.taskExecutor(taskExecutor()).build();
}
#Bean(name = "slave")
public Step slave() {
return stepBuilderFactory.get("slave").chunk(100).faultTolerant().retryLimit(2)
.retry(DeadlockLoserDataAccessException.class).reader(reader(null)).processor(processor())
.writer(batchWriter()).build();
}
#Bean(name = "manageStagingScheduleMaster")
public Job manageStagingScheduleMaster(final Step masterStep) throws Exception {
return jobBuilderFactory.get("manageStagingScheduleMaster").preventRestart().incrementer(new RunIdIncrementer())
.start(masterStep).build();
}
can anyone help me to configure it properly or is there any other way where I can monitor my batch jobs
I also tried with Spring boot admin but it is not supporting java configuration in SBA is there any way to add jobs without jobs in xml
I am launcing this job from controller
JobParametersBuilder builder = new JobParametersBuilder();
System.out.println("Job Builder " + builder);
JobParameters jobParameters = builder.toJobParameters();
JobExecution execution = jobLauncher.run(job, jobParameters);
return execution.getStatus().toString();
This sample shows a basic Spring batch application that can be launched as a task in Spring Cloud Data Flow.

Spring batch with Spring Boot terminates before children process with AsyncItemProcessor

I'm using Spring Batch with a AsyncItemProcessor and things are behaving unexpectedly. Let me show first the code:
Followed a simple example as shown on the Spring Batch project:
#EnableBatchProcessing
#SpringBootApplication
#Import({HttpClientConfigurer.class, BatchJobConfigurer.class})
public class PerfilEletricoApp {
public static void main(String[] args) throws Exception {// NOSONAR
System.exit(SpringApplication.exit(SpringApplication.run(PerfilEletricoApp.class, args)));
//SpringApplication.run(PerfilEletricoApp.class, args);
}
}
-- EDIT
If I just sleep the main process go give a few seconds to slf4j to write the flush the logs, everything works as expected.
#EnableBatchProcessing
#SpringBootApplication
#Import({HttpClientConfigurer.class, BatchJobConfigurer.class})
public class PerfilEletricoApp {
public static void main(String[] args) throws Exception {// NOSONAR
//System.exit(SpringApplication.exit(SpringApplication.run(PerfilEletricoApp.class, args)));
ConfigurableApplicationContext context = SpringApplication.run(PerfilEletricoApp.class, args);
Thread.sleep(1000 * 5);
System.exit(SpringApplication.exit(context));
}
}
-- ENDOF EDIT
I'm reading a text file with a field and then using a AsyncItemProcessor to get a multithreaded processing, which consists of a Http GET on a URL to fetch some data, I'm also using a NoOpWriter to do nothing on the write part. I'm saving the results of the GET on the Processor part of the job (using log.trace / log.warn).
#Configuration
public class HttpClientConfigurer {
// [... property and configs omitted]
#Bean
public CloseableHttpClient createHttpClient() {
// ... creates and returns a poolable http client etc
}
}
As for the Job:
#Configuration
public class BatchJobConfigurer {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Value("${async.tps:10}")
private Integer tps;
#Value("${com.bemobi.perfilelerico.sourcedir:/AppServer/perfil-eletrico/source-dir/}")
private String sourceDir;
#Bean
public ItemReader<String> reader() {
MultiResourceItemReader<String> reader = new MultiResourceItemReader<>();
reader.setResources( new Resource[] { new FileSystemResource(sourceDir)});
reader.setDelegate((ResourceAwareItemReaderItemStream<? extends String>) flatItemReader());
return reader;
}
#Bean
public ItemReader<String> flatItemReader() {
FlatFileItemReader<String> itemReader = new FlatFileItemReader<>();
itemReader.setLineMapper(new DefaultLineMapper<String>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "sample-field-001"});
}});
setFieldSetMapper(new SimpleStringFieldSetMapper<>());
}});
return itemReader;
}
#Bean
public ItemProcessor asyncItemProcessor(){
AsyncItemProcessor<String, OiPaggoResponse> asyncItemProcessor = new AsyncItemProcessor<>();
asyncItemProcessor.setDelegate(processor());
asyncItemProcessor.setTaskExecutor(getAsyncExecutor());
return asyncItemProcessor;
}
#Bean
public ItemProcessor<String,OiPaggoResponse> processor(){
return new PerfilEletricoItemProcessor();
}
/**
* Using a NoOpItemWriter<T> so we satisfy spring batch flow but don't use writer for anything else.
* #return a NoOpItemWriter<OiPaggoResponse>
*/
#Bean
public ItemWriter<OiPaggoResponse> writer() {
return new NoOpItemWriter<>();
}
#Bean
protected Step step1() throws Exception {
/*
Problem starts here, If Use the processor() everything ends nicely, but if I insist on the asyncItemProcessor(), the job ends and the logs from processor are not stored on the disk.
*/
return this.steps.get("step1").<String, OiPaggoResponse> chunk(10)
.reader(reader())
.processor(asyncItemProcessor())
.build();
}
#Bean
public Job job() throws Exception {
return this.jobs.get("consulta-perfil-eletrico").start(step1()).build();
}
#Bean(name = "asyncExecutor")
public TaskExecutor getAsyncExecutor()
{
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(tps);
executor.setMaxPoolSize(tps);
executor.setQueueCapacity(tps * 1000);
executor.setRejectedExecutionHandler(new ThreadPoolExecutor.CallerRunsPolicy());
executor.setThreadNamePrefix("AsyncExecutor-");
return executor;
}
}
-- UPDATED WITH AsyncItemWriter (Working version)
/*Wrapped Writer*/
#Bean
public ItemWriter asyncItemWriter(){
AsyncItemWriter<OiPaggoResponse> asyncItemWriter = new AsyncItemWriter<>();
asyncItemWriter.setDelegate(writer());
return asyncItemWriter;
}
/*AsyncItemWriter defined on the steps*/
#Bean
protected Step step1() throws Exception {
return this.steps.get("step1").<String, OiPaggoResponse> chunk(10)
.reader(reader())
.processor(asyncItemProcessor())
.writer(asyncItemWriter())
.build();
}
--
Any thoughts on why the AsyncItemProcessor don't wait for all the children to to complete before send a OK-Completed signal to the context?
The issue is that the AsyncItemProcessor is creating Futures that no one is waiting for. Wrap your NoOpItemWriter in the AsyncItemWriter so that someone is waiting for the Futures. That will cause the job to complete as expected.

spring batch with annotations

I'm currently trying to create a batch with the spring annotations but the batch is never called. No error occurs, my batch isn't called. Its a simple batch that retrieves values from the database and add messages in a queue (rabbitmq).
The main configuration class:
#Configuration
#EnableBatchProcessing
public class BatchInfrastructureConfiguration {
#Bean
public JobLauncher getJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
public JobRepository getJobRepository() throws Exception {
MapJobRepositoryFactoryBean factory = new MapJobRepositoryFactoryBean();
factory.setTransactionManager(new ResourcelessTransactionManager());
factory.afterPropertiesSet();
return (JobRepository) factory.getObject();
}
}
The configuration class specific to my batch
#Configuration
#Import(BatchInfrastructureConfiguration.class)
public class PurchaseStatusBatchConfiguration {
#Inject
private JobBuilderFactory jobBuilders;
#Inject
private StepBuilderFactory stepBuilders;
#Bean
public Job purchaseStatusJob(){
return jobBuilders.get("purchaseStatusJob")
.start(step())
.build();
}
#Bean
public Step step(){
return stepBuilders.get("purchaseStatusStep")
.tasklet(new PurchaseStatusBatch())
.build();
}
}
The batch class:
public class PurchaseStatusBatch implements Tasklet {
#Inject
private PurchaseRepository purchaseRepository;
#Inject
#Qualifier(ApplicationConst.BEAN_QUALIFIER_PURCHASE_QUEUE)
private RabbitTemplate rabbitTemplate;
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
PurchaseDto purchaseDto;
PurchaseMessage purchaseMessage;
List<Purchase> notVerifiedPurchase = purchaseRepository.findByVerified(false);
for (Purchase purchase : notVerifiedPurchase) {
purchaseDto = new PurchaseDto();
purchaseDto.setOrderId(purchase.getOrderId());
purchaseDto.setProductId(purchase.getProductId());
purchaseDto.setPurchaseToken(purchase.getPurchaseToken());
purchaseDto.setUserScrapbookKey(purchase.getUserScrapbookKey());
purchaseMessage = new PurchaseMessage();
purchaseMessage.setPurchaseDto(purchaseDto);
rabbitTemplate.convertAndSend(purchaseMessage);
}
return null;
}
}
The job runner (class calling the batch):
#Service
public class PurchaseStatusJobRunner {
#Inject
private JobLocator jobLocator;
#Inject
private JobLauncher jobLauncher;
//#Scheduled(fixedDelay = 3000L)
//#Scheduled(cron="* * * * *") // every 1 minute
#Scheduled(fixedDelay = 3000L)
public void runJob() throws JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobParametersInvalidException, NoSuchJobException {
jobLauncher.run(jobLocator.getJob("purchaseStatusJob"), new JobParameters());
}
}
Short answer: The problem is that your PurchaseStatusBatch instance is not a spring bean... then all its attributes are null (they have not been injected)
Try this :
#Bean
public Step step(){
return stepBuilders.get("purchaseStatusStep")
.tasklet(purchaseStatusBatch())
.build();
}
#Bean
public PurchaseStatusBatch purchaseStatusBatch() {
return new PurchaseStatusBatch()
}
When you want to have feedback on your job execution, just use the JobExecution instance returned by the JobLauncher. You can get the ExitStatus, you can get all exceptions catched by the JobLauncher, and more information.
Another solution to have feedback would be providing a real database to your JobRepository, and then check execution statuses in it.

Resources