Spring batch .How to chain multiple itemProcessors with diffrent types? - spring-boot

i have to compose 2 processors as following :
processor 1 implement the itemProcessor Interface with itemProcessor<A,B> (transforming data).
processor 2 implement the itemProcessor Interface with itemProcessor<B,B>.(treat transformed data).
the CompositeItemProcessor<I, O> requires the delegates to be in the same type , moreover when passing it to the Step the step is already configure with fixed types <A,B>.
how i could chain these processors with different types and assign it to the step processor ?

You need to declare your step as well as your composite processor with <A, B>. Here is a quick example:
import java.util.Arrays;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.support.CompositeItemProcessor;
import org.springframework.batch.item.support.ListItemReader;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class MyJobConfiguration {
#Bean
public ItemReader<A> itemReader() {
return new ListItemReader<>(Arrays.asList(new A("a1"), new A("a2")));
}
#Bean
public ItemProcessor<A, B> itemProcessor1() {
return item -> new B(item.name);
}
#Bean
public ItemProcessor<B, B> itemProcessor2() {
return item -> item; // TODO process item as needed
}
#Bean
public ItemProcessor<A, B> compositeItemProcessor() {
CompositeItemProcessor<A, B> compositeItemProcessor = new CompositeItemProcessor<>();
compositeItemProcessor.setDelegates(Arrays.asList(itemProcessor1(), itemProcessor2()));
return compositeItemProcessor;
}
#Bean
public ItemWriter<B> itemWriter() {
return items -> {
for (B item : items) {
System.out.println("item = " + item.name);
}
};
}
#Bean
public Job job(JobBuilderFactory jobs, StepBuilderFactory steps) {
return jobs.get("job")
.start(steps.get("step")
.<A, B>chunk(2)
.reader(itemReader())
.processor(compositeItemProcessor())
.writer(itemWriter())
.build())
.build();
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJobConfiguration.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
class A {
String name;
public A(String name) { this.name = name; }
}
class B {
String name;
public B(String name) { this.name = name; }
}
}

Related

How to add AsyncItemWriter in Spring Batch to a Step correctly?

In a Spring Batch Job like the following I'm trying to use an AsyncWriter
#Bean
public Step readWriteStep() throws Exception {
return stepBuilderFactory.get("readWriteStep")
.listener(listener)
.<Data, Data>chunk(10)
.reader(dataItemReader())
.writer(dataAsyncWriter())
.build();
}
#Bean
public AsyncItemWriter<Data> dataAsyncWriter() throws Exception {
AsyncItemWriter<Data> asyncItemWriter = new AsyncItemWriter<>();
asyncItemWriter.setDelegate(dataItemWriter);
asyncItemWriter.afterPropertiesSet();
return asyncItemWriter;
}
If I try like this intelliJ complains:
Required type: ItemWriter <? super Data>
Provided: AsyncItemWriter <Data>
When I change .<Data, Data>chunk(10) to .<Data, Future<Data>>chunk(10) intelliJ does not make any warning, but when I run the Job, I get the following Exception:
java.lang.ClassCastException: Data cannot be cast to class java.util.concurrent.Future Data is in unnamed module of loader 'app';
java.util.concurrent.Future is in module java.base of loader 'bootstrap'
For what is the first and the second parameter here? .<Data, Data>chunk(10)?
Are these two parameters for what the processor takes and the second what the processor is giving back?
How do I solve this Problem?
Your example should compile if you change the step definition to use the following:
.<Data, Future<Data>>chunk(10)
That said, I'm not sure this will work correctly at runtime because the AsyncItemWriter is expected to unwrap items from their enclosing Futures, where these Futures are created by an AsyncItemProcessor.
In other words, AsyncItemWriter and AsyncItemProcessor should be used in conjunction for this pattern to work. Here is a quick example with both of them:
import java.util.Arrays;
import java.util.List;
import java.util.concurrent.Future;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.integration.async.AsyncItemProcessor;
import org.springframework.batch.integration.async.AsyncItemWriter;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.support.ListItemReader;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.task.SimpleAsyncTaskExecutor;
#Configuration
#EnableBatchProcessing
public class SO72477556 {
#Bean
public ItemReader<Data> dataItemReader() {
return new ListItemReader<Data>(Arrays.asList());
}
#Bean
public ItemProcessor<Data, Data> dataItemProcessor() {
return new ItemProcessor<Data, Data>() {
#Override
public Data process(Data item) throws Exception {
return item;
}
};
}
#Bean
public AsyncItemProcessor<Data, Data> asyncDataItemProcessor() {
AsyncItemProcessor<Data, Data> asyncItemProcessor = new AsyncItemProcessor<>();
asyncItemProcessor.setDelegate(dataItemProcessor());
asyncItemProcessor.setTaskExecutor(new SimpleAsyncTaskExecutor());
return asyncItemProcessor;
}
#Bean
public ItemWriter<Data> dataItemWriter() {
return new ItemWriter<Data>() {
#Override
public void write(List<? extends Data> items) throws Exception {
}
};
}
#Bean
public AsyncItemWriter<Data> dataAsyncWriter() throws Exception {
AsyncItemWriter<Data> asyncItemWriter = new AsyncItemWriter<>();
asyncItemWriter.setDelegate(dataItemWriter());
asyncItemWriter.afterPropertiesSet();
return asyncItemWriter;
}
#Bean
public Step readWriteStep(StepBuilderFactory stepBuilderFactory) throws Exception {
return stepBuilderFactory.get("readWriteStep")
.<Data, Future<Data>>chunk(10)
.reader(dataItemReader())
.processor(asyncDataItemProcessor())
.writer(dataAsyncWriter())
.build();
}
#Bean
public Job job(JobBuilderFactory jobs, StepBuilderFactory steps) throws Exception {
return jobs.get("job")
.start(readWriteStep(steps))
.build();
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(SO72477556.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
static class Data {}
}

spring batch : typeMismatch.java.sql.Date,typeMismatch

i'm trying to send this csv file to database with spring batch
users.csv
2021-06-22,test1#gmail.com, testFullname1, testMatricule1, 1234, testUsername1
2021-06-22,test2#gmail.com, testFullname2, testMatricule2, 0000, testUsername2
and i have this error
Failed to convert property value of type 'java.lang.String' to required type 'java.sql.Date' for property
here's batchConfig
package sofrecom.collaborateur.config;
import javax.sql.DataSource;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.batch.item.database.BeanPropertyItemSqlParameterSourceProvider;
import org.springframework.batch.item.database.JdbcBatchItemWriter;
import org.springframework.batch.item.file.FlatFileItemReader;
import org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.transform.DelimitedLineTokenizer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.ClassPathResource;
import sofrecom.collaborateur.model.DAOUser;
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public DataSource dataSource;
#Bean
public FlatFileItemReader<DAOUser> reader() {
FlatFileItemReader<DAOUser> reader = new FlatFileItemReader<DAOUser>();
reader.setResource(new ClassPathResource("users.csv"));
reader.setLineMapper(new DefaultLineMapper<DAOUser>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "dateIntegration","email","fullname","matricule","password","username" });
}});
setFieldSetMapper(new BeanWrapperFieldSetMapper<DAOUser>() {{
setTargetType(DAOUser.class);
}});
}});
return reader;
}
#Bean
public UserItemProcessor processor() {
return new UserItemProcessor();
}
#Bean
public JdbcBatchItemWriter<DAOUser> writer() {
JdbcBatchItemWriter<DAOUser> writer = new JdbcBatchItemWriter<DAOUser>();
writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<DAOUser>());
writer.setSql("INSERT INTO user ( date_integration,email,fullname,matricule,password,username) VALUES ( :dateIntegration,:email,:fullname,:matricule,:password,:username)");
writer.setDataSource(dataSource);
return writer;
}
#Bean
public Job importUserJob(JobCompletionNotificationListener listener) {
return jobBuilderFactory.get("importUserJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(step1())
.end()
.build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<DAOUser, DAOUser> chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
}
and this is user Item proccessor
package sofrecom.collaborateur.config;
import java.sql.Date;
import java.text.SimpleDateFormat;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.security.crypto.password.PasswordEncoder;
import sofrecom.collaborateur.model.DAOUser;
public class UserItemProcessor implements ItemProcessor<DAOUser, DAOUser> {
private static final Logger log = LoggerFactory.getLogger(UserItemProcessor.class);
#Autowired
private PasswordEncoder bcryptEncoder;
#Override
public DAOUser process(final DAOUser person) throws Exception {
final String password = bcryptEncoder.encode(person.getUsername());
final DAOUser transformedPerson = new DAOUser(person.getDateIntegration(),person.getEmail(),person.getFullname(),person.getMatricule(),password,person.getUsername());
log.info("Converting (" + person + ") into (" + transformedPerson + ")");
return transformedPerson;
}
}
any solution please !!
The problem is that the BeanWrapperFieldSetMapper does not know, by default, how to convert a String like 2021-06-22 to an object of type java.sql.Date (which is the field dateIntegration) in your domain object DAOUser. The way to tell this mapper how to deal with custom conversions is by registering a ConversionService. This conversion service should have a converter from String to java.sql.Date registered in it. Here is a quick example:
#Bean
public FlatFileItemReader<DAOUser> reader() {
DefaultConversionService conversionService = new DefaultConversionService();
conversionService.addConverter(new Converter<String, Date>() { // java.sql.Date
#Override
public Date convert(String s) {
return Date.valueOf(s);
}
});
BeanWrapperFieldSetMapper<DAOUser> fieldSetMapper = new BeanWrapperFieldSetMapper<>();
fieldSetMapper.setConversionService(conversionService);
fieldSetMapper.setTargetType(DAOUser.class);
FlatFileItemReader<DAOUser> reader = new FlatFileItemReader<>();
reader.setResource(new ClassPathResource("users.csv"));
reader.setLineMapper(new DefaultLineMapper<DAOUser>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "dateIntegration","email","fullname","matricule","password","username" });
}});
setFieldSetMapper(fieldSetMapper);
}});
return reader;
}
You can find a complete example based on what you shared in this repository.

spring batch integration configuration with azure service bus

I am trying to configure inbound and outbound adaptors as provided in the spring batch remote partitioning samples for Manager and worker beans. Having difficulty since they are configured in context of AMQPConnectionFactory.
However when I follow spring integration samples, there is no class which can provide Connection Factory. Help appreciated.
Below is sample code:-
import com.microsoft.azure.spring.integration.core.DefaultMessageHandler;
import com.microsoft.azure.spring.integration.core.api.CheckpointConfig;
import com.microsoft.azure.spring.integration.core.api.CheckpointMode;
import com.microsoft.azure.spring.integration.servicebus.inbound.ServiceBusQueueInboundChannelAdapter;
import com.microsoft.azure.spring.integration.servicebus.queue.ServiceBusQueueOperation;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.integration.partition.RemotePartitioningManagerStepBuilderFactory;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.integration.annotation.IntegrationComponentScan;
import org.springframework.integration.annotation.ServiceActivator;
import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.messaging.MessageChannel;
import org.springframework.messaging.MessageHandler;
import org.springframework.util.concurrent.ListenableFutureCallback;
#Configuration
#IntegrationComponentScan
public class ManagerConfiguration {
private static final int GRID_SIZE = 3;
private static final String REQUEST_QUEUE_NAME = "digital.intg.batch.cm.request";
private static final String REPLY_QUEUE_NAME = "digital.intg.batch.cm.reply";
private static final String MANAGER_INPUT_CHANNEL = "manager.input";
private static final String MANGER_OUTPUT_CHANNEL = "manager.output";
private static final Log LOGGER = LogFactory.getLog(ManagerConfiguration.class);
private final JobBuilderFactory jobBuilderFactory;
private final RemotePartitioningManagerStepBuilderFactory managerStepBuilderFactory;
public ManagerConfiguration(JobBuilderFactory jobBuilderFactory,
RemotePartitioningManagerStepBuilderFactory managerStepBuilderFactory
) {
this.jobBuilderFactory = jobBuilderFactory;
this.managerStepBuilderFactory = managerStepBuilderFactory;
}
/*
* Configure outbound flow (requests going to workers)
*/
#Bean( name = MANGER_OUTPUT_CHANNEL )
public DirectChannel managerRequests() {
return new DirectChannel();
}
/*
* Configure inbound flow (replies coming from workers)
*/
#Bean( name = MANAGER_INPUT_CHANNEL )
public DirectChannel managerReplies() {
return new DirectChannel();
}
#Bean
public ServiceBusQueueInboundChannelAdapter managerQueueMessageChannelAdapter(
#Qualifier( MANAGER_INPUT_CHANNEL ) MessageChannel inputChannel, ServiceBusQueueOperation queueOperation) {
queueOperation.setCheckpointConfig(CheckpointConfig.builder().checkpointMode(CheckpointMode.MANUAL).build());
ServiceBusQueueInboundChannelAdapter adapter = new ServiceBusQueueInboundChannelAdapter(REPLY_QUEUE_NAME,
queueOperation);
adapter.setOutputChannel(inputChannel);
return adapter;
}
#Bean
#ServiceActivator( inputChannel = MANGER_OUTPUT_CHANNEL )
public MessageHandler managerQueueMessageSender(ServiceBusQueueOperation queueOperation) {
DefaultMessageHandler handler = new DefaultMessageHandler(REQUEST_QUEUE_NAME, queueOperation);
handler.setSendCallback(new ListenableFutureCallback<Void>() {
#Override
public void onSuccess(Void result) {
LOGGER.info("Manager Request Message was sent successfully.");
}
#Override
public void onFailure(Throwable ex) {
LOGGER.info("There was an error sending request message to worker.");
}
});
return handler;
}
#Bean
public IntegrationFlow managerOutboundFlow(MessageHandler managerQueueMessageSender) {
return IntegrationFlows
.from(managerRequests())
.handle(managerQueueMessageSender)
.get();
}
#Bean
public IntegrationFlow managerInboundFlow(ServiceBusQueueInboundChannelAdapter managerQueueMessageChannelAdapter) {
return IntegrationFlows
.from(managerQueueMessageChannelAdapter)
.channel(managerReplies())
.get();
}
/*
* Configure the manager step
*/
#Bean
public Step managerStep() {
return this.managerStepBuilderFactory.get("managerStep")
.partitioner("workerStep", new BasicPartitioner())
.gridSize(GRID_SIZE)
.outputChannel(managerRequests())
.inputChannel(managerReplies())
//.aggregator()
.build();
}
#Bean
public Job remotePartitioningJob() {
return this.jobBuilderFactory.get("remotePartitioningJob")
.start(managerStep())
.build();
}
}
The sample uses ActiveMQ because it is easily embeddable in a JVM for our tests and samples. But you can use any other broker that you want.
?? what should I inject here?
You should inject any dependency required by the queueMessageChannelAdapter handler:
.handle(queueMessageChannelAdapter)

Spring batch example for single job to read from a table then split the results by type and process in parallel

Each parallel step will create a file, if all succeed then these files will be moved together to an output folder. If any of these steps fail then none of the files will go to the output folder and the whole job is failed. Help with / code example much appreciated for batch noob.
read from a table then split the results by type and process in parallel
You can partition data by type using a partition step. Partitions will be processed in parallel and each partition creates a file. Then you add step after the partition step to clean up the files if any of the partitions fail. Here is a quick example you can try:
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.HashMap;
import java.util.Map;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.partition.support.Partitioner;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.item.ExecutionContext;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.task.SimpleAsyncTaskExecutor;
#Configuration
#EnableBatchProcessing
public class PartitionJobSample {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public Step step1() {
return steps.get("step1")
.partitioner(workerStep().getName(), partitioner())
.step(workerStep())
.gridSize(3)
.taskExecutor(taskExecutor())
.build();
}
#Bean
public SimpleAsyncTaskExecutor taskExecutor() {
return new SimpleAsyncTaskExecutor();
}
#Bean
public Partitioner partitioner() {
return gridSize -> {
Map<String, ExecutionContext> map = new HashMap<>(gridSize);
for (int i = 0; i < gridSize; i++) {
ExecutionContext executionContext = new ExecutionContext();
executionContext.put("data", "data" + i);
String key = "partition" + i;
map.put(key, executionContext);
}
return map;
};
}
#Bean
public Step workerStep() {
return steps.get("workerStep")
.tasklet(getTasklet(null))
.build();
}
#Bean
#StepScope
public Tasklet getTasklet(#Value("#{stepExecutionContext['data']}") String partitionData) {
return (contribution, chunkContext) -> {
if (partitionData.equals("data2")) {
throw new Exception("Boom!");
}
System.out.println(Thread.currentThread().getName() + " processing partitionData = " + partitionData);
Files.createFile(Paths.get(partitionData + ".txt"));
return RepeatStatus.FINISHED;
};
}
#Bean
public Step moveFilesStep() {
return steps.get("moveFilesStep")
.tasklet((contribution, chunkContext) -> {
System.out.println("moveFilesStep");
// add code to move files where needed
return RepeatStatus.FINISHED;
})
.build();
}
#Bean
public Step cleanupFilesStep() {
return steps.get("cleanupFilesStep")
.tasklet((contribution, chunkContext) -> {
System.out.println("cleaning up..");
deleteFiles();
return RepeatStatus.FINISHED;
})
.build();
}
#Bean
public Job job() {
return jobs.get("job")
.flow(step1()).on("FAILED").to(cleanupFilesStep())
.from(step1()).on("*").to(moveFilesStep())
.from(moveFilesStep()).on("*").end()
.from(cleanupFilesStep()).on("*").fail()
.build()
.build();
}
public static void main(String[] args) throws Exception {
deleteFiles();
ApplicationContext context = new AnnotationConfigApplicationContext(PartitionJobSample.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
private static void deleteFiles() throws IOException {
for (int i = 0; i <= 2; i++) {
Files.deleteIfExists(Paths.get("data" + i + ".txt"));
}
}
}
This example creates 3 dummy partitions ("data0", "data1" and "data2"). Each partition will create a file. If all partitions finish correctly, you will have three files "data0.txt", "data1.txt" and "data2.txt" which will be moved in the moveFilesStep.
Now let make one of the partitions fail, for example the second partition:
#Bean
#StepScope
public Tasklet getTasklet(#Value("#{stepExecutionContext['data']}") String partitionData) {
return (contribution, chunkContext) -> {
if (partitionData.equals("data2")) {
throw new Exception("Boom!");
}
System.out.println(Thread.currentThread().getName() + " processing partitionData = " + partitionData);
Files.createFile(Paths.get(partitionData + ".txt"));
return RepeatStatus.FINISHED;
};
}
In this case, the cleanupFilesStep will be triggered and will delete all files.
Hope this helps.

Spring Batch With Annotation and Caching

Does anyone have good example of Spring Batch (Using Annotation) to cache a reference table which will be accessible to processor ?
I just need a simple cache, run a query which returns some byte[] and keep it in memory till the time job is executing.
Appreciate any help on this topic.
Thanks !
A JobExecutionListener can be used to populate the cache with reference data before the job is executed and clear the cache after the job is finished.
Here is an example:
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobExecution;
import org.springframework.batch.core.JobExecutionListener;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.cache.CacheManager;
import org.springframework.cache.concurrent.ConcurrentMapCacheManager;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class MyJob {
private JobBuilderFactory jobs;
private StepBuilderFactory steps;
public MyJob(JobBuilderFactory jobs, StepBuilderFactory steps) {
this.jobs = jobs;
this.steps = steps;
}
#Bean
public CacheManager cacheManager() {
return new ConcurrentMapCacheManager(); // return the implementation you want
}
#Bean
public Tasklet tasklet() {
return new MyTasklet(cacheManager());
}
#Bean
public Step step() {
return steps.get("step")
.tasklet(tasklet())
.build();
}
#Bean
public JobExecutionListener jobExecutionListener() {
return new CachingJobExecutionListener(cacheManager());
}
#Bean
public Job job() {
return jobs.get("job")
.start(step())
.listener(jobExecutionListener())
.build();
}
class MyTasklet implements Tasklet {
private CacheManager cacheManager;
public MyTasklet(CacheManager cacheManager) {
this.cacheManager = cacheManager;
}
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
String name = (String) cacheManager.getCache("referenceData").get("foo").get();
System.out.println("Hello " + name);
return RepeatStatus.FINISHED;
}
}
class CachingJobExecutionListener implements JobExecutionListener {
private CacheManager cacheManager;
public CachingJobExecutionListener(CacheManager cacheManager) {
this.cacheManager = cacheManager;
}
#Override
public void beforeJob(JobExecution jobExecution) {
// populate cache as needed. Can use a jdbcTemplate to query the db here and populate the cache
cacheManager.getCache("referenceData").put("foo", "bar");
}
#Override
public void afterJob(JobExecution jobExecution) {
// clear cache when the job is finished
cacheManager.getCache("referenceData").clear();
}
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJob.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
}
When executed, it prints:
Hello bar
which means data is correctly retrieved from the cache. You would need to adapt the sample to query the database and populate the cache (See comments in code).
Hope this helps.
You can use ehcache-jsr107 implementation. Very quick to setup.
Spring and ehcache integration example is available here.
You should be able to setup same with spring batch also.
Hope this hleps

Resources