Spring Batch Reader is reading alternate records - spring-boot

I have created a sample spring batch application which is trying to read record from a DB and in writer, it displays those records. However, I could see that only even numbered (alternate) records are printed.
It's not the problem of database as the behavior is consistent with both H2 database or Oracle database.
There are total 100 records in my DB.
With JDBCCursorItemReader, only 50 records are read and that too alternate one as can be seen from log snapshot
With JdbcPagingItemReader, only 5 records are read and that too alternate one as can be seen from log snapshot
My code configurations are given below. Why reader is skipping odd numbered records?
#Bean
public ItemWriter<Safety> safetyWriter() {
return items -> {
for (Safety item : items) {
log.info(item.toString());
}
};
}
#Bean
public JdbcCursorItemReader<Safety> cursorItemReader() throws Exception {
JdbcCursorItemReader<Safety> reader = new JdbcCursorItemReader<>();
reader.setSql("select * from safety " );
reader.setDataSource(dataSource);
reader.setRowMapper(new SafetyRowMapper());
reader.setVerifyCursorPosition(false);
reader.afterPropertiesSet();
return reader;
}
#Bean
JdbcPagingItemReader<Safety> safetyPagingItemReader() throws Exception {
JdbcPagingItemReader<Safety> reader = new JdbcPagingItemReader<>();
reader.setDataSource(dataSource);
reader.setFetchSize(10);
reader.setRowMapper(new SafetyRowMapper());
H2PagingQueryProvider queryProvider = new H2PagingQueryProvider();
queryProvider.setSelectClause("*");
queryProvider.setFromClause("safety");
Map<String, Order> sortKeys = new HashMap<>(1);
sortKeys.put("id", Order.ASCENDING);
queryProvider.setSortKeys(sortKeys);
reader.setQueryProvider(queryProvider);
return reader;
}
#Bean
public Step importSafetyDetails() throws Exception {
return stepBuilderFactory.get("importSafetyDetails")
.<Safety, Safety>chunk(chunkSize)
//.reader(cursorItemReader())
.reader(safetyPagingItemReader())
.writer(safetyWriter())
.listener(new StepListener())
.listener(new ChunkListener())
.build();
}
#Bean
public Job job() throws Exception {
return jobBuilderFactory.get("job")
.start(importSafetyDetails())
.build();
}
Domain classes looks like below:
#NoArgsConstructor
#AllArgsConstructor
#Data
public class Safety {
private int id;
}
public class SafetyRowMapper implements RowMapper<Safety> {
#Override
public Safety mapRow(ResultSet resultSet, int i) throws SQLException {
if(resultSet.next()) {
Safety safety = new Safety();
safety.setId(resultSet.getInt("id"));
return safety;
}
return null;
}
}
#SpringBootApplication
#EnableBatchProcessing
public class SpringBatchSamplesApplication {
public static void main(String[] args) {
SpringApplication.run(SpringBatchSamplesApplication.class, args);
}
}
application.yml configuration is as below:
spring:
application:
name: spring-batch-samples
main:
allow-bean-definition-overriding: true
datasource:
url: jdbc:h2:mem:testdb;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE
username: sa
password:
driver-class-name: org.h2.Driver
hikari:
connection-timeout: 20000
maximum-pool-size: 10
h2:
console:
enabled: true
batch:
initialize-schema: never
server:
port: 9090
sqls are as below:
CREATE TABLE safety (
id int NOT NULL,
CONSTRAINT PK_ID PRIMARY KEY (id)
);
INSERT INTO safety (id) VALUES (1);
...100 records are inserted
Listeners classes are as below:
#Slf4j
public class StepListener{
#AfterStep
public ExitStatus afterStep(StepExecution stepExecution) {
log.info("In step {} ,Exit Status: {} ,Read Records: {} ,Committed Records: {} ,Skipped Read Records: {} ,Skipped Write Records: {}",
stepExecution.getStepName(),
stepExecution.getExitStatus().getExitCode(),
stepExecution.getReadCount(),
stepExecution.getCommitCount(),
stepExecution.getReadSkipCount(),
stepExecution.getWriteSkipCount());
return stepExecution.getExitStatus();
}
}
#Slf4j
public class ChunkListener {
#BeforeChunk
public void beforeChunk(ChunkContext context) {
log.info("<< Before the chunk");
}
#AfterChunk
public void afterChunk(ChunkContext context) {
log.info("<< After the chunk");
}
}

I tried to reproduce your problem, but I couldn't. Maybe it would be great if you could share more code.
Meanwhile I created a simple job to read 100 records from "safety" table a print them to the console. And it is working fine.
.
#SpringBootApplication
#EnableBatchProcessing
public class ReaderWriterProblem implements CommandLineRunner {
#Autowired
DataSource dataSource;
#Autowired
StepBuilderFactory stepBuilderFactory;
#Autowired
JobBuilderFactory jobBuilderFactory;
#Autowired
private JobLauncher jobLauncher;
#Autowired
private ApplicationContext context;
public static void main(String[] args) {
String[] arguments = new String[]{LocalDateTime.now().toString()};
SpringApplication.run(ReaderWriterProblem.class, arguments);
}
#Bean
public ItemWriter<Safety> safetyWriter() {
return new ItemWriter<Safety>() {
#Override
public void write(List<? extends Safety> items) throws Exception {
for (Safety item : items) {
//log.info(item.toString());
System.out.println(item);
}
}
};
}
// #Bean
// public JdbcCursorItemReader<Safety> cursorItemReader() throws Exception {
// JdbcCursorItemReader<Safety> reader = new JdbcCursorItemReader<>();
//
// reader.setSql("select * from safety ");
// reader.setDataSource(dataSource);
// reader.setRowMapper(new SafetyRowMapper());
// reader.setVerifyCursorPosition(false);
// reader.afterPropertiesSet();
//
// return reader;
// }
#Bean
JdbcPagingItemReader<Safety> safetyPagingItemReader() throws Exception {
JdbcPagingItemReader<Safety> reader = new JdbcPagingItemReader<>();
reader.setDataSource(dataSource);
reader.setFetchSize(10);
reader.setRowMapper(new SafetyRowMapper());
PostgresPagingQueryProvider queryProvider = new PostgresPagingQueryProvider();
queryProvider.setSelectClause("*");
queryProvider.setFromClause("safety");
Map<String, Order> sortKeys = new HashMap<>(1);
sortKeys.put("id", Order.ASCENDING);
queryProvider.setSortKeys(sortKeys);
reader.setQueryProvider(queryProvider);
return reader;
}
#Bean
public Step importSafetyDetails() throws Exception {
return stepBuilderFactory.get("importSafetyDetails")
.<Safety, Safety>chunk(5)
//.reader(cursorItemReader())
.reader(safetyPagingItemReader())
.writer(safetyWriter())
.listener(new MyStepListener())
.listener(new MyChunkListener())
.build();
}
#Bean
public Job job() throws Exception {
return jobBuilderFactory.get("job")
.listener(new JobListener())
.start(importSafetyDetails())
.build();
}
#Override
public void run(String... args) throws Exception {
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addString("date", LocalDateTime.now().toString());
try {
Job job = (Job) context.getBean("job");
jobLauncher.run(job, jobParametersBuilder.toJobParameters());
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobInstanceAlreadyCompleteException | JobParametersInvalidException e) {
e.printStackTrace();
}
}
public static class JobListener implements JobExecutionListener {
#Override
public void beforeJob(JobExecution jobExecution) {
System.out.println("Before job");
}
#Override
public void afterJob(JobExecution jobExecution) {
System.out.println("After job");
}
}
private static class SafetyRowMapper implements RowMapper<Safety> {
#Override
public Safety mapRow(ResultSet resultSet, int i) throws SQLException {
Safety safety = new Safety();
safety.setId(resultSet.getLong("ID"));
return safety;
}
}
public static class MyStepListener implements StepExecutionListener {
#Override
public void beforeStep(StepExecution stepExecution) {
System.out.println("Before Step");
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
System.out.println("After Step");
return ExitStatus.COMPLETED;
}
}
private static class MyChunkListener implements ChunkListener {
#Override
public void beforeChunk(ChunkContext context) {
System.out.println("Before Chunk");
}
#Override
public void afterChunk(ChunkContext context) {
System.out.println("After Chunk");
}
#Override
public void afterChunkError(ChunkContext context) {
}
}
}
Hope this helps

Related

#Transactional timeout is not working in spring boot application

I am using Spring Boot and JdbcTemplate in my application. I am trying to implement timeout for select query but its not working.
My query takes more time than timeout time but still its not giving timeout exception.
#Service
#Slf4j
public class SchedulerService
{
#Autowired
UserService userExportService;
#Autowired
private userExportDao userExportDao;
#Value("${queryTest}")
private String queryFetchByExportFlagCustom;
#Scheduled(fixedDelay=10000)
public void triggerUserExport() {
List<UserExportCustom> userList;
try {
userList = userExportDao.findByExportFlag(0, queryFetchByExportFlagCustom);
userExportService.exportUsers(userList, schedulerCount);
} catch (Exception e) {
e.printStackTrace();
}
}
}
#Repository
#Slf4j
public class UserExportDao extends JdbcDaoImpl<UserExportCustom, Long>
{
#Autowired
BeanPropertyRowMapper<UserExportCustom> userExportCustomRowMapper;
#Transactional(readOnly = true, timeout = 1)
public List<UserExportCustom> findByExportFlag(Integer exportFlag, String query)
{
List<UserExportCustom> userExportCustomList = null;
try
{
SqlParameterSource namedParameters = new MapSqlParameterSource().addValue("exportFlag", exportFlag, Types.INTEGER);
userExportCustomList = namedParameterJdbcTemplate.query(query, namedParameters,userExportCustomRowMapper);
}
catch (Exception e)
{
e.printStackTrace();
log.error("Error in findByExportFlag: \n" + e);
}
return userExportCustomList;
}
}
public class JdbcDaoImpl<T, ID> implements JdbcDao<T, ID> {
#Autowired
protected JdbcTemplate jdbcTemplate;
#Autowired
protected NamedParameterJdbcTemplate namedParameterJdbcTemplate;
#Override
public List<T> findAll() {
throw new IllegalStateException();
}
#Override
public T save(T api) {
throw new IllegalStateException();
}
#Override
public T update(T api) {
throw new IllegalStateException();
}
#Override
public T saveOrUpdate(T api) {
throw new IllegalStateException();
}
#Override
public T findOne(String unique) {
throw new IllegalStateException();
}
#Override
public T findOneById(ID id) {
throw new IllegalStateException();
}
#Override
public void delete(ID id) {
throw new IllegalStateException();
}
}
If query take more than 1 second than it should give timeout exception but it does not.
try{}catch{} will lead #Transactional to fail ,remove try catch

The FlatFileItemReader read only one line from the CSV file - Spring Batch

I'm creating a Spring Batch Job to populate Data into a Database table from a given CSV file.
I created a customized FlatFileItemReader.
my problem is that the read() method is invoked only one time so only the first line of my CSV file is inserted into the database.
#Configuration
#EnableBatchProcessing
public class SpringBatchConfig {
private MultipartFile[] files;
#Bean
public Job job(JobBuilderFactory jobBuilderFactory, StepBuilderFactory stepBuilderFactory,
ItemReader<MyModelEntity> itemReader,
ItemWriter<MyModelEntity> itemWriter) {
Step step = stepBuilderFactory.get("Load-CSV-file_STP")
.<MyModelEntity, MyModelEntity > chunk(12)
.reader(itemReader)
.writer(itemWriter).build();
return jobBuilderFactory.get("Load-CSV-Files").
incrementer(new RunIdIncrementer()) /
.start(step)
.build();
}
#Bean
ItemReader<MyModelEntity> myModelCsvReader() throws Exception {
return new MyModelCsvReader();
}
}
the myModelCsvReader
#Component
#StepScope
public class MyModelCsvReader implements ItemReader<MyModelEntity>{
#Value("#{jobParameters['SDH']}")
private String sdhPath;
private boolean batchJobState= false;
#Autowired
MyModelFieldSetMapper myModelFieldSetMapper;
public LineMapper<MyModelEntity> lineMapper() throws Exception {
DefaultLineMapper<MyModelEntity> defaultLineMapper = new
DefaultLineMapper<MyModelEntity>();
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setDelimiter(",");
lineTokenizer.setStrict(false);
lineTokenizer.setNames(new String[]
{
"clientId","ddId","institName","progName",
"qual","startDate","endDate","eType", "country","comments"
});
defaultLineMapper.setLineTokenizer(lineTokenizer);
defaultLineMapper.setFieldSetMapper(myModelFieldSetMapper);
return defaultLineMapper;}
#Override
public MyModelEntity read()
throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
//if(!batchJobState )
{
FlatFileItemReader<MyModelEntity> flatFileItemReader = new
FlatFileItemReader<MyModelEntity>();
flatFileItemReader.setMaxItemCount(2000);
flatFileItemReader.setResource(new UrlResource("file:\\"+sdhPath));
flatFileItemReader.setName("CSV-Reader");
flatFileItemReader.setLinesToSkip(1);
flatFileItemReader.setLineMapper(lineMapper());
flatFileItemReader.open(new ExecutionContext());
batchJobState=true;
return flatFileItemReader.read();
}
// return null;
}
}
the FieldSetMapper Implementation
#Component
public class MyModelFieldSetMapper implements FieldSetMapper<MyModelEntity> {
//private SiteService siteService =BeanUtil.getBean(SiteServiceImpl.class);
#Autowired
private SiteService siteService;
#Override
public MyModelEntity mapFieldSet(FieldSet fieldSet ) throws BindException {
if(fieldSet == null){
return null;
}
MyModelEntity educationHistory = new MyModelEntity();
// setting MyModelAttributes Values
return myModel;
}
}
any conribution is welcomed . thanks
// thereader after extending FlatFileItemReader
#Component
#StepScope
public class CustomUserItemReader extends FlatFileItemReader<User> {
#Value("#{jobParameters['UserCSVPath']}")
private String UserCSVPath;
private boolean batchJobState;
public CustomUserItemReader() throws Exception {
super();
setResource(new UrlResource("file:\\"+UserCSVPath));
setLineMapper(lineMapper());
afterPropertiesSet();
setStrict(false);
}
public LineMapper<User> lineMapper() throws Exception {
DefaultLineMapper<User> defaultLineMapper =
new DefaultLineMapper<User>();
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setDelimiter(",");
lineTokenizer.setStrict(false);
lineTokenizer.setNames(new String[]{"name", "dept",
"salary","endDate"});
defaultLineMapper.setLineTokenizer(lineTokenizer);
defaultLineMapper.setFieldSetMapper(new CustomUserFieldSetMapper());
//defaultLineMapper.setFieldSetMapper(fieldSetMapper);
return defaultLineMapper;}
#Override
public User read()
throws Exception, UnexpectedInputException, ParseException,
NonTransientResourceException {
//if(!batchJobState )
{
flatFileItemReader).size())
// flatFileItemReader.setMaxItemCount(2000);
this.setResource(new UrlResource("file:\\"+UserCSVPath));
this.setName("CSV-Reader");
this.setLinesToSkip(1);
//flatFileItemReader.setLineMapper(lineMapper());
this.open(new ExecutionContext());
User e = this.read();
batchJobState = true;
return e ;
}
// return null;
}
public
String getUserCSVPath() {
return UserCSVPath;
}
public
void setUserCSVPath(String userCSVPath) {
UserCSVPath = userCSVPath;
}
}
Thanks for all your suggestions, even if i have implemented ItemReader<>. I fixed the problem by moving the instantiation of the FlatFileItemReader Out from the read() method.
that was creating a new FlatFileItemReader in each loop and reading only the first line for each object created .
thanks

Spring batch run multiple jobs in parallel

I am new to Spring batch and couldn't figure out how to do this..
Basically I have a spring batch files and both are have to run parallel i.e when I request execute_job1 then BatchConfig1 have to execute and when I request execute_job2 then BatchConfig2 have to execute. How can I do this?
Controller
#RestController
public class JobExecutionController {
#Autowired
JobLauncher jobLauncher;
#Autowired
Job job;
/**
*
* #return
*/
#RequestMapping("/execute_job1")
#ResponseBody
public void executeBatchJob1() {
}
/**
*
* #return
*/
#RequestMapping("/execute_job2")
#ResponseBody
public void executeBatchJob2() {
}
}
BatchConfig1
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public Step stepOne(){
return steps.get("stepOne")
.tasklet(new MyTaskOne())
.build();
}
#Bean
public Step stepTwo(){
return steps.get("stepTwo")
.tasklet(new MyTaskTwo())
.build();
}
#Bean
public Job demoJob(){
return jobs.get("exportUserJob1")
.incrementer(new RunIdIncrementer())
.start(stepOne())
.next(stepTwo())
.build();
}
}
BatchConfig2:
#Configuration
#EnableBatchProcessing
public class BatchConfig2 {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public DataSource dataSource;
#Bean
public JdbcCursorItemReader<User> reader() {
JdbcCursorItemReader<User> reader = new JdbcCursorItemReader<User>();
reader.setDataSource(dataSource);
reader.setSql("SELECT id,name FROM user");
reader.setRowMapper(new UserRowMapper());
return reader;
}
public class UserRowMapper implements RowMapper<User> {
#Override
public User mapRow(ResultSet rs, int rowNum) throws SQLException {
User user = new User();
user.setId(rs.getInt("id"));
user.setName(rs.getString("name"));
return user;
}
}
#Bean
public UserItemProcessor processor() {
return new UserItemProcessor();
}
#Bean
public FlatFileItemWriter<User> writer() {
FlatFileItemWriter<User> writer = new FlatFileItemWriter<User>();
writer.setResource(new ClassPathResource("users.csv"));
writer.setLineAggregator(new DelimitedLineAggregator<User>() {
{
setDelimiter(",");
setFieldExtractor(new BeanWrapperFieldExtractor<User>() {
{
setNames(new String[] { "id", "name" });
}
});
}
});
return writer;
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1").<User, User>chunk(10).reader(reader()).processor(processor())
.writer(writer()).build();
}
#Bean
public Job exportUserJob() {
return jobBuilderFactory.get("exportUserJob2").incrementer(new RunIdIncrementer()).flow(step1()).end().build();
}
}
You can run a job with JobLauncher. The code for the controller:
#RestController
public class JobExecutionController {
public JobExecutionController(JobLauncher jobLauncher,
#Qualifier("demoJob") Job demoJob,
#Qualifier("exportUserJob") Job exportUserJob) {
this.jobLauncher = jobLauncher;
this.demoJob = demoJob;
this.exportUserJob = exportUserJob;
}
JobLauncher jobLauncher;
Job demoJob;
Job exportUserJob;
#RequestMapping("/execute_job1")
#ResponseBody
public void executeBatchJob1() {
try {
JobExecution jobExecution = jobLauncher.run(demoJob, new JobParameters(generateJobParameter()));
log.info("Job started in thread :" + jobExecution.getJobParameters().getString("JobThread"));
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobParametersInvalidException | JobInstanceAlreadyCompleteException e) {
log.error("Something sent wrong during job execution", e);
}
}
#RequestMapping("/execute_job2")
#ResponseBody
public void executeBatchJob2() {
try {
JobExecution jobExecution = jobLauncher.run(exportUserJob, new JobParameters(generateJobParameter()));
log.info("Job started in thread :" + jobExecution.getJobParameters().getString("JobThread"));
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobParametersInvalidException | JobInstanceAlreadyCompleteException e) {
log.error("Something sent wrong during job execution", e);
}
}
private JobParameters generateJobParameter() {
Map<String, JobParameter> parameters = new HashMap<>();
parameters.put("Job start time", new JobParameter(Instant.now().toEpochMilli()));
parameters.put("JobThread", new JobParameter(Thread.currentThread().getId()));
return new JobParameters(parameters);
}
}
To prevent starting your jobs at the application start add to the application.properties next spring batch configuration: spring.batch.job.enabled=false

JpaPagingItemReader - endless Loop on failed read

Hi I am getting endless loop when exception was thrown.
Is there something wrong to the transactionmanager I am using?
Thanks!
Below is my job setup
#Bean
public ItemReader<Entity> bizAppReader() {
#SuppressWarnings("serial")
Map<String, Object> map = new HashMap<String, Object>() {{
put("status", Arrays.asList(new ApplicationStatus[] {ApplicationStatus.COMPLETED, ApplicationStatus.PENDING, ApplicationStatus.INITIATE}));
}};
return new JpaPagingItemReaderBuilder<Entity>()
.name("bizAppJpaReader")
.entityManagerFactory(entityManagerFactory)
.queryString("from Entity b where b.status in (:status)").parameterValues(map)
.build();
}
#Bean
public Step bizApplicatonStep(#Value("${batch.chunk_size:10}") int chunkSize,
ItemReader<Entity> bizAppReader,
ItemProcessor<Entity, Map<ApplicationStatus, String>> bizAppProcessor,
ItemWriter<Map<ApplicationStatus, String>> bizAppWriter) {
return stepBuilderFactory.get("bizApplicatonStep")
.<Entity, Map<ApplicationStatus, String>>chunk(chunkSize).reader(bizAppReader)
.processor(bizAppProcessor).writer(bizAppWriter)
.faultTolerant()
.skip(Exception.class)
.noRollback(Exception.class)
.skipPolicy((e, i) -> {
log.error("Exception occurred : ", e);
return true;
})
.allowStartIfComplete(true)
.build();
}
#Slf4j
#SpringBootApplication
#EnableBatchProcessing
public class BatchApplication extends DefaultBatchConfigurer {
public static void main(String[] args) throws Exception {
ConfigurableApplicationContext context = SpringApplication.run(BatchApplication.class, args);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job;
JobParameters jobParameters;
if (ArrayUtils.isNotEmpty(args) && args[0].equals("bookingJob")) {
job = context.getBean("bookingJob", Job.class);
jobParameters = new JobParametersBuilder()
.addString("startMonth", args[1])
.addString("endMonth", args[2])
.toJobParameters();
} else {
job = context.getBean("bizApplicationJob", Job.class);
jobParameters = new JobParametersBuilder().toJobParameters();
}
JobExecution jobExecution = jobLauncher.run(job, jobParameters);
if (BatchStatus.FAILED.equals(jobExecution.getStatus())) {
log.error("job execution completed exit code is : 1 ");
System.exit(1);
}
log.info("job execution completed exit code is : 0 ");
}
#Override
protected JobRepository createJobRepository() throws Exception {
return new MapJobRepositoryFactoryBean(new ResourcelessTransactionManager()).getObject();
}
}
Stacktrace:
10:13:20.657 [main] ERROR c.u.pweb.bizacct.batch.BizJobConfig - Exception occurred :
java.lang.IllegalStateException: Transaction already active
at org.hibernate.engine.transaction.internal.TransactionImpl.begin(TransactionImpl.java:52)
-===================== Updated application to replicate =====================
this happen if the reader failed to read with invalid Enum value from DB.
#Slf4j
#SpringBootApplication
#EnableBatchProcessing
public class DemoBatchApplication extends DefaultBatchConfigurer {
public static void main(String[] args) {
SpringApplication.run(DemoBatchApplication.class, args);
}
#Override
public void setDataSource(DataSource dataSource) {
super.setDataSource(null);
}
private final EntityManagerFactory entityManagerFactory;
public DemoBatchApplication(EntityManagerFactory entityManagerFactory) {
this.entityManagerFactory = entityManagerFactory;
}
#Override
public PlatformTransactionManager getTransactionManager() {
return new JpaTransactionManager(this.entityManagerFactory);
}
}
enum
public enum ApplicationStatus {
MA, AS, SH, CP, AA, AP
}
table
#Entity
#Table(name = "APP_TBL_TEST")
#Data
#Builder
#AllArgsConstructor
#NoArgsConstructor
public class ApplicationEntity {
#Id
#GeneratedValue(generator = "system-uuid")
#GenericGenerator(name = "system-uuid", strategy = "uuid2")
#Column(name = "ID", length = 50)
private String id;
#Column(name = "APPLICATION_NAME", unique = true, length = 50)
private String referenceNumber;
#Column(name = "STATUS")
#Enumerated(EnumType.STRING)
private ApplicationStatus status;
}
batch config
#Configuration
#Slf4j
#Data
public class ApplicationBatchCfg {
#Autowired
public EntityManagerFactory entityManagerFactory;
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Bean
public Step applicationStep(#Value("${batch.chunk_size:5}") int chunkSize,
ItemReader<ApplicationEntity> appReader,
ItemProcessor<ApplicationEntity, String> appProcessor,
ItemWriter<String> appWriter){
return stepBuilderFactory.get("applicatonStep")
.<ApplicationEntity, String>chunk(chunkSize)
.reader(appReader)
.processor(appProcessor)
.writer(appWriter)
.faultTolerant()
.skip(Exception.class)
.noRollback(Exception.class)
.skipPolicy((e, i) -> {
log.error("Exception occurred : ", e);
return true;
})
.skipLimit(5)
.allowStartIfComplete(true)
.build();
}
#Bean
public Job applicationJob(Step applicationStep) {
return jobBuilderFactory.get("applicationJob").incrementer(new RunIdIncrementer())
.start(applicationStep).build();
}
#Bean
public ItemReader<ApplicationEntity> appReader() {
return new JpaPagingItemReaderBuilder<ApplicationEntity>()
.name("appReader")
.entityManagerFactory(entityManagerFactory)
.queryString("from ApplicationEntity")
.pageSize(3)
.build();
}
#Bean
public ItemProcessor<ApplicationEntity, String> appProcessor(){
return new ItemProcessor<ApplicationEntity, String>() {
#Override
public String process(ApplicationEntity item) throws Exception {
//nothing for demo only
log.info("item {}", item);
return null;
}
};
}
#Bean
public ItemWriter<String> appWriter() {
return new ItemWriter<String>() {
#Override
public void write(List<? extends String> items) throws Exception {
//nothing for demo only
}
};
}
}
inside yml
spring:
jpa:
properties:
hibernate:
show_sql: true
use_sql_comments : true
format_sql: true
dialect: org.hibernate.dialect.Oracle10gDialect
datasource:
useWallet: false
url: jdbc:oracle:thin:#DB12397:1521:azd
platform: oracle
username: user
password: pwd

Spring Batch: pass data between reader and writer

I would like to get data in the Writer that I've set in the Reader of my step. I know about ExecutionContexts (step and job) and about ExecutionContextPromotionListener via http://docs.spring.io/spring-batch/trunk/reference/html/patterns.html#passingDataToFutureSteps
The problem is that in Writer I'm retrieving a null value of 'npag'.
Line on ItemWriter:
LOG.info("INSIDE WRITE, NPAG: " + nPag);
I've being doing some workarounds without luck, looking answer for other similar questions... Any help? thanks!
Here's my code:
READER
#Component
public class LCItemReader implements ItemReader<String> {
private StepExecution stepExecution;
private int nPag = 1;
#Override
public String read() throws CustomItemReaderException {
ExecutionContext stepContext = this.stepExecution.getExecutionContext();
stepContext.put("npag", nPag);
nPag++;
return "content";
}
#BeforeStep
public void saveStepExecution(StepExecution stepExecution) {
this.stepExecution = stepExecution;
}
}
WRITER
#Component
#StepScope
public class LCItemWriter implements ItemWriter<String> {
private String nPag;
#Override
public void write(List<? extends String> continguts) throws Exception {
try {
LOG.info("INSIDE WRITE, NPAG: " + nPag);
} catch (Throwable ex) {
LOG.error("Error: " + ex.getMessage());
}
}
#BeforeStep
public void retrieveInterstepData(StepExecution stepExecution) {
JobExecution jobExecution = stepExecution.getJobExecution();
ExecutionContext jobContext = jobExecution.getExecutionContext();
this.nPag = jobContext.get("npag").toString();
}
}
JOB/STEP BATCH CONFIG
#Bean
public Job lCJob() {
return jobs.get("lCJob")
.listener(jobListener)
.start(lCStep())
.build();
}
#Bean
public Step lCStep() {
return steps.get("lCStep")
.<String, String>chunk(1)
.reader(lCItemReader)
.processor(lCProcessor)
.writer(lCItemWriter)
.listener(promotionListener())
.build();
}
LISTENER
#Bean
public ExecutionContextPromotionListener promotionListener() {
ExecutionContextPromotionListener executionContextPromotionListener = new ExecutionContextPromotionListener();
executionContextPromotionListener.setKeys(new String[]{"npag"});
return executionContextPromotionListener;
}
The ExecutionContextPromotionListener specifically states that it works at the end of a step so that would be after the writer executes. So the promotion I think you are counting on does not occur when you think it does.
If i were you I would set it in the step context and get it from the step if you need the value with in a single step. Otherwise I would set it to the job context.
The other aspect is the #BeforeStep. That marks a method for executing before the step context exists. The way you are setting the nPag value in the reader would be after the step had started executing.
You are trying to read the value for nPag even before it is set in the reader, ending up with a default value which is null. You need to read the value on nPag at the time of logging from the execution context directly. You can keep a reference to the jobContext. Try this
#Component
#StepScope
public class LCItemWriter implements ItemWriter<String> {
private String nPag;
private ExecutionContext jobContext;
#Override
public void write(List<? extends String> continguts) throws Exception {
try {
this.nPag = jobContext.get("npag").toString();
LOG.info("INSIDE WRITE, NPAG: " + nPag);
} catch (Throwable ex) {
LOG.error("Error: " + ex.getMessage());
}
}
#BeforeStep
public void retrieveInterstepData(StepExecution stepExecution) {
JobExecution jobExecution = stepExecution.getJobExecution();
jobContext = jobExecution.getExecutionContext();
}
}
In your Reader and Writer you need to implement ItemStream interface and use ExecutionContext as member variable.Here i have given example with Processor instead of Writer but same is applicable for Writer as well .Its working fine for me and i am able to take values from reader to processor.
I have set the value in context in reader and getting the value in processor.
public class EmployeeItemReader implements ItemReader<Employee>, ItemStream {
ExecutionContext context;
#Override
public Employee read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
context.put("ajay", "i am going well");
Employee emp=new Employee();
emp.setEmpId(1);
emp.setFirstName("ajay");
emp.setLastName("goswami");
return emp;
}
#Override
public void close() throws ItemStreamException {
// TODO Auto-generated method stub
}
#Override
public void open(ExecutionContext arg0) throws ItemStreamException {
context = arg0;
}
#Override
public void update(ExecutionContext arg0) throws ItemStreamException {
// TODO Auto-generated method stub
context = arg0;
}
}
My processor
public class CustomItemProcessor implements ItemProcessor<Employee,ActiveEmployee>,ItemStream{
ExecutionContext context;
#Override
public ActiveEmployee process(Employee emp) throws Exception {
//See this line
System.out.println(context.get("ajay"));
ActiveEmployee actEmp=new ActiveEmployee();
actEmp.setEmpId(emp.getEmpId());
actEmp.setFirstName(emp.getFirstName());
actEmp.setLastName(emp.getLastName());
actEmp.setAdditionalInfo("Employee is processed");
return actEmp;
}
#Override
public void close() throws ItemStreamException {
// TODO Auto-generated method stub
}
#Override
public void open(ExecutionContext arg0) throws ItemStreamException {
// TODO Auto-generated method stub
}
#Override
public void update(ExecutionContext arg0) throws ItemStreamException {
context = arg0;
}
}
Hope this helps.

Resources