spring boot spring batch: how to set query dynamically to ItemReader - spring-boot

I am new to Spring. I have a use case where I need to execute same multiple sql queries and return same POJO for every query. I would like to write one Item reader and change the query in each step. Is there a way to do this?

You can use spring batch late binding by adding #StepScope in your reader
Sample code
#StepScope
#Bean
public ItemReader<Pojo> myReader() {
JdbcCursorItemReader<Pojo> reader = new JdbcCursorItemReader<>();
reader.setDataSource(basicDataSource);
//You can inject sql as per you need
//Some expamles
//using #{jobParameters['']}
//using {jobExecutionContext['input.file.name']}"
//using #{stepExecutionContext['input.file.name']}"
reader.setSql("Your-SQL");
reader.setRowMapper(new MyMapper());
return reader;
}
check section 5.4
https://docs.spring.io/spring-batch/reference/html/configureStep.html

Related

Conditional writer for spring batch

I use spring boot and spring batch
public ItemWriter<T> writerOne(){
ItemWriter<T> writer = new ItemWriter<T>();
//your logic here
return writer;
}
public ItemWriter<T> writerTwo(){
ItemWriter<T> writer = new ItemWriter<T>();
//your logic here
return writer;
}
public CompositeItemWriter<T> compositeItemWriter(){
CompositeItemWriter writer = new CompositeItemWriter();
writer.setDelegates(Arrays.asList(writerOne(),writerTwo()));
return writer;
}
I read a cvs file, do a process and after need to call two writer...
Depending a field value, writerTwo must be called.
Is there any way to get this goal?

Spring Batch - How to reads 5 million records in faster ways?

I'm developing Spring Boot v2.2.5.RELEASE and Spring Batch example. In this example, I'm reading 5 million records using JdbcPagingItemReader from Postgres system from one data-center and writing in into MongoDB into another data-center.
This migration is too slow and need to make the better performance of this batch job. I 'm not sure on how to use partition, because I have a PK in that table holds UUID values, so I can't think of using ColumnRangePartitioner. Is there any best approach to implement this?
Approach-1:
#Bean
public JdbcPagingItemReader<Customer> customerPagingItemReader(){
// reading database records using JDBC in a paging fashion
JdbcPagingItemReader<Customer> reader = new JdbcPagingItemReader<>();
reader.setDataSource(this.dataSource);
reader.setFetchSize(1000);
reader.setRowMapper(new CustomerRowMapper());
// Sort Keys
Map<String, Order> sortKeys = new HashMap<>();
sortKeys.put("cust_id", Order.ASCENDING);
// POSTGRES implementation of a PagingQueryProvider using database specific features.
PostgresPagingQueryProvider queryProvider = new PostgresPagingQueryProvider();
queryProvider.setSelectClause("*");
queryProvider.setFromClause("from customer");
queryProvider.setSortKeys(sortKeys);
reader.setQueryProvider(queryProvider);
return reader;
}
Then Mongo writer, I've used Spring Data Mongo as custom writer:
Job details
#Bean
public Job multithreadedJob() {
return this.jobBuilderFactory.get("multithreadedJob")
.start(step1())
.build();
}
#Bean
public Step step1() {
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setCorePoolSize(4);
taskExecutor.setMaxPoolSize(4);
taskExecutor.afterPropertiesSet();
return this.stepBuilderFactory.get("step1")
.<Transaction, Transaction>chunk(100)
.reader(fileTransactionReader(null))
.writer(writer(null))
.taskExecutor(taskExecutor)
.build();
}
Approach-2: AsyncItemProcessor and AsyncItemWriter would be the better option, because still I've to read using same JdbcPagingItemReader?
Approach-3: Partition, how to use it where I've PK as UUID?
Partitioning (approach 3) is the best option IMO. If your primary key is a String, you can try to create a compound key (aka a combination of columns to make up a unique key).

Wring to multiple files dynamically in Spring Batch

In Spring batch I configure a file write as such:
#Bean
public FlatFileItemWriter<MyObject> flatFileItemWriter() throws Exception{
FlatFileItemWriter<MyObject> itemWriter = new FlatFileItemWriter();
// pass through aggregator just calls toString on any item pass in.
itemWriter.setLineAggregator(new PassThroughLineAggregator<>());
String outputPath = File.createTempFile("output", ".out").getAbsolutePath();
System.out.println(">>output path=" + outputPath);
itemWriter.setResource(new FileSystemResource(outputPath));
itemWriter.afterPropertiesSet();
return itemWriter;
}
What happens if MyObject is a complex structure that can vary depending on configuration settings etc and I want to generate different parts of that structure to different files.
How do I do this?
Have you looked at CompositeItemWriter? You may need to have CompositeLineMapper in your reader as well as ClassifierCompositeItemProcessor depending on your needs.
Below is example of a CompositeItemWriter
#Bean
public ItemWriter fileWriter() {
CompositeItemWriter compWriter = new CompositeItemWriter();
FlatFileItemWriter<MyObject_data> dataWriter = new FlatFileItemWriter<MyObject_data>();
FlatFileItemWriter<MyObject_otherdata> otherWriter = new FlatFileItemWriter<MyObject_otherdata>();
List<ItemWriter> iList = new ArrayList<ItemWriter>();
iList.add(dataWriter);
iList.add(otherWriter);
compWriter.setDelegates(iList);
return compWriter;
}

spring-boot MyBatisBatchItemWriter Cannot change the ExecutorType when there is an existing transaction

I'm use spring boot and mybatis MyBatisBatchItemWriter.
Using demo to write data(mysql) to the database when no problem.
but Used in the my project
org.springframework.dao.TransientDataAccessResourceException: Cannot change the ExecutorType when there is an existing transaction
at org.mybatis.spring.SqlSessionUtils.getSqlSession(SqlSessionUtils.java:91) ~[mybatis-spring-1.2.2.jar:1.2.2]
at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java:353) ~[mybatis-spring-1.2.2.jar:1.2.2]
at com.sun.proxy.$Proxy45.update(Unknown Source) ~[na:na]
this my demo:
#Bean
public MyBatisBatchItemWriter<Hfbank> writer() {
MyBatisBatchItemWriter<Hfbank> writer = new MyBatisBatchItemWriter<Hfbank>();
writer.setSqlSessionFactory(sqlSessionFactory);
String statementId = "com.springboot.dao.HfbankDao.insertSelective";
writer.setStatementId(statementId);
CompositeItemWriter compositeItemWriter = new CompositeItemWriter();
List delegates = new ArrayList();
delegates.add(writer);
compositeItemWriter.setDelegates(delegates);
writer.setAssertUpdates(false);
return writer;
}
this my MyBatisBatchItemWriter:
#Bean
#StepScope
public MyBatisBatchItemWriter<ChannelDataInfo> writer(#Value("#{jobParameters[channelid]}") Long channelid) {
MyBatisBatchItemWriter<ChannelDataInfo> writer = new MyBatisBatchItemWriter<ChannelDataInfo>();
SqlSession sqlSession = sqlSessionFactory.openSession(ExecutorType.BATCH);
writer.setSqlSessionFactory(sqlSessionFactory);
String statementId = "com.kaigejava.fundcheck.repository.ChannelDataInfoRepository.insertSelective";
writer.setStatementId(statementId);
CompositeItemWriter compositeItemWriter = new CompositeItemWriter();
List delegates = new ArrayList();
delegates.add(writer);
compositeItemWriter.setDelegates(delegates);
writer.setAssertUpdates(false);
return writer;
}
why demo ok but my project has error?
Because it sais it: you can't change the executor type inside the transaction.
it looks like you've tried to batch-write something as the part of more broad transaction that includes other SQL operations, but that transaction was started with SIMPLE (default) or REUSE executor type.
It's obvious, that batch-write requires BATCH executor type, though once the transaction started, it's executor type can not be changed. So, perform your batch operations in separate transaction, or run nested transaction, if your RDBMS allows it.

Spring-Batch Java Based FileItemWriter for CSV

I have a Spring Batch service containg ItemWriter to write the data to the CSV.
I have used the example give by Spring Batch guide. https://spring.io/guides/gs/batch-processing/
I tried to modify the ItemWriter to create the CSV again.
The Problems which I am facing are -
It is not creating the CSV file if it is not present.
If I made it available before hand it is not writing data to it.
#Bean
public ItemWriter<Person> writer(DataSource dataSource) {
FlatFileitemWriter<Person> csvWriter = new FlatFileItemWriter<Person>();
csvWriter.setResource(new ClassPathResource("csv/new-data.csv"));
csvWriter.setShouldDeleteIfExists(true);
DelimitedLineAggregator<Person> lineAggregator = new DelimitedLineAggregator<Person>();
lineAggregator.setDelimiter(",");
BeanWrapperFieldExtractor<Person> fieldExtractor = new BeanWrapperFieldExtractor<Person>();
String[] names = {"firstName", "lastName"};
fieldExtractor.setNames(names);
lineAggregator.setFieldExtractor(fieldExtractor);
csvWriter.setLineAggregator(lineAggregator);
return csvWriter;
}
I have gone through various links but they show the example with XML based configuration. How to do it completely in JAVA ?
You are using a ClassPathResource to write. I'm not sure, but I don't think you can write to a ClassPathResource. Try using a normal FileSystemResource and try again.
Moreover, how do you inject the writer? are you sure that it really is instantiated as spring bean?
Why do you have DataSource as a parameter since you don't need a datasource to instantiate a FlatFileItemWriter.

Resources