Spring batch can classifer return multiple writers - spring

I am new to Spring batch
I have a large master table. This master table gets merged into 5 separate tables
My requirement is I need to create a writer which will write to these 5 separate tables.
But these writers will be called based on the condition
For Eg: If I have a field which is not set in the master table I'll call 2 writers and skip the other 3 .
I used a composite writer with classifier to check for the condition but the classifier only returns 1 writer. Can the classifier return multiple writers or is there any other class which can satisfy my requirement?

You can create a custom ItemWriter which combine several ItemWriter for handling a specific case. For example suppose there are two cases which require different ItemWriter. Case1 requires to write to table1 and table2 :
#Component
public class Case1ItemWriter implements ItemWriter<Foo> {
#Autowired
private JdbcBatchItemWriter writer1; //write to table1
#Autowired
private JdbcBatchItemWriter writer2; //write to table2
#Override
public void write(List<? extends T> items) throws Exception{
writer1.write(items);
wrtire2.write(items)
}
}
And case2 requires to write to table3 and table4 :
#Component
public class Case2ItemWriter implements ItemWriter<Foo> {
#Autowired
private JdbcBatchItemWriter writer3; //write to table3
#Autowired
private JdbcBatchItemWriter writer4; //write to table4
#Override
public void write(List<? extends T> items) throws Exception{
writer3.write(items);
wrtire4.write(items)
}
}
Then implement a Classifier to determine different cases to return their own ItemWriter:
#Component
public class MyClassifier implements Classifier<Foo ,ItemWriter<Foo>> {
#Autowired
private Case1ItemWriter case1ItemWriter;
#Autowired
private Case2ItemWriter case2ItemWriter;
#Override
public ItemWriter<Foo> classify(Foo foo){
if(foo.isBlabBlaBla()){
return case1ItemWriter;
}else{
......
return case2ItemWriter;
}
}
}
And configure this Classifier to the ClassifierCompositeItemWriter .

Related

Access Job Parameter in Custom ItemProcessor

I am implementing a custom ItemProcessor<I, O> in spring batch for processing data from a Rest api .
I want access some values from jobParameter inside my ItemProcessor class .
Any suggestion on how to do that ?
In Tasklet we can access JobParameter but not sure how to do in ItemProcessor .
MyItemProcessor.java
#Component
public class MyItemProcessor implements ItemProcessor<User, UserDetails> {
#Override
public UserDetails process(User user) throws Exception {
// access values from job parameter here
return null;
}
}
You can make your item processor step-scoped and inject job parameters in it. The following is one way of doing that:
#Component
#StepScope
public class MyItemProcessor implements ItemProcessor<User, UserDetails> {
#Value("#{jobParameters}")
private JobParameters jobParameters;
#Override
public UserDetails process(User user) throws Exception {
// access values from job parameter here
return null;
}
}
You could also inject a specific parameter if you want with something like the following:
#Component
#StepScope
public class MyItemProcessor implements ItemProcessor<User, UserDetails> {
#Value("#{jobParameters['myParameter']}")
private String myParameter;
#Override
public UserDetails process(User user) throws Exception {
// use myParameter as needed here
return null;
}
}
Since field injection is not recommended, you can inject job parameters in your item processor when you define it as a bean, something like:
// Note how nothing related to Spring is used here, and the processor can be unit tested as a regular Java class
public class MyItemProcessor implements ItemProcessor<User, UserDetails> {
private String myParameter;
public MyItemProcessor(String myParameter) {
this.myParameter = myParameter;
}
#Override
public UserDetails process(User user) throws Exception {
// use this.myParameter as needed here
return null;
}
}
Once that in place, you can declare your item processor bean as follows:
#Bean
#StepScope
public MyItemProcessor itemProcessor(#Value("#{jobParameters['myParameter']}") String myParameter) {
return new MyItemProcessor(myParameter);
}
Fore more details about scoped beans, please check the documentation here: Late Binding of Job and Step attributes.

the 2nd step can not fetch data inserted by 1st step

I configure 2 steps in my job.
The first job is to read from a csv file and writer it to db
below it he code of writer
public class TempTableWriter implements ItemWriter<Module> {
#Autowired
DataSource ds;
#Autowired
ModuleService moduleService;
#Override
public void write(List<? extends Module> items) throws Exception {
// TODO Auto-generated method stub
moduleService.updateModuleList(items);
}
}
#Service
#Transactional(propagation = Propagation.REQUIRED)
public class ModuleService {
#Autowired
ModuleDao moduleDao;
#Transactional(isolation = Isolation.READ_UNCOMMITTED)
public List<Module> getModuleList() {
return moduleDao.getModuleList();
}
public void updateModuleList(List<? extends Module> items) {
items.forEach(item -> {
moduleDao.updateModuleList(p1, p2, p3, p4);
});
}
}
#Repository
public class ModuleDao {
#Autowired
#Qualifier("moduleMapper")
private RowMapper moduleMapper;
#Autowired
private JdbcTemplate jdbctemplate;
public List<Module> getModuleList() {
return jdbctemplate.query("select * from[schema].[t1] ORDER BY p1",
moduleMapper);
}
public void updateModuleList(String requestId, String fileName,
String orcStatus,String context) {
jdbctemplate.update("insert into [schema].[t1] values(?,?,?,?)", p1, p2,
p3,p4);
}
}
then on the 2nd step when I try the fetch the data which I stored by step1, it always get null, but after the job completing,I could find them really stored in db.
Similarly I tried to use JdbcItemWriter to save the records but I still can not fetch them by the next step.

preprocessing in spring boot batch with multiple threads

I have a Spring batch with multi threads. In my processor I want to use global variables say a map. The map contains some values which is to be queried from a table and is to be used by the processor. How can I achieve this? If i write the logic to set the map in the processor, the query will be executed for every record fetched by the item reader, which would be millions in numbers. Is there a way to do this?
You can intercept step execution
Spring Batch - Reference Documentation section 5.1.10. Intercepting Step Execution
For example, you can implement the StepExecutionListener interface
#Component
#JobScope
public class Processor implements ItemProcessor<Integer,Integer>, StepExecutionListener {
private final Map<String,String> map = new HashMap<>();
#Override
public void beforeStep(StepExecution stepExecution) {
// initialize a variable once before step
map.put("KEY","VALUE");
}
#Override
public Integer process(Integer item) throws Exception {
// use a variable for each iteration
final String key = map.get("KEY");
// ...
}
// ....
}
or use the #BeforeStep annotation
#Component
#JobScope
public class Processor implements ItemProcessor<Integer,Integer>{
private final Map<String,String> map = new HashMap<>();
#BeforeStep
public void beforeStep(StepExecution stepExecution) {
// initialize a variable once before step
map.put("KEY","VALUE");
}
#Override
public Integer process(Integer item) throws Exception {
// use a variable for each iteration
final String key = map.get("KEY");
//...
}
}

Strange behaviour when integrate Spring JPA Data and Spring Cache

When I integrate Spring JPA Data and Spring Cache, there is a strange behaviour I can't explain.
I am using Spring Boot to setup my demo project. The code is below.
My config bean:
#Configuration
public class AppConfig {
#Bean
public CacheManager cacheManager() {
return new ConcurrentMapCacheManager("Person");
}
}
My entity bean.
#Entity
public class Person implements Serializable {
private static final long serialVersionUID = 2263555241909370718L;
#Id
#GeneratedValue
private Long id;
private String name;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
My JPA interface. I overwrite some methods from JpaRepository and add #cachable annotation.
public interface PersonRepository extends JpaRepository<Person, Long> {
#Override
#CacheEvict(value = "Person", allEntries = true)
public void delete(Long id);
#Cacheable("Person")
public Person findByName(String name);
#Override
#Query("select p from Person p where p.id = :id + 1L")
#Cacheable("Person")
public Person findOne(#Param("id") Long id);
}
My unit test class
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = SpringDataDemoApplication.class)
public class SpringDataDemoApplicationTests {
#Autowired
private PersonRepository personRepository;
#Autowired
private CacheManager cacheManager;
#Before
public void setup() {
Person p1 = new Person();
p1.setName("Chris");
personRepository.save(p1);
}
#Test
public void test2() {
JpaRepository<Person, Long> jpaRepository = personRepository;
Person p = personRepository.findOne(0L);
Assert.assertNotNull(p);
p = personRepository.findOne(0L);
Assert.assertNotNull(p);
System.err.println("---------------------------------------");
p = jpaRepository.findOne(0L);
Assert.assertNotNull(p);
p = jpaRepository.findOne(0L);
Assert.assertNotNull(p);
}
}
The output is very strange.
Hibernate: insert into person (id, name) values (default, ?)
Hibernate: select person0_.id as id1_0_, person0_.name as name2_0_ from person person0_ where person0_.id=?+1
---------------------------------------
Hibernate: select person0_.id as id1_0_, person0_.name as name2_0_ from person person0_ where person0_.id=?+1
Hibernate: select person0_.id as id1_0_, person0_.name as name2_0_ from person person0_ where person0_.id=?+1
It should only print out one sql statement for my expect. The jpaRepository.findOne(0L) doesn't use the cache object.
The cache annotation is not working after I assign the PersonRepository interface to its parent interface, JpaRepository.
These 2 variables are exactly point to the same reference, even it a proxy object. Why call the same reference's method causing 2 difference result?
I also notice that the #Query annotation is working well. Both the JpaRepository and PersonRepository references use the customized SQL.
I guess there maybe some differences between how Spring Cache and Spring JPA Data generate the proxy advisor. Is that possible a bug here?
Add #EnableCaching to your configuration:
#EnableCaching
#Configuration
public class AppConfig {
#Bean
public CacheManager cacheManager() {
return new ConcurrentMapCacheManager("Person");
}
}
Declaring the cache annotations does not automatically trigger their actions per se, you should declaratively enable the Caching behavior by using EnableCaching annotation. One of the advantages of this approach is you can disable it by removing only one configuration line rather than all the annotations in your code.
I think I find the reason why this happened. I add some aop code to track what method be called in the repository.
#Aspect
#Configuration
public class AppConfig {
#Bean
public CacheManager cacheManager() {
return new ConcurrentMapCacheManager();
}
#AfterReturning("execution(* org..*Repository.*(..))")
public void logServiceAccess(JoinPoint joinPoint) {
Arrays.asList(joinPoint.getTarget().getClass().getMethods()) //
.stream() //
.filter(m -> m.getName().startsWith("findOne")) //
.forEach(m -> System.err.println(m));
System.err.println("Completed: " + joinPoint);
}
}
The output is
public final java.lang.Object com.sun.proxy.$Proxy66.findOne(java.io.Serializable)
public final org.chris.demo.domain.Person com.sun.proxy.$Proxy66.findOne(java.lang.Long)
Spring container proxy 2 findOne methods with different arguments. I think that caused by generic code as all generic code will be removed after compile.
When I use the parent interface call the method, it call the public final java.lang.Object com.sun.proxy.$Proxy66.findOne(java.io.Serializable)
method. And on that method, there is not way to add #cacheable annotation.
I don't know whether there is way to focus Spring container generate only one findOne method when there is a sub-interface overwrite the method by generic programming.

Issue with transactions in multiple services (Spring Framework/JTA): org.hibernate.ObjectDeletedException: deleted instance passed to merge

I receive the following exception during program execution:
org.hibernate.ObjectDeletedException: deleted instance passed to merge: [ns.entity.Category#<null>]; nested exception is java.lang.IllegalArgumentException: org.hibernate.ObjectDeletedException: deleted instance passed to merge: [ns.entity.Category#<null>]
The following code throws exception:
importer.foo();
Importer service:
#Service
#Transactional
public class Importer {
#Autowired
private UserService userService;
#Autowired
private CategoryService categoryService;
#Transactional
public void foo() {
User user = userService.findByLogin("max");
categoryService.delete(user.getCategories());
}
}
UserService (uses CrudRepository):
#Service
#Repository
#Transactional
public class UserServiceImpl implements UserService {
#Autowired
private UserRepository repository;
#Override
#Transactional(readOnly = true)
public User findById(Long userId) {
return repository.findOne(userId);
}
}
CategoryService (uses CrudRepository):
#Service
#Repository
#Transactional
public class CategoryServiceImpl implements CategoryService {
#Autowired
private CategoryRepository repository;
#Override
#Transactional
public void delete(Set<Category> categories) {
repository.delete(categories);
}
}
The following code snippet in CategoryServiceImpl.delete() works without exception:
for (Category category : categories) {
Category newCat = findById(category.getCategoryId());
if (newCat != null) {
delete(newCat);
}
}
From what I understand two different transactions are used (one read only and one for deletion). Is it possible to re-use the transaction for all calls? Removing (readOnly = true) from UserServiceImpl.findById() does not help.
I thought that just one transaction should be used for all three methods (Importer.foo(), UserServiceImpl.findById(), CategoryServiceImpl.delete()) according to Spring documentation.

Resources