Scaling Spring Batch using AsyncItemProcessor and AsyncItemWriter - spring

I am trying to write a solution to scale a spring batch. In the spring batch it's reading data (600 000) from the MySQL database and then process it and update the status of each processed row as 'Completed'.
I am using AsyncItemProcessor and AsyncItemWriter for scaling the spring batch.
Problem:
If I run the spring batch synchronously then it's taking same or less time while running the spring batch asynchronously. I am not getting the benefit of using multi threading.
package com.example.batchprocessing;
import javax.sql.DataSource;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobExecutionListener;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.batch.integration.async.AsyncItemProcessor;
import org.springframework.batch.integration.async.AsyncItemWriter;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.database.BeanPropertyItemSqlParameterSourceProvider;
import org.springframework.batch.item.database.JdbcBatchItemWriter;
import org.springframework.batch.item.database.JdbcCursorItemReader;
import org.springframework.batch.item.database.builder.JdbcBatchItemWriterBuilder;
import org.springframework.batch.item.file.FlatFileItemReader;
import org.springframework.batch.item.file.FlatFileItemWriter;
import org.springframework.batch.item.file.builder.FlatFileItemReaderBuilder;
import org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.transform.BeanWrapperFieldExtractor;
import org.springframework.batch.item.file.transform.DelimitedLineAggregator;
import org.springframework.batch.item.file.transform.DelimitedLineTokenizer;
import org.springframework.batch.item.support.CompositeItemWriter;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.ClassPathResource;
import org.springframework.core.io.FileSystemResource;
import org.springframework.core.task.SimpleAsyncTaskExecutor;
import org.springframework.core.task.TaskExecutor;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor;
import org.springframework.transaction.annotation.Transactional;
import java.util.Arrays;
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Bean
public JdbcCursorItemReader<Person> reader(DataSource dataSource) {
JdbcCursorItemReader<Person> reader = new JdbcCursorItemReader<>();
reader.setDataSource(dataSource);
reader.setSql("SELECT * from people");
reader.setRowMapper(new UserRowMapper());
return reader;
}
#Bean
public PersonItemProcessor processor() {
return new PersonItemProcessor();
}
#Bean
public AsyncItemProcessor<Person, Person> asyncItemProcessor() throws Exception {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(30);
executor.setMaxPoolSize(50);
executor.setQueueCapacity(10000);
executor.setThreadNamePrefix("BatchProcessing-");
executor.afterPropertiesSet();
AsyncItemProcessor<Person, Person> asyncProcessor = new AsyncItemProcessor<>();
asyncProcessor.setDelegate(processor());
asyncProcessor.setTaskExecutor(executor);
asyncProcessor.afterPropertiesSet();
return asyncProcessor;
}
#Bean
public JdbcBatchItemWriter<Person> writer(DataSource dataSource) {
return new JdbcBatchItemWriterBuilder<Person>()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql("UPDATE people set status= 'completed' where person_id= :id")
.dataSource(dataSource)
.build();
}
#Bean
public AsyncItemWriter<Person> asyncItemWriter() {
AsyncItemWriter<Person> asyncWriter = new AsyncItemWriter<>();
asyncWriter.setDelegate(writer(null));
return asyncWriter;
}
#Bean
public Job importUserJob(JobCompletionNotificationListener listener, Step step1) {
return jobBuilderFactory.get("importUserJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(step1)
.end()
.build();
}
#Bean
public Step step1(JdbcBatchItemWriter<Person> writer) throws Exception {
return stepBuilderFactory.get("step1")
.<Person, Person> chunk(10000)
.reader(reader(null))
//.processor(processor())
//.writer(writer)
.processor((ItemProcessor) asyncItemProcessor())
.writer(asyncItemWriter())
//.throttleLimit(30)
.build();
}
}
What am I doing wrong? Why is the AsyncItemProcessor taking more/same time as synchronous processing? Usually it should take less time. I can see in the log multiple thread is working but eventually the end time is the same as synchronous processing.

I agree to #5019386, asyncItemProcessor is helpful when processing an item is expensive. One suggestion that I can make is to move the code to create ThreadPoolTaskExecutor into a separate block as a bean so that you can avoid creating threads.

Related

Implementing JUnit test cases in Spring batch Jobs calling from Controller

I am new to Junit. I am literally struggling to write Junit test cases for my code. I am working with Spring boot, Batch and JPA. I configured Jobs,file read and write in to DB everything working fine. But when it comes to JUnit, I don't even get an idea to write code. Can anyone help me. Below is my code
FileProcessController.java
import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameter;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.repository.JobInstanceAlreadyCompleteException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.ResponseBody;
import org.springframework.web.bind.annotation.RestController;
import com.sc.batchservice.model.StatusResponse;
import lombok.extern.slf4j.Slf4j;
#Slf4j
#RestController
#RequestMapping("/fileProcess")
public class FileProcessController {
#Autowired
private JobLauncher jobLauncher;
#Autowired
#Qualifier("euronetFileParseJob")
private Job job;
#GetMapping(path = "/process")
public #ResponseBody StatusResponse process() throws Exception {
try {
Map<String, JobParameter> parameters = new HashMap<>();
parameters.put("date", new JobParameter(new Date()));
jobLauncher.run(job, new JobParameters(parameters));
return new StatusResponse(true);
} catch (Exception e) {
log.error("Exception", e);
Throwable rootException = ExceptionUtils.getRootCause(e);
String errMessage = rootException.getMessage();
log.info("Root cause is instance of JobInstanceAlreadyCompleteException --> "+(rootException instanceof JobInstanceAlreadyCompleteException));
if(rootException instanceof JobInstanceAlreadyCompleteException){
log.info(errMessage);
return new StatusResponse(false, "This job has been completed already!");
} else{
throw e;
}
}
}
}
BatchConfig.java
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.Resource;
import com.sc.batchservice.model.DetailsDTO;
#Configuration
#EnableBatchProcessing
public class BatchConfig {
private JobBuilderFactory jobBuilderFactory;
#Autowired
public void setJobBuilderFactory(JobBuilderFactory jobBuilderFactory) {
this.jobBuilderFactory = jobBuilderFactory;
}
#Autowired
StepBuilderFactory stepBuilderFactory;
#Value("file:${input.files.location}${input.file.pattern}")
private Resource[] fileInputs;
#Value("${euronet.file.column.names}")
private String filecolumnNames;
#Value("${euronet.file.column.lengths}")
private String fileColumnLengths;
#Value("${input.files.location}")
private String inputFileLocation;
#Value("${input.file.pattern}")
private String inputFilePattern;
#Autowired
FlatFileWriter flatFileWriter;
#Bean
public Job euronetFileParseJob() {
return jobBuilderFactory.get("euronetFileParseJob")
.incrementer(new RunIdIncrementer())
.start(fileStep())
.build();
}
public Step fileStep() {
return stepBuilderFactory.get("fileStep")
.<DetailsDTO, DetailsDTO>chunk(10)
.reader(new FlatFileReader(fileInputs, filecolumnNames, fileColumnLengths))
.writer(flatFileWriter)
.build();
}
}
FlatFileReader.java
import org.springframework.batch.item.file.FlatFileItemReader;
import org.springframework.batch.item.file.MultiResourceItemReader;
import org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.mapping.FieldSetMapper;
import org.springframework.batch.item.file.transform.FixedLengthTokenizer;
import org.springframework.core.io.Resource;
import com.sc.batchservice.model.DetailsDTO;
import com.sc.batchservice.util.CommonUtil;
import lombok.extern.slf4j.Slf4j;
#Slf4j
public class FlatFileReader extends MultiResourceItemReader<DetailsDTO> {
public EuronetFlatFileReader(Resource[] fileInputs, String filecolumnNames, String fileColumnLengths) {
setResources(fileInputs);
setDelegate(reader(filecolumnNames, fileColumnLengths));
setStrict(true);
}
private FlatFileItemReader<DetailsDTO> reader(String filecolumnNames, String fileColumnLengths) {
FlatFileItemReader<DetailsDTO> flatFileItemReader = new FlatFileItemReader<>();
FixedLengthTokenizer tokenizer = CommonUtil.fixedLengthTokenizer(filecolumnNames, fileColumnLengths);
FieldSetMapper<DetailsDTO> mapper = createMapper();
DefaultLineMapper<DetailsDTO> lineMapper = new DefaultLineMapper<>();
lineMapper.setLineTokenizer(tokenizer);
lineMapper.setFieldSetMapper(mapper);
flatFileItemReader.setLineMapper(lineMapper);
return flatFileItemReader;
}
/*
* Mapping column data to DTO
*/
private FieldSetMapper<DetailsDTO> createMapper() {
BeanWrapperFieldSetMapper<DetailsDTO> mapper = new BeanWrapperFieldSetMapper<>();
try {
mapper.setTargetType(DetailsDTO.class);
} catch(Exception e) {
log.error("Exception in mapping column data to dto ", e);
}
return mapper;
}
}
I have Writer,Entity and model classes also, But if any example or idea upto this code, I can proceed with those classes.

Retry feature is not working in Spring Batch

I have a batch job where i am using ScriptBatch.3.0.x version.
My use-case is to retry the job incase of any intermediate failures in between.
I am using the Chunk based processing and StepBuilderFactory for a job. I could not see any difference by adding the retry in it.
return stepBuilderFactory.get("ValidationStepName")
.<Long, Info> chunk(10)
.reader(.....)
.processor(.....)
// .faultTolerant()
// .retryLimit(5)
// .retryLimit(5).retry(Exception.class)
.writer(......)
.faultTolerant()
.retryLimit(5)
//.retryLimit(5).retry(Exception.class)
.transactionManager(jpaTransactionManager())
.listener(new ChunkNotificationListener())
.build();
Not sure i am missing something here, I am expecting here that adding retryLimit() will retry the same chunk for n number of time on getting any exception
I am expecting here that adding retryLimit() will retry the same chunk for n number of time on getting any exception
If you specify a retry limit, you need to specify which exceptions to retry. Otherwise you would have an IllegalStateException with the message: If a retry limit is provided then retryable exceptions must also be specified.
EDIT:
Point 1 : The following test is passing with version 3.0.9:
import java.util.Arrays;
import org.junit.Rule;
import org.junit.Test;
import org.junit.rules.ExpectedException;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.junit.MockitoJUnitRunner;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.core.step.tasklet.TaskletStep;
import org.springframework.batch.item.support.ListItemReader;
import org.springframework.batch.item.support.ListItemWriter;
import org.springframework.transaction.PlatformTransactionManager;
#RunWith(MockitoJUnitRunner.class)
public class TestRetryConfig {
#Rule
public ExpectedException expectedException = ExpectedException.none();
#Mock
private JobRepository jobRepository;
#Mock
PlatformTransactionManager transactionManager;
#Test
public void testRetryLimitWithoutException() {
expectedException.expect(IllegalStateException.class);
expectedException.expectMessage("If a retry limit is provided then retryable exceptions must also be specified");
StepBuilderFactory stepBuilderFactory = new StepBuilderFactory(jobRepository, transactionManager);
TaskletStep step = stepBuilderFactory.get("step")
.<Integer, Integer>chunk(2)
.reader(new ListItemReader<>(Arrays.asList(1, 2, 3)))
.writer(new ListItemWriter<>())
.faultTolerant()
.retryLimit(3)
.build();
}
}
It shows that if you specify a retry limit without the exception type(s) to retry, the step configuration should fail.
Point 2: The following sample shows that the declared exception type is retried as expected (tested with version 3.0.9 too):
import java.util.Arrays;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.support.ListItemReader;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class MyJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public ItemReader<Integer> itemReader() {
return new ListItemReader<>(Arrays.asList(1, 2, 3, 4, 5, 6, 7, 8, 9, 10));
}
#Bean
public ItemWriter<Integer> itemWriter() {
return items -> {
for (Integer item : items) {
System.out.println("item = " + item);
if (item.equals(7)) {
throw new Exception("Sevens are sometime nasty, let's retry them");
}
}
};
}
#Bean
public Step step() {
return steps.get("step")
.<Integer, Integer>chunk(5)
.reader(itemReader())
.writer(itemWriter())
.faultTolerant()
.retryLimit(3)
.retry(Exception.class)
.build();
}
#Bean
public Job job() {
return jobs.get("job")
.start(step())
.build();
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJob.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
}
it prints:
item = 1
item = 2
item = 3
item = 4
item = 5
item = 6
item = 7
item = 6
item = 7
item = 6
item = 7
item 7 is retried 3 times and then the step fails as expected.
I hope this helps.

Spring Batch ResourcelessTransactionManager messes with persistence.xml?

I am working on an application and have been asked to implement a scheduled spring batch job. I have set up a configuration file where I set a #Bean ResourcelessTransactionManager but it seems to mess with the persistence.xml.
There is already a persistence xml in an other module, there is no compilation error. I get a NoUniqueBeanDefinitionException when I am requesting a page that returns a view item.
This is the error:
Caused by: org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [org.springframework.transaction.PlatformTransactionManager] is defined: expected single matching bean but found 2: txManager,transactionManager
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:365)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:331)
at org.springframework.transaction.interceptor.TransactionAspectSupport.determineTransactionManager(TransactionAspectSupport.java:366)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:271)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:653)
at com.mypackage.services.MyClassService$$EnhancerBySpringCGLIB$$9e8bf16f.registryEvents(<generated>)
at com.mypackage.controllers.MyClassSearchView.init(MyClassSearchView.java:75)
... 168 more
Is there a way to tell spring batch to use the data source defined in the persistence.xml of the other module or maybe is this caused by something else?
I created separate BatchScheduler java class as below and included it in BatchConfiguration java class. I am sharing both the classes. BatchConfiguration contains another jpaTransactionManager.
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean;
import org.springframework.batch.support.transaction.ResourcelessTransactionManager;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.scheduling.annotation.EnableScheduling;
#Configuration
#EnableScheduling
public class BatchScheduler {
#Bean
public ResourcelessTransactionManager resourcelessTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory(
ResourcelessTransactionManager resourcelessTransactionManager) throws Exception {
MapJobRepositoryFactoryBean factory = new
MapJobRepositoryFactoryBean(resourcelessTransactionManager);
factory.afterPropertiesSet();
return factory;
}
#Bean
public JobRepository jobRepository(
MapJobRepositoryFactoryBean factory) throws Exception {
return factory.getObject();
}
#Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) {
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setJobRepository(jobRepository);
return launcher;
}
}
BatchConfiguration contains another jpaTransactionManager.
import java.io.IOException;
import java.util.Date;
import java.util.Properties;
import javax.sql.DataSource;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobExecution;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.JobParametersBuilder;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.batch.item.database.JpaItemWriter;
import org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.transform.DelimitedLineTokenizer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
import org.springframework.context.support.PropertySourcesPlaceholderConfigurer;
import org.springframework.core.env.Environment;
import org.springframework.core.io.FileSystemResource;
import org.springframework.core.io.support.PathMatchingResourcePatternResolver;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.JpaVendorAdapter;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.orm.jpa.vendor.Database;
import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter;
import org.springframework.scheduling.TaskScheduler;
import org.springframework.scheduling.annotation.EnableScheduling;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.scheduling.concurrent.ConcurrentTaskScheduler;
import org.springframework.transaction.PlatformTransactionManager;
import trade.api.common.constants.Constants;
import trade.api.entity.SecurityEntity;
import trade.api.trade.batch.item.processor.SecurityItemProcessor;
import trade.api.trade.batch.item.reader.NseSecurityReader;
import trade.api.trade.batch.notification.listener.SecurityJobCompletionNotificationListener;
import trade.api.trade.batch.tasklet.SecurityReaderTasklet;
import trade.api.vo.SecurityVO;
#Configuration
#EnableBatchProcessing
#EnableScheduling
#Import({OhlcMonthBatchConfiguration.class, OhlcWeekBatchConfiguration.class, OhlcDayBatchConfiguration.class, OhlcMinuteBatchConfiguration.class})
public class BatchConfiguration {
private static final String OVERRIDDEN_BY_EXPRESSION = null;
/*
Load the properties
*/
#Value("${database.driver}")
private String databaseDriver;
#Value("${database.url}")
private String databaseUrl;
#Value("${database.username}")
private String databaseUsername;
#Value("${database.password}")
private String databasePassword;
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
private JobLauncher jobLauncher;
#Bean
public TaskScheduler taskScheduler() {
return new ConcurrentTaskScheduler();
}
//second, minute, hour, day of month, month, day(s) of week
//#Scheduled(cron = "0 0 21 * * 1-5") on week days
#Scheduled(cron="${schedule.insert.security}")
public void importSecuritySchedule() throws Exception {
System.out.println("Job Started at :" + new Date());
JobParameters param = new JobParametersBuilder().addString("JobID",
String.valueOf(System.currentTimeMillis())).toJobParameters();
JobExecution execution = jobLauncher.run(importSecuritesJob(), param);
System.out.println("Job finished with status :" + execution.getStatus());
}
#Bean SecurityJobCompletionNotificationListener securityJobCompletionNotificationListener() {
return new SecurityJobCompletionNotificationListener();
}
//Import Equity OHLC End
//Import Equity Start
// tag::readerwriterprocessor[]
#Bean
public SecurityReaderTasklet securityReaderTasklet() {
return new SecurityReaderTasklet();
}
#Bean
#StepScope
public NseSecurityReader<SecurityVO> nseSecurityReader(#Value("#{jobExecutionContext["+Constants.SECURITY_DOWNLOAD_FILE+"]}") String pathToFile) throws IOException {
NseSecurityReader<SecurityVO> reader = new NseSecurityReader<SecurityVO>();
reader.setLinesToSkip(1);
reader.setResource(new FileSystemResource(pathToFile));
reader.setLineMapper(new DefaultLineMapper<SecurityVO>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "symbol", "nameOfCompany", "series", "dateOfListing", "paidUpValue", "marketLot", "isinNumber", "faceValue" });
}});
setFieldSetMapper(new BeanWrapperFieldSetMapper<SecurityVO>() {{
setTargetType(SecurityVO.class);
}});
}});
return reader;
}
#Bean
public SecurityItemProcessor processor() {
return new SecurityItemProcessor();
}
#Bean
public JpaItemWriter<SecurityEntity> writer() {
JpaItemWriter<SecurityEntity> writer = new JpaItemWriter<SecurityEntity>();
writer.setEntityManagerFactory(entityManagerFactory().getObject());
return writer;
}
// end::readerwriterprocessor[]
// tag::jobstep[]
#Bean
public Job importSecuritesJob() throws IOException {
return jobBuilderFactory.get("importSecuritesJob")
.incrementer(new RunIdIncrementer())
.listener(securityJobCompletionNotificationListener())
.start(downloadSecurityStep())
.next(insertSecurityStep())
.build();
}
#Bean
public Step downloadSecurityStep() throws IOException {
return stepBuilderFactory.get("downloadSecurityStep")
.tasklet(securityReaderTasklet())
.build();
}
#Bean
public Step insertSecurityStep() throws IOException {
return stepBuilderFactory.get("insertSecurityStep")
.transactionManager(jpaTransactionManager())
.<SecurityVO, SecurityEntity> chunk(100)
.reader(nseSecurityReader(OVERRIDDEN_BY_EXPRESSION))
.processor(processor())
.writer(writer())
.build();
}
// end::jobstep[]
//Import Equity End
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(databaseDriver);
dataSource.setUrl(databaseUrl);
dataSource.setUsername(databaseUsername);
dataSource.setPassword(databasePassword);
return dataSource;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean lef = new LocalContainerEntityManagerFactoryBean();
lef.setPackagesToScan("trade.api.entity");
lef.setDataSource(dataSource());
lef.setJpaVendorAdapter(jpaVendorAdapter());
lef.setJpaProperties(new Properties());
return lef;
}
#Bean
public JpaVendorAdapter jpaVendorAdapter() {
HibernateJpaVendorAdapter jpaVendorAdapter = new HibernateJpaVendorAdapter();
jpaVendorAdapter.setDatabase(Database.MYSQL);
jpaVendorAdapter.setGenerateDdl(true);
jpaVendorAdapter.setShowSql(false);
jpaVendorAdapter.setDatabasePlatform("org.hibernate.dialect.MySQLDialect");
return jpaVendorAdapter;
}
#Bean
#Qualifier("jpaTransactionManager")
public PlatformTransactionManager jpaTransactionManager() {
return new JpaTransactionManager(entityManagerFactory().getObject());
}
#Bean
public static PropertySourcesPlaceholderConfigurer dataProperties(Environment environment) throws IOException {
String[] activeProfiles = environment.getActiveProfiles();
final PropertySourcesPlaceholderConfigurer ppc = new PropertySourcesPlaceholderConfigurer();
ppc.setLocations(new PathMatchingResourcePatternResolver().getResources("classpath*:application-"+activeProfiles[0]+".properties"));
return ppc;
}
//// Import Security End
}
Problem solved. There was a PlatformTransactionManager bean located in an other configuration file. I set it as #Primary and now the problem is fixed. Thanks everyone for the help.

Spring boot with spring batch and jpa -configuration

I have simple batch application, reading csv to postgres database.
I have uploaded the code in this below repo in bitbucket
https://github.com/soasathish/spring-batch-with-jpa.git
I have problems in configuring the writing to database using spring data JPA.
I am getting manage bean not found .issue.
This same jpa spring data configuration works in different project when i tried to integrate with spring batch it fails with manage bean not found.
The batch config has spring job
There is only one step
1) reader -read from csv files.
processor applies some rules on the files .. Drools
please run schema-postgresql.sql to setup database
WRITER USES THE SPRING DATA JPA TO WRITE TO DB
could one help
I have uploaded the code in this below repo in bitbucket
https://github.com/soasathish/spring-batch-with-jpa.git
i know its a minor issue , but any direction or help will be grateful
code for creating repo
=======================
package uk.gov.iebr.batch.config;
import static uk.gov.iebr.batch.config.AppProperties.DRIVER_CLASS_NAME;
import static uk.gov.iebr.batch.config.AppProperties.IEBR_DB_PASSWORD_KEY;
import static uk.gov.iebr.batch.config.AppProperties.IEBR_DB_URL_KEY;
import static uk.gov.iebr.batch.config.AppProperties.IEBR_DB_USER_KEY;
import java.util.Properties;
import javax.sql.DataSource;
import org.hibernate.jpa.HibernatePersistenceProvider;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.context.annotation.PropertySource;
import org.springframework.core.env.Environment;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
#Configuration
#PropertySource({"classpath:application.properties"})
#EnableJpaRepositories({"uk.gov.iebr.batch.repository"})
#EnableTransactionManagement
#ComponentScan(basePackages="uk.gov.iebr.batch.repository")
public class DataSourceConfiguration {
#Autowired
Environment env;
#Bean(name = "allsparkEntityMF")
public LocalContainerEntityManagerFactoryBean allsparkEntityMF() {
final LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
em.setDataSource(allsparkDS());
em.setPersistenceUnitName("allsparkEntityMF");
em.setPackagesToScan(new String[] { "uk.gov.iebr.batch"});
em.setPackagesToScan(new String[] { "uk.gov.iebr.batch.repository"});
em.setPersistenceProvider(new HibernatePersistenceProvider());
HibernateJpaVendorAdapter a = new HibernateJpaVendorAdapter();
em.setJpaVendorAdapter(a);
Properties p = hibernateSpecificProperties();
p.setProperty("hibernate.ejb.entitymanager_factory_name", "allsparkEntityMF");
em.setJpaProperties(p);
return em;
}
#Bean(name = "allsparkDS")
public DataSource allsparkDS() {
final DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getProperty(DRIVER_CLASS_NAME));
dataSource.setUrl(env.getProperty(IEBR_DB_URL_KEY));
dataSource.setUsername(env.getProperty(IEBR_DB_USER_KEY));
dataSource.setPassword(env.getProperty(IEBR_DB_PASSWORD_KEY));
return dataSource;
}
#Bean
public Properties hibernateSpecificProperties(){
final Properties p = new Properties();
p.setProperty("hibernate.hbm2ddl.auto", env.getProperty("spring.jpa.hibernate.ddl-auto"));
p.setProperty("hibernate.dialect", env.getProperty("spring.jpa.hibernate.dialect"));
p.setProperty("hibernate.show-sql", env.getProperty("spring.jpa.show-sql"));
p.setProperty("hibernate.cache.use_second_level_cache", env.getProperty("spring.jpa.hibernate.cache.use_second_level_cache"));
p.setProperty("hibernate.cache.use_query_cache", env.getProperty("spring.jpa.hibernate.cache.use_query_cache"));
return p;
}
#Bean(name = "defaultTm")
public PlatformTransactionManager transactionManager() {
JpaTransactionManager txManager = new JpaTransactionManager();
txManager.setEntityManagerFactory(allsparkEntityMF().getObject());
return txManager;
}
}
Batch config file:
package uk.gov.iebr.batch;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
import org.springframework.context.annotation.PropertySource;
import uk.gov.iebr.batch.config.AllSparkDataSourceConfiguration;
import uk.gov.iebr.batch.config.DataSourceConfiguration;
import uk.gov.iebr.batch.dao.PersonDao;
import uk.gov.iebr.batch.model.Person;
import uk.gov.iebr.batch.step.Listener;
import uk.gov.iebr.batch.step.Processor;
import uk.gov.iebr.batch.step.Reader;
import uk.gov.iebr.batch.step.Writer;
#Configuration
#EnableBatchProcessing
//spring boot configuration
#EnableAutoConfiguration
//file that contains the properties
#PropertySource("classpath:application.properties")
#Import({DataSourceConfiguration.class, AllSparkDataSourceConfiguration.class})
public class BatchConfig {
private static final Logger log = LoggerFactory.getLogger(BatchConfig.class);
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public PersonDao PersonDao;
#Autowired
public DataSourceConfiguration dataSourceConfiguration;
#Bean
public Job job() {
long startTime = System.currentTimeMillis();
log.info("START OF BATCH ========================================================================" +startTime);
return jobBuilderFactory.get("job").incrementer(new RunIdIncrementer())
//.listener(new Listener(PersonDao))
.flow(step1()).end().build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1").<Person, Person>chunk(10)
.reader(Reader.reader("tram-data.csv"))
.processor(new Processor()).writer(new Writer(PersonDao)).build();
}
}
Writer calls this PersonDaoImpl:
public class PersonDaoImpl implements PersonDao {
#Autowired
DataSourceConfiguration dataSource;
#Autowired
PersonRepository personrepo;
#Override
public void insert(List<? extends Person> Persons) {
personrepo.save(Persons);
}
}
Based on the code you provided and the stack trace in your comment.
It's complaining that it can't find a #Bean named entityManagerFactory.
The reason this is happening is because you are using #EnableJpaRepositories and the entityManagerFactoryRef property defaults to entityManagerFactory. This property defines the name of the #Bean for the EntityManagerFactory.
I think your application configuration is preventing the normal auto-configuration from spring-boot from being processed.
I would recommend removing the IEBRFileProcessApplication class and following this example for configuring your spring-boot application (you could use ServletInitializer if you want).
#SpringBootApplication
public class Application extends SpringBootServletInitializer {
#Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {
return application.sources(Application.class);
}
public static void main(String[] args) throws Exception {
SpringApplication.run(Application.class, args);
}
}
I also can't really see a need for DataSourceConfiguration and AllSparkDataSourceConfiguration, so I would recommend removing them. If you really need to specify your own DataSource, let me know and I can provide an additional example.
Between the #SpringBootApplication and #EnableBatchProcessing annotations, everything that is necessary will be bootstrapped for you.
All you need on BatchConfig is #Configuration and #EnableBatchProcessing.
If you make these changes to simplify your code base, then your problems should disappear.
UPDATE:
I created a pull request located here https://github.com/soasathish/spring-batch-with-jpa/pull/1
Please take a look at the javadoc here for an explanation on how #EnableBatchProcessing works. http://docs.spring.io/spring-batch/apidocs/org/springframework/batch/core/configuration/annotation/EnableBatchProcessing.html

Unable to consume message from activeMQ using JMSListener

First I am sending message to ActiveMQ and its working and now I want to consume that message from queue using JMSListener. But this is not working. Please help!
Here is my config.java
ApplicationConfig.java
package com.ge.health.gam.poc.config;
import javax.annotation.Resource;
import javax.jms.ConnectionFactory;
import org.apache.activemq.ActiveMQConnectionFactory;
import org.apache.activemq.RedeliveryPolicy;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jms.annotation.EnableJms;
import org.springframework.jms.annotation.JmsListener;
import org.springframework.jms.config.DefaultJmsListenerContainerFactory;
import org.springframework.jms.core.JmsOperations;
import org.springframework.jms.core.JmsTemplate;
import org.springframework.jms.support.destination.BeanFactoryDestinationResolver;
import org.springframework.jms.support.destination.DestinationResolver;
#Configuration
#EnableJms
public class AppConfig {
#Resource
ApplicationContext applicationContext;
#Bean
public DefaultJmsListenerContainerFactory jmsListenerContainerFactory() {
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory();
factory.setConnectionFactory(connectionFactory());
factory.setDestinationResolver(destinationResolver());
factory.setConcurrency("9");
factory.setTransactionManager(null);
return factory;
}
#Bean
public ConnectionFactory connectionFactory() {
ActiveMQConnectionFactory activeMQConnectionFactory = new ActiveMQConnectionFactory("tcp://localhost:61616");
RedeliveryPolicy policy = new RedeliveryPolicy();
policy.setMaximumRedeliveries(0);
activeMQConnectionFactory.setRedeliveryPolicy(policy);
return activeMQConnectionFactory;
//return new ActiveMQConnectionFactory("tcp://localhost:61616");
}
#Bean
JmsTemplate jmsTemplate(ConnectionFactory connectionFactory){
return new JmsTemplate(connectionFactory);
}
#Bean
public DestinationResolver destinationResolver() {
return new BeanFactoryDestinationResolver(applicationContext);
}
}
First I am sending message to ActiveMQ and its working and now i want to consume that message from queue using JMSListener. But this is not working. Please help.This is DemoConsumer class
package com.ge.health.gam.poc.consumer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.jms.annotation.JmsListener;
import org.springframework.jms.core.JmsTemplate;
import org.springframework.messaging.handler.annotation.SendTo;
import org.springframework.stereotype.Component;
#Component
public class DemoConsumer {
#JmsListener(destination ="NewQueue")
public void demoConsumeMessage(String message){
System.out.println("Message Received");
}
}

Resources