spring batch admin ui not showing jobs that are be configured - spring-boot

I have integrated spring boot with spring-batch-admin successfully. I used the sample guide from spring-batch for setting up jobs. I added couple of more tasklets to the jobs. my UI shows only one job. I expect to see three of them. I also the expect the three jobs to have start/stop functionalities and take job parameters from UI.
I have pushed entire code here.. please feel free to issue pull requests if you have solutions or improvements.
Here is my job.xml located in src/main/resources/batch/jobs/
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:batch="http://www.springframework.org/schema/batch"
xsi:schemaLocation="http://www.springframework.org/schema/batch http://www.springframework.org/schema/batch/spring-batch.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-4.1.xsd">
<!-- This is the XML way to define jobs but it will be very handy if you already have jobs like this -->
<batch:job id="FirstJob" restartable="true">
<batch:step id="firstStep">
<batch:tasklet ref="firstTasklet" start-limit="1" />
</batch:step>
</batch:job>
<bean id="firstTasklet" class="hello.FirstTasklet">
<property name="property" value="${custom-property}" />
</bean>
</beans>
Here is my BatchConfiguration
package hello;
import javax.sql.DataSource;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobExecutionListener;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.item.database.BeanPropertyItemSqlParameterSourceProvider;
import org.springframework.batch.item.database.JdbcBatchItemWriter;
import org.springframework.batch.item.file.FlatFileItemReader;
import org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.transform.DelimitedLineTokenizer;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.ClassPathResource;
import org.springframework.jdbc.core.JdbcTemplate;
#Configuration
//#EnableBatchProcessing
public class BatchConfiguration {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public DataSource dataSource;
// #Value("#{jobParameters['file']:sample-data.csv}")
// String filename;
// tag::readerwriterprocessor[]
#Bean
//#StepScope
public FlatFileItemReader<Person> reader() {
FlatFileItemReader<Person> reader = new FlatFileItemReader<Person>();
reader.setResource(new ClassPathResource("sample-data.csv"));
reader.setLineMapper(new DefaultLineMapper<Person>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "firstName", "lastName" });
}});
setFieldSetMapper(new BeanWrapperFieldSetMapper<Person>() {{
setTargetType(Person.class);
}});
}});
return reader;
}
#Bean
public PersonItemProcessor processor() {
return new PersonItemProcessor();
}
#Bean
public JdbcBatchItemWriter<Person> writer() {
JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<Person>();
writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<Person>());
writer.setSql("INSERT INTO people (first_name, last_name) VALUES (:firstName, :lastName)");
writer.setDataSource(dataSource);
return writer;
}
// end::readerwriterprocessor[]
// tag::jobstep[]
#Bean
public Job importUserJob(JobCompletionNotificationListener listener) {
return jobBuilderFactory.get("importUserJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(step1())
.end()
.build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Person, Person> chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
// end::jobstep[]
#Bean
#StepScope
public FailableTasklet tasklet(#Value("#{jobParameters[fail]}") Boolean failable) {
if(failable != null) {
return new FailableTasklet(failable);
}
else {
return new FailableTasklet(false);
}
}
public static class FailableTasklet implements Tasklet {
private final boolean fail;
public FailableTasklet(boolean fail) {
this.fail = fail;
}
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
System.out.println("Tasklet was executed");
if(fail) {
throw new RuntimeException("This exception was expected");
}
else {
return RepeatStatus.FINISHED;
}
}
}
}
screenshot of UI

Spring Batch Admin UI displays only the Jobs. You wouldn't see steps/tasklets in the UI as they cannot be run individually. But after you run your job, you could see the stats of each step that was performed in that job. Hope this helps.

Related

spring batch : typeMismatch.java.sql.Date,typeMismatch

i'm trying to send this csv file to database with spring batch
users.csv
2021-06-22,test1#gmail.com, testFullname1, testMatricule1, 1234, testUsername1
2021-06-22,test2#gmail.com, testFullname2, testMatricule2, 0000, testUsername2
and i have this error
Failed to convert property value of type 'java.lang.String' to required type 'java.sql.Date' for property
here's batchConfig
package sofrecom.collaborateur.config;
import javax.sql.DataSource;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.batch.item.database.BeanPropertyItemSqlParameterSourceProvider;
import org.springframework.batch.item.database.JdbcBatchItemWriter;
import org.springframework.batch.item.file.FlatFileItemReader;
import org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.transform.DelimitedLineTokenizer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.ClassPathResource;
import sofrecom.collaborateur.model.DAOUser;
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public DataSource dataSource;
#Bean
public FlatFileItemReader<DAOUser> reader() {
FlatFileItemReader<DAOUser> reader = new FlatFileItemReader<DAOUser>();
reader.setResource(new ClassPathResource("users.csv"));
reader.setLineMapper(new DefaultLineMapper<DAOUser>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "dateIntegration","email","fullname","matricule","password","username" });
}});
setFieldSetMapper(new BeanWrapperFieldSetMapper<DAOUser>() {{
setTargetType(DAOUser.class);
}});
}});
return reader;
}
#Bean
public UserItemProcessor processor() {
return new UserItemProcessor();
}
#Bean
public JdbcBatchItemWriter<DAOUser> writer() {
JdbcBatchItemWriter<DAOUser> writer = new JdbcBatchItemWriter<DAOUser>();
writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<DAOUser>());
writer.setSql("INSERT INTO user ( date_integration,email,fullname,matricule,password,username) VALUES ( :dateIntegration,:email,:fullname,:matricule,:password,:username)");
writer.setDataSource(dataSource);
return writer;
}
#Bean
public Job importUserJob(JobCompletionNotificationListener listener) {
return jobBuilderFactory.get("importUserJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(step1())
.end()
.build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<DAOUser, DAOUser> chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
}
and this is user Item proccessor
package sofrecom.collaborateur.config;
import java.sql.Date;
import java.text.SimpleDateFormat;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.security.crypto.password.PasswordEncoder;
import sofrecom.collaborateur.model.DAOUser;
public class UserItemProcessor implements ItemProcessor<DAOUser, DAOUser> {
private static final Logger log = LoggerFactory.getLogger(UserItemProcessor.class);
#Autowired
private PasswordEncoder bcryptEncoder;
#Override
public DAOUser process(final DAOUser person) throws Exception {
final String password = bcryptEncoder.encode(person.getUsername());
final DAOUser transformedPerson = new DAOUser(person.getDateIntegration(),person.getEmail(),person.getFullname(),person.getMatricule(),password,person.getUsername());
log.info("Converting (" + person + ") into (" + transformedPerson + ")");
return transformedPerson;
}
}
any solution please !!
The problem is that the BeanWrapperFieldSetMapper does not know, by default, how to convert a String like 2021-06-22 to an object of type java.sql.Date (which is the field dateIntegration) in your domain object DAOUser. The way to tell this mapper how to deal with custom conversions is by registering a ConversionService. This conversion service should have a converter from String to java.sql.Date registered in it. Here is a quick example:
#Bean
public FlatFileItemReader<DAOUser> reader() {
DefaultConversionService conversionService = new DefaultConversionService();
conversionService.addConverter(new Converter<String, Date>() { // java.sql.Date
#Override
public Date convert(String s) {
return Date.valueOf(s);
}
});
BeanWrapperFieldSetMapper<DAOUser> fieldSetMapper = new BeanWrapperFieldSetMapper<>();
fieldSetMapper.setConversionService(conversionService);
fieldSetMapper.setTargetType(DAOUser.class);
FlatFileItemReader<DAOUser> reader = new FlatFileItemReader<>();
reader.setResource(new ClassPathResource("users.csv"));
reader.setLineMapper(new DefaultLineMapper<DAOUser>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "dateIntegration","email","fullname","matricule","password","username" });
}});
setFieldSetMapper(fieldSetMapper);
}});
return reader;
}
You can find a complete example based on what you shared in this repository.

Spring Batch: Can't quite work out Conditional Flow

Edited to update my latest configuration: Is this on the right track for my use-case?
I have a flow that's supposed to go like this:
The FileRetrievingTasklet retrieves a remote file and places the
"type" of that file in the execution context.
If the file is of type "YEARLY", proceed to the yearlyStep().
If the file is of type "QUARTERLY", proceed to the quarterlyStep().
Finish.
This seems so simple, but what I have doesn't work. The job finishes with FAILED after the tasklet step.
Here's my job config:
#Bean
public Job fundsDistributionJob() {
return jobBuilderFactory
.get("fundsDistributionJob")
.start(retrieveFileStep(stepBuilderFactory))
.on("YEARLY").to(yearEndStep())
.from(retrieveFileStep(stepBuilderFactory))
.on("QUARTERLY").to(quarterlyStep())
.end()
.listener(new FileWorkerJobExecutionListener())
.build();
}
And one of the steps:
#Bean
public Step quarterlyStep() {
return stepBuilderFactory.get("quarterlyStep")
.<Item, Item>chunk(10)
.reader(quarterlyReader())
.processor(processor())
.writer(writer())
.listener(new StepItemReadListener())
.faultTolerant()
.skipPolicy(new DistSkipPolicy())
.build();
}
Can someone tell me what's missing?
The approach with a decider (before your edit) is the way to go. You just had an issue with your flow definition. Here is an example that works as you described:
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.job.flow.FlowExecutionStatus;
import org.springframework.batch.core.job.flow.JobExecutionDecider;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class MyJob {
private final JobBuilderFactory jobs;
private final StepBuilderFactory steps;
public MyJob(JobBuilderFactory jobs, StepBuilderFactory steps) {
this.jobs = jobs;
this.steps = steps;
}
#Bean
public Step retrieveFileStep() {
return steps.get("retrieveFileStep")
.tasklet((contribution, chunkContext) -> {
System.out.println("Downloading file..");
chunkContext.getStepContext().getStepExecution()
.getExecutionContext().put("type", Type.YEARLY);
return RepeatStatus.FINISHED;
})
.build();
}
#Bean
public JobExecutionDecider fileMapperDecider() {
return (jobExecution, stepExecution) -> {
Type type = (Type) stepExecution.getExecutionContext().get("type");
return new FlowExecutionStatus(type == Type.YEARLY ? "yearly" : "quarterly");
};
}
#Bean
public Step yearlyStep() {
return steps.get("yearlyStep")
.tasklet((contribution, chunkContext) -> {
System.out.println("running yearlyStep");
return RepeatStatus.FINISHED;
})
.build();
}
#Bean
public Step quarterlyStep() {
return steps.get("quarterlyStep")
.tasklet((contribution, chunkContext) -> {
System.out.println("running quarterlyStep");
return RepeatStatus.FINISHED;
})
.build();
}
#Bean
public Job job() {
return jobs.get("job")
.start(retrieveFileStep())
.next(fileMapperDecider())
.from(fileMapperDecider()).on("yearly").to(yearlyStep())
.from(fileMapperDecider()).on("quarterly").to(quarterlyStep())
.build()
.build();
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJob.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
enum Type {
YEARLY, QUARTERLY
}
}
It prints:
Downloading file..
running yearlyStep
If you change the type attribute in the execution context to Type.QUARTERLY in retrieveFileStep, it prints:
Downloading file..
running quarterlyStep

Dynamic step creationin Spring Batch using custom parameter for decision making

Perfect solution for dynamic step creationin Spring Batch.
Just that I am not able to get parameters into this , which will decide what step need to be executed or how can pass steps Array ?
<pre>#Bean
public Job job() {
Step[] stepsArray = // create your steps array or pass it as a parameter
SimpleJobBuilder jobBuilder = jobBuilderFactory.get("mainCalculationJob")
.incrementer(new RunIdIncrementer())
.start(truncTableTaskletStep());
for (Step step : stepsArray) {
jobBuilder.next(step);
}
return jobBuilder.build();
}</pre>
Thanks
i am looking how to pass this step array as parameter and get in above function
Here is an example of how to pass the steps array as a parameter:
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.job.builder.SimpleJobBuilder;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class MyJobConfiguration {
private final JobBuilderFactory jobBuilderFactory;
private final StepBuilderFactory stepBuilderFactory;
public MyJobConfiguration(JobBuilderFactory jobBuilderFactory, StepBuilderFactory stepBuilderFactory) {
this.jobBuilderFactory = jobBuilderFactory;
this.stepBuilderFactory = stepBuilderFactory;
}
public Step initialStep() {
return stepBuilderFactory.get("initialStep")
.tasklet((contribution, chunkContext) -> {
System.out.println("initial step");
return RepeatStatus.FINISHED;
})
.build();
}
#Bean
public Step[] dynamicSteps() {
// load steps sequence from db and create steps here
Step step1 = stepBuilderFactory.get("step1")
.tasklet((contribution, chunkContext) -> {
System.out.println("hello");
return RepeatStatus.FINISHED;
})
.build();
Step step2 = stepBuilderFactory.get("step2")
.tasklet((contribution, chunkContext) -> {
System.out.println("world");
return RepeatStatus.FINISHED;
})
.build();
return new Step[]{step1, step2};
}
#Bean
public Job job(Step[] dynamicSteps) {
SimpleJobBuilder jobBuilder = jobBuilderFactory.get("job")
.start(initialStep());
for (Step step : dynamicSteps) {
jobBuilder.next(step);
}
return jobBuilder.build();
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJobConfiguration.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
}
Nothing related to Spring Batch here, this is Spring dependency injection: passing an array of beans of type Step as a parameter to a another bean definition method (of type Job).

Spring Batch With Annotation and Caching

Does anyone have good example of Spring Batch (Using Annotation) to cache a reference table which will be accessible to processor ?
I just need a simple cache, run a query which returns some byte[] and keep it in memory till the time job is executing.
Appreciate any help on this topic.
Thanks !
A JobExecutionListener can be used to populate the cache with reference data before the job is executed and clear the cache after the job is finished.
Here is an example:
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobExecution;
import org.springframework.batch.core.JobExecutionListener;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.cache.CacheManager;
import org.springframework.cache.concurrent.ConcurrentMapCacheManager;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableBatchProcessing
public class MyJob {
private JobBuilderFactory jobs;
private StepBuilderFactory steps;
public MyJob(JobBuilderFactory jobs, StepBuilderFactory steps) {
this.jobs = jobs;
this.steps = steps;
}
#Bean
public CacheManager cacheManager() {
return new ConcurrentMapCacheManager(); // return the implementation you want
}
#Bean
public Tasklet tasklet() {
return new MyTasklet(cacheManager());
}
#Bean
public Step step() {
return steps.get("step")
.tasklet(tasklet())
.build();
}
#Bean
public JobExecutionListener jobExecutionListener() {
return new CachingJobExecutionListener(cacheManager());
}
#Bean
public Job job() {
return jobs.get("job")
.start(step())
.listener(jobExecutionListener())
.build();
}
class MyTasklet implements Tasklet {
private CacheManager cacheManager;
public MyTasklet(CacheManager cacheManager) {
this.cacheManager = cacheManager;
}
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
String name = (String) cacheManager.getCache("referenceData").get("foo").get();
System.out.println("Hello " + name);
return RepeatStatus.FINISHED;
}
}
class CachingJobExecutionListener implements JobExecutionListener {
private CacheManager cacheManager;
public CachingJobExecutionListener(CacheManager cacheManager) {
this.cacheManager = cacheManager;
}
#Override
public void beforeJob(JobExecution jobExecution) {
// populate cache as needed. Can use a jdbcTemplate to query the db here and populate the cache
cacheManager.getCache("referenceData").put("foo", "bar");
}
#Override
public void afterJob(JobExecution jobExecution) {
// clear cache when the job is finished
cacheManager.getCache("referenceData").clear();
}
}
public static void main(String[] args) throws Exception {
ApplicationContext context = new AnnotationConfigApplicationContext(MyJob.class);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
jobLauncher.run(job, new JobParameters());
}
}
When executed, it prints:
Hello bar
which means data is correctly retrieved from the cache. You would need to adapt the sample to query the database and populate the cache (See comments in code).
Hope this helps.
You can use ehcache-jsr107 implementation. Very quick to setup.
Spring and ehcache integration example is available here.
You should be able to setup same with spring batch also.
Hope this hleps

Spring Batch ResourcelessTransactionManager messes with persistence.xml?

I am working on an application and have been asked to implement a scheduled spring batch job. I have set up a configuration file where I set a #Bean ResourcelessTransactionManager but it seems to mess with the persistence.xml.
There is already a persistence xml in an other module, there is no compilation error. I get a NoUniqueBeanDefinitionException when I am requesting a page that returns a view item.
This is the error:
Caused by: org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [org.springframework.transaction.PlatformTransactionManager] is defined: expected single matching bean but found 2: txManager,transactionManager
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:365)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:331)
at org.springframework.transaction.interceptor.TransactionAspectSupport.determineTransactionManager(TransactionAspectSupport.java:366)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:271)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:653)
at com.mypackage.services.MyClassService$$EnhancerBySpringCGLIB$$9e8bf16f.registryEvents(<generated>)
at com.mypackage.controllers.MyClassSearchView.init(MyClassSearchView.java:75)
... 168 more
Is there a way to tell spring batch to use the data source defined in the persistence.xml of the other module or maybe is this caused by something else?
I created separate BatchScheduler java class as below and included it in BatchConfiguration java class. I am sharing both the classes. BatchConfiguration contains another jpaTransactionManager.
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean;
import org.springframework.batch.support.transaction.ResourcelessTransactionManager;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.scheduling.annotation.EnableScheduling;
#Configuration
#EnableScheduling
public class BatchScheduler {
#Bean
public ResourcelessTransactionManager resourcelessTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory(
ResourcelessTransactionManager resourcelessTransactionManager) throws Exception {
MapJobRepositoryFactoryBean factory = new
MapJobRepositoryFactoryBean(resourcelessTransactionManager);
factory.afterPropertiesSet();
return factory;
}
#Bean
public JobRepository jobRepository(
MapJobRepositoryFactoryBean factory) throws Exception {
return factory.getObject();
}
#Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) {
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setJobRepository(jobRepository);
return launcher;
}
}
BatchConfiguration contains another jpaTransactionManager.
import java.io.IOException;
import java.util.Date;
import java.util.Properties;
import javax.sql.DataSource;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobExecution;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.JobParametersBuilder;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.batch.item.database.JpaItemWriter;
import org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper;
import org.springframework.batch.item.file.mapping.DefaultLineMapper;
import org.springframework.batch.item.file.transform.DelimitedLineTokenizer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
import org.springframework.context.support.PropertySourcesPlaceholderConfigurer;
import org.springframework.core.env.Environment;
import org.springframework.core.io.FileSystemResource;
import org.springframework.core.io.support.PathMatchingResourcePatternResolver;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.JpaVendorAdapter;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.orm.jpa.vendor.Database;
import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter;
import org.springframework.scheduling.TaskScheduler;
import org.springframework.scheduling.annotation.EnableScheduling;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.scheduling.concurrent.ConcurrentTaskScheduler;
import org.springframework.transaction.PlatformTransactionManager;
import trade.api.common.constants.Constants;
import trade.api.entity.SecurityEntity;
import trade.api.trade.batch.item.processor.SecurityItemProcessor;
import trade.api.trade.batch.item.reader.NseSecurityReader;
import trade.api.trade.batch.notification.listener.SecurityJobCompletionNotificationListener;
import trade.api.trade.batch.tasklet.SecurityReaderTasklet;
import trade.api.vo.SecurityVO;
#Configuration
#EnableBatchProcessing
#EnableScheduling
#Import({OhlcMonthBatchConfiguration.class, OhlcWeekBatchConfiguration.class, OhlcDayBatchConfiguration.class, OhlcMinuteBatchConfiguration.class})
public class BatchConfiguration {
private static final String OVERRIDDEN_BY_EXPRESSION = null;
/*
Load the properties
*/
#Value("${database.driver}")
private String databaseDriver;
#Value("${database.url}")
private String databaseUrl;
#Value("${database.username}")
private String databaseUsername;
#Value("${database.password}")
private String databasePassword;
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
private JobLauncher jobLauncher;
#Bean
public TaskScheduler taskScheduler() {
return new ConcurrentTaskScheduler();
}
//second, minute, hour, day of month, month, day(s) of week
//#Scheduled(cron = "0 0 21 * * 1-5") on week days
#Scheduled(cron="${schedule.insert.security}")
public void importSecuritySchedule() throws Exception {
System.out.println("Job Started at :" + new Date());
JobParameters param = new JobParametersBuilder().addString("JobID",
String.valueOf(System.currentTimeMillis())).toJobParameters();
JobExecution execution = jobLauncher.run(importSecuritesJob(), param);
System.out.println("Job finished with status :" + execution.getStatus());
}
#Bean SecurityJobCompletionNotificationListener securityJobCompletionNotificationListener() {
return new SecurityJobCompletionNotificationListener();
}
//Import Equity OHLC End
//Import Equity Start
// tag::readerwriterprocessor[]
#Bean
public SecurityReaderTasklet securityReaderTasklet() {
return new SecurityReaderTasklet();
}
#Bean
#StepScope
public NseSecurityReader<SecurityVO> nseSecurityReader(#Value("#{jobExecutionContext["+Constants.SECURITY_DOWNLOAD_FILE+"]}") String pathToFile) throws IOException {
NseSecurityReader<SecurityVO> reader = new NseSecurityReader<SecurityVO>();
reader.setLinesToSkip(1);
reader.setResource(new FileSystemResource(pathToFile));
reader.setLineMapper(new DefaultLineMapper<SecurityVO>() {{
setLineTokenizer(new DelimitedLineTokenizer() {{
setNames(new String[] { "symbol", "nameOfCompany", "series", "dateOfListing", "paidUpValue", "marketLot", "isinNumber", "faceValue" });
}});
setFieldSetMapper(new BeanWrapperFieldSetMapper<SecurityVO>() {{
setTargetType(SecurityVO.class);
}});
}});
return reader;
}
#Bean
public SecurityItemProcessor processor() {
return new SecurityItemProcessor();
}
#Bean
public JpaItemWriter<SecurityEntity> writer() {
JpaItemWriter<SecurityEntity> writer = new JpaItemWriter<SecurityEntity>();
writer.setEntityManagerFactory(entityManagerFactory().getObject());
return writer;
}
// end::readerwriterprocessor[]
// tag::jobstep[]
#Bean
public Job importSecuritesJob() throws IOException {
return jobBuilderFactory.get("importSecuritesJob")
.incrementer(new RunIdIncrementer())
.listener(securityJobCompletionNotificationListener())
.start(downloadSecurityStep())
.next(insertSecurityStep())
.build();
}
#Bean
public Step downloadSecurityStep() throws IOException {
return stepBuilderFactory.get("downloadSecurityStep")
.tasklet(securityReaderTasklet())
.build();
}
#Bean
public Step insertSecurityStep() throws IOException {
return stepBuilderFactory.get("insertSecurityStep")
.transactionManager(jpaTransactionManager())
.<SecurityVO, SecurityEntity> chunk(100)
.reader(nseSecurityReader(OVERRIDDEN_BY_EXPRESSION))
.processor(processor())
.writer(writer())
.build();
}
// end::jobstep[]
//Import Equity End
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(databaseDriver);
dataSource.setUrl(databaseUrl);
dataSource.setUsername(databaseUsername);
dataSource.setPassword(databasePassword);
return dataSource;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean lef = new LocalContainerEntityManagerFactoryBean();
lef.setPackagesToScan("trade.api.entity");
lef.setDataSource(dataSource());
lef.setJpaVendorAdapter(jpaVendorAdapter());
lef.setJpaProperties(new Properties());
return lef;
}
#Bean
public JpaVendorAdapter jpaVendorAdapter() {
HibernateJpaVendorAdapter jpaVendorAdapter = new HibernateJpaVendorAdapter();
jpaVendorAdapter.setDatabase(Database.MYSQL);
jpaVendorAdapter.setGenerateDdl(true);
jpaVendorAdapter.setShowSql(false);
jpaVendorAdapter.setDatabasePlatform("org.hibernate.dialect.MySQLDialect");
return jpaVendorAdapter;
}
#Bean
#Qualifier("jpaTransactionManager")
public PlatformTransactionManager jpaTransactionManager() {
return new JpaTransactionManager(entityManagerFactory().getObject());
}
#Bean
public static PropertySourcesPlaceholderConfigurer dataProperties(Environment environment) throws IOException {
String[] activeProfiles = environment.getActiveProfiles();
final PropertySourcesPlaceholderConfigurer ppc = new PropertySourcesPlaceholderConfigurer();
ppc.setLocations(new PathMatchingResourcePatternResolver().getResources("classpath*:application-"+activeProfiles[0]+".properties"));
return ppc;
}
//// Import Security End
}
Problem solved. There was a PlatformTransactionManager bean located in an other configuration file. I set it as #Primary and now the problem is fixed. Thanks everyone for the help.

Resources