Spring Batch : How to change the default isolation level? - spring-boot

I read in the documentation :
"The default is ISOLATION_SERIALIZABLE, which prevents accidental
concurrent execution of the SAME job"
However, when I launch DIFFERENT jobs at the the same time (with a default isolation level at SERIALIZABLE), I have the error : ORA-08177: can't serialize access for this transaction. Is it normal ?
Second, to change the Default Isolation Level to READ_COMMITTED, I understood that we can't change this level in application.properties, and, that I have to redefine BatchConfigurer. Exact ?
Using BasicBatchConfigurer, I must define an explicit contructor (implicit super constructor BasicBatchConfigurer() is undefined for default constructor).
However, I have the error :
Parameter 0 of constructor in MyBatchConfigurer required a bean of type 'org.springframework.boot.autoconfigure.batch.BatchProperties' that could not be found.
How to create : BatchProperties properties, DataSource dataSource and TransactionManagerCustomizers transactionManagerCustomizers ?
This is my code :
PeopleApplication.java
#SpringBootApplication
#EnableAutoConfiguration(exclude = { BatchAutoConfiguration.class })
public class PeopleApplication {
public static void main(String[] args) throws Exception {
ConfigurableApplicationContext ctx = new SpringApplicationBuilder(PeopleApplication.class)
.web(WebApplicationType.NONE)
.run(args);
int exitValue = SpringApplication.exit(ctx);
System.exit(exitValue);
}
}
MyBatchConfigurer.java
#Component
#PropertySource("classpath:fileA.properties")
public class MyBatchConfigurer extends BasicBatchConfigurer implements CommandLineRunner, ExitCodeGenerator {
protected MyBatchConfigurer(BatchProperties properties, DataSource dataSource, TransactionManagerCustomizers transactionManagerCustomizers) {
super(properties, dataSource, transactionManagerCustomizers);
}
#Override
protected String determineIsolationLevel() {
return "ISOLATION_" + Isolation.READ_COMMITTED;
}
#Override
public void run(String... args) {
...
}
...
}
Regards.
RESPONSE :
use #EnableAutoConfiguration instead of :
#EnableAutoConfiguration(exclude = { BatchAutoConfiguration.class })
Like this, the bean BatchProperties, DataSource and TransactionManagerCustomizers will be automatically created.

Please see the reply from Mahmoud that explains very clearly
can't serialize access for this transaction in spring batch.
Example for the usage is below and override only isolation level
--application.properties
spring.application.name=SpringBatch
####### SPRING ##############
spring.main.banner-mode=off
spring.main.web-application-type=none
spring.batch.initialize-schema=always
spring.batch.job.enabled=false // Disable default if you want to control
########JDBC Datasource########
#connection timeout 10 min
spring.datasource.hikari.connection-timeout=600000
spring.datasource.hikari.minimum-idle=5
spring.datasource.hikari.maximum-pool-size=100
spring.datasource.hikari.idle-timeout=600000
spring.datasource.hikari.max-lifetime=1800000
spring.datasource.hikari.auto-commit=true
spring.datasource.hikari.poolName=SpringBoot-HikariCP
spring.datasource.url=jdbc:oracle:thin:#ngecom.ae:1521:ngbilling
spring.datasource.username=ngbilling
springbatch.datasource.password=ngbilling
#SpringBootApplication
public class YourApplication {
public static void main(String[] args) {
SpringApplication.run(SsadapterApplication.class, args);
}
}
// Your manual batch scheduler
class BatchJobScheduler extends BasicBatchConfigurer {
#Autowired
private JobLauncher jobLauncher;
#Autowired
private ApplicationArguments args;
#Autowired
private Job myJob;
protected BatchJobScheduler(BatchProperties properties, DataSource dataSource,
TransactionManagerCustomizers transactionManagerCustomizers) {
super(properties, dataSource, transactionManagerCustomizers);
}
#Override
protected String determineIsolationLevel() {
return "ISOLATION_" + Isolation.READ_COMMITTED;
}
//#Scheduled(cron = "${batch.cron}")
public void notScheduledJob() {
appId= args.getOptionValues("appId").get(0);
JobParameters params = new JobParametersBuilder().addLong("jobId"+appId, System.currentTimeMillis())
.toJobParameters();
jobLauncher.run(myJob, params);
}
// Your batch Configuration And Spring will do everything for you to available
#Configuration
#EnableBatchProcessing
#EnableScheduling
public class BatchConfiguration {
Logger logger = LoggerFactory.getLogger(BatchConfiguration.class);
#Value("${batch.commit.chunk}")
private Integer chunkSize;
#Value("${batch.error.skipCount}")
private Integer skipErrorCount;
#Autowired
private Environment environment;
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
private DataSource dataSource;
#Autowired
private ApplicationArguments args;
#Autowired
private JdbcTemplate jdbcTemplate;
#Bean
public Job myJob() throws Exception {
return jobBuilderFactory.get("myJob").incrementer(new RunIdIncrementer())
.listener(new JobListener()).start(myStep()).build();
}
#Bean
public Step myStep() throws Exception {
return stepBuilderFactory.get("myStep").<InputObject, OutPutObject>chunk(chunkSize)
.reader(yourReader(null)).processor(yourProcessor()).writer(yourWriter())
//.faultTolerant()
//.skipLimit(skipErrorCount).skip(Exception.class)//.noSkip(FileNotFoundException.class)
.listener(invoiceSkipListener())
//.noRetry(Exception.class)
//.noRollback(Exception.class)
.build();
}

Related

Spring Boot - Auto wiring service having String constructor

How do i #autowire bean class TransactionManagerImpl which is having 1(String) argument constructor without using new in spring-boot application?
Even after searching through many post i couldn't get any clue to autowire without using new
I need to autowire TransactionManager in three different classes and the parameters are different in all three classes.
This looks like very basic scenario.
#Service
public class TransactionManagerImpl implements TransactionManager {
private final Logger logger = LoggerFactory.getLogger(this.getClass());
String txnLogFile;
#ConstructorProperties({"txnLogFile"})
public TransactionManagerImpl(String txnLogFile) {
this.txnLogFile= txnLogFile;
}
}
is there any specific requirement where you want to use #Service annotation?
if not then you can use #Bean to create a bean for TransactionManagerImpl like below.
#Configuration
public class Config {
#Value("${txnLogFile}")
private String txnLogFile;
#Bean
public TransactionManager transactionManager() {
return new TransactionManagerImpl(txnLogFile);
}
}
and remove #Service annotation from TransactionManagerImpl.
Putting aside other complications, it can be done like this
public TransactionManagerImpl(#Value("${txnLogFile}") String txnLogFile) {
this.txnLogFile= txnLogFile;
}
Finally, i did it as below, now sure if this is the best way to do. I did not want to have three implementation just because of one variable.
application.yaml
app:
type-a:
txn-log-file: data/type-a-txn-info.csv
type-b:
txn-log-file: data/type-b-txn-info.csv
default:
txn-log-file: data/default/txn-info.csv
MainApplication.java
#SpringBootApplication
public class MainApplication {
public static void main(String[] args) {
new SpringApplicationBuilder(MainApplication.class).web(WebApplicationType.NONE).run(args);
}
#Bean
public TransactionManager transactionManager(#Value("${app.default.txn-log-file}") String txnLogFile) {
return new TransactionManagerImpl(txnLogFile);
}
#Bean
public CsvService csvService(String txnLogFile) {
return new CsvServiceImpl(txnLogFile);
}
}
TypeOneRoute.java
#Configuration
public class TypeOneRoute extends RouteBuilder {
#Value("${app.type-a.txn-log-file}")
private String txnLogFile;
#Autowired
private ApplicationContext applicationContext;
private TransactionManager transactionManager;
#Override
public void configure() throws Exception {
transactionManager = applicationContext.getBean(TransactionManager.class, txnLogFile);
transactionManager.someOperation();
}
}
TypeTwoRoute.java
#Configuration
public class TypeTwoRoute extends RouteBuilder {
#Value("${app.type-b.txn-log-file}")
private String txnLogFile;
#Autowired
private ApplicationContext applicationContext;
private TransactionManager transactionManager;
#Override
public void configure() throws Exception {
transactionManager = applicationContext.getBean(TransactionManager.class, txnLogFile);
transactionManager.create();
}
}
TransactionManager.java
#Service
#Scope(value = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public interface TransactionManager {
public ZonedDateTime create() throws IOException, ParseException;
}
TransactionManagerImpl.java
public class TransactionManagerImpl implements TransactionManager {
#Autowired
private ApplicationContext applicationContext;
private String txnLogFile;
public TransactionManagerImpl(String txnLogFile) {
this.txnLogFile = txnLogFile;
}
private CsvService csvService;
#PostConstruct
public void init() {
csvService = applicationContext.getBean(CsvService.class, txnLogFile);
}
public ZonedDateTime create() throws IOException, ParseException {
try {
csvService.createTxnInfoFile();
return csvService.getLastSuccessfulTxnTimestamp();
} catch (IOException e) {
throw new IOException("Exception occured in getTxnStartDate()", e);
}
}
}
Initially TransactionManager Bean will be registered with the app.default.txn-info.csv and when i actually get it from ApplicationContext i am replacing the value with the parameter passed to get the bean from ApplicationContext

Unit Testing using JUnit for Spring Batch without XML configuration

I am new to Spring Batch and I started developping a simple batch application. Now I am thinking of some unit testing unsing JUnit that could be healthy for my app and code ;)
The problem is that I couldn't find any ressource (examples, tutos ...) on the internet that shows how to perform unit testing with Spring Batch when using no XML.
Here is my code to be more clear :
Config class:
package my.company.project.name.batch.config
#Configuration
#EnableBatchProcessing
#ComponentScan({
"my.company.project.name.batch.reader",
"my.company.project.name.batch.tasklet",
"my.company.project.name.batch.processor",
"my.company.project.name.batch.writer"
})
#Import({CommonConfig.class})
public class MyItemBatchConfig {
#Autowired
private StepBuilderFactory steps;
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private MyItemTasklet myItemTasklet;
#Bean
public Job myItemJob(#Qualifier("myItem") Step loadProducts){
return jobBuilderFactory.get("myItemJob").start(myMethod).build();
}
#Bean(name= "myItem")
public Step myMethod(){
return steps.get("myItem").tasklet(myItemTasklet).build();
}
}
MyItemReader class :
package my.company.project.name.batch.reader
#Component
public class MyItemReader implements ItemReader<MyItem>{
#Value("${batch.load.produit.csv.file.path}")
private String csvFilePath;
private LinkedList<CsvRawLine> myItems;
#PostConstruct
public void init() {
myItems = new LinkedList<>(CsvUtil.getCsvReader(MyItem.class, csvFilePath));
}
#Override
public MyItem read() throws Exception{
return myItems.poll();
}
}
ItemProcessor class :
package my.company.project.name.batch.processor
#Component
public class MyItemProcessor implements ItemProcessor<MyItem, MyItemProcessorResult> {
public MyItemProcessorResult process(MyItemitem) throws Exception {
//processing business logic
}
}
ItemWriter class :
package my.company.project.name.batch.writer
#Component
public class MyItemWriter implements ItemWriter<MyItem> {
#Override
public void write(List<? extends MyItem> myItems) throws Exception {
//writer business logic
}
}
MyItemTasklet class that will call all the previous classes in order to achieve the task wanted by the batch:
package package my.company.project.name.batch.tasklet
#Component
public class MyItemBatchTasklet implements Tasklet{
#Autowired
public MyItemReader myItemReader;
#Autowired
public MyItemProcessor myItemProcessor;
#Autowired
public MyItemeWriter myItemWriter;
#Override
public RepeatStatus execute execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
//calling myItemReader, myItemProcessor and myItemWriter to do the business logic
return RepeatStatus.FINISHED
}
}
MyItemTasklet class that will launch the tasklet by its main method :
package package my.company.project.name.batch
public class MyItemTaskletLauncher{
public MyItemTaskletLauncher(){
//No implementation
}
public static void main (String[] args) throws IOException, JobExecutionException, NamingException {
Launcher.launchWithConfig("Launching MyItemTasklet ...", MyItemBatchConfig.class,false);
}
}
I made a simple batch application using Spring Batch and MyBatis and JUnit.
The test codes of application runs unit testing without XML.
Here is test class for Job.
#RunWith(SpringJUnit4ClassRunner.class)
#SpringBootTest(classes = {xxx.class, yyy.class, zzz.class, xxxJobLauncherTestUtils.class})
public class JobTest {
#Autowired
#Qualifier(value = "xxxJobLauncherTestUtils")
private JobLauncherTestUtils xxxJobLauncherTestUtils;
#Test
public void testXxxJob() throws Exception {
JobExecution jobExecution = xxxJobLauncherTestUtils.launchJob();
assertThat(jobExecution.getStatus(), is(BatchStatus.COMPLETED));
}
}
#Component(value = "xxxJobLauncherTestUtils")
class XxxjobLauncherTestUtils extends JobLauncherTestUtils {
#Autowired
#Qualifier(value = "xxxJob")
#Override
public void setJob(Job job) {
super.setJob(job);
}
}
About details, please see the below link.
https://github.com/Maeno/spring-batch-example/tree/master/src/test
I hope that it will be helpful.

Spring with Quartz, using SchedulerFactoryBean in custom service

I have job with a bean injected to it. I've acheaved using this solution.
In this solution the job trigger is setted during configuration of SchedulerFactoryBean bean in the config class.
But I want to schedule it in my custom SchedulerService service by colling scheduleTrackRetry method.
I try something like this(see below) but the job is not fired up at the appropriate time.
#Service
public class SchedulerService {
#Autowired
SchedulerFactoryBean quartzScheduler;
#Autowired
JobDetailFactoryBean jobDetailFactoryBean;
#Autowired
CronTriggerFactoryBean cronTriggerFactoryBean;
public void scheduleTrackRetry() {
cronTriggerFactoryBean.setJobDetail(jobDetailFactoryBean.getObject());
quartzScheduler.setTriggers(cronTriggerFactoryBean.getObject());
}
So, please tell me how could I acheave the desired behaviour?
Here is my job and conf classes:
#Component
public class TrackRetryJob implements Job {
private static final Logger LOGGER = LoggerFactory.getLogger(TrackRetryJob.class);
#Autowired
private TimeformBatchService timeformBatchService;
#Override
public void execute(JobExecutionContext context) throws JobExecutionException {
LOGGER.info("Run retry for fetching greyhound tracks");
timeformBatchService.consumeTracks();
}
}
#Configuration
public class QuartzConfig {
#Autowired
private ApplicationContext applicationContext;
#Bean
public SchedulerFactoryBean quartzScheduler() {
SchedulerFactoryBean quartzScheduler = new SchedulerFactoryBean();
// custom job factory of spring with DI support for #Autowired!
AutowiringSpringBeanJobFactory jobFactory = new AutowiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
quartzScheduler.setJobFactory(jobFactory);
quartzScheduler.setTriggers(getTrigger().getObject());
return quartzScheduler;
}
#Bean
public JobDetailFactoryBean retryTrackFetch() {
JobDetailFactoryBean jobDetailFactory = new JobDetailFactoryBean();
jobDetailFactory.setJobClass(TrackRetryJob.class);
jobDetailFactory.setGroup("group1");
return jobDetailFactory;
}
#Bean
public CronTriggerFactoryBean getTrigger() {
CronTriggerFactoryBean cronTriggerFactoryBean = new CronTriggerFactoryBean();
cronTriggerFactoryBean.setJobDetail(retryTrackFetch().getObject());
cronTriggerFactoryBean.setCronExpression("0 53 * * * ?");
cronTriggerFactoryBean.setGroup("group1");
return cronTriggerFactoryBean;
}
}
public final class AutowiringSpringBeanJobFactory extends SpringBeanJobFactory implements ApplicationContextAware {
private transient AutowireCapableBeanFactory beanFactory;
#Override
public void setApplicationContext(final ApplicationContext context) {
beanFactory = context.getAutowireCapableBeanFactory();
}
#Override
protected Object createJobInstance(final TriggerFiredBundle bundle) throws Exception {
final Object job = super.createJobInstance(bundle);
beanFactory.autowireBean(job);
return job;
}
}
I found the solution here
I could not use SchedulerFactoryBean as a normal bean. When I try to inject it, spring inject Scheduler bean. Therefore my service should look like this:
#Service
public class SchedulerService {
#Autowired
Scheduler scheduler;
}

JobLauncherTestUtils throws NoUniqueBeanDefinitionException while trying to test spring batch steps

I am using Spring boot and Spring batch. I have defined more than one job.
I am trying to build junit to test specific task within a job.
Therefor I am using the JobLauncherTestUtils library.
When I run my test case I always get NoUniqueBeanDefinitionException.
This is my test class:
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = {BatchConfiguration.class})
public class ProcessFileJobTest {
#Configuration
#EnableBatchProcessing
static class TestConfig {
#Autowired
private JobBuilderFactory jobBuilder;
#Autowired
private StepBuilderFactory stepBuilder;
#Bean
public JobLauncherTestUtils jobLauncherTestUtils() {
JobLauncherTestUtils jobLauncherTestUtils = new JobLauncherTestUtils();
jobLauncherTestUtils.setJob(jobUnderTest());
return jobLauncherTestUtils;
}
#Bean
public Job jobUnderTest() {
return jobBuilder.get("job-under-test")
.start(processIdFileStep())
.build();
}
#Bean
public Step processIdFileStep() {
return stepBuilder.get("processIdFileStep")
.<PushItemDTO, PushItemDTO>chunk(1) //important to be one in this case to commit after every line read
.reader(reader(null))
.processor(processor(null, null, null, null))
.writer(writer())
// .faultTolerant()
// .skipLimit(10) //default is set to 0
// .skip(MySQLIntegrityConstraintViolationException.class)
.build();
}
#Bean
#Scope(value = "step", proxyMode = ScopedProxyMode.INTERFACES)
public ItemStreamReader<PushItemDTO> reader(#Value("#{jobExecutionContext[filePath]}") String filePath) {
...
return itemReader;
}
#Bean
#Scope(value = "step", proxyMode = ScopedProxyMode.INTERFACES)
public ItemProcessor<PushItemDTO, PushItemDTO> processor(#Value("#{jobParameters[pushMessage]}") String pushMessage,
#Value("#{jobParameters[jobId]}") String jobId,
#Value("#{jobParameters[taskId]}") String taskId,
#Value("#{jobParameters[refId]}") String refId)
{
return new PushItemProcessor(pushMessage,jobId,taskId,refId);
}
#Bean
public LineMapper<PushItemDTO> lineMapper() {
DefaultLineMapper<PushItemDTO> lineMapper = new DefaultLineMapper<PushItemDTO>();
...
return lineMapper;
}
#Bean
public ItemWriter writer() {
return new someWriter();
}
}
#Autowired
protected JobLauncher jobLauncher;
#Autowired
JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void processIdFileStepTest1() throws Exception {
JobParameters jobParameters = new JobParametersBuilder().addString("filePath", "C:\\etc\\files\\2015_02_02").toJobParameters();
JobExecution jobExecution = jobLauncherTestUtils.launchStep("processIdFileStep",jobParameters);
}
and thats the exception:
Caused by: org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [org.springframework.batch.core.Job] is defined: expected single matching bean but found 3: jobUnderTest,executeToolJob,processFileJob
Any idea?
Thanks.
added BatchConfiguration class:
package com.mycompany.notification_processor_service.batch.config;
import com.mycompany.notification_processor_service.common.config.CommonConfiguration;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.*;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import javax.sql.DataSource;
#ComponentScan("com.mycompany.notification_processor_service.batch")
#PropertySource("classpath:application.properties")
#Configuration
#Import({CommonConfiguration.class})
#ImportResource({"classpath:applicationContext-pushExecuterService.xml"/*,"classpath:si/integration-context.xml"*/})
public class BatchConfiguration {
#Value("${database.driver}")
private String databaseDriver;
#Value("${database.url}")
private String databaseUrl;
#Value("${database.username}")
private String databaseUsername;
#Value("${database.password}")
private String databasePassword;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(databaseDriver);
dataSource.setUrl(databaseUrl);
dataSource.setUsername(databaseUsername);
dataSource.setPassword(databasePassword);
return dataSource;
}
#Bean
public JdbcTemplate jdbcTemplate(DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
}
and this is CommonConfiguration
#ComponentScan("com.mycompany.notification_processor_service")
#Configuration
#EnableJpaRepositories(basePackages = {"com.mycompany.notification_processor_service.common.repository.jpa"})
#EnableCouchbaseRepositories(basePackages = {"com.mycompany.notification_processor_service.common.repository.couchbase"})
#EntityScan({"com.mycompany.notification_processor_service"})
#EnableAutoConfiguration
#EnableTransactionManagement
#EnableAsync
public class CommonConfiguration {
}
I had the same issue and the easier way is injecting in the setter of JobLauncherTestUtils like Mariusz explained in Jira of Spring:
#Bean
public JobLauncherTestUtils getJobLauncherTestUtils() {
return new JobLauncherTestUtils() {
#Override
#Autowired
public void setJob(#Qualifier("ncsvImportJob") Job job) {
super.setJob(job);
}
};
}
So I see the jobUnderTest bean. Somewhere in all those imports, you're importing the two other jobs as well. I see your BatchConfiguration class imports other stuff as well as you having component scanning turned on. Carefully trace through all your configurations. Something is picking up the definitions for those beans.
I also ran into this issue and couldn't have JobLauncherTestUtils to work properly. It might be caused by this issue
I ended up autowiring the SimpleJobLauncher and my Job into the unit test, and simply
launcher.run(importAccountingDetailJob, params);
An old post, but i thought of providing my solution as well.
In this case i am automatically registering a JobLauncherTestUtils per job
#Configuration
public class TestConfig {
private static final Logger logger = LoggerFactory.getLogger(TestConfig.class);
#Autowired
private AbstractAutowireCapableBeanFactory beanFactory;
#Autowired
private List<Job> jobs;
#PostConstruct
public void registerServices() {
jobs.forEach(j->{
JobLauncherTestUtils u = create(j);
final String name = j.getName()+"TestUtils"
beanFactory.registerSingleton(name,u);
beanFactory.autowireBean(u);
logger.info("Registered JobLauncherTestUtils {}",name);
});
}
private JobLauncherTestUtils create(final Job j) {
return new MyJobLauncherTestUtils(j);
}
private static class MyJobLauncherTestUtils extends JobLauncherTestUtils {
MyJobLauncherTestUtils(Job j) {
this.setJob(j);
}
#Override // to remove #Autowire from base class
public void setJob(Job job) {
super.setJob(job);
}
}
}

Spring jdbc configuration

I have been trying to implement a web service using spring. This webservice will provide data access to a mySQL database using JDBC. I am trying to not use any xml configuration files, so I have come across a problem trying to connect to the database.
I am following the tutorial: http://spring.io/guides/tutorials/rest/ but I changed a few things along the way.
Now that I am trying to implement the connection with the database I get an error when trying to execute the tomcat instance, and I guess the problem is within the configurations.
Here follows some of my code:
Datasource configuration:
#Configuration
#Profile("mySQL")
#PropertySource("classpath:/services.properties")
public class MySQLDataSourceConfiguration implements DataSourceConfiguration{
#Inject
private Environment environment;
#Bean
public DataSource dataSource() throws Exception {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setPassword(environment.getProperty("dataSource.password"));
dataSource.setUrl(environment.getProperty("dataSource.url"));
dataSource.setUsername(environment.getProperty("dataSource.user"));
dataSource.setDriverClassName(environment.getPropertyAsClass("dataSource.driverClass", Driver.class).getName());
return dataSource;
}
}
the file service.properties is where I keep my configurations for the database, so when I desire to change the database I will just have to change 4 fields.
The JDBCConfiguration class for the setup of the JDBCtemplate
#Configuration
#EnableTransactionManagement
#PropertySource("classpath:/services.properties")
#Import( { MySQLDataSourceConfiguration.class })
public class JdbcConfiguration {
#Autowired
private DataSourceConfiguration dataSourceConfiguration;
#Inject
private Environment environment;
#Bean
public JdbcTemplate setupJdbcTemplate() throws Exception {
return new JdbcTemplate(dataSourceConfiguration.dataSource());
}
#Bean
public PlatformTransactionManager transactionManager(DataSource dataSource) throws Exception {
return new DataSourceTransactionManager(dataSource);
}
}
Then there is the Repository, that recieves the template.
#Transactional
#Repository
#Qualifier("jdbcRepository")
public class JdbcIndividualRepository implements IndividualsRepository{
private static final Logger LOG = LoggerFactory.getLogger(JdbcIndividualRepository.class);
#Autowired
private JdbcTemplate jdbcTemplate;
#Autowired
public JdbcIndividualRepository(DataSource jdbcDataSource) {
LOG.info("JDBCRepo arg constructor");
this.jdbcTemplate = new JdbcTemplate(jdbcDataSource);
}
#Override
public Individual save(Individual save) {
String sql = "INSERT INTO Individual(idIndividual, Name) VALUES(?,?)";
this.jdbcTemplate.update(sql, save.getId(), save.getName());
return save;
}
#Override
public void delete(String key) {
String sql = "DELETE FROM Individual WHERE idIndividual=?";
jdbcTemplate.update(sql, key);
}
#Override
public Individual findById(String key) {
String sql = "SELECT i.* FROM Individual i WHERE i.idIndividual=?";
return this.jdbcTemplate.queryForObject(sql, new IndividualRowMapper(), key);
}
#Override
public List<Individual> findAll() {
String sql = "SELECT * FROM Individual";
return new LinkedList<Individual>(this.jdbcTemplate.query(sql, new IndividualRowMapper()));
}
}
Then I register the jdbc configuration in the initializer class when creating the root context of the application as follows:
private WebApplicationContext createRootContext(ServletContext servletContext) {
AnnotationConfigWebApplicationContext rootContext = new AnnotationConfigWebApplicationContext();
rootContext.register(CoreConfig.class, SecurityConfig.class, JdbcConfiguration.class);
rootContext.refresh();
servletContext.addListener(new ContextLoaderListener(rootContext));
servletContext.setInitParameter("defaultHtmlEscape", "true");
return rootContext;
}
However, the Tomcat server wont run because it can't autowire the class MySQLDataSourceConfiguration.
Anyone knows what the problem might be? I can give more details on the code, but the question is already really large.
Appreciate any kind of help!
Cheers
EDIT
Solved changing the JdbcConfiguration class to:
#Configuration
#EnableTransactionManagement
#PropertySource("classpath:/services.properties")
#Import( { MySQLDataSourceConfiguration.class })
public class JdbcConfiguration {
#Autowired
private DataSource dataSource;
#Inject
private Environment environment;
#Bean
public JdbcTemplate setupJdbcTemplate() throws Exception {
return new JdbcTemplate(dataSource);
}
#Bean
public PlatformTransactionManager transactionManager(DataSource dataSource) throws Exception {
return new DataSourceTransactionManager(dataSource);
}
#Bean
public IndividualsRepository createRepo(){
return new JdbcIndividualRepository(dataSource);
}
}
Remove
#Autowired
private DataSourceConfiguration dataSourceConfiguration;
Because that's not how it's supposed to be used. Instead add to the same class the following:
#Autowired DataSource dataSource;
and use it like this: new JdbcTemplate(dataSource);
Also, try adding #ComponentScan to JdbcConfiguration class. From what I see in your code the class JdbcIndividualRepository is not picked up by anything.
In your class JdbcConfiguration, you are trying to autowire DataSourceConfiguration. I'm not really sure if that's possible - typically you should try to autwire the DataSource, not the DataSourceConfiguration.
#Import( { MySQLDataSourceConfiguration.class })
public class JdbcConfiguration {
#Autowired
private DataSource dataSource;
#Bean
public JdbcTemplate setupJdbcTemplate() throws Exception {
return new JdbcTemplate(dataSource);
}
Also if you have several DataSources and you're using Spring profiles to separate them, it's easier to provide all the DataSource beans in one file and annotate each bean with a different profile:
#Configuration
public class DataSourceConfig {
#Bean
#Profile("Test")
public DataSource devDataSource() {
.... configure data source
}
#Bean
#Profile("Prod")
public DataSource prodDataSource() {
... configure data source
}

Resources