SpringBoot and SpringJDBC multiple datasources - spring-boot

I am trying to use two datasources with my SpringBoot application and can't get the second datasource to autowire. I have tried many things but this is the closest I have come:
My Yaml file:
spring:
first-datasource:
url: MyURLString1
username: User
password: Password
driver-class-name: oracle.jdbc.OracleDriver
second-datasource:
url: MyURLString2
username: User
password: Password
driver-class-name: oracle.jdbc.OracleDriver
My Application Class:
#SpringBootApplication
public class MyApplication {
public static void main(String[] args) {
SpringApplication.run(MyApplication.class, args);
}
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.first-datasource")
public DataSource firstDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix = "spring.second-datasource")
public DataSource secondDataSource() {
return DataSourceBuilder.create().build();
}
}
And Finally my DAO:
#Repository
public class MyDao {
private static final String FIRST_SELECT = "select * from SomeTableInDB1";
private static final String SECOND_SELECT = "select * from AnotherTableInDB2";
#Autowired
private JdbcTemplate firstJdbcTemplate;
#Autowired
#Qualifier("secondDataSource")
private JdbcTemplate secondJdbcTemplate;
List<DB1Entity> getDB1Entity(Long id) {
return firstJdbcTemplate.query(FIRST_SELECT, new Object[] {id}, new BeanPropertyRowMapper(DB1Entity.class));
}
List<DB2Entity> getDB2Entity(Long id) {
return secondJdbcTemplate.query(SECOND_SELECT, new Object[] {id}, new BeanPropertyRowMapper(DB2Entity.class));
}
}
This is the closest I have come so far. I say it is closest because if I remove the #Qualifier then both of my dao methods actually work, assuming that the SECOND_SELECT statement is valid SQL for my DB1. Once I put in the #Qualifier for my non-primary datasouce then I get an autowire error because Spring is expecting a Datasouce object, not a JdbcTemplate object. That is weird to me as it does work with the primary datasource.
Here is my error:
Could not autowire field: private org.springframework.jdbc.core.JdbcTemplate org.my.classpath.secondJdbcTemplate; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type [org.springframework.jdbc.core.JdbcTemplate] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=true), #org.springframework.beans.factory.annotation.Qualifier(value=secondDataSource)}

You create the bean of type DataSource, but try to Autowire JdbcTemplate which is a mismatch. Your probably should have something like this
private JdbcTemplate jdbcTemplate1;
private JdbcTemplate jdbcTemplate2;
#Autowired
#Qualifier("firstDataSource")
public void setDataSource(DataSource dataSource){
this.jdbcTemplate1=new JdbcTemplate(dataSource);
}
#Autowired
#Qualifier("secondDataSource")
public void setDataSource(DataSource dataSource){
this.jdbcTemplate2=new JdbcTemplate(dataSource);
}

Ideally, but not a mandate, one of the datasources should be marked PRIMARY for most of the default wiring via Annotations to work. Also, we need to create TransactionManagers for each of the datasources separately otherwise Spring would not know how to enforce Transactions. Following is a complete example of how this should be done
#Primary
#Bean(name = "dataSource")
#ConfigurationProperties(prefix="datasource.mysql")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "transactionManager")
public DataSourceTransactionManager transactionManager(#Qualifier("dataSource") DataSource dataSource) {
return new DataSourceTransactionManager();
}
#Bean(name = "postGresDataSource")
#ConfigurationProperties(prefix="datasource.postgres")
public DataSource postgresDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "postGresTransactionManager")
public DataSourceTransactionManager transactionManager(#Qualifier("postGresDataSource") DataSource dataSource) {
return new DataSourceTransactionManager();
}
#Transactional(transactionManager="postGresTransactionManager")
public void createCustomer(Customer cust) {
customerDAO.create(cust);
}
// specifying a transactionManager attribute is optional if we
// want to use the default transactionManager since we
// already marked one of the TM above with #Primary
#Transactional
public void createOrder(Order order) {
orderDAO.create(order);
}
Hope this helps.

Here also provide another 'not-working' situation that confused me for several days:
when configurating two data sources of same type in Springboot application, the #Qualifier doesn't work as expected to pick up right beans. It behaves like it's not recognized by Spring framework.
the reason is when using #SpringbootApplication annotation which contains #EnableAutoConfiguration annotation, which, in Springboot, will auto-configurate data sources it provides for users.
this would terribly affect #Qualifier's behavior.

In my case, following the #Aman Tuladhar answer, worked like that:
(Spring Boot 2.1.3.RELEASE)
#Configuration
public class JDBCConfig {
#Bean("first-datasource")
public JdbcTemplate paymentsJDBCTemplate(#Qualifier("first-db") DataSource paymentsDataSource){
return new JdbcTemplate(paymentsDataSource);
}
#Bean("second-datasource")
public JdbcTemplate parametersJDBCTemplate(#Qualifier("second-db") DataSource paramsDataSource){
return new JdbcTemplate(paramsDataSource);
}
#Bean("first-db")
public DataSource paymentsDataSource(Environment env) {
return buildDataSource(env, "first");
}
#Bean("second-db")
public DataSource paramsDataSource(Environment env) {
return buildDataSource(env, "second");
}
private DataSource buildDataSource(Environment env, String prop) {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getProperty("spring."+prop+"-datasource.driver-class-name"));
dataSource.setUrl(env.getProperty("spring."+prop+"-datasource.url"));
dataSource.setUsername(env.getProperty("spring."+prop+"-datasource.username"));
dataSource.setPassword(env.getProperty("spring."+prop+"-datasource.password"));
return dataSource;
}
}
... and using like that:
#Autowired #Qualifier("first-datasource")
private JdbcTemplate jdbcTemplate1;
#Autowired #Qualifier("second-datasource")
private JdbcTemplate jdbcTemplate2;
... application.yml:
spring:
datasource:
driver-class-name: com.mysql.cj.jdbc.Driver
url: ${DATABASE_1_URL}
username: ${DATABASE_1_USERNAME}
password: ${DATABASE_1_PASSWORD}
second-datasource:
driver-class-name: com.mysql.cj.jdbc.Driver
url: ${DATABASE_2_URL}
username: ${DATABASE_2_USERNAME}
password: ${DATABASE_2_PASSWORD}

Related

One Database overrides the other when Primary Bean is defined on and vice versa

I have 2 JDBC Datasources defined in a Spring Boot application utilizing used in a Spring Batch job. However, after autowiring the datasources, only one gets used. The one used is the one annotated #Primary. If I place the annotation on the other JDBC datasource that gets used instead. In a nutshell only one of the JDBC datasources ever gets used. I use Lombok in some places but I'm unsure if that is playing a part.
Here are the datasources:
application.yml
symphony:
datasource:
driver-class-name: oracle.jdbc.OracleDriver
url: ...
type: com.zaxxer.hikari.HikariDataSource
username: <USR>
password: <PWD>
jndi-name: false
repo:
datasource:
driver-class-name: org.h2.Driver
url: jdbc:h2:mem:db;DB_CLOSE_DELAY=-1
username: sa
password: sa
jndi-name: false
Here is the first datasource:
#Configuration
public class RepoDbConfig {
#Bean
#ConfigurationProperties("repo.datasource")
public DataSourceProperties repoDataProperties() {
return new DataSourceProperties();
}
#Bean(name = "repoDataSource")
public DataSource dataSourcerepo() {
DataSource dataSource = repoDataProperties().initializeDataSourceBuilder().type(BasicDataSource.class)
.build();
return dataSource;
}
#Bean(name = "repoJdbcTemplate")
public JdbcTemplate repoJdbcTemplate(DataSource repoDataSource) {
return new JdbcTemplate(repoDataSource);
}
}
Here is the second datasource:
#Configuration
public class SymphonyDbConfig {
#Primary
#Bean
#ConfigurationProperties("symphony.datasource")
public DataSourceProperties symphonyDataSourceProperties() {
return new DataSourceProperties();
}
#Primary
#Bean(name = "symphonyDataSource")
public DataSource dataSourcesymphony() {
HikariDataSource dataSource = symphonyDataSourceProperties().initializeDataSourceBuilder().type(HikariDataSource.class)
.build();
return dataSource;
}
#Primary
#Bean(name = "symphonyJdbcTemplate")
public JdbcTemplate symphonyJdbcTemplate(DataSource symphonyDataSource) {
return new JdbcTemplate(symphonyDataSource);
}
}
The JobRepository beans are configured like this:
#Configuration
#RequiredArgsConstructor
public class JobRepositoryConfig {
final #Qualifier("repoDataSource")
DataSource repoDataSource;
#Bean("repoTransactionManager")
AbstractPlatformTransactionManager repoTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean("repoJobRepository")
public JobRepository repoJobRepository(DataSource repoDataSource) throws Exception {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setDataSource(repoDataSource);
jobRepositoryFactoryBean.setTransactionManager(repoTransactionManager());
jobRepositoryFactoryBean.setDatabaseType(DatabaseType.H2.getProductName());
return jobRepositoryFactoryBean.getObject();
}
#Bean("repoAppJobLauncher")
public JobLauncher careLocationAppJobLauncher(JobRepository repoJobRepository) {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(repoJobRepository);
return simpleJobLauncher;
}
}
Finally the Batch Job beans used for the Job are configured here: The only part not shown is the launching of the job. All the required beans used are shown here:
#Configuration
#EnableBatchProcessing
#EnableScheduling
#RequiredArgsConstructor
#Slf4j
public class CellBatchConfig {
private final JobBuilderFactory jobBuilderFactory;
#Qualifier("repoAppJobLauncher")
private final JobLauncher repoAppJobLauncher;
private final StepBuilderFactory stepBuilderFactory;
#Value("${chunk-size}")
private int chunkSize;
#Qualifier("symphonyDataSource")
final DataSource symphonyDataSource;
#Qualifier("repoDataSource")
final DataSource symphonyDataSource;
#Bean
public JdbcPagingItemReader<CenterDto> cellItemReader(PagingQueryProvider pagingQueryProvider) {
return new JdbcPagingItemReaderBuilder<CenterDto>()
.name("cellItemReader")
.dataSource(symphonyDataSource)
.queryProvider(pagingQueryProvider)
.pageSize(chunkSize)
.rowMapper(new CellRowMapper())
.build();
}
#Bean
public PagingQueryProvider pagingQueryProvider() {
OraclePagingQueryProvider pagingQueryProvider = new OraclePagingQueryProvider();
final Map<String, Order> sortKeys = new HashMap<>();
sortKeys.put("ID", Order.ASCENDING);
pagingQueryProvider.setSortKeys(sortKeys);
pagingQueryProvider.setSelectClause(" ID, CELL_NO, MAT_VO ");
pagingQueryProvider.setFromClause(" from pvc.cells");
return pagingQueryProvider;
}
.......
}
The error results from only one of the datasources being used. That results that being used to query the Spring Batch job repository resulting in it failing: Here is the key portion of the stacktrace. It is trying to use the oracle datasource to query for the JobRespository resources and fails as a result:
Caused by: org.springframework.jdbc.BadSqlGrammarException:
PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from
BATCH_JOB_INSTANCE
where JOB_NAME = ? and JOB_KEY = ?]; nested exception is
java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist
In the class JobRepositoryConfig:
In the bean:
#Bean("symphonyJobRepository")
public JobRepository symphonyJobRepository(DataSource repoDataSource) throws Exception {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setDataSource(repoDataSource);
jobRepositoryFactoryBean.setTransactionManager(repoTransactionManager());
jobRepositoryFactoryBean.setDatabaseType(DatabaseType.H2.getProductName());
return jobRepositoryFactoryBean.getObject();
}
You didn't use the variable:
final #Qualifier("repoDataSource") DataSource repoDataSource;
So Spring uses a DataSource object which is annotated with #Primary annotation
I fixed it by making one bean the primary and also adding the qualifiers on the specific beans which had been missing, an omission on my part. For example, here I added the #Qualifier:
#Primary
#Bean(name = "symphonyDataSource")
#Qualifier("symphonyDataSource") // This was missing
public DataSource dataSourcesymphony() {
HikariDataSource dataSource = symphonyDataSourceProperties().initializeDataSourceBuilder().type(HikariDataSource.class)
.build();
return dataSource;
}

Annotation Value is not working in spring boot

I want to read few values from properties file through #Value annotation, but getting error.
Injection of autowired dependencies failed; nested exception is java.lang.IllegalArgumentException: Could not resolve placeholder 'rcm.datasource.driverClassName' in value "${rcm.datasource.driverClassName}"
properties file
rcm.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
java class
#Configuration
public class RcmDBConfig {
#Value("${rcm.datasource.driverClassName}")
private String driverClassName;
#Bean(name = "rcmEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean productEntityManager() {
LocalContainerEntityManagerFactoryBean em
= new LocalContainerEntityManagerFactoryBean();
System.out.println(driverClassName);
}
}
The value can be able to inject if you have specified the component-scan package properly; otherwise spring boot can't be able to detect the packages for auto-configuration.
Ex:
#Configuration
#EnableAutoConfiguration
#ComponentScan({"com.deb.xyz.*"})
#SpringBootApplication
public class Application{
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
and your RcmDBConfig.java must be in package like com.deb.xyz.config
you can use like this
#Configuration
public class MyDatabaseConfig {
#Bean(name = "myDataSource")
#Primary
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource myDataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
here is the yml file
spring:
datasource:
driver-class-name: com.mysql.cj.jdbc.Driver
username: ****
password: ***********
jdbc-url : jdbc:mysql://localhost:3306/myshema?autoReconnect=true&autoReconnectForPools=true&zeroDateTimeBehavior=convertToNull&serverTimezone=UTC
or you just want to read yml or properties files in java class
the like this
#Component
#ConfigurationProperties("spring.datasource")
#Getter
#Setter
public class SpringYMLData {
private String driver-class-name;
private String username;
private String password;
private String jdbc-url;
}

Spring boot multiple data sources - Connectivity Error Handling

I'm working with multiple data sources (Oracle and SQL-Server) in spring boot rest application. In this application, I have more than 25+ end-points exist to process client requests. But when one of the databases is down like Oracle or SQL-server is not available for some reason, my application is unable to start the server.
Looked couple examples on google and stack overflow but they're different what I'm looking for...
package com.foobar;
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories
(
entityManagerFactoryRef = "entityManagerFactory",
basePackages = { "com.foobar.foo.repo" }
)
public class FooDbConfig
{
#Primary
#Bean(name = "dataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean
entityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("dataSource") DataSource dataSource
) {
return builder
.dataSource(dataSource)
.packages("com.foobar.foo.domain")
.persistenceUnit("foo")
.build();
}
#Primary
#Bean(name = "transactionManager")
public PlatformTransactionManager transactionManager(
#Qualifier("entityManagerFactory") EntityManagerFactory
entityManagerFactory)
{
return new JpaTransactionManager(entityManagerFactory);
}
}
same configuration for 2nd data-source but with different properties.
I'm using below example as a base code reference to implement my requirements
Example link
I'm looking for a solution if one DB server is available out of N application should start and process client requests and whenever 2nd DB server is available then it connects automatically and processes other endpoints requests
I've created a solution recently for multitenancy with datasources and using liquibase, but if not use the liquibase, just remove that works too!
Example of application.yml
spring:
dataSources:
- tenantId: db1
url: jdbc:postgresql://localhost:5432/db1
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
liquibase:
enabled: true
default-schema: public
change-log: classpath:db/master/changelog/db.changelog-master.yaml
- tenantId: db2
url: jdbc:postgresql://localhost:5432/db2
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
- tenantId: db3
url: jdbc:postgresql://localhost:5432/db3
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
DataSourceConfiguration
#Configuration
#EnableTransactionManagement
#EntityScan(basePackages = { "br.com.dijalmasilva.springbootmultitenancyliquibase" })
#EnableJpaRepositories(basePackages = { "br.com.dijalmasilva.springbootmultitenancyliquibase" })
public class DataSourceConfiguration {
#Bean(name = "dataSources")
#Primary
public Map<Object, Object> getDataSources(DataSourceProperties dataSourceProperties) {
return dataSourceProperties.getDataSources().stream().map(dataSourceProperty -> {
DataSource dataSource = DataSourceBuilder.create()
.url(dataSourceProperty.getUrl())
.username(dataSourceProperty.getUsername())
.password(dataSourceProperty.getPassword())
.driverClassName(dataSourceProperty.getDriverClassName())
.build();
return new TenantIdDataSource(dataSourceProperty.getTenantId(), dataSource);
}).collect(Collectors.toMap(TenantIdDataSource::getTenantId, TenantIdDataSource::getDataSource));
}
#Bean(name = "tenantRoutingDataSource")
#DependsOn("dataSources")
public DataSource dataSource(Map<Object, Object> dataSources) {
AbstractRoutingDataSource tenantRoutingDataSource = new TenantRoutingDataSource();
tenantRoutingDataSource.setTargetDataSources(dataSources);
tenantRoutingDataSource.setDefaultTargetDataSource(dataSources.get("db1"));
tenantRoutingDataSource.afterPropertiesSet();
return tenantRoutingDataSource;
}
#Data
#AllArgsConstructor
private class TenantIdDataSource {
private Object tenantId;
private Object dataSource;
}
}
TenantRoutingDataSource
public class TenantRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return TenantContext.getCurrentTenant();
}
}
DataSourceProperties
#Data
#Component
#ConfigurationProperties(prefix = "spring")
public class DataSourceProperties {
private List<DataSourceProperty> dataSources = new ArrayList<>();
}
DataSourceProperty
#Data
public class DataSourceProperty {
private String tenantId;
private String url;
private String username;
private String password;
private String driverClassName;
private LiquibaseProperties liquibase;
}
See the complete code, maybe help you!
Link of project: https://github.com/dijalmasilva/spring-boot-multitenancy-datasource-liquibase

How to use Multiple JdbcOperations and Multiple JdbcTemplates in Spring

I have 2 different datasrouces from which I want to use in the same file and query each of them using JdbcOperations implementation. Is this possible?
#Repository
public class TestRepository {
private JdbcOperations jdbcOperations;
#Inject
#Qualifier("dataSource1")
private DataSource dataSource1;
#Inject
#Qualifier("dataSource2")
private DataSource dataSource2;
#Bean
#Qualifier("jdbcTemplate1")
public JdbcTemplate jdbcTemplate1(#Qualifier("dataSource1") DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
#Bean
#Qualifier("jdbcTemplate2")
public JdbcTemplate jdbcTemplate1(#Qualifier("dataSource2") DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
#Inject
public TestRepository(JdbcOperations jdbcOperations) {
this.jdbcOperations = jdbcOperations; //HOW DO I SPECIFY WHICH JDBCTEMPLATE SHOULD BE USED FOR INITIALIZING THIS JDBCOPERATIONS
}
}
Above is my code, note that JdbcOperations is initialized in the constructor. But no way to specify which jdbcTemplate should the jdbcOperations use.
The qualifier should actually be put at the parameter level:
public TestRepository(#Qualifier("jdbcTemplate2")JdbcOperations jdbcOperations) {
this.jdbcOperations = jdbcOperations;
}
Uses the bean named jdbcTemplate2

Spring jdbc configuration

I have been trying to implement a web service using spring. This webservice will provide data access to a mySQL database using JDBC. I am trying to not use any xml configuration files, so I have come across a problem trying to connect to the database.
I am following the tutorial: http://spring.io/guides/tutorials/rest/ but I changed a few things along the way.
Now that I am trying to implement the connection with the database I get an error when trying to execute the tomcat instance, and I guess the problem is within the configurations.
Here follows some of my code:
Datasource configuration:
#Configuration
#Profile("mySQL")
#PropertySource("classpath:/services.properties")
public class MySQLDataSourceConfiguration implements DataSourceConfiguration{
#Inject
private Environment environment;
#Bean
public DataSource dataSource() throws Exception {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setPassword(environment.getProperty("dataSource.password"));
dataSource.setUrl(environment.getProperty("dataSource.url"));
dataSource.setUsername(environment.getProperty("dataSource.user"));
dataSource.setDriverClassName(environment.getPropertyAsClass("dataSource.driverClass", Driver.class).getName());
return dataSource;
}
}
the file service.properties is where I keep my configurations for the database, so when I desire to change the database I will just have to change 4 fields.
The JDBCConfiguration class for the setup of the JDBCtemplate
#Configuration
#EnableTransactionManagement
#PropertySource("classpath:/services.properties")
#Import( { MySQLDataSourceConfiguration.class })
public class JdbcConfiguration {
#Autowired
private DataSourceConfiguration dataSourceConfiguration;
#Inject
private Environment environment;
#Bean
public JdbcTemplate setupJdbcTemplate() throws Exception {
return new JdbcTemplate(dataSourceConfiguration.dataSource());
}
#Bean
public PlatformTransactionManager transactionManager(DataSource dataSource) throws Exception {
return new DataSourceTransactionManager(dataSource);
}
}
Then there is the Repository, that recieves the template.
#Transactional
#Repository
#Qualifier("jdbcRepository")
public class JdbcIndividualRepository implements IndividualsRepository{
private static final Logger LOG = LoggerFactory.getLogger(JdbcIndividualRepository.class);
#Autowired
private JdbcTemplate jdbcTemplate;
#Autowired
public JdbcIndividualRepository(DataSource jdbcDataSource) {
LOG.info("JDBCRepo arg constructor");
this.jdbcTemplate = new JdbcTemplate(jdbcDataSource);
}
#Override
public Individual save(Individual save) {
String sql = "INSERT INTO Individual(idIndividual, Name) VALUES(?,?)";
this.jdbcTemplate.update(sql, save.getId(), save.getName());
return save;
}
#Override
public void delete(String key) {
String sql = "DELETE FROM Individual WHERE idIndividual=?";
jdbcTemplate.update(sql, key);
}
#Override
public Individual findById(String key) {
String sql = "SELECT i.* FROM Individual i WHERE i.idIndividual=?";
return this.jdbcTemplate.queryForObject(sql, new IndividualRowMapper(), key);
}
#Override
public List<Individual> findAll() {
String sql = "SELECT * FROM Individual";
return new LinkedList<Individual>(this.jdbcTemplate.query(sql, new IndividualRowMapper()));
}
}
Then I register the jdbc configuration in the initializer class when creating the root context of the application as follows:
private WebApplicationContext createRootContext(ServletContext servletContext) {
AnnotationConfigWebApplicationContext rootContext = new AnnotationConfigWebApplicationContext();
rootContext.register(CoreConfig.class, SecurityConfig.class, JdbcConfiguration.class);
rootContext.refresh();
servletContext.addListener(new ContextLoaderListener(rootContext));
servletContext.setInitParameter("defaultHtmlEscape", "true");
return rootContext;
}
However, the Tomcat server wont run because it can't autowire the class MySQLDataSourceConfiguration.
Anyone knows what the problem might be? I can give more details on the code, but the question is already really large.
Appreciate any kind of help!
Cheers
EDIT
Solved changing the JdbcConfiguration class to:
#Configuration
#EnableTransactionManagement
#PropertySource("classpath:/services.properties")
#Import( { MySQLDataSourceConfiguration.class })
public class JdbcConfiguration {
#Autowired
private DataSource dataSource;
#Inject
private Environment environment;
#Bean
public JdbcTemplate setupJdbcTemplate() throws Exception {
return new JdbcTemplate(dataSource);
}
#Bean
public PlatformTransactionManager transactionManager(DataSource dataSource) throws Exception {
return new DataSourceTransactionManager(dataSource);
}
#Bean
public IndividualsRepository createRepo(){
return new JdbcIndividualRepository(dataSource);
}
}
Remove
#Autowired
private DataSourceConfiguration dataSourceConfiguration;
Because that's not how it's supposed to be used. Instead add to the same class the following:
#Autowired DataSource dataSource;
and use it like this: new JdbcTemplate(dataSource);
Also, try adding #ComponentScan to JdbcConfiguration class. From what I see in your code the class JdbcIndividualRepository is not picked up by anything.
In your class JdbcConfiguration, you are trying to autowire DataSourceConfiguration. I'm not really sure if that's possible - typically you should try to autwire the DataSource, not the DataSourceConfiguration.
#Import( { MySQLDataSourceConfiguration.class })
public class JdbcConfiguration {
#Autowired
private DataSource dataSource;
#Bean
public JdbcTemplate setupJdbcTemplate() throws Exception {
return new JdbcTemplate(dataSource);
}
Also if you have several DataSources and you're using Spring profiles to separate them, it's easier to provide all the DataSource beans in one file and annotate each bean with a different profile:
#Configuration
public class DataSourceConfig {
#Bean
#Profile("Test")
public DataSource devDataSource() {
.... configure data source
}
#Bean
#Profile("Prod")
public DataSource prodDataSource() {
... configure data source
}

Resources