I'm getting below error after 5-6 requests.
org.springframework.dao.DataAccessResourceFailureException
Unable to acquire JDBC Connection; nested exception is org.hibernate.exception.JDBCConnectionException: Unable to acquire JDBC Connection
Using below code working perfectly, except it exhausts the connection pool after few requests.
I'm new to Spring framework, and made up all these using online samples. I have tried a few variants and all failed. Any help would be appreciated. Thanks.
application.yml
spring:
datasource:
type: com.zaxxer.hikari.HikariDataSource
dataSourceClassName: com.mysql.jdbc.jdbc2.optional.MysqlDataSource
jdbcUrl: jdbc:mysql://localhost:3306/db_name?autoReconnect=true&useSSL=false&useUnicode=true&characterEncoding=utf8
catalog: db_name
username: myusername
password: mypassword
testOnBorrow: true
validationQuery: SELECT 1
testWhileIdle: true
timeBetweenEvictionRunsMillis: 3600000
jpa:
show_sql: true
hibernate:
dialect: org.hibernate.dialect.MySQLDialect
show_sql: false
format_sql: true
connection:
provider_class: com.zaxxer.hikari.hibernate.HikariConnectionProvider
release_mode: after_transaction
...
ApplicationConfiguration.java
#Configuration
#PropertySource("classpath:application.yml")
#EnableTransactionManagement
#EnableSwagger2
#EntityScan("com...dal.data")
public class ApplicationConfiguration extends WebMvcConfigurerAdapter {
#Configuration
#ConfigurationProperties(prefix="spring.datasource")
public class JpaConfig extends HikariConfig {}
#Autowired
private JpaConfig jpaConfig;
#Bean(destroyMethod = "close")
public DataSource dataSource() {
return new HikariDataSource(jpaConfig);
}
#Bean
public SessionFactory sessionFactory() {
LocalSessionFactoryBuilder factoryBuilder = new LocalSessionFactoryBuilder(dataSource());
factoryBuilder.addAnnotatedClasses(
com...dal.data.MyEntity.class, ...
);
return factoryBuilder.buildSessionFactory();
}
TestDaoImpl.java
#Repository
#Scope(value = "request", proxyMode = ScopedProxyMode.TARGET_CLASS)
public class TestDaoImpl implements TestDao {
private static final Logger logger = LoggerFactory.getLogger(TestDaoImpl.class);
#PersistenceContext
private EntityManager em;
#SuppressWarnings("unchecked")
#Override
public List<MyEntity> getEntities() {
return em.unwrap(Session.class)
.createCriteria(MyEntity.class, "myEntity")
.list();
}
#Override
#Transactional
public void saveTest(MyEntity test) throws OperationException {
try {
em.persist(test);
} catch (Exception e) {
logger.error("ERROR saving test", e);
throw new OperationException("PS-SERVER");
}
}
This Code is working good.
The issue was with another #Repository class in the project was doing
#Inject
private SessionFactory sessionFactory;
which was eating up the connection, even when the code will not be called in the test service. I'm still not sure how that works, but once i replace that code with
#PersistenceContext
private EntityManager em;
it worked.
Related
I have used Namedjdbctemplate and Spring Jpa both in my project for which i have two Configuration class which is preparing the datasource for db2 database but when i run the project i am facing
Name Can not be null Error while creating entity Manage Factory . I am not able to identify the root cause of this problem can some body please point the problem and mistakes i am doing
Tried put("hibernate.ddl-auto", "none"); as well but same error
Error
Caused by: javax.persistence.PersistenceException: [PersistenceUnit: read] Unable to build Hibernate SessionFactory
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.persistenceException(EntityManagerFactoryBuilderImpl.java:954)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:882)
at org.hibernate.jpa.HibernatePersistenceProvider.createContainerEntityManagerFactory(HibernatePersistenceProvider.java:135)
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:353)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:370)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:359)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1687)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1624)
... 16 common frames omitted
Caused by: java.lang.IllegalArgumentException: Name cannot be null
at org.hibernate.boot.model.relational.QualifiedNameParser$NameParts.<init>(QualifiedNameParser.java:34)
at org.hibernate.boot.model.relational.QualifiedNameImpl.<init>(QualifiedNameImpl.java:24)
at org.hibernate.boot.model.relational.QualifiedSequenceName.<init>(QualifiedSequenceName.java:16)
at org.hibernate.tool.schema.extract.internal.SequenceInformationExtractorLegacyImpl.extractMetadata(SequenceInformationExtractorLegacyImpl.java:51)
at org.hibernate.tool.schema.extract.internal.DatabaseInformationImpl.initializeSequences(DatabaseInformationImpl.java:64)
at org.hibernate.tool.schema.extract.internal.DatabaseInformationImpl.<init>(DatabaseInformationImpl.java:60)
at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:123)
at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:101)
at org.hibernate.internal.SessionFactoryImpl.<init>(SessionFactoryImpl.java:472)
at org.hibernate.boot.internal.SessionFactoryBuilderImpl.build(SessionFactoryBuilderImpl.java:444)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:879)
... 22 common frames omitted
Process finished with exit code
Configuration Classes
#Setter
#Getter
#Configuration
#PropertySource("classpath:application.yml")
#ConfigurationProperties("app.datasource.db2.credentials.hikari")
public class HikariReadProperties {
private String poolName;
private int minimumIdle;
private int maximumPoolSize;
private int idleTimeout;
private String connectionTestQuery;
}
public class HikariConfigRead extends HikariConfig {
protected final HikariReadProperties hikariReadProperties;
protected final String PERSISTENCE_UNIT_NAME = "read";
protected HikariConfigRead(HikariReadProperties hikariReadProperties) {
this.hikariReadProperties = hikariReadProperties;
setPoolName(this.hikariReadProperties.getPoolName());
setMinimumIdle(this.hikariReadProperties.getMinimumIdle());
setMaximumPoolSize(this.hikariReadProperties.getMaximumPoolSize());
setIdleTimeout(this.hikariReadProperties.getIdleTimeout());
setConnectionTestQuery(this.hikariReadProperties.getConnectionTestQuery());
}
}
#Configuration
#ConfigurationProperties("app.datasource.db2.credentials")
#EnableTransactionManagement
#EnableJpaRepositories(entityManagerFactoryRef = "entityManagerFactory",
transactionManagerRef = "transactionManagerRead", basePackages = {"com.testing.db2migration.repository"})
public class JpaConfiguration extends HikariConfigRead {
protected JpaConfiguration(HikariReadProperties hikariReadProperties) {
super(hikariReadProperties);
}
#Bean(name = "db2DataSource")
public HikariDataSource hikariDataSource() {
return new HikariDataSource(this);
}
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean entityManagerFactoryWrite(
final #Qualifier("db2DataSource") HikariDataSource dataSourceWrite) {
return new LocalContainerEntityManagerFactoryBean() {{
setDataSource(dataSourceWrite);
setPersistenceProviderClass(HibernatePersistenceProvider.class);
setPersistenceUnitName(PERSISTENCE_UNIT_NAME);
setPackagesToScan("com.testing.db2migration.model");
Properties JPA_READ_PROPERTIES = new Properties() {{
put("hibernate.dialect", "org.hibernate.dialect.DB2Dialect");
put("hibernate.hbm2ddl.auto", "update");
put("hibernate.ddl-auto", "update");
put("show-sql", "true");
}};
setJpaProperties(JPA_READ_PROPERTIES);
}};
}
#Bean(name="transactionManagerRead")
public PlatformTransactionManager transactionManagerWrite(
#Qualifier("entityManagerFactory") EntityManagerFactory entityManagerFactoryWrite) {
return new JpaTransactionManager(entityManagerFactoryWrite);
}
}
public class BaseConfig {
protected Environment environment;
protected DataSource getHikariDataSource () {
HikariDataSource ds = new HikariDataSource();
String driverClassName = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-driverClassName");
String url = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-url");
String userName = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-username");
String password = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-password");
String dbTestQuery = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-dbTestQuery");
ds.setJdbcUrl(url);
ds.setUsername(userName);
ds.setPassword(password);
ds.setDriverClassName(driverClassName);
ds.setConnectionTestQuery(dbTestQuery);
return ds;
}
}
#Configuration
#ComponentScan("com.testing")
public class LocalConfig extends BaseConfig implements EnvironmentAware {
#Override
public void setEnvironment(Environment environment) {
this.environment = environment;
}
#Bean(name = "dataSource")
#Primary
public DataSource dataSource() throws SQLException {
return this.getHikariDataSource();
}
#Bean("jdbcTemplate")
#Autowired
public JdbcTemplate jdbcTemplate(DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
..................
#Normal bean declaration
}
application.yaml
vcap:
services:
store-service:
credentials:
env: QA
db2-driverClassName: com.ibm.db2.jcc.DB2Driver
db2-url: jdbc:db2://localhost:/APP1
db2-schema: YQ1MM
db2-username: ******
db2-password: *****
db2-dbTestQuery: SELECT CURRENT SQLID FROM SYSIBM.SYSDUMMY1
app:
datasource:
db2:
credentials:
env: QA
driver-class-name: com.ibm.db2.jcc.DB2Driver
jdbc-url: jdbc:db2://localhost/APP1
hibernate:
default_schema: YQ1MM
username: *****
password: *****
hikari:
maximum-pool-size: 10
connectionTestQuery: SELECT CURRENT SQLID FROM SYSIBM.SYSDUMMY1
Model
#Entity
#Table(name = "epr_str")
public class EprStr {
#Id
#Column(name = "str_bu_id")
private String strBuId;
#Column(name = "str_nbr")
private String strNbr;
}
Table Structure
str_bu_id is primary key
I have 2 JDBC Datasources defined in a Spring Boot application utilizing used in a Spring Batch job. However, after autowiring the datasources, only one gets used. The one used is the one annotated #Primary. If I place the annotation on the other JDBC datasource that gets used instead. In a nutshell only one of the JDBC datasources ever gets used. I use Lombok in some places but I'm unsure if that is playing a part.
Here are the datasources:
application.yml
symphony:
datasource:
driver-class-name: oracle.jdbc.OracleDriver
url: ...
type: com.zaxxer.hikari.HikariDataSource
username: <USR>
password: <PWD>
jndi-name: false
repo:
datasource:
driver-class-name: org.h2.Driver
url: jdbc:h2:mem:db;DB_CLOSE_DELAY=-1
username: sa
password: sa
jndi-name: false
Here is the first datasource:
#Configuration
public class RepoDbConfig {
#Bean
#ConfigurationProperties("repo.datasource")
public DataSourceProperties repoDataProperties() {
return new DataSourceProperties();
}
#Bean(name = "repoDataSource")
public DataSource dataSourcerepo() {
DataSource dataSource = repoDataProperties().initializeDataSourceBuilder().type(BasicDataSource.class)
.build();
return dataSource;
}
#Bean(name = "repoJdbcTemplate")
public JdbcTemplate repoJdbcTemplate(DataSource repoDataSource) {
return new JdbcTemplate(repoDataSource);
}
}
Here is the second datasource:
#Configuration
public class SymphonyDbConfig {
#Primary
#Bean
#ConfigurationProperties("symphony.datasource")
public DataSourceProperties symphonyDataSourceProperties() {
return new DataSourceProperties();
}
#Primary
#Bean(name = "symphonyDataSource")
public DataSource dataSourcesymphony() {
HikariDataSource dataSource = symphonyDataSourceProperties().initializeDataSourceBuilder().type(HikariDataSource.class)
.build();
return dataSource;
}
#Primary
#Bean(name = "symphonyJdbcTemplate")
public JdbcTemplate symphonyJdbcTemplate(DataSource symphonyDataSource) {
return new JdbcTemplate(symphonyDataSource);
}
}
The JobRepository beans are configured like this:
#Configuration
#RequiredArgsConstructor
public class JobRepositoryConfig {
final #Qualifier("repoDataSource")
DataSource repoDataSource;
#Bean("repoTransactionManager")
AbstractPlatformTransactionManager repoTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean("repoJobRepository")
public JobRepository repoJobRepository(DataSource repoDataSource) throws Exception {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setDataSource(repoDataSource);
jobRepositoryFactoryBean.setTransactionManager(repoTransactionManager());
jobRepositoryFactoryBean.setDatabaseType(DatabaseType.H2.getProductName());
return jobRepositoryFactoryBean.getObject();
}
#Bean("repoAppJobLauncher")
public JobLauncher careLocationAppJobLauncher(JobRepository repoJobRepository) {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(repoJobRepository);
return simpleJobLauncher;
}
}
Finally the Batch Job beans used for the Job are configured here: The only part not shown is the launching of the job. All the required beans used are shown here:
#Configuration
#EnableBatchProcessing
#EnableScheduling
#RequiredArgsConstructor
#Slf4j
public class CellBatchConfig {
private final JobBuilderFactory jobBuilderFactory;
#Qualifier("repoAppJobLauncher")
private final JobLauncher repoAppJobLauncher;
private final StepBuilderFactory stepBuilderFactory;
#Value("${chunk-size}")
private int chunkSize;
#Qualifier("symphonyDataSource")
final DataSource symphonyDataSource;
#Qualifier("repoDataSource")
final DataSource symphonyDataSource;
#Bean
public JdbcPagingItemReader<CenterDto> cellItemReader(PagingQueryProvider pagingQueryProvider) {
return new JdbcPagingItemReaderBuilder<CenterDto>()
.name("cellItemReader")
.dataSource(symphonyDataSource)
.queryProvider(pagingQueryProvider)
.pageSize(chunkSize)
.rowMapper(new CellRowMapper())
.build();
}
#Bean
public PagingQueryProvider pagingQueryProvider() {
OraclePagingQueryProvider pagingQueryProvider = new OraclePagingQueryProvider();
final Map<String, Order> sortKeys = new HashMap<>();
sortKeys.put("ID", Order.ASCENDING);
pagingQueryProvider.setSortKeys(sortKeys);
pagingQueryProvider.setSelectClause(" ID, CELL_NO, MAT_VO ");
pagingQueryProvider.setFromClause(" from pvc.cells");
return pagingQueryProvider;
}
.......
}
The error results from only one of the datasources being used. That results that being used to query the Spring Batch job repository resulting in it failing: Here is the key portion of the stacktrace. It is trying to use the oracle datasource to query for the JobRespository resources and fails as a result:
Caused by: org.springframework.jdbc.BadSqlGrammarException:
PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from
BATCH_JOB_INSTANCE
where JOB_NAME = ? and JOB_KEY = ?]; nested exception is
java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist
In the class JobRepositoryConfig:
In the bean:
#Bean("symphonyJobRepository")
public JobRepository symphonyJobRepository(DataSource repoDataSource) throws Exception {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setDataSource(repoDataSource);
jobRepositoryFactoryBean.setTransactionManager(repoTransactionManager());
jobRepositoryFactoryBean.setDatabaseType(DatabaseType.H2.getProductName());
return jobRepositoryFactoryBean.getObject();
}
You didn't use the variable:
final #Qualifier("repoDataSource") DataSource repoDataSource;
So Spring uses a DataSource object which is annotated with #Primary annotation
I fixed it by making one bean the primary and also adding the qualifiers on the specific beans which had been missing, an omission on my part. For example, here I added the #Qualifier:
#Primary
#Bean(name = "symphonyDataSource")
#Qualifier("symphonyDataSource") // This was missing
public DataSource dataSourcesymphony() {
HikariDataSource dataSource = symphonyDataSourceProperties().initializeDataSourceBuilder().type(HikariDataSource.class)
.build();
return dataSource;
}
I'm working with multiple data sources (Oracle and SQL-Server) in spring boot rest application. In this application, I have more than 25+ end-points exist to process client requests. But when one of the databases is down like Oracle or SQL-server is not available for some reason, my application is unable to start the server.
Looked couple examples on google and stack overflow but they're different what I'm looking for...
package com.foobar;
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories
(
entityManagerFactoryRef = "entityManagerFactory",
basePackages = { "com.foobar.foo.repo" }
)
public class FooDbConfig
{
#Primary
#Bean(name = "dataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean
entityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("dataSource") DataSource dataSource
) {
return builder
.dataSource(dataSource)
.packages("com.foobar.foo.domain")
.persistenceUnit("foo")
.build();
}
#Primary
#Bean(name = "transactionManager")
public PlatformTransactionManager transactionManager(
#Qualifier("entityManagerFactory") EntityManagerFactory
entityManagerFactory)
{
return new JpaTransactionManager(entityManagerFactory);
}
}
same configuration for 2nd data-source but with different properties.
I'm using below example as a base code reference to implement my requirements
Example link
I'm looking for a solution if one DB server is available out of N application should start and process client requests and whenever 2nd DB server is available then it connects automatically and processes other endpoints requests
I've created a solution recently for multitenancy with datasources and using liquibase, but if not use the liquibase, just remove that works too!
Example of application.yml
spring:
dataSources:
- tenantId: db1
url: jdbc:postgresql://localhost:5432/db1
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
liquibase:
enabled: true
default-schema: public
change-log: classpath:db/master/changelog/db.changelog-master.yaml
- tenantId: db2
url: jdbc:postgresql://localhost:5432/db2
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
- tenantId: db3
url: jdbc:postgresql://localhost:5432/db3
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
DataSourceConfiguration
#Configuration
#EnableTransactionManagement
#EntityScan(basePackages = { "br.com.dijalmasilva.springbootmultitenancyliquibase" })
#EnableJpaRepositories(basePackages = { "br.com.dijalmasilva.springbootmultitenancyliquibase" })
public class DataSourceConfiguration {
#Bean(name = "dataSources")
#Primary
public Map<Object, Object> getDataSources(DataSourceProperties dataSourceProperties) {
return dataSourceProperties.getDataSources().stream().map(dataSourceProperty -> {
DataSource dataSource = DataSourceBuilder.create()
.url(dataSourceProperty.getUrl())
.username(dataSourceProperty.getUsername())
.password(dataSourceProperty.getPassword())
.driverClassName(dataSourceProperty.getDriverClassName())
.build();
return new TenantIdDataSource(dataSourceProperty.getTenantId(), dataSource);
}).collect(Collectors.toMap(TenantIdDataSource::getTenantId, TenantIdDataSource::getDataSource));
}
#Bean(name = "tenantRoutingDataSource")
#DependsOn("dataSources")
public DataSource dataSource(Map<Object, Object> dataSources) {
AbstractRoutingDataSource tenantRoutingDataSource = new TenantRoutingDataSource();
tenantRoutingDataSource.setTargetDataSources(dataSources);
tenantRoutingDataSource.setDefaultTargetDataSource(dataSources.get("db1"));
tenantRoutingDataSource.afterPropertiesSet();
return tenantRoutingDataSource;
}
#Data
#AllArgsConstructor
private class TenantIdDataSource {
private Object tenantId;
private Object dataSource;
}
}
TenantRoutingDataSource
public class TenantRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return TenantContext.getCurrentTenant();
}
}
DataSourceProperties
#Data
#Component
#ConfigurationProperties(prefix = "spring")
public class DataSourceProperties {
private List<DataSourceProperty> dataSources = new ArrayList<>();
}
DataSourceProperty
#Data
public class DataSourceProperty {
private String tenantId;
private String url;
private String username;
private String password;
private String driverClassName;
private LiquibaseProperties liquibase;
}
See the complete code, maybe help you!
Link of project: https://github.com/dijalmasilva/spring-boot-multitenancy-datasource-liquibase
I am using Spring boot and Spring data and I want to use primarily a MySQL datasource but if fails to connect go to a H2 datasource.
So far, I do the change just moving the #Primary in the configurations, but if I put the #Primary in the MySQL (main data source) and stop the MySQL server in my pc, the other bean does not come up... What do I need?
application.yml:
# Main properties
spring:
application:
name: app
jpa:
database: default
show-sql: false
hibernate:
ddl-auto: update
properties:
hibernate:
format_sql: false
current_session_context_class: org.springframework.orm.hibernate5.SpringSessionContext
# Main database: MySQL
main.datasource:
url: jdbc:mysql://localhost:3306/app?useSSL=false
driver-class-name: com.mysql.jdbc.Driver
username: sa
password: sa
# Backup database: H2
backup.datasource:
url: jdbc:h2:${project.directory}/app;DB_CLOSE_ON_EXIT=FALSE
driver-class-name: org.h2.Driver
username: sa
password: sa
Main data source
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories("org.app")
#EntityScan("org.app")
public class MainDataSourceConfig {
#Primary
#Bean(name = "mainDataSource")
#ConfigurationProperties(prefix = "main.datasource")
public DataSource mainDataSource() {
return DataSourceBuilder.create().build();
}
}
Backup data source:
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories("org.app")
#EntityScan("org.app")
public class BackupDataSourceConfig {
#Bean(name = "backupDataSource")
#ConfigurationProperties(prefix = "backup.datasource")
public DataSource backupDataSource() {
return DataSourceBuilder.create().build();
}
}
Thanks!
I figure out how to do it. Hope this can help anyone:
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories("org.app")
#EntityScan("org.app")
public class DataSourceConfig {
private static final String USERNAME = "sa";
private static final String PASSWORD = "sa";
#Bean
#Primary
public DataSource dataSource() {
DataSource dataSource;
try {
dataSource = getMainDataSource();
dataSource.getConnection().isValid(500);
} catch (Exception e) {
log.error("Main database not valid.", e);
dataSource =getBackupDataSource();
}
return dataSource;
}
private DataSource getMainDataSource() {
return DataSourceBuilder.create()
.driverClassName("com.mysql.jdbc.Driver")
.username(USERNAME)
.password(PASSWORD)
.url("jdbc:mysql://localhost:3306/app?useSSL=false")
.build();
}
private DataSource getBackupDataSource() {
return DataSourceBuilder.create()
.driverClassName("org.h2.Driver")
.username(USERNAME)
.password(PASSWORD)
.url("jdbc:h2:/app;DB_CLOSE_ON_EXIT=FALSE")
.build();
}
}
Just on bean.
I am trying to use two datasources with my SpringBoot application and can't get the second datasource to autowire. I have tried many things but this is the closest I have come:
My Yaml file:
spring:
first-datasource:
url: MyURLString1
username: User
password: Password
driver-class-name: oracle.jdbc.OracleDriver
second-datasource:
url: MyURLString2
username: User
password: Password
driver-class-name: oracle.jdbc.OracleDriver
My Application Class:
#SpringBootApplication
public class MyApplication {
public static void main(String[] args) {
SpringApplication.run(MyApplication.class, args);
}
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.first-datasource")
public DataSource firstDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix = "spring.second-datasource")
public DataSource secondDataSource() {
return DataSourceBuilder.create().build();
}
}
And Finally my DAO:
#Repository
public class MyDao {
private static final String FIRST_SELECT = "select * from SomeTableInDB1";
private static final String SECOND_SELECT = "select * from AnotherTableInDB2";
#Autowired
private JdbcTemplate firstJdbcTemplate;
#Autowired
#Qualifier("secondDataSource")
private JdbcTemplate secondJdbcTemplate;
List<DB1Entity> getDB1Entity(Long id) {
return firstJdbcTemplate.query(FIRST_SELECT, new Object[] {id}, new BeanPropertyRowMapper(DB1Entity.class));
}
List<DB2Entity> getDB2Entity(Long id) {
return secondJdbcTemplate.query(SECOND_SELECT, new Object[] {id}, new BeanPropertyRowMapper(DB2Entity.class));
}
}
This is the closest I have come so far. I say it is closest because if I remove the #Qualifier then both of my dao methods actually work, assuming that the SECOND_SELECT statement is valid SQL for my DB1. Once I put in the #Qualifier for my non-primary datasouce then I get an autowire error because Spring is expecting a Datasouce object, not a JdbcTemplate object. That is weird to me as it does work with the primary datasource.
Here is my error:
Could not autowire field: private org.springframework.jdbc.core.JdbcTemplate org.my.classpath.secondJdbcTemplate; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type [org.springframework.jdbc.core.JdbcTemplate] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=true), #org.springframework.beans.factory.annotation.Qualifier(value=secondDataSource)}
You create the bean of type DataSource, but try to Autowire JdbcTemplate which is a mismatch. Your probably should have something like this
private JdbcTemplate jdbcTemplate1;
private JdbcTemplate jdbcTemplate2;
#Autowired
#Qualifier("firstDataSource")
public void setDataSource(DataSource dataSource){
this.jdbcTemplate1=new JdbcTemplate(dataSource);
}
#Autowired
#Qualifier("secondDataSource")
public void setDataSource(DataSource dataSource){
this.jdbcTemplate2=new JdbcTemplate(dataSource);
}
Ideally, but not a mandate, one of the datasources should be marked PRIMARY for most of the default wiring via Annotations to work. Also, we need to create TransactionManagers for each of the datasources separately otherwise Spring would not know how to enforce Transactions. Following is a complete example of how this should be done
#Primary
#Bean(name = "dataSource")
#ConfigurationProperties(prefix="datasource.mysql")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "transactionManager")
public DataSourceTransactionManager transactionManager(#Qualifier("dataSource") DataSource dataSource) {
return new DataSourceTransactionManager();
}
#Bean(name = "postGresDataSource")
#ConfigurationProperties(prefix="datasource.postgres")
public DataSource postgresDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "postGresTransactionManager")
public DataSourceTransactionManager transactionManager(#Qualifier("postGresDataSource") DataSource dataSource) {
return new DataSourceTransactionManager();
}
#Transactional(transactionManager="postGresTransactionManager")
public void createCustomer(Customer cust) {
customerDAO.create(cust);
}
// specifying a transactionManager attribute is optional if we
// want to use the default transactionManager since we
// already marked one of the TM above with #Primary
#Transactional
public void createOrder(Order order) {
orderDAO.create(order);
}
Hope this helps.
Here also provide another 'not-working' situation that confused me for several days:
when configurating two data sources of same type in Springboot application, the #Qualifier doesn't work as expected to pick up right beans. It behaves like it's not recognized by Spring framework.
the reason is when using #SpringbootApplication annotation which contains #EnableAutoConfiguration annotation, which, in Springboot, will auto-configurate data sources it provides for users.
this would terribly affect #Qualifier's behavior.
In my case, following the #Aman Tuladhar answer, worked like that:
(Spring Boot 2.1.3.RELEASE)
#Configuration
public class JDBCConfig {
#Bean("first-datasource")
public JdbcTemplate paymentsJDBCTemplate(#Qualifier("first-db") DataSource paymentsDataSource){
return new JdbcTemplate(paymentsDataSource);
}
#Bean("second-datasource")
public JdbcTemplate parametersJDBCTemplate(#Qualifier("second-db") DataSource paramsDataSource){
return new JdbcTemplate(paramsDataSource);
}
#Bean("first-db")
public DataSource paymentsDataSource(Environment env) {
return buildDataSource(env, "first");
}
#Bean("second-db")
public DataSource paramsDataSource(Environment env) {
return buildDataSource(env, "second");
}
private DataSource buildDataSource(Environment env, String prop) {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getProperty("spring."+prop+"-datasource.driver-class-name"));
dataSource.setUrl(env.getProperty("spring."+prop+"-datasource.url"));
dataSource.setUsername(env.getProperty("spring."+prop+"-datasource.username"));
dataSource.setPassword(env.getProperty("spring."+prop+"-datasource.password"));
return dataSource;
}
}
... and using like that:
#Autowired #Qualifier("first-datasource")
private JdbcTemplate jdbcTemplate1;
#Autowired #Qualifier("second-datasource")
private JdbcTemplate jdbcTemplate2;
... application.yml:
spring:
datasource:
driver-class-name: com.mysql.cj.jdbc.Driver
url: ${DATABASE_1_URL}
username: ${DATABASE_1_USERNAME}
password: ${DATABASE_1_PASSWORD}
second-datasource:
driver-class-name: com.mysql.cj.jdbc.Driver
url: ${DATABASE_2_URL}
username: ${DATABASE_2_USERNAME}
password: ${DATABASE_2_PASSWORD}