Multiple DataSource is referring only one database in Spring Boot - spring

In my application i need to integrate two datasource but when i integrate the second database using JdbcTemplate then the previous one not working, instead all table created in the second datasource
#1 DataSource
#Configuration
#Profile("mariadb4j")
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class EmbeddedMariaDBConfig {
private static final Logger L =
LoggerFactory.getLogger(EmbeddedMariaDBConfig.class);
private static final String DB_SERVICE = "dbServiceBean";
#Bean(name = {DB_SERVICE})
MariaDB4jSpringService mariaDB4jSpringService() {
L.info("Initializing MariaDB4j service");
return new MariaDB4jSpringService();
}
#Bean(name = "adminDataSource")
#Primary
#DependsOn(DB_SERVICE)
DataSource dataSource(MariaDB4jSpringService mdb, DataSourceProperties dataSourceProperties) throws ManagedProcessException {
String dbName = dataSourceProperties.getName();
L.debug("Embedded MariaDB datasource properties from spring: [{}]", dataSourceProperties);
mdb.getDB().createDB(dbName);
if(L.isDebugEnabled()) {
DBConfigurationBuilder ecfg = mdb.getConfiguration();
L.debug("JDBC URL for embedded MariaDB as reported by driver: [{}]", ecfg.getURL(dbName));
L.debug("JDBC URL from spring config: [{}]", dataSourceProperties.getUrl());
L.debug("JDBC Username: [{}]", dataSourceProperties.getUsername());
L.debug("JDBC Password: [{}]", dataSourceProperties.getPassword());
}
return DataSourceBuilder
.create()
.username(dataSourceProperties.getUsername())
.password(dataSourceProperties.getPassword())
.url(dataSourceProperties.getUrl())
.driverClassName(dataSourceProperties.getDriverClassName())
.build();
}
}
Yaml Configuration
spring:
profiles: mariadb4j
datasource:
username: root
password: password
driver-class-name: com.mysql.jdbc.Driver
url: jdbc:mysql://localhost:3306/t1
mariaDB4j:
dataDir: /tmp/mariadb
port: 3900
2 DataSource
#Configuration
public class ExoDBConfig {
#Bean(name="user")
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("com.mysql.jdbc.Driver");
dataSource.setUrl("jdbc:mysql://localhost:3306/t2");
dataSource.setUsername("root");
dataSource.setPassword("password");
return dataSource;
}
#Bean
public JdbcTemplate jdbcTemplate(#Qualifier("user")DataSource dataSource)
{
return new JdbcTemplate(dataSource);
}
}
When i use only one DataSource the 1st one then it's working fine and tables are creating in t1 database but when i integrate the 2nd DataSource then it points to second DataSource means all the tables is creating in the t2 database.

Just Create DataSource object in the JdbcTemplate itself, working for me, and you can create separate properties file for t2 database
#Bean
public JdbcTemplate jdbcTemplate() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("com.mysql.jdbc.Driver");
dataSource.setUrl("jdbc:mysql://localhost:3306/t2");
dataSource.setUsername("user");
dataSource.setPassword("password");
return new JdbcTemplate(dataSource);
}

Related

One Database overrides the other when Primary Bean is defined on and vice versa

I have 2 JDBC Datasources defined in a Spring Boot application utilizing used in a Spring Batch job. However, after autowiring the datasources, only one gets used. The one used is the one annotated #Primary. If I place the annotation on the other JDBC datasource that gets used instead. In a nutshell only one of the JDBC datasources ever gets used. I use Lombok in some places but I'm unsure if that is playing a part.
Here are the datasources:
application.yml
symphony:
datasource:
driver-class-name: oracle.jdbc.OracleDriver
url: ...
type: com.zaxxer.hikari.HikariDataSource
username: <USR>
password: <PWD>
jndi-name: false
repo:
datasource:
driver-class-name: org.h2.Driver
url: jdbc:h2:mem:db;DB_CLOSE_DELAY=-1
username: sa
password: sa
jndi-name: false
Here is the first datasource:
#Configuration
public class RepoDbConfig {
#Bean
#ConfigurationProperties("repo.datasource")
public DataSourceProperties repoDataProperties() {
return new DataSourceProperties();
}
#Bean(name = "repoDataSource")
public DataSource dataSourcerepo() {
DataSource dataSource = repoDataProperties().initializeDataSourceBuilder().type(BasicDataSource.class)
.build();
return dataSource;
}
#Bean(name = "repoJdbcTemplate")
public JdbcTemplate repoJdbcTemplate(DataSource repoDataSource) {
return new JdbcTemplate(repoDataSource);
}
}
Here is the second datasource:
#Configuration
public class SymphonyDbConfig {
#Primary
#Bean
#ConfigurationProperties("symphony.datasource")
public DataSourceProperties symphonyDataSourceProperties() {
return new DataSourceProperties();
}
#Primary
#Bean(name = "symphonyDataSource")
public DataSource dataSourcesymphony() {
HikariDataSource dataSource = symphonyDataSourceProperties().initializeDataSourceBuilder().type(HikariDataSource.class)
.build();
return dataSource;
}
#Primary
#Bean(name = "symphonyJdbcTemplate")
public JdbcTemplate symphonyJdbcTemplate(DataSource symphonyDataSource) {
return new JdbcTemplate(symphonyDataSource);
}
}
The JobRepository beans are configured like this:
#Configuration
#RequiredArgsConstructor
public class JobRepositoryConfig {
final #Qualifier("repoDataSource")
DataSource repoDataSource;
#Bean("repoTransactionManager")
AbstractPlatformTransactionManager repoTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean("repoJobRepository")
public JobRepository repoJobRepository(DataSource repoDataSource) throws Exception {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setDataSource(repoDataSource);
jobRepositoryFactoryBean.setTransactionManager(repoTransactionManager());
jobRepositoryFactoryBean.setDatabaseType(DatabaseType.H2.getProductName());
return jobRepositoryFactoryBean.getObject();
}
#Bean("repoAppJobLauncher")
public JobLauncher careLocationAppJobLauncher(JobRepository repoJobRepository) {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(repoJobRepository);
return simpleJobLauncher;
}
}
Finally the Batch Job beans used for the Job are configured here: The only part not shown is the launching of the job. All the required beans used are shown here:
#Configuration
#EnableBatchProcessing
#EnableScheduling
#RequiredArgsConstructor
#Slf4j
public class CellBatchConfig {
private final JobBuilderFactory jobBuilderFactory;
#Qualifier("repoAppJobLauncher")
private final JobLauncher repoAppJobLauncher;
private final StepBuilderFactory stepBuilderFactory;
#Value("${chunk-size}")
private int chunkSize;
#Qualifier("symphonyDataSource")
final DataSource symphonyDataSource;
#Qualifier("repoDataSource")
final DataSource symphonyDataSource;
#Bean
public JdbcPagingItemReader<CenterDto> cellItemReader(PagingQueryProvider pagingQueryProvider) {
return new JdbcPagingItemReaderBuilder<CenterDto>()
.name("cellItemReader")
.dataSource(symphonyDataSource)
.queryProvider(pagingQueryProvider)
.pageSize(chunkSize)
.rowMapper(new CellRowMapper())
.build();
}
#Bean
public PagingQueryProvider pagingQueryProvider() {
OraclePagingQueryProvider pagingQueryProvider = new OraclePagingQueryProvider();
final Map<String, Order> sortKeys = new HashMap<>();
sortKeys.put("ID", Order.ASCENDING);
pagingQueryProvider.setSortKeys(sortKeys);
pagingQueryProvider.setSelectClause(" ID, CELL_NO, MAT_VO ");
pagingQueryProvider.setFromClause(" from pvc.cells");
return pagingQueryProvider;
}
.......
}
The error results from only one of the datasources being used. That results that being used to query the Spring Batch job repository resulting in it failing: Here is the key portion of the stacktrace. It is trying to use the oracle datasource to query for the JobRespository resources and fails as a result:
Caused by: org.springframework.jdbc.BadSqlGrammarException:
PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from
BATCH_JOB_INSTANCE
where JOB_NAME = ? and JOB_KEY = ?]; nested exception is
java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist
In the class JobRepositoryConfig:
In the bean:
#Bean("symphonyJobRepository")
public JobRepository symphonyJobRepository(DataSource repoDataSource) throws Exception {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setDataSource(repoDataSource);
jobRepositoryFactoryBean.setTransactionManager(repoTransactionManager());
jobRepositoryFactoryBean.setDatabaseType(DatabaseType.H2.getProductName());
return jobRepositoryFactoryBean.getObject();
}
You didn't use the variable:
final #Qualifier("repoDataSource") DataSource repoDataSource;
So Spring uses a DataSource object which is annotated with #Primary annotation
I fixed it by making one bean the primary and also adding the qualifiers on the specific beans which had been missing, an omission on my part. For example, here I added the #Qualifier:
#Primary
#Bean(name = "symphonyDataSource")
#Qualifier("symphonyDataSource") // This was missing
public DataSource dataSourcesymphony() {
HikariDataSource dataSource = symphonyDataSourceProperties().initializeDataSourceBuilder().type(HikariDataSource.class)
.build();
return dataSource;
}

Spring Boot Multitenancy - Hibernate - Use ddl-auto to update all schemas when entity structure changes

I am new to Spring Boot and trying to implement multi-tenancy architecture using Spring boot 2, hibernate and flyway. I was referring to tutorial https://reflectoring.io/flyway-spring-boot-multitenancy/ to understand the concepts and was able to implement the architecture as mentioned.
However, if I add a new field entity classes, everything breaks because hibernate does not create new fields in tenant databases. From reading theory and stackoverflow questions, I understand that flyway is to solve this problem. However, i am not able to make it work.
Can somebody tell me where I am going wrong. My requirement is - When I add a new field to entity class, all tables in all tenant databases should get updated with that field. Below is the code
Application Properties
spring:
jpa:
database-platform: org.hibernate.dialect.MySQL5InnoDBDialect
flyway:
enabled: false
tenants:
datasources:
vw:
jdbcUrl: jdbc:mysql://localhost:3306/vw
driverClassName: com.mysql.jdbc.Driver
username: vikky
password: Test#123
bmw:
jdbcUrl: jdbc:mysql://localhost:3306/bmw
driverClassName: com.mysql.jdbc.Driver
username: vikky
password: Test#123
Datasource Configuration
#Configuration
public class DataSourceConfiguration {
private final DataSourceProperties dataSourceProperties;
public DataSourceConfiguration(DataSourceProperties dataSourceProperties) {
this.dataSourceProperties = dataSourceProperties;
}
#Bean
public DataSource dataSource() {
TenantRoutingDataSource customDataSource = new TenantRoutingDataSource();
customDataSource.setTargetDataSources(dataSourceProperties.getDatasources());
return customDataSource;
}
#PostConstruct
public void migrate() {
dataSourceProperties
.getDatasources()
.values()
.stream()
.map(dataSource -> (DataSource) dataSource)
.forEach(this::migrate);
}
private void migrate(DataSource dataSource) {
Flyway flyway = Flyway.configure().dataSource(dataSource).load();
flyway.migrate();
}
}
DataSource properties
#Component
#ConfigurationProperties(prefix = "tenants")
public class DataSourceProperties {
private Map<Object, Object> datasources = new LinkedHashMap<>();
public Map<Object, Object> getDatasources() {
return datasources;
}
public void setDatasources(Map<String, Map<String, String>> datasources) {
datasources
.forEach((key, value) -> this.datasources.put(key, convert(value)));
}
public DataSource convert(Map<String, String> source) {
return DataSourceBuilder.create()
.url(source.get("jdbcUrl"))
.driverClassName(source.get("driverClassName"))
.username(source.get("username"))
.password(source.get("password"))
.build();
}
}
Tenant Routing Data Source
public class TenantRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return ThreadTenantStorage.getTenantId();
}
}
Header Interceptor
#Component
public class HeaderTenantInterceptor implements WebRequestInterceptor {
public static final String TENANT_HEADER = "X-tenant";
#Override
public void preHandle(WebRequest request) throws Exception {
ThreadTenantStorage.setTenantId(request.getHeader(TENANT_HEADER));
}
#Override
public void postHandle(WebRequest request, ModelMap model) throws Exception {
ThreadTenantStorage.clear();
}
#Override
public void afterCompletion(WebRequest request, Exception ex) throws Exception {
}
}
There are other classes as well like Web Configuration, controllers etc. but I don't they are required to be posted here.
After lot of research, I understood that flyway is required only in case of production, where we do not want to update table definition using ddl-auto=true. Since that was not the case with me, I added below configuration to update all schemas according to entity structure
#Configuration
public class AutoDDLConfig
{
#Value("${spring.datasource.username}")
private String username;
#Value("${spring.datasource.password}")
private String password;
#Value("${schemas.list}")
private String schemasList;
#Bean
public void bb()
{
if (StringUtils.isBlank(schemasList))
{
return;
}
String[] tenants = schemasList.split(",");
for (String tenant : tenants)
{
tenant = tenant.trim();
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("com.mysql.jdbc.Driver"); // Change here to MySql Driver
dataSource.setSchema(tenant);
dataSource.setUrl("jdbc:mysql://localhost/" + tenant
+ "?autoReconnect=true&characterEncoding=utf8&useSSL=false&useTimezone=true&serverTimezone=Asia/Kolkata&useLegacyDatetimeCode=false&allowPublicKeyRetrieval=true");
dataSource.setUsername(username);
dataSource.setPassword(password);
LocalContainerEntityManagerFactoryBean emfBean = new LocalContainerEntityManagerFactoryBean();
emfBean.setDataSource(dataSource);
emfBean.setPackagesToScan("com"); // Here mention JPA entity path / u can leave it scans all packages
emfBean.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
emfBean.setPersistenceProviderClass(HibernatePersistenceProvider.class);
Map<String, Object> properties = new HashMap<>();
properties.put("hibernate.hbm2ddl.auto", "update");
properties.put("hibernate.default_schema", tenant);
emfBean.setJpaPropertyMap(properties);
emfBean.setPersistenceUnitName(dataSource.toString());
emfBean.afterPropertiesSet();
}
}
}

Spring boot multiple data sources - Connectivity Error Handling

I'm working with multiple data sources (Oracle and SQL-Server) in spring boot rest application. In this application, I have more than 25+ end-points exist to process client requests. But when one of the databases is down like Oracle or SQL-server is not available for some reason, my application is unable to start the server.
Looked couple examples on google and stack overflow but they're different what I'm looking for...
package com.foobar;
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories
(
entityManagerFactoryRef = "entityManagerFactory",
basePackages = { "com.foobar.foo.repo" }
)
public class FooDbConfig
{
#Primary
#Bean(name = "dataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean
entityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("dataSource") DataSource dataSource
) {
return builder
.dataSource(dataSource)
.packages("com.foobar.foo.domain")
.persistenceUnit("foo")
.build();
}
#Primary
#Bean(name = "transactionManager")
public PlatformTransactionManager transactionManager(
#Qualifier("entityManagerFactory") EntityManagerFactory
entityManagerFactory)
{
return new JpaTransactionManager(entityManagerFactory);
}
}
same configuration for 2nd data-source but with different properties.
I'm using below example as a base code reference to implement my requirements
Example link
I'm looking for a solution if one DB server is available out of N application should start and process client requests and whenever 2nd DB server is available then it connects automatically and processes other endpoints requests
I've created a solution recently for multitenancy with datasources and using liquibase, but if not use the liquibase, just remove that works too!
Example of application.yml
spring:
dataSources:
- tenantId: db1
url: jdbc:postgresql://localhost:5432/db1
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
liquibase:
enabled: true
default-schema: public
change-log: classpath:db/master/changelog/db.changelog-master.yaml
- tenantId: db2
url: jdbc:postgresql://localhost:5432/db2
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
- tenantId: db3
url: jdbc:postgresql://localhost:5432/db3
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
DataSourceConfiguration
#Configuration
#EnableTransactionManagement
#EntityScan(basePackages = { "br.com.dijalmasilva.springbootmultitenancyliquibase" })
#EnableJpaRepositories(basePackages = { "br.com.dijalmasilva.springbootmultitenancyliquibase" })
public class DataSourceConfiguration {
#Bean(name = "dataSources")
#Primary
public Map<Object, Object> getDataSources(DataSourceProperties dataSourceProperties) {
return dataSourceProperties.getDataSources().stream().map(dataSourceProperty -> {
DataSource dataSource = DataSourceBuilder.create()
.url(dataSourceProperty.getUrl())
.username(dataSourceProperty.getUsername())
.password(dataSourceProperty.getPassword())
.driverClassName(dataSourceProperty.getDriverClassName())
.build();
return new TenantIdDataSource(dataSourceProperty.getTenantId(), dataSource);
}).collect(Collectors.toMap(TenantIdDataSource::getTenantId, TenantIdDataSource::getDataSource));
}
#Bean(name = "tenantRoutingDataSource")
#DependsOn("dataSources")
public DataSource dataSource(Map<Object, Object> dataSources) {
AbstractRoutingDataSource tenantRoutingDataSource = new TenantRoutingDataSource();
tenantRoutingDataSource.setTargetDataSources(dataSources);
tenantRoutingDataSource.setDefaultTargetDataSource(dataSources.get("db1"));
tenantRoutingDataSource.afterPropertiesSet();
return tenantRoutingDataSource;
}
#Data
#AllArgsConstructor
private class TenantIdDataSource {
private Object tenantId;
private Object dataSource;
}
}
TenantRoutingDataSource
public class TenantRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return TenantContext.getCurrentTenant();
}
}
DataSourceProperties
#Data
#Component
#ConfigurationProperties(prefix = "spring")
public class DataSourceProperties {
private List<DataSourceProperty> dataSources = new ArrayList<>();
}
DataSourceProperty
#Data
public class DataSourceProperty {
private String tenantId;
private String url;
private String username;
private String password;
private String driverClassName;
private LiquibaseProperties liquibase;
}
See the complete code, maybe help you!
Link of project: https://github.com/dijalmasilva/spring-boot-multitenancy-datasource-liquibase

Spring data - If primary datasource fails the second does not come up

I am using Spring boot and Spring data and I want to use primarily a MySQL datasource but if fails to connect go to a H2 datasource.
So far, I do the change just moving the #Primary in the configurations, but if I put the #Primary in the MySQL (main data source) and stop the MySQL server in my pc, the other bean does not come up... What do I need?
application.yml:
# Main properties
spring:
application:
name: app
jpa:
database: default
show-sql: false
hibernate:
ddl-auto: update
properties:
hibernate:
format_sql: false
current_session_context_class: org.springframework.orm.hibernate5.SpringSessionContext
# Main database: MySQL
main.datasource:
url: jdbc:mysql://localhost:3306/app?useSSL=false
driver-class-name: com.mysql.jdbc.Driver
username: sa
password: sa
# Backup database: H2
backup.datasource:
url: jdbc:h2:${project.directory}/app;DB_CLOSE_ON_EXIT=FALSE
driver-class-name: org.h2.Driver
username: sa
password: sa
Main data source
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories("org.app")
#EntityScan("org.app")
public class MainDataSourceConfig {
#Primary
#Bean(name = "mainDataSource")
#ConfigurationProperties(prefix = "main.datasource")
public DataSource mainDataSource() {
return DataSourceBuilder.create().build();
}
}
Backup data source:
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories("org.app")
#EntityScan("org.app")
public class BackupDataSourceConfig {
#Bean(name = "backupDataSource")
#ConfigurationProperties(prefix = "backup.datasource")
public DataSource backupDataSource() {
return DataSourceBuilder.create().build();
}
}
Thanks!
I figure out how to do it. Hope this can help anyone:
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories("org.app")
#EntityScan("org.app")
public class DataSourceConfig {
private static final String USERNAME = "sa";
private static final String PASSWORD = "sa";
#Bean
#Primary
public DataSource dataSource() {
DataSource dataSource;
try {
dataSource = getMainDataSource();
dataSource.getConnection().isValid(500);
} catch (Exception e) {
log.error("Main database not valid.", e);
dataSource =getBackupDataSource();
}
return dataSource;
}
private DataSource getMainDataSource() {
return DataSourceBuilder.create()
.driverClassName("com.mysql.jdbc.Driver")
.username(USERNAME)
.password(PASSWORD)
.url("jdbc:mysql://localhost:3306/app?useSSL=false")
.build();
}
private DataSource getBackupDataSource() {
return DataSourceBuilder.create()
.driverClassName("org.h2.Driver")
.username(USERNAME)
.password(PASSWORD)
.url("jdbc:h2:/app;DB_CLOSE_ON_EXIT=FALSE")
.build();
}
}
Just on bean.

My Spring Database (H2) is not Persisting

This is my config file
#Configuration
#ComponentScan
public class Config {
#Bean
public DataSource datasource() {
return new EmbeddedDatabaseBuilder().setName("MyDB").setType(EmbeddedDatabaseType.H2).addScript("schema.sql").build();
}
#Bean
public JdbcOperations jdbcTemplate(DataSource ds) {
return new JdbcTemplate(ds);
}
}
After I run the program, I can not find the "MyDB" database.
I know it is an in-memory database. How to make it embedded so that when I close the program the data on the database persist and I can find "MyDB" on the project folder.
#Bean
public DataSource h2DataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driver);
dataSource.setUrl(url);
dataSource.setUsername(username);
dataSource.setPassword(password);
return dataSource;
}
driver: org.h2.Driver
url: jdbc:h2:file:${java.io.tmpdir}/database/db_name;AUTO_SERVER=TRUE
username: user
password: pass

Resources