I have a bit a problem, I have several microservices, but one throws an exception that others don't throws and work perfect ...
[2020-09-28 16:55:38.304]|ERROR|TIBCO EMS Session Dispatcher (21297)|org.hibernate.engine.jdbc.spi.SqlExceptionHelper.logExceptions^[[36m(142)^[[0;39m: --- ERROR: relation "computed.fluxeventlogging" does not exist
Position: 502
[2020-09-28 16:55:38.307]|INFO |TIBCO EMS Session Dispatcher (21297)|org.hibernate.event.internal.DefaultLoadEventListener.doOnLoad^[[36m(116)^[[0;39m: --- HHH000327: Error performing load command
org.hibernate.exception.SQLGrammarException: could not extract ResultSet
at org.hibernate.exception.internal.SQLStateConversionDelegate.convert(SQLStateConversionDelegate.java:103) ~[hibernate-core-5.4.12.Final.jar:5.4.12.Final]
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:42) ~[hibernate-core-5.4.12.Final.jar:5.4.12.Final]
I have good permissions in a database like that:
And my entity is located in a common projet used by all microservice, added with maven dependency.
#Entity
#Data
#Table(name="fluxeventlogging", schema = "computed")
#IdClass(FluxEventLoggingIdEntity.class)
public class FluxEventLoggingEntity implements Serializable {
#Id
#Column(name = "fluxeventuuid", columnDefinition = "uuid")
private UUID fluxEventUuid;
#Column(name = "lastupdatedate")
private Instant lastUpdateDate;
#Column(name = "businessfluxtype")
private String businessFluxType;
#Column(name = "fluxprocessortype")
private String fluxProcessorType;
#Column(name = "valuewhichcauseupdate")
private String valueWhichCauseUpdate;
#Column(name = "oldvaluecause")
private String oldValueCause;
#Column(name = "newvaluecause")
private String newValueCause;
#Column(name = "currenteventlifestate")
private String currentEventLifeState;
#Column(name = "nexteventlifestate")
private String nextEventLifeState;
#Column(name = "generatederror", columnDefinition = "text")
private String generatedError;
}
however I fixed in pom the hibernate version to:
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>5.4.12.Final</version>
</dependency>
because this project is the only once which use one data source for jpa like that:
custom.datasource.url: jdbc:postgresql://localhost:5432/postgres
custom.datasource.database: postgres
custom.datasource.driver: pool
custom.datasource.protocol: postgres
custom.datasource.localhost: localhost
custom.datasource.port: 5432
custom.datasource.password: postgres
custom.datasource.username: postgres
custom.datasource.driverclassname: org.postgresql.Driver
spring:
jpa:
properties:
hibernate:
dialect: org.hibernate.dialect.PostgreSQL82Dialect
hibernate:
ddl-auto: create
show-sql: false
database-platform: org.hibernate.dialect.PostgreSQLDialect
datasource:
driver-class-name: org.postgresql.Driver
url: jdbc:postgresql://localhost:5432/postgres
username: postgres
password: postgres
schema: classpath:/schema.sql
initialization-mode: always
r2dbc:
url: r2dbc:postgresql://postgres:postgres#localhost:5432/postgres
pool:
enabled: true
initial-size=: 00
max-size: 500
max-idle-time: 30m
validation-query: SELECT 1
And another data source for r2dbc (postgres reactive)
#Configuration
#Slf4j
public class DatabaseConfiguration {
private static Map<String, Object> PROPERTIES;
#Autowired
DataSource dataSource;
#Bean
public ConnectionFactory r2dbcConnectionFactory() {
if(PROPERTIES == null) {
Yaml yaml = new Yaml();
InputStream inputStream = this.getClass()
.getClassLoader()
.getResourceAsStream("application.yml");
PROPERTIES = yaml.load(inputStream);
}
log.info("Init rd2dbc with host: {}", PROPERTIES.get("custom.datasource.localhost").toString());
log.info("Init rd2dbc with port: {}", PROPERTIES.get("custom.datasource.port").toString());
log.info("Init rd2dbc with database: {}", PROPERTIES.get("custom.datasource.database").toString());
log.info("Init rd2dbc with username: {}", PROPERTIES.get("custom.datasource.username").toString());
log.info("Init rd2dbc with driver: {}", PROPERTIES.get("custom.datasource.driver").toString());
log.info("Init rd2dbc with protocol: {}", PROPERTIES.get("custom.datasource.protocol").toString());
ConnectionFactoryOptions options = ConnectionFactoryOptions.builder()
.option(ConnectionFactoryOptions.DRIVER, PROPERTIES.get("custom.datasource.driver").toString())
.option(ConnectionFactoryOptions.PROTOCOL, PROPERTIES.get("custom.datasource.protocol").toString())
.option(ConnectionFactoryOptions.USER, PROPERTIES.get("custom.datasource.username").toString())
.option(ConnectionFactoryOptions.PASSWORD, PROPERTIES.get("custom.datasource.password").toString())
.option(ConnectionFactoryOptions.HOST, PROPERTIES.get("custom.datasource.localhost").toString())
.option(ConnectionFactoryOptions.PORT, Integer.parseInt(PROPERTIES.get("custom.datasource.port").toString()))
.option(ConnectionFactoryOptions.DATABASE, PROPERTIES.get("custom.datasource.database").toString())
.build();
return ConnectionFactories.get(options);
//return ConnectionFactories.get(ConnectionFactoryOptions.parse(PROPERTIES.get("custom.r2dbc.url").toString()));
}
#Bean
public DataSource getDataSource() {
if(PROPERTIES == null) {
Yaml yaml = new Yaml();
InputStream inputStream = this.getClass()
.getClassLoader()
.getResourceAsStream("application.yml");
PROPERTIES = yaml.load(inputStream);
}
log.info("Init datasource with url: {}", PROPERTIES.get("custom.datasource.url").toString());
log.info("Init datasource with username: {}", PROPERTIES.get("custom.datasource.username").toString());
log.info("Init datasource with driver: {}", PROPERTIES.get("custom.datasource.driverclassname").toString());
DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
dataSourceBuilder.url(PROPERTIES.get("custom.datasource.url").toString());
dataSourceBuilder.username(PROPERTIES.get("custom.datasource.username").toString());
dataSourceBuilder.password(PROPERTIES.get("custom.datasource.password").toString());
dataSourceBuilder.driverClassName(PROPERTIES.get("custom.datasource.driverclassname").toString());
return dataSourceBuilder.build();
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
//JpaVendorAdapteradapter can be autowired as well if it's configured in application properties.
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
vendorAdapter.setGenerateDdl(false);
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setJpaVendorAdapter(vendorAdapter);
//Add package to scan for entities.
factory.setPackagesToScan("fr.microservice2.database", "fr.microservice.common");
factory.setDataSource(dataSource);
return factory;
}
#Bean
public PlatformTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
JpaTransactionManager txManager = new JpaTransactionManager();
txManager.setEntityManagerFactory(entityManagerFactory);
return txManager;
}
I don't know why only this microservice don't found my table computed.fluxeventlogging ...
while I have plenty of other tables in this diagram where it does not bring up the problem
Anyone have an idea please?
Thank you and best regards
Related
I have used Namedjdbctemplate and Spring Jpa both in my project for which i have two Configuration class which is preparing the datasource for db2 database but when i run the project i am facing
Name Can not be null Error while creating entity Manage Factory . I am not able to identify the root cause of this problem can some body please point the problem and mistakes i am doing
Tried put("hibernate.ddl-auto", "none"); as well but same error
Error
Caused by: javax.persistence.PersistenceException: [PersistenceUnit: read] Unable to build Hibernate SessionFactory
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.persistenceException(EntityManagerFactoryBuilderImpl.java:954)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:882)
at org.hibernate.jpa.HibernatePersistenceProvider.createContainerEntityManagerFactory(HibernatePersistenceProvider.java:135)
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:353)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:370)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:359)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1687)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1624)
... 16 common frames omitted
Caused by: java.lang.IllegalArgumentException: Name cannot be null
at org.hibernate.boot.model.relational.QualifiedNameParser$NameParts.<init>(QualifiedNameParser.java:34)
at org.hibernate.boot.model.relational.QualifiedNameImpl.<init>(QualifiedNameImpl.java:24)
at org.hibernate.boot.model.relational.QualifiedSequenceName.<init>(QualifiedSequenceName.java:16)
at org.hibernate.tool.schema.extract.internal.SequenceInformationExtractorLegacyImpl.extractMetadata(SequenceInformationExtractorLegacyImpl.java:51)
at org.hibernate.tool.schema.extract.internal.DatabaseInformationImpl.initializeSequences(DatabaseInformationImpl.java:64)
at org.hibernate.tool.schema.extract.internal.DatabaseInformationImpl.<init>(DatabaseInformationImpl.java:60)
at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:123)
at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:101)
at org.hibernate.internal.SessionFactoryImpl.<init>(SessionFactoryImpl.java:472)
at org.hibernate.boot.internal.SessionFactoryBuilderImpl.build(SessionFactoryBuilderImpl.java:444)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:879)
... 22 common frames omitted
Process finished with exit code
Configuration Classes
#Setter
#Getter
#Configuration
#PropertySource("classpath:application.yml")
#ConfigurationProperties("app.datasource.db2.credentials.hikari")
public class HikariReadProperties {
private String poolName;
private int minimumIdle;
private int maximumPoolSize;
private int idleTimeout;
private String connectionTestQuery;
}
public class HikariConfigRead extends HikariConfig {
protected final HikariReadProperties hikariReadProperties;
protected final String PERSISTENCE_UNIT_NAME = "read";
protected HikariConfigRead(HikariReadProperties hikariReadProperties) {
this.hikariReadProperties = hikariReadProperties;
setPoolName(this.hikariReadProperties.getPoolName());
setMinimumIdle(this.hikariReadProperties.getMinimumIdle());
setMaximumPoolSize(this.hikariReadProperties.getMaximumPoolSize());
setIdleTimeout(this.hikariReadProperties.getIdleTimeout());
setConnectionTestQuery(this.hikariReadProperties.getConnectionTestQuery());
}
}
#Configuration
#ConfigurationProperties("app.datasource.db2.credentials")
#EnableTransactionManagement
#EnableJpaRepositories(entityManagerFactoryRef = "entityManagerFactory",
transactionManagerRef = "transactionManagerRead", basePackages = {"com.testing.db2migration.repository"})
public class JpaConfiguration extends HikariConfigRead {
protected JpaConfiguration(HikariReadProperties hikariReadProperties) {
super(hikariReadProperties);
}
#Bean(name = "db2DataSource")
public HikariDataSource hikariDataSource() {
return new HikariDataSource(this);
}
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean entityManagerFactoryWrite(
final #Qualifier("db2DataSource") HikariDataSource dataSourceWrite) {
return new LocalContainerEntityManagerFactoryBean() {{
setDataSource(dataSourceWrite);
setPersistenceProviderClass(HibernatePersistenceProvider.class);
setPersistenceUnitName(PERSISTENCE_UNIT_NAME);
setPackagesToScan("com.testing.db2migration.model");
Properties JPA_READ_PROPERTIES = new Properties() {{
put("hibernate.dialect", "org.hibernate.dialect.DB2Dialect");
put("hibernate.hbm2ddl.auto", "update");
put("hibernate.ddl-auto", "update");
put("show-sql", "true");
}};
setJpaProperties(JPA_READ_PROPERTIES);
}};
}
#Bean(name="transactionManagerRead")
public PlatformTransactionManager transactionManagerWrite(
#Qualifier("entityManagerFactory") EntityManagerFactory entityManagerFactoryWrite) {
return new JpaTransactionManager(entityManagerFactoryWrite);
}
}
public class BaseConfig {
protected Environment environment;
protected DataSource getHikariDataSource () {
HikariDataSource ds = new HikariDataSource();
String driverClassName = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-driverClassName");
String url = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-url");
String userName = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-username");
String password = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-password");
String dbTestQuery = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-dbTestQuery");
ds.setJdbcUrl(url);
ds.setUsername(userName);
ds.setPassword(password);
ds.setDriverClassName(driverClassName);
ds.setConnectionTestQuery(dbTestQuery);
return ds;
}
}
#Configuration
#ComponentScan("com.testing")
public class LocalConfig extends BaseConfig implements EnvironmentAware {
#Override
public void setEnvironment(Environment environment) {
this.environment = environment;
}
#Bean(name = "dataSource")
#Primary
public DataSource dataSource() throws SQLException {
return this.getHikariDataSource();
}
#Bean("jdbcTemplate")
#Autowired
public JdbcTemplate jdbcTemplate(DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
..................
#Normal bean declaration
}
application.yaml
vcap:
services:
store-service:
credentials:
env: QA
db2-driverClassName: com.ibm.db2.jcc.DB2Driver
db2-url: jdbc:db2://localhost:/APP1
db2-schema: YQ1MM
db2-username: ******
db2-password: *****
db2-dbTestQuery: SELECT CURRENT SQLID FROM SYSIBM.SYSDUMMY1
app:
datasource:
db2:
credentials:
env: QA
driver-class-name: com.ibm.db2.jcc.DB2Driver
jdbc-url: jdbc:db2://localhost/APP1
hibernate:
default_schema: YQ1MM
username: *****
password: *****
hikari:
maximum-pool-size: 10
connectionTestQuery: SELECT CURRENT SQLID FROM SYSIBM.SYSDUMMY1
Model
#Entity
#Table(name = "epr_str")
public class EprStr {
#Id
#Column(name = "str_bu_id")
private String strBuId;
#Column(name = "str_nbr")
private String strNbr;
}
Table Structure
str_bu_id is primary key
I recently try to using r2dbc with postgres "0.8.8.RELEASE" (pg r2dbc maven dependecy) and spring boot "2.5.2"
I declared following properties in my application.yml:
spring: jpa:
properties:
hibernate:
dialect: org.hibernate.dialect.PostgreSQL82Dialect
hibernate:
ddl-auto: create
show-sql: false
database-platform: org.hibernate.dialect.PostgreSQLDialect r2dbc:
url: r2dbc:postgresql://localhost:5432/postgres
username: postgres
password: postgres
pool:
enabled: true
initial-size: 00
max-size: 500
max-idle-time: 30m
validation-query: SELECT 1 sql:
init:
schema-locations: classpath:/schema.sql
mode: always
Bellow my Application.java:
#EnableJpaRepositories(basePackages = {"fr.mycompany.common"})
#EntityScan("fr.mycompany")
#SpringBootApplication(exclude = {DataSourceAutoConfiguration.class})
#EnableConfigurationProperties
public class Application {
I precise that the "fr.company.common" is in another subproject with own entities & repositories
My DatabaseConfig class:
#Configuration
#EnableTransactionManagement
#EnableR2dbcRepositories(basePackages = "fr.mycompany.activite.ingester.database.repos")
#Slf4j
public class DatabaseConfig extends AbstractR2dbcConfiguration {
#Value("${spring.r2dbc.host}")
private String host;
#Value("${spring.r2dbc.username}")
private String username;
#Value("${spring.r2dbc.password}")
private String password;
#Value("${spring.r2dbc.database}")
private String database;
#Override
public ConnectionFactory connectionFactory() {
log.info("Init rd2dbc with host: {}", host);
log.info("Init rd2dbc with database: {}", database);
log.info("Init rd2dbc with username: {}", username);
log.info("Init rd2dbc with password: {}", password);
return new PostgresqlConnectionFactory(PostgresqlConnectionConfiguration.builder()
.username(username)
.password(password)
.host(host)
.database(database)
.build());
}
#Bean
ReactiveTransactionManager transactionManager(ConnectionFactory connectionFactory) {
return new R2dbcTransactionManager(connectionFactory);
}
#Bean
public ConnectionFactoryInitializer initializer(ConnectionFactory connectionFactory) {
ConnectionFactoryInitializer initializer = new ConnectionFactoryInitializer();
initializer.setConnectionFactory(connectionFactory);
CompositeDatabasePopulator populator = new CompositeDatabasePopulator();
populator.addPopulators(new ResourceDatabasePopulator(new ClassPathResource("schema.sql")));
populator.addPopulators(new ResourceDatabasePopulator(new ClassPathResource("data.sql")));
initializer.setDatabasePopulator(populator);
return initializer;
}
My repository:
public interface OrdreDeTravailPivotV2SplittedRepository extends ReactiveCrudRepository<OrdreDeTravailPivotV2SplittedEntity, OrdreDeTravailPivotV2IdEntity> {
#Query(value = "SELECT * FROM splitted.ordredetravail WHERE idot = :idOt ORDER BY datemajstatut DESC LIMIT 1", nativeQuery = true)
Optional<OrdreDeTravailPivotV2SplittedEntity> findLastByItOt(String idOt);
}
my entity:
#Entity
#Getter
#Setter
#IdClass(OrdreDeTravailIdEntity.class)
#Table(name = "ordredetravail", schema = "splitted")
#TypeDef(
name = "jsonb",
typeClass = JsonBinaryType.class
)
public class OrdreDeTravailSplittedEntity implements ISplittedEntity {
#Id
#Column(name = "idot")
private String idOt;
#Id
#Column(name = "datemajstatut")
private Instant dateMajStatut;
Finally my business class which uses my repository:
#Slf4j
#Component
public class OrdreDeTravailConverter implements IModelConverter<OrdreDeTravailRawEntity, OrdreDeTravailSplittedEntity, OrdreDeTravailComputedEntity, OrdreDeTravailInputConversionModel> {
private final OrdreDeTravailSplittedRepository ordreDeTravailSplittedRepository;
private final OrdreDeTravailComputedRepository ordreDeTravailComputedRepository;
#Autowired
public OrdreDeTravailConverter(OrdreDeTravailSplittedRepository ordreDeTravailSplittedRepository, OrdreDeTravailComputedRepository ordreDeTravailComputedRepository) {
this.ordreDeTravailSplittedRepository = ordreDeTravailSplittedRepository;
this.ordreDeTravailComputedRepository = ordreDeTravailComputedRepository;
}
and when I try to launch the application I get the following error:
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'fr.mycompany.activite.demande.orion.ingester.database.repos.splitted.OrdreDeTravailPivotV2SplittedRepository' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {}
at org.springframework.beans.factory.support.DefaultListableBeanFactory.raiseNoMatchingBeanFound(DefaultListableBeanFactory.java:1790)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1346)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1300)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:887)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791)
... 85 more
do you have an idea ?
Best regards
Adrien
Maybe because you include multiple implementations of RepositoryFactorySupport. You need to let AnnotationRepositoryConfigurationSource#hasExplicitFilters function return true.In spring boot 2.4.2 you must use #EnableR2dbcRepositories and Must have "includeFilters" or "excludeFilters "
My solution(kotlin):
//Custom annotations
//Used on the Repository interface
#Target(AnnotationTarget.CLASS)
#kotlin.annotation.Retention(AnnotationRetention.RUNTIME)
#MustBeDocumented
annotation class R2dbcRepository()
//Use above the configuration class
#EnableR2dbcRepositories(includeFilters = [ComponentScan.Filter(type = FilterType.ANNOTATION, classes = [R2dbcRepository::class])])
I'm working with multiple data sources (Oracle and SQL-Server) in spring boot rest application. In this application, I have more than 25+ end-points exist to process client requests. But when one of the databases is down like Oracle or SQL-server is not available for some reason, my application is unable to start the server.
Looked couple examples on google and stack overflow but they're different what I'm looking for...
package com.foobar;
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories
(
entityManagerFactoryRef = "entityManagerFactory",
basePackages = { "com.foobar.foo.repo" }
)
public class FooDbConfig
{
#Primary
#Bean(name = "dataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean
entityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("dataSource") DataSource dataSource
) {
return builder
.dataSource(dataSource)
.packages("com.foobar.foo.domain")
.persistenceUnit("foo")
.build();
}
#Primary
#Bean(name = "transactionManager")
public PlatformTransactionManager transactionManager(
#Qualifier("entityManagerFactory") EntityManagerFactory
entityManagerFactory)
{
return new JpaTransactionManager(entityManagerFactory);
}
}
same configuration for 2nd data-source but with different properties.
I'm using below example as a base code reference to implement my requirements
Example link
I'm looking for a solution if one DB server is available out of N application should start and process client requests and whenever 2nd DB server is available then it connects automatically and processes other endpoints requests
I've created a solution recently for multitenancy with datasources and using liquibase, but if not use the liquibase, just remove that works too!
Example of application.yml
spring:
dataSources:
- tenantId: db1
url: jdbc:postgresql://localhost:5432/db1
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
liquibase:
enabled: true
default-schema: public
change-log: classpath:db/master/changelog/db.changelog-master.yaml
- tenantId: db2
url: jdbc:postgresql://localhost:5432/db2
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
- tenantId: db3
url: jdbc:postgresql://localhost:5432/db3
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
DataSourceConfiguration
#Configuration
#EnableTransactionManagement
#EntityScan(basePackages = { "br.com.dijalmasilva.springbootmultitenancyliquibase" })
#EnableJpaRepositories(basePackages = { "br.com.dijalmasilva.springbootmultitenancyliquibase" })
public class DataSourceConfiguration {
#Bean(name = "dataSources")
#Primary
public Map<Object, Object> getDataSources(DataSourceProperties dataSourceProperties) {
return dataSourceProperties.getDataSources().stream().map(dataSourceProperty -> {
DataSource dataSource = DataSourceBuilder.create()
.url(dataSourceProperty.getUrl())
.username(dataSourceProperty.getUsername())
.password(dataSourceProperty.getPassword())
.driverClassName(dataSourceProperty.getDriverClassName())
.build();
return new TenantIdDataSource(dataSourceProperty.getTenantId(), dataSource);
}).collect(Collectors.toMap(TenantIdDataSource::getTenantId, TenantIdDataSource::getDataSource));
}
#Bean(name = "tenantRoutingDataSource")
#DependsOn("dataSources")
public DataSource dataSource(Map<Object, Object> dataSources) {
AbstractRoutingDataSource tenantRoutingDataSource = new TenantRoutingDataSource();
tenantRoutingDataSource.setTargetDataSources(dataSources);
tenantRoutingDataSource.setDefaultTargetDataSource(dataSources.get("db1"));
tenantRoutingDataSource.afterPropertiesSet();
return tenantRoutingDataSource;
}
#Data
#AllArgsConstructor
private class TenantIdDataSource {
private Object tenantId;
private Object dataSource;
}
}
TenantRoutingDataSource
public class TenantRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return TenantContext.getCurrentTenant();
}
}
DataSourceProperties
#Data
#Component
#ConfigurationProperties(prefix = "spring")
public class DataSourceProperties {
private List<DataSourceProperty> dataSources = new ArrayList<>();
}
DataSourceProperty
#Data
public class DataSourceProperty {
private String tenantId;
private String url;
private String username;
private String password;
private String driverClassName;
private LiquibaseProperties liquibase;
}
See the complete code, maybe help you!
Link of project: https://github.com/dijalmasilva/spring-boot-multitenancy-datasource-liquibase
In my application i need to integrate two datasource but when i integrate the second database using JdbcTemplate then the previous one not working, instead all table created in the second datasource
#1 DataSource
#Configuration
#Profile("mariadb4j")
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class EmbeddedMariaDBConfig {
private static final Logger L =
LoggerFactory.getLogger(EmbeddedMariaDBConfig.class);
private static final String DB_SERVICE = "dbServiceBean";
#Bean(name = {DB_SERVICE})
MariaDB4jSpringService mariaDB4jSpringService() {
L.info("Initializing MariaDB4j service");
return new MariaDB4jSpringService();
}
#Bean(name = "adminDataSource")
#Primary
#DependsOn(DB_SERVICE)
DataSource dataSource(MariaDB4jSpringService mdb, DataSourceProperties dataSourceProperties) throws ManagedProcessException {
String dbName = dataSourceProperties.getName();
L.debug("Embedded MariaDB datasource properties from spring: [{}]", dataSourceProperties);
mdb.getDB().createDB(dbName);
if(L.isDebugEnabled()) {
DBConfigurationBuilder ecfg = mdb.getConfiguration();
L.debug("JDBC URL for embedded MariaDB as reported by driver: [{}]", ecfg.getURL(dbName));
L.debug("JDBC URL from spring config: [{}]", dataSourceProperties.getUrl());
L.debug("JDBC Username: [{}]", dataSourceProperties.getUsername());
L.debug("JDBC Password: [{}]", dataSourceProperties.getPassword());
}
return DataSourceBuilder
.create()
.username(dataSourceProperties.getUsername())
.password(dataSourceProperties.getPassword())
.url(dataSourceProperties.getUrl())
.driverClassName(dataSourceProperties.getDriverClassName())
.build();
}
}
Yaml Configuration
spring:
profiles: mariadb4j
datasource:
username: root
password: password
driver-class-name: com.mysql.jdbc.Driver
url: jdbc:mysql://localhost:3306/t1
mariaDB4j:
dataDir: /tmp/mariadb
port: 3900
2 DataSource
#Configuration
public class ExoDBConfig {
#Bean(name="user")
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("com.mysql.jdbc.Driver");
dataSource.setUrl("jdbc:mysql://localhost:3306/t2");
dataSource.setUsername("root");
dataSource.setPassword("password");
return dataSource;
}
#Bean
public JdbcTemplate jdbcTemplate(#Qualifier("user")DataSource dataSource)
{
return new JdbcTemplate(dataSource);
}
}
When i use only one DataSource the 1st one then it's working fine and tables are creating in t1 database but when i integrate the 2nd DataSource then it points to second DataSource means all the tables is creating in the t2 database.
Just Create DataSource object in the JdbcTemplate itself, working for me, and you can create separate properties file for t2 database
#Bean
public JdbcTemplate jdbcTemplate() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("com.mysql.jdbc.Driver");
dataSource.setUrl("jdbc:mysql://localhost:3306/t2");
dataSource.setUsername("user");
dataSource.setPassword("password");
return new JdbcTemplate(dataSource);
}
I am using Spring boot and Spring data and I want to use primarily a MySQL datasource but if fails to connect go to a H2 datasource.
So far, I do the change just moving the #Primary in the configurations, but if I put the #Primary in the MySQL (main data source) and stop the MySQL server in my pc, the other bean does not come up... What do I need?
application.yml:
# Main properties
spring:
application:
name: app
jpa:
database: default
show-sql: false
hibernate:
ddl-auto: update
properties:
hibernate:
format_sql: false
current_session_context_class: org.springframework.orm.hibernate5.SpringSessionContext
# Main database: MySQL
main.datasource:
url: jdbc:mysql://localhost:3306/app?useSSL=false
driver-class-name: com.mysql.jdbc.Driver
username: sa
password: sa
# Backup database: H2
backup.datasource:
url: jdbc:h2:${project.directory}/app;DB_CLOSE_ON_EXIT=FALSE
driver-class-name: org.h2.Driver
username: sa
password: sa
Main data source
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories("org.app")
#EntityScan("org.app")
public class MainDataSourceConfig {
#Primary
#Bean(name = "mainDataSource")
#ConfigurationProperties(prefix = "main.datasource")
public DataSource mainDataSource() {
return DataSourceBuilder.create().build();
}
}
Backup data source:
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories("org.app")
#EntityScan("org.app")
public class BackupDataSourceConfig {
#Bean(name = "backupDataSource")
#ConfigurationProperties(prefix = "backup.datasource")
public DataSource backupDataSource() {
return DataSourceBuilder.create().build();
}
}
Thanks!
I figure out how to do it. Hope this can help anyone:
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories("org.app")
#EntityScan("org.app")
public class DataSourceConfig {
private static final String USERNAME = "sa";
private static final String PASSWORD = "sa";
#Bean
#Primary
public DataSource dataSource() {
DataSource dataSource;
try {
dataSource = getMainDataSource();
dataSource.getConnection().isValid(500);
} catch (Exception e) {
log.error("Main database not valid.", e);
dataSource =getBackupDataSource();
}
return dataSource;
}
private DataSource getMainDataSource() {
return DataSourceBuilder.create()
.driverClassName("com.mysql.jdbc.Driver")
.username(USERNAME)
.password(PASSWORD)
.url("jdbc:mysql://localhost:3306/app?useSSL=false")
.build();
}
private DataSource getBackupDataSource() {
return DataSourceBuilder.create()
.driverClassName("org.h2.Driver")
.username(USERNAME)
.password(PASSWORD)
.url("jdbc:h2:/app;DB_CLOSE_ON_EXIT=FALSE")
.build();
}
}
Just on bean.