I recently try to using r2dbc with postgres "0.8.8.RELEASE" (pg r2dbc maven dependecy) and spring boot "2.5.2"
I declared following properties in my application.yml:
spring: jpa:
properties:
hibernate:
dialect: org.hibernate.dialect.PostgreSQL82Dialect
hibernate:
ddl-auto: create
show-sql: false
database-platform: org.hibernate.dialect.PostgreSQLDialect r2dbc:
url: r2dbc:postgresql://localhost:5432/postgres
username: postgres
password: postgres
pool:
enabled: true
initial-size: 00
max-size: 500
max-idle-time: 30m
validation-query: SELECT 1 sql:
init:
schema-locations: classpath:/schema.sql
mode: always
Bellow my Application.java:
#EnableJpaRepositories(basePackages = {"fr.mycompany.common"})
#EntityScan("fr.mycompany")
#SpringBootApplication(exclude = {DataSourceAutoConfiguration.class})
#EnableConfigurationProperties
public class Application {
I precise that the "fr.company.common" is in another subproject with own entities & repositories
My DatabaseConfig class:
#Configuration
#EnableTransactionManagement
#EnableR2dbcRepositories(basePackages = "fr.mycompany.activite.ingester.database.repos")
#Slf4j
public class DatabaseConfig extends AbstractR2dbcConfiguration {
#Value("${spring.r2dbc.host}")
private String host;
#Value("${spring.r2dbc.username}")
private String username;
#Value("${spring.r2dbc.password}")
private String password;
#Value("${spring.r2dbc.database}")
private String database;
#Override
public ConnectionFactory connectionFactory() {
log.info("Init rd2dbc with host: {}", host);
log.info("Init rd2dbc with database: {}", database);
log.info("Init rd2dbc with username: {}", username);
log.info("Init rd2dbc with password: {}", password);
return new PostgresqlConnectionFactory(PostgresqlConnectionConfiguration.builder()
.username(username)
.password(password)
.host(host)
.database(database)
.build());
}
#Bean
ReactiveTransactionManager transactionManager(ConnectionFactory connectionFactory) {
return new R2dbcTransactionManager(connectionFactory);
}
#Bean
public ConnectionFactoryInitializer initializer(ConnectionFactory connectionFactory) {
ConnectionFactoryInitializer initializer = new ConnectionFactoryInitializer();
initializer.setConnectionFactory(connectionFactory);
CompositeDatabasePopulator populator = new CompositeDatabasePopulator();
populator.addPopulators(new ResourceDatabasePopulator(new ClassPathResource("schema.sql")));
populator.addPopulators(new ResourceDatabasePopulator(new ClassPathResource("data.sql")));
initializer.setDatabasePopulator(populator);
return initializer;
}
My repository:
public interface OrdreDeTravailPivotV2SplittedRepository extends ReactiveCrudRepository<OrdreDeTravailPivotV2SplittedEntity, OrdreDeTravailPivotV2IdEntity> {
#Query(value = "SELECT * FROM splitted.ordredetravail WHERE idot = :idOt ORDER BY datemajstatut DESC LIMIT 1", nativeQuery = true)
Optional<OrdreDeTravailPivotV2SplittedEntity> findLastByItOt(String idOt);
}
my entity:
#Entity
#Getter
#Setter
#IdClass(OrdreDeTravailIdEntity.class)
#Table(name = "ordredetravail", schema = "splitted")
#TypeDef(
name = "jsonb",
typeClass = JsonBinaryType.class
)
public class OrdreDeTravailSplittedEntity implements ISplittedEntity {
#Id
#Column(name = "idot")
private String idOt;
#Id
#Column(name = "datemajstatut")
private Instant dateMajStatut;
Finally my business class which uses my repository:
#Slf4j
#Component
public class OrdreDeTravailConverter implements IModelConverter<OrdreDeTravailRawEntity, OrdreDeTravailSplittedEntity, OrdreDeTravailComputedEntity, OrdreDeTravailInputConversionModel> {
private final OrdreDeTravailSplittedRepository ordreDeTravailSplittedRepository;
private final OrdreDeTravailComputedRepository ordreDeTravailComputedRepository;
#Autowired
public OrdreDeTravailConverter(OrdreDeTravailSplittedRepository ordreDeTravailSplittedRepository, OrdreDeTravailComputedRepository ordreDeTravailComputedRepository) {
this.ordreDeTravailSplittedRepository = ordreDeTravailSplittedRepository;
this.ordreDeTravailComputedRepository = ordreDeTravailComputedRepository;
}
and when I try to launch the application I get the following error:
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'fr.mycompany.activite.demande.orion.ingester.database.repos.splitted.OrdreDeTravailPivotV2SplittedRepository' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {}
at org.springframework.beans.factory.support.DefaultListableBeanFactory.raiseNoMatchingBeanFound(DefaultListableBeanFactory.java:1790)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1346)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1300)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:887)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791)
... 85 more
do you have an idea ?
Best regards
Adrien
Maybe because you include multiple implementations of RepositoryFactorySupport. You need to let AnnotationRepositoryConfigurationSource#hasExplicitFilters function return true.In spring boot 2.4.2 you must use #EnableR2dbcRepositories and Must have "includeFilters" or "excludeFilters "
My solution(kotlin):
//Custom annotations
//Used on the Repository interface
#Target(AnnotationTarget.CLASS)
#kotlin.annotation.Retention(AnnotationRetention.RUNTIME)
#MustBeDocumented
annotation class R2dbcRepository()
//Use above the configuration class
#EnableR2dbcRepositories(includeFilters = [ComponentScan.Filter(type = FilterType.ANNOTATION, classes = [R2dbcRepository::class])])
Related
I have used Namedjdbctemplate and Spring Jpa both in my project for which i have two Configuration class which is preparing the datasource for db2 database but when i run the project i am facing
Name Can not be null Error while creating entity Manage Factory . I am not able to identify the root cause of this problem can some body please point the problem and mistakes i am doing
Tried put("hibernate.ddl-auto", "none"); as well but same error
Error
Caused by: javax.persistence.PersistenceException: [PersistenceUnit: read] Unable to build Hibernate SessionFactory
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.persistenceException(EntityManagerFactoryBuilderImpl.java:954)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:882)
at org.hibernate.jpa.HibernatePersistenceProvider.createContainerEntityManagerFactory(HibernatePersistenceProvider.java:135)
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:353)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.buildNativeEntityManagerFactory(AbstractEntityManagerFactoryBean.java:370)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:359)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1687)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1624)
... 16 common frames omitted
Caused by: java.lang.IllegalArgumentException: Name cannot be null
at org.hibernate.boot.model.relational.QualifiedNameParser$NameParts.<init>(QualifiedNameParser.java:34)
at org.hibernate.boot.model.relational.QualifiedNameImpl.<init>(QualifiedNameImpl.java:24)
at org.hibernate.boot.model.relational.QualifiedSequenceName.<init>(QualifiedSequenceName.java:16)
at org.hibernate.tool.schema.extract.internal.SequenceInformationExtractorLegacyImpl.extractMetadata(SequenceInformationExtractorLegacyImpl.java:51)
at org.hibernate.tool.schema.extract.internal.DatabaseInformationImpl.initializeSequences(DatabaseInformationImpl.java:64)
at org.hibernate.tool.schema.extract.internal.DatabaseInformationImpl.<init>(DatabaseInformationImpl.java:60)
at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:123)
at org.hibernate.tool.hbm2ddl.SchemaUpdate.execute(SchemaUpdate.java:101)
at org.hibernate.internal.SessionFactoryImpl.<init>(SessionFactoryImpl.java:472)
at org.hibernate.boot.internal.SessionFactoryBuilderImpl.build(SessionFactoryBuilderImpl.java:444)
at org.hibernate.jpa.boot.internal.EntityManagerFactoryBuilderImpl.build(EntityManagerFactoryBuilderImpl.java:879)
... 22 common frames omitted
Process finished with exit code
Configuration Classes
#Setter
#Getter
#Configuration
#PropertySource("classpath:application.yml")
#ConfigurationProperties("app.datasource.db2.credentials.hikari")
public class HikariReadProperties {
private String poolName;
private int minimumIdle;
private int maximumPoolSize;
private int idleTimeout;
private String connectionTestQuery;
}
public class HikariConfigRead extends HikariConfig {
protected final HikariReadProperties hikariReadProperties;
protected final String PERSISTENCE_UNIT_NAME = "read";
protected HikariConfigRead(HikariReadProperties hikariReadProperties) {
this.hikariReadProperties = hikariReadProperties;
setPoolName(this.hikariReadProperties.getPoolName());
setMinimumIdle(this.hikariReadProperties.getMinimumIdle());
setMaximumPoolSize(this.hikariReadProperties.getMaximumPoolSize());
setIdleTimeout(this.hikariReadProperties.getIdleTimeout());
setConnectionTestQuery(this.hikariReadProperties.getConnectionTestQuery());
}
}
#Configuration
#ConfigurationProperties("app.datasource.db2.credentials")
#EnableTransactionManagement
#EnableJpaRepositories(entityManagerFactoryRef = "entityManagerFactory",
transactionManagerRef = "transactionManagerRead", basePackages = {"com.testing.db2migration.repository"})
public class JpaConfiguration extends HikariConfigRead {
protected JpaConfiguration(HikariReadProperties hikariReadProperties) {
super(hikariReadProperties);
}
#Bean(name = "db2DataSource")
public HikariDataSource hikariDataSource() {
return new HikariDataSource(this);
}
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean entityManagerFactoryWrite(
final #Qualifier("db2DataSource") HikariDataSource dataSourceWrite) {
return new LocalContainerEntityManagerFactoryBean() {{
setDataSource(dataSourceWrite);
setPersistenceProviderClass(HibernatePersistenceProvider.class);
setPersistenceUnitName(PERSISTENCE_UNIT_NAME);
setPackagesToScan("com.testing.db2migration.model");
Properties JPA_READ_PROPERTIES = new Properties() {{
put("hibernate.dialect", "org.hibernate.dialect.DB2Dialect");
put("hibernate.hbm2ddl.auto", "update");
put("hibernate.ddl-auto", "update");
put("show-sql", "true");
}};
setJpaProperties(JPA_READ_PROPERTIES);
}};
}
#Bean(name="transactionManagerRead")
public PlatformTransactionManager transactionManagerWrite(
#Qualifier("entityManagerFactory") EntityManagerFactory entityManagerFactoryWrite) {
return new JpaTransactionManager(entityManagerFactoryWrite);
}
}
public class BaseConfig {
protected Environment environment;
protected DataSource getHikariDataSource () {
HikariDataSource ds = new HikariDataSource();
String driverClassName = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-driverClassName");
String url = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-url");
String userName = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-username");
String password = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-password");
String dbTestQuery = environment.getProperty(Constants.ENVIRONMENT_ROOT + "db2-dbTestQuery");
ds.setJdbcUrl(url);
ds.setUsername(userName);
ds.setPassword(password);
ds.setDriverClassName(driverClassName);
ds.setConnectionTestQuery(dbTestQuery);
return ds;
}
}
#Configuration
#ComponentScan("com.testing")
public class LocalConfig extends BaseConfig implements EnvironmentAware {
#Override
public void setEnvironment(Environment environment) {
this.environment = environment;
}
#Bean(name = "dataSource")
#Primary
public DataSource dataSource() throws SQLException {
return this.getHikariDataSource();
}
#Bean("jdbcTemplate")
#Autowired
public JdbcTemplate jdbcTemplate(DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
..................
#Normal bean declaration
}
application.yaml
vcap:
services:
store-service:
credentials:
env: QA
db2-driverClassName: com.ibm.db2.jcc.DB2Driver
db2-url: jdbc:db2://localhost:/APP1
db2-schema: YQ1MM
db2-username: ******
db2-password: *****
db2-dbTestQuery: SELECT CURRENT SQLID FROM SYSIBM.SYSDUMMY1
app:
datasource:
db2:
credentials:
env: QA
driver-class-name: com.ibm.db2.jcc.DB2Driver
jdbc-url: jdbc:db2://localhost/APP1
hibernate:
default_schema: YQ1MM
username: *****
password: *****
hikari:
maximum-pool-size: 10
connectionTestQuery: SELECT CURRENT SQLID FROM SYSIBM.SYSDUMMY1
Model
#Entity
#Table(name = "epr_str")
public class EprStr {
#Id
#Column(name = "str_bu_id")
private String strBuId;
#Column(name = "str_nbr")
private String strNbr;
}
Table Structure
str_bu_id is primary key
I'm trying to inject Bean of properties within context Bean.
(Spring-boot 2.7.3 / Java 11)
My application.yml is like below:
spring:
config:
active: dev
---
spring:
config:
activate:
on-profile: dev
keycloak:
username: "local"
password: "local"
---
spring:
config:
activate:
on-profile: stg
keycloak:
username: "stg"
password: "stg"
---
spring:
config:
activate:
on-profile: prod
keycloak:
username: "prod"
password: "prod"
and my KafkaProducerConfig.java code like below:
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.boot.context.properties.ConstructorBinding;
import org.springframework.stereotype.Component;
import lombok.Getter;
#Getter
#Component
#ConstructorBinding
#ConfigurationProperties("keycloak")
public class KafkaProducerConfig {
private final String username;
private final String password;
public KafkaProducerConfig(String username, String password) {
this.username = username;
this.password = password;
}
}
and finally I failed to inject within another class.
Actually, UserDataProducer class extended in a context bean class which means UserDataProducer class also instanciated by spring IoC Container as I know.
I also tried #DependsOn which doesn't work.
#Slf4j
#DependsOn(value = {"KafkaProducerConfig"})
public class UserDataProducer {
#Autowired
KafkaProducerConfig kafkaProducerConfig;
private final String topicName;
public UserDataProducer() {
log.info("===========================================================");
log.info("Initializing UserDataProducer ...");
System.out.println(kafkaProducerConfig.getPassword());
log.info("===========================================================");
// additional properties for transactional producing
topicName = ProducerConfig.PRODUCER_PROPS.getProperty("default.topic");
}
#Slf4j
#Component
public class UserDataProducer {
// use static initializer block to initialize your static fields
// private static final Producer<String, Object> producer;
// initialzer order : static{} -> instance block {} -> constructor
private final String topicName;
public UserDataProducer(KafkaProducerConfig kafkaProducerConfig) {
log.info("===========================================================");
log.info("Initializing UserDataProducer ...");
System.out.println(kafkaProducerConfig.getPassword());
log.info("===========================================================");
// additional properties for transactional producing
topicName = ProducerConfig.PRODUCER_PROPS.getProperty("default.topic");
}
I have a bit a problem, I have several microservices, but one throws an exception that others don't throws and work perfect ...
[2020-09-28 16:55:38.304]|ERROR|TIBCO EMS Session Dispatcher (21297)|org.hibernate.engine.jdbc.spi.SqlExceptionHelper.logExceptions^[[36m(142)^[[0;39m: --- ERROR: relation "computed.fluxeventlogging" does not exist
Position: 502
[2020-09-28 16:55:38.307]|INFO |TIBCO EMS Session Dispatcher (21297)|org.hibernate.event.internal.DefaultLoadEventListener.doOnLoad^[[36m(116)^[[0;39m: --- HHH000327: Error performing load command
org.hibernate.exception.SQLGrammarException: could not extract ResultSet
at org.hibernate.exception.internal.SQLStateConversionDelegate.convert(SQLStateConversionDelegate.java:103) ~[hibernate-core-5.4.12.Final.jar:5.4.12.Final]
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:42) ~[hibernate-core-5.4.12.Final.jar:5.4.12.Final]
I have good permissions in a database like that:
And my entity is located in a common projet used by all microservice, added with maven dependency.
#Entity
#Data
#Table(name="fluxeventlogging", schema = "computed")
#IdClass(FluxEventLoggingIdEntity.class)
public class FluxEventLoggingEntity implements Serializable {
#Id
#Column(name = "fluxeventuuid", columnDefinition = "uuid")
private UUID fluxEventUuid;
#Column(name = "lastupdatedate")
private Instant lastUpdateDate;
#Column(name = "businessfluxtype")
private String businessFluxType;
#Column(name = "fluxprocessortype")
private String fluxProcessorType;
#Column(name = "valuewhichcauseupdate")
private String valueWhichCauseUpdate;
#Column(name = "oldvaluecause")
private String oldValueCause;
#Column(name = "newvaluecause")
private String newValueCause;
#Column(name = "currenteventlifestate")
private String currentEventLifeState;
#Column(name = "nexteventlifestate")
private String nextEventLifeState;
#Column(name = "generatederror", columnDefinition = "text")
private String generatedError;
}
however I fixed in pom the hibernate version to:
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>5.4.12.Final</version>
</dependency>
because this project is the only once which use one data source for jpa like that:
custom.datasource.url: jdbc:postgresql://localhost:5432/postgres
custom.datasource.database: postgres
custom.datasource.driver: pool
custom.datasource.protocol: postgres
custom.datasource.localhost: localhost
custom.datasource.port: 5432
custom.datasource.password: postgres
custom.datasource.username: postgres
custom.datasource.driverclassname: org.postgresql.Driver
spring:
jpa:
properties:
hibernate:
dialect: org.hibernate.dialect.PostgreSQL82Dialect
hibernate:
ddl-auto: create
show-sql: false
database-platform: org.hibernate.dialect.PostgreSQLDialect
datasource:
driver-class-name: org.postgresql.Driver
url: jdbc:postgresql://localhost:5432/postgres
username: postgres
password: postgres
schema: classpath:/schema.sql
initialization-mode: always
r2dbc:
url: r2dbc:postgresql://postgres:postgres#localhost:5432/postgres
pool:
enabled: true
initial-size=: 00
max-size: 500
max-idle-time: 30m
validation-query: SELECT 1
And another data source for r2dbc (postgres reactive)
#Configuration
#Slf4j
public class DatabaseConfiguration {
private static Map<String, Object> PROPERTIES;
#Autowired
DataSource dataSource;
#Bean
public ConnectionFactory r2dbcConnectionFactory() {
if(PROPERTIES == null) {
Yaml yaml = new Yaml();
InputStream inputStream = this.getClass()
.getClassLoader()
.getResourceAsStream("application.yml");
PROPERTIES = yaml.load(inputStream);
}
log.info("Init rd2dbc with host: {}", PROPERTIES.get("custom.datasource.localhost").toString());
log.info("Init rd2dbc with port: {}", PROPERTIES.get("custom.datasource.port").toString());
log.info("Init rd2dbc with database: {}", PROPERTIES.get("custom.datasource.database").toString());
log.info("Init rd2dbc with username: {}", PROPERTIES.get("custom.datasource.username").toString());
log.info("Init rd2dbc with driver: {}", PROPERTIES.get("custom.datasource.driver").toString());
log.info("Init rd2dbc with protocol: {}", PROPERTIES.get("custom.datasource.protocol").toString());
ConnectionFactoryOptions options = ConnectionFactoryOptions.builder()
.option(ConnectionFactoryOptions.DRIVER, PROPERTIES.get("custom.datasource.driver").toString())
.option(ConnectionFactoryOptions.PROTOCOL, PROPERTIES.get("custom.datasource.protocol").toString())
.option(ConnectionFactoryOptions.USER, PROPERTIES.get("custom.datasource.username").toString())
.option(ConnectionFactoryOptions.PASSWORD, PROPERTIES.get("custom.datasource.password").toString())
.option(ConnectionFactoryOptions.HOST, PROPERTIES.get("custom.datasource.localhost").toString())
.option(ConnectionFactoryOptions.PORT, Integer.parseInt(PROPERTIES.get("custom.datasource.port").toString()))
.option(ConnectionFactoryOptions.DATABASE, PROPERTIES.get("custom.datasource.database").toString())
.build();
return ConnectionFactories.get(options);
//return ConnectionFactories.get(ConnectionFactoryOptions.parse(PROPERTIES.get("custom.r2dbc.url").toString()));
}
#Bean
public DataSource getDataSource() {
if(PROPERTIES == null) {
Yaml yaml = new Yaml();
InputStream inputStream = this.getClass()
.getClassLoader()
.getResourceAsStream("application.yml");
PROPERTIES = yaml.load(inputStream);
}
log.info("Init datasource with url: {}", PROPERTIES.get("custom.datasource.url").toString());
log.info("Init datasource with username: {}", PROPERTIES.get("custom.datasource.username").toString());
log.info("Init datasource with driver: {}", PROPERTIES.get("custom.datasource.driverclassname").toString());
DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
dataSourceBuilder.url(PROPERTIES.get("custom.datasource.url").toString());
dataSourceBuilder.username(PROPERTIES.get("custom.datasource.username").toString());
dataSourceBuilder.password(PROPERTIES.get("custom.datasource.password").toString());
dataSourceBuilder.driverClassName(PROPERTIES.get("custom.datasource.driverclassname").toString());
return dataSourceBuilder.build();
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
//JpaVendorAdapteradapter can be autowired as well if it's configured in application properties.
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
vendorAdapter.setGenerateDdl(false);
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setJpaVendorAdapter(vendorAdapter);
//Add package to scan for entities.
factory.setPackagesToScan("fr.microservice2.database", "fr.microservice.common");
factory.setDataSource(dataSource);
return factory;
}
#Bean
public PlatformTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
JpaTransactionManager txManager = new JpaTransactionManager();
txManager.setEntityManagerFactory(entityManagerFactory);
return txManager;
}
I don't know why only this microservice don't found my table computed.fluxeventlogging ...
while I have plenty of other tables in this diagram where it does not bring up the problem
Anyone have an idea please?
Thank you and best regards
I'm working with multiple data sources (Oracle and SQL-Server) in spring boot rest application. In this application, I have more than 25+ end-points exist to process client requests. But when one of the databases is down like Oracle or SQL-server is not available for some reason, my application is unable to start the server.
Looked couple examples on google and stack overflow but they're different what I'm looking for...
package com.foobar;
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories
(
entityManagerFactoryRef = "entityManagerFactory",
basePackages = { "com.foobar.foo.repo" }
)
public class FooDbConfig
{
#Primary
#Bean(name = "dataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean
entityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("dataSource") DataSource dataSource
) {
return builder
.dataSource(dataSource)
.packages("com.foobar.foo.domain")
.persistenceUnit("foo")
.build();
}
#Primary
#Bean(name = "transactionManager")
public PlatformTransactionManager transactionManager(
#Qualifier("entityManagerFactory") EntityManagerFactory
entityManagerFactory)
{
return new JpaTransactionManager(entityManagerFactory);
}
}
same configuration for 2nd data-source but with different properties.
I'm using below example as a base code reference to implement my requirements
Example link
I'm looking for a solution if one DB server is available out of N application should start and process client requests and whenever 2nd DB server is available then it connects automatically and processes other endpoints requests
I've created a solution recently for multitenancy with datasources and using liquibase, but if not use the liquibase, just remove that works too!
Example of application.yml
spring:
dataSources:
- tenantId: db1
url: jdbc:postgresql://localhost:5432/db1
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
liquibase:
enabled: true
default-schema: public
change-log: classpath:db/master/changelog/db.changelog-master.yaml
- tenantId: db2
url: jdbc:postgresql://localhost:5432/db2
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
- tenantId: db3
url: jdbc:postgresql://localhost:5432/db3
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
DataSourceConfiguration
#Configuration
#EnableTransactionManagement
#EntityScan(basePackages = { "br.com.dijalmasilva.springbootmultitenancyliquibase" })
#EnableJpaRepositories(basePackages = { "br.com.dijalmasilva.springbootmultitenancyliquibase" })
public class DataSourceConfiguration {
#Bean(name = "dataSources")
#Primary
public Map<Object, Object> getDataSources(DataSourceProperties dataSourceProperties) {
return dataSourceProperties.getDataSources().stream().map(dataSourceProperty -> {
DataSource dataSource = DataSourceBuilder.create()
.url(dataSourceProperty.getUrl())
.username(dataSourceProperty.getUsername())
.password(dataSourceProperty.getPassword())
.driverClassName(dataSourceProperty.getDriverClassName())
.build();
return new TenantIdDataSource(dataSourceProperty.getTenantId(), dataSource);
}).collect(Collectors.toMap(TenantIdDataSource::getTenantId, TenantIdDataSource::getDataSource));
}
#Bean(name = "tenantRoutingDataSource")
#DependsOn("dataSources")
public DataSource dataSource(Map<Object, Object> dataSources) {
AbstractRoutingDataSource tenantRoutingDataSource = new TenantRoutingDataSource();
tenantRoutingDataSource.setTargetDataSources(dataSources);
tenantRoutingDataSource.setDefaultTargetDataSource(dataSources.get("db1"));
tenantRoutingDataSource.afterPropertiesSet();
return tenantRoutingDataSource;
}
#Data
#AllArgsConstructor
private class TenantIdDataSource {
private Object tenantId;
private Object dataSource;
}
}
TenantRoutingDataSource
public class TenantRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return TenantContext.getCurrentTenant();
}
}
DataSourceProperties
#Data
#Component
#ConfigurationProperties(prefix = "spring")
public class DataSourceProperties {
private List<DataSourceProperty> dataSources = new ArrayList<>();
}
DataSourceProperty
#Data
public class DataSourceProperty {
private String tenantId;
private String url;
private String username;
private String password;
private String driverClassName;
private LiquibaseProperties liquibase;
}
See the complete code, maybe help you!
Link of project: https://github.com/dijalmasilva/spring-boot-multitenancy-datasource-liquibase
What I need is 2 Repositories created out of a single entity:
interface TopicRepository implements ReactiveCrudRepository<Topic, String>
interface BackupTopicRepository implements ReactiveCrudRepository<Topic, String>
How is that possible? Right now only one is created.
This is how you would do it.
#Configuration
#ConfigurationProperties(prefix = "mongodb.topic")
#EnableMongoRepositories(basePackages = "abc.def.repository.topic", mongoTemplateRef = "topicMongoTemplate")
#Setter
class TopicMongoConfig {
private String host;
private int port;
private String database;
#Primary
#Bean(name = "topicMongoTemplate")
public MongoTemplate topicMongoTemplate() throws Exception {
final Mongo mongoClient = createMongoClient(new ServerAddress(host, port));
return new MongoTemplate(mongoClient, database);
}
private Mongo createMongoClient(ServerAddress serverAddress) {
return new MongoClient(serverAddress);
}
}
Another configuration
#Configuration
#ConfigurationProperties(prefix = "mongodb.backuptopic")
#EnableMongoRepositories(basePackages = "abc.def.repository.backuptopic", mongoTemplateRef = "backupTopicMongoTemplate")
#Setter
class BackupTopicMongoConfig {
private String host;
private int port;
private String database;
#Primary
#Bean(name = "backupTopicMongoTemplate")
public MongoTemplate backupTopicMongoTemplate() throws Exception {
final Mongo mongoClient = createMongoClient(new ServerAddress(host, port));
return new MongoTemplate(mongoClient, database);
}
private Mongo createMongoClient(ServerAddress serverAddress) {
return new MongoClient(serverAddress);
}
}
Your TopicRepository and BackuoTopicRepository should reside in abc.def.repository.topic and abc.def.repository.backuptopic respectively.
And also you need to have these properties defined in your properties or yml file
mongodb:
topic:
host:
database:
port:
backuptopic:
host:
database:
port:
Lastly, disable springboot autoconfiguration for mongo.
#SpringBootApplication(exclude = {MongoAutoConfiguration.class, MongoDataAutoConfiguration.class})