How to set up liquibase in Spring for multiple data sources? - spring

I need to set up liquibase for two datasources in Spring, at the moment it seems that only one liquibase set up is possible and you can choose for which data source.

If you are using spring boot, here is the setup which can help you:
Configuration class:
#Configuration
public class DatasourceConfig {
#Primary
#Bean
#ConfigurationProperties(prefix = "datasource.primary")
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix = "datasource.primary.liquibase")
public LiquibaseProperties primaryLiquibaseProperties() {
return new LiquibaseProperties();
}
#Bean
public SpringLiquibase primaryLiquibase() {
return springLiquibase(primaryDataSource(), primaryLiquibaseProperties());
}
#Bean
#ConfigurationProperties(prefix = "datasource.secondary")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix = "datasource.secondary.liquibase")
public LiquibaseProperties secondaryLiquibaseProperties() {
return new LiquibaseProperties();
}
#Bean
public SpringLiquibase secondaryLiquibase() {
return springLiquibase(secondaryDataSource(), secondaryLiquibaseProperties());
}
private static SpringLiquibase springLiquibase(DataSource dataSource, LiquibaseProperties properties) {
SpringLiquibase liquibase = new SpringLiquibase();
liquibase.setDataSource(dataSource);
liquibase.setChangeLog(properties.getChangeLog());
liquibase.setContexts(properties.getContexts());
liquibase.setDefaultSchema(properties.getDefaultSchema());
liquibase.setDropFirst(properties.isDropFirst());
liquibase.setShouldRun(properties.isEnabled());
liquibase.setLabels(properties.getLabels());
liquibase.setChangeLogParameters(properties.getParameters());
liquibase.setRollbackFile(properties.getRollbackFile());
return liquibase;
}
...
}
properties.yml
datasource:
primary:
url: jdbc:mysql://localhost/primary
username: username
password: password
liquibase:
change-log: classpath:/db/changelog/db.primary.changelog-master.xml
secondary:
url: jdbc:mysql://localhost/secondary
username: username
password: password
liquibase:
change-log: classpath:/db/changelog/db.secondary.changelog-master.xml

I've done a project that I can create multiple dataSources with your specific changeSets, so if you need to add another dataSource, it would just change your application.yml, no longer needing to change the code.
Configuration class
#Configuration
#ConditionalOnProperty(prefix = "spring.liquibase", name = "enabled", matchIfMissing = true)
#EnableConfigurationProperties(LiquibaseProperties.class)
#AllArgsConstructor
public class LiquibaseConfiguration {
private LiquibaseProperties properties;
private DataSourceProperties dataSourceProperties;
#Bean
#DependsOn("tenantRoutingDataSource")
public MultiTenantDataSourceSpringLiquibase liquibaseMultiTenancy(Map<Object, Object> dataSources,
#Qualifier("taskExecutor") TaskExecutor taskExecutor) {
// to run changeSets of the liquibase asynchronous
MultiTenantDataSourceSpringLiquibase liquibase = new MultiTenantDataSourceSpringLiquibase(taskExecutor);
dataSources.forEach((tenant, dataSource) -> liquibase.addDataSource((String) tenant, (DataSource) dataSource));
dataSourceProperties.getDataSources().forEach(dbProperty -> {
if (dbProperty.getLiquibase() != null) {
liquibase.addLiquibaseProperties(dbProperty.getTenantId(), dbProperty.getLiquibase());
}
});
liquibase.setContexts(properties.getContexts());
liquibase.setChangeLog(properties.getChangeLog());
liquibase.setDefaultSchema(properties.getDefaultSchema());
liquibase.setDropFirst(properties.isDropFirst());
liquibase.setShouldRun(properties.isEnabled());
return liquibase;
}
}
application.yml
spring:
dataSources:
- tenantId: db1
url: jdbc:postgresql://localhost:5432/db1
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
liquibase:
enabled: true
default-schema: public
change-log: classpath:db/master/changelog/db.changelog-master.yaml
- tenantId: db2
url: jdbc:postgresql://localhost:5432/db2
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
- tenantId: db3
url: jdbc:postgresql://localhost:5432/db3
username: postgres
password: 123456
driver-class-name: org.postgresql.Driver
 
Link of repository: https://github.com/dijalmasilva/spring-boot-multitenancy-datasource-liquibase

I was in the need of supporting a dynamic amount of DataSources, not a fixed number of them. I found that you can use the same SpringLiquibase bean for multiple DataSources by making a service like this:
#Service
#DependsOn("liquibase")
public class LiquibaseService {
#Autowired
#Qualifier("liquibase")
private SpringLiquibase liquibase;
#PostConstruct
public void initialize() {
/* Obtain datasources from wherever. I obtain them from a master DB. It's up to you. */
List<DataSource> dataSources = obtainDataSources();
for (DataSource dataSource : dataSources) {
try {
liquibase.setDataSource(dataSource);
liquibase.setChangeLog("classpath:liquibase/emp.changelog.xml");
liquibase.setShouldRun(true);
// This runs Liquibase
liquibase.afterPropertiesSet();
} catch (LiquibaseException ex) {
throw new RuntimeException(ex);
}
}
}
}
For this to work, you should have a SpringLiquibase bean declared somewhere. In this example, I got this in one of my configuration files:
#Bean
public SpringLiquibase liquibase(LiquibaseProperties properties) {
SpringLiquibase liquibase = new SpringLiquibase();
liquibase.setDataSource(systemDataSource);
liquibase.setChangeLog("classpath:liquibase/sis.changelog.xml");
liquibase.setContexts(properties.getContexts());
liquibase.setDefaultSchema(properties.getDefaultSchema());
liquibase.setDropFirst(properties.isDropFirst());
liquibase.setLabels(properties.getLabels());
liquibase.setChangeLogParameters(properties.getParameters());
liquibase.setRollbackFile(properties.getRollbackFile());
// This is because we are running the process manually. Don't let SpringLiquibase do it.
liquibase.setShouldRun(false);
return liquibase;
}
The above highly depends on your DataSource configuration requirements. You could also need to put this on your main Application class so the Spring-Liquibase auto-configuration doesn't kick in:
#SpringBootApplication(exclude = {
LiquibaseAutoConfiguration.class
})
public class Application {
// Stuff...
}

Just have 2 datasources and 2 beans
<bean id="liquibase1" class="liquibase.integration.spring.SpringLiquibase">
<property name="dataSource" ref="dataSource1" />
<property name="changeLog" value="classpath:db1-changelog.xml" />
</bean>
<bean id="liquibase2" class="liquibase.integration.spring.SpringLiquibase">
<property name="dataSource" ref="dataSource2" />
<property name="changeLog" value="classpath:db2-changelog.xml" />
</bean>

You can also run multiple liquibase instance (i.e. not only limit to a primary and a secondary).
e.g. Your configuration java can have:
#Bean
#ConfigurationProperties(prefix = "liquibase1")
...
#Bean
#ConfigurationProperties(prefix = "liquibase2")
...
#Bean
#ConfigurationProperties(prefix = "liquibase3")
Your application.property can have:
liquibase1.default-schema=schemaA
...
liquibase2.default-schema=schemaB
...
liquibase3.default-schema=schemaC
...
And (excitingly), these springLiquibase instances can use the same Datasource, or different DataSource... however you like it.
Running order? I haven't found any official document, from my observation in debug, all liquibase migration runs according to the order you write in application.properties. Those who wants to run migration in one datasource, then go to another datasource, then come back to this datasource and run something else,,, you may want to try this multiple liquibase instance approach.

Related

Camel route to read data from H2 DB in Spring boot app

My spring boot app is using route to read from mySQL DB.
from("sql:select * from Students?repeatCount=1").convertBodyTo(String.class)
.to("file:outbox");
Now I want to create route to read from in memory H2 DB but Im not sure which camel components to use and how to create the route.
If you got Spring Boot, then you can actually inject everything via the dataSource=#mysql parameter and introduce separate DataSource-Beans and use the required ones:
#Configuration
public class DataSourceConfig {
#Bean(name = "mysqlProperties")
#ConfigurationProperties("spring.datasource.mysql")
public DataSourceProperties mysqlDataSourceProperties() {
return new DataSourceProperties();
}
#Bean(name = "mysql")
public DataSource mysqlDataSource(#Qualifier("mysqlProperties") DataSourceProperties properties) {
return properties
.initializeDataSourceBuilder()
.build();
}
#Bean(name = "h2Properties")
#ConfigurationProperties("spring.datasource.h2")
public DataSourceProperties h2DataSourceProperties() {
return new DataSourceProperties();
}
#Bean(name = "h2")
public DataSource h2DataSource(#Qualifier("h2Properties") DataSourceProperties properties) {
return properties
.initializeDataSourceBuilder()
.build();
}
}
Afterwards you can declare the different DataSources in you application.yml but don't forget to disable/exclude the datasource autoconfig.
spring:
autoconfigure:
exclude:
- org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration
datasource:
mysql:
url: jdbc:mysql://localhost:3306/db_example
username: user
password: pass
driver-class-name: com.mysql.cj.jdbc.Driver
h2:
url: jdbc:h2:mem:h2testdb
username: sa
password: sa
driver-class-name: org.h2.Driver
Your Camel-Route should look like this to use everything properly:
from("sql:select * from Students?repeatCount=1&dataSource=#mysql")
.convertBodyTo(String.class)
.to("file:outbox");

Springboot 2.0 JUnit test #Tranactional rollback doesn't works

I migrated some projects from Springboot 1.5.10 to 2.0.
Springboot 2.0.3Release, JDK10, mysql, hikari-cp
After this work, in JUnit test, all data in test cases remains at database. I think it doesn't works #Tranactional - org.springframework.transaction.annotation.Transactional
Here is part of application.yml and test class.
spring:
datasource:
driver-class-name: com.mysql.jdbc.Driver
jpa:
show-sql: true
hibernate:
ddl-auto: none
database: mysql
database-platform: org.hibernate.dialect.MySQL5Dialect
Here is datasource.
#Configuration
#EnableTransactionManagement
public class DatasourceJPAConfig {
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
}
Here is part of JUnit test class.
#Transactional
#Rollback(true)
#RunWith(SpringRunner.class)
#ActiveProfiles("local")
#SpringBootTest
public class RepoTests {
#Autowired
private TestRepository testRepository;
#Test
public void saveTest() {
var name = "test";
var description = "test description"
var item = TestDomain.builder()
.name(name)
.description(description)
.build();
testRepository.save(item);
var optional = testRepository.findById(item.getId());
assertTrue(optional.isPresent());
assertEquals(optional.get().getDescription(), description);
assertEquals(optional.get().getName(), name);
}
}
after to run saveTest method, increase 1 row at database.
Add datasource to test and set auto commit to false
#Autowired
private DataSource dataSource;
And inside test
((HikariDataSource)dataSource).setAutoCommit(false);

Non-primary embedded database schema not being initiliazed

I'm currently working on a Spring Boot (1.5.10.RELEASE) application that uses 2 embedded databases. This two databases are defined equally but one's defined on the application itself and one's defined on a custom autoconfiguration class that's on an imported JAR; yet only the one flagged as #Primary (the one on the application) gets the schema initialized.
This is the current definition for both datasources:
Primary, on the application:
#Configuration
public class DataSourceConfiguration {
#Bean
#ConfigurationProperties(prefix = "first.datasource")
#Primary
public DataSourceProperties firstProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
public DataSource firstDataSource() {
return firstProperties().initializeDataSourceBuilder().build();
}
#Bean
#Primary
public JdbcTemplate firstTemplate() {
return new JdbcTemplate(firstDataSource());
}
}
On the autoconfiguration:
#Configuration
#ConditionalOnProperty(name = "second.datasource.url")
public class SecondDataSourceAutoconfiguration {
#Bean
#ConfigurationProperties(prefix = "second.datasource")
public DataSourceProperties secondProperties() {
return new DataSourceProperties();
}
#Bean
public DataSource secondDataSource() {
return secondProperties().initializeDataSourceBuilder().build();
}
#Bean
public JdbcTemplate secondJdbcTemplate(#Qualifier("secondDataSource") DataSource datasource) {
return new JdbcTemplate(datasource);
}
}
And my application.yml fills the properties:
first:
datasource:
url: jdbc:h2:firstdb;DB_CLOSE_ON_EXIT=FALSE
second:
datasource:
url: jdbc:h2:seconddb
platform: h2
My resources folder contains both a schema.sql that's executed on firstdb and a schema-h2.sql which should get executed on seconddb but does not. I tried playing around with the datasource.schema and datasource.initialize properties, switching the script names and the platform property to the first datasource (in that case, the schema-h2.sql gets executed on firstdb, but nothing on the seconddb) and changing the embedded database provider to HSQLDB; but can't get the schema for the non-primary in-memory DB initialized anyhow.

Disable production datasource autoconfiguration in a Spring DataJpaTest

In my application.yml, I have the following configuration (to be able to customize variable on different environment with docker/docker-compose) :
spring:
datasource:
url: ${SPRING_DATASOURCE_URL}
username: ${SPRING_DATASOURCE_USERNAME}
password: ${SPRING_DATASOURCE_PASSWORD}
The trouble is that Spring tries to autoconfigure this datasource while I am in #DataJpaTest, so with an embedded H2 database, and obviously it does not like placeholders....
I tried to exclude some autoconfiguration :
#DataJpaTest(excludeAutoConfiguration =
{DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class,
HibernateJpaAutoConfiguration.class})
But then, nothing works, entityManagerFactory is missing, ...
I could probably use profiles but if possible I preferred another solution.
Did you try defining your own Datasource bean?
#Import(JpaTestConfiguration.class)
#DataJpaTest
#Configuration
public class JpaTestConfiguration{
//...
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("org.h2.Driver");
dataSource.setUrl("jdbc:h2:mem:db;DB_CLOSE_DELAY=-1");
dataSource.setUsername("sa");
dataSource.setPassword("");
return dataSource;
}
//...
}

Spring Cloud Task - specify database config

I have Spring Cloud Task that loads data from SQL Server to Cassandra DB which will be run on Spring Cloud Data Flow.
One of the requirement of Spring Task is to provide relational database to persist metadata like task execution state. But I don't want use either of the above databases for that. Instead, I have to specify third database for persistence. But it seems like Spring Cloud Task flow automatically picks up data source properties of SQL Server from application.properties. How can I specify another db for task state persistence?
My Current properties:
spring.datasource.url=jdbc:sqlserver://iphost;databaseName=dbname
spring.datasource.username=user
spring.datasource.password=password
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.jpa.show-sql=false
#spring.jpa.hibernate.dialect=org.hibernate.dialect.SQLServer2012Dialect
spring.jpa.hibernate.naming.physical-strategy=org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl
spring.jpa.hibernate.ddl-auto=none
spring.data.cassandra.contact-points=ip
spring.data.cassandra.port=9042
spring.data.cassandra.username=username
spring.data.cassandra.password=password
spring.data.cassandra.keyspace-name=mykeyspace
spring.data.cassandra.schema-action=CREATE_IF_NOT_EXISTS
Update: 1
I added below code to point to 3rd database as suggested by Michael Minella. Now Spring Task is able to connect to this DB and persist state. But now my batch job source queries are also connecting to this database. Only thing I changed was to add datasource for task.
spring.task.datasource.url=jdbc:postgresql://host:5432/testdb?stringtype=unspecified
spring.task.datasource.username=user
spring.task.datasource.password=passwrod
spring.task.datasource.driverClassName=org.postgresql.Driver
#Configuration
public class DataSourceConfigs {
#Bean(name = "taskDataSource")
#ConfigurationProperties(prefix="spring.task.datasource")
public DataSource getDataSource() {
return DataSourceBuilder.create().build();
}
}
#Configuration
public class DDTaskConfigurer extends DefaultTaskConfigurer{
#Autowired
public DDTaskConfigurer(#Qualifier("taskDataSource") DataSource dataSource) {
super(dataSource);
}
}
Update #2:
#Component
#StepScope
public class MyItemReader extends RepositoryItemReader<Scan> implements InitializingBean{
#Autowired
private ScanRepository repository;
private Integer lastScanIdPulled = null;
public MyItemReader(Integer _lastIdPulled) {
super();
if(_lastIdPulled == null || _lastIdPulled <=0 ){
lastScanIdPulled = 0;
} else {
lastScanIdPulled = _lastIdPulled;
}
}
#PostConstruct
protected void setUpRepo() {
final Map<String, Sort.Direction> sorts = new HashMap<>();
sorts.put("id", Direction.ASC);
this.setRepository(this.repository);
this.setSort(sorts);
this.setMethodName("findByScanGreaterThanId");
List<Object> methodArgs = new ArrayList<Object>();
System.out.println("lastScanIdpulled >>> " + lastScanIdPulled);
if(lastScanIdPulled == null || lastScanIdPulled <=0 ){
lastScanIdPulled = 0;
}
methodArgs.add(lastScanIdPulled);
this.setArguments(methodArgs);
}
}
#Repository
public interface ScanRepository extends JpaRepository<Scan, Integer> {
#Query("...")
Page<Scan> findAllScan(final Pageable pageable);
#Query("...")
Page<Scan> findByScanGreaterThanId(int id, final Pageable pageable);
}
Update #3:
If I add config datasource for Repository, I now get below exception. Before you mention that one of the datasource needs to be declared Primary. I already tried that.
Caused by: java.lang.IllegalStateException: Expected one datasource and found 2
at org.springframework.cloud.task.batch.configuration.TaskBatchAutoConfiguration$TaskBatchExecutionListenerAutoconfiguration.taskBatchExecutionListener(TaskBatchAutoConfiguration.java:65) ~[spring-cloud-task-batch-1.0.3.RELEASE.jar:1.0.3.RELEASE]
at org.springframework.cloud.task.batch.configuration.TaskBatchAutoConfiguration$TaskBatchExecutionListenerAutoconfiguration$$EnhancerBySpringCGLIB$$baeae6b9.CGLIB$taskBatchExecutionListener$0(<generated>) ~[spring-cloud-task-batch-1.0.3.RELEASE.jar:1.0.3.RELEASE]
at org.springframework.cloud.task.batch.configuration.TaskBatchAutoConfiguration$TaskBatchExecutionListenerAutoconfiguration$$EnhancerBySpringCGLIB$$baeae6b9$$FastClassBySpringCGLIB$$5a898c9.invoke(<generated>) ~[spring-cloud-task-batch-1.0.3.RELEASE.jar:1.0.3.RELEASE]
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228) ~[spring-core-4.3.14.RELEASE.jar:4.3.14.RELEASE]
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:358) ~[spring-context-4.3.14.RELEASE.jar:4.3.14.RELEASE]
at org.springframework.cloud.task.batch.configuration.TaskBatchAutoConfigu
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "myEntityManagerFactory",
basePackages = { "com.company.dd.collector.tool" },
transactionManagerRef = "TransactionManager"
)
public class ToolDbConfig {
#Bean(name = "myEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean
myEntityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("ToolDataSource") DataSource dataSource
) {
return builder
.dataSource(dataSource)
.packages("com.company.dd.collector.tool")
.persistenceUnit("tooldatasource")
.build();
}
#Bean(name = "myTransactionManager")
public PlatformTransactionManager transactionManager(
#Qualifier("myEntityManagerFactory") EntityManagerFactory
entityManagerFactory
) {
return new JpaTransactionManager(entityManagerFactory);
}
}
#Configuration
public class DataSourceConfigs {
#Bean(name = "taskDataSource")
#ConfigurationProperties(prefix="spring.task.datasource")
public DataSource getDataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "ToolDataSource")
#ConfigurationProperties(prefix = "tool.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
}
You need to create a TaskConfigurer to specify the DataSource to be used. You can read about this interface in the documentation here: https://docs.spring.io/spring-cloud-task/1.1.1.RELEASE/reference/htmlsingle/#features-task-configurer
The javadoc can be found here: https://docs.spring.io/spring-cloud-task/docs/current/apidocs/org/springframework/cloud/task/configuration/TaskConfigurer.html
UPDATE 1:
When using more than one DataSource, both Spring Batch and Spring Cloud Task follow the same paradigm in that they both have *Configurer interfaces that need to be used to specify what DataSource to use. For Spring Batch, you use the BatchConfigurer (typically by just extending the DefaultBatchConfigurer) and as noted above, the TaskConfigurer is used in Spring Cloud Task. This is because when there is more than one DataSource, the framework has no way of knowing which one to use.

Resources