Spring and Flyway - migrate before application context starts up - spring

As the title suggests, I am looking for any means that could help me run Flyway migrations before Springs application context (persistence context to be precise) is loaded. The reason for that is I have few queries that run at the startup of the application. This leads to my tests failing as queries are being executed on database tables that do not yet exist. I am using H2 as my test database. Right now I am using only flyway core dependency:
<dependency>
<groupId>org.flywaydb</groupId>
<artifactId>flyway-core</artifactId>
<version>6.5.0</version>
<scope>test</scope>
</dependency>
and I have a single Flyway configuration class as follows:
#Configuration
class FlywayConfig {
private static final String resourcePath = "classpath:flyway/migrations";
private static final String sampleDataPath = "classpath:flyway/sample_data";
#Bean
Flyway flyway(#Value("${spring.datasource.url}") String dataSource,
#Value("${spring.datasource.username}") String username,
#Value("${spring.datasource.password}") String password) {
FluentConfiguration fluentConfiguration = Flyway.configure().dataSource(dataSource, username, password);
fluentConfiguration.locations(resourcePath, sampleDataPath);
Flyway flyway = fluentConfiguration.load();
return flyway;
}
}
and the properties are defined in application.yml
spring:
datasource:
username: sa
password: sa
url: 'jdbc:h2:mem:testdb;Mode=Oracle;IGNORE_CATALOGS=TRUE;DB_CLOSE_DELAY=-1;'
platform: h2
h2:
console:
enabled: true
jpa:
show-sql: true
What I would like to achieve is that: 1. flyway does migration 2. Spring context loads up (in that particular order)

I managed to be able to acomplish what I wanted by creating DataSource object manually in configuration file (not by Spring automatically from application.yml) and using #DependsOn on the DataSource object. This way I made sure that any possible database connection from the application context would be established once I do the migration in the Flyway bean (which btw I also tweaked). I was cleaning and migrating Flyway right before tests and now I have to do this while initalizing application context beans. Here is the code which worked for me:
#Configuration
class DatabaseConfig {
private static final String resourcePath = "classpath:flyway/migrations";
private static final String sampleDataPath = "classpath:flyway/sample_data";
private static final String dataSourceUrl = "jdbc:h2:mem:testdb;Mode=Oracle;IGNORE_CATALOGS=TRUE;DB_CLOSE_DELAY=-1;";
private static final String username = "sa";
private static final String password = "sa";
#Bean("flyway")
public Flyway flyway() {
FluentConfiguration fluentConfiguration = Flyway.configure().dataSource(dataSourceUrl, username, password);
fluentConfiguration.locations(resourcePath, sampleDataPath);
Flyway flyway = fluentConfiguration.load();
flyway.clean();
flyway.migrate();
return flyway;
}
#DependsOn("flyway")
#Bean
public DataSource dataSource() {
DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
dataSourceBuilder.driverClassName("org.h2.Driver");
dataSourceBuilder.url(dataSourceUrl);
dataSourceBuilder.username(username);
dataSourceBuilder.password(password);
return dataSourceBuilder.build();
}
}
and here is the application.yml file (I got rid of datasource related records):
spring:
h2:
console:
enabled: true
jpa:
show-sql: true
flyway:
enabled: false

Related

Amazon RDS Read Replica configuration Postgres database from an spring boot application deployed on PCF?

Hi All currently we are have deployed our springboot code to pcf which is running on aws.
we are using aws database - where we have cup service and VCAP_SERVICES which hold the parameter of db.
Below our configuration to get datasource
#Bean
public DataSource dataSource() {
if (dataSource == null) {
dataSource = connectionFactory().dataSource();
configureDataSource(dataSource);
}
return dataSource;
}
#Bean
public JdbcTemplate jdbcTemplate() {
return new JdbcTemplate(dataSource());
}
private void configureDataSource(DataSource dataSource) {
org.apache.tomcat.jdbc.pool.DataSource tomcatDataSource = asTomcatDatasource(dataSource);
tomcatDataSource.setTestOnBorrow(true);
tomcatDataSource.setValidationQuery("SELECT 1");
tomcatDataSource.setValidationInterval(30000);
tomcatDataSource.setTestWhileIdle(true);
tomcatDataSource.setTimeBetweenEvictionRunsMillis(60000);
tomcatDataSource.setRemoveAbandoned(true);
tomcatDataSource.setRemoveAbandonedTimeout(60);
tomcatDataSource.setMaxActive(Environment.getAsInt("MAX_ACTIVE_DB_CONNECTIONS", tomcatDataSource.getMaxActive()));
}
private org.apache.tomcat.jdbc.pool.DataSource asTomcatDatasource(DataSource dataSource) {
Objects.requireNonNull(dataSource, "There is no DataSource configured");
DataSource targetDataSource = ((DelegatingDataSource)dataSource).getTargetDataSource();
return (org.apache.tomcat.jdbc.pool.DataSource) targetDataSource;
}
Now when we have read replicas created , what configuration do i need to modify so our spring boot application uses the read replicas?
is Just #Transactional(readOnly = true) on the get call is enough - that it will be automatically taken care? or do i need to add some more configuration
#Repository
public class PostgresSomeRepository implements SomeRepository {
#Autowired
public PostgresSomeRepository(JdbcTemplate jdbcTemplate, RowMapper<Consent> rowMapper) {
this.jdbcTemplate = jdbcTemplate;
this.rowMapper = rowMapper;
}
#Override
#Transactional(readOnly = true)
public List<SomeValue> getSomeGetCall(List<String> userIds, String applicationName, String propositionName, String since, String... types) {
//Some Logic
try {
return jdbcTemplate.query(sql, rowMapper, paramList.toArray());
} catch (DataAccessException ex) {
throw new ErrorGettingConsent(ex.getMessage(), ex);
}
}
}
Note:we have not added any spring aws jdbc dependency
Let's assume the cloud service name is my_db.
Map the cloud service to the application config appication-cloud.yml used by default in the CF (BTW this is better than using the connector because you can customize the datasource)
spring:
datasource:
type: com.zaxxer.hikari.HikariDataSource
# my_db
url: ${vcap.services.my_db.credentials.url}
username: ${vcap.services.my_db.credentials.username}
password: ${vcap.services.my_db.credentials.password}
hikari:
poolName: Hikari
auto-commit: false
data-source-properties:
cachePrepStmts: true
prepStmtCacheSize: 250
prepStmtCacheSqlLimit: 2048
useServerPrepStmts: true
jpa:
generate-ddl: false
show-sql: true
put the service to the application manifest.yml:
---
applications:
- name: my-app
env:
SPRING_PROFILES_ACTIVE: "cloud" # by default
services:
- my_db

Camel route to read data from H2 DB in Spring boot app

My spring boot app is using route to read from mySQL DB.
from("sql:select * from Students?repeatCount=1").convertBodyTo(String.class)
.to("file:outbox");
Now I want to create route to read from in memory H2 DB but Im not sure which camel components to use and how to create the route.
If you got Spring Boot, then you can actually inject everything via the dataSource=#mysql parameter and introduce separate DataSource-Beans and use the required ones:
#Configuration
public class DataSourceConfig {
#Bean(name = "mysqlProperties")
#ConfigurationProperties("spring.datasource.mysql")
public DataSourceProperties mysqlDataSourceProperties() {
return new DataSourceProperties();
}
#Bean(name = "mysql")
public DataSource mysqlDataSource(#Qualifier("mysqlProperties") DataSourceProperties properties) {
return properties
.initializeDataSourceBuilder()
.build();
}
#Bean(name = "h2Properties")
#ConfigurationProperties("spring.datasource.h2")
public DataSourceProperties h2DataSourceProperties() {
return new DataSourceProperties();
}
#Bean(name = "h2")
public DataSource h2DataSource(#Qualifier("h2Properties") DataSourceProperties properties) {
return properties
.initializeDataSourceBuilder()
.build();
}
}
Afterwards you can declare the different DataSources in you application.yml but don't forget to disable/exclude the datasource autoconfig.
spring:
autoconfigure:
exclude:
- org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration
datasource:
mysql:
url: jdbc:mysql://localhost:3306/db_example
username: user
password: pass
driver-class-name: com.mysql.cj.jdbc.Driver
h2:
url: jdbc:h2:mem:h2testdb
username: sa
password: sa
driver-class-name: org.h2.Driver
Your Camel-Route should look like this to use everything properly:
from("sql:select * from Students?repeatCount=1&dataSource=#mysql")
.convertBodyTo(String.class)
.to("file:outbox");

Micronaut - Configuring Oracle UCP Multiple Data Sources And jdbcOperations

I am developing my first micronaut application and i'm having a problem configuring oracle multiple datasources with ucp.
I'm following the oficial tutorial (https://micronaut-projects.github.io/micronaut-sql/latest/guide/) by when I try to perform a select, I get a error:
io.micronaut.transaction.jdbc.exceptions.CannotGetJdbcConnectionException: No current JDBC Connection found. Consider wrapping this call in transactional boundaries.
I check the DataSourceFactory and the PoolDataSource is set correctly
What am I missing?
Thanks!
micronaut:
application:
name: myapp
datasources:
first:
url: url
connectionFactoryClassName: oracle.jdbc.pool.OracleDataSource
username: user
password: password
minPoolSize: 1
maxPoolSize: 10
second:
url: url
connectionFactoryClassName: oracle.jdbc.pool.OracleDataSource
username: user
password: password
minPoolSize: 1
maxPoolSize: 10
#JdbcRepository(dialect = Dialect.ORACLE)
public abstract class MyRepository {
#Inject
#Named("first")
protected final DataSource dataSource;
protected final JdbcOperations jdbcOperations;
public MyRepository(JdbcOperations jdbcOperations, DataSource dataSource) {
this.dataSource = dataSource;
this.jdbcOperations = jdbcOperations;
}
}
#Singleton
public class MyDao extends MyRepository {
public MyDao(JdbcOperations jdbcOperations, DataSource dataSource) {
super(jdbcOperations, dataSource);
}
#Transactional
public Long find() {
String sql = "select * from table where id = 1";
return this.jdbcOperations.prepareStatement(sql, st -> {
return 1L;
});
}
The first datasource needs to be called 'default'.
Also when using a transaction with the non-default datasource, you need to use #TransactionAdvice with the datasource name, e.g. #TransactionalAdvice("Second").

Springboot 2.0 JUnit test #Tranactional rollback doesn't works

I migrated some projects from Springboot 1.5.10 to 2.0.
Springboot 2.0.3Release, JDK10, mysql, hikari-cp
After this work, in JUnit test, all data in test cases remains at database. I think it doesn't works #Tranactional - org.springframework.transaction.annotation.Transactional
Here is part of application.yml and test class.
spring:
datasource:
driver-class-name: com.mysql.jdbc.Driver
jpa:
show-sql: true
hibernate:
ddl-auto: none
database: mysql
database-platform: org.hibernate.dialect.MySQL5Dialect
Here is datasource.
#Configuration
#EnableTransactionManagement
public class DatasourceJPAConfig {
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
}
Here is part of JUnit test class.
#Transactional
#Rollback(true)
#RunWith(SpringRunner.class)
#ActiveProfiles("local")
#SpringBootTest
public class RepoTests {
#Autowired
private TestRepository testRepository;
#Test
public void saveTest() {
var name = "test";
var description = "test description"
var item = TestDomain.builder()
.name(name)
.description(description)
.build();
testRepository.save(item);
var optional = testRepository.findById(item.getId());
assertTrue(optional.isPresent());
assertEquals(optional.get().getDescription(), description);
assertEquals(optional.get().getName(), name);
}
}
after to run saveTest method, increase 1 row at database.
Add datasource to test and set auto commit to false
#Autowired
private DataSource dataSource;
And inside test
((HikariDataSource)dataSource).setAutoCommit(false);

Spring Boot externalised configuration for DataSource not working

I have an application - using Spring 4.3.6 and Spring Boot 1.4.4 - which will be exported as a JAR. I want to connect to a remote Oracle database but I am having trouble externalising the configuration without breaking the application.
This is my current workaround:
import org.apache.tomcat.jdbc.pool.DataSource;
#Bean
public DataSource dataSource() {
DataSource dataSource = new DataSource();
dataSource.setUrl("jdbc:oracle:thin:#ip-address:port:orcl");
dataSource.setUsername("user");
dataSource.setPassword("password");
dataSource.setDriverClassName("oracle.jdbc.OracleDriver");
return dataSource;
}
With the above, my application is able to connect to the database and execute queries successfully. However, when I try to externalise the configuration as follows:
#Bean
#ConfigurationProperties(prefix="app.datasource")
public DataSource dataSource() {
return new DataSource();
}
// application.properties
app.datasource.url=jdbc:oracle:thin:#ip-address:port:orcl
app.datasource.username=user
app.datasource.password=password
app.datasource.driver-class-name=oracle.jdbc.OracleDriver
I will get the following error when trying to execute jdbcTemplate.update(query) in my Spring Boot Controller (note that without externalising the above works):
org.springframework.jdbc.CannotGetJdbcConnectionException: Could not get JDBC Connection; nested exception is java.sql.SQLException: The url cannot be null
I have tried to remove #ConfigurationProperties and change app.datasource to spring.datasource. I have also tried to use DataSourceBuilder.create().build() which returns javax.sql.DataSource but the same error is thrown in both cases.
I'm doing something wrong. What's the correct way to successfully externalise the configuration?
Suppose you have two datasources for two different Oracle databases. Then you have the following properties file:
/path/to/config/application.properties
oracle1.username=YourUser1
oracle1.password=YourPassword1
oracle1.url=jdbc:oracle:thin:#localhost:1521:XE
oracle2.username=YourUser2
oracle2.password=YourPassword2
oracle2.url=jdbc:oracle:thin:#192.168.0.3:1521:XE
Then in a configuration file:
import oracle.jdbc.pool.OracleDataSource;
#Configuration
public class DatasourcesConfig {
#Autowired
private Environment env;
#Primary
#Bean(name = "dataSource1")
DataSource oracle1DataSource() throws SQLException {
OracleDataSource dataSource = new OracleDataSource();
dataSource.setUser(env.getProperty("oracle1.username"));
dataSource.setPassword(env.getProperty("oracle1.password"));
dataSource.setURL(env.getProperty("oracle1.url"));
return dataSource;
}
#Bean(name = "dataSource2")
DataSource oracle2DataSource() throws SQLException {
OracleDataSource dataSource = new OracleDataSource();
dataSource.setUser(env.getProperty("oracle2.username"));
dataSource.setPassword(env.getProperty("oracle2.password"));
dataSource.setURL(env.getProperty("oracle2.url"));
return dataSource;
}
}
If you want to specify the external location of your application.properties file when running the jar, then set the spring.config.location as a system property, you can try:
java -jar target/your-application-0.0.1.jar -Dspring.config.location=/path/to/config/
Make sure the application.properties file is excluded when building the jar
There should be no need to create the DataSourceyourself.
Make sure you have the
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
</dependency>
Dependecy in your classpath and the oracle driver and put following properties in your application.properties file:
spring.datasource.url=jdbc:oracle:thin:#ip-address:port:orcl
spring.datasource.username=user
spring.datasource.password=password
spring.datasource.driver-class-name=oracle.jdbc.OracleDriver
After that you should be able to #Autowired your DataSource
For more information have a look at:
https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-sql.html#boot-features-connect-to-production-database
You cannot override the predefined properties provided by spring boot.
Just use the following properties in application.properties file.
spring.datasource.url=jdbc:oracle:thin:#ip-address:port:orcl
spring.datasource.data-username=user
spring.datasource.data-password=password
spring.datasource.driver-class-name=oracle.jdbc.OracleDriver
See Also : https://docs.spring.io/spring-boot/docs/current/reference/html/common-application-properties.html
Apart from above, to clarify #ConfigurationProperties is used at class level and the prefix "app" not "app.datasource"
#ConfigurationProperties(prefix = "app")
Now you have a class named DbProperties and the properties of the class is same as the last part of the key in application.properties
public class DBProperties {
private String url;
private String username;
private String password;
// setters and getters
}
Now the actual config/component class should look like following.
#Component
#ConfigurationProperties(prefix = "app")
public class MyComponent {
DBProperties datasource = new DBProperties();
public DBProperties getDatasource() {
return datasource;
}
public void setDatasource(DBProperties datasource) {
this.datasource = datasource;
}
}
Please note
The instance variable name is datasource which is same as the second part of the key
datasource is a class level instance

Resources