Integrating liquibase for existing Spring boot gradle project - spring-boot

I have a spring boot project which initially uses hibernate.ddl-auto: update command to create schema. Now i want to migrate to liquibase, so for this steps followed are:
ran diffChangeLog on cmd to generate existing schema -> got generated in db/changelog folder.
ran changelogSync on cmd, I was able to create liquibase related table(changeLog and lock) and all existing schema related insert statement were also inserted.
So far so good. now i want to move this change to production, When my spring boot project starts it should automatically run changelogSync command before hand and then boot strap the project, so that i don't need to run these command manually. how can i achieve this?

As far as diffChangeLog is concerned it is used to point out the differences in general and generates changes to resolve most of them. Where the changeLogSync commands ONLY marks all undeployed changes in your changelog as executed in your database, it does not deploy any changes. From what understand this is not the best approach to migrate the database.

There are 2 ways which you can use. Change your pipeline(in jenkins or bamboo or any other tool) so you can execute changelogSync before even you start your application(advantage of this is that liquibase command is totally decoupled from your app). The second approach is to implement CommandLineRunner like this:
#Component
public class CommandLineAppStartupRunner implements CommandLineRunner {
#Override
public void run(String...args) throws Exception {
Runtime rt = Runtime.getRuntime();
Process pr = rt.exec("liquibase changelogSync");//or any other command
}
}
You can even pass your command as argument to command line runner if you want.

Related

Liquibase migrations from custom spring-boot-starter

I am developing a custom spring-boot-starter that can be included in any project. For it work it need to create a couple of tables in the database in the main application. For this I use liquibase migrations which are described in the spring-boot-starter module. Also in the starter in the application.properties file I describe the properties
spring.datasource.url=jdbc:postgresql://localhost:5432/message
spring.liquibase.change-log=classpath:liquibase/changelog.xml
When starting the main application, which contains my starter in dependencies, I get an exception:
Description:
Liquibase failed to start because no changelog could be found at 'classpath:/db/changelog/db.changelog-master.yaml'.
Action:
Make sure a Liquibase changelog is present at the configured path.
The application looks for a path to migrations and migrations inside the project, but does not look for them inside the starter scope.
Please tell me what needs to be done so that when the application starts, the migrations described in the starter are also pulled up?
How to set up liquibase? Or maybe there is some other way to run sql scripts if the required tables are missing?

Spring Batch with unknown datasource

I have a working Spring Boot application which embeds a Spring Batch Job. The job is not run on a schedule, instead we kick it with an endpoint. It is working as it should. The basics of the batch are
Kick the endpoint to start the job
Reader reads from input file
Processor reads from oracle database using jpa repository and simple spring datasource config
Writer writes to output file
However there are new requirements:
The schema of the repository database is from here on unknown on application startup. The tables are the same, it is just an unknown schema. This fact is out of our control and you might think it is stupid but there are reasons for it and this cant be changed. This means that with current functionality we need to reconfigure the datasource when we know the new schema name, and restart the application. This is a job that we will run for a number of times when migrating from one system to another, so it has a limited lifecycle and we just need a "quick fix" to be able to use it without rewriting the whole app. So what I would like to do is:
Send the schema name as a query param to the application, put it in job parameters and then - get a new datasource when the processor reads from the repository. Would this be doable at all using Spring Batch? Any help appreciated!

How to programmatically trigger flyway migration before spring-boot flyway-ootb migration is executed?

In my spring-boot project I'm using out-of-the-box integration for Flyway (org.flywaydb.flyway-core) and have some migration scripts which are executed on startup and managed via default flyway_schema_history-table.
The project also uses a module bringing its own flyway migration scripts, which are migrated programmatically and keeping track of migrations in an other moduleX_schema_history-table.
As the migrations of the main-project needs to work on some of the tables created via the modules migration, the module-migration needs to happen before the flyway-plugin migrates the main-projects scripts.
How can I achieve executing modules migration programmatically and before the main-apllications fly-integration kicks in?
How and when is the flyway-plugin migration triggered?
//EDIT:
I tried to execute code before FlywayAutoConfiguration via a #Configuration class annotated with
#AutoConfigureBefore({FlywayAutoConfiguration.class})
#AutoConfigureAfter({DataSourceAutoConfiguration.class})
but unfortunatly the class is still instanciated after FlywayAutoConfiguration.
//EDIT:
I asked the related (more general) question how to order AutoConfiguration from modules, too How to use #AutoConfigureOrder in a spring boot configuration class from a module - Stack Overflow
Flyway migration will kick-in during startup if you configured it correctly in your application yaml/properies file.
In order to start another flyway migration before the main app migration you can use flyway command line or create an separate custom_pom.xml project file for another module and trigger it via mvn plugin from command line: mvn flyway:migrate before executing the main app.

Flyway with spring boot overwrites whole DB every time I switch run mode between WAR and IDE run

I'm facing very weird issue while integrating flyway DB migration with spring boot application.
When I run the application from executable WAR using command line, it creates new DB at the start-up of application.
Now, If I switch the application run mode to IDE (i.e. run from STS), it again fires all the script from my db/migration folder. I can see the installed_on column time changes every-time I switch between these 2 run modes. I have tried enabling baselineOnMigrate property, but didn't get any effect of it.
Do you think its something related to spring boot embedded tomcat ? because at both run it creates individual tomcat which is embedded.
Please find my spring boot application.properties below:
mssql.dbname=issueDB
mssql.password=password
mssql.dbserver=localhost
mssql.port=1501
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.datasource.url=jdbc:sqlserver://${mssql.dbserver}:${mssql.port};databaseName=${mssql.dbname}
spring.datasource.username=user
spring.datasource.password=${mssql.password}
spring.flyway.baselineOnMigrate=true
spring.flyway.locations=classpath:db/migration/testissue
spring.flyway.out-of-order=true
spring.flyway.baseline-version=1.3
spring.flyway.placeholder-prefix=$
spring.flyway.placeholder-suffix=$
spring.flyway.mixed=true
spring.flyway.cleanOnValidationError=true
I suppose, it could be caused by this property spring.flyway.cleanOnValidationError=true. According to the docs:
Whether to automatically call clean or not when a validation error occurs.
This is exclusively intended as a convenience for development. Even tough we strongly recommend not to change migration scripts once they have been checked into SCM and run, this provides a way of dealing with this case in a smooth manner. The database will be wiped clean automatically, ensuring that the next migration will bring you back to the state checked into SCM.
May be that you got some validation problems if you are running your application in different ways on the same database and flyway just clean your database and overwrite it with the current scripts state.

Flyway to execute some scripts before migrations to provide some sprocs in my schema

I have an in-house lib with all the sprocs to create some audit table and triggers for the tables that I would be adding in future as migration scripts and I want these sprocs to be deployed to my schema before any of the migrations runs. I have a specific DDL that does it. The reason I want it that way is whenever I have a new table that I need to add, I can call this sproc to create those triggers and audit tables. That way my migration scripts would be cleaner and simpler.Also, this would help in keeping the H SQL and oracle scripts as much in sync as possible as I would not care about having audit or triggers for H SQL while doing dev work.
I have extracted from the library into a folder the DLL running the sprocs and it is in a separate folder as I want to have them separate to my migration scripts.
I don't have a Flyway bean created because spring boot did it for me. I don't see any properties configuration to create callbacks.
I am using spring boot 1.3.3-RELEASE and prefer to use flyway 3.2.1 which comes with spring boot parent dependencies.
Could someone suggest the best way to do this?
If there is a higher version of flyway supporting this, would you recommend override we specific version of flyway provided by spring boot with the new one?
Option 1:
Use callbacks. Flyway version 3.x has support for these which was improved upon in version 4. This includes hooks for beforeMigrate, beforeEachMigrate and beforeBaseline which is what you are after.
There are a few ways to do this:
Add the named SQL files (eg beforeMigrate.sql, beforeEachMigrate.sql) to the migrations directory.
Configure your callback Java classes with the callbacks property.
spring-boot offers FlywayMigrationStrategy to hook into the lifecycle and get access to the Flyway object. See the spring-boot Flyway documentation.
Option 2.
Add those stored procedures to V__1 so they are callable from each subsequent migration. Means you will have to forgo your desire of having them separated from the migrations of course - but much simpler.
#Component
public class FlywayFactory implements FlywayMigrationStrategy {
#Override
public void migrate(Flyway flyway) {
flyway.setCallbacks(new FlywayCallbackService());
flyway.migrate();
}
}
This did the work. thanks for the answer #markdsievers

Resources