How should I control order of execution between Liquibase and springboot data.sql? - spring

We have a (internal to company, external to project) library (jar) that includes some Liquibase scripts to add tables to a schema to support the functionality of that library.
Using SpringBoot and Maven to run integration tests with H2, we have been using sql files, listed in property files, to initialise the DB.
We want to be able to add data to the tables created by the external (to the project) library for the ITs but finding the tables haven't been created by Liquibase when SpringBoot/SpringData attempts to run the insert statements in our sql files.
Given the errors we're seeing (tables not existing when spring attempts to run the insert.sql files) it looks like spring is executing those files before Liquibase has done its thing.
How can I ensure Liquibase config run by the library to create tables has completed before Spring does it's thing with running sql files specified by the spring.datasource.data property?
We don't really want to include the test data in the library (which was working, but introduced other issued we are trying to work around with liquibase inserting test data into production DB).

what about using different context for your tests?
so you will have application.properties in your test folder and there you will define another changelog that will include all changelogs that are needed (even from library) and you will also include the .sql file that you are running probably with jpa? Try to look here if that helps.

Related

Liquibase migrations from custom spring-boot-starter

I am developing a custom spring-boot-starter that can be included in any project. For it work it need to create a couple of tables in the database in the main application. For this I use liquibase migrations which are described in the spring-boot-starter module. Also in the starter in the application.properties file I describe the properties
spring.datasource.url=jdbc:postgresql://localhost:5432/message
spring.liquibase.change-log=classpath:liquibase/changelog.xml
When starting the main application, which contains my starter in dependencies, I get an exception:
Description:
Liquibase failed to start because no changelog could be found at 'classpath:/db/changelog/db.changelog-master.yaml'.
Action:
Make sure a Liquibase changelog is present at the configured path.
The application looks for a path to migrations and migrations inside the project, but does not look for them inside the starter scope.
Please tell me what needs to be done so that when the application starts, the migrations described in the starter are also pulled up?
How to set up liquibase? Or maybe there is some other way to run sql scripts if the required tables are missing?

Populate database using spring / hibernate / flyway / postgresql

I'm trying to populate my database with around 150 different values (one for each row).
So far, I've found two different ways to implement the inserts, but none of them seems to be the best way to do it.
Flyway + Postgres: One of them is to create a migration file and make use of the COPY command from postgres but to do so, I need to give superuser permissions to the user and that doesn't seem to be a good choice.
Spring boot: place a data.sql file in the classpath with a lot of inserts. If I'm not wrong I would have to write 150 insert into... statements.
In previous projects, I have used liquibase and it has a loadData command which is very convenient to do what is says it does. You just give the file, table name and that's it. You end up with your csv file values in your table rows.
Is there an alike way to do that in flyway? What is the best way to populate the database?
Actually there is a way, you can find more info on the official documentation's page
You need to add some spring boot properties too:
spring.flyway.enabled=true
spring.flyway.locations=classpath:/db/migration
spring.flyway.schemas=public
Properties details here
In my case, a use Repetables scripts by my needs but take care with the prefixes
Flyway is a direct competitor of liquidbase, so if you need to track the status of migrations, manage distributed migration (many instances of the same service start simultaneously, and only one instance should actually execute a migration), check upon startup which migration should be applied and execute only relevant migrations, and all other benefits that you have previously expected from "migration management system", then you should use Flyway rather than managing SQLs directly.
Spring boot has integrations with both Flyway and Liquidbase, so you can place your migrations in the "resources" folder, define a couple of properties and spring boot will run Flyway automatically.
For example, here you can find a tutorial of Flyway integration with spring boot.
Since flyway's migrations are SQL files- you can place there whatever you want (even plSQL I believe), it will even manage transaction per migration guaranteeing that the migration "atomicity" (all or nothing, no partial migration).
So the straightforward approach would be creating a SQL file file with 150 inserts and running it via flyway in spring or even via maven depending on your actual setup.
If you want more fine-grained control and the SQL is not flexible enough, its possible to implement Migration in Java Code. See Official Flyway Documentation

Spring Boot: Executing the Newer version of SQL file each time we rebuild the application

I have a spring-boot application with PostgreSQL. Some of the tables are created using models and other tables have to be created prior to the start of the application or while starting the application. That can be done by running an SQL file while startup.
But DB tend to change over time, we may have to alter some of the tables, add some new tables without disturbing the existing data in the tables, etc.
Is there a way to add new SQL files, and run only the SQL files which was not run in spring-boot application each time when we rebuild and rerun? And don't run any of the SQL files while start-up if everything were already executed?
For your scenario liquibase is the best solution. You can merge liquibase on your spring boot application.
Ex: https://javadeveloperzone.com/spring-boot/spring-boot-liquibase-example/
You can use flyway. It allows you to have versioned sql scripts:
flywaydb.org
flywaydb spring boot plugin
Examples:
Spring Boot Database Migrations with Flyway
Incrementally changing your db with java and flyway

jHipster - Loading data

Is there a workflow for adding data with jHipster?
I want to add static data for the H2 database in the first instance, is this process different when using MySql, postgresql?
I note there is a users.csv which is loaded via liquibase, I'm guessing I create another csv and load that.
Cheers..
It's the same for all SQL databases as it uses Liquibase, you must create a Liquibase changelog that uses loadData to load your CSV file from src/main/resources/config/liquibase/data and refer to it from master.xml.
Additionally, you can also tag your changelog so that this data is only loaded for H2 by setting a conditional dbms="h2" or use a condition on Liquibase context context="dev" that you can then set in application*.yml.

HSQL Unit Test -- How to Create Multiple In-Memory Schemas?

I would like to use hsql within my DAO unit tests for a web application. The web app is written against mysql and uses three different schemas within the same mysql database. Some schemas has FK relationships with data in the other schemas. If I'm to unit test, I must be able to execute against a database that can hold multiple schemas.
I know that HSQL supports multiple schemas, but I don't know how to configure hsql to have multiple schemas set up for an in-memory database. I read that I can define multiple schemas in the server.properties file, but the file needs to be in the location of where the java class was called -- the junit.jar location? If so, that would be hard to support in my Java Maven application. How can I:
Run an in-memory hsql database to start up with three databases?
Where would I place the server.properties file in my Maven app?
Could I point hsql to use a server.properties file in a location other than where the junit jar is (that's a showstopper for me)?
Is it possible to configure multiple schemas for an in-memory database just via a tricked out jdbc url?
I wish I could untangle the schemas from each other, but that's not possible at this time.
Thanks for your help!
HSQLDB supports multiple schemas in the same database. Foreign keys can reference tables from different schemas. The following apply to the very latest HSQLDB 2.2.6 snapshot available from http://hsqldb.org
Before running your tests, execute CREATE SCHEMA schemaname for each schema.
Doesn't matter where, you can specify the absolute path on the command line arguments when running. See the HSQLD Guide and JavaDoc on server.
Yes.
No. You use the SQL statement to create the schemas.
Note you have two options for running HSQLDB, one is as a server, the other is as an embedded database. In the case of server, it must be started before the test run. In both cases, you need to connect to the database and create the schemas before your tests.
It is possible to create different db by setting db name. By default, it creates the db name as testdb, but in case we want to create multiple db, then set the name explicitly.
new EmbeddedDatabaseBuilder()
.setType(EmbeddedDatabaseType.HSQL).setName("DB_NAME")
.addScript("DDL.SQL")
.addScript("DML.SQL")
.build();
If you run the below line mutiple time, you can see the databases:
DatabaseManagerSwing.main(new String[] { "--url", "jdbc:hsqldb:mem:" + dbnaes, "--user", "sa", "--password", "" });

Resources