spring data jpa append the create statements each time I run the project as spring app, - spring

I have a problem with the spring jpa schema generation. I set the following properties in the application.properties file:
spring.jpa.properties.javax.persistence.schema-generation.scripts.action=create
spring.jpa.properties.javax.persistence.schema-generation.scripts.create-target=create.sql
spring.jpa.properties.javax.persistence.schema-generation.scripts.create-source=metadata
Then each time I run the project or I build it as maven build it appends all the create statements to the create.sql file. I tried to put the create.sql in the target path then each time it removes it and then create a new one but I get an error that it can not read the create.sql. Any idea how to stop appending the create statements each time at the end of the file?

This question is a duplicate of Spring JPA DDL file generation - how to delete or clean file before generating - I have described a solution there that works for Hibernate versions >= 5.5.3:
spring.jpa.properties.hibernate.hbm2ddl.schema-generation.script.append=false

Related

Liquibase migrations from custom spring-boot-starter

I am developing a custom spring-boot-starter that can be included in any project. For it work it need to create a couple of tables in the database in the main application. For this I use liquibase migrations which are described in the spring-boot-starter module. Also in the starter in the application.properties file I describe the properties
spring.datasource.url=jdbc:postgresql://localhost:5432/message
spring.liquibase.change-log=classpath:liquibase/changelog.xml
When starting the main application, which contains my starter in dependencies, I get an exception:
Description:
Liquibase failed to start because no changelog could be found at 'classpath:/db/changelog/db.changelog-master.yaml'.
Action:
Make sure a Liquibase changelog is present at the configured path.
The application looks for a path to migrations and migrations inside the project, but does not look for them inside the starter scope.
Please tell me what needs to be done so that when the application starts, the migrations described in the starter are also pulled up?
How to set up liquibase? Or maybe there is some other way to run sql scripts if the required tables are missing?

How should I control order of execution between Liquibase and springboot data.sql?

We have a (internal to company, external to project) library (jar) that includes some Liquibase scripts to add tables to a schema to support the functionality of that library.
Using SpringBoot and Maven to run integration tests with H2, we have been using sql files, listed in property files, to initialise the DB.
We want to be able to add data to the tables created by the external (to the project) library for the ITs but finding the tables haven't been created by Liquibase when SpringBoot/SpringData attempts to run the insert statements in our sql files.
Given the errors we're seeing (tables not existing when spring attempts to run the insert.sql files) it looks like spring is executing those files before Liquibase has done its thing.
How can I ensure Liquibase config run by the library to create tables has completed before Spring does it's thing with running sql files specified by the spring.datasource.data property?
We don't really want to include the test data in the library (which was working, but introduced other issued we are trying to work around with liquibase inserting test data into production DB).
what about using different context for your tests?
so you will have application.properties in your test folder and there you will define another changelog that will include all changelogs that are needed (even from library) and you will also include the .sql file that you are running probably with jpa? Try to look here if that helps.

Spring Boot: Executing the Newer version of SQL file each time we rebuild the application

I have a spring-boot application with PostgreSQL. Some of the tables are created using models and other tables have to be created prior to the start of the application or while starting the application. That can be done by running an SQL file while startup.
But DB tend to change over time, we may have to alter some of the tables, add some new tables without disturbing the existing data in the tables, etc.
Is there a way to add new SQL files, and run only the SQL files which was not run in spring-boot application each time when we rebuild and rerun? And don't run any of the SQL files while start-up if everything were already executed?
For your scenario liquibase is the best solution. You can merge liquibase on your spring boot application.
Ex: https://javadeveloperzone.com/spring-boot/spring-boot-liquibase-example/
You can use flyway. It allows you to have versioned sql scripts:
flywaydb.org
flywaydb spring boot plugin
Examples:
Spring Boot Database Migrations with Flyway
Incrementally changing your db with java and flyway

How to read file generated dynamically in spring boot

I have Spring boot application, in which I am generating file dynamically and want flyway plugin to read file at the same time.
I am able to generate file but not able to find the file at the same run by flyway plugin. If I refresh app it shows that file and able to read at next run.
I want that generates and at same time able to read.
spring.devtools.restart.additional-paths=
tried this but it restarts application and write into same file only. because of that records are duplicated and also it not refresh resource dir so not able to read.
I tried many other ways but couldn't work for me. Is there any solution.

Using Liquibase and Spring Boot

I have a Spring Boot application and want to use Liquibase to generate the changelogs for my JPA entities. However, I encounter different issues, depending on my approach.
My first approach is to use the diff goal of the maven plugin. The url is my H2 development database with the H2 driver and the reference URL is something like "hibernate:spring:myBasePackage.myEntityPackage?dialect=org.hibernate.dialect.H2Dialect" with the driver "liquibase.ext.hibernate.database.connection.HibernateDriver". In that case Liquibase seems to recognize my entities, but prints the differences to the console. Also the differences do not have the form of a changelog file.
My second approach would be to use the generateChangeLog goal of the maven plugin. In this case my url is "hibernate:spring:myBasePackage.myEntityPackage?dialect=org.hibernate.dialect.H2Dialect" with the driver "liquibase.ext.hibernate.database.connection.HibernateDriver. In this case I am getting an error "Unable to resolve persistence unit root URL: class path resource [] cannot be resolved to URL because it does not exist". This error can be found in both the Spring an the Liquibase issue trackers, but it seems like it is always said that this error is already fixed.
My third approach is basically like the second, but in this case I am using a "hibernate:classic" url with an implementation of "CustomClassicConfigurationFactory", which registers my annotated classes explicitly. This does work. However, in this case I have to do this in my application-jar. I have to add my application-jar as a dependency for the maven-plugin. Thus I have to build my application-jar (and install it to the local Maven repository), before I can generate the changelogs. This seems to be cumbersome.
My questions are:
Is there an easier way to generate the changelogs for JPA entities in a Spring boot based application?
Why are the first two approaches not working?
Is there a way to simplify the third approach?
I am using:
Spring Boot 1.5.4.RELEASE
Liquibase-Hibernate4 3.6
Liquibase 3.5.3
Many thanks in advance.
In the first approach using liquibase:diff , the change set for the entities (create table change set) will not be generated since liquibase do not assume new jpa entity as change.
In the second approach generateChangeLog, it generates the change log from the given data base.It won't look into your jpa entities.
In order to generate the ddl scripts for your jpa entities , just submit the following to your jpa properties
<property key="javax.persistence.schema-generation.scripts.action">drop-and-create</property>
<property key="javax.persistence.schema-generation.scripts.create-target">./ddl/create.sql</property>
<property key="javax.persistence.schema-generation.scripts.drop-target">./ddl/drop.sql</property>
The above will generate the scripts the ddl folder under the root folder.
You can check the other properties here https://thoughts-on-java.org/standardized-schema-generation-data-loading-jpa-2-1/

Resources