Liquibase execution on specific enviroment(maven + spring boot) - maven

My apllication(maven + spring boot) have a liquibase, but I need execute only on dev enviroment.
On another enviroments(prod, CI for example) I'll need block the execution.
Can I do this?
tks.

There're two options:
For your prod environment you can use spring.liquibase.enabled=false properties. This will disable Liquibase altogether and no changeSet will be executed.
Use Liquibase contexts. When executing maven scripts you can add -Dliquibase.contexts=dev_context property (for spring-boot it'll be spring.liquibase.contexts=dev_context).
And in your changeSets you can specify the context attribute:
<changeSet id="foo" author="bar" context="dev_context">
<!-- your logic here -->
</changeSet>
This way your changeSet will be executed only for dev_context.

Tks for the answer
I try this but didn't work.
I think the probleam is on my pom.xml.
If e delete the tag, the liquibase don't work.
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.8.4</version>
<configuration>
<promptOnNonLocalDatabase>false</promptOnNonLocalDatabase>
<propertyFile>src/main/resources/liquibase.properties</propertyFile>
</configuration>
<executions>
<execution>
<phase>process-resources</phase>
<goals>
<goal>update</goal>
</goals>
</execution>
</executions>
</plugin>

Related

How to invoke specific execution

I am trying to replace maven exec with MavenInvokerPlugin because of problems on Jenkins with forwarding the maven settings file.
So in bash it looks straight:
mvn dependency:copy-dependencies#resolve-maven-deps
My translation to MavenInvokerPlugin configuration is
<plugin>
<artifactId>maven-invoker-plugin</artifactId>
<version>3.0.1</version>
<configuration>
<projectsDirectory>${project.basedir}/src/main/docker</projectsDirectory>
<localRepositoryPath>${project.build.mavenDependencies}</localRepositoryPath>
<goal>dependency:copy-dependencies#resolve-maven-deps</goal>
</configuration>
<executions>
<execution>
<id>integration-test</id>
<goals>
<goal>run</goal>
</goals>
<phase>compile</phase>
</execution>
</executions>
</plugin>
It looks like that execution id is completely ignored, because I tried random strings and mvn builds the project with success.
mvn dependency:copy-dependencies#asdfasdfa
So I'd like to know whether this feature is supported at all and what I am doing wrong.
P.S. I know that calling maven out of maven is anti pattern, but here is exactly that rare case when there is no other way.
After looking at projects using maven invoker I figured out the trick.
goal tag is not used, instead provide invokerPropertiesFile:
<pom>${project.basedir}/xxx/pom.xml</pom>
<invokerPropertiesFile>${project.basedir}/invoker.properties</invokerPropertiesFile>
content of the file:
invoker.goals=compile -P resolve-maven-deps

Using sonar.test.exclusions with Sonarqube 6.3

I'm currently evaluating Sonarqube 6.3 (a big upgrade from my current 5.5 instance) and I'm getting confused trying to work out the functionality of the sonar.test.exclusions setting.
There's this question: Sonar Maven Plugin: How do I exclude test source directories? which seems to indicate that it is used to exclude test files from analysis (which is what I'm after - I don't want my sonar ruleset run over my unit tests). The documentation https://docs.sonarqube.org/display/SONAR/Narrowing+the+Focus also indicates that it is used to 'exclude unit test files' (perhaps this can be expanded upon to make it clearer?)
Thing is, when I add sonar.test.exclusions with a value of **/src/test/** and then run my analysis, I'm still getting code smells and the like being found for:
Foo/src/test/java/foo/bar/BarTest.java
Foo/src/test/java/lah/LahTest.java
etc.
When I use sonar.exclusions instead, they don't show up. Why is sonar.test.exclusions not doing what I expect?
First of all: if you have a Maven project, you should use the scanner for Maven (mvn sonar:sonar). It will simplify your configuration, and will automatically register src/test/java folder as a test directory.
Now if you want to do the configuration manually, or understand what is going on under the hood, here is the explanation: SonarQube scanner work with 2 sets of files, main and test. Main source files are configured using the property sonar.sources. Test source files are configured using sonar.tests.
On top of that, you can filter some content using the sonar.[test.]exclusions properties.
In you case your problem is that Foo/src/test/java/foo/bar/BarTest.java seems to be considered as a main source file. That's why sonar.test.exclusions has no effect.
Using maven with verfication goal (mvn clean verify sonar:sonar install), I have used this configuration without problems:
...
<properties>
....
<sonar.exclusions>
**/generated/**/*,
**/model/**/*
</sonar.exclusions>
<sonar.test.exclusions>
src/test/**/*
</sonar.test.exclusions>
....
<sonar.java.coveragePlugin>jacoco</sonar.java.coveragePlugin>
<sonar.jacoco.reportPath>${project.basedir}/../target/jacoco.exec</sonar.jacoco.reportPath>
<sonar.coverage.exclusions>
**/generated/**/*,
**/model/**/*
</sonar.coverage.exclusions>
<jacoco.version>0.7.5.201505241946</jacoco.version>
....
</properties>
....
Coverage exclusion configuration, inside properties (up) and jacoco plugin configuracion:
.....
<build>
<plugins>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>${jacoco.version}</version>
<executions>
<execution>
<id>prepare-agent</id>
<goals>
<goal>prepare-agent</goal>
</goals>
</execution>
<execution>
<id>report</id>
<phase>prepare-package</phase>
<goals>
<goal>report</goal>
</goals>
</execution>
<execution>
<id>post-unit-test</id>
<phase>test</phase>
<goals>
<goal>report</goal>
</goals>
<configuration>
<dataFile>target/jacoco.exec</dataFile>
<outputDirectory>target/jacoco-ut</outputDirectory>
</configuration>
</execution>
</executions>
<configuration>
<systemPropertyVariables>
<jacoco-agent.destfile>target/jacoco.exec</jacoco-agent.destfile>
</systemPropertyVariables>
</configuration>
</plugin>
</plugins>
</build>
....

How to generate symbolmap in gwt mvp4g project?

I am using Mvp4g on the gwt client side. I want to generate symbolMap that is used by RemoteLogging Servlet but when I try to generate symbol map using mvn clean install and specifying -extra folder_name property in gwt-maven-plugin configuration, I don't see the symbolMap files. It is not a plain gwt app but its gwt with mvp4g. I don't know whether its mvp4g that is causing the problem.
mvp4g generates Java code on the client side. This happens before the compiler translates the code to JavaScript. Check your settings. May be this post helps. How to generate symbol map in gwt using maven?
Update:
I use this maven configuration:
<gwt.output>myPathToTheProjectDirectory/output</gwt.output>
<gwt.gen>genSources</gwt.gen>
<gwt.extra>extra</gwt.extra>
And this for the maven-gwt-plugin:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<version>2.6.1</version>
<executions>
<execution>
<phase>compile</phase>
<id>bla</id>
<goals>
<goal>compile</goal>
</goals>
<configuration>
<draftCompile>false</draftCompile>
<disableClassMetadata>true</disableClassMetadata>
<compileReport>true</compileReport>
<warSourceDirectory>${gwt.war}</warSourceDirectory>
<webappDirectory>${gwt.output}</webappDirectory>
<gen>${gwt.output}/${gwt.gen}</gen>
<extra>${gwt.output}/${gwt.extra}</extra>
<fragmentCount>8</fragmentCount>
<extraJvmArgs>-Xms1G -Xmx1G -Xss1024k -XX:MaxPermSize=1024m -Dgwt.persistentunitcache=false</extraJvmArgs>
<localWorkers>7</localWorkers>
</configuration>
</execution>
</executions>
</plugin>
In case I execute maven:compile, the symbolmaps are listed inside the folder myPathToTheProjectDirectory/output/extra/symbolmaps.
Try add this: <set-property name="compiler.useSourceMaps" value="true" />
It solves the problem for me.

Liquibase on multiple databases

I have already implemented Liquibase with Maven. We are currently using a single database (db2) but now we need to add a new database to the application which will have different objects.
I've seen that i can define a new profile in maven but i couldn't find out how to differentiate which objects is being created on which database.
Is there a solution to this? Can I support 2 different databases with different objects using liquibase?
As you can see in the documentation, you can use two different executions, like this:
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.0.5</version>
<executions>
<execution>
<phase>process-resources</phase>
<configuration>
<changeLogFile>PATH_TO_CHANGELOG_1</changeLogFile>
... connection properties ...
</configuration>
<goals>
<goal>update</goal>
</goals>
</execution>
<execution>
<phase>process-resources</phase>
<configuration>
<changeLogFile>PATH_TO_CHANGELOG_2</changeLogFile>
... connection properties ...
</configuration>
<goals>
<goal>update</goal>
</goals>
</execution>
</executions>
</plugin>
The only problem with this approach is that you need two different changelog.xml files, one per database.
Also, you can have preconditions in your changelog file to choose between what changeset will be processed by each database.
For example:
<changeSet id="1" author="bob">
<preConditions onFail="MARK_RAN">
<dbms type="oracle" />
</preConditions>
<comment>Comments should go after preCondition. If they are before then liquibase usually gives error.</comment>
<dropTable tableName="oldtable"/>
</changeSet>
The onFail="MARK_RAN" makes Liquibase skip the changeset but marks it as run, so the next time it will not try again. See the customPrecondition tag in the documentation for more complex preconditions.
You may want to have 2 separate changelogs to manage the two databases, even if they are both used by the same application.
As Arturo says you can have 2 or more execution-nodes, but you must give every execution-node a seperate id.
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.0.5</version>
<executions>
<execution>
<id>db1-update</id>
<phase>process-resources</phase>
<configuration>
<changeLogFile>src/main/resources/org/liquibase/db1.xml</changeLogFile>
<driver>org.postgresql.Driver</driver>
<url>jdbc:postgresql://localhost/db1</url>
<username>..</username>
<password>..</password>
</configuration>
<goals>
<goal>update</goal>
</goals>
</execution>
<execution>
<id>db2-update</id>
<phase>process-resources</phase>
<configuration>
<changeLogFile>src/main/resources/org/liquibase/db2.xml</changeLogFile>
<driver>org.postgresql.Driver</driver>
<url>jdbc:postgresql://localhost/db2</url>
<username>...</username>
<password>...</password>
</configuration>
<goals>
<goal>update</goal>
</goals>
</execution>
<execution>
<id>db3-update</id>
<phase>process-resources</phase>
<configuration>
<changeLogFile>src/main/resources/org/liquibase/db3.xml</changeLogFile>
<driver>org.postgresql.Driver</driver>
<url>jdbc:postgresql://localhost/db3</url>
<username>...</username>
<password>...</password>
</configuration>
<goals>
<goal>update</goal>
</goals>
</execution>
</executions>
</plugin>
You can use Preconditions inside changeset or changelog and give conditions according to the database,
<preConditions onFail="WARN">
<dbms type="oracle" />
<runningAs username="SYSTEM"/>
</preConditions>
Like this, you can use precondition tag inside changeset and give conditions according to each database.
Use this link for additions documentation.
Old question but I still answer cause I have the same requirement today, and I am opting for another solution.
I would recommend, if you can, as proposed already in the answers to use seperate changelogs.
But if you want to keep the changelogs unified, as I need for my specific case, I would use labels instead of preconditions to filter changesets to be executed.
<changeSet id="0001:1" author="oz" labels="clickhouse">
<sql>...SOMESQL...</sql>
</changeSet>
<changeSet id="0001:2" author="oz" labels="mongodb">
<ext:createCollection collectionName="myCollection">
...SOMEJSON....
</ext:createCollection>
</changeSet>
This will prevent poluting the databasechangelog of the two databases with the executions of the changesets of the other database.
This will cause problems(for the current release at least 4.6.1) for any liquibase operation using tags, such as rollbackToTag or updateToTag.

How to use Liquibase to update database in embedded Glassfish instance

I'm in the process of converting our db management to Liquibase. This is running along nicely.
As a subsequent step I want to assure that all future modifications gets tested before deploy to common environments, continuous integration style. I try to do this using the following setup:
Build our ear containing EJB webservices
Using maven-embedded-glassfish-plugin to start an embedded instance of Glassfish 3 during pre-integration-test maven ohase
Create my datasource as part of start goal
Deploy ear during deploy goal
Still in pre-integration-test, I run liquibase:update on the same database URL. In this case a H2 file database
I then want to run our SoapUI tests on the deployed application
But when i get this far the application can't find any data in the database. So the question is if I've missed something in my setup or if there's a better way to organize my intended goal?
pom.xml, embedded Glassfish
<plugin>
<groupId>org.glassfish.embedded</groupId>
<artifactId>maven-embedded-glassfish-plugin</artifactId>
<version>4.0</version>
<configuration>
<ports>
<http-listener>9090</http-listener>
<https-listener>9191</https-listener>
</ports>
<goalPrefix>embedded-glassfish</goalPrefix>
<app>${project.build.directory}/school-application-${project.version}.ear</app>
<name>school-application</name>
<commands>
<command>create-jdbc-connection-pool --datasourceclassname=org.h2.jdbcx.JdbcDataSource --restype=javax.sql.DataSource --property URL=jdbc\:h2\:~/tmpLB\;AUTO_SERVER\=TRUE schoolDSPool</command>
<command>create-jdbc-resource --connectionpoolid schoolDSPool jdbc/schoolDS</command>
</commands>
</configuration>
<dependencies>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.3.176</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>start</goal>
<goal>admin</goal>
<goal>deploy</goal>
<goal>undeploy</goal>
<goal>stop</goal>
</goals>
</execution>
</executions>
</plugin>
pom.xml, Liquibase
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>3.1.1</version>
<dependencies>
<dependency>
<groupId>company.school</groupId>
<artifactId>school-db</artifactId>
<version>${project.version}</version>
<systemPath>../school-db/target</systemPath>
<type>jar</type>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.3.176</version>
</dependency>
</dependencies>
<executions>
<execution>
<phase>integration-test</phase>
<configuration>
<promptOnNonLocalDatabase>false</promptOnNonLocalDatabase>
<changeLogFile>db.changelog-master.xml</changeLogFile>
<driver>org.h2.Driver</driver>
<url>jdbc:h2:~/tmpLB;AUTO_SERVER=TRUE</url>
<logging>info</logging>
</configuration>
<goals>
<goal>update</goal>
</goals>
</execution>
</executions>
</plugin>
I have one changeset in the changelog inserting data in targeted tables.
Do I have the right users set up?
Is there a way to run Liquibase in the same process as Glassfish and use a mem: database instead?
Thx and regards,
Christian
Ok, so there was an "easy" solution to the problem.
There was no data in the database since that changeset in liquibase changelog couldn't complete. I had the insert statements in a separate sql file that I called using the <sqlFile> liquibase tag. But the insert was violating some foreign key constraints and didn't get executed.
So what put me off was the fact that Liquibase seems to hide any errors from included sql files. Will try try reproduce that and Jira it if I succeed.
/Christian

Resources