generate-sources phase from containing project is broken - spring

I have a project that defines a module in its pom.xml:
<modules>
<module>mytestmodule</module>
</modules>
The module uses spring-boot-starter-parent as its parent. Additionnally, the module uses in its pom.xml the plugin jooq.codegen-maven to generate source files from a database:
<plugin>
<groupId>org.jooq</groupId>
<artifactId>jooq-codegen-maven</artifactId>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<jdbc>
<driver>xxx</driver>
<url>xxx</url>
<user>xxx</user>
<password>xxx</password>
</jdbc>
<generator>
<target>
<packageName>org.jooq.schema</packageName>
<directory>src/main/java</directory>
</target>
</generator>
</configuration>
</execution>
</executions>
</plugin>
When I run the generate-sources phase from the module directory, it correctly generates the source files in a schema directory. However, when I run the command from the containing project (the parent directory), not only does it not generate the source files, but it also deletes them. How to fix this?

Related

Maven build lifecycle phase synchronization across modules

I have a maven project with a reactor and a couple of modules, most of which are being packed as war. The order in which they are specified in the reactor / root pom.xml defines the order in which they are built.
pom.xml
....
<module>library1</module>
<module>library2</module>
<module>webapp1</module><!--war-->
<module>webapp2</module><!--war-->
<module>blackduck-scan</module><!-- create file to be placed into webapp2 post-build but pre-packaging of webapp2 -->
...
The last module, purposely, is destined to simply run an executable in the prepare-package phase. More precisely a blackduck license scanner, which itself eventually produces the license notice file which is then placed into the /webapp folder of one of the web applications to be displayed after deployment.
The idea is what this notice file is being placed after compilation of the applications but before packaging these as WAR artifacts to have it included in the current delivery of our pipeline without re-building just for the sake of re-packaging.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<executable>java</executable>
<workingDirectory>.</workingDirectory>
<arguments>
<argument>-jar</argument>
<argument>synopsys-detect-8.4.0.jar</argument>
</arguments>
...
</configuration>
<executions>
<execution>
<goals>
<goal>exec</goal>
</goals>
<phase>prepare-package</phase>
</execution>
</executions>
</plugin>
I've tried two options to achieve this without success.
a) Adding the plugin/goal to the reactor pom.xml leads to the goal being executed first as soon as the target phase is prepare-package and thus may lead to incomplete scan results while the actual project has not yet been built.
b) Adding then plugin/goal as a module as described above puts the execution at the end of the chain, however, packaging of the webapps has already been concluded.
c) The third (arguably working) but less elegant approach would be to split this into two separate maven calls:
mvn clean install && mvn package
I see that modules are build in sequence and for good reason. However, is there any method to "synchronize" build phases such that each phase is started only after the previous phase has been completed for all modules? Effectively to simply call, all included:
mvn clean install
I believe the option is:
disable war creation and run exploded goal instead
use maven-assembly-plugin to pack target war
inject execution of blackduck between maven-war-plugin and maven-assembly-plugin
smth. like:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<executions>
<execution>
<!-- disabling building war -->
<id>default-war</id>
<phase>none</phase>
</execution>
<execution>
<id>exploded</id>
<phase>prepare-package</phase>
<goals>
<goal>exploded</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- blackduck execution here -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
<configuration>
<descriptors>
<descriptor>war-assembly.xml</descriptor>
</descriptors>
<appendAssemblyId>false</appendAssemblyId>
</configuration>
</plugin>
war-assembly.xml:
<assembly>
<id>war</id>
<formats>
<format>war</format>
</formats>
<includeBaseDirectory>false</includeBaseDirectory>
<fileSets>
<fileSet>
<directory>${project.build.directory}/${project.build.finalName}</directory>
<outputDirectory>.</outputDirectory>
<includes>
<include>/**/*</include>
</includes>
</fileSet>
</fileSets>
</assembly>

Generate Javadoc for multimodule project

I have read everything I can find on solving this and my attempts still fail. The best I can do is to get the Javadoc of exactly one module to show up--the last one built. (For now, I'm not trying to bundle Javadoc into any JARs. I'm also not trying to do anything "site".) I just want to put Javadoc for easy access into a subdirectory under the project root.
Here's what's in my parent pom.xml:
<build>
<plugins>
.
.
.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.2.0</version>
<configuration>
<noqualifier>all</noqualifier>
<reportOutputDirectory>${user.dir}/documents</reportOutputDirectory>
<destDir>javadoc</destDir>
</configuration>
<executions>
<execution>
<id>attach-javadocs</id>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
What I'm putting into subordinate pom.xml files is identical to the above except for
<goals>
<goal>javadoc</goal>
</goals>
I have played with replacing the <execution> in the parent and sometimes subordinate pom.xml files with:
<execution>
<id>aggregate</id>
<goals>
<goal>aggregate</goal>
</goals>
</execution>
but it makes no difference.
I think the following configuration is the reason your reports get overwritten:
<configuration>
<reportOutputDirectory>${user.dir}/documents</reportOutputDirectory>
</configuration>
All module builds will be written to the same directory, hence overwriting the previous build.
The solution is to use the default output directory and configure the output directory for the aggregated javadoc instead. This way the reactor build will create javadoc output files in each module's target directory. These can then be used by the aggregate goal to be combined.
This can be done by configuring your parent POM as follows:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.2.0</version>
<configuration>
<!-- Default configuration for all reports -->
<noqualifier>all</noqualifier>
<destDir>javadoc</destDir>
</configuration>
<executions>
<execution>
<id>aggregate</id>
<goals>
<goal>aggregate</goal>
</goals>
<configuration>
<!-- Specific configuration for the aggregate report -->
<reportOutputDirectory>${user.dir}/documents</reportOutputDirectory>
<destDir>javadoc</destDir>
</configuration>
</execution>
...
</executions>
</plugin>
...
</plugins>
</build>
(there is no need for any additional configuration in the module POM files)
The aggregated javadoc can now be created by running
mvn compile javadoc:javadoc javadoc:aggregate
(note that the compile or package goal is required for reactor to resolve inter-module dependencies)

maven zip uber-jar and shell script

I would like maven to combine an uber-jar created by the shade-plugin and a shell script from the all_files directory.
The project structure looks like this:
all_files/
mvn_script.sh
projB-shaded.jar
maven_project/
guide/
parent-pom.xml
projA/
pom.xml
projB/
pom.xml
The jar is produced by projectB's pom file and then placed into the outtermost folder to be ready to be zipped with the shell script. The reason is so that the shell script can call the jar file to execute the project.
I want other programmers to be able to easily unzip the file and run the shell script without worry. And I also need to have maven package the script and jar together. I'm not sure exactly how to implement that within the shaded plugin.
Note: I do not want to use assembly-plugin because it doesn't package dependent jars well.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<minimizeJar>true</minimizeJar>
<outputFile>../../guide/${project.artifactId}-${project.version}-shaded.jar</outputFile>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<manifestEntries>
<Main-Class>projB.classB</Main-Class>
</manifestEntries>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
You don't want to use the maven-assembly-plugin for creating the uber-jar. But you will want to use it to create that ZIP.
Currently, your maven-shade-plugin is bound to the package phase. You could shift that execution to the prepare-package phase (since it actually prepares your final packaging) add an execution of the maven-assembly-plugin bound to the package phase. Your assembly would create a ZIP based on the shaded JAR (which will exist since the shade plugin will have been executed) and the shell script.
A sample descriptor would be the following:
<assembly xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.3"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.3 http://maven.apache.org/xsd/assembly-1.1.3.xsd">
<id>your-id</id>
<formats>
<format>zip</format>
</formats>
<files>
<file>
<source>${project.build.directory}/projB-shaded.jar</source>
<outputDirectory>/</outputDirectory>
</file>
<file>
<source>/path/to/mvn_script.sh</source>
<outputDirectory>/</outputDirectory>
</file>
</files>
</assembly>
with the following POM configuration:
<plugin>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>prepare-package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<!-- current configuration -->
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<id>assemble</id>
<goals>
<goal>single</goal>
</goals>
<phase>package</phase>
<configuration>
<descriptors>
<descriptor>/path/to/assembly.xml</descriptor>
</descriptors>
</configuration>
</execution>
</executions>
</plugin>
The typical location for the assembly descritor is under src/assembly as per the Maven standard directory layout.

Execute script as part of mvn package

My pom.xml contains
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>2.3</version>
<configuration>
<warName>${project.artifactId}</warName>
<outputDirectory>${wlp.install.dir}/usr/servers/liberty/apps</outputDirectory>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
When I run mvn package I can see this step running:
[INFO] --- maven-war-plugin:2.3:war (default-war) # frontEnd ---
That's great. However, I also want to run a shell script before the war file is created. I tried adding
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<configuration>
<tasks>
<exec dir="${basedir}"
executable="${basedir}/src/main/webapp/concat"/>
</tasks>
</configuration>
</plugin>
before the maven-war plugin, but it does not run. I don't even see antrun in the output of mvn. Adding the <tasks> element to the <configuration> for maven-war-plugin does nothing either.
What can I do to have maven simply run a script as part of mvn package?
The position in the pom.xml is irrelevant, you have to bind the maven-antrun-plugin execution to the correct lifecycle phase (e.g. compile) as shown below:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<phase> <!-- a lifecycle phase --> </phase>
<configuration>
<target>
<!--
Place any Ant task here. You can add anything
you can add between <target> and </target> in a
build.xml.
-->
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
See The maven-antrun-plugin Usage Page for more details and The Maven Introduction to the Build Lifecycle for further reference.

How do I include test classes and configuration in my war for integration testing using maven?

I currently have a maven web project that I am attempting to write integration tests for. For the structure of the project, I've defined test stubs under src/test/java, whilst the spring bean definitions for these stubs sit under src/test/resources.
What I would like to do, is that when I build my war artifact I'd like all of the test stub classes to be compiled and included in the war along with the spring bean definition files. I've tried to do it with the maven war plugin but the only things I've been able to copy are the resources. Simply put, I'd like to make use of the test class path and include all these classes in my war file.
It seems the useTestClassPath option with the maven jetty plugin would solve my problem but the current project I'm working on is currently using Tomcat 6.0. Is there another maven plugin or a way I can configure the maven war plugin to achieve my objective?
You can also do it straightforwardly. This will add both test classes and test resources to the WEB-INF/classes:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<phase>process-test-classes</phase>
<configuration>
<target>
<copy todir="${basedir}/target/classes">
<fileset dir="${basedir}/target/test-classes" includes="**/*" />
</copy>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
I also recommend you place it into separate profile like "integration" and also to override the package name in that profile to not be able to confuse normal war without tests packaged in and the testing war.
The full example with profile is here. You may run mvn clean package to have a war war-it-test.war without tests included, or you may run mvn clean package -Pintegration to have a war war-it-test-integration.war for the war with tests included.
I believe the following configuration for the maven war plugin would do what you want. You copy your test-classes to your WEB-INF/classes folder. You can even filter those resources.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<executions>
<execution>
<id>generate-test-war</id>
<phase>pre-integration-test</phase>
<goals>
<goal>war</goal>
</goals>
</execution>
</executions>
<configuration>
<warSourceDirectory>${basedir}/src/test/webapp</warSourceDirectory>
<warName>${project.artifactId}-test</warName>
<webappDirectory>${basedir}/target/${project.artifactId}-test</webappDirectory>
<primaryArtifact>false</primaryArtifact>
<webResources>
<resource>
<directory>${basedir}/target/test-classes</directory>
<targetPath>WEB-INF/classes</targetPath>
</resource>
</webResources>
</configuration>
</plugin>
See http://maven.apache.org/plugins/maven-war-plugin/examples/adding-filtering-webresources.html
You can use the maven build helper plugin to add additional folders to the "normal" class path.
But I would recommend to create an new folder for your integration test (for example src/it/java), and add this folder, but not the "normal" test folder (src/test/java) -- the same for the resources folder.
Instead of using the maven antrun plugin, you could instead use the maven resources plugin and configure it like this:
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>2.5</version>
<executions>
<execution>
<phase>process-test-classes</phase>
<id>test-classes</id>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<overwrite>false</overwrite>
<outputDirectory>${project.build.directory}/${project.build.finalName}/WEB-INF/classes</outputDirectory>
<resources>
<resource>
<directory>${project.build.directory}/test-classes</directory>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
Use Tomcat 7 plugin with additional classpath directories configuration.
<plugin>
<groupId>org.apache.tomcat.maven</groupId>
<artifactId>tomcat7-maven-plugin</artifactId>
<version>2.2</version>
<configuration>
<additionalClasspathDirs>
<additionalClasspathDir>${build.testOutputDirectory}</additionalClasspathDir>
</additionalClasspathDirs>
</configuration>
<executions>
<execution>
<id>start-tomcat</id>
<goals>
<goal>run</goal>
</goals>
<configuration>
<fork>true</fork>
</configuration>
</execution>
</executions>
</plugin>
You can do this Configuration in pom.xml file you don't get any errors in pom.xml and adding test classes to our jar or war file.
<pluginManagement>
<plugins>
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<versionRange>[1.7,)</versionRange>
<goals>
<goal>run</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<phase>process-test-classes</phase>
<configuration>
<target>
<copy todir="${basedir}/target/classes">
<fileset dir="${basedir}/target/test-classes" includes="**/*" />
</copy>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
We need to add the below plugin to the pom.xml in order to add the test cases to jar. Thanks to #IvonSopov
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<phase>process-test-classes</phase>
<configuration>
<target>
<copy todir="${basedir}/target/classes">
<fileset dir="${basedir}/target/test-classes" includes="**/*" />
</copy>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
But after adding this line we got the build succeeded and also able to add the test case classes into the jar.. but the problem is in pom.xml it is showing as error like
Plugin execution not covered by lifecycle configuration:
org.apache.maven.plugins:maven-antrun-plugin:1.8:run
(execution:default, phase: process-test-classes)
In order to remove this error we need to include the below plugin as a separate tag within the build tag. (not inside the plugins which we added earlier.)
<pluginManagement>
<plugins>
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<versionRange>[1.7,)</versionRange>
<goals>
<goal>run</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
Now we can create the jar which includes the test classes without any errors.

Resources