How to run a Maven plugin execution only if the resulting output is not already present - maven

I have inherited a POM that attempts to avoid repeating build steps by using a profile
that is only activated when the step output does not exist:
<profile>
<id>run-once</id>
<activation>
<file>
<missing>target/some-output</missing>
</file>
</activation>
<build>
<plugins>
<plugin>
...
<executions>
<execution>
... slow process to produce target/some-output ...
</execution>
</executions>
</plugin>
</plugins>
</build>
</activation>
However, as maven experts no doubt realized immediately, this does not work if the developer says mvn clean install. Maven calculates the active profiles once, before running clean, and if target/some-output was present, then the run-once profile is not active. The result is that target/some-output is removed by the clean phase but is not recreated in the install phase, and the ensuing WAR is broken because some-output is missing.
Is there a standard solution to this problem (besides avoiding mvn clean install) ? I'm about to make the plugin unconditional to prevent the silent creation of a broken WAR.
More generally, is there a standard technique to prevent mvn from recreating artifacts like some-output that are up-to-date? Or is the idea that if make-style dependency management is important, one should use gradle or rake instead of maven?

I don't think there is a standard solution to this problem. There are though various options that I can think of (there are most likely others as well):
you could obviously activate the profile manually: mvn clean install -Prun-once, but then you have to remember to do that each time of course
configure the maven-enforcer-plugin together with its requireFilesExist rule to make sure the files exist and fail the build if they don't. (at least then you wont get the silent creation of a broken WAR)
modify the profile to have it create the files to a location under your src folder (i.e. src/main/gen) which is excluded from being checked into your source repository (if you are using one), and then configure the maven-resources-plugin and its copy-resources goal to copy these resources to the correct location under your build directory. This way clean wont delete them.

Related

How to disable jar creation in commandline in a maven project?

I have a maven project for which I'm running two separate builds.
In one build I want to save the build time by disabling the jar creation of maven modules in it.(There are 45 maven modules). There is a Maven-Jar-Plugin that is being used to create the jars.
I want to conditionally disable the jar creation at the command line, that is, looking for something similar to -Dskiptests used to skip the unit tests though there is a surefire plugin by default.
The maven-jar-plugin does not provide any skip option.
However, several ways are possible to achieve your requirement.
You may just skip the phase which brings by default (via default mappings) the jar creation, that is, the package phase, and as such simply invoke
mvn clean test
The additional phases would not make sense if you do not create a jar file anyway: package, install, deploy would not have anything to process. Moreover, the additional integration phases may also be impacted depending on your strategy for integration tests, if any.
Alternatively, you can configure your pom as following:
<properties>
<jar.creation>package</jar.creation>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<id>default-jar</id>
<phase>${jar.creation}</phase>
</execution>
</executions>
</plugin>
</plugins>
</build>
As such, the default behavior would still provide a jar creation, while executing maven as following:
mvn clean install -Djar.creation=false
Would instead skip the creation of the jar.
What we are actually doing:
We are re-defining the default execution of the maven-jar-plugin
We are overriding its execution id, as such getting more control over it
We are placing its execution phase binding to a configurable (via property) phase
Default phase (property value) keeps on being package
At command line time you can still change it to any value different than a standard maven phase. That is, -Djar.creation=none would also work.

How do I configure my Maven-war-plugin's useCache feature so that consecutive builds are faster?

I’m using Maven 3.3.0 on Mac Yosemite. I wanted to make use of the maven-war-plugin’s useCache feature, but it isn’t doing anything in my multi-module project. When I run
mvn clean install -DskipTests
my project takes about 1:25 to run with the below configuration
<profile>
<id>prepare-deploy-war-to-jboss</id>
<activation>
<file>
<exists>${basedir}/src/main/webapp</exists>
</file>
</activation>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.6</version>
<configuration>
<useCache>true</useCache>
<cacheFile>/tmp/${project.artifactId}/war/work</cacheFile>
</configuration>
</plugin>
</plugins>
</build>
</profile>
Then I run the same command again and the project takes the same amount of time. I can see “work” files getting created so the plugin is definitely running but consecutive builds do not seem to be doing anything.
My question here isn’t so much as why isn’t useCache speeding up my build, but how can I configure my plugin differently so that consecutive runs do speed up the build? If there is another plugin I should be using that would speed up builds on back-to-back runs, then that would also suffice here.
Looking at the WAR mojo code (at the time of writing), the cache is mainly used by its web app structure concerning overlays management, so in most of the cases it would not improve build time indeed.
Moreover, as stated by its official documentation, the cache mechanism is an experimental feature, hence disabled by default, which probably doesn't achieve (yet) user expectations.
Regardless of the effectiveness of this cache option, some hints to speed up maven builds could be:
Consider whether you really need to clean at each and every run
Consider building offline (-o option) if everything you need is already on your local cache
Consider using threads during your build (-T option)
Consider going on quite mode (-q option), swithing off build log temporarely and getting only error logs (basically: no news, good news)
In your case, the War Plugin is activated upon the existence of a structure typical of a war packaging, which probably means this profile is part of the aggregator/parent pom and then activated only on the war module. Although it might impact very little, also consider moving the War Plugin configuration to its concerned module and avoid such a triggered configuration
Last but not least, during development time, build time is probably more important than war size, so you could switch off the default mechanism of re-compressing external libraries added to the war file, via the recompressZippedFiles option:
Indicates if zip archives (jar,zip etc) being added to the war should be compressed again. Compressing again can result in smaller archive size, but gives noticeably longer execution time.
Default: true
So a sample configuration would look like:
<properties>
<war.recompress.files>false</war.recompress.files>
</properties>
<build>
<finalName>webapp</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.6</version>
<configuration>
<recompressZippedFiles>${war.recompress.files}</recompressZippedFiles>
</configuration>
</plugin>
</plugins>
</build>
Note: since there is no user property for this configuration entry, I also added a property for it, to switch it on/off on demand via command line (or via profile).
You could then test the different execution times executing the default build (with the configuration above disabling recompression) against the previous configuration (below, switching recompression on for the current execution, on demand):
mvn clean install -Dwar.recompress.files=true
You may then consider to profile it to switch it on/off depending on development phase.

separate local download and install repositories using maven?

I want to rebuild my project structure from scratch from time to time and want to purge the built repository in order to do that. However, I don't want to remove downloaded files from maven central and other repositories. Is there a simple way to tell maven to install my built artifacts into a separate repository, ie. other then the one used to store downloaded, external files?
I am NOT talking about deploy, just mvn install.
UPDATE
I found an alternate solution using only one local repository for both downloaded and self-built artifacts: the self-built ones are accompanied by files called "maven-metadata-local.xml", so I select the repository directories to purge based on the existence of that file now...
You cannot do that with the install goal. maven-install-plugin will install the artifact to the same local repository that is used to fetch downloaded artifacts from. By default, this is ${user.home}/.m2/repository. You change that by setting the system variable maven.repo.local to another location (or by telling Maven to use a specific settins.xml). However, at the moment, Maven can't be configured to install specific artifacts to a different local repository than where it is fetching downloaded artifacts.
A possible work-around would be to declare an execution of the install-file goal, bound to the install phase and declare it to install all of the artifacts you want to to the specified local repository.
<plugin>
<artifactId>maven-install-plugin</artifactId>
<version>2.5.2</version>
<executions>
<execution>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<file><!-- path to artifact to install --></file>
<pomFile><!-- path to POM of artifact --></pomFile>
<localRepositoryPath><!-- path to repository you want to install to --></localRepositoryPath>
</configuration>
</execution>
</executions>
</plugin>
MINSTALL-126 enhancement about if this could be added to maven-install-plugin. In the mean time, see the following workaround, slightly extending what's proposed above, from a blog post I wrote on http://blog2.vorburger.ch/2016/06/maven-install-into-additional.html with some background about why this would be useful:
<profiles>
<profile>
<activation>
<property>
<name>addInstallRepositoryPath</name>
</property>
</activation>
<build>
<plugins>
<plugin>
<artifactId>maven-install-plugin</artifactId>
<executions>
<execution>
<id>additional-install</id>
<phase>install</phase>
<goals>
<goal>install-file</goal>
</goals>
<configuration>
<file>${project.build.directory}/${project.build.finalName}.jar</file>
<localRepositoryPath>${addInstallRepositoryPath}</localRepositoryPath>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
To formalize and expand on the “update” in the question (by the way you should not hesitate to answer your own question):
I came to a similar conclusion independently and include
find -L ~/.m2/repository \( -type d -name '*-SNAPSHOT' -prune -o -type f -name maven-metadata-local.xml \) -exec rm -rfv {} \;
in a general “cleanup” script that I run from time to time. Note that this differs from the ideal of install:install always going to a separate location in (at least) three ways:
You have to remember to run this script, so in the meantime you could have a local repository “polluted” with things you built. On occasion this will mean that a build will work locally for you but will not work for others. (Or even fail only or you, or succeed for everyone but behave subtly differently.) This defeats the goal of reproducible builds, unless you have plenty of Internet bandwidth and are willing to run docker run --rm -v "$PWD":/usr/src/mymaven -w /usr/src/mymaven maven mvn clean install!
If someone has intentionally deployed SNAPSHOTs to a shared repository, this script will delete them, so your next build will have to repeat the download.
Local installs of release versions are not deleted. Now if these came from release:perform because you were the one cutting the release, that is not so bad—presumably the remote artifact is identical to your local copy anyway. Where this gets really evil is if, in the course of trying to debug some problem in someone else’s released artifact by rebuilding from sources with some diagnostic patches (say), you forget to edit pom.xml to use a SNAPSHOT or other distinguishing version, and install the result. Maven will never notice that your local copy differs from the official version, and you can get into weird situations months later. Of course this has never happened to me.
The latter two problems could perhaps be addressed with a more complicated script that parsed maven-metadata-*.xml files rather than assuming that all, and only, SNAPSHOTs were local builds. Or as the submitter hints at, just delete the whole version directory if maven-metadata-local.xml is present (distinguishing this somehow from the parent artifact directory, which will also have such a file, and resolver-status.properties too).
While it is nice that Maven 3 records some information about where artifacts in the local repository came from, it is not good enough. What I for one would really appreciate is if install:install always saved to a distinct location, so that the main local repository could be trusted to be purely a cache of downloads. Local artifact resolution would then prefer one or the other repository in case of conflict based on a command-line switch (after issuing a warning).

Maven install goal does not generate pom for modules

I'm running a multi-module maven project and have an unexpected behavior. First time I'm seeing this...
My parent module configures the install plugin, defining its classifier.
<plugin>
<artifactId>maven-install-plugin</artifactId>
<configuration>
<generatePom>true</generatePom>
<classifier>${env}</classifier>
</configuration>
</plugin>
<!-- ... -->
<modules>
<module>webapp-formation</module>
<module>db-formation</module>
</modules>
But when I'm running mvn install the .pom files are not generate for my modules. Only my parent is associated with a .pom file in my repositories. Thus trying to browse to my module's artifact on Archiva (after running mvn depoy of course!) it simply fails. I can browse to the parent but not its children.
So... I need to add the undocumented attribute generatePom to my plugin configuration to have the .pom files generated --copied would be a better word actually-- for all my modules. --I said undocumented attribute because this attribute is documented only for the install-file goal which is not the one ran by default. The install goal is not expecting that attribute...
Of course, if I do not configure my install plugin --so not configuring the classifier-- I have no problem and all .pom files are generated properly.
For you guys, is that a normal behavior? Something that you have already seen? Or should I just file a bug?
Thanks,
Olivier.
What you describe as an undocumented attribute is simply wrong, cause the attributes are specific on a goal base which means the given configuration will not change anything, cause the generatePom attribute is only valid for install-file goal. So you can remove it.
In general such configuration does not make sense, cause if you have different environments you should go a different way. Just removed hte configuration with <classifier>${env}</classifier> as well and try to deploy via:
mvn clean deploy

Separate Jenkins-Project for deploying to JBoss

I have a Jenkins build which builds a maven project with -PmyProfile clean package. This works fine. Now I want the project be deployable but in a separate task (JBoss deployment) so it can be triggered explicitly via the jenkins GUI. For that, I have the following in my pom:
<profiles>
<profile>
<id>myProfile</id>
<properties>...</properties>
<build>
<plugins>
<plugin>
<groupId>org.jboss.as.plugins</groupId>
<artifactId>jboss-as-maven-plugin</artifactId>
<version>7.0.0.Final</version>
<configuration>
<hostname>localhost</hostname>
<port>29999</port>
<username>admin</username>
<password>admin</password>
<filename>${project.build.finalName}.war</filename>
<name>my-webapp</name>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
Now I only want to call that single deployment via mvn jboss-as:deploy separately. But how would I do that? If I create a second Jenkins project, everything needs to be built again, so that's pretty stupid. Building as a separate module does not work, either (some error with "building single modules not supported for maven 3").
Any ideas?
Thanks
It sucks a little, but you can always get stuff from another Jenkins workspace by using filesystem relative path like ../../SecondJob/workspace (or use symlink). I used to do this for the same case (deploying as separate job) for all my projects and it works, it's just not elegant, but I believe there's no built-in solution in Jenkins for that.
Alternatively, it seems there's Jenkins plugin for that, but I haven't used it and can't tell anything about it.
Possible trick:
Have only one project, but parameterize it with DEPLOY parameter set to FALSE by default. The build will contain your main build as well as an Invoke top-level Maven targets post-build step for deployment. The deployment step will be invoked only if DEPLOY is TRUE. To do that you use Conditional Build Step plugin.
There is a new deploy-only goal added in version 7.5.Final. You can grab the war from the first job with Copy Artifact Plugin.
References:
https://docs.jboss.org/jbossas/7/plugins/maven/latest/deploy-only-mojo.html
https://github.com/jbossas/jboss-as-maven-plugin/pull/56/commits

Resources