Maven inheritance and evaluation of properties - maven

Since system scope is said to be deprecated and dangerous, we use a local repository. The repository is in the parent folder and used by most of the submodules. Now including the repo gets messy. Providing the URL is sort of a hack.
We tried ${project.parent.basedir}/repo in the submodules, but this evaluates to nothing. We also tried to set it in the parent pom, ...
<repository>
<id>project_repo</id>
<url>file://${project.basedir}/project_repo</url>
</repository>
but maven decides to ship the url as given to the submodules which in turn evaluate the property. This lead us to the mess of just taking the relative parent dir, forcing the submodules to be subfolders of the parent pom:
<url>file://${project.basedir}/../project_repo</url>
This is problem Y. The question concerning X is, why does maven inherit before evaluation and how can I avoid this?

forcing the submodules to be subfolders of the parent pom
Regardless of other faced issue, this is actually the recommended approach in general, to have a multi-module/aggregation project (the parent) and submodules as subfolders, in order to have one central/entry-point folder (the parent) providing common configuration and governance (its pom.xml file) and modules (subfolders).
but maven decides to ship the url as given to the submodules which in turn evaluate the property.
Indeed project.basedir is evaluated as the folder containing the pom.xml against we are currently building (or sub-building, in case of a module), since the building project is the module at a given time.
From official documentation concerning project.basedir
The directory that the current project resides in.
If you want to always point to the folder from which you launched your build (that is, the aggregator/parent project in this case), you could use session.executionRootDirectory instead.
However, be careful especially if wishing to build a module directly from its directory, you may then run into troubles (path issues): you should always run it from the parent using reactor options like -pl (projects to build).
This is also a trigger for further thoughts: maintenance and readability of the project may suffer this approach. An enterprise Maven repository would then be a better solution.
Further reading on SO:
Maven variable for reactor root
Finding the root directory of a multi module maven reactor project
Update
The question concerning X is, why does maven inherit before evaluation and how can I avoid this?
Concerning your X question, here is the explanation I could find:
The answer relies in the core of a Maven build, the Maven Model Builder:
The effective model builder, with profile activation, inheritance, interpolation, ...
In particular, it performs the following steps in the following order:
phase 1
profile activation
raw model validation
model normalization
profile injection
parent resolution until super-pom
inheritance assembly
model interpolation
url normalization
Bold is mine. That is, it does so because it's made to do so by its model.

Related

Is it possible to see the actual pom.xml executed, including parent dependencies/plugins?

I need to extract a project from a repository which uses several layers of parent projects. Every parent project adds some dependency or plugins or properties. This is becoming a nightmare as I'm not able to build any more the project, once I've manually added pieces from parent projects.
Is there a way to create a list of all dependencies/plugins/properties which are linked by a single pom.xml so that I can build a portable, single Maven project?
Thanks
You can create the effective pom (https://maven.apache.org/plugins/maven-help-plugin/effective-pom-mojo.html) that is a kind of merge with all parent POMs.
This is useful to understand the complete list and configuration of plugins.
Whether this helps you to build a "portable" Maven project, I don't know. Without the appropriate Maven repositories with all the plugins, dependencies and so on, Maven will not build.
Base concept of Maven is CoC (Convention over Configuration). Maven has a SuperPOM and all model is inherited from that. SuperPOM is located in maven-model-builder jar. Here is the source https://maven.apache.org/ref/3.6.2/maven-model-builder/super-pom.html
Each Maven goal is using a merged model called effective pom. The help plugin has effective-pom goal which displays the full model including parent model(s) and SuperPOM.
So the answer is just run: mvn help:effective-pom command to see actual model.

Building "out of source" with Maven

Summary:
I would like to know how to build Maven applications consisting of several Maven projects with all inputs from read-only source and all outputs to a given (temporary) folder, without breaking IDE (Netbeans) support. None of the proposed solutions work for me, so I ask in detail to explain my problem as good as possible.
In detail:
Since quite a while I try to get my Maven builds integrated in our productive building environment which has several requirements:
no internet access (all inputs must be under version control)
source tree is read-only
all build artifacts must be below a build directory
well defined toolchain (under version version control or in read-only Virtual Machine)
libraries shall not know which application use them (i.e. "no parent POM")
to support development cycle, preferably it should work with IDEs (Netbeans), optional
to support development cycle, preferably it should work incrementally,
optional
This is similar to Out-of-tree build with maven. Is it possible? and other questions related to maven build directories, but unfortunately none of the proposed solutions works here.
I have the "central repository" accessible via "file:" URL and as well as the "local repository" below the (temporary) build directory, but I did not find any way to have mavens "target" directories below the build directory without breaking anything else.
I have several applications that share some libraries. For each library I have an own "parent POM" (violating the DRY principle, because I found no better way). Each contain application and environment specific settings such as the distributionManagement repository path, which is defined using ${env.variable} to allow the build tool to inject the system and application specific values. Default values are provided to make developers using Netbeans happy.
Unfortunately this does not work for build directory.
I could set a build directory as in Maven: How to change path to target directory from command line?. Then all class files (of all libraries and applications) build by one parent POM will be put into one and the same directory. Maven can be ran once and works - but in any later run it will fail! At least unit tests fail, because Maven (surefire) will find all tests for all projects, not only the one Maven is currently processing. So it will try to run unit tests from the application while building a library A (which in my case fails because it also needs library B which is not provided as dependency in library A).
I also tried constructions like "-DbuildDirectory=$BUILDDIR/maven-target/\${project.name}". This works for compilation of sources, but again not for tests. Although Maven (3.1.1) evaluates ${project.name} correctly when storing the files, it passes it literally (!) to the classpath, so when compiling unit test java doesn't find the test objects ("error: cannot find symbol"). Having a symlink ${project.name} to the correct name of course works only for one project (library A for example), because I found no way how to change it from within a pom.xml.
Currently my closest approach is using a global target directory and clean it before each build, but obviously this is bad when developing using this build system (working incrementally without IDE).
Now I'm even considering generating the pom.xml files from the build system with filled values and copy all sources to the build tree. This should work for our defined release builds, but be uncomfortable during normal development (with IDE).
Is there any trick or is this really impossible with Maven?
EDITED to answer the questions
Thanks so much for spending time on my issue / help request! I answer as clearly as possible.
A team mate suggested to add the motivation for all this:
It should be possible to reproduce any release in ten or 20 years, even if internet changed.
Why do you need to change the value of "target"?
Because the source tree is read-only, Maven cannot create the directory at its default position. A writable build directory is available, but outside the source code tree.
Maven will not write to /src, so no problem there
"source tree" does not reference to the "src" folder within the sources / inputs for the build process, but to the whole structure - including POM files, resources and even binary libs if needed. So the directory that contains src also contains pom.xml and belongs to read-only VCS controlled "source tree".
Could you give an example of something that does not work with the standard project layout?
One different example of such a read-only source tree could be: burn a whole project structure (application with libaries) on a CD-R and ensure "mvn compile test package deploy" works.
(One motivation for the read-only source tree requirement is to ensure that the build process does not accidentally (or intentionally) manipulate source code, which could lead to automatically changing build artifacts by the number of repeats, breaking reproducibility.)
using a repository manager
As I understood, a repository manager is a complex service typically running on a host. I think using file URLs instead of such a complex service keeps the dependencies smaller. Please note that the whole repository has to be version controlled, so automatically updating must not work anyway, but as I understand is the advantage of a repository manager.
Do you use dependency jar's
Yes, sure. The libraries and applications depend on several JARs. Even Maven depends on several JARs, such as the compiler plugin.
If your request states that you need to build all dependencies yourself ?
Actually I'm afraid that this could be the case and yes, I'm aware this would be huge effort, unfortunately. Even with Debian packages this was quite difficult to realize and they put a lot of effort to make it possible. At the moment I don't see a realistic chance to build these libraries, unfortunately.
"...all build artifacts must be below a build directory"- Can you please explain that in detail?
The build process is supposed to create all files below a given build directory, for example /tmp/$USER/build/$PROJECTNAME/$PROJECTVERSION/$APPLICATION/$VARIANT/. The build system can create arbitrary structures below that, but is not supposed to change anything outside this. A list of files within this tree is defined as output (build result) and taken from there. Typically some binaries or ZIP files are copied to version control system.
Some call this "out of tree build", "build out of source" or similar.
well defined toolchain
The idea is to have a Virtual Machine image (under version control), for each build create a dynamic "clone" of it, install needed toolchains (from version control), for example the build tools, the Maven installation and its repository, give access to the specific version of the sources to be built (for example, in form of a read-only ClearCase view), run a build script entry point script, collect created outputs (for example some "release.zip" and "buildlogs.zip") and finally discard the whole virtual machine including all its contents. Of course the toolchains could be fully preinstalled in the image, it is just not done for practical reasons (and yes we also use Docker).
For impact analysis it's very important thing to know who is using which library ?
Yes, this surely is very true. However, the sources should abstract from it, and usually do. For example, the authors of JUnit dont' know everybody who is using it. We like to use a library in several projects at the same time, so we cannot mention a parent POM, because there are multiple parent POMs.
I hope I was able to explain better this time.
EDIT #2 to answer the new questions
Thanks again for your time getting through this long post! I hope this time I was able to explain the last missing bits and make it clear. I think some of the questions seem to be comments about the requirements. I'm afraid we could slip into a discussion.
In maven you have a directory structure like:
root
+- pom.xml
+- src
The root directory is under version control. So I don't understand the following:
"source tree" does not reference to the "src" folder within the
sources / inputs for the build process, but to the whole structure -
root
+- pom.xml
+- src
+-- ...
+- target
+-- ...
I added target" to the structre to better illustrate the problem. As you say, the root directory is under version control and thus read-only. Thus, root/pom.xml, root/src and root itself are read-only.
When Maven would try to create root/target, it will get an error (Read-only file system), because root and all its sub-folders are read-only.
Here even the parent folder is under version control and in case of ClearCase, there are even more additional read-only parent directories, but I think this does not matter here.
We are talking about terra bytes of artifacts....They have to be backed up but not checked into version control.
This is out of scope, but I try to answer nevertheless.
The one ClearCase instance I'm working on right now for example has 2.2 TB, so tera bytes, exactly. Backup may not be sufficient if standards require traceability of all changes between releases, for example. However I think this is out of scope of my question.
a file base repository does not work well and needed to be copied to each machine where you use it which is a hassle having a kind of network file system which is mounted to many machines. A repository manager works http(s) based which much more simpler to handle..
I think this is out of scope and not related to my question, but I try to answer anyway.
The problem is that for reproducibility, you would need to archive everything of this http(s) for the next ten or 20 years and keep it running. With history, because it might be required to reproduce a five year old state. Of course the needed packages might not be available in the internet anymore (remember Codehaus.org?). Challenging tasks.
a repository manager is much more simpler and keeps the conventions
Unfortunately it does not fulfill the requirements, or require a high price (managing additional virtual machines plus version control of the stored packages).
building all dependencies yourself I think that is simply not worth the effort. I don't a real advantage of that
I think such requirements are not uncommon (https://en.wikipedia.org/wiki/Source_code_escrow), but I'm afraid we are getting off topic and start discussing requirements.
If the JUnit team can access the request from maven central (which can be done) they can see who is using JUnit
I think its off-topic, but how should JUnit developers be able to know all "my" projects that use JUnit?
Where is the relationship to the parent pom?
I'm sorry if I caused confusion, this was a side note only. I think it is out of scope of my question. I'll try to answer nevertheless.
To avoid redundancy, one approach in Maven is using a parent POM and inherit from it. For example, you can define distributionManagement in a parent POM and make the library POMs inherit from it. When you have project specific distributionManagement requirements, you need project specific parent POM files. If two projects share one and the same library, which of the two project POMs should the library inherit (having as parent)? Having the setting distributionManagement in an aggregator POM does not work as it is not propagated.
EDIT #3 to explain read-only
Hi #JF Meier
thank you for your comment.
Read-only means it cannot be written to, so Maven cannot create the target directory. Because of this, javac cannot store the class files and compilation aborts.
Maybe it is too abstract, so let me give a full real example:
cat pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>test</groupId>
<artifactId>mavenproject1</artifactId>
<version>1.0-SNAPSHOT</version>
</project>
cat src/main/java/Hello.java:
public class HelloWorld {
public static void main(String[] args) { System.out.println("Hello, World!"); }
}
Now copy & paste from the shell (I cut a few boring lines):
steffen#node1:/view/dev_test_project_v1/vobs/playground/steffen/minimal_pom $ mvn compile
[...]
[ERROR] Failure executing javac, but could not parse the error:
javac: directory not found: /view/dev_test_project_v1/vobs/playground/steffen/minimal_pom/target/classes
This is one problem. A small detail I have to correct, it is not wshowing the Read-only file sytem error, I think a bug, but in strace we can see it:
21192 mkdir("/view/dev_test_project_v1/vobs/playground/steffen/minimal_pom/target/classes", 0777) = -1 ENOENT (No such file or directory)
21192 mkdir("/view/dev_test_project_v1/vobs/playground/steffen/minimal_pom/target", 0777) = -1 EROFS (Read-only file system)
21192 mkdir("/view/dev_test_project_v1/vobs/playground/steffen/minimal_pom/target", 0777) = -1 EROFS (Read-only file system)
This leads to the javac: directory not found error.
It is read-only:
steffen#node1:/view/dev_test_project_v1/vobs/playground/steffen/minimal_pom $ mkdir -p target/classes
mkdir: cannot create directory ‘target’: Read-only file system
The mkdir tool correctly shows the error message.
Out-of-tree builds with Maven and no Internet
In case someone else faces similar requirements, I describe my solution, in form of an example. I understand and happily accept that most people surely do not want this or at all need "out-of-tree" builds with Maven. This is for the very few others, like me, who have no choice.
So this is not suited for "normal Maven builds", but only for the specific requirements described in the question (no internet access, everything read-only except a /tmp folder). Such a configuration could be a build system in a virtual machine which is ran from a VW template, dynamically instantiated by Jenkins jobs.
A word about ClearCase
This approach is not ClearCase specific, but the example uses ClearCase paths.
The following example uses a read-only ClearCase view toolchain_view mounted to /view/toolchain_view/ (containing Maven, all Java packages and other build tooling) and a read-only source code view (containing an application) available below "/view/steffen_hellojava_source_view/". Instead of "/view/toolchain_view/", someone could use "/opt/vendor/name/toolchain/version/" or such and for the sources use any other version control system, so this approach is not ClearCase-specific.
For those who don't know ClearCase a word about it: The view behaves similar like a NFS file system, where the server selects the file content based on a version description. The file contents itself is in a database called Versioned Object Base (VOB). Using a so called Configuration Specification (CS) someone can define which content version to show for which file (element) in form of selection rules (such as selecting by a LABEL). For example:
---[ toolchain-1_4_7_0.cs ]---------------------------------------->8=======
element /vobs/playvob/toolchain/... TOOLCHAIN_VERSION_1_4_7_0
# TOOLCHAIN_VERSION_1_4_7_0 automatically also selects:
# element /vobs/playvob/toolchain/3p/maven/... TOOLCHAIN_MAVEN_3_1_1_0
# element /vobs/playvob/toolchain/3p/maven-repo/... TOOLCHAIN_MAVENREPO_1_0_1_0
# element /vobs/playvob/toolchain/3p/java/... TOOLCHAIN_JAVA_1_7_0_U60
=======8<-------------------------------------------------------------------
Now a ClearCase view such as "toolchain_view" can be created and configured to use these selection rules cleartool -tag toolchain_view -setcs toolchain-1_4_7_0.cs. In the example it is mounted as /view/toolchain_view/, which prefixes the element (file) paths.
Configuring Maven
Now we need to configure Maven to
use our file structure /view/toolchain_view/vobs/playvob/toolchain/3p/maven-repo/ as only Central Repository
store the local repository below /tmp and
have the target directories also below /tmp
The first two can be configured in Maven settings file, but apparently the last one unfortunately seems to require specific changes in the POM file.
mvn_settings.xml
Here an excerpt from a --global-settings file for Maven. I compactified it a bit. Essential is the usage of file:/// URLs for both Central Repositories and also enforcing this as one and only mirror:
<!-- Automatically generated by toolchain creator scripts.
This sets the maven-repo version to the correct value, for example
toolchain version 1.4.7.0 defines Maven repo version 1.0.1. -->
<settings ...>
<localRepository>${env.MAVEN_LOCAL_REPO}</localRepository>
<profiles>
<profile>
<id>1</id>
<activation><activeByDefault>true</activeByDefault></activation>
<repositories><repository>
<id>3rdparty</id><name>third party repo</name>
<url>file:///view/toolchain_view/vobs/playvob/toolchain/3p/maven-repo/</url>
<snapshots><enabled>true</enabled><updatePolicy>never</updatePolicy></snapshots>
<releases><enabled>true</enabled><updatePolicy>never</updatePolicy></releases>
</repository></repositories>
<pluginRepositories><pluginRepository>
<id>3rdparty</id><name>third party repo</name>
<url>file:///view/toolchain_view/vobs/playvob/toolchain/3p/maven-repo/</url>
<snapshots><enabled>true</enabled><updatePolicy>never</updatePolicy></snapshots>
<releases><enabled>true</enabled><updatePolicy>never</updatePolicy></releases>
</pluginRepository></pluginRepositories>
</profile>
</profiles>
<!-- Unfortunately **required** for file-based "central repository": -->
<mirrors><mirror>
<id>dehe_repo_1.0.1</id><name>Vendor Name Central 1.0.1</name>
<url>file:///view/toolchain_view/vobs/playvob/toolchain/3p/maven-repo/</url>
<mirrorOf>*,!3rdparty</mirrorOf>
</mirror></mirrors>
</settings>
To ease deployment we could use paths that include version numbers, such as /opt/vendor/name/toolchain/1.4.7.0/mvn_settings.xml, which could be provided by own Debian packages.
Pointing localRepository to a build-specific (temporary) directory by setting an environment variable allows building when everything is read-only except the temporary directory, which is needed for "out-of-tree" builds.
When building with IDE (Netbeans) we can use this settings file as well, but usually it is more comfortable not to do so. In this case, however, someone has to pay attention not to accidentally add dependencies. If these are not included in the pinned Maven Repo, compilation on the build system will break.
hellojava/pom.xml
To support out-of-tree builds, we also need to move the target folders out of the read-only source tree (normally they are created beside pom.xml and src in the Java package directory, which itself is under version control and thus read-only here). This is implemented by using two properties: buildDirectory and deployDirectory. The default buildDirectory is the normal target folder, so when not setting buildDirectory, Maven builds as normal. This is nice, because we don't need a specific POM file for the IDE (Netbeans).
Surefire generates unit test reports, which of course also need to go to our build directory.
<project>
...
<properties>
<project.skipTests>false</project.skipTests>
<project.testFailureIgnore>false</project.testFailureIgnore>
<buildDirectory>${project.basedir}/target</buildDirectory>
<deployDirectory>defaultDeploy</deployDirectory>
</properties>
...
<build>
<!-- https://stackoverflow.com/a/3908061 -->
<directory>${buildDirectory}/hellojava</directory>
<!-- https://stackoverflow.com/a/6733858/9095109 -->
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.4</version>
<configuration>
<skipTests>${project.skipTests}</skipTests>
<testFailureIgnore>${project.testFailureIgnore}</testFailureIgnore>
<workingDirectory>${project.build.directory}/test-run</workingDirectory>
</configuration>
</plugin>
</plugins>
</build>
...
<distributionManagement>
<repository>
<id>build-integration</id>
<name>Deployment Repository</name>
<url>file:///${deployDirectory}</url>
</repository>
</distributionManagement>
...
</project>
Putting it all together
The values for the properties in this POM file now have to be set by command line parameters, and MAVEN_LOCAL_REPO has to be configured, for example:
#!/bin/bash
# Normally in a wrapper which is automatically generated by toolchain creator.
# The path to the pom.xml and build directories then of course are parameters.
export JAVA_HOME="/view/toolchain_view/vobs/playvob/toolchain/3p/java/"
export PATH="/view/toolchain_view/vobs/playvob/toolchain/bin:$PATH"
# or: export PATH="/opt/vendor/name/toolchain/1.4.7.0/bin:$PATH"
: ${MAVEN_LOCAL_REPO:="$HOME/.m2/repository"}
: ${MAVEN_OPTS:="-XX:PermSize=64m -XX:MaxPermSize=192m"}
export MAVEN_LOCAL_REPO="/tmp/steffen-build/hellojava/mavenbuild/repo"
/view/toolchain_view/vobs/playvob/toolchain/3p/maven/bin/mvn -e \
--global-settings /view/toolchain_view/vobs/playvob/toolchain/mvn_settings.xml \
-DbuildDirectory=/tmp/steffen-build/hellojava/mavenbuild/target/ \
-DdeployDirectory=/tmp/steffen-build/hellojava/mavenbuild/output/ \
-Dproject.skipTests \
-f /view/steffen_hellojava_source_view/hellojava/pom.xml \
compile package deploy
Now /tmp can be a RAM disk on a "read-only system". All artifacts are written below the dedicated build directory. We could have toolchain and complete source tree on a DVD or read-only NFS archive server, and still could compile it, without Internet access. This should still work in 20 years, even if Maven Central has been renamed or whatever.
Of course wrapper scripts can hide all the details. In my case, they are integrated in a cmake-based build system and the top level build directory is configured by a Jenkins job.
Checking Requirements
For each requirement from the original question we can check whether this approach meets it:
no internet access (all inputs must be under version control)
OK, "downlading" from file:///
source tree is read-only
OK, works with read-only /view tree
all build artifacts must be below a build directory
OK, Maven Local Repository is configured by setting MAVEN_LOCAL_REPO
well defined toolchain (under version version control or in read-only Virtual Machine)
OK, is in /view/ or /opt and all versions are "pinned" (fixed)
libraries shall not know which application use them (i.e. "no parent POM")
OK, but not nice, since it is needed to adjust all POM files
to support development cycle, preferably it should work with IDEs (Netbeans), optional
OK, same POM files work for IDEs
to support development cycle, preferably it should work incrementally, optional
OK, as long as the build tree and the Local Repository are kept, Maven works incrementally
So all the (very specific) input requirements are fulfilled. Good. So this implements out-of-tree builds without Internet.
So based on your update I add another update here:
Why do you need to change the value of "target"?
Because the source tree is read-only, Maven cannot create the
directory at its default position. A writable build directory is
available, but outside the source code tree.
You seemed to misunderstand the idea of target directory or you have a misunderstanding how Maven works.
The target directory is intended to contain all generated/compiled things which is NOT checked into source control. Or in other words target directory is by default ignored during checkin and will never and should never being checked into version control.
Maven will not write to /src, so no problem there
In maven you have a directory structure like:
root
+- pom.xml
+- src
+-- ...
The root directory is under version control. So I don't understand the following:
"source tree" does not reference to the "src" folder within the
sources / inputs for the build process, but to the whole structure -
including POM files, resources and even binary libs if needed. So the
directory that contains src also contains pom.xml and belongs to
read-only VCS controlled "source tree".
Coming to the next point:
As I understood, a repository manager is a complex service typically
running on a host. I think using file URLs instead of such a complex
service keeps the dependencies smaller. Please note that the whole
repository has to be version controlled, so automatically updating
must not work anyway, but as I understand is the advantage of a
repository manager.
The whole repository should be version controlled. That sounds like you don't have experience with real builds in corporate environments. We are talking about terra bytes of artifacts....They have to be backed up but not checked into version control.
The reason is simply cause each artifact is uniquely defined by it's coordinates which groupId, artifactId, version.
Apart from the argument you gave the complexity of setting up is simpler than you think. Apart from that a file base repository does not work well and needed to be copied to each machine where you use it which is a hassle having a kind of network file system which is mounted to many machines. A repository manager works http(s) based which much more simpler to handle..
Coming to next point:
The build process is supposed to create all files below a given build
directory, for example
/tmp/$USER/build/$PROJECTNAME/$PROJECTVERSION/$APPLICATION/$VARIANT/.
Maven does this in target directory..
The build system can create arbitrary structures below that, but is
not supposed to change anything outside this. A list of files within
this tree is defined as output (build result) and taken from there.
Typically some binaries or ZIP files are copied to version control
system. Some call this "out of tree build", "build out of source" or
similar.
As I mentioned before the resulting artifact I would not recommend to put them under version control cause a repository manager is much more simpler and keeps the conventions...
Coming back to your point of building all dependencies yourself I think that is simply not worth the effort. I don't a real advantage of that...In particular cause you can consume all artifacts based correct coordinates from Maven repositories (or from your own repository manager inside your organisation)...
Next point:
Yes, this surely is very true. However, the sources should abstract
from it, and usually do. For example, the authors of JUnit dont' know
everybody who is using it.
If the JUnit team can access the request from maven central (which can be done) they can see who is using JUnit..This is much more simpler inside an organisation by using the access log of the repository manager and extract the usage information...
What I don't understand is:
We like to use a library in several
projects at the same time, so we cannot mention a parent POM, because
there are multiple parent POMs.
This is called a simple dependency in Maven. Where is the relationship to the parent pom ? You seemed to misunderstand / or have no understand how Maven works.. ?
Let me add the following to khmarbaise's answer.
I don't see your "in ten years reproducible" issue with the standard Maven structure. You just:
Check out the source from the version control system to an empty directory.
Build your project with Maven in this directory.
Deploy the results to wherever you want.
The source in the version control system is not changed, everything will work exactly the same every time. You only have to make sure that you only use release versions in your dependencies (not SNAPSHOT versions), so that Nexus/Artifactory give you the same artifact every time.

Maven release and recomposed POM family

I'm having some issues running the Maven release plugin in my company's specific maven structure. Let me explain the concept of recomposed family I'm referring to. My projects are multimodule projects, where each module may have a different parent than its natural father.
project/
pom.xml (the natural multimodule reactor father) (I list as modules all that follows)
module1/pom.xml (my parent is NOT ../pom.xml, I'm a half sibling)
module2/pom.xml (my parent is ../pom.xml, I'm a natural child) (I have a dependency on module1)
project-war/pom.xml (my parent is NOT ../pom.xml, I'm a half sibling)
The reason we adopt this "foster parent" strategy, is that we want to activate some plugins by default for some specific "adopted siblings". For example, every WAR projects needs to define a specific maven-resource-plugin execution ID. Since we have about 80 WARs to generate, imagine the maintenance if we are to add an extra execution step to ALL WARs. This works well for development purposes, we do have valid and running SNAPSHOTs building and deploying.
Now that we want to release, the maven-release-plugin seems not to like this specific structure. In short, since module2 needs module1, but module1 has a different father, release plugin keeps module1 as a SNAPSHOT.
My question here is, has anyone manage to release a projects with recomposed family members? Is there any configuration that I need in the release plugin to enable a full release of such projects?
Violating the inheritance between modules are parent is going to give you more problems than anything else.
Your only options here are either:
fix the parent-children model so the release plugin can do its job (and then move those module activations to the children where you want it)
do the tagging, changing of versions, build+release (mvn deploy , and optionally also site-deploy) manually
After some further tests, here are my conclusions about this issue.
The recomposed family CAN be tackled by Maven on a simple condition: ALL members of the family are present in the reactor.
That would mean having a "super-reactor" that has a reference to both the project you want to release, and the parent poms that may be declared by some modules. Of course, the release plugin will then release everything in this super reactor. If you have many projects that all refer the same parent projects, you should release them in one shot.
Moreover, for the build model to be generated correctly, all relative paths must be right. In my specific case, we wanted to avoid such a thing, I guess we are back to setting a fixed folder structure.
It could be handy to exclude some projects from the release plugin, but of course that creates a potential instability.

Good approach of a maven project design or antipattern design

Context:
I have a multimodules maven project that looks like:
Root ---- ModuleA
ModuleB
ModuleC
ModuleD
.......
They are around 25 modules under the Root:
A few of them represent the core of the application (5 modules)
And each of the remaining modules represent the business processes implementation related to a type a customers. These modules are completely independant among each others.
When packaging or releasing the 'Root' project, the artifact generated is a single ZIP file aggregating all the JARs related to 'Root' modules.
The single ZIP file is generated according to an assembly descriptor, it represents the delivery artifact.
At deployment time on the target environment, the single ZIP is unziped under a directory where it is consumed (class loaded) by an 'engine', a java web application that provides the final services.
Constraints
The 'business constraints' from one side,
And the willing to reduce regressions between different versions on
the other side
The above constraints lead us to adopt the following release scenarios:
Either we release the Root and ALL its submodules. It means that
the resulting ZIP will aggegate all the submodules JAR with the same
version. The ZIP will contain something similar to:
[ModuleA-1.1.jar, ModuleB-1.1.jar, ModuleC-1.1.jar, ModuleD-1.1.jar,
......., ModuleX-1.1.jar].
Or we release the Root and A FEW of its submodules, the ones that we want to re update.
The resulting ZIP will aggegate all the submodules JAR : The released submodules will be aggregated with the last released versions, the unreleased submodules will be aggregated with another 'appropriate' version.
For example, if we made a such incremental release, the ZIP will contain something similar to [ModuleA-1.2.jar, ModuleB-1.1.jar, ModuleC-1.2.jar, ModuleD-1.1.1.jar,
......., ModuleX-1.1.2.jar].
These 2 scenarios were made possible by:
Either declaring the modules as MAVEN MODULES 'module' for the first
scenario
Or declaring the modules as MAVEN DEPENDENCIES 'dependency' for
the second scenario, the INCREMENTAL scenario.
Question
Both scenarios are working perfectly BUT when we are in the 2nd scenario (INCREMENTAL), the maven-release-plugin:prepare is uploading to the SCM (svn) all the modules [ModuleA, ModuleB, ModuleD, .... ModuleX], it is uploading the released and the non released ones, whereas the 'non released modules' are declared as 'dependency' in the pom and not as a 'module'.
1/ IS THERE a way to avoid uploading the 'non released' modules ? Is there a way to inject an 'exlcude directrory list' to SCM svn provider ?
2/ A MORE global question, does the approach used is a correct one ? Or is it an anti pattern usage ? In that case, what should be the alternative ?
Thank you.
To me, your approach looks like an antipattern. I recommend to only have projects in the same hierarchy that you want to release together. Projects with a different release lifecycle should live on their own - otherwise you will keep running into the issues you mentioned. If you run the release plugin from a root directory (multi-module setup), all of the content of that root directory will be tagged in SVN.
In your case, I would probably create the following hierarchies:
Core
One per customer type
Potentially one per type to bundle them (zip), depending on your structure
I would group it by the way you create the release. It might mean that you have to run the release plugin a couple of times instead of just once when you make a change e.g. in Core, but it will be a lot cleaner.
Your packaging project will then pull in all of the dependencies and package/assemble them.
If you have common configuration options, I recommend to put them into a common parent pom. This doesn't have to be your root (multi-module) pom.
Did you try to run the maven-release-plugin with -r argument + the list of all modules you want to release?
Basically, this argument allows you to specify the list of modules against which the maven command should be performed. (if you omit it: all submodules will be included, this the default behavior)
See more details about this command line here.
I never try to use it with the maven-release-plugin, and I don't know if it will work, especially regarding SCM operations.

Maven2: Multiple inheritence or profile selection

I have a Master/Child pom project with many child modules. For some of the child modules I have a series of tasks that only they need to undertake, basically read a properties file to configure a jboss deploy location.
At present I have copied the build section into a few POMs and this is now starting to ring alarm bells. I really should write this once and use it many.
This has set me on a philosphical path.
a) I believe I could create a profile in the master POM and when running the child POMs enable this profile thus using the defined functionality from the Parent POM. This keeps the current structure in place of 1 parent : many children. It also raises the question, could I actually activate that profile in the child POM so that the person doing the build on the child did not have to use a -P command line option?
b) I could change the exisiting parent/master POM in to a grandparent and inherit the master POM into a new parent pom, adding the property reading/jboss deploying facilities to this new parent. The child POMs requiring this functionality would then inherit from the new Parent, giving me a grandfather-father-son inheritence. This would mean that the person doing the build would not need to rememeber the commandline at the cost of another level of inheiritence.
Question is, which method is the "maven way"?
Unfortunately option a) is not supported by Maven. I had a similar situation to solve and tried the child-enables-profile-defined-in-parent bit, only to find it doesn't work; details elsewhere in StackOverflow.
I don't know if there is an official best practice, as I didn't find one when I did my hunt before. I used a combination of grandparent/parent/child POMs and the <pluginManagement> configuration.
I would try <pluginManagement> in your parent POM first. If that doesn't quite meet the need the extra POM level might work. Good luck!

Resources