Maven build modules out of order - maven

I am trying to build a maven project that has several modules. So I expect them to be built in the order given in the pom.xml ,but It can be seen that the order of the modules in build is not same as the order mentioned in the pom.xml file. What can be the reason for that ?
My Maven version : Apache Maven 3.0.4

Maven decides the order based on the dependencies. So if you have 3 submodules: A, B an C, you need to go into each submodule and make the dependency explicit. For example, if you go to the pom.xml of B and declare that it depends on A and C, maven will build A and C in some random order, and will build B at the end.

Order that you mentioned in the parent pom file is also relevant in the case when there is no dependency crash between modules , that means if any module present above in the list and it depends on the module that is below it , then in this case order mentioned in POM file won't be used , Maven will use his brain and first build all the modules that are going to need by some other modules to build.
For more on build order , please have a look at this question and Maven Spec on same.
From the specs :
Reactor Sorting
Because modules within a multi-module build can depend on each other,
it is important that The reactor sorts all the projects in a way that
guarantees any project is built before it is required.
The following relationships are honoured when sorting projects:
1. project dependency on another module in the build
2. plugin declaration where the plugin is another modules in the build
3. plugin dependency on another module in the build
4. build extension declaration on another module in the build
5. the order declared in the modules element (if no other rule applies)
Note that only "instantiated" references are used - dependencyManagement
and pluginManagement elements will not cause a change to the reactor
sort order.

Related

maven multi-module project with one plugin module

I am thinking about creating a multi-module project in maven, with one plugin module (i.e. this module is used as a plugin in other projects, not a dependency).
Question: Is it feasible to have a plugin as a module in a multi module maven project?
It is feasible and it is also documented on the official maven multi-module/reactor page concerning how having a plugin as module is handled by the build (bold is mine):
Because modules within a multi-module build can depend on each other, it is important that The reactor sorts all the projects in a way that guarantees any project is built before it is required.
The following relationships are honoured when sorting projects:
a project dependency on another module in the build
a plugin declaration where the plugin is another modules in the build
a plugin dependency on another module in the build
[..]
Once the plugin is installed and deployed, it will then not bring with it any knowledge of its module nature, that is, it will be seen as a normal plugin by the projects that will use it via its unique maven coordinates (GAV).
The same is applied to archetypes, which can be modules of a multi module project and then be used individually.
Also note that you can have - as an example - an aggregator project of projects (then modules) completely unrelated between each other, and, say, just aggregate then because you want to build them all together (often not really meaningful, but useful in some cases).

Maven dependency vs multimodule?

Very new to Maven, can someone please explain to me the difference between using maven modules vs just adding a dependency to your maven project to another maven project in your workspace? When would you use one over the other?
A dependency is a pre-built entity. You get the artifact for that dependency from Maven Central (or Nexus or the like.) It is common to use dependencies for code that belongs to other teams or projects. For example, suppose you need a CSV library in Android. You'd pull it as a dependency.
A Maven module gets built just like your project does. It is common to use Maven modules for components that the project owns. For example, maybe your project creates three jar files.
A dependency can be thought of as a lib/jar (aka Artifact in Maven parlance) that you need to use for building and/or running your code.
This artifact can either be built by your one of the modules of your multi module project or a third party pre-build library (for example log4j).
One of the concepts of maven is that each module is going to output a single artifact (say a jar). So in case of a complex project it is good idea to split your project to multiple modules. And these modules can be dependent on each other via declared dependencies.
See http://books.sonatype.com/mvnex-book/reference/multimodule-sect-intro.html for example of how a web app is split to parent and child modules and how they are linked.
One of the most confusing aspects of Maven is the fact that the parent pom can act as both a parent and as an aggregator.
99% of the functionality you think about in Maven is the parent pom aspect, where you inherit things like repositories, plugins, and most importantly, dependencies.
Dependencies are hard, tangible relationships between your libs that are evaluated during each build. If you think of your software as a meal, it's basically saying A requires ingredient B.
So let's say you're preparing lasagne. Then your dependency chain would look something like this:
lasagne
<- meatSauce
<- groundBeef
<- tomatoPaste
<- cheese
<- noodles
The key thing is, each of the above items (meatSause, groundBeef, cheese, etc) are individual builds that have their individual set of dependencies.
By contrast, the only section of your pom that pertains to aggregation is the modules section:
<modules>
<module>meatSauce</module>
<module>groundBeef</module>
<module>tomatoPaste</module>
<module>cheese</module>
<module>noodles</module>
</modules>
Aggregation simply tells your build engine that it should run these 5 builds in rapid succession:
groundBeef -> tomatoPaste -> cheese -> noodles -> meatSauce
The main benefit of aggregation is the convenience (just click build once) and ensuring the builds are in the correct order (e.g. you wouldn't want to build meatSauce before tomatoPaste).
Here's the thing though: even if you organize the libs as standalone projects without module aggregation, your build will still come out the same provided you build in the correct order.
Moreover, both Jenkins and Eclipse have mechanisms for triggering builds if a dependent project has changed (e.g. changing groundBeef will automatically trigger meatSauce).
Therefore if you're building out of Jenkins or Eclipse, there is no need for aggregation

Maven pom to generate classfiles for another pom

I have a Maven project that depends on generated sources. These sources need to be generated by a java program built by another maven pom. (In this case the sources are generated by the greendao source generator, but they could be generated by any generic java executable.)
I haven't dealt with this sort of interdependency between maven projects before. Assuming I want to use a reactor to build these two submodules, how can I ensure that the first module is built AND executed and generates its source files to be included in the second module?
Module A generates sources. Module B depends on Module A. If you say that Module B depends on Module A in Module B's pom, I think the reactor will figure out it needs to build Module A first.

Tycho: Parent POM needs to list plug-ins included in my feature?

I am transitioning from using Buckminster to build an Eclipse product to Tycho. I've mavenized my plug-ins and features and have a question:
I created a parent feature with a POM that references my features and plugins. I don't know if I am doing this correctly, but I find that I need to add all features and plugins as modules. So if I have pluginA, pluginB and feature1 that includes pluginA and pluginB, I add all three to parent POM. This is a bit strange to me, because in Buckminster I had to reference only feature1 and it would get its dependencies based on the feature.xml file.
I am doing something wrong in my Tycho builds, or is this how it's suppose to work?
A Tycho build is driven by Maven, i.e. Maven first determines which modules should be part of the build reactor, and then Tycho builds the modules. Therefore, you'll need an aggregator POM that tells Maven about the list of artifacts to be built.

How to create a maven assembly with transitive dependencies for different deployment scenarios?

I'm having a problem reconciling building a project for use within an application server and for use as a stand-alone application.
To give an overall simplified context, say I have three Projects A, B, C.
Project A depends on Project B which depends on Project C.
Project C has a dependency X which is marked as provided since it was expected that it would be available as a JEE library within say an application server. i.e. jms.jar.
So if I perform an assembly build of Project A, I get all the transitive dependencies save for those marked as provided as expected.
Now I have a new deployment scenario where Project A needs to be used in a standalone environment i.e. outside an application server.
So now I need the jms jar to be a compile dependency. Does this mean that I should explicitly add a compile dependency for X in Project A? Doesn't this violate the Law of Demeter, i.e. don't talk to strangers, in the sense Project A shouldn't explicitly know about Project C but only about Project B?
This is a simple example but in reality I have multiple dependencies which have been marked as provided but are now need to be compile or runtime dependencies so they end up in the artifact produced by the maven assembly plugin.
Is this a fundamental problem with Maven or am I not using the tools correctly?
Thanks in advance for any guidance.
If you need your build to have variations in it for different scenarios, you need to use profiles and keep certain things (such as some of the dependencies) in the various profiles.
http://maven.apache.org/pom.html#Profiles
Different dependencies for different build profiles in maven
answers a similar question - but you can swap in the "release" and "debug" for "Project A" and "Project C"
Provided dependencies are a difficult subject. First of all: Provided dependencies are not transitive in the following sense: If your project C has a provided dependency on X, then A will not get the dependency. It is silently ignored. This fits with the following meaning of "provided" which I propose:
Only the artifacts that are actually deployed should mark dependencies as "provided". Libraries or other jars that are not individually deployed to a specific server should not have provided dependencies. Instead, they should declare their dependencies as compile dependencies. In your example: Project C should have a compile dependency on X. If project A knows that X is provided, it sets X to provided in "dependencyManagement". As project A should know the environment in which it runs it should decide what is provided and what is not. And "dependenyManagement" is the right place to declare this.
If your project A should be able to run within and without a given server, you probably need to make a lot of adjustments, even change the type from ear to jar. So you either use build profiles for this, which then have different dependencyManagement entries, or you split A into two projects which depend on some other project that contains the common elements.
If some given project C already has a provided dependency on X and you cannot change that, this is effectively the same as a missing dependency in C. This has to be repaired at some point, and this could be project A itself.

Resources