maven multimodule build: package vs. install - maven

What is the true way of building multi-module maven project: via package or via install if NONE of the modules will be a dependency for another project? I think in this case package is the only way to build it but I see how people IMO abuse the install goal. And I don't get why.
Is there any official considerations on how the multi-module project should be built?
UPD: I have only one explanation. Sometimes people just unaware of -pl, -am and -rf maven options. Thus it leads them to install the modules' artifacts into repository when they want to build part of the reactor.

if none of the projects is a dependency of any other project you have a weird setup. why build them together if they don't belong together?
Sometimes people just unaware of -pl,
-am and -rf maven options
in a normal setup, where there are dependencies between modules, these options (at least -rf) don't work if the modules aren't installed.
OK, you are using a dependency management system without dependencies. Yes, you can use package instead of install. But you are not avoiding bad usage patterns, you are either missing out on features or grouping things together that don't belong together.

Related

Maven to Gradle -- command line options

I'm making a case for moving our builds from Maven to Gradle. Below are a few of the Maven command-line options my team finds useful. What are the Gradle equivalent choices?
-am,--also-makeIf project list is specified, also build projects required by the list
-amd,--also-make-dependentsIf project list is specified, also build projects that depend on projects on the list
-o,--offline Work offline
-pl,--projects Build specified reactor projects
instead of all projects
-rf,--resume-from Resume reactor from specified project
Maven Examples:
I only want to build the sub-project I'm working on and its dependencies.
mvn install --also-makeIf --projects :my-sub-project
After fixing an build issue, I want to start the build from the point of failure.
mvn install --resume-from :my-sub-project
I don't want to download external dependencies from an central repo.
mvn install --offline
Here are some rough analogues:
-am: buildNeeded (This triggers a full build of all upstream projects; building those parts of upstream projects that are required to fulfill the command at hand is automatic in Gradle.)
-amd: buildDependents
-o: --offline
-pl: :subproject1:build :subproject2:build
-rf: No direct analogue (not reliable, wouldn't work for parallel builds, etc.), but Gradle's incremental build will get you to the "resume point" quickly.
Note that Gradle's core concepts differ significantly from Maven's. To give one example, in Gradle build order is solely determined by task relationships, and there is no such concept as an execution dependency between projects. Due to these differences, some Maven features aren't necessary or useful in Gradle, some you get for free, and some come in a different form.

How to resolve dependencies between modules within multi-module project?

After working with Maven for a while, I am thrilled by the many features that Maven brings into the build architecture, particularly the dependency management. However, I have run into one issue again and again - how Maven resolves dependencies between multi-module projects. I am wondering if this is the big flaw of the current Maven implementation and/or if there is any satisfactory workaround.
Let's say I have a multi-module Maven project. The Parent pom contains three modules -- moduleA (jar), moduleB (jar), and moduleC(war). B depends on A and C depends on B. Simple enough? Now that I want to run the mvn dependency:go-offline at the parent project, which is supposed to resolve all the dependencies and bring them into the local .m2 directory. It fails because Maven complains that it cannot solve dependency for moduleA when it is acting on moduleB. Because all these modules belong to one groupId, I even try to use -DexcludeGroupIds=x.y.z to exclude these module dependencies, but it still fails at the same point.
I understand why Maven is complaining - moduleA is not built yet and thus there is no moduleA:jar artifact in my local or internal repository when go-offline goal is executed. But IMHO the plugin should treat these inter-module dependencies differently. In this case, it should simply ignore it. One might argues that I can simply do mvn clean install, which will install moduleA:jar into the local repository. After that, running mvn dependency:go-offline will work for sure. But that workaround defeats the purpose of this go-offline goal. This plugin allows us to resolve and pull dependencies into our local repository without building the whole project. I used dependency:copy-dependencies goal in another case and it has the same issue.
I also ran into similar issue in other scenarios: "mvn clean generate-source" could not resolve dependencies. When I ran mvn clean compile, everything works fine, but when I ran mvn clean generate-source, it fails because Maven cannot resolve inter-module dependency. In that case, the was caused by #requiresDependencyResolution in the antrun plugin.
Since both antrun plugin and dependency plugin are very popular in the Maven world, I am sure I am not the only one who have run into this issue. Anyone finds any solution/workaround?
Maven has the concept of a "reactor" where artifacts that have been built in a single run (e.g. maven package) are available for dependency resolution during the build. For example, if your dependency graph yields the build order moduleA moduleB moduleC, and you do mvn package, Maven will build moduleA, package its artifact and add it to the reactor, then build moduleB, package it and add it to the reactor, then the same for moduleC. This means moduleB has access to moduleA's artifact for dependency resolution, and moduleC has access to moduleA and moduleB. This only works if artifacts are actually built, i.e. when you run the package goal.
The problem is that when you don't run the package goal because you're not interested in the artifacts (as for your dependency:go-offline example), artifacts for modules that have been processed don't get built and thus not added to the reactor. I find this annoying as well; I think Maven should look at the POM files in its list of modules to build and look there as well; but it doesn't.
In short, the solution to your problem is to do mvn package dependency:go-offline . This will not install artifacts in your local repository (which I believe is very bad practice) but it will put them in the reactor for the duration of the build, meaning that Maven will be able to resolve dependencies from your moduleB to the moduleA that has already been built. The downside is every module will be tested and packaged, which is a lot of work when all you wanted is to do dependency:go-offline.
Either way, hope this helps.
This has been finally been resolved with Maven Dependency Plugin version 3.1.2.
You can make sure it's used by pinning the version in your pom.xml:
<build>
<pluginManagement>
<plugins>
<plugin>
<artifactId>maven-dependency-plugin</artifactId>
<version>3.1.2</version>
</plugin>
</plugins>
</pluginManagement>
</build>
I have created a JIRA ticket with a sample project here:
https://issues.apache.org/jira/browse/MDEP-516
Please vote for it.
You explained why it doesn't work so you understand the issues. The problem for you is that it stops when it can't find A.jar but that will only happen when you get to building B. So there is a sort of, sometimes useful strategy.
You have to mess with A by itself. Just build A. Use your plan of loading dependencies and then building it.
Once it builds, you can move on to doing the same thing with B and then C. Step by step.
One thing to remember here is that its sometimes ok to build B with an old snapshot of A in the local repo. You only need the new snapshot of A build in the repo if there are signature changes or new stuff required by B.
There are some discussions here too: Maven Modules + Building a Single Specific Module
One final not that usually these sort of questions come up when people have builds that take too long. There are several ways to make builds go faster:
Get faster hardware. The build computer, the disk storage or the network speed are typical components that are cheaper to upgrade than waste the time taken in slow builds.
Make the build go faster by not building stuff that doesn't need rebuilding. (For example, I had a build that rebuilt all the generated code every time. I added some stuff into the build that kept it from doing that except when dependencies to the generated code changed.)
Speed up the tests. Sometimes this means breaking the tests into two parts. Part 1 is fast tests and part 2 is slow tests. Run the fast tests on every build and the slow tests before any checkin of code or release of artifacts.
Break a multi-module build into 2 or more separate builds and use human intelligence to decide when to rebuild things. This works well when some jars are stable and don't change much any more.
Fill in your own method to make the build go faster.
I doubt such functionality would ever be possible with Maven. Whilst your projects share a common parent and depend upon each other, Maven cannot possibly know where to find these projects in order to build them. It also cannot determine whether the projects just need to be built, or whether you've specified the wrong version number for your dependency.
That's going to be supported by dependency:go-offline goal starting from Maven Dependency Plugin v3.1.2. Related jira ticket MDEP-204 and patch 23b7ca8790ae14175ed8e3a20c75c6274efe5ad8 with fix.

Best practice wrt. `mvn install`, multi-module projects, and running one submodule

I tend to avoid using mvn install in my multi-module projects because I feel like I then don't know which exact version of a submodule is then used when building / launching other submodules (particularly when switching between branches very often).
I tend to use mvn package a lot and then mvn verify.
I'm now facing the issue in a FOSS project (a Maven archetype moreover) where I'd like to use Maven's best practices.
It's a multi-module project with a webapp submodule depending on the other modules, and what worries me is the ease of development along with mvn jetty:run (or jetty:start).
Currently, I defined 2 profiles:
prod, the default one, declares dependencies on the other submodules;
dev on the other hand does not depend on the other modules, and configures the jetty-maven-plugin by adding the other modules' output directories as extraClasspath and resourcesAsCSV.
That way, I can mvn package once and then cd webapp && mvn jetty:start -Pdev and quickly iterate, reloading the webapp without the need to even stop the server.
AFAICT, extraClasspath was added for that exact purpose (JETTY-1206).
I've been pointed at the tomcat7-maven-plugin which can resolve modules from the reactor build when using Maven 3 (and I raised an issue to bring the same to Jetty: JETTY-1517), but that hardly solve my
If I hadn't removed the dependency on the other submodules from in dev profile, I'd have had to do an mvn install first so that validating the POM doesn't fail, even if jetty:start doesn't use those dependencies afterwards.
So here's my question: is mvn install really that common? or my approach of putting the intra-reactor dependencies only in the prod profile OK?
(note that I have the exact same problem with the gwt-maven-plugin, so please don't tell me to simply switch to Tomcat; that wouldn't even work actually, details here)
The mvn install is common in particular in relationship with multi-module builds, cause it will give you the chance to run a single module from your multi-module build.
This can be achieved by using:
mvn -pl submodule LifeCycle
I just found a workaround (which seems logical as an afterthought): https://jira.codehaus.org/browse/JETTY-1517?focusedCommentId=306630&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-306630
In brief: skip the plugin by default in the parent module then re-enable it where needed.
This however only works if the plugin can be skipped (i.e. has a skip configuration) and is only used in one specific submodule, and it has to be selectively done for each plugin you need/want to run that way (in my case, jetty:run and gwt:run).
I do most of my development on my laptop. For the projects I'm currently working on, my local repository is really more of a temporary holding area. I run mvn install all the time. Putting artifacts in one's local repo is the only way I know of to share built artifacts between projects, especially if you are working on projects which are related but are not (and should not be) part of the same multi-module build.
When I'm done developing I commit changes to the shared SCM and let Jenkins build & deploy the code to the shared remote repo. Then I either blow away the changed projects in my local repository so the next build brings down the freshly built artifacts, or I run Maven with -U to force updates.
This works well for me, YMMV.

Why does maven recognize dependencies on only installed POM files?

I've got a project with Maven in which one subproject (A) wants to depend on another subproject (B) which uses "pom" packaging.
If I do this the straightforward way, where A specifies a dependency on B with <type>pom</type>, things work perfectly if I do "mvn install", but if I run any phase earlier than install, such as mvn compile or mvn package, then it fails while trying to build A: it goes looking for B's pom in the repository, and doesn't find it.
I don't really want this pom in the repository, because it's part of our active source code and changes frequently.
For all the jar-packaged projects we build, it seems to work fine to keep them out of the repository, build with mvn package, and Maven knows how to find all the dependencies in the source and build trees it manages without resorting to the repository; however for the pom-packaged project it always wants to go to the repository.
A couple things I learned while trying to understand this:
Maven best practices encourage you to use pom-packaged projects to group dependencies, but with the added step of "mvn install" on the POM project
Maven lifecycle documentation says "a project that is purely metadata (packaging value is pom) only binds goals to the install and deploy phases"; maybe this is why the POM project is invisible as a dependency target unless I invoke the install phase? I tried binding the compiler plugin to the compile phase and this didn't seem to help.
Is there a way that I can specify the POM subproject as a dependency of another subproject in the same parent project, without installing the POM project to the repository?
It isn't purely a question of which goals are bound to which lifecycle phases for POM projects. If it were, then binding the "package" goal would solve the problem.
When building a multi-module project, Maven reads the POMs of all modules to determine dependencies between modules, so that it can build the depended-upon modules before the depending modules. It's able to achieve this even when running the "package" goal (such that the depended-upon modules are not yet in the local repository).
Therefore, the code that constructs the classpath for builds must be managing several cases, notably:
extra-project jar dependency, where it looks for the POM in the local repository, handles its dependencies, and adds the POM's jar to the classpath
extra-project pom dependency, where it looks for the POM in the local repository and handles its dependencies
intra-project jar dependency, where it looks for the POM within the project tree, handles its dependencies, and adds that module's target/classes folder to the classpath
intra-project pom dependency, where for some reason it doesn't look for the POM within the project tree, and therefore doesn't handle it's dependencies.
Notice the asymmetry in the last two cases, as compared to the first two.
I can see two solutions to your problem. One is to file a bug report, or rather a request to change the behaviour (since it's obviously intentional), perhaps only for the case of intra-project dependencies on multi-module projects. Or indeed propose a patch. But since the behaviour is intentional, you might meet a refusal. In the best of cases, you're in for a long wait. (I'd vote for your bug report though - I've been stung by that same behaviour, in a different context.)
The other solution is simply to run an install on your project. I don't really understand why you don't want the POM project in your repository: if needs be, you can use a snapshot repository, where it doesn't matter if things change frequently, to avoid polluting your main repository.
Configuring maven-install-plugin to run during the compile phase, and copy the relevant pom.xml to the repository, seems to accomplish what I wanted as far as Maven itself is concerned, though m2eclipse still is not happy (it throws "failed to read artifact descriptor" errors with no additional description for the pom.xml that has a dependency on the other POM project).

Build single module from multimodule pom

Is it possible to do?
The environment: Multimodule pom consists of 3 modules: mm1, mm2, mm3. Module mm2 has mm1 as dependency. It is possible to build parent pom without any errors.
The question: Is it possible to build single module mm2 (i.e., run maven from mm2 base directory) without installing mm1 into local repository?
Thanks.
I'm not sure what you mean exactly by "without installing mm1 into local repository". Do you mean previously to building mm2 or never?
In doubt, maybe one of the new build options announced in the Maven Tips and Tricks: Advanced Reactor Options blog post can help:
Starting with the Maven 2.1 release,
there are new Maven command line
options which allow you to manipulate
the way that Maven will build
multimodule projects. These new
options are:
-rf, --resume-from
Resume reactor from specified project
-pl, --projects
Build specified reactor projects instead of all projects
-am, --also-make
If project list is specified, also build projects required by the list
-amd, --also-make-dependents
If project list is specified, also build projects that depend on projects on the list
I was specifically thinking to the -pl and -am options. To build a subset of the modules, run the following from the root directory
$ mvn --projects mm2 --also-make install
However, I'm not sure this answers your question (which is not totally clear for me).
Without automatic installing not, but it's possible to build only choosen projects. You need to have multi module build (I'm assuming you do). In reactor mode every command need to be run from the root of reactor.
So in your case:
mvn reactor:make -Dmake.folders=mm2
In this case you build mm2 module and modules on which it depends (mm1).
Useful links:
Maven reactor plugin reference
Maven book reactor chapter
From book examples I build only project persist and his dependency project model. Others projects are untouched with
mvn reactor:make -Dmake.folders=sample-persist
alt text http://www.sonatype.com/books/maven-book/reference/figs/web/running_aro-dependencies.png
Other useful command is reactor:make-dependents which build projects that depend on X.
This goes against the principle of dependencies of Maven2. What is the interest of doing that exactly?
However, we can imagine to define the mm1 dependency of mm2 as a system dependency:
<dependency>
<groupId>...</groupId>
<artifactId>mm1</artifactId>
<version>...</version>
<scope>system</scope>
<systemPath>../mm1/target/</systemPath>
</dependency>

Resources