Maven dependency vs multimodule? - maven

Very new to Maven, can someone please explain to me the difference between using maven modules vs just adding a dependency to your maven project to another maven project in your workspace? When would you use one over the other?

A dependency is a pre-built entity. You get the artifact for that dependency from Maven Central (or Nexus or the like.) It is common to use dependencies for code that belongs to other teams or projects. For example, suppose you need a CSV library in Android. You'd pull it as a dependency.
A Maven module gets built just like your project does. It is common to use Maven modules for components that the project owns. For example, maybe your project creates three jar files.

A dependency can be thought of as a lib/jar (aka Artifact in Maven parlance) that you need to use for building and/or running your code.
This artifact can either be built by your one of the modules of your multi module project or a third party pre-build library (for example log4j).
One of the concepts of maven is that each module is going to output a single artifact (say a jar). So in case of a complex project it is good idea to split your project to multiple modules. And these modules can be dependent on each other via declared dependencies.
See http://books.sonatype.com/mvnex-book/reference/multimodule-sect-intro.html for example of how a web app is split to parent and child modules and how they are linked.

One of the most confusing aspects of Maven is the fact that the parent pom can act as both a parent and as an aggregator.
99% of the functionality you think about in Maven is the parent pom aspect, where you inherit things like repositories, plugins, and most importantly, dependencies.
Dependencies are hard, tangible relationships between your libs that are evaluated during each build. If you think of your software as a meal, it's basically saying A requires ingredient B.
So let's say you're preparing lasagne. Then your dependency chain would look something like this:
lasagne
<- meatSauce
<- groundBeef
<- tomatoPaste
<- cheese
<- noodles
The key thing is, each of the above items (meatSause, groundBeef, cheese, etc) are individual builds that have their individual set of dependencies.
By contrast, the only section of your pom that pertains to aggregation is the modules section:
<modules>
<module>meatSauce</module>
<module>groundBeef</module>
<module>tomatoPaste</module>
<module>cheese</module>
<module>noodles</module>
</modules>
Aggregation simply tells your build engine that it should run these 5 builds in rapid succession:
groundBeef -> tomatoPaste -> cheese -> noodles -> meatSauce
The main benefit of aggregation is the convenience (just click build once) and ensuring the builds are in the correct order (e.g. you wouldn't want to build meatSauce before tomatoPaste).
Here's the thing though: even if you organize the libs as standalone projects without module aggregation, your build will still come out the same provided you build in the correct order.
Moreover, both Jenkins and Eclipse have mechanisms for triggering builds if a dependent project has changed (e.g. changing groundBeef will automatically trigger meatSauce).
Therefore if you're building out of Jenkins or Eclipse, there is no need for aggregation

Related

Gradle share dependencies in a cascade manner between related projects

I have the following Java projects structure:
Util
|
-- Core
|
-- Services
|
-- Tools
The projects: Tools and Services references to Core and Util projects, the thing is that I ended up writing the same dependency over each project, there must be a better way to inherit the dependencies of the referenced projects and add new ones if needed.
I know about multi projects in Gradle, but this is not like a multi project, since I can basically take the Core library, compile it (which will then contain Core + Util libs) and use it in another project.
I wonder what would be the best way to approach this?
Repeating the same dependencies in every project is usually reasonable because in a bigger project you'll never know when they become different, and you don't want to deal with compilation/runtime problems when someone changes common dependencies list.
I believe that it is more pragmatic to add dependency analyser plugin to your build. It will help you to remove unnecessary dependencies and explicitly add transitive dependencies. And if you add this plugin to your build chain, it will help you to keep your dependencies healthy in the future. Pick this plugin here gradle-dependency-analyze, or maybe there is a better fork or equivalent somewhere.
You are actually out of options in your case because there are only two kinds of dependencies: (1) external (some other jar artefact) or (2) internal (another module in a multimodule build).
2.1 When you use an external maven-like dependency it will come to you with own dependencies (they are named "transitive dependencies"). It means that if you do compile 'yourgroup:Core:1.0' then you will get Util as a transitive dependency. But as I mentioned above, it is better to list transitive dependencies explicitly if they are used during compilation or to prevent them from being accidentally removed and crash your application in runtime.
2.2. If your projects live in the same version control repository and usually change and build together, then the multimodule layout is your best choice. In this case, you will refer to Core dependency like compile project(':Util:Core') and it will grab Util as a transitive dependency as well. And you will be able to do what you asked for and define dependencies for Services and Tools once - inside subprojects {} closure in the Core/build.gradle.
Having multimodule built doesn't limit you from using Core library elsewhere. No matter if it is a multimodule build or not, you can always add maven-publish plugin to Core/build.gradle, execute publishToMavenLocal task and reference to Core.jar from another project the same way you do for external dependencies.
You can always put your common code (like the one which will add common dependencies) in the external gradle file or custom plugin and apply it in Services and Tools.

Is it possible to build a "sub jar" of a Maven project?

I have a situation at the moment where I have:
Project A which is built into a fat jar using Maven assembly plugin.
Project B which uses the jar built in Project A. It is added to the project as a resource and launched in a separate process using a process builder.
I wonder if it's possible to achieve similar behaviour using just one Maven project. I.e build the jar containing only the classes and dependencies required for project A, and then build the rest of the project with the prebuilt jar.
Sorry if I'm not being very clear here.
This is against a few of Maven's core concepts:
One project, one model (POM). Two projects (A, B), two models (POMs).
There's one artifactId in a POM. What is a second artifact (jar) supposed to be named?
One project leads to one artifact. There is no additional "prebuilt jar" built within the very same project.
Dependencies are for the whole project (and possible sub-module projects). I'm not aware of how to "containing only the classes and dependencies required for project A".
Artifacts are stored:
in <project>/target temporarily
in the local Maven repository (default: ~/.m2/repository)
possibly in a remote Maven repository
... while resources are taken from <project>/src/main/resources during the build.
There might be some tricky solutions (which have possibly pitfalls, too) to achieve this if one thinks about it thoroughly. But I'd never ever recommend such.

Maven multi-project depths

I was trying to build Maven pom in something similar to the following hierarchical form:
root
+-- A-POM
+-- B-POM
+-- C-POM
+---D-POM
I was hoping that this could take care of my changed module problem. That is, if C is changed, then A must be rebuilt, etc.
But I ran into the issue that it seems the packaging at root is "pom," and after that I can't have A as packaging "war" then continue to drill in to have A include B, C as its modules. It seems to me that any POM which does not have "pom" in the then it can't have child modules. Is my understanding correct? Is there a way to do what I wanted to do?
In addition, I don't seem about to chain the "changed" mechanism in Maven (must due to my lack of knowledge). I like to have Maven detect a dependent project has changed and rebuild all the affected projects.
Thanks so much!
the reactor project (the root of the multimodule project) must have pom packaging. So your nested structure is invalid since A is not of type pom and I'm pretty sure you won't get it to work this way.
Second point is that Maven is a modularized build system and uses repository mechanisms to locate pre-built artifacts instead of checking out all modules from version control and building them in a monolithic way like in the old days ;) This means that Maven cannot know what to rebuild when you change something at your module since it simple does not have all the other module there at this time.
I think this is more a CI task than that should be handled by the build system itself. I know that your can achieve such a behavior with an appropriate build/CI Server like Jenkins that supports upstream and downstream projects. This means it is able to detect dependencies between the projects and trigger other builds as soon as a dependency has been built. This comes close to the behavior you are trying to achieve.
Btw. rebuilding other projects is only required for SNAPSHOT dependencies. Jenkins with the maven plugin supports this behavior but, depending on the number of SNAPSHOT dependencies of your project, this can cause long chains of project builds on the server. Some folks are of the opinion that in general SNAPSHOT versions are hell for CI tasks since these artifacts can change over time and are not reproducible. You could think over completely omitting SNAPSHOT versions and building final versions each time. This would also obviate your requirement to rebuild other modules as soon as a module changes. There are simply no changes until you upgrade dependency versions.

Dependency on non-maven module

The entire java project has an ant build; however couple of module(s) have maven build too.
My new module (maven built, say A) has dependency over an existing module(or simply a folder?, say B) which is being built using ant which just packages the src into jar and drops it inside the project.
Maven build for module A fails (unable to locate moduleB files); Options -
1. Package module B using maven, push to m2_repo
I do not want to go with this option.
Please let me know what are the other options available for the same.
If you have control over the source of all your modules, and if you decided that Maven is your way to go forward, then I recommend to to go all-in as soon as possible. If you do not then you will have continuous problems for the two build strategies to play along. And not only during build time but also at deploy time when it is time to collect all your runtime dependencies. Unless you build one uber-JAR as your only deployable artifact.
If your modules will (almost) always release in sync, then consider using a multi-module project setup.
When I say "all-in" then I do not mean that you have to give up Ant completely. You can use the maven-antrun-plugin to kick of your existing ant builds.
You should also consider running your own repository server, e.g. Nexus, to take full advantage of your maven builds.

How should I share Maven DepdendencyManagement from multiple sources?

I have a Maven project multimodule project. Some of the modules create custom packaging for the libraries produced by the other modules. The packaging being used has its own suite of versioned dependencies that I need to play nice with.
As an example: my parent POM might have an entry for e.g. commons-codec:commons-codec 1.4, my "core-lib" POM includes it as a dependency (sans explicit version), and I want to make sure my packaging module bundles in the right version. However, the specific type of custom packaging that I'm using also needs e.g. log4j:log4j 1.2.15, and I want to make sure that when my packaging module runs, it also bundles the correct log4j version.
Here's the wrinkle: the example POM I'm working from for "project that makes {custom packaging}" uses a parent that's provided by the custom-packaging team. If I use their parent, I lose the version info for commons-codec. If I use my parent, I lose the version info for log4j.
Now, ordinarily if I ask "how do I make A and B depend on the same version", you'd answer "make A and B have the same parent, and include a dependencyManagementsection in the parent". My problem is, I need A, B, and C to depend on the same version, but I don't have any control over C.
I think this is what Maven "mixins" are meant to address, but of course they don't exist yet. In the meantime, what I've been doing is picking one parent, then copy-and-pasting the dependencyManagement section from the other POM, with a comment saying "make sure you keep this up to date". Obviously this is an ugly, ugly hack, but I haven't found another way to keep current with both sides.
What about using the assembly plugin to pack up your artifact with all its dependencies and having your packaging module run on that instead? Then you're not trying any pom magic. It's just a matter of one project using the artifact from another project, like usual.
For now, I'm going to accept the answer of "this is one of the really sucky things about Maven". Maybe this question can get updated when Maven 3.1 finally launches.
Could you not activate multiple profiles which have their own dependency section pulling in the required libraries when enabled. This allows some nice flexibility due to the ways that profiles can be activated.

Resources