How to create a maven assembly with transitive dependencies for different deployment scenarios? - maven

I'm having a problem reconciling building a project for use within an application server and for use as a stand-alone application.
To give an overall simplified context, say I have three Projects A, B, C.
Project A depends on Project B which depends on Project C.
Project C has a dependency X which is marked as provided since it was expected that it would be available as a JEE library within say an application server. i.e. jms.jar.
So if I perform an assembly build of Project A, I get all the transitive dependencies save for those marked as provided as expected.
Now I have a new deployment scenario where Project A needs to be used in a standalone environment i.e. outside an application server.
So now I need the jms jar to be a compile dependency. Does this mean that I should explicitly add a compile dependency for X in Project A? Doesn't this violate the Law of Demeter, i.e. don't talk to strangers, in the sense Project A shouldn't explicitly know about Project C but only about Project B?
This is a simple example but in reality I have multiple dependencies which have been marked as provided but are now need to be compile or runtime dependencies so they end up in the artifact produced by the maven assembly plugin.
Is this a fundamental problem with Maven or am I not using the tools correctly?
Thanks in advance for any guidance.

If you need your build to have variations in it for different scenarios, you need to use profiles and keep certain things (such as some of the dependencies) in the various profiles.
http://maven.apache.org/pom.html#Profiles
Different dependencies for different build profiles in maven
answers a similar question - but you can swap in the "release" and "debug" for "Project A" and "Project C"

Provided dependencies are a difficult subject. First of all: Provided dependencies are not transitive in the following sense: If your project C has a provided dependency on X, then A will not get the dependency. It is silently ignored. This fits with the following meaning of "provided" which I propose:
Only the artifacts that are actually deployed should mark dependencies as "provided". Libraries or other jars that are not individually deployed to a specific server should not have provided dependencies. Instead, they should declare their dependencies as compile dependencies. In your example: Project C should have a compile dependency on X. If project A knows that X is provided, it sets X to provided in "dependencyManagement". As project A should know the environment in which it runs it should decide what is provided and what is not. And "dependenyManagement" is the right place to declare this.
If your project A should be able to run within and without a given server, you probably need to make a lot of adjustments, even change the type from ear to jar. So you either use build profiles for this, which then have different dependencyManagement entries, or you split A into two projects which depend on some other project that contains the common elements.
If some given project C already has a provided dependency on X and you cannot change that, this is effectively the same as a missing dependency in C. This has to be repaired at some point, and this could be project A itself.

Related

Is it possible to force a Maven plugin to be included in a project from a dependency of that project?

I have three Java projects. The first is an application, com.foo:foo-application:1.0.0, and the second is a module used as a dependency to that application, com.foo:foo-framework:1.0.0. The third is a Maven plugin authored by our team, com.foo:foo-plugin:1.0.0.
My intention is that any project, e.g. foo-application, which uses classes available in foo-framework must also validate that it has used those classes correctly, where said validation is enforced by foo-plugin.
Is there a way to enforce this behaviour within foo-framework's POM.xml, whereby any Maven module which declares it as a dependency in its own POM will have foo-plugin executed as part of its build lifecycle?
No (at least no way that I'm aware of).
when you declare a dependency on something, youre declaring a dependency on its output artifacts (and transitively their dependencies as optionally described in that artifact's pom.xml file). There's no place in a pom file to force anything on the build importing it - the build importing it may not even be a maven build.
it appears you may be able to do something similar via other tools though - for example checkstyle supports discovering rules from dependencies on the classpath (not exactly what you want and depends on users of your library running checkstyle configured just right)

Gradle share dependencies in a cascade manner between related projects

I have the following Java projects structure:
Util
|
-- Core
|
-- Services
|
-- Tools
The projects: Tools and Services references to Core and Util projects, the thing is that I ended up writing the same dependency over each project, there must be a better way to inherit the dependencies of the referenced projects and add new ones if needed.
I know about multi projects in Gradle, but this is not like a multi project, since I can basically take the Core library, compile it (which will then contain Core + Util libs) and use it in another project.
I wonder what would be the best way to approach this?
Repeating the same dependencies in every project is usually reasonable because in a bigger project you'll never know when they become different, and you don't want to deal with compilation/runtime problems when someone changes common dependencies list.
I believe that it is more pragmatic to add dependency analyser plugin to your build. It will help you to remove unnecessary dependencies and explicitly add transitive dependencies. And if you add this plugin to your build chain, it will help you to keep your dependencies healthy in the future. Pick this plugin here gradle-dependency-analyze, or maybe there is a better fork or equivalent somewhere.
You are actually out of options in your case because there are only two kinds of dependencies: (1) external (some other jar artefact) or (2) internal (another module in a multimodule build).
2.1 When you use an external maven-like dependency it will come to you with own dependencies (they are named "transitive dependencies"). It means that if you do compile 'yourgroup:Core:1.0' then you will get Util as a transitive dependency. But as I mentioned above, it is better to list transitive dependencies explicitly if they are used during compilation or to prevent them from being accidentally removed and crash your application in runtime.
2.2. If your projects live in the same version control repository and usually change and build together, then the multimodule layout is your best choice. In this case, you will refer to Core dependency like compile project(':Util:Core') and it will grab Util as a transitive dependency as well. And you will be able to do what you asked for and define dependencies for Services and Tools once - inside subprojects {} closure in the Core/build.gradle.
Having multimodule built doesn't limit you from using Core library elsewhere. No matter if it is a multimodule build or not, you can always add maven-publish plugin to Core/build.gradle, execute publishToMavenLocal task and reference to Core.jar from another project the same way you do for external dependencies.
You can always put your common code (like the one which will add common dependencies) in the external gradle file or custom plugin and apply it in Services and Tools.

Is it possible to build a "sub jar" of a Maven project?

I have a situation at the moment where I have:
Project A which is built into a fat jar using Maven assembly plugin.
Project B which uses the jar built in Project A. It is added to the project as a resource and launched in a separate process using a process builder.
I wonder if it's possible to achieve similar behaviour using just one Maven project. I.e build the jar containing only the classes and dependencies required for project A, and then build the rest of the project with the prebuilt jar.
Sorry if I'm not being very clear here.
This is against a few of Maven's core concepts:
One project, one model (POM). Two projects (A, B), two models (POMs).
There's one artifactId in a POM. What is a second artifact (jar) supposed to be named?
One project leads to one artifact. There is no additional "prebuilt jar" built within the very same project.
Dependencies are for the whole project (and possible sub-module projects). I'm not aware of how to "containing only the classes and dependencies required for project A".
Artifacts are stored:
in <project>/target temporarily
in the local Maven repository (default: ~/.m2/repository)
possibly in a remote Maven repository
... while resources are taken from <project>/src/main/resources during the build.
There might be some tricky solutions (which have possibly pitfalls, too) to achieve this if one thinks about it thoroughly. But I'd never ever recommend such.

Maven dependency vs multimodule?

Very new to Maven, can someone please explain to me the difference between using maven modules vs just adding a dependency to your maven project to another maven project in your workspace? When would you use one over the other?
A dependency is a pre-built entity. You get the artifact for that dependency from Maven Central (or Nexus or the like.) It is common to use dependencies for code that belongs to other teams or projects. For example, suppose you need a CSV library in Android. You'd pull it as a dependency.
A Maven module gets built just like your project does. It is common to use Maven modules for components that the project owns. For example, maybe your project creates three jar files.
A dependency can be thought of as a lib/jar (aka Artifact in Maven parlance) that you need to use for building and/or running your code.
This artifact can either be built by your one of the modules of your multi module project or a third party pre-build library (for example log4j).
One of the concepts of maven is that each module is going to output a single artifact (say a jar). So in case of a complex project it is good idea to split your project to multiple modules. And these modules can be dependent on each other via declared dependencies.
See http://books.sonatype.com/mvnex-book/reference/multimodule-sect-intro.html for example of how a web app is split to parent and child modules and how they are linked.
One of the most confusing aspects of Maven is the fact that the parent pom can act as both a parent and as an aggregator.
99% of the functionality you think about in Maven is the parent pom aspect, where you inherit things like repositories, plugins, and most importantly, dependencies.
Dependencies are hard, tangible relationships between your libs that are evaluated during each build. If you think of your software as a meal, it's basically saying A requires ingredient B.
So let's say you're preparing lasagne. Then your dependency chain would look something like this:
lasagne
<- meatSauce
<- groundBeef
<- tomatoPaste
<- cheese
<- noodles
The key thing is, each of the above items (meatSause, groundBeef, cheese, etc) are individual builds that have their individual set of dependencies.
By contrast, the only section of your pom that pertains to aggregation is the modules section:
<modules>
<module>meatSauce</module>
<module>groundBeef</module>
<module>tomatoPaste</module>
<module>cheese</module>
<module>noodles</module>
</modules>
Aggregation simply tells your build engine that it should run these 5 builds in rapid succession:
groundBeef -> tomatoPaste -> cheese -> noodles -> meatSauce
The main benefit of aggregation is the convenience (just click build once) and ensuring the builds are in the correct order (e.g. you wouldn't want to build meatSauce before tomatoPaste).
Here's the thing though: even if you organize the libs as standalone projects without module aggregation, your build will still come out the same provided you build in the correct order.
Moreover, both Jenkins and Eclipse have mechanisms for triggering builds if a dependent project has changed (e.g. changing groundBeef will automatically trigger meatSauce).
Therefore if you're building out of Jenkins or Eclipse, there is no need for aggregation

How should I share Maven DepdendencyManagement from multiple sources?

I have a Maven project multimodule project. Some of the modules create custom packaging for the libraries produced by the other modules. The packaging being used has its own suite of versioned dependencies that I need to play nice with.
As an example: my parent POM might have an entry for e.g. commons-codec:commons-codec 1.4, my "core-lib" POM includes it as a dependency (sans explicit version), and I want to make sure my packaging module bundles in the right version. However, the specific type of custom packaging that I'm using also needs e.g. log4j:log4j 1.2.15, and I want to make sure that when my packaging module runs, it also bundles the correct log4j version.
Here's the wrinkle: the example POM I'm working from for "project that makes {custom packaging}" uses a parent that's provided by the custom-packaging team. If I use their parent, I lose the version info for commons-codec. If I use my parent, I lose the version info for log4j.
Now, ordinarily if I ask "how do I make A and B depend on the same version", you'd answer "make A and B have the same parent, and include a dependencyManagementsection in the parent". My problem is, I need A, B, and C to depend on the same version, but I don't have any control over C.
I think this is what Maven "mixins" are meant to address, but of course they don't exist yet. In the meantime, what I've been doing is picking one parent, then copy-and-pasting the dependencyManagement section from the other POM, with a comment saying "make sure you keep this up to date". Obviously this is an ugly, ugly hack, but I haven't found another way to keep current with both sides.
What about using the assembly plugin to pack up your artifact with all its dependencies and having your packaging module run on that instead? Then you're not trying any pom magic. It's just a matter of one project using the artifact from another project, like usual.
For now, I'm going to accept the answer of "this is one of the really sucky things about Maven". Maybe this question can get updated when Maven 3.1 finally launches.
Could you not activate multiple profiles which have their own dependency section pulling in the required libraries when enabled. This allows some nice flexibility due to the ways that profiles can be activated.

Resources