Do I need the the Maven wrapper for submodules? - maven

If I have a parent Maven project that contains the Maven wrapper files, is there any reason to also including the wrappers in the submodules of this project? Does it really matter? Any possible side effects of doing this I need to look out for? I only plan on running Maven commands from the parent project, but some day I may want to break the submodules up into their own separate projects.
Edit: To be clear, I'm talking about mvnw, mvnw.cmd, and the .mvn directory

The Maven Wrapper is only needed on the root level the project. The intention is to define a the version which will be used for building the project by downloading it automatically.
If you later decide to split up the project this would result in creating separated repositories (Git) and this means you have for each of such projects an other root which mean there it would make sense to have the maven wrapper.

Related

Gradle search local maven or gradle repository

I am using gradle and its local repository is at \.gradle\caches\modules-2\files-2.1 which has all the downloaded jar but not my modules.Is there any specific place I should be searching it for ?
I need it as is in settings.gradle I am having a dependency path specified like :
include ':model'
project (':model').projectDir = new File(settingsDir, './model')
in a new project. Also I don't want to give path in that way because if I have a dependency from multiple projects on this project then mentioning path will be difficult and weird.
How can I make gradle search it from local maven or gradle repositories.
I'm still not sure what is being asked here, and I suspect there is some confusion over how multi-project builds work. So I'm going to attempt to provide a general-purpose answer.
The first question you need to answer is whether you're interested in dependencies between projects that are part of the same build — as in part of a multi-project build — or in separate builds.
Project dependencies (multi-project builds)
Project dependencies are covered in the user manual and only apply to multi-project builds. They use a logical path, using colons as 'path' separators, to specify the location of the target module, like so:
dependencies {
implementation project(":model")
}
At this point, Gradle needs to know where ":model" exists on the file system. There's no getting around that. You have a few options:
Follow the convention of directory structure matching the logical path structure, i.e. have a MyBigProject/model directory containing the ":model" child project
Specify the file path of ":model" in settings.gradle, e.g. with project(":model").projectDir = new File(rootDir, "unusual/path/to/model")
Automate the discovery of projects
The most common approach is the first one. The second is not unusual, particularly if you want to put child projects into a separate directory, like subprojects — something the build of Gradle itself does. I haven't seen the last option done, and I don't know whether it runs into problems.
For the sake of completeness, and at your own risk if you use something like it, here's an example of automatic discovery of projects in the settings.gradle file:
rootDir.eachDir { File dir ->
if ("build.gradle" in dir.listFiles()*.name) {
include dir.name
}
}
This fragment basically looks for directories within the root project folder that have a build.gradle file in them and adds them as child projects. The child projects' directory names become the projects' names.
It's not particularly clever, and you should really use different names for the build files, but it may give you some ideas to work with.
Non-project dependencies
As with project dependencies, Gradle needs to know where to get the corresponding JAR or other form of artifact for a specified module. You normally specify Maven Central or something similar for this, but there are other useful, but less common, options:
Copy a project's artifacts into the local Maven repository — both the Maven Plugin and Maven Publish Plugin support this
Publish to a Maven-compatible repository using a file:// URL rather than an HTTP/HTTPS one, which protects your projects from corruption of Maven Local
Worth noting is that Gradle supports composite builds that allow you to substitute a normal dependency with (effectively) a project dependency from another build. So if model were part of a separate build but you had the source code and build locally, you could make changes and immediately test them in another build's project without going through the whole "install" intermediate step that's common in the Maven world (and Gradle pre-composite-builds).
Hope all this makes sense.

How to make IntelliJ reference a local project for a dependency?

Working in a multi-module Maven project, call it "app." I need to work on the source of one of the dependencies, call it "lib", and be able to easily test/debug "app" against my changes in "lib."
In Eclipse this is an option for its Maven and Gradle plug-ins, and is obvious since Eclipse doesn't bind the concepts of "workspace" and "project" as tightly as IntelliJ does. When I cloned the repo for "lib", IntelliJ offered to create a new project for it, but how do I force "app" to use the local working copy of "lib" for compilation and runtime?
To put it another way, can IntelliJ basically encapsulate doing build install on "lib" behind the scenes so that "app" uses the updated (snapshot) of it?
The obvious, cleanest choice would be to combine the two projects into a common Maven multi-pom project. If that is something you can't do (perhaps the projects belong to different teams etc.), then I could imagine you could fake it by using symlinks.
Create a wrapper project with just a pom file and two modules. Instead of folders for the modules, use symbolic links to the actual file locations. Obviously the reactor root pom would not be the parent pom.
Now open the wrapper pom as IntelliJ project.
I don't know if this works, but it's worth a try.

Maven module dependency source instead of repository jars

I have a multi-module project, i.e.
parent
module1
module2
In one dev cycle, I added a class mod1.A to module1. Class mod2.B in module2 depends on it.
I do not have the artifacts in my local .m2/repository. Running this:
$ cd prj/module2
$ mvn -o exec:java -Dexec.mainClass=mod2.B
results in an error along the lines of:
The following artifacts could not be resolved: com.example:module1:jar:1.0-SNAPSHOT
After I install the artifacts via mvn install while in the prj folder, it all works as expected.
However, this presents an issue in at least two ways:
I have to go through the slower install phase instead of the faster compile phase
I have two versions of the same project and conflicting modifications in these. I cannot run the same Java class with their respective modifications, only the currently installed modifications, considering they are both the same SNAPSHOT version
There are workaround for both (skip parts of the build for the first, different snapshot versions for the second), but they are far from usable in practice.
Is there a way to make maven use the local modules, instead of using artifacts from local maven repository?
If I understand your question correctly, it seems like you are living a bit outside the norm here: you have two local "copies" of the project with different modifications, that you want to work with alternately when running "exec:java". And Maven is getting in your way: it expects your local .m2 repository area to be in play, but the version strings in each copy are the same, so you end up with the changes interfering among the copies.
To me, it sounds like what you are trying to do is to test your changes. I suggest you just write an actual JUnit or TestNG test in module2 that tests what you want (it can just call mod2.B Main if you want). Then, from your chosen project directory, you can run mvn test -Dtest=MyTestName. It won't "install" anything and it will find the dependencies the way you want it to.
Otherwise, I can see three options.
Change the version string locally in one of the copies (mvn versions:set -DnewVersion=B-SNAPSHOT can do this for you). That way any "installed" jars from your work on that copy will not be considered by the other copy, and vice-versa. You refer to this as being "far from usable" ... I think it should be fine? These are different versions of the project! They should have different version strings! I strongly recommend this option out of the three. (You can do mvn versions:revert when done if you used :set, or you can rely on version control to undo the change.)
Select a different local repository used by Maven when working on one of the projects, with a command-line flag as per https://stackoverflow.com/a/7071791/58549. I don't really think this is a good solution, since you would have to be very careful about using the right flags every time with both projects. Also you'd end up having to re-download Maven plugins and any other dependencies into your new local repository anyway, which is kind of a waste of time.
Try to avoid using any local repository at all. You seem to be trying to make this option work. I don't think this is a great approach either; you're fighting against Maven's expectations, and it limits your flexibility a lot. Maven will indeed find dependencies from the "reactor" (i.e., the executing mvn process) first, but this means all of the required modules must be available in the reactor to be found, which means you can only run mvn at the top level. So if instead you want to just do "mvn exec:java" inside a single module, mvn needs to find that module's dependencies somewhere ... and that's what the local repo is generally used for.
If you're dead set on going with option 3 (instead of option 1), then I suggest you follow the comments on your question and create a profile that runs your exec selectively against module2 and binds it to a lifecycle phase. But this is in practice very close to just wrapping it with a test.
For IntelliJ users:
I solved this problem using IntelliJ's Run configuration. It has the options Resolve workspace artifacts and Add before launch task -> Build. See this picture for clarification:
Run configuration example
The whole point of modules in Maven is to create decoupling between them. You either build each module independently, so that you can work on one module without touching the other, or include both modules as sub-modules in the parent pom and build the parent, which will resolve dependencies between its sub-modules and trigger their builds.
It looks like you have two options here:
Review the structure of your project. Do you really need to split it into two separate modules, if you change code in both of them simultaneously?
Import the project into a Maven-aware IDE (IntelliJ IDEA is very good at working with Maven), and let the IDE handle the compilation. Once finished and stabilized the code-base, build normally with Maven.

Maven multi-project depths

I was trying to build Maven pom in something similar to the following hierarchical form:
root
+-- A-POM
+-- B-POM
+-- C-POM
+---D-POM
I was hoping that this could take care of my changed module problem. That is, if C is changed, then A must be rebuilt, etc.
But I ran into the issue that it seems the packaging at root is "pom," and after that I can't have A as packaging "war" then continue to drill in to have A include B, C as its modules. It seems to me that any POM which does not have "pom" in the then it can't have child modules. Is my understanding correct? Is there a way to do what I wanted to do?
In addition, I don't seem about to chain the "changed" mechanism in Maven (must due to my lack of knowledge). I like to have Maven detect a dependent project has changed and rebuild all the affected projects.
Thanks so much!
the reactor project (the root of the multimodule project) must have pom packaging. So your nested structure is invalid since A is not of type pom and I'm pretty sure you won't get it to work this way.
Second point is that Maven is a modularized build system and uses repository mechanisms to locate pre-built artifacts instead of checking out all modules from version control and building them in a monolithic way like in the old days ;) This means that Maven cannot know what to rebuild when you change something at your module since it simple does not have all the other module there at this time.
I think this is more a CI task than that should be handled by the build system itself. I know that your can achieve such a behavior with an appropriate build/CI Server like Jenkins that supports upstream and downstream projects. This means it is able to detect dependencies between the projects and trigger other builds as soon as a dependency has been built. This comes close to the behavior you are trying to achieve.
Btw. rebuilding other projects is only required for SNAPSHOT dependencies. Jenkins with the maven plugin supports this behavior but, depending on the number of SNAPSHOT dependencies of your project, this can cause long chains of project builds on the server. Some folks are of the opinion that in general SNAPSHOT versions are hell for CI tasks since these artifacts can change over time and are not reproducible. You could think over completely omitting SNAPSHOT versions and building final versions each time. This would also obviate your requirement to rebuild other modules as soon as a module changes. There are simply no changes until you upgrade dependency versions.

Good approach of a maven project design or antipattern design

Context:
I have a multimodules maven project that looks like:
Root ---- ModuleA
ModuleB
ModuleC
ModuleD
.......
They are around 25 modules under the Root:
A few of them represent the core of the application (5 modules)
And each of the remaining modules represent the business processes implementation related to a type a customers. These modules are completely independant among each others.
When packaging or releasing the 'Root' project, the artifact generated is a single ZIP file aggregating all the JARs related to 'Root' modules.
The single ZIP file is generated according to an assembly descriptor, it represents the delivery artifact.
At deployment time on the target environment, the single ZIP is unziped under a directory where it is consumed (class loaded) by an 'engine', a java web application that provides the final services.
Constraints
The 'business constraints' from one side,
And the willing to reduce regressions between different versions on
the other side
The above constraints lead us to adopt the following release scenarios:
Either we release the Root and ALL its submodules. It means that
the resulting ZIP will aggegate all the submodules JAR with the same
version. The ZIP will contain something similar to:
[ModuleA-1.1.jar, ModuleB-1.1.jar, ModuleC-1.1.jar, ModuleD-1.1.jar,
......., ModuleX-1.1.jar].
Or we release the Root and A FEW of its submodules, the ones that we want to re update.
The resulting ZIP will aggegate all the submodules JAR : The released submodules will be aggregated with the last released versions, the unreleased submodules will be aggregated with another 'appropriate' version.
For example, if we made a such incremental release, the ZIP will contain something similar to [ModuleA-1.2.jar, ModuleB-1.1.jar, ModuleC-1.2.jar, ModuleD-1.1.1.jar,
......., ModuleX-1.1.2.jar].
These 2 scenarios were made possible by:
Either declaring the modules as MAVEN MODULES 'module' for the first
scenario
Or declaring the modules as MAVEN DEPENDENCIES 'dependency' for
the second scenario, the INCREMENTAL scenario.
Question
Both scenarios are working perfectly BUT when we are in the 2nd scenario (INCREMENTAL), the maven-release-plugin:prepare is uploading to the SCM (svn) all the modules [ModuleA, ModuleB, ModuleD, .... ModuleX], it is uploading the released and the non released ones, whereas the 'non released modules' are declared as 'dependency' in the pom and not as a 'module'.
1/ IS THERE a way to avoid uploading the 'non released' modules ? Is there a way to inject an 'exlcude directrory list' to SCM svn provider ?
2/ A MORE global question, does the approach used is a correct one ? Or is it an anti pattern usage ? In that case, what should be the alternative ?
Thank you.
To me, your approach looks like an antipattern. I recommend to only have projects in the same hierarchy that you want to release together. Projects with a different release lifecycle should live on their own - otherwise you will keep running into the issues you mentioned. If you run the release plugin from a root directory (multi-module setup), all of the content of that root directory will be tagged in SVN.
In your case, I would probably create the following hierarchies:
Core
One per customer type
Potentially one per type to bundle them (zip), depending on your structure
I would group it by the way you create the release. It might mean that you have to run the release plugin a couple of times instead of just once when you make a change e.g. in Core, but it will be a lot cleaner.
Your packaging project will then pull in all of the dependencies and package/assemble them.
If you have common configuration options, I recommend to put them into a common parent pom. This doesn't have to be your root (multi-module) pom.
Did you try to run the maven-release-plugin with -r argument + the list of all modules you want to release?
Basically, this argument allows you to specify the list of modules against which the maven command should be performed. (if you omit it: all submodules will be included, this the default behavior)
See more details about this command line here.
I never try to use it with the maven-release-plugin, and I don't know if it will work, especially regarding SCM operations.

Resources