My situation: I have project which contains several Maven modules. I make changes to one of them. Suddenly I find out, that my project is no longer possible to be built because of the errors in other modules. To fix this I need to run SVN UPDATE and rebuilt the project.
My assumption: probably, during the build process of my module some of the artifacts are taken from central repository and have the most newest version, while others are still outdated and taken from my local repo.
A question: I don't want to rebuild my project each time someone updates ANOTHER Maven module. I want to download the already built artefacts from the central repository without rebuilding them by myself. Is it possible?
You can tell Reactor which modules to build. In your case when you only change a single module and want to speed up the build you can pass -pl (Project Location) parameter to maven. For example:
mvn -pl module-with-changes
That will build single module, while taking other dependencies from your local Maven Repository or download from Central (whatever is the latest). That said, if you already ran mvn install for whole project and other artifacts have not been updated in Central repository, then Maven will see your local artifacts as latest and will not re-download them.
Another issue you might get with -pl parameter is when other modules in your project depend on the module that you are building. If there is a problem in dependent module you will not see it by building only the dependency model. To avoid that you can pass -amd (Also Make Dependents). Like this:
mvn -pl module-with-changes -amd
That will trigger the build for module-with-changes + modules that depend on module-with-changes + their dependents.
You can get more info about Reactor parameters from here:
http://www.sonatype.com/people/2009/10/maven-tips-and-tricks-advanced-reactor-options/
Related
I have a question about how the Maven dependency resolving mechanism is working in a multi module project.
Normally I only use 'mvn clean install' when I build my multi module projects and my assumption was that if any module in the project needs a previous module, dependency will be resolved by going local repository and loading the corresponding 'jar'.
For project internal reason, I have to use 'mvn clean compile,' this command naturally does not create any 'jar' while 'install' is not there. So here I started wondering, how the dependency resolution for a multi module project works, while jar' is not created but project still able to see the changes from the previous builds. Does the target directories used for dependency management?
Or for 'mvn clean compile' target directory used but for 'mvn clean install' the local repository.
Can anybody explain me how the dependency resolution works in a 'multi module' project.
Thx for answers.....
I think you will understand better if you look at https://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html
There is a life cycle in the process of building the jar. The compile target will compile the code and create a complete classes folder in your target directory. This target will resolve all your dependencies in your poms and download any dependencies to your local repo, not already there.
The install target will create the jar from the classes directory and install it in your local repository.
I really think you will need to run the install target to get anything useful.
Maven is made of separate components.
There a component that deal with a given module and among other things try to get its dependencies. It ALWAYS get the dependencies from the local repository, eventually after having downloaded such dependencies. If the dependencies are not there and can't be dowloaded, it will fail. Eventually the module will create its own artifact that it will publish to the local repo.
Then there a compoment that when you ask it to build several maven modules, for example calling mvn at the root of a project does order the various module use the dependencies to find the best ordering for the build so that if a given module depend on another, it will be build after the module it depend on. It then call the previous compoent I described, building each module in order.
In all cases, a given module dependencies are always taken from the local repo. The expectation is that the previous modules that were built before actually pushed their artifact to the local repo typically with the mvn install but you could force it do it at any step thanks to proper configuration (may not be a good idea).
In all case if the previous component jar was not built and put into the repo, there no way such jar can be added in the classpath for the next module to be compiled.
Doing compile only on multiple projets isn't going to be any useful.
I cloned the git repository of Apache ActiveMQ Artemis project (https://github.com/apache/activemq-artemis) and then typed
mvn -Ptests test -pl :integration-tests
I was surprised to see log messages like the following
...
Downloading: http://repository.apache.org/snapshots/org/apache/activemq/artemis-selector/1.4.0-SNAPSHOT/artemis-selector-1.4.0-20160625.030221-11.jar
Downloading: http://repository.apache.org/snapshots/org/apache/activemq/artemis-core-client/1.4.0-SNAPSHOT/artemis-core-client-1.4.0-20160625.030211-11.jar
...
Since e.g. artemis-core-client is contained in the git repository I cloned in the beginning, I'd have expected maven just builds it from there.
That way, when I make changes in the core client source, they get picked up by the integration tests.
Instead, maven is downloading the jar from the repository.
Question: How do I configure maven to always build all modules that are in the git repository and download only "true" dependencies, which I mean things not in the git repository?
You are not executing the Maven build on the main project, on the main pom.xml which indeed defines the artemis-selector and artemis-core-client modules, among others.
You are executing the Maven build on the tests and its pom.xml, where only tests modules are defined. This is a side/test project, which has as parent the previous pom file, but it doesn't play any role in its parent modules definition. Hence, dependencies are not resolved as modules but as Maven dependencies.
You should firstly install (via mvn clean install) the former project, so that libraries will be available in your local Maven cache (hence no downloading would be triggered), then execute the tests project.
Check the Maven docs for a inheritance vs aggregation difference to further clarify it.
From the Stack Overflow, the follow threads could also be interesting:
What is the difference between using maven -pl option and running maven from module level?
Maven multi module project cannot find sibling module
According to the Maven lifecycle, mvn install will "install the package into the local repository, for use as a dependency in other projects locally". The local repository then stores all the jars that I downloaded remotely.
My modules have dependencies with other modules. When I run mvn package, nothing is stored in my local repository, but the dependencies appear to be fulfilled. So how does Maven handle the inter-module dependencies? Does Maven refer to the jars of each module from the built target directories or does it fetch them from another location?
Corey,
You are correct, going strictly by Maven docs implies mvn compile on:
parent_pom/
subA/
pom.xml
subB/
pom.xml # depends on subA
should fail since subA hasn't been pushed out to the local repo.
What's happening under the hood is that Maven uses the reactor to trick the build into looking into target dir of earlier submodules on the same build.
Beyond the scope of this particular question, the maven-reactor-plugin is one of the most opaque parts of Maven, but also one of the most powerful if you master it. You would do well to read up on it.
Hope that helps.
It depends on the phase you're executing. Before compile, Maven will fail, since there are no classes compiled. Between compile and package, the target/classes is used. For package and later, the target/artifactId-version.jar is used.
I have a maven multi module project with several modules. I want to deploy them (mvn deploy) only if they all pass a full mvn install (which includes the tests).
Currently, I run a mvn install on the project. If all modules pass, I run mvn deploy to do the deployment. The problem I see is the waste of time calling mvn twice (even if I skip tests on the second run).
Does anyone have an idea on this?
EDIT: I have learned that using Artifactory as a repository manager and the maven-artifactory-plugin with your maven setup will add the atomic deploy behaviour to the mvn deploy command. See the Build Integration section in the Artifactory documentation.
[DISCLOSURE - I'm associated with JFrog. Artifactory creator.]
Take a look at the deployAtEnd parameter of Maven Deployment plugin: http://maven.apache.org/plugins/maven-deploy-plugin/deploy-mojo.html
This is a bit tricky. Maven is not atomic when it executes the build life-cycle. So a broken set of artifacts may end up in a repository.
One solution I know is Nexus Pro: http://www.sonatype.com/Products/Nexus-Professional/Features - it allows you to promote builds or define certain repos as staging. So only verified versions get promoted to be used. Maybe artifactory has something similar - I just don't know.
If that solution is too expensive you probably need to create a cleanup build or profile to remove artifacts that where already uploaded. My first guess would be to write a Maven plugin to use the the proxy remote API or maybe the maven features are already sufficient. But since deploy means update the meta-data xml files too I dont think there is a delete - not sure on this either.
I tend to avoid using mvn install in my multi-module projects because I feel like I then don't know which exact version of a submodule is then used when building / launching other submodules (particularly when switching between branches very often).
I tend to use mvn package a lot and then mvn verify.
I'm now facing the issue in a FOSS project (a Maven archetype moreover) where I'd like to use Maven's best practices.
It's a multi-module project with a webapp submodule depending on the other modules, and what worries me is the ease of development along with mvn jetty:run (or jetty:start).
Currently, I defined 2 profiles:
prod, the default one, declares dependencies on the other submodules;
dev on the other hand does not depend on the other modules, and configures the jetty-maven-plugin by adding the other modules' output directories as extraClasspath and resourcesAsCSV.
That way, I can mvn package once and then cd webapp && mvn jetty:start -Pdev and quickly iterate, reloading the webapp without the need to even stop the server.
AFAICT, extraClasspath was added for that exact purpose (JETTY-1206).
I've been pointed at the tomcat7-maven-plugin which can resolve modules from the reactor build when using Maven 3 (and I raised an issue to bring the same to Jetty: JETTY-1517), but that hardly solve my
If I hadn't removed the dependency on the other submodules from in dev profile, I'd have had to do an mvn install first so that validating the POM doesn't fail, even if jetty:start doesn't use those dependencies afterwards.
So here's my question: is mvn install really that common? or my approach of putting the intra-reactor dependencies only in the prod profile OK?
(note that I have the exact same problem with the gwt-maven-plugin, so please don't tell me to simply switch to Tomcat; that wouldn't even work actually, details here)
The mvn install is common in particular in relationship with multi-module builds, cause it will give you the chance to run a single module from your multi-module build.
This can be achieved by using:
mvn -pl submodule LifeCycle
I just found a workaround (which seems logical as an afterthought): https://jira.codehaus.org/browse/JETTY-1517?focusedCommentId=306630&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-306630
In brief: skip the plugin by default in the parent module then re-enable it where needed.
This however only works if the plugin can be skipped (i.e. has a skip configuration) and is only used in one specific submodule, and it has to be selectively done for each plugin you need/want to run that way (in my case, jetty:run and gwt:run).
I do most of my development on my laptop. For the projects I'm currently working on, my local repository is really more of a temporary holding area. I run mvn install all the time. Putting artifacts in one's local repo is the only way I know of to share built artifacts between projects, especially if you are working on projects which are related but are not (and should not be) part of the same multi-module build.
When I'm done developing I commit changes to the shared SCM and let Jenkins build & deploy the code to the shared remote repo. Then I either blow away the changed projects in my local repository so the next build brings down the freshly built artifacts, or I run Maven with -U to force updates.
This works well for me, YMMV.