What does maven clean install -U do? - maven

I have eclipse ide with m2e plugin, maven and weblogc app server running from my local box.
I have imported someone else's multiple maven projects from bitbucket to my box. I was told that one of them is main and rest are dependencies in which I never seen anything like that before. I have always dealt with single maven project. Anyhow from the instruction, it says I have to run maven command such as "clean install -U".
In the IDE so I touched run configuration for each mvn project by setting goal as "clean install -U". By reading maven guide, I kind understand what each term means but when you combine together with a passing parameter, what does it do actually? I didn't expect a jar (web app) to be deployed to an application server but it did also.

-U forces maven to check any external dependencies (third party dependencies) that might need to be updated based on your POM files.
clean install are both basic maven lifecycle phases (https://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html).
install normally would simply take the artifact that is built and put it in the local repository, i.e. a directory on the box you are building on (.m2 directory most of the time). It would not do a deployment to a server - typically the deploy phase would be used to do that.
However, developers can override and add to what maven does in the various phases, so just like in the days of ant things can easily devolve into chaos no one can understand on complex projects ;-).
sometimes in the integration-test phase, developers will tell maven to start up a container temporarily to run the web app on, so that tests can be run against it, and then that container is shut down when the integration-test phase completes.

Related

Jenkins & Maven - build process

I am learning about Jenkins and I have to explore some existing build jobs that others wrote (in the company that I'm working).
So I am trying to understand a job which uses mvn command.
So under the build part (inside the job), I see these details:
Maven version: 3.0.5
Root POM: pom:xml
Goals and options: clean install -U -Pnotest,docs
I'm trying to understand what this mvn command means?
I tried to google it: "clean install -U"
But I didn't find what the parameter U means.
And I don't know what is "-Pnotest,docs".
can you guide me regarding how I can find what's it? (maybe "-Pnotest,docs" is from a xml file or it's from the artifactory etc..)
Thanks a lot!!!!
-U Forces a check for miss releases and updated snapshots on remote repositories
If Maven is regularly used in your company, and you will have to work with it on a day-to-day basis, I would advise you to find a mentor (any colleague that knows the tool well and is ready to share its knowledge with you) and work with them. Maven, when you first look at it, can be quite of a mouthful and you'll learn it more efficiently with their help.
For the problem at hand, Elarbi Mohamed Aymen's answer already tells you what the -U flag corresponds to. As for -P, it is used to activate profiles (in your case notest and docs). These profiles are usually defined in the pom.xml of the project being build.
See Running Apache Maven for the basic commands, and as advised on that page run mvn -h to have the complete list of flags the command can use.
Maven is one of the mechanism how to handle the build process and check project dependencies, especially for Java.
One of the option can be to have physically included dependencies (artifacts / libs) in the project, but its not so useful- in case of new version, you have to replace the file, sometimes you are using same lib in more apps, ten you have to handle it manually in all projects.
Except this, there is the maven- it has a global repository with shared artifacts / libs , which are common used- ref. https://repo1.maven.org/maven2/.
Except this, you can make your own libs/ artifacts in this case, its a modules / applications which are reusable, then you are storing it in private repository- this is the artifactory.
When you want to build your project, in case of maven project you have pom.xml , which is like manual for maven what to do / how to build.
clean and install are common goals, clean will wipe your local maven repository, install will download them again, with parameter -U it force to download them.
You can define your own goals in pom file, eg. to "tree build"- build some dependent modules, then build parent project.
Eg. with -D you pass parameters to the maven eg.
mvn archetype:generate -DgroupId=com.mycompany.app -DartifactId=my-app
- that will generate new project, based on given archetype- "template", with the given groupID and artifactID- groupID can be eg. company name, artifactID is then the name of specific app / component.
-P,--activate-profiles <arg> Comma-delimited list of profiles
to activate
-D,--define <arg> Define a system property

How to configure maven multi module project and FindBugs?

I have a multi module maven project ('assembler'), and I would like to add a FindBugs phase.
The problem is that some of the projects are not able to build stand alone since have dependencies on other projects... however when I invoke mvn package from the 'assembler' project - it works fine and all inner dependencies are resolved.
The problem is that when mvn findbugs:findbugs command is executed, those inner dependencies are not resolved, and maven complains.
Searching Google, I found a way to make it work, using mvn install, but I do not like this approach since eventually this should be used in CI with Jenkins and I do not want to rely on local maven repo on the build server.
Will appreciate hearing your ideas.
Thanks.
Don't be afraid of mvn install, it is your friend. What you need is to set up a centralized Maven repository (such as Nexus or Artifactory) or rent one (such as Bintray) to which you deploy your Maven artifacts to.
Jenkins or any other decent CI engine can do the actual deploying for you, either natively or through Maven. Once deployed, other users - Jenkins included - will be able to resolve their dependencies and you will be able to run FindBugs, PMD, JaCoco, host Javadoc or pretty much anything you want.
Side-note: Don't get confused by the term deploy as most people new to Maven usually get. In context of Maven, it has absolutely nothing to with your target environment. It simply means that an artifact is pushed out to a remote repository and nothing else.

Maven deploy multi module project only if all modules build successfully

I have a maven multi module project with several modules. I want to deploy them (mvn deploy) only if they all pass a full mvn install (which includes the tests).
Currently, I run a mvn install on the project. If all modules pass, I run mvn deploy to do the deployment. The problem I see is the waste of time calling mvn twice (even if I skip tests on the second run).
Does anyone have an idea on this?
EDIT: I have learned that using Artifactory as a repository manager and the maven-artifactory-plugin with your maven setup will add the atomic deploy behaviour to the mvn deploy command. See the Build Integration section in the Artifactory documentation.
[DISCLOSURE - I'm associated with JFrog. Artifactory creator.]
Take a look at the deployAtEnd parameter of Maven Deployment plugin: http://maven.apache.org/plugins/maven-deploy-plugin/deploy-mojo.html
This is a bit tricky. Maven is not atomic when it executes the build life-cycle. So a broken set of artifacts may end up in a repository.
One solution I know is Nexus Pro: http://www.sonatype.com/Products/Nexus-Professional/Features - it allows you to promote builds or define certain repos as staging. So only verified versions get promoted to be used. Maybe artifactory has something similar - I just don't know.
If that solution is too expensive you probably need to create a cleanup build or profile to remove artifacts that where already uploaded. My first guess would be to write a Maven plugin to use the the proxy remote API or maybe the maven features are already sufficient. But since deploy means update the meta-data xml files too I dont think there is a delete - not sure on this either.

Best practice wrt. `mvn install`, multi-module projects, and running one submodule

I tend to avoid using mvn install in my multi-module projects because I feel like I then don't know which exact version of a submodule is then used when building / launching other submodules (particularly when switching between branches very often).
I tend to use mvn package a lot and then mvn verify.
I'm now facing the issue in a FOSS project (a Maven archetype moreover) where I'd like to use Maven's best practices.
It's a multi-module project with a webapp submodule depending on the other modules, and what worries me is the ease of development along with mvn jetty:run (or jetty:start).
Currently, I defined 2 profiles:
prod, the default one, declares dependencies on the other submodules;
dev on the other hand does not depend on the other modules, and configures the jetty-maven-plugin by adding the other modules' output directories as extraClasspath and resourcesAsCSV.
That way, I can mvn package once and then cd webapp && mvn jetty:start -Pdev and quickly iterate, reloading the webapp without the need to even stop the server.
AFAICT, extraClasspath was added for that exact purpose (JETTY-1206).
I've been pointed at the tomcat7-maven-plugin which can resolve modules from the reactor build when using Maven 3 (and I raised an issue to bring the same to Jetty: JETTY-1517), but that hardly solve my
If I hadn't removed the dependency on the other submodules from in dev profile, I'd have had to do an mvn install first so that validating the POM doesn't fail, even if jetty:start doesn't use those dependencies afterwards.
So here's my question: is mvn install really that common? or my approach of putting the intra-reactor dependencies only in the prod profile OK?
(note that I have the exact same problem with the gwt-maven-plugin, so please don't tell me to simply switch to Tomcat; that wouldn't even work actually, details here)
The mvn install is common in particular in relationship with multi-module builds, cause it will give you the chance to run a single module from your multi-module build.
This can be achieved by using:
mvn -pl submodule LifeCycle
I just found a workaround (which seems logical as an afterthought): https://jira.codehaus.org/browse/JETTY-1517?focusedCommentId=306630&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-306630
In brief: skip the plugin by default in the parent module then re-enable it where needed.
This however only works if the plugin can be skipped (i.e. has a skip configuration) and is only used in one specific submodule, and it has to be selectively done for each plugin you need/want to run that way (in my case, jetty:run and gwt:run).
I do most of my development on my laptop. For the projects I'm currently working on, my local repository is really more of a temporary holding area. I run mvn install all the time. Putting artifacts in one's local repo is the only way I know of to share built artifacts between projects, especially if you are working on projects which are related but are not (and should not be) part of the same multi-module build.
When I'm done developing I commit changes to the shared SCM and let Jenkins build & deploy the code to the shared remote repo. Then I either blow away the changed projects in my local repository so the next build brings down the freshly built artifacts, or I run Maven with -U to force updates.
This works well for me, YMMV.

Maven WAR overlay problems, while using Hudson + Artifactory

We have three artifacts:
common.jar : with common classes.
public.war : depending on the common.jar, contains only public site resources.
internal.war : depends on both common.jar and public.war, adding authentication
information and security context resource files. Also contains
few administration site classes.
Currently I have structured these in such way, that internal.war overlays itself with public.war.
Building the project locally, installing the artifacts to local repo, works perfectly.
Problems start when trying to get the Hudson builds working with following sequence:
Build all projects in dependency order.
Modify common.jar (say, add a new class method)
Modify internal.war classes in such way that they are compile-time dependent on changes done in 2. step.
Commit both changes, triggering the Hudson builds.
Internal.war build fails because it can not find the symbols added in step 2.
Somehow the build in step 5. is using an old version of the common.jar, and failing because of it.
The common.jar version number does not change, let's say it's 1.0.0-SNAPSHOT for the purposes of this example.
If I DO change the common.jar version number, the build works. (Supposedly because there is only one release by a release version number).
Now, what could cause this using of old artifacts in Hudson builds?
We are running maven builds on Hudson with command "clean package -e -X -U"
"Deploy artifacts to maven repository" has been checked.
It's hard to definitively answer this without access to the real poms, but here is what I would do:
1) Make sure Hudson is using the exact same version of Maven as you are on your local machine
2) Examine the effective pom.xml of internal.war on the Hudson machine in a terminal via mvn help:effective-pom making sure you are running the same mvn executable as your Hudson job does. You need to verify the version of the common.jar in the effective pom.xml of internal.war. It could be different than what you expect due to profiles or settings.xml differences.
3) Check the settings.xml file for your Hudson install of Maven. In particular you need to verify all is well in your distributionManagement, servers, and repositories stanzas. Another good way to check this is to go to your internal.war project and run mvn help:effective-settings and see if what is there matches what is on your local machine.
Something is awry and it won't take long to find with the right analysis.

Resources