Maven build breaks but dependency not in tree - maven

I have a build which runs fine on my local machine but is failing in Jenkins.
The build breaks with:
[WARNING] The POM for net.sourceforge.nekohtml:nekohtml:jar:1.9.16 is missing, no dependency information available
My project is not using this jar, it is not listed in my poms and does not show up when I run
mvn dependency:tree
no matter what debug/verbose options. There is no reference to neko in my code.
This is an initial setup within Jenkins so it could be that something is not configured quite right but I have no idea what it is - the build works fine on my local machine even using the same settings.xml as in Jenkins.

Related

Installation and deployment of maven test-jar

I've discovered the wonderful test-jar facility in Maven: https://maven.apache.org/plugins/maven-jar-plugin/examples/create-test-jar.html
But it may so happen that one project needs to use the test-jar of another project. From https://stackoverflow.com/a/6469256/421049 and experimentation, it would seem that using mvn install does not install the test-jar to the local ~/.m2/repository. So how does one project on my machine use the test jars of another project not in the same aggregate POM?
Yet it would seem from Maven deploy not to upload test jar that deployment of a project to Maven Central does in fact deploy the test-jar? So I can deploy it to Nexus but not install it locally? And if I deploy it to Nexus, will my local project using a dependency of <type>test-jar</type> go find it on Maven Central?
It turns out that maven-jar-plugin does in fact install the test-jar (e.g. foo-1.2.3-tests.jar) in the local Maven repository ~/.m2/repository/.... Wonderful!
My problem is that I had inadvertently configured the maven-jar-plugin to be in a separate profile named release. (I had copied and pasted to the wrong location in my POM.) That's why the test-jar didn't show up in my local repository after a mvn install, and that's why it suddenly showed up later (after I used -P release once in testing), and I thought I had just missed it when I looked the first time.
I move the maven-jar-plugin to the <build> section and everything is working fine: the test-jar gets put into the local maven repository using mvn install.
In my case, I was setting maven.test.skip for a particular build profile. That property actually causes tests to not be compiled/installed, thus also preventing their deploy. The solution was to set the skipTests property instead.

Setting up maven to compile (instead of downloading) dependencies

I cloned the git repository of Apache ActiveMQ Artemis project (https://github.com/apache/activemq-artemis) and then typed
mvn -Ptests test -pl :integration-tests
I was surprised to see log messages like the following
...
Downloading: http://repository.apache.org/snapshots/org/apache/activemq/artemis-selector/1.4.0-SNAPSHOT/artemis-selector-1.4.0-20160625.030221-11.jar
Downloading: http://repository.apache.org/snapshots/org/apache/activemq/artemis-core-client/1.4.0-SNAPSHOT/artemis-core-client-1.4.0-20160625.030211-11.jar
...
Since e.g. artemis-core-client is contained in the git repository I cloned in the beginning, I'd have expected maven just builds it from there.
That way, when I make changes in the core client source, they get picked up by the integration tests.
Instead, maven is downloading the jar from the repository.
Question: How do I configure maven to always build all modules that are in the git repository and download only "true" dependencies, which I mean things not in the git repository?
You are not executing the Maven build on the main project, on the main pom.xml which indeed defines the artemis-selector and artemis-core-client modules, among others.
You are executing the Maven build on the tests and its pom.xml, where only tests modules are defined. This is a side/test project, which has as parent the previous pom file, but it doesn't play any role in its parent modules definition. Hence, dependencies are not resolved as modules but as Maven dependencies.
You should firstly install (via mvn clean install) the former project, so that libraries will be available in your local Maven cache (hence no downloading would be triggered), then execute the tests project.
Check the Maven docs for a inheritance vs aggregation difference to further clarify it.
From the Stack Overflow, the follow threads could also be interesting:
What is the difference between using maven -pl option and running maven from module level?
Maven multi module project cannot find sibling module

How to configure maven multi module project and FindBugs?

I have a multi module maven project ('assembler'), and I would like to add a FindBugs phase.
The problem is that some of the projects are not able to build stand alone since have dependencies on other projects... however when I invoke mvn package from the 'assembler' project - it works fine and all inner dependencies are resolved.
The problem is that when mvn findbugs:findbugs command is executed, those inner dependencies are not resolved, and maven complains.
Searching Google, I found a way to make it work, using mvn install, but I do not like this approach since eventually this should be used in CI with Jenkins and I do not want to rely on local maven repo on the build server.
Will appreciate hearing your ideas.
Thanks.
Don't be afraid of mvn install, it is your friend. What you need is to set up a centralized Maven repository (such as Nexus or Artifactory) or rent one (such as Bintray) to which you deploy your Maven artifacts to.
Jenkins or any other decent CI engine can do the actual deploying for you, either natively or through Maven. Once deployed, other users - Jenkins included - will be able to resolve their dependencies and you will be able to run FindBugs, PMD, JaCoco, host Javadoc or pretty much anything you want.
Side-note: Don't get confused by the term deploy as most people new to Maven usually get. In context of Maven, it has absolutely nothing to with your target environment. It simply means that an artifact is pushed out to a remote repository and nothing else.

Maven deploy multi module project only if all modules build successfully

I have a maven multi module project with several modules. I want to deploy them (mvn deploy) only if they all pass a full mvn install (which includes the tests).
Currently, I run a mvn install on the project. If all modules pass, I run mvn deploy to do the deployment. The problem I see is the waste of time calling mvn twice (even if I skip tests on the second run).
Does anyone have an idea on this?
EDIT: I have learned that using Artifactory as a repository manager and the maven-artifactory-plugin with your maven setup will add the atomic deploy behaviour to the mvn deploy command. See the Build Integration section in the Artifactory documentation.
[DISCLOSURE - I'm associated with JFrog. Artifactory creator.]
Take a look at the deployAtEnd parameter of Maven Deployment plugin: http://maven.apache.org/plugins/maven-deploy-plugin/deploy-mojo.html
This is a bit tricky. Maven is not atomic when it executes the build life-cycle. So a broken set of artifacts may end up in a repository.
One solution I know is Nexus Pro: http://www.sonatype.com/Products/Nexus-Professional/Features - it allows you to promote builds or define certain repos as staging. So only verified versions get promoted to be used. Maybe artifactory has something similar - I just don't know.
If that solution is too expensive you probably need to create a cleanup build or profile to remove artifacts that where already uploaded. My first guess would be to write a Maven plugin to use the the proxy remote API or maybe the maven features are already sufficient. But since deploy means update the meta-data xml files too I dont think there is a delete - not sure on this either.

Maven WAR overlay problems, while using Hudson + Artifactory

We have three artifacts:
common.jar : with common classes.
public.war : depending on the common.jar, contains only public site resources.
internal.war : depends on both common.jar and public.war, adding authentication
information and security context resource files. Also contains
few administration site classes.
Currently I have structured these in such way, that internal.war overlays itself with public.war.
Building the project locally, installing the artifacts to local repo, works perfectly.
Problems start when trying to get the Hudson builds working with following sequence:
Build all projects in dependency order.
Modify common.jar (say, add a new class method)
Modify internal.war classes in such way that they are compile-time dependent on changes done in 2. step.
Commit both changes, triggering the Hudson builds.
Internal.war build fails because it can not find the symbols added in step 2.
Somehow the build in step 5. is using an old version of the common.jar, and failing because of it.
The common.jar version number does not change, let's say it's 1.0.0-SNAPSHOT for the purposes of this example.
If I DO change the common.jar version number, the build works. (Supposedly because there is only one release by a release version number).
Now, what could cause this using of old artifacts in Hudson builds?
We are running maven builds on Hudson with command "clean package -e -X -U"
"Deploy artifacts to maven repository" has been checked.
It's hard to definitively answer this without access to the real poms, but here is what I would do:
1) Make sure Hudson is using the exact same version of Maven as you are on your local machine
2) Examine the effective pom.xml of internal.war on the Hudson machine in a terminal via mvn help:effective-pom making sure you are running the same mvn executable as your Hudson job does. You need to verify the version of the common.jar in the effective pom.xml of internal.war. It could be different than what you expect due to profiles or settings.xml differences.
3) Check the settings.xml file for your Hudson install of Maven. In particular you need to verify all is well in your distributionManagement, servers, and repositories stanzas. Another good way to check this is to go to your internal.war project and run mvn help:effective-settings and see if what is there matches what is on your local machine.
Something is awry and it won't take long to find with the right analysis.

Resources