Maven multimodule project with Jenkins and git - maven

Currently we have a number of maven projects (specifically Apache Camel) that reside in isolation. We also have one Jenkins job per project because we need to specify the pom.xml file in the maven build and since the projects are independent then we require one job per project.
However, we also know most of the projects share a lot of dependencies and we want to turn them into a maven multi-module project with a parent pom file where the dependencies and versions are stated. We also want to have fewer Jenkins jobs to maintain and allow more projects to be added without having to create new Jenkins job.
My question is, in the Jenkins job for the maven build I still need to specify a single pom file. Does this mean that I need to point to the parent pom file and then add theparent directory as the directory to for Jenkins to receive the git trigger? In a sense, whenever a code is committed to any of the child projects, the job gets triggered and it uses the parent pom file to build only that project where the code was committed?

Related

How to run a maven plugin without a POM in Jenkins?

I have a plugin which can run either using a pom.xml or without (depends upon the version of the artifact we're building: new versions go without a pom. Strange, I know).
I want to have that plugin run in Jenkins.
But when creating a maven project, I have to set a pom (or as a default, Jenkins suppose there is one in the base folder given).
Question: Is it possible to configure Jenkins to not use a pom when there is none?
As per my comment, you should use a Jenkins freestyle project build in this case, in order to have more flexibility and avoid the default assumptions of a Jenkins Maven build.
In such a build, you can then configure a build step executing a shell or a Windows command (depending on the Jenkins server OS).
Indeed, in the Jenkins Maven build, a pom file is always required, as mentioned in the help support of the Configuration > Build > Root Pom entry
If your workspace has the top-level pom.xml in somewhere other than the 1st module's root directory, specify the path (relative to the module root) here, such as parent/pom.xml.
If left empty, defaults to pom.xml

Publish the jar file first and then pom file using artifactoryPublish

Is there a way to force artifactoryPublish task to publish the jar file first and then the pom file ? Or make pom file visible to others only when jar file is available.
We have multiple Git repositories where the jars are shared across the repositories. The Hudson build for them is triggered almost at the same time. So we have hit this case of gradle finding the pom but not the jar; But when we try to look into artifactory we see both. Just that we see jar gets uploaded couple of seconds later than the pom.
What is the best way to solve this (other than changing the time when Hudson build triggers).

How to download maven dependencies from Jenkins without a binary repository

Are there any plugins or ways to download the dependencies for a maven project from Jenkins? I am using Jenkins for a multi-module desktop application. Although I know I could just archive all dependencies, I don't see why there isn't the ability to download dependencies using maven which installed on the same machine as Jenkins. Preferably one would specify the location of a pom and then have the ability with one click to download all the dependencies for that pom. Can you do this? I do not need or want an entire binary repository for this feature.
Edit: I will try and rephrase this as I don't think people are understanding.
In Jenkins one has the ability to archive artifacts at the end of a build. Also in jenkins you have integration with maven. When building a jar in maven you have arguablly 2 options:
You can either use the assembly plugin which zips all .class files
together with those produced from your source code resulting in 1 jar
You can create a jar just source code which references all
dependency jars which are located in a separate folder.
In Jenkins one also has the ability to download the latest artifact. Now if I am using Option 2, I can either archieve just the jar which my sources produced, which I would say is more desirable for space and is the whole purpose of the archive functionality, or you can also archive the libraries too.
Here is the PROBLEM!! If I don't archive the libraries then I cannot easily run this jar, as it is a desktop application and its dependencies cannot be obtained in the same mannor as clicking on a link from jenkins. So lets say my question is what is the easiest way to obtain them? Extra info: assume jenkins is running as a server and you can't use artifactory or another server application, that seems to me to be massive over kill.
Use the maven plugin and create a maven job for your project. Jenkins will then use the maven command you provide in the job configuration to build the project. This means maven will download the projects dependencies and store them on the machine jenkins is running. Normally this would be <JENKINS_HOME>/.m2/repository. This way you get a local repository that only contains the dependencies of the projects you created maven jobs for.

Set a different version to my jars at project deployment stage - Maven

I have a Maven deployment problem:
When I execute Maven deploy, Maven pushes all my jars to a remote repository under the project version which is specified in the POM files:
<version>version.x.y.z</version>
The problem is that I don't want to overwrite my previous jars every time I rebuild my project, I want to increment the version automatically every time I build as part of the building process.
(So, I don't want to use CLI tool such "Versions Maven Plugin" to change the pom files before the building process.)
I have an environment variable, $project.buildnumber, that I can use to set the project version.
Is it possible to configure maven-deploy-plugin to automatically change the version (for instance using this environment variable)?
Many thanks!!

Maven: Change the "test" phase directory from local .m2 to target?

Forgive me if this is remedial, but I am still new to Maven and it's functionality.
In my project, when it "builds" and gets to the compile phase, it will create a target directory with just compiled libraries and update (or create if not there) the local .m2 directory.
When I get to the "test" phase, I want it to build against the target directory's library files, and not the local .m2 directory.
Any hints, recommendations, or suggests would be greatly appreciated. Thanks!
Maven has this concept of “the reactor”, which is just a fancy term for the list of projects being built. At the start of a Maven build, and at the end, Maven prints out this list of projects (using /project/name if defined or groupId:artifactId otherwise).
For each project in the reactor, Maven maintains a list of artifacts that have been attached. By default, each module's pom.xml is attached, and as each plugin runs, they have the option of attaching additional artifacts. Most plugins do not attach artifacts, here are some plugins that do:
jar:jar creates a .jar and attaches it
war:war creates a .war and attaches it
source:jar creates a .jar of the source Java code and attaches it with a classifier of source
java doc:jar creates a .jar of the JavaDocs ad attaches it with a classifier of javadoc
There is also a default primary artifact (this is the one that gets replaced by jar:jar) which is actually a directory and not a file, as such it will not get installed or deployed to the local repository cache or a remote repository.
So when in the reactor, and a plugin that attaches the primary artifact has not run yet, and another plugin asks for the primary artifact, it will be given the directory ${project.build.outputDirectory}. If after the primary artifact as been attached, then that primary artifact will be provided.
The test phase happens before the package phase, so will use the directory and not the .jar. The integation-test phase happens after, so will always use the .jar.
Things get more complex in a multi-module project (which is where my long intro should help you out)
Maven has to build the test classpath. If one of the dependencies is within the reactor, Maven will use the artifact attached to the reactor. Otherwise it will use the local cache (populating from the remote repositories if necessary).
When you run
mvn test
In a multimdule project from the root, there is no replacement of the default (directory-based) artifact, so intra-module classpath will be to the target/classes directories.
When you run
mvn package
In the same project, however, because each module completes its life cycle sequentially, all the dependent modules will have swapped in their .jar files as their attached artifact.
All of this should show you that Maven is doing the sensible thing. Hope this has helped.
The test phase is going to execute tests on your project. The project won't reference itself via the dependency mechanism. Only dependencies will be referenced via your local repository, i.e. .m2/repository
Also, it's not the compile phase that installs the artifact to the local repository, it's the install phase. And, then, there's a later phase, called deploy, that will deploy the artifact to a remote repository, provided you have a remote repository configured as the deploy target. Note, install and deploy are nearly identical phases except install is a local only thing; thus, it's the common build phase to hit when doing dev environment work. Normally the build server will do the deploy stuff.

Resources