Build Once, Deploy Anywhere, with Maven, Jenkins, and Artifactory - maven

I'm in the latter stages of setting up a CI environment for my project. I'm using Maven, Jenkins and Artifactory Pro and can successfully build my project and deploy it's artifacts to Artifactory. I have also written a bash script to retrieve the resulting artifacts of a specific build from Artifactory and copy them somewhere.
The main part I'm missing right now is automated versioning. I've looked at enabling Artifactory release management, which is really cool, but involves the rebuilding of the project. I'm really trying to follow the mantra of 'Build Once, Deploy Anywhere', so any rebuilding is a no-no.
My question boils down to: Is there an automated way (either with one of the aforementioned tools, or a plugin) to handle versioning, without rebuilding an artifact?

Artifactory Pro allows you to easily extend Artifactory's behavior with your own plugins written in groovy. (https://www.jfrog.com/confluence/display/RTF/User+Plugins)
You can find here, an example of Promote extension, that will change your artifacts versions without the needs of new build.
You can find more usefully examples in the GitHub "artifactory-user-plugins" repository.

Related

Building and deploying native code using Maven

I've spent years trying to deploy libraries that use native code to Maven Central. I've run into the following problems:
There weren't any good plugins for building native code using Maven. native-maven-plugin was a very rigid build system that, among other things, made it difficult to debug the resulting binaries. You'd have to manually synchronize the native-maven-plugin build system with the native IDE you use for debugging.
Maven did not replace variables in deployed pom.xml files: MNG-2971, MNG-4223. This meant that libraries had to declare platform-specific dependencies once per Maven profile (as opposed to declaring the dependency once and setting a different classifier per profile); otherwise, anyone who depended on your library had to re-define those same properties in their project file in order to resolve transitive dependencies. See Maven: Using inherited property in dependency classifier causes build failure.
Jenkins had abysmal support for running similar logic across different platforms (e.g. "shell" vs "batch" tasks, and coordinating a build across multiple machines)
Running Windows, Linux and Mac in virtual machines was way too slow and fragile. Even if you got it working, attempting to configure the VMs as Jenkins slaves was a lesson in frustration (you'd get frequent intermittent build errors).
Maven Central requires a main jar for artifacts that are platform-specific: OSSRH-975
Sonatype OSS Repository Hosting and maven-release-plugin assumed that it would be possible to release a project in an atomic manner from a single machine but I need to build the OS-specific bits on separate machines.
I'm going to use this Stackoverflow question to document how I've managed to overcome these limitations.
Here is how I overcame the aforementioned problems:
I used CMake for building native code. The beauty of this system is that it generates project files for your favorite (native) IDE. You use the same project files to compile and debug the code. You no longer need to synchronize the two systems manually.
Maven didn't support CMake, so I built my own plugin: https://github.com/cmake-maven-project/cmake-maven-project
I manually hard-coded platform-specific dependencies into each Maven profile, instead of defining the dependency once with a different classifier per profile. This was more work, but it doesn't look like they will be fixing this bug in Maven anytime soon.
I plan to investigate http://www.mojohaus.org/flatten-maven-plugin/ and https://github.com/mjiderhamn/promote-maven-plugin as alternatives in the near future.
Jenkins pipeline does a good job of orchestrating a build across multiple machines.
Running Jenkins slaves on virtual machines is still very error-prone but I've managed to workaround most of the problems. I've uploaded my VMWare configuration steps and Jenkins job configuration to help others get started.
I now create an empty JAR file for platform-specific artifacts in order to suppress the Sonatype error. This was actually recommended by Sonatype's support staff.
It turns out that maven-release-plugin delegates to other plugins under the hood. Instead of invoking it, I do the following:
Use mvn versions:set to change the version number from SNAPSHOT to a release and back.
Tag and commit the release myself.
Use nexus-staging:rc-open, nexus-staging:deploy -DstagingProfileId=${stagingProfileId} -DstagingRepositoryId=${stagingRepositoryId}, and nexus-staging:rc-close to upload artifacts from different platforms into the same repository. This is called a Staging Workflow (referenced below).
Upon review, release the repository to Maven Central.
Important: do not enable <autoReleaseAfterClose> in the nexus-staging plugin because it closes the staging repository after each deploy instead of waiting for all deploys to complete.
Per https://issues.sonatype.org/browse/NEXUS-18753 it isn't possible to release SNAPSHOT artifacts atomically (there is no workaround). When releasing SNAPSHOTs, you need to skip rc-open, rc-close and invoke nexus-staging:deploy without -DstagingProfileId=${stagingProfileId} -DstagingRepositoryId=${stagingRepositoryId}. Each artifact will be uploaded into a separate repository.
See my Requirements API for a real-life example that works.
Other quirks to watch out for:
skipNexusStagingDeployMojo must be false in last reactor module (otherwise no artifacts will be deployed): https://issues.sonatype.org/browse/NEXUS-12365. The best workaround is to use Maven profiles to omit whatever modules you want when deploying (don't use skipNexusStagingDeployMojo at all)
skipLocalStaging prevents deploying multiple artifacts into the same repository: https://issues.sonatype.org/browse/NEXUS-12351

Best practices for storing maven artifacts in nexus

In my company we have teams working on services which are built using maven pom's and gradle build scripts. The problem I have is that when the team's build their web applications, the jar files that get's created by one team member needs to be available for other team members in their pom files.
What we were thinking was to have a local nexus repo and then push the built jar files to nexus so that when any other team member builds they also can refer the same jar file.
However this could lead to versioning problems as two team members could be generating the same jar file if they change different files in the same project.
What I would like to know is are their any best practices in doing these types of builds and versioning.
There are many different opinions and strategies on how to manage this process. Some aspects are relatively common, however.
I'd say there are two key elements:
* Proper use of version definitions and references
* Automated builds and nexus deployments
If you have work ongoing for a specific artifact, for a specific release, then those changes should all go into a specific numbered version of the artifact. While work is ongoing, that version should end with "-SNAPSHOT". When the work for a release is completed, that version number should remove the '-SNAPSHOT". You also likely want to have separate repositories in the Nexus server for "snapshot" and "release" artifacts.
Concerning pushing artifacts to Nexus, this should always be done through automation. Manually pushing artifacts should be very rare. When a regular build is done for ongoing work, that should automatically deploy the "-SNAPSHOT" artifact to the snapshot repo. When your build automation is running a "release" build, those artifacts will deploy the release artifact to the release repo.
There are many other options and details you'll want to examine. Only implement features in this process that provide clear value in your situation. It's very easy to set up a process that is more complicated than you need.

What is Workflow management using Maven?

I have only little experience using maven with eclipse. One of the job descriptions which I received has "Workflow management using Maven" as a required skill. What does this mean ? What do they possibly expect?
I think they want you to correct them? :D
I'm not sure what they refer to. I would guess it relates to the developer workflow of creating and delivering software with eclipse (?) and maven.
So setting up a project from scratch is often done from an maven archetype (a project template if you like). A lot of open source frameworks offer archetypes to start with.
For existing projects you would check out the code from version control and import it into eclipse. the m2eclipse plugin is required to do that (but I think its quite common to have it)
Then there is building the software. Which is done through executing maven phases (which will then execute plugins). See maven-phases for more details. Maven phases have default plugins that execute (for example compile will run the compiler plugin).
So your workflow would look like this: you modify the files. compile them, test them, package them, deploy the artifacts into the maven repository. the maven install phase will store the artifacts in you local repository, the maven deploy phase will upload them into the company's repository.
From there the the files are installed. Yet you can use maven plugins to install the software into a application server. That depends on the traditions of the company.
I would not think of workflow as some strict step by step think like BPMN. Development is usually done with huge amounts of personal practices (are tests written in advance or while implementing, and so on).
Hope that will help :)

Best practice for using Maven or Gradle without internet access

My company has a policy that software deployed into production has be be built on a specific machine that has no access to the internet.
We're currently using Maven. When running build on development machines, maven automatically download the dependencies from central Maven repository without problem. Then before go production, we put all files in local Maven repository (.m2/repository) into source control, and then run offline build with
mvn -o -Dmaven.local.repo=<local repo dir> package
this method works, but managing thousands of files in source control is a real pain, particularly the dependencies for Maven plugins. Thus my question, how can I improve the workflow so as to make it easier to maintain the dependencies in the source control?
I'm considering switching to Gradle, mainly because it's more flexible and doesn't depend on plugin downloaded from repository. but then I found out the Gradle local cache directory is not transportable between computers, which means I cannot check it into source control.
Suggestions and recommendations are all appreciated.
Use internal repository manager like Nexus or Artifactory. Always put released artefact to production.
But building project on production machine is not good idea. Better use complete artefact like EAR or WAR with all dependencies included, or something like jar-with-dependencies or other assembled distros. Build project on your CI server and deploy complete package with one click to production server.

What's the purpose of an artifact repository?

Wherever you read about continuous delivery or continuous integration it's recommended to use an artifact repository to store the artifacts even though Jenkins already stores them for each build.
So why is it recommended to use an artifact repository? Is there a smooth solution to work with the artifacts of the Jenkins builds, ex. to use these artifacts for deployment?
An artifact repository and continuous integration tools serve two different purposes and one cannot be substituted with the other. Check this video from Artifactory, one of the providers of artifact repositories, about why one should use an artifact repository.
Jenkins stores the artifacts as plain files without versioning while artifacts in an artifact repository can be version controlled. So you have a lot more flexibility in retrieving artifacts and governing them. Read this very good article on why we need them. Surely not all of those things are supported by continuous integration tools like Jenkins.
Moreover, you can also look at the Artifactory plugin for Jenkins which integrates the two.
An artifact repository is needed but the artifact repo is a conceptual piece an not always a distinct tool. With Jenkins you should have MD5 signatures and (I think) a way of downloading the files you want (web service call, right?) from your remote server. Certainly, if you're doing something simple like using the Jenkins build pipeline plugin, it should be able to access the right versions of the files smoothly.
Alternatively, if you are using a separate deployment tool, the better ones bundle an artifact repository.
Regardless, you want what the ITIL folks call a Definitive Media/Software Library. Definitive in that the bits are secure, trusted, and official. And a library in that they can be easily looked up and accessed. When working with an artifact repository, you need to make sure its adequately secure. It is backed up. It is accessible for your deployments (including to production). If you look at Jenkins and it meets your criteria in those categories, consider yourself done. If it's lacking, and I wouldn't be surprised if it was, then you need either a dedicated tool like the Maven repos, or something bundled with the deploy tooling.
For more of my rambling on the subject, there's a recorded webcast. The slides for that are up on Slideshare.
I haven't kept up to date with Jenkins, we still use a version of the CI when it was orginally called Hudson.
In your projects your poms you should normally point to your own artifact repository were you can fetch and deploy your own (company) projects.
Using an artifact repository with your CI server, it can then deploy successfully built snapshot and releases which can be available to other developers.

Resources