Maven deploy configs in pom.xml VS jenkins post-build action - maven

Actually I see two alternatives how can I deploy my project to NEXUS:
Configure distributionManagement and deploy-plugin in pom.xml. That in jenkins I should only call mvn deploy and my project will be deployed to the environment
Create in Jenkins Post-build Actions -> Deploy artifacts to maven repository, where I can set repository URL, repository ID and so on
Question
What is pros and cons of each approach comparing with one another?

If you are configuring the deployment in Jenkins build configuration you are doing two things
you are separating the deployment from the project itself and therefore potentially can have different deployments for the same project
you remove the deployment setup from your version control setup/your source code
If you are leaving it in the pom using the default Maven setup you can run deployment of the project without modification from the commandline on any machine that has the credentials set up correctly. This can greatly help wit troubleshooting and it makes the setup independent of whatever CI server you use.
Both approaches as well as more custom setups like using the Artifactory Build Integration or the Nexus Staging Maven Plugin usage are fine. It will mostly depend on what you are aiming to achieve.
Personally I believe that the configuration should not be isolated to Jenkins and should remain with the project in the pom. But that is just my 2c.

Thanks for adding the Artifactory tag, now I can give you one more option - Artifactory Build Integration. With Artifactory Jenkins plugin you can configure your deployment options (target repository, whether or not you want to deploy build information, environment variables and custom properties etc.) without polluting your developers pom with ci-eyes only information.

Related

Build Once, Deploy Anywhere, with Maven, Jenkins, and Artifactory

I'm in the latter stages of setting up a CI environment for my project. I'm using Maven, Jenkins and Artifactory Pro and can successfully build my project and deploy it's artifacts to Artifactory. I have also written a bash script to retrieve the resulting artifacts of a specific build from Artifactory and copy them somewhere.
The main part I'm missing right now is automated versioning. I've looked at enabling Artifactory release management, which is really cool, but involves the rebuilding of the project. I'm really trying to follow the mantra of 'Build Once, Deploy Anywhere', so any rebuilding is a no-no.
My question boils down to: Is there an automated way (either with one of the aforementioned tools, or a plugin) to handle versioning, without rebuilding an artifact?
Artifactory Pro allows you to easily extend Artifactory's behavior with your own plugins written in groovy. (https://www.jfrog.com/confluence/display/RTF/User+Plugins)
You can find here, an example of Promote extension, that will change your artifacts versions without the needs of new build.
You can find more usefully examples in the GitHub "artifactory-user-plugins" repository.

Best practice for using Maven or Gradle without internet access

My company has a policy that software deployed into production has be be built on a specific machine that has no access to the internet.
We're currently using Maven. When running build on development machines, maven automatically download the dependencies from central Maven repository without problem. Then before go production, we put all files in local Maven repository (.m2/repository) into source control, and then run offline build with
mvn -o -Dmaven.local.repo=<local repo dir> package
this method works, but managing thousands of files in source control is a real pain, particularly the dependencies for Maven plugins. Thus my question, how can I improve the workflow so as to make it easier to maintain the dependencies in the source control?
I'm considering switching to Gradle, mainly because it's more flexible and doesn't depend on plugin downloaded from repository. but then I found out the Gradle local cache directory is not transportable between computers, which means I cannot check it into source control.
Suggestions and recommendations are all appreciated.
Use internal repository manager like Nexus or Artifactory. Always put released artefact to production.
But building project on production machine is not good idea. Better use complete artefact like EAR or WAR with all dependencies included, or something like jar-with-dependencies or other assembled distros. Build project on your CI server and deploy complete package with one click to production server.

How to deploy locally already installed artifact with Maven

Story
I know, that maven deploy command runs through the whole lifecycle. My problem, that it takes to much time in my case. Let me explain:
There is an application built up from a Server, and a single sourced Eclipse RAP&RCP client
The communication is defined by shared API projects which are built together
with the Server, but also needed by the GUI projects
The GUI projects are built by Tycho, so its impossible to build
both of them in one build (in one reactor, EDIT: since the P2 artifacts are different for RCP and RAP)
I build a release with a multi step Jenkins build. To make sure, that
everything is fine I first make a clean install for Server and the
GUI variations one by one, and then I deploy them, if nothing fails
Question
Building everything twice takes a lot of time. Is there anything like "please simply deploy all built artifacts as they are from my local repository to the POM defined repository with skipping the whole lifecycle"?
If you have the artifact already by the previous build, you may consider the deploy:deploy-file by following the Guide to deploying 3rd party JARs to remote repository. I always use this goal to publish some stable artifact to the developer public remote repository for letting other team to test/use.
I hope this may help.
I don't think that there is a pure Maven solution to this. The problem is that your deploy only build won't know which artifacts to deploy – AFAIK this information is only in the in-memory Maven model and not persisted to the target folder.
The problem can be solved with a Maven repository manager that supports staging, like the (commercial) Nexus Pro. Then, your build would deploy straight away into a staging repository, and only promote the artifacts to the (main) repository if everything succeeded.

Maven deploy: forcing the deploy even if artifact already exists

I'm building a project, which is made up from several (sometimes unrelated) modules and some more non standard java modules (built with ANT).
Each maven module is deployed to the releases repository on completion.
If the build fails in the middle, I might have some modules already deployed, so if I try to rebuild, the new attempt to deploy will fail since the artifacts are already deployed.
Is it possible to force a deploy or instead, remove the deployed artifact before I deploy again?
It sounds like the middleware admins have configured your remote repo instance (Nexus or Artifactory or whatever) to not allow artifact redeployment, and as #khmarbaise says there are good reasons for that. Nexus can be configured to allow artifact deletion by users in a particular role or with artifact deletion privileges. If your admins have it set up that way perhaps you can request the delete privilege and remove the offending artifacts. Or, perhaps the Nexus admin will agree to do it for you.
If neither of these is possible, here are some things to try which might keep this from happening in the future:
If you are using the release plugin, do a dry run (-DdryRun=true on the release:prepare command line) first. Maven should report any errors without committing to SCM.
Try running mvn install on your group of projects first. This will install the artifacts to your local repo, not the remote. If there's a problem you can whack the artifacts out of your local repo and start from scratch, repeating until you get a complete build.
If you are running a multi-module build, there are command line options that allow resuming a Maven build from a particular project forward.
Define -Dmaven.deploy.skip=true on the Maven command line. This is similar to suggestion #2, except Maven will actually load & configure the deploy plugin, it just won't do the actual deploy to the remote repo. Once everything works, remove the skip property.
I know it might be late, but in Nexus there's an option where allows the redeployment of artifacts.
Just select the repositories in the left, choose the repository you want to change the policy and then set it to Allow Redeploy.
The possible options have been increased ;)
Use the parameter deployAtEnd (More information: here). With this parameter the artifacts are deployed only, if all artifacts were built successfully.

Adding default maven build configuration on Jenkins

I am setting up a CI system using Jenkins for Maven based projects. I was wondering whether there is a way to specify a build configuration which would be common for all the projects deployed on Jenkins.
For instance, I want all the projects to generate JavaDoc's hence I require the maven-javadoc-plugin in maven pom. As I understand, this can't be added to the settings.xml file. And I don't have access to the super pom. And editing the super pom isn't a good idea anyways.
What is the best way to add a common build profile for all the projects?
Corporate POM, that's what I was looking for.

Resources