Maven deploy: forcing the deploy even if artifact already exists - maven

I'm building a project, which is made up from several (sometimes unrelated) modules and some more non standard java modules (built with ANT).
Each maven module is deployed to the releases repository on completion.
If the build fails in the middle, I might have some modules already deployed, so if I try to rebuild, the new attempt to deploy will fail since the artifacts are already deployed.
Is it possible to force a deploy or instead, remove the deployed artifact before I deploy again?

It sounds like the middleware admins have configured your remote repo instance (Nexus or Artifactory or whatever) to not allow artifact redeployment, and as #khmarbaise says there are good reasons for that. Nexus can be configured to allow artifact deletion by users in a particular role or with artifact deletion privileges. If your admins have it set up that way perhaps you can request the delete privilege and remove the offending artifacts. Or, perhaps the Nexus admin will agree to do it for you.
If neither of these is possible, here are some things to try which might keep this from happening in the future:
If you are using the release plugin, do a dry run (-DdryRun=true on the release:prepare command line) first. Maven should report any errors without committing to SCM.
Try running mvn install on your group of projects first. This will install the artifacts to your local repo, not the remote. If there's a problem you can whack the artifacts out of your local repo and start from scratch, repeating until you get a complete build.
If you are running a multi-module build, there are command line options that allow resuming a Maven build from a particular project forward.
Define -Dmaven.deploy.skip=true on the Maven command line. This is similar to suggestion #2, except Maven will actually load & configure the deploy plugin, it just won't do the actual deploy to the remote repo. Once everything works, remove the skip property.

I know it might be late, but in Nexus there's an option where allows the redeployment of artifacts.
Just select the repositories in the left, choose the repository you want to change the policy and then set it to Allow Redeploy.

The possible options have been increased ;)
Use the parameter deployAtEnd (More information: here). With this parameter the artifacts are deployed only, if all artifacts were built successfully.

Related

How to ensure that maven doesn't accidentally deploy to release repoisitory?

So I'm working for a customer that uses mvn deploy statements in his build scripts and I'm trying to figure out a way to prevent maven to accidentally overwrite artifacts in the release repo of Artifactory, for instance if a developer forgets to mark his POM version with -SNAPSHOT on his feature branch.
I'm no maven expert, but I've seen some suggestions, like using certain maven plugins, but these plugins' usage must be configured in the POM and then I'm back where I started, what if this is forgotten on a feature branch? There must be an established method to ensure that no artifacts from feature branches are deployed into the release repo and that no artifacts from release branches are deployed into the snapshot repo by accident.
One way I can think of and that also has been suggested is, to simply disallow redeployment on the release repo in Artifactory, but what if I have a validation build that fires after a PR is created and then another CI build fires and tries to redeploy?
Is there another good way to achieve this?
One solution is to ensure specific users/groups do not have the Delete permission.
See more here: https://www.jfrog.com/confluence/display/JFROG/Permissions#Permissions-RepositoryPermissions.
NOTE: I haven't used Artifactory in a while, but this makes sense according to the docs.

Maven different local repositories for SNAPSHOTs and RELEASE artifacts

is it possible in Maven to configure different local repositories for SNAPSHOT and RELEASE artifacts?
The reason I am asking, we are using Jenkins for continuous build for our project. To ensure the consistency (if same artifact is built from different Jenkins jobs because of race condition, we can experience chaotic behavior) before build start, we create a fresh local repository for Jenkins.
Now the problem is, our project is huge, so for every build we have to download lots of dependencies from our Nexus but when you think about it, there is no reason to download every time new the RELEASE artifacts. The RELEASE artifacts don't change from build to build, for ex, Spring 4.5, httpclient 4.0, aspectj 1.8.1 is same for one build to another.
So actually to ensure the consistency, we only should not have the SNAPSHOT dependencies in the repository. If we could have two local repositories one for RELEASE artifacts and the other for SNAPSHOT's, then before every build start, we could delete the SNAPSHOT repository but re-use the local RELEASE repository, which would save me gigabytes of download from Nexus.
I know we can do RELEASE, SNAPSHOT configurations for remote repositories, is it possible to do same sort of configuration for local repositories?
If this is not possible, how would you solve this problem.
There is currently no way to achieve this, and yes, I agree with the sentiment.
A reasonably recent versions of Jenkins' Maven plugin allow you to specify a custom local repository without having to edit a settings.xml file — the option is right there at the job definition screen (in the Advanced section, select Use private Maven repository).
So, what I would do is use this option, and precede the Maven build step with a script that deletes all directories, in the local private repository, which end in -SNAPSHOT.
It's repulsive, but I can't think of any other way.

How to deploy locally already installed artifact with Maven

Story
I know, that maven deploy command runs through the whole lifecycle. My problem, that it takes to much time in my case. Let me explain:
There is an application built up from a Server, and a single sourced Eclipse RAP&RCP client
The communication is defined by shared API projects which are built together
with the Server, but also needed by the GUI projects
The GUI projects are built by Tycho, so its impossible to build
both of them in one build (in one reactor, EDIT: since the P2 artifacts are different for RCP and RAP)
I build a release with a multi step Jenkins build. To make sure, that
everything is fine I first make a clean install for Server and the
GUI variations one by one, and then I deploy them, if nothing fails
Question
Building everything twice takes a lot of time. Is there anything like "please simply deploy all built artifacts as they are from my local repository to the POM defined repository with skipping the whole lifecycle"?
If you have the artifact already by the previous build, you may consider the deploy:deploy-file by following the Guide to deploying 3rd party JARs to remote repository. I always use this goal to publish some stable artifact to the developer public remote repository for letting other team to test/use.
I hope this may help.
I don't think that there is a pure Maven solution to this. The problem is that your deploy only build won't know which artifacts to deploy – AFAIK this information is only in the in-memory Maven model and not persisted to the target folder.
The problem can be solved with a Maven repository manager that supports staging, like the (commercial) Nexus Pro. Then, your build would deploy straight away into a staging repository, and only promote the artifacts to the (main) repository if everything succeeded.

Is there a plugin that ties Jenkins' builds to Maven (Nexus) artifacts

I was wondering if there is a Hudson/Jenkins plugin that ties repository artifacts to the build that created them? I was looking at the question "remove artifacts from nexus repository" and thinking that deleting a build in Jenkins should also offer the option to remove the artifacts the build created.
We are currently running Jenkins 1.447 and the Nexus Open Source 1.9.2.3. Our Jenkins builds create artifacts in our Nexus repository using the maven deploy goal. I appears that once those artifacts have been deployed, there is no similarly automated mechanism to remove them. We would like to tie the Jenkins build to its Nexus artifacts. I figure if we have decided to remove the build from Jenkins, we have no use for the build and therefore, have no need to store the artifacts for that build either. We would like deleting the build to trigger deleting the Nexus artifacts.
If there's nothing available I figure I could start writing something, but I wanted to check and see how others handle this.
you can use the REST API from Nexus to build your own Jenkins Plugin that does that Job for you. You could store the Jenkins Build Job number using the Nexus custom-metadata plugin. Once the build is deleted you can have your custom Jenkins Plugin delete all artifacts in Nexus that have that build number in their metadate. I had a similar Problem and wrote a custom Jenkins Plugin. Have a look at the tutorial and the source code on github. It should be pretty straight forward.
Tutorial:
http://blog.codecentric.de/en/2012/08/tutorial-create-a-jenkins-plugin-to-integrate-jenkins-and-nexus-repository/
Sourcecode:
https://github.com/marcelbirkner/nexus-metadata-plugin/
You can purge artifacts from a local repository via the maven-dependency-plugin.
If you have a release it does not make sense to number it based on the numbers of the build server. The usual use case is to use SNAPSHOT's for exactly this purpose. Furthermore the usual use case is to delete SNAPSHOT's based on a scheduled task after a time from Nexus but releases will never be deleted from Nexus.
Since you know the name of the release, you could build a custom job or trigger to use a wget command to delete the artifact from the nexus repository.
As the proper user in Nexus you do have the ability to delete release artifacts, not just SNAPSHOTS.

How to enable inside glassfish access to maven repository?

I have a following problem. We have a central maven repository hosted on our company server. Our team is working on a project. Everyone here uses that repository to get the required artifacts. If something is missing at the moment and is required for the task that the developer is currently dealing with, he installs this artifact manually to the central repository, so that his commits don't break the automated builds.
Now, each developer also has Glassfish v2 installed on his machine. That is for testing and debugging purposes. Before committing the changes, developer makes the .ear for the project with Maven help. However, after the developer deploys the ear to it's local glassfish, frequent errors arise, because the set of glassfish libraries may not contain all the latest dependencies of the central company repository.
Right now in case of the error the developer simply reads the log and looks what exactly is missing. After that he manually copies the required jar inside his local $GLASSFISH_HOME$/lib dir. But that seems a little bit frustrating. How can this be done automatically?
Right now we are trying to implement the following solution. The developer has to synchronize his local maven repository gathering all the artifacts from the central one that are required by the project. This local repository has to be placed on the java classpath, so that glassfish would also see it. Is that a correct approach? Maybe there is a way to install directly all the required artifacts from the central repository inside $GLASSFISH_HOME$/dir and this can be done automatically during deploy?
About having to install dependencies. If the developers need to install dependencies missing from public maven repositories, take into account that usually maven proxies have the ability to cache public repos. For instance, archiva has a proxying cache. If the dependencies are your own project deliverables you should consider releasing and deploying with maven to your company repo.
About latest versions. You need to specify maven what version of dependencies should use. I would prefer editing my poms manually, anyway there's a variety of ways to achieve that.
The libraries should be part of the project, I think. If not standard libraries of glassfish, they should be included, for instance, in your war file as part of your project. If not standard but not part of your project (not the regular approach) consider managing this glassfish as a project on its own (own git/svn repo, own pom, own versions, own everything).
Good luck.

Resources