What if maven artifact not anymore available? - maven

As we just migrated all projects to maven projects, one question came up and I couldn't find any suitable solutions yet.
What if maven shuts down forever or just one artifact is not available anymore? Or what if some versions are not available anymore, and the software cannot run with the newer ones?
Since we're not hosting a copy of the artifacts locally, should we host copies of every jar somewhere for such a scenario?
Thanks

Unlikely, but if you build your artifacts with Maven, you already have a copy of each relevant artifact - in your local repository. If you backup it from time to time, you have the necessary level of security.
Alternatively, use a company repository manager (Nexus/Artifactory) to proxy MavenCentral. It will also keep copies of the used artifacts.

Related

Should I backup maven/ivy dependencies?

In a project with an online dependency management done by ivy/maven, is it good practice to back-up the artifacts e.g. by including them in the projects source code / version control (e.g. svn, git) or is it (safe) enough to so so with pom.xml / ivy.xml? Of course, having the artifacts not backed up somehow makes the project totaly depend on the availability of the maven repo.
EDIT:
The reason for my question is that I fear that certain artifacts are no longer available on the official maven repository, or even that the repository itself is down. In such a case, having the libs under version control gives the security to be able to build the project in such a case.
My advice is not to backup the artifacts . if the artifact is not available , then you should consider it to change with replacement however I see very rare possibility of this to happen . Any way , you can always get old artifact JAR from your previous build for the scenario when we do't have any option and you have to use old artifact So i will not suggest to keep separate backup .
thanks ..

How do I get my local Maven to overwrite an existing artefact that was overwritten in my network repository

I have a Maven repository folder on our network drive, which contains all the artefacts we use.
Everyone in the office uses a standard settings.xml file on their local Maven setup which contains the location of that network drive as a remote repository.
In this way, we keep the network Maven repository folder updated so the local environment on everyone's computers simply downloads from that central repository folder, which avoids re-downloading off the internet for everyone.
We are busy developing a new library, lets say "MyLib 1.0.0". We install it into the central network maven folder repository, and everyone's local Maven projects use that dependency in their project(s). But now, we have not officially released "MyLib 1.0.0", its still a work in progress, so once we make further updates to it, we overwrite the "MyLib 1.0.0" artefact in the central repository.
Problem is, because all our local Maven's have already downloaded the artefact into their local Maven repositories, they wont re-download it. It already exists. I don't want to increase the version of "MyLib" yet because its not an official release, and I also don't want everyone to have to change their dependency version in their pom.xml files. I just want to replace the "MyLib 1.0.0" file and have everyone's local Maven's download and overwrite their local copy automatically. (At the moment everyone has to be told to go and remove the artefact from their local Maven repository manually, as which point it will re-download the latest copy of "MyLib 1.0.0")
What is best practice for the above, or how can I go about achieving this?
Based on your description there are comming two things into my mind.
First:
Stop using a network drive. Better start using a repository manager.
Apart from that a network drive is dammed slow.
Second:
In Maven a release is immutable which means if you have version 1.0 and deploy to your repository it will never being changed anymore. If you need to make a change on that you have to use a different version for example 1.0.1 etc.
You should start using SNAPSHOT's instead of releases if you are developing which is exactly causing the problem:
At the moment everyone has to be told to go and remove the artefact
from their local Maven repository manually, as which point it will
re-download the latest copy of "MyLib 1.0.0"
Which shows you are working against the concepts of Maven.
And third. It sounds like you don't use a continious integration for your builds.

What's the purpose of an artifact repository?

Wherever you read about continuous delivery or continuous integration it's recommended to use an artifact repository to store the artifacts even though Jenkins already stores them for each build.
So why is it recommended to use an artifact repository? Is there a smooth solution to work with the artifacts of the Jenkins builds, ex. to use these artifacts for deployment?
An artifact repository and continuous integration tools serve two different purposes and one cannot be substituted with the other. Check this video from Artifactory, one of the providers of artifact repositories, about why one should use an artifact repository.
Jenkins stores the artifacts as plain files without versioning while artifacts in an artifact repository can be version controlled. So you have a lot more flexibility in retrieving artifacts and governing them. Read this very good article on why we need them. Surely not all of those things are supported by continuous integration tools like Jenkins.
Moreover, you can also look at the Artifactory plugin for Jenkins which integrates the two.
An artifact repository is needed but the artifact repo is a conceptual piece an not always a distinct tool. With Jenkins you should have MD5 signatures and (I think) a way of downloading the files you want (web service call, right?) from your remote server. Certainly, if you're doing something simple like using the Jenkins build pipeline plugin, it should be able to access the right versions of the files smoothly.
Alternatively, if you are using a separate deployment tool, the better ones bundle an artifact repository.
Regardless, you want what the ITIL folks call a Definitive Media/Software Library. Definitive in that the bits are secure, trusted, and official. And a library in that they can be easily looked up and accessed. When working with an artifact repository, you need to make sure its adequately secure. It is backed up. It is accessible for your deployments (including to production). If you look at Jenkins and it meets your criteria in those categories, consider yourself done. If it's lacking, and I wouldn't be surprised if it was, then you need either a dedicated tool like the Maven repos, or something bundled with the deploy tooling.
For more of my rambling on the subject, there's a recorded webcast. The slides for that are up on Slideshare.
I haven't kept up to date with Jenkins, we still use a version of the CI when it was orginally called Hudson.
In your projects your poms you should normally point to your own artifact repository were you can fetch and deploy your own (company) projects.
Using an artifact repository with your CI server, it can then deploy successfully built snapshot and releases which can be available to other developers.

Maven and ibiblio

I searched a lot in apache documentation and ibiblio.org and I could not find a decent straight answer.
My questions:
When I download a jar using maven dependency (setup in pom), how can I be sure that the file does not change on the remote repository? for example, if I'm using log4j version 1.2.3, downloaded from ibiblio.org (or any other repo for that matter), how can I be sure I'm getting the exact same jar each time?
Does maven delete jars from the local repository? let's assume I'm not clearing the repository at all, will it fill up eventually? or does maven have some kind of mechanism to clear old jars?
In Maven conventions a released version like log4j 1.2.3 will never be changed. It will be left in your locale repository until you manually delete it. It can't be changed by anyone except for the admins on maven central, but i suppose they don't do such a stupid thing.
Furthermore the download by default is done from maven central (repo1.maven.org/maven2 instead of ibiblio).
One of the "tricks" in Maven is download an artifact (released) only once...that improved your build performance in contradiction to the SNAPSHOT dependencies.
You could configure your own repository, and point all your project poms at that. It's easy to configure your poms to use a different (private) repository, but I've never set one up myself. Doesn't seem too hard, other than managing it to keep all the needed artifacts available.

How to enable inside glassfish access to maven repository?

I have a following problem. We have a central maven repository hosted on our company server. Our team is working on a project. Everyone here uses that repository to get the required artifacts. If something is missing at the moment and is required for the task that the developer is currently dealing with, he installs this artifact manually to the central repository, so that his commits don't break the automated builds.
Now, each developer also has Glassfish v2 installed on his machine. That is for testing and debugging purposes. Before committing the changes, developer makes the .ear for the project with Maven help. However, after the developer deploys the ear to it's local glassfish, frequent errors arise, because the set of glassfish libraries may not contain all the latest dependencies of the central company repository.
Right now in case of the error the developer simply reads the log and looks what exactly is missing. After that he manually copies the required jar inside his local $GLASSFISH_HOME$/lib dir. But that seems a little bit frustrating. How can this be done automatically?
Right now we are trying to implement the following solution. The developer has to synchronize his local maven repository gathering all the artifacts from the central one that are required by the project. This local repository has to be placed on the java classpath, so that glassfish would also see it. Is that a correct approach? Maybe there is a way to install directly all the required artifacts from the central repository inside $GLASSFISH_HOME$/dir and this can be done automatically during deploy?
About having to install dependencies. If the developers need to install dependencies missing from public maven repositories, take into account that usually maven proxies have the ability to cache public repos. For instance, archiva has a proxying cache. If the dependencies are your own project deliverables you should consider releasing and deploying with maven to your company repo.
About latest versions. You need to specify maven what version of dependencies should use. I would prefer editing my poms manually, anyway there's a variety of ways to achieve that.
The libraries should be part of the project, I think. If not standard libraries of glassfish, they should be included, for instance, in your war file as part of your project. If not standard but not part of your project (not the regular approach) consider managing this glassfish as a project on its own (own git/svn repo, own pom, own versions, own everything).
Good luck.

Resources