In a project with an online dependency management done by ivy/maven, is it good practice to back-up the artifacts e.g. by including them in the projects source code / version control (e.g. svn, git) or is it (safe) enough to so so with pom.xml / ivy.xml? Of course, having the artifacts not backed up somehow makes the project totaly depend on the availability of the maven repo.
EDIT:
The reason for my question is that I fear that certain artifacts are no longer available on the official maven repository, or even that the repository itself is down. In such a case, having the libs under version control gives the security to be able to build the project in such a case.
My advice is not to backup the artifacts . if the artifact is not available , then you should consider it to change with replacement however I see very rare possibility of this to happen . Any way , you can always get old artifact JAR from your previous build for the scenario when we do't have any option and you have to use old artifact So i will not suggest to keep separate backup .
thanks ..
Related
As we just migrated all projects to maven projects, one question came up and I couldn't find any suitable solutions yet.
What if maven shuts down forever or just one artifact is not available anymore? Or what if some versions are not available anymore, and the software cannot run with the newer ones?
Since we're not hosting a copy of the artifacts locally, should we host copies of every jar somewhere for such a scenario?
Thanks
Unlikely, but if you build your artifacts with Maven, you already have a copy of each relevant artifact - in your local repository. If you backup it from time to time, you have the necessary level of security.
Alternatively, use a company repository manager (Nexus/Artifactory) to proxy MavenCentral. It will also keep copies of the used artifacts.
In my company we have teams working on services which are built using maven pom's and gradle build scripts. The problem I have is that when the team's build their web applications, the jar files that get's created by one team member needs to be available for other team members in their pom files.
What we were thinking was to have a local nexus repo and then push the built jar files to nexus so that when any other team member builds they also can refer the same jar file.
However this could lead to versioning problems as two team members could be generating the same jar file if they change different files in the same project.
What I would like to know is are their any best practices in doing these types of builds and versioning.
There are many different opinions and strategies on how to manage this process. Some aspects are relatively common, however.
I'd say there are two key elements:
* Proper use of version definitions and references
* Automated builds and nexus deployments
If you have work ongoing for a specific artifact, for a specific release, then those changes should all go into a specific numbered version of the artifact. While work is ongoing, that version should end with "-SNAPSHOT". When the work for a release is completed, that version number should remove the '-SNAPSHOT". You also likely want to have separate repositories in the Nexus server for "snapshot" and "release" artifacts.
Concerning pushing artifacts to Nexus, this should always be done through automation. Manually pushing artifacts should be very rare. When a regular build is done for ongoing work, that should automatically deploy the "-SNAPSHOT" artifact to the snapshot repo. When your build automation is running a "release" build, those artifacts will deploy the release artifact to the release repo.
There are many other options and details you'll want to examine. Only implement features in this process that provide clear value in your situation. It's very easy to set up a process that is more complicated than you need.
Wherever you read about continuous delivery or continuous integration it's recommended to use an artifact repository to store the artifacts even though Jenkins already stores them for each build.
So why is it recommended to use an artifact repository? Is there a smooth solution to work with the artifacts of the Jenkins builds, ex. to use these artifacts for deployment?
An artifact repository and continuous integration tools serve two different purposes and one cannot be substituted with the other. Check this video from Artifactory, one of the providers of artifact repositories, about why one should use an artifact repository.
Jenkins stores the artifacts as plain files without versioning while artifacts in an artifact repository can be version controlled. So you have a lot more flexibility in retrieving artifacts and governing them. Read this very good article on why we need them. Surely not all of those things are supported by continuous integration tools like Jenkins.
Moreover, you can also look at the Artifactory plugin for Jenkins which integrates the two.
An artifact repository is needed but the artifact repo is a conceptual piece an not always a distinct tool. With Jenkins you should have MD5 signatures and (I think) a way of downloading the files you want (web service call, right?) from your remote server. Certainly, if you're doing something simple like using the Jenkins build pipeline plugin, it should be able to access the right versions of the files smoothly.
Alternatively, if you are using a separate deployment tool, the better ones bundle an artifact repository.
Regardless, you want what the ITIL folks call a Definitive Media/Software Library. Definitive in that the bits are secure, trusted, and official. And a library in that they can be easily looked up and accessed. When working with an artifact repository, you need to make sure its adequately secure. It is backed up. It is accessible for your deployments (including to production). If you look at Jenkins and it meets your criteria in those categories, consider yourself done. If it's lacking, and I wouldn't be surprised if it was, then you need either a dedicated tool like the Maven repos, or something bundled with the deploy tooling.
For more of my rambling on the subject, there's a recorded webcast. The slides for that are up on Slideshare.
I haven't kept up to date with Jenkins, we still use a version of the CI when it was orginally called Hudson.
In your projects your poms you should normally point to your own artifact repository were you can fetch and deploy your own (company) projects.
Using an artifact repository with your CI server, it can then deploy successfully built snapshot and releases which can be available to other developers.
I searched a lot in apache documentation and ibiblio.org and I could not find a decent straight answer.
My questions:
When I download a jar using maven dependency (setup in pom), how can I be sure that the file does not change on the remote repository? for example, if I'm using log4j version 1.2.3, downloaded from ibiblio.org (or any other repo for that matter), how can I be sure I'm getting the exact same jar each time?
Does maven delete jars from the local repository? let's assume I'm not clearing the repository at all, will it fill up eventually? or does maven have some kind of mechanism to clear old jars?
In Maven conventions a released version like log4j 1.2.3 will never be changed. It will be left in your locale repository until you manually delete it. It can't be changed by anyone except for the admins on maven central, but i suppose they don't do such a stupid thing.
Furthermore the download by default is done from maven central (repo1.maven.org/maven2 instead of ibiblio).
One of the "tricks" in Maven is download an artifact (released) only once...that improved your build performance in contradiction to the SNAPSHOT dependencies.
You could configure your own repository, and point all your project poms at that. It's easy to configure your poms to use a different (private) repository, but I've never set one up myself. Doesn't seem too hard, other than managing it to keep all the needed artifacts available.
I have a following problem. We have a central maven repository hosted on our company server. Our team is working on a project. Everyone here uses that repository to get the required artifacts. If something is missing at the moment and is required for the task that the developer is currently dealing with, he installs this artifact manually to the central repository, so that his commits don't break the automated builds.
Now, each developer also has Glassfish v2 installed on his machine. That is for testing and debugging purposes. Before committing the changes, developer makes the .ear for the project with Maven help. However, after the developer deploys the ear to it's local glassfish, frequent errors arise, because the set of glassfish libraries may not contain all the latest dependencies of the central company repository.
Right now in case of the error the developer simply reads the log and looks what exactly is missing. After that he manually copies the required jar inside his local $GLASSFISH_HOME$/lib dir. But that seems a little bit frustrating. How can this be done automatically?
Right now we are trying to implement the following solution. The developer has to synchronize his local maven repository gathering all the artifacts from the central one that are required by the project. This local repository has to be placed on the java classpath, so that glassfish would also see it. Is that a correct approach? Maybe there is a way to install directly all the required artifacts from the central repository inside $GLASSFISH_HOME$/dir and this can be done automatically during deploy?
About having to install dependencies. If the developers need to install dependencies missing from public maven repositories, take into account that usually maven proxies have the ability to cache public repos. For instance, archiva has a proxying cache. If the dependencies are your own project deliverables you should consider releasing and deploying with maven to your company repo.
About latest versions. You need to specify maven what version of dependencies should use. I would prefer editing my poms manually, anyway there's a variety of ways to achieve that.
The libraries should be part of the project, I think. If not standard libraries of glassfish, they should be included, for instance, in your war file as part of your project. If not standard but not part of your project (not the regular approach) consider managing this glassfish as a project on its own (own git/svn repo, own pom, own versions, own everything).
Good luck.