since it's sometimes really slow to download jars from a maven central repository, I want to backup my local maven repository and restore it in another computer and maybe even in another os.
I'm working in windows 7 now, so, do I just have to simply zip all the repository files in "C:\Users[username].m2\repository", and unzip it to the rep location in another computer to restore it? Or do I have to do something special?
Thanks for your answer.
As far as i know that is all you have to do, maven does not store any special information in the repository. If you do this frequently, you could also use something like rsync to reduce the time and load of this.
And if you have many machines that need basically the same set of jars, consider using some local repository, like for example Artifcatory. You can set it up to work as a transparent cache, plus you can deploy your own artifacts on it.
You can just make a zip and copy it on another system. If you can't unzip the archive in the default maven repository location you can set the new location in your settings.xml
I would suggest to download an Repository manager like Nexus, Archiva an use that. So you can just drop your local repository and it will be downloaded the next time you build.
Related
I have created a maven plugin, this is dependent on a series of JARS. i have uploaded these JARs and poms to the relevant location on the server under nexus-data/blobs/maven-thirdparty/{group-id}.
when i run a task to "Rebuild Maven repository metadata" and "Rebuild repository index" these files does not appear when i try and browse the files http://{nexus-server}:port/#browse/browse/assets.
How can i have nexus server recognize the files in the repository?
i will look to that in future. I posted this same question to Sonatype forum and they assisted me with a utility that assists with this and it worked perfectly.
https://github.com/simpligility/maven-repository-tools/tree/master/maven-repository-provisioner
Thanks to Peter Lynch - https://support.sonatype.com/hc/en-us/articles/236210187-How-do-I-export-import-a-Maven-2-format-repository-over-HTTP-
What you are trying to do will not work, although it did in Nexus Repo 2. There currently is no mechanism for adding jars in to the blobstore in this way. We've intentionally added the blobstore and all that goes along with it so that we can do more fun things with searching, metadata, etc... that were much more difficult with a system that just has files on a path. What I might suggest instead is using something like this:
Nexus Exchange - Nexus Repository Import Scripts
GitHub Repo for Scripts
That should help you get the JARs into Nexus Repository 3, into a repo of your choice. I maintain that repo, so if you run into issues, create an issue and I'll see what I can do to help you out! ~Sonatype Community Nerd
I would like to know the difference between generating artifacts from and cache and updates.
Does generating artifacts get from remote repository if it is not available in local repository?
I did not find any relevant posts regarding the same.
I do not know if I understand correctly your question, tell me if I have to delete the answer.
In Maven, there are the remote repositories and the local repository (under .m2 folder the local one).
In the moment you build a project, the dependencies to build that project are downloaded from the remote repositories and saved into the local repository. So the next time you build a project with that dependency there should be no need to download because there is in your local repository.
There are people who say that the cache and the local repository are the same. But for example, if you use Eclipse and if you go into .m2 folder, you can see a .cache folder. There should be a m2e folder. Here Eclipse save indexes to manage the dependencies.
I have a Maven repository folder on our network drive, which contains all the artefacts we use.
Everyone in the office uses a standard settings.xml file on their local Maven setup which contains the location of that network drive as a remote repository.
In this way, we keep the network Maven repository folder updated so the local environment on everyone's computers simply downloads from that central repository folder, which avoids re-downloading off the internet for everyone.
We are busy developing a new library, lets say "MyLib 1.0.0". We install it into the central network maven folder repository, and everyone's local Maven projects use that dependency in their project(s). But now, we have not officially released "MyLib 1.0.0", its still a work in progress, so once we make further updates to it, we overwrite the "MyLib 1.0.0" artefact in the central repository.
Problem is, because all our local Maven's have already downloaded the artefact into their local Maven repositories, they wont re-download it. It already exists. I don't want to increase the version of "MyLib" yet because its not an official release, and I also don't want everyone to have to change their dependency version in their pom.xml files. I just want to replace the "MyLib 1.0.0" file and have everyone's local Maven's download and overwrite their local copy automatically. (At the moment everyone has to be told to go and remove the artefact from their local Maven repository manually, as which point it will re-download the latest copy of "MyLib 1.0.0")
What is best practice for the above, or how can I go about achieving this?
Based on your description there are comming two things into my mind.
First:
Stop using a network drive. Better start using a repository manager.
Apart from that a network drive is dammed slow.
Second:
In Maven a release is immutable which means if you have version 1.0 and deploy to your repository it will never being changed anymore. If you need to make a change on that you have to use a different version for example 1.0.1 etc.
You should start using SNAPSHOT's instead of releases if you are developing which is exactly causing the problem:
At the moment everyone has to be told to go and remove the artefact
from their local Maven repository manually, as which point it will
re-download the latest copy of "MyLib 1.0.0"
Which shows you are working against the concepts of Maven.
And third. It sounds like you don't use a continious integration for your builds.
I know how to do it for an external repository but not for my local repository, since I don't have a <repository> for my local repository in my settings.xml.
I use snapshot versions for my sub-projects, so when I re-build the parent project I want maven to get all the sub-projects snapshot versions from my local repository not only once a day (which seems to be what happens by default) but always.
If I'm understanding your comment, I think #FrVaBe may have the correct answer. When you change code for a child project on your development machine, it's up to you to rebuild the snapshot and get it into your local artifact repo (via mvn install) so it's available for the parent project to use.
If, however, you want your parent project build to pull in changes made by your teammates and published to the corporate remote repository more often than once per day, read on.
Here is a summary of how Maven central (and kin), remote repositories (e.g a company instance of Nexus or Artifactory) and your local repository work together. If you always want the latest version of snapshots to download on every build, go into your settings.xml file, find <snapshot> repository containing the snapshot you want, and change the <updatePolicy> value to "always". Personally I rarely do this, I simply add the '-U' option to my mvn command line when I want to ensure I have the latest version of a snapshot from my remote repo.
There is no update policy for the local repository!
The local repository is just a bunch of files. When you install to your local repository your local projects already reference the artifacts directly. There is no update that needs to be performed except that maybe your IDE needs to be refreshed to pickup the newer files.
In this manner you can build local snapshots all day long with no versioning headaches, no updates required and no old artifacts left hanging around afterwards. Nice and clean but not so obvious if you're new to Maven and still getting to grips with all these repositories and their fancy update mechanisms.
I think you missunderstood something. Maven will always take the latest/newest SNAPSHOT from your local respository. But in your project setup (Project Inheritance) you need to build the sub projects on their own if you changed something.
An automatical build of the sub project only happens on a Project Aggregation layout.
The difference is explained in the Project Inheritance vs Project Aggregation section of the documentation.
I am currently working on a project that require a fair amount of libraries. I have also install some libraries into the local repository. The next question is how should I write my pom.xml to tell which one should pull from the Internet and which should pull from the local system?
You don't. All repositories are equal in the eyes of maven. You write ordinary <dependency/> elements in your pom, and they work the same either way. Maven will always search the local repository with no special <repository/> elements.
You might find it more convenient in the long term to install a repository manager to share local deployments (Atifactory, Nexus, etc), than to be running install:install-file all the time.