We want to maintain a common repository for Maven for all the systems within our local network, i.e., there should not be a .m2 directory on every system but on a common server(say with some local ip 172.<>).
Can it be acheived via any file transfer protocol or any other service?
Operating System : Windows
While this is actually possible (you can give Maven a settings.xml on the command line, so you can always point to the one in the network), I would strongly recommend against this:
The Maven local repository is not thread safe. When two guys build against it at the same time, anything might break, especially SNAPSHOT versions. I speak from experience: We tried to have only one local repository on our build server and we got wrong results in different builds.
If you want a repository for your team, you need Nexus or Artifactory.
Related
Where are Maven and pom.xml file kept in a real-time project if the code is at GitHub. I mean can I keep my local repository somewhere in another machine and use it in my project. If yes, how?
Local repositories are not meant for sharing. They are also not "thread-safe" in any way, so accessing them simultaneously from two different builds might break things.
They are populated by the artifacts Maven downloads from MavenCentral and other repositories, and also the stuff you build yourself. As they are more or less a form of cache, there is no need to share them.
If you need a repository that is used from different machines or by different users, set up a Nexus/Artifactory server.
Is it possible to specify a remote settings.xml file for maven to use?
So it could be convenient to update one settings.xml file in some remote location (server), and the rest of the dev team wouldn't have to download it manualy.
Quite likely you could do this using tricks outside maven itself, such as symlinking or, as mentioned before me, sharing them through a repository.
But you probably should not. The settings.xml file is used for local settings - specific to your machine. You use this for example to specify the location of your local application server or a local database connection, etc. You would have to force every user to use the same file system layout and server setup, which probably requires more hassle than a shared settings.xml would save.
The proper way to share settings across a project is to include them in the project's pom. If you want to share across a team or organisation regardless of project, you can use a parent pom, or even several layers of them.
Simple answer no, cause the settings.xml defines the configuration to access remote resources furthermore it could contain passwords/keys etc. which would not make sense to store remotely.
You can create a git repository which contains ${HOME}/.m2/ included the settings.xml as a template so the onboarding is simpler.
I want to setup a development environment that allows reusing some artifacts from public Maven repositories like Maven Central, Code Haus. Specifically, I like the concept of transitive dependencies.
In our company, our production network cannot export any data outside, but we can push data inside. We already have some gateways to copy file from the outside into our network. Therefore, I could use this to copy the required packages manually but we would miss the power of maven. In our case, the perfect solution would be to be able to get data from public repository but be forbidden to deploy to the external repo.
So I would like to have your expert view on this problem.
We can use various means, as long as the capability to export data outside our network is guarantee:
External packages are created on a disk area that is read-only from production servers.
Some HTTP requests are filtered.
Using a repository manager, as Nexus.
In the repository management guide, Nexus talks about this possibility (http://books.sonatype.com/nexus-book/reference/confignx-sect-manage-repo.html). I would like a confirmation from you guys about how secure it is. Specifically, this has to be updated only by the IT manager.
Regards,
Loïc.
This is completely feasible and a common setup with Nexus. Here are the steps roughly.
Lock all developers and CI server inside the network disallowing direct access to outside servers
Setup Nexus to proxy external repositories like Central as desired
Allow Nexus to reach to those external repositories via the proxy
Configure developers and CI server machines to access Nexus to get the dependencies (and transitive dependencies) as desired
Optionally you can also
Configure CI servers to deploy any internal packages to Nexus
Configure deployment tools to get components for deployment from Nexus
Also note this can be done via different repository formats and toolchains. The common one is Maven, but Nexus also supports NPM, Nuget, Rubygems, sites, YUM and others.
And if you want to make some of your packages in Nexus available to the outside you can configure this as well following multiple options.
Also note that a proxy repository is by definition read only in terms of deployments to it directly. Thats what a hosted repository is for...
I have small open-source projects hosted on Github which I want to make available for others via Maven. I have a small webspace where I can host static files. How can I create a repo? Also, I would want to remove old snapshots from there if possible.
Standard maven repository implementations are almost all Tomcat web apps. Each one of them should have a static repository, just as your local repository. The webapp serves to the purpose of searching and management of the artifacts stored in that static repository.
If you want to host the repository with static web access only, you'll have to perform the management manually and provide a static manually generated html page that contains GAV coordinates of all artifacts in the repo. No other user but you could ever upload to the repository unless you give your password or enable anonymous FTP acces.
If maven doesn't try to upload anything to the repo until the deploy phase then this approach is still partly usable, since running a mvn clean deploy should fail.
You can check if is it doable like this (I suppose that you have that projects in your local repo):
upload your local repoistory folder to a URL
for the purpose of testing mirror your central repo to that URL
try to build your project with dependencies from your repo
Open your settings.xml file and under <mirrors> node add:
<mirror>
<url>http://your/url/repo</url>
<mirrorOf>*</mirrorOf>
</mirror>
and see if mvn clean install suceeds. Please feedback.
In this SO answer I have outlined the way I set up my OSS projects which are all hosted in Github. There are actually a number of free services out there you could you when you would like to run an OSS project.
I would recommend publishing to Maven Central, if your plugin is well-tested and expected to bring other people benefits as well. You can use CloudBee's BuildHive as a free Jenkins CI.
A static repo works great, per my experience.
I scp'd up my local repository into a static apache server. Legit repo. Not as easy to maintain as a real repo of course, but quite a bit cheaper if you've already got a plain vanilla web host.
Other than setting the permissions properly (same as required for you to browse the folders), it was a pretty painless procedure.
The only two things I did to make it more reasonable were
1 - Wrote a script to "rm -rf ...." on most of the contents of my local repo so that the only thing I am deploying is those few artifacts that are not available in the general repos.
2 - Tarred it up first before scping to my web host.
Hope this helps.
The guy below did something similar, only using FTP which saves him a lot of hand work if he updates his binaries very often.
http://stuartsierra.com/2009/09/08/run-your-own-maven-repository
I think I know how to do it now. I'm using mvn deploy now to create a local repository on the file system and then I upload it to the webserver. If I'm not wrong, there doesn't even need to be a file listing.
The command I'm using is:
mvn deploy -DaltDeploymentRepository=local::default::file:./repo
This creates/updates the local repository automatically, so the repo can be synced with a server.
Hudson provides the option to have a Maven build job utilize a private local repository, or use the common one from the Maven installation, i.e. one shared with other build jobs. I have the sense that our builds should use private local repositories to ensure that they are clean builds. However, this causes performance issues, particularly with respect to bandwith of downloading all dependencies for each job -- we also have the jobs configured to start with a clean "workspace", whcih seems to nuke the private maven repo along with the rest of the build space.
For daily, continuous integration builds, what are the pros and cons of choosing whether or not to use a private local maven repository for each build job? Is it a big deal to share a local repo with other jobs?
Interpreting the jenkins documentation, you would use private Maven repository if
You end up having builds incorrectly succeed, just because your have all the dependencies in your local repository, despite that fact that
none of the repositories in POM might have them.
You have problems regarding having concurrent Maven processes trying to use the same local repository.
Furthermore
When using this option, consider setting up a Maven artifact manager
so that you don't have to hit remote Maven repositories too often.
Also you could explore your scm's clean option (rather than workspace clean) to avoid this repository getting nuked.
I believe Sonatype recommends using a local Nexus instance, even though their own research shows (State of the Software Supply Chain report 2015) that less then 5% of traffic to Maven Central comes from such repositories.
To get back to the question, assuming you have a local Nexus instance and high bandwidth connectivity (tens of Gbps at least) between your build server (e.g. Jenkins) and Nexus, then I can see few drawbacks to using a private local repo, in fact I would call the decrease in build performance a reasonable trade-off.
The above said, what exactly are we trading off? We are accepting a small performance penalty on the downside and on the upside we know with 100% certainty that independent, clean builds against with our local Nexus instance as proxy works.
The latter is important because consider the scenario where the local repo on the build server (probably in the jenkins' user home directory) has an artefact that is not cached in Nexus (this is not improbable if you started off your builds against Maven Central). This out-of-sync scenario is suboptimal because it is possible to get a scenario where your cache TTL settings in Nexus means that builds fail if Nexus' upstream connectivity to Central was down temporarily.
Finally, to add more to the benefits side of the trade-off, I spent hours today getting an artefact in the shared Jenkins user .m2/repository today. Earlier on in the day upstream connectivity to Central was locally up and down for hours (mysterious issue in enterprise context). In the end I deleted the entire shared jenkins user .m2/repository so it all be retrieved from the local Nexus.
It's worth considering having builds using a local .m2/repository (in jenkins user home directory) as well as builds using private local repositories (fast and less fast builds). In my case however I may opt for private local repositories only in the first instance - I may be able to accept the penalty if I optimise the build by focussing on low hanging fruit (e.g. split up multi module build).