Can Nexus/Artifactory store copies from a public repository? - maven

The requirements are as follows. We need copies from binaries we need in our projects on our repository server. We can't just proxy the public repository because we had several cases in the past where the binaries on the public repository were changed without changing the release number and we want to avoid problems imposed by that, thus we want to manually specify when to download it from the public repository and when to update. No changes are ever to be made to the binary stored on our repository server without manual interaction.
Is there a way achieve this? I.e. to say "I want artefacts X, Y, Z" copied to my repository server(preferably including their dependencies). Is this possible with either Nexus or Artifactory?

Yes. In Nexus define your own local repository, manually download the versions you want and add them to your repository. You may have to set up "manual routing" for dependency resolution to ensure that Nexus consults the repos in the correct order.
Then make sure your pom files refer to the specific versions you have downloaded.

One thing that will make this a little easier is that you can place the downloaded artifacts directly into the local storage directory of a Nexus repository (you don't need to upload them into Nexus).
See here for details: https://support.sonatype.com/entries/38605563

Related

How to use Artifactory as a local repository for Go modules

I have Artifactory set up and working, serving other artifacts (RPM, etc)
I would like to have local copies of public and private Go programs and libraries
to ensure version consistency
to let public repositories get bugs out
to let public repositories secure from unauthorized alterations
I've created a Go repository in Artifactory, and populated it with, as an example, spf13/viper using frog-cli (which created a zip file and a mod file)
Questions:
Is the zip file the proper way to store Go modules in Artifactory?
How does one use the zip file in a Go program? E.g. the URL to get the zip file is http://hostname/artifactory/reponame/github.com/spf13/viper/#v/v1.6.1.zip (and .mod for the mod file) E.g., do I set GOPATH to some value?
Is there a way to ensure all requirements are automatically included in the local Artifactory repository? At the time of the primary package's (e.g. viper) inclusion into the local Artifactory repository?
Answering 3rd question first -
Here's another article that will help - https://jfrog.com/blog/why-goproxy-matters-and-which-to-pick/. There are two ways to publish private go modules to Artifactory. The first is a traditional way i.e. via JFrog CLI that's highlighted in another article.
Another way is to point a remote repository to a private GitHub repository. This capability was added recently. In this case, a virtual repository will have two remotes. The first remote repository defaults to GoCenter via which public go modules are fetched. The second remote repository points to private VCS systems.
Setting GOPROXY to ONLY the virtual go modules repository will ensure that Artifactory continues to be a source of truth for both public and private go modules. If you want to store complied go binaries, you can use a local generic repository but would advise using a custom layout to structure the contents of a generic repository.
Answering the first 2 questions -
Go module is a package manager in Golang similar to what maven is for Java. In Artifactory, for every go module, there are 3 files for every go module version: go.mod, .info, and the archive file.
Artifactory follows the GOPROXY protocol, hence the dependencies mentioned in the go.mod will be automatically fetched from the virtual repository. This will include the archive file too which is a collection of packages (source files).
There's additional metadata that's stored for public go modules such as tile and lookup requests since GoSumDB requests are cached to ensure that Artifactory remains the source of truth for modules and metadata even in an air-gapped environment.

Can I keep Maven local repository on another machine and use it in my project?

Where are Maven and pom.xml file kept in a real-time project if the code is at GitHub. I mean can I keep my local repository somewhere in another machine and use it in my project. If yes, how?
Local repositories are not meant for sharing. They are also not "thread-safe" in any way, so accessing them simultaneously from two different builds might break things.
They are populated by the artifacts Maven downloads from MavenCentral and other repositories, and also the stuff you build yourself. As they are more or less a form of cache, there is no need to share them.
If you need a repository that is used from different machines or by different users, set up a Nexus/Artifactory server.

Sharing Maven local repository

Maven stores all jars under local repository ~/.m2/repository/. It occupies a lot of space when there are many users.
So, Is it possible to share this local repository by multiple users, perhaps under a different directory structure?
Simple answer No. The local reposiory is as the name implies for the user and not for multiple users. Apart from that Maven itself is not designed for that. It will usually come to problems.
Of course you can share the local repository. You just need a folder that all of you have write permission. e.g.: /local/.m2/repository. And to share this folder, you all can change the settings.xml with new local repository as below:
<localRepository>/local/.m2/repository</localRepository>
Also you all can use the settings.xml in ${MAVEN_HONE}/conf and then you don't need to set private settings.xml.

Guidelines when splitting artifact repositories

I am looking for an article which describes a set of guidelines to follow when creating repositories in an artifact repository manager.
I know that:
You need to keep snapshots in snapshot repositories.
You need to keep releases in release repositories.
Third-party artifacts should be in a separate repository (the same goes for forked/patched
versions of third-party libraries).
It's generally a good idea to prefix the names with int-* and ext-*.
Usually different product lines end up having their own repositories as sometimes their artifacts don't depend on each other.
I've been trying to find an article on this to illustrate to a client how this artifact separation abstraction is done by other companies and organizations using repositories.
Many thanks in advance!
I am not aware of existence of such an article, but as #tieTYT mentioned, you can look at Artifactory default repositories. They reflect years of experience in binaries management, continuous integration and delivery.
Those practices still apply even if you use Nexus (and you can observe them even without installing Artifactory, by looking at JFrog public Artifactory instance http://repo.jfrog.org)
For your convenience, here are the defaults (important usage emphasised):
Local Repositories:
libs-snapshot-local: Deploy here your local snapshots
libs-release-local: Deploy here your local releases
ext-snapshot-local: Deploy here 3rd-party snapshots which aren't available in remote repos
ext-release-local: Deploy here 3rd-party releases which aren't available in remote repos
plugins-snapshot-local: Deploy here your plugin (usually, maven) snapshots
plugins-release-local: Deploy here your plugin (usually, maven) releases
Remote Repositories:
jcenter: proxy of http://jcenter.bintray.com. Normally, that's the only remote repo you'll need. It includes whatever exists in maven central plus all other major maven repositories
Virtual Repositories:
remote-repos: aggregation of all the remote repositories
libs-release: this is the resolution repository for release builds. It includes remote-repos, libs-release-local and ext-release-local
libs-snapshot: this is the resolution repository for snapshot builds. It includes remote-repos, libs-snapshot-local and ext-shapshot-local
repo: this is special virtual repository, that aggregates everything. Generally, do not use it, if you ever plan building release pipeline using binary repository.
I'll be glad to advice on specific question.
As is the case with many questions about best practices, the answer is: It depends.
Technically there are only two distinctions that are required:
Snapshot vs release repo
Hosted vs proxy repository
Snapshots vs release repositories as a distinction is required since the Maven repository format and therefore Maven and other build tools differentiate how they work with the the meta data and what they do during upload.
For proxy repositories you will just have to add as many you need to proxy. This will depend on what components you require and will be separate for proxying snapshot and release repos.
For hosted repositories you also have to have separate snapshot and release repos. Beyond that is is all up for grabs. Having a separate third party repo as preconfigured in Nexus (and Artifactory) and other setups are certainly useful, but not really necessary. You can have all those distinctions sorted out by internal meta data where required.
Along the same lines you can have one release repo for everyone or one for each team or whatever. You can still apply access rights within those repositories to separate access and so on in Nexus with repository targets. I assume Artifactory and Archiva can do something similar. The question here mostly boils down to ease of administration, backups, security setup and access for users.
Naming conventions like you mentioned can help if you want to have separate repositories, but technically none of this is necessary.
Other things I have seen are e.g. migration repos that are used to migrate legacy project libraries into a repo but become frozen after the migration is done, separate repos per team, separate repos per project and so on. Another aspect are separate repos for different levels of approval and so on (e.g. check out problems with that on http://blog.sonatype.com/people/2013/10/golden-repository/)
In the end however this all hinges really on usability and meta data and is not required. Ultimately these repositories will in most cases grouped together and accessed via one group, which flattens out the whole separation. And access rights still carry through into the group so everything can still be controlled as you like. So it turns to be a matter of taste on how you want to slice and dice and manage it.
PS: I am referring to the Maven repositories and format. Once you add a whole bunch of other formats into the mix and wrappers around them exposing them in other formats, everything gets more complicated, but the ideas behind things stay similar.

How to host a maven repo without Nexus

I have small open-source projects hosted on Github which I want to make available for others via Maven. I have a small webspace where I can host static files. How can I create a repo? Also, I would want to remove old snapshots from there if possible.
Standard maven repository implementations are almost all Tomcat web apps. Each one of them should have a static repository, just as your local repository. The webapp serves to the purpose of searching and management of the artifacts stored in that static repository.
If you want to host the repository with static web access only, you'll have to perform the management manually and provide a static manually generated html page that contains GAV coordinates of all artifacts in the repo. No other user but you could ever upload to the repository unless you give your password or enable anonymous FTP acces.
If maven doesn't try to upload anything to the repo until the deploy phase then this approach is still partly usable, since running a mvn clean deploy should fail.
You can check if is it doable like this (I suppose that you have that projects in your local repo):
upload your local repoistory folder to a URL
for the purpose of testing mirror your central repo to that URL
try to build your project with dependencies from your repo
Open your settings.xml file and under <mirrors> node add:
<mirror>
<url>http://your/url/repo</url>
<mirrorOf>*</mirrorOf>
</mirror>
and see if mvn clean install suceeds. Please feedback.
In this SO answer I have outlined the way I set up my OSS projects which are all hosted in Github. There are actually a number of free services out there you could you when you would like to run an OSS project.
I would recommend publishing to Maven Central, if your plugin is well-tested and expected to bring other people benefits as well. You can use CloudBee's BuildHive as a free Jenkins CI.
A static repo works great, per my experience.
I scp'd up my local repository into a static apache server. Legit repo. Not as easy to maintain as a real repo of course, but quite a bit cheaper if you've already got a plain vanilla web host.
Other than setting the permissions properly (same as required for you to browse the folders), it was a pretty painless procedure.
The only two things I did to make it more reasonable were
1 - Wrote a script to "rm -rf ...." on most of the contents of my local repo so that the only thing I am deploying is those few artifacts that are not available in the general repos.
2 - Tarred it up first before scping to my web host.
Hope this helps.
The guy below did something similar, only using FTP which saves him a lot of hand work if he updates his binaries very often.
http://stuartsierra.com/2009/09/08/run-your-own-maven-repository
I think I know how to do it now. I'm using mvn deploy now to create a local repository on the file system and then I upload it to the webserver. If I'm not wrong, there doesn't even need to be a file listing.
The command I'm using is:
mvn deploy -DaltDeploymentRepository=local::default::file:./repo
This creates/updates the local repository automatically, so the repo can be synced with a server.

Resources