Artifactory Remote Repository - caching

I'm new to Artifactory. I'm trying to understand better how the remote repository cache mechanism works.
Right know I have 3 imported remote repositories at the system:
google-code-cache.
java.net-cache.
jcenter-cache.
At the first two (google and java), After I'm entering to the remote url and downloading an artifact it seems to be saved at the repository-cache.
At jcenter that doesn't work. even if I'm downloading a file the file doesn't goes to the cache.
Can anyone help me understand better the logic behind the cache mechanism - when does it save the artifacts at the cache and when it's don't?
Thanks,
Nadav

First of all, with JCenter you don't need the other repositories. You'll be able to retrieve all the artifacts from java.net and google-code directly from jcenter.
Second, the easiest way to configure remote repositories is using the configuration import functionality. Just click on the Import button. The default server to import from will be repo.jfrog.org, which is a good choice, since it has a lot of remote repos to choose. Select 'jcenter' from the list of the proposed repositories and it will be added to your Artifactory and configured correctly.

Related

Artifactory - Problems with overit-central and This item is not cached

We've been experiencing problems with our overit-central repository on Artifactory. Some elements appear as "not-cached" and while trying to retrieve them from a build, we're encountering an "element not found" error. That same element does exist in the overit-central repo, but somehow our Artifactory instance doesn't seem to find it or cache it. (The problem doesn't happen with all the elements of the repo, and we don't know why. Some of them are actually cached)
We've already run a re-index of said repository but still the "This item is not cached" continues to appear.
We tried creating a new remote repository hooked to the same url and running some maven re-indexes from Artifactory, but it didn't help.
How do we force the caching of the elements from Artifactory? The documentation in the JFrog support page mentions it, but doesn't explain how to do it.
Thank you in advance
Regards
A remote repository in Artifactory is a mirror of the endpoint you have configured. It doesn't download the artifacts by default and this is the reason it says "Artifact is not cached". It is the intended behavior and it lets you know that you have never downloaded this artifact.
In order to cache an artifact, you have to simply download it from the remote endpoint. This can be done in two ways:
From the UI - Right click --> download
Using the API "curl -uUSERNAME:PASSWORD http://ARTIFACTORY_URL:PORT/artifactory/REMOTE_REPOSITORY/PATH_TO_ARTIFACT
Once the artifact is downloaded, it means that it is cached and you will be able to see it in the "name-of-the-remote-repository-cache" repository which shows the downloaded cached artifacts.

Remote repository for a go project in Artifactory doesn't proxy?

I'm trying to understand how to work with a remote repository in Artifactory for a Go project. My initial expectation was that it'll work transparently, all I would need to do is to point GORPOXY variable to a virtual repository (with local and remote behind it), do go build and dependencies will either be downloaded from the Artifactory cache or Artifactory would download them transparently. Similar to the way it works for maven dependencies.
When I tried that, it complained that the dependencies weren't found in artifactory. Ok.
Reading the documentation two things stand out. First, there's nothing there about GOPROXY and everything is about using artifactory cli. That's a big downside for several reasons.
Second, is that you need to publish dependencies manually with jfrog rt go-publish go --self=false --deps=ALL and then dependencies appear under a local repository.
So I'm trying to figure out if 1) I can avoid using JFrog CLI and 2) what's the point of remote repositories if they don't proxy? Or maybe I'm missing something?
Artifactory 6.3.0
I understand your confusion on the blog post you mentioned, though I have a feeling the intent of the writer was to more show how the JFrog CLI can be used.
To answer your questions:
1) Yes, you don't have to use the JFrog CLI to build. Please check out the documentation on how to set up a remote repository for Go. This will guide you through setting up GitHub or GoCenter as a remote repository for your Go builds. This will allow you to set the GOPROXY environment variable following this structure <protocol>://<username>:<password>#<artifactory domain>/api/go/<go repository>.
2) Remote repositories will absolutely act as a proxy, caching the contents you download from the remote repository (copying a part from the user guide: A remote Go repository in Artifactory serves as a caching proxy for a public Go registry such as GoCenter or GitHub.)

Adding remote repository in artifactory

I had jcenter repository in my artifactory under remote repositories. Since some of the artifacts were missing such as qpid, I decided to delete jcenter and add it back. When I deleted the jcenter repository and added it back it didn't download any artifacts.
The repository tree structure looks like this
Before removing the jcenter repository it was like this.
Why it is not able to import any artifact? I can see the URL which is associated with it(http://jcenter.bintray.com/) has so many artifacts.
A remote repository in Artifactory serves as a caching proxy. This means that it downloads artifacts from the remote URL and cache them in Artifactory,
When you deleted the JCenter repository from Artifactory you deleted all cached artifacts.
After recreating the repository, your cache is empty. This is why, when browsing jcenter-cache, you see no artifacts. You can use the remote browsing capability in order to see which artifacts are available in the remote URL, but not currently caches
To re-populate the cache, you will need to download artifacts from the remote repository. Usually the best way to do it, is running your builds which are using this repository.
If the problem is that artifacts are not resolved at all from the remote repository, try the following:
Make sure the repository is configured correctly in Artifactory. Use the "Test" button to make sure that the URL is correct and you can reach the remote URL.
Check that your build tool is properly configured to use the repository you configured. One way of checking this, is by monitoring the Artifactory request log looking for requests from your build tool.
Deleting a repository is not a good practice when you are missing some dependencies. A better approach would be checking if they are available in the remote URL and downloading them into the cache. Artifactory has the option to perform a remote search in Bintray which can help you when looking for artifacts in JCenter.

Nexus OSS: publish to static mirror

Do you know a way to configure Nexus OSS so that it publishes the artifact repository to a remote server in a form that can be statically served, e.g. by Apache Httpd? I'd like to use this static copy to serve only my own artifacts, so the nexus server could actively trigger an update in case there is something new published.
Technically, I think it should be possible to create the metadata for the repo and store them in a static file, but I'm not sure with that. Any hints appreciated.
If there is another repo manager to achieve that, it would be fine for me as well.
I clearly understand the advantages to use the repo manager directly, but due to IT rules I can run Nexus only internally and it would be necessary to have these artifacts available in a (private) repo copy on the Internet as well.
A typical way to solve this IT requirement of only exposing known servers like Apache httpd is to setup Apache httpd as a reverse proxy as documented here.
You can use that approach in a more restrictive way by only exposing a specific repository or better repository group (so you can combine snapshots and releases) and tying that together with a specific user or a specifically restricted setup of the anonymous user that is used by default when no credentials are passed through.
Also if you need more help feel free to contact us in the user mailinglist or on hipchat.

Nexus - proxy repositories with no indexes?

I'm trying to add a proxy to a public repository (specifically camel-extra). However, I get the following error in my Nexus logs:
Cannot fetch remote index for repository camel-extra
and then further down:
The remoteURL we requested does not exists on remote server (remoteUrl="http://camel-extra.googlecode.com/svn/maven2/releases/.index/nexus-maven-repository-index.properties")
I've ensured that 'Download Remote Indexes' is 'True', repaired the index, updated the index, all to no avail. Browsing to the provided URL shows that the artifacts are there.
So if a repository doesn't have this file, is it not proxy-able through Nexus?
TIA,
Roy
UPDATE
Thanks for the answers everyone - was able to pull the artifacts without the index. Thanks again!
Repositories without indexes published will be still proxy-able thru Nexus (or any other MRM). Index is only a "topping" providing useful extras like searching the whole remote content, etc.
The index does not participate in proxy-ing at all, hence the lack of it on remote does not affect main functionality of Nexus at all: to proxy artifacts from remote repository.
From the nexus documentation, it appears that downloading an index is configurable.
The default for new proxy repositories
is enabled, but all of the default
repositories included in Nexus have
this option disabled.
You should disable the Downloading of Remote Index.
Yes, it is proxyable. Just try to download an artifact which is hosted in that repository. The indexes only affect searching and the index published in turn by Nexus.

Resources