I have artifacts in JFrog.
Example:
https://test.com/artifactory/users/data-config/1.0.0/user.json
https://test.com/artifactory/users/data-config/1.0.1/user.json
https://test.com/artifactory/users/data-config/2.0.0/user.json
Is there a way I can download the latest version using curl? Like in this case it will be
https://test.com/artifactory/users/data-config/2.0.0/user.json
Considering your comment that the files are not maven based, I assume it is deployed to a generic repository and irrespective of the file type, the only way to resolve the file via direct cURL is using the following command.
curl -u<USERNAME>:<PASSWORD> -O "http://<HOSTNAME>:<PORT>/artifactory/generic-repository/<TARGET_FILE_PATH>"
So, the target path name should be given completely when it is being resolved via direct download (API). i.e., The complete target file name manually.
However, for other package types such as a maven release or a snapshot, the available options are described here.
What are a few other additional options available?
Artifact Latest Version Search Based on Layout
, Artifact Latest Version Search Based on Properties
Related
I'm having an issue where an external tool is being used to make a call which causes mvn to download a dependency on the fly. This download however is calling the "central" enterprise artifactory repo rather than one of our normal artifactory repos and I'm trying to figure out how to make it mirror the enterprise repo to point to the appropriate repo.
All I've seen indicates I should be able to do this by setting the mirror in the settings.xml file, and I've passed the path to this settings file via the -s option.
But the mirror is still being ignored.
Is there something special about making a command to use a dependency via the commandline that bypasses mirrors?
It appears that the reason setting mirrors wasn't working was because the deployment mechanisms in place weren't actually setting the xml files as intended. To get around it we added code to modify the .m2 folder to contain the xml files as part of the script run during deployment.
I am new to Gradle. Here is my scenario. I have a Gradle project. This project doesn’t have any java code. All it has, is a ‘build.gradle’ file … to package other war/jar/libs/configs from certain source directories, and create a TAR.GZ file output. … This ‘build.gradle’ file is currently being used, before implementing DevOps.
Now, after implementing DevOps, we use Artifactory repository to store all war/jar/libs/configs files.
We want to update this ‘build.gradle’ file to fetch / download all the files from Artifactory … to build the TAR package ... rather sourcing it from local directories.
I have a specific need:
• Produce 3 different package types – LIGHT / PARTIAL / FULL … meaning, LIGHT package will have a pre-defined set of files, PARTIAL package will have custom selection, FULL package will have everything
• I want to pass the option, via the gradle.properties file
• Gradle should download the files from Artifactory, according to the package type mentioned in the properties file (LIGHT/PARTIAL/FULL)
• Is it possible to bring in such dynamism into a build.gradle file?
Please guide. THANKS A LOT
You can use the file specs to download files from Artifactory for your different packages. For example you can have different file specs for the different package types you mentioned and download them using the REST API or the JFrog CLI. These file specs can be used in your Gradle builds.
You can find a few sample specs on GitHub
How can a corporate Maven repository be used (to the exclusion of other repositories) with sbt 0.11.x, as described in how do I get sbt to use a local maven proxy repository (Nexus)? ? There is no mention of ivyRepositories in the new sbt wiki at github, so I'm assuming the accepted solution there is out of date.
Step 1: Follow the instructions at Detailed Topics: Proxy Repositories, which I have summarised and added to below:
(If you are using Artifactory, you can skip this step.) Create an entirely separate Maven proxy repository (or group) on your corporate Maven repository, to proxy ivy-style repositories such as these two important ones:
http://repo.typesafe.com/typesafe/ivy-releases/
http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/
This is needed because some repository managers cannot handle Ivy-style and Maven-style repositories being mixed together.
Create a file repositories, listing both your main corporate repository and any extra one that you created in step 1, in the format shown below:
[repositories]
my-maven-proxy-releases: http://repo.example.com/maven-releases/
my-ivy-proxy-releases: http://repo.example.com/ivy-releases/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext]
Either save that file in the .sbt directory inside your home directory, or specify it on the sbt command line (you will need to specify if you have disabled sharing):
sbt -Dsbt.repository.config=<path-to-your-repo-file>
Good news for those using older versions of sbt: Even though, in the sbt 0.12.0 launcher jar at least, the boot properties files for older sbt versions don't contain the required line (the one that mentions repository.config), it will still work for those versions of sbt if you edit those files to add the required line, and repackage them into the sbt 0.12.0 launcher jar! This is because the feature is implemented in the launcher, not in sbt itself. And the sbt 0.12.0 launcher is claimed to be able to launch all versions of sbt, right back to 0.7!
Step 2: To make sure external repositories are not being used, remove the default repositories from your resolvers. This can be done in one of three ways:
Add the command line option -Dsbt.override.build.repos=true mentioned on the Detailed Topics page above. This will cause the repositories you specified in the file to override any repositories specified in any of your sbt files. This might only work in sbt 0.12 and above, though - I haven't tried it yet.
Having the same effect as 1, you can use overrideBuildResolvers := true, with the advantage that you can control the projects where it is applicable, depending on which scope (a project / ThisBuild / Global) you define it in. This works in sbt 0.13.
Use fullResolvers := Seq( resolver(s) for your corporate maven repositories ) in your build files, instead of resolvers ++= or resolvers := or whatever you used to use.
Finally, note that the sbt launcher script has a bug in reading the sbtopts file, so if you decide to put your common sbt command-line options in there, make sure the last line of the file ends in a newline (Emacs in particular can fail to ensure this, unless configured to do so).
An alternative for Step 2 of the accepted answer (am using sbt 0.13.1):
Add file .sbtopts to the project root directory with contents:
-Dsbt.override.build.repos=true
Another alternative is to add this line in $SBT_HOME/conf/.sbtopts, but this would force the setting for all projects.
Unpack the sbt-launcher.jar and copy the sbt.boot.properties file to a location of your choice. Change the launch script to use this file. In the file, change the repositories section to only contain your local repo and the corporate one. The distinction between Maven and Ivy comes from the given pattern (no pattern means Maven pattern by default).
Here is an example:
[repositories]
local
corporate: http://inhouse.acme.com/releases/
I run my own little Maven repo for some open source. I have no dedicated server so I use a Google code repository, deploy to file system and then commit and push. Works perfect for me.
But some Maven tools are looking for a nexus-maven-repository-index.properties and the index (in GZ). I would like to generate this index to
get rid of the warning that it's not here
Maven doesn't try the repo for artefacts that are not there.
How can I do that? Is there a tool (Java main) that is able to generate an index? Also tips how to use the proper Nexus Jars with a little commandline tool are welcome.
I came across this post while I was searching for a solution to add a local repository to my Maven project using IntelliJ Idea.
Since Sonatype changed their paths and reorganized the downloads since the last post, here is an updated step-by-step tutorial to get your repository indexed for use with IntelliJ Idea:
Download the latest stand-alone indexer from here.
Extract it somewhere and go into this directory
From the console, run this command: export REPODIR=/path/to/your/local/repo/ && java org.sonatype.nexus.index.cli.NexusIndexerCli -r $REPODIR -i $REPODIR/.index -d $REPODIR/.index -n localrepo
In the directory .index within the repository directory, some files will be created including the file "nexus-maven-repository-index.gz" which is the file IntelliJ looks out for.
You can use the Maven Indexer CLI to product the index directly, but why bother hosting your own repo when OSS projects can use a hosted one for free?
http://nexus.sonatype.org/oss-repository-hosting.html
I was looking at maven indexer... but I am not sure what for is the last parameter indexDir in the method:
public RepositoryIndexer createRepositoryIndexer(String repositoryId,
File repositoryBasedir,
File indexDir)
is it like starting point in the repositoryBasedir?
I am using hudson and the "Publish artifacts to FTP" option. It makes up its own directory based on the date and time of the build. I would like to override that with a fixed name/location. How can I do that? Is it possible?
What version of the FTP Publisher plugin are you using? I just installed the 1.0 version on Hudson 1.361, and in that version I can control the path by selecting the Flatten files option in the job configuration. Timestamp directories can also be disabled (that was the default).
That would solve path control. For the filename control, there doesn't seem to be a way to rename the file. You will either have to create an artifact with the right name during the build, or use some other tool for the ftp upload.
Edit: For example, the Post Build task, that can run arbitrary shell commands based on the result of the build.