In the below code:
def server = Artifactory.server 'server_id';
def uploadSpec = """{
"files": [{
"pattern": "${WORKSPACE}/$repoName/target/$repoName-0.1-$jarType.jar",
"target": "libs-release-local/a/b/c/"
}
]
}"""
server.upload(uploadSpec)
server.upload(uploadSpec) does not create repository path a/b/c under libs-release-local, when ran for the first time
Jenkins is connect to Artifactory with admin privileges
1) Why above code does not create repository path to upload artifact in Artifactory?
2)
Does server.upload internally use JFrog cli?
1) The path will be created under the repository only if an artifact is uploaded. If no error occurred and the path was not created, I am assuming the pattern did not find any match and no artifacts were uploaded.
As to your comment, you can diagnose the problem by setting 'failNoOp' to true, which will fail the build if no files are affected:
server.upload(uploadSpec, true)
The Console Output would also contain a line as "Deploying artifact: /path/to/artifact" for every artifact uploaded.
2) No, the Jenkins Artifactory Plug-in does not rely on JFrog CLI at all.
you can use the jfrog's REST API for this: https://www.jfrog.com/confluence/display/RTF/Artifactory+REST+API
and use curl to put your artifact.
eg.
sh("curl -u username:password# -X PUT \"${mavenRepo}/${relativeMavenPath}/${serviceName}/${serviceTag}/${serviceName}-${serviceTag}.jar\" -T services.jar")
Related
I am using Pipeline Job which should upload all the jars to the Jfrog, it's working but it is uploading all the jars without its folder structure to Jfrog.
eg:
libs-release-local/one.jar
libs-release-local/two.jar
But I want to upload all the jars along with their folder structure like below.
eg:
libs-release-local/abc/efg/abc/one.jar
libs-release-local/ABC/EFG/ABC/two.jar
Note: here the folder structure may change based on the jar.
So how do I make changes in a script which will catch the folder structure and upload it for every jar?
Here is the current script am using
stage('Uploading to artifactory'){
steps{
rtUpload (
serverId:"<server id>" ,
spec: '''{
"files": [
{
"pattern": "**/*.jar",
"target": "libs-bt-test-local/"
}
]
}''',
)
}
}
Let me know if there a possible way to include a loop which will dynamically change the directory structure for every jar.
The target value can be edited with a placeholders in order to dynamically determine the uploaded path.
For example: libs-bt-test-local/{1}
For further information and examples, you may refer to the Artifactory REST API documentation page:
https://www.jfrog.com/confluence/display/JFROG/Using+File+Specs#UsingFileSpecs-UsingPlaceholders
I have a Java project that makes use of Gradle to build and package. My purpose is to create artifacts that are published to Maven Central.
As a first step, I configured my Gradle project as shown in the following example from the documentation:
https://docs.gradle.org/current/userguide/publishing_maven.html#publishing_maven:complete_example
When I run gradle publishToMavenLocal, I get the following files installed in my local repository:
maven-metadata-local.xml
my-library-1.0.2-SNAPSHOT.jar
my-library-1.0.2-SNAPSHOT.jar.asc
my-library-1.0.2-SNAPSHOT-javadoc.jar
my-library-1.0.2-SNAPSHOT-javadoc.jar.asc
my-library-1.0.2-SNAPSHOT.pom
my-library-1.0.2-SNAPSHOT.pom.asc
my-library-1.0.2-SNAPSHOT-sources.jar
my-library-1.0.2-SNAPSHOT-sources.jar.asc
The files are all OK. The only issue I have is that checksum files (md5 and sha1) are not generated. However, checksum files are a requirement to have artifacts deployed on Maven Central via OSS Sonatype.
How can I generate the missing checksum files? It seems the maven-publish or signing plugins do not have an option for this purpose? what is wrong?
The solution I found was to use shadow along with ant.checksum:
tasks.withType(Jar) { task ->
task.doLast {
ant.checksum algorithm: 'md5', file: it.archivePath
ant.checksum algorithm: 'sha1', file: it.archivePath
ant.checksum algorithm: 'sha-256', file: it.archivePath, fileext: '.sha256'
ant.checksum algorithm: 'sha-512', file: it.archivePath, fileext: '.sha512'
}
}
Invoking gradle publishShadowPublicationToMavenLocal will generate the signatures as needed, although won't publish them to ~/.m2.
At first I thought those signatures should have been automatic, so I opened https://github.com/johnrengelman/shadow/issues/718 to discuss.
I thought this was a bug in Gradle and I opened an issue, but as described here this actually mimics mvn install behavior. It sounds like Maven Local works a little different than a Maven Repository.
The proper way to test this locally is to use a file based repository. Since you're only using it to test (and not to actually share things with other projects) I think putting that into the build directory is best. Add the repositories section from below to the publishing block. Then when you ./gradlew publish it will publish to your build directory.
Kotlin
repositories {
maven {
// change URLs to point to your repos, e.g. http://my.org/repo
val releasesRepoUrl = uri(layout.buildDirectory.dir("repos/releases"))
val snapshotsRepoUrl = uri(layout.buildDirectory.dir("repos/snapshots"))
url = if (version.toString().endsWith("SNAPSHOT")) snapshotsRepoUrl else releasesRepoUrl
}
}
Groovy
repositories {
maven {
// change URLs to point to your repos, e.g. http://my.org/repo
def releasesRepoUrl = layout.buildDirectory.dir('repos/releases')
def snapshotsRepoUrl = layout.buildDirectory.dir('repos/snapshots')
url = version.endsWith('SNAPSHOT') ? snapshotsRepoUrl : releasesRepoUrl
}
}
These two samples are actually from the link you shared. It's possible they were added later or you (like me) thought that publishToMavenLocal should behave the same as publish (apart from where the files actually go).
I am trying to filter the artifacts that get published to Artifactory and running into two issues:
1) The include/exclude filtering is not working for me as expected.
2) I have not found a way to set it to exclude unpublished artifacts from buildinfo.
1) The relevant section of my jenkinsfile looks like this:
def now = new Date()
def changelist = now.format("yyyyMMddHHmm", TimeZone.getTimeZone('US/Central'))
def server = Artifactory.server env.ARTIFACTORY_SERVER_ID
server.credentialsId = 'creds-artifactory'
def rtMaven = Artifactory.newMavenBuild()
rtMaven.resolver server: server, releaseRepo: 'releases-repo', snapshotRepo: 'snapshots-repo'
rtMaven.deployer server: server, releaseRepo: 'candidates-repo', snapshotRepo: 'snapshots-repo'
rtMaven.deployer.artifactDeploymentPatterns.addInclude("myGroupId:myDistArtifactId*")
buildInfo = rtMaven.run pom: 'pom.xml', goals: "clean install -B -Dchangelist=.${changelist}".toString()
server.publishBuildInfo buildInfo
I have tried also to exclude with no lock. The only way I got this to work was to do a simple filter like exclude "*.zip". But have not found any other way to make it work based on artifactId. What am I missing? Based on the docs I saw this should be working.
2) The other issue is that the excluded artifacts (metadata) still get published since they are in my buildInfo but in Artifactory they show as being deleted (i.e. not attached binary). is there a way to update buildInfo and remove the excluded artifacts?
Maybe this documentation article will be helpful, see uploading example:
def uploadSpec = """{
"files": [
{
"pattern": "bazinga/*froggy*.zip",
"target": "bazinga-repo/froggy-files/"
}
]
}"""
server.upload spec: uploadSpec
How to upload multiple maven artifacts(.zip) to different targets in Jfrog Artifactory using Jenkins pipeline script
If your using the Jenkins artifactory plugin, this grants access to the Jfrog Cli which enables use of uploadSpec and downloadSpec. (Artifactory.server name is configured under jenkins global settings after installing artifactory plugin)
def server = Artifactory.server 'artifactory'
def uploadSpec = """{
"files": [
{
"pattern": "*-file-1.zip",
"target": "location1/1"
},
{
"pattern": "*-file-2.zip",
"target": "location2/2"
}
]
}"""
def buildInfo = server.upload(uploadSpec)
More info on filespecs available on their website https://www.jfrog.com/confluence/display/RTF/Using+File+Specs
You can use the Maven deploy:deploy-file goal (http://maven.apache.org/plugins/maven-deploy-plugin/deploy-file-mojo.html) to upload arbitrary files.
I am trying to upload a pom.xml file to a Maven repository hosted on an Artifactory server. The <project> section of the pom.xml looks like this:
<groupId>com.x.y.z</groupId>
<artifactId>common</artifactId>
<version>2.3.0-RELEASE</version>
<packaging>jar</packaging>
I am using the Artifactory plugin for Jenkins in a Pipeline script and here is the uploadSpec
{
"files": [
{
"pattern": "target/common-2.3.0-RELEASE.jar",
"target": "REPOSITORY/com/x/y/z/common/2.3.0-RELEASE/common-2.3.0-RELEASE.jar"
},
{
"pattern": "pom.xml",
"target": "REPOSITORY/com/x/y/z/common/2.3.0-RELEASE/common-2.3.0-RELEASE.pom"
}
]
}
When I now try to upload the artifact, I'm getting the following error message:
java.io.IOException: Failed to deploy file.
Status code: 409
Response message: Artifactory returned the following errors:
The target deployment path 'com/x/y/z/common/2.3.0-RELEASE/common-2.3.0-RELEASE.pom'
does not match the POM's expected path prefix 'com/x/y/z/common/2.2.7'.
Please verify your POM content for correctness and make sure the source path is a valid Maven repository root path. Status code: 409
Before I upload the RELEASE, I upload a SNAPSHOT which (in this case) had the version 2.2.7-SNAPSHOT. After that I bump the version to 2.3.0, re-build the project with mvn clean install and then start another upload to Artifactory. Somehow Artifactory still seems to expect the "old" version, when I try to upload the new version.
Edit
When I upload the file with curl, everything works as expected:
curl -user:password-T pom.xml \
"http://DOMAIN/artifactory/REPOSITORY/com/x/y/z/common/2.3.0-RELEASE/common-2.3.0-RELEASE.pom"
So it seems like this is related to the Jenkins Artifactory Plugin.
You upload your pom file to an incorrect location. You use
REPOSITORY/com/x/y/z/common-2.3.0-RELEASE.pom
as a path, when the path should be
REPOSITORY/com/x/y/z/common/2.3.0-RELEASE/common-2.3.0-RELEASE.pom
Note version-named directory that is missing.
The good news are you don't even need to bother with it. When you use our Artifactory.newMavenBuild for Maven builds, we'll take care of the correct deployment. See the example.
Can you try the below code in pipeline script?
{
"pattern": "pom.xml",
"target": "REPOSITORY/com/x/y/z/common/2.3.0-RELEASE/common-2.3.0-RELEASE.pom"
}
or if it doesn't work you can utilize maven deploy in pipeline script using
def mvnHome = tool mvnName
sh "$mvnHome/bin/mvn deploy -deploy-file -Durl=file:///C:/m2-repo \
-DrepositoryId=some.id \
-Dfile=path-to-your-artifact-jar \
-DpomFile=path-to-your-pom.xml