I've been trying to use the Jenkins Artifactory plug-in to upload an artifact to Artifactory but I've been running into an issue with creating the artifact (what's odd is that my pipeline ends, so I must be triggering a silent error).
Here is the code (I give more details regarding what I think is going wrong below):
def server = Artifactory.server 'rc-artifact'
def uploadSpec =
"""{
"files": [
{
"pattern": "${unencryptedZipName}",
"target": "builds/SedTuningGui/${env.BUILD_NUMBER}/${unencryptedZipName}",
"props": "type=zip"
}
]
}"""
def buildInfo = server.upload spec: unencryptedUploadSpec
print buildInfo.getProperties().toString()
and the console output (the portion of it that I'm comfortable showing):
[..., deployableArtifacts:[], artifacts:[], ...]
As you can see, both the 'artifacts' and 'deployable artifacts' arrays are empty, which means that the server didn't receive the zip file that I built.. I know that the zip file exists, because I've successfully robocopied it.
All of this has led me to believe that the def buildInfo = server.upload... line isn't working because the built zip file exists, but isn't included in the buildInfo object.
This is my first time working with a Jenkins pipeline, and I think I'm close to closing this issue out, I just need a bit of direction.
The file spec you defined is uploadSpec, but in the following line def buildInfo = server.upload spec: unencryptedUploadSpec you use unencryptedUploadSpec. Try with def buildInfo = server.upload spec: uploadSpec
Related
Here is the snippet of code where I am trying to download some folder contents from jfrog to in jenkins pipeline script
stage ('Pull from BAMS Artifactory')
{
def server = Artifactory.newServer url: u_rl, credentialsId: creds
def downloadSpec = """{
"files": [
{
"pattern": "default.npm.global/transfer-pricing/ooxp-common-lib/*.**",
"target": "default.npm.global/"
}
]
}"""
server.download(downloadSpec)
}
The fallowing are the files in that folder ooxp-common-lib
../
ooxp-common-lib-1.0.0.tgz 30-Oct-2018 22:33 14.24 KB
ooxp-common-lib-1.0.0.tgz.md5 30-Oct-2018 22:33 32 bytes
ooxp-common-lib-1.0.0.tgz.sha1 30-Oct-2018 22:33 40 bytes
How ever .tgz file is downloading and the files .tgz.md5 and .tgz.sh1 are not dowloading to workspace
,
I Tried many ways but could not able to download those extensions files.
Can someone please help me quickly please.
The query you are using is a part of Artifactory Query Language, in which * replaces any string, and ? replaces any character, until it hits the next dot in the query. That's why *.** will catch ooxp-common-lib-1.0.0.tgz, but won't catch ooxp-common-lib-1.0.0.tgz.
The solution, as #yahavi suggested in the comments is just using one * which will catch everything: default.npm.global/transfer-pricing/ooxp-common-lib/*
i'm trying to convert my freestyle job in a scripted pipeline, i'm using gradle for the build and artifactory to resolve my dependecies and publish artifacts.
My build is parametraized with 3 params and in freestyle job when i configure Invoke gradle script I have the checkbox Pass all job parameters as System properties and in my
project.gradle file I use the params with System.getProperty() command.
Now implementing my pipeline I define the job parameters, I have these like enviromnent variables in the Jenkinsfile but can I pass this params to the gradle task?
Following the official tutorial to use Artifactory-Gradle plugin in pipeline I run my build with :
buildinfo = rtGradle.run rootDir: "workspace/project/", buildFile: 'project.gradle', tasks: "cleanJunitPlatformTest build"
Can I pass params to gradle build and use in my .gradle file?
Thank's
Yes, you can. If using sh ''' then switch that to sh """ to get the expanded values
Jenkinsfile
#!/usr/bin/env groovy
pipeline {
agent any
parameters {
string(name: 'firstParam', defaultValue: 'AAA', description: '')
string(name: 'secondParam', defaultValue: 'BBB', description: '')
string(name: 'thirdParam', defaultValue: 'CCC', description: '')
}
stages {
stage ('compile') {
steps {
sh """
gradle -PfirstParam=${params.firstParam} -PsecondParam=${params.secondParam} -PthirdParam=${params.thirdParam} clean build
sh """
}
}
}
}
and inside your build.gradle you can access them as
def firstParam = project.getProperty("firstParam")
You can also use SystemProperty with -D prefix as compared to project property with -P. In that case you can get the value inside build.gradle as
def firstParam = System.getProperty("firstParam")
I went through the following link and successfully implemented a task which calls build.gradle file from another project. i.e. solution provided by #karl worked for me.
But I need something up on that.
Can somebody help me to know how I can pass command line arguments while calling another build.gradle? Command line argument should be the variable which I have generated from my current build.gradle file.
In my case, I am defining a buildNumber and doing something like this:
def buildNumber = '10.0.0.1'
def projectToBuild = 'projectName'
def projectPath = "Path_till_my_Project_Dir"
task executeSubProj << {
def tempTask = tasks.create(name: "execute_$projectToBuild", type: GradleBuild)
// ****** I need to pass buildNumber as command line argument in "$projectPath/$projectToBuild/build.gradle" ******
tempTask.tasks = ['build']
tempTask.buildFile = "$projectPath/$projectToBuild/build.gradle"
tempTask.execute()
}
You should never call execute directly on any gradle object. The fact it's feasible doesn't mean it should be done and it's highly discouraged since you intrude internal gradle's execution graph structure.
What you need is a task of type GradleBuild which has StartParameter field that can be used to carry build options.
So:
task buildP2(type: GradleBuild) {
buildFile = '../p2/build.gradle'
startParameter.projectProperties = [lol: 'lol']
}
Full demo can be found here, navigate to p1 directory and run gradle buildP2.
You should modify your script in the following way:
def buildNumber = '10.0.0.1'
def projectToBuild = 'projectName'
def projectPath = "Path_till_my_Project_Dir"
task executeSubProj(type: GradleBuild) {
buildFile = "$projectPath/$projectToBuild/build.gradle"
tasks = ['build']
startParameter.projectProperties = [buildNumber: '10.0.0.1']
}
In the project that is executed use project.findProperty('buildNumber') to get the required value.
I have code that reads in a pom.xml file then attempts to re-serialize and write it back out:
// Get the file raw text
def pomXMLText = readFile(pomFile)
// Parse the pom.xml file
def project = new XmlSlurper(false, false).parseText(pomXMLText)
... do some useful stuff ...
def pomFileOut = "$WORKSPACE/pomtest.xml"
def pomXMLTextOut = groovy.xml.XmlUtil.serialize(project)
println "pomXMLTextOut = $pomXMLTextOut" // <-- This line prints to updated XML
writeFile file: pomFileOut, text: pomXMLTextOut // <-- This line crashes with the error listed in the posting title: java.io.NotSerializableException: groovy.util.slurpersupport.NodeChild
I've tried casting the pomXMLTextOut variable to a String. I tried applying the .text() method, which gets a jenkins sandbox security error. Has anyone else been able to successfully write an XML file from a groovy script running in a Jenkins pipeline?
BTW, I've also tried using a File object, but that isn't remotable across jenkins nodes. It works as long as the job always runs on master.
You could try a #NonCPS annotation and close those non-serializable objects in a funcation like this
#NonCPS
def writeToFile(String text) {
...
}
Here's the explanation from Pipeline groovy plugin
#NonCPS methods may safely use non-Serializable objects as local
variables
I want to make my gradle build inteligent when building my model.
To acquire this I was planning to read schema files, acquire what is included and then build firstly included models (if they are not present).
I'm pretty new to Groovy and Gradle, so please that into account.
What I have:
build.gradle file on root directory, including n subdirectories (subprojects added to settings.gradle). I have only one gradle build file, because I defined tasks like:
subprojects {
task init
task includeDependencies(type: checkDependencies)
task build
task dist
(...)
}
I will return to checkDependencies shortly.
Schema files located externally, which I can see.
Each of them have from 0 to 3 lines of code, that say about dependencies and looks like that:
#include "ModelDir/ModelName.idl"
In my build.gradle I created task that should open, and read those dependencies, preferably return them:
class parsingIDL extends DefaultTask{
String idlFileName="*def file name*"
def regex = ~/#include .*\/(\w*).idl/
#Task Action
def checkDependencies(){
File idlFile= new File(idlFileName)
if(!idlFile.exists()){
logger.error("File not found)
} else {
idlFile.eachLine{ line ->
def dep = []
def matcher = regex.matcher(line)
(...)*
}
}
}
}
What should I have in (...)* to find all dependencies and how should I define, that for example
subprojectA::build.dependsOn([subprojectB::dist, subprojectC::dist])?
All I could find on internet created dep, that outputted given:
[]
[]
[modelName]
[]
[]
(...)