How to refer to the parameters in groovy pipeline script? - jenkins-pipeline

I merged this as vars/gitCheckout.goovy add this as library into the jenkins
def call(String branch = '*/master') {
checkout([$class: 'GitSCM',
branches: [[name: ${branch}]],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'SubmoduleOption',
disableSubmodules: false,
parentCredentials: false,
recursiveSubmodules: true,
reference: '',
trackingSubmodules: false]],
submoduleCfg: [],
userRemoteConfigs: [[url: 'https://my-server.com/some/project.git']]])
}
Calling this method as below from jenkins Pipeline Script:
#Library('jenkins-library#master') _
pipeline {
agent { label 'my-server' }
stages {
stage('Git Checkout') {
steps {
gitCheckout()
}
}
}
}
This fails with error java.lang.NoSuchMethodError: No such DSL method '$' found among steps [ArtifactoryGradleBuild, MavenDescriptorStep, ....
I tried $branch, params.branch, but it didn't work, the code otherwise works if I don't use parameter and hardcode the branch name. Also, whenever I make any update to this .groovy script, should I test it by merging and running it as jenkins job? is there any other way to test before merging the groovy script ?

Replace ${branch} in the 3rd line with just branch. You use $ with a variable name when you e.g. interpolate variables inside Groovy strings:
def value = "current branch is: ${branch}" // produces: current branch is */master
If you forgot to use $ in string interpolation, nothing would happen:
def value = "current branch is: branch" // produces: current branch is branch

Related

Using Function in JenkinsFile Parameter Description

I am trying to add a function in JenkinsFile Declarative pipelines parameter's description but struggling to make it work.
Idea is to have a Jenkins Job specific for the environment. and would like to see the choice parameter to show environment name in the description of the variable.
My pipeline looks like this
def check_env = app_env(ENVS, env.JOB_NAME)
pipeline {
agent { label 'master' }
options {
disableConcurrentBuilds()
buildDiscarder(logRotator(numToKeepStr: '20'))
timestamps()
}
parameters{
string(name: 'myVariable', defaultValue: "/", description: 'Enter Path To App e.g / OR /dummy_path for ' {check_env} )
}
stages{
stage('Running App') {
agent {
docker {
image 'myApp:latest'
}
}
steps{
script{
sh label: 'App', script: "echo "App is running in ${check_env} "
}
}
}
}
}
}
I have tried multiple combinations for check_env e.g check_env, check_env(), ${check_env} function but none of them worked.
String Interpolation
I believe this is simply a String Interpolation issue. Notice my use of double quotes below
parameters{
string(name: 'myVariable', defaultValue: "/", description: "Enter Path To App e.g / OR /dummy_path for ${check_env}")
}
Your build page should then interpolate your variable
To test I simply set def check_env = 'live' since I do not have the code for your method

How to pass paramaters to bash script in a Jenkins scripted pipeline and set the credentials to remote host login

I'm new to Jenkins scripted pipeline. Below is the code which i'm trying to execute on a remote host. I want to know, two things:
1) How to pass credentials without hard coding it. Unlike the way i did in the script below.
2) How can i pass a parameter to my test.sh script. Meaning I want to pass it as
sshScript remote: remote, script: "myscript.sh {version} "
Update:
Below, is the script I got:
node {
properties([
parameters([
string(name: 'version', defaultValue: '', description: 'Enter the version in x.y.z format')
])
])
version = params.version.trim()
def remote = [:]
remote.name = 'Filetransfer'
remote.host = 'X.X.XX.XXXX'
remote.allowAnyHosts = true
withCredentials([usernamePassword(credentialsId: 'saltmaster', passwordVariable: 'password', usernameVariable: 'ops')]) {
remote.user = ops
remote.password = password
stage('Filetransfer') {
sshCommand remote: remote, command: "hostname"
//sshCommand remote: remote, command: "whoami"
sshGet remote: remote, from: '/srv/salt/tm-server/files/docker-compose.yaml', into: '/home/jenkins/jenkins-data/docker-compose.yaml', override: true
//sshScript remote: remote, script: '/home/jenkins/jenkins-data/rebuilt_dockercompose.sh "${version}"'
}
sh 'echo "Executing the script now ..."'
sh "echo Current version: ${version}"
sh "/home/jenkins/jenkins-data/rebuilt_dockercompose.sh //"${version}//""
}
}
Here is what you can do:
Pass credentials without hard-coding
In your Jenkins instance, add a global credential of type SSH Username with private key.
In your pipeline, use the withCredentials([sshUserPrivateKey...]) directive from Credential Binding Plugin to pass the credentials. This will also mask the credentials in the console output.
Pass user defined parameters
Since yours is a scripted pipeline, use the parameters([string...]) block wrapped inside the properties([]) block to allow users enter the version as a string parameter.
Pass the parameter as an argument to your shell script.
Modified pipeline script
node {
properties([
parameters([
string(name: 'version', defaultValue: '', description: 'Enter the version in x.y.z format')
])
])
def remote = [:]
remote.name = 'testPlugin'
remote.host = 'x.x.x.x'
remote.allowAnyHosts = true
withCredentials([
sshUserPrivateKey(credentialsId: 'ssh-credentials', usernameVariable: 'ssh-user', passphraseVariable: 'ssh-pass')
]) {
remote.user = ssh-user
remote.password = ssh-pass
stage('testPlugin') {
sshPut remote: remote, from: 'myscript.sh', into: '.'
sshScript remote: remote, script: "myscript.sh ${version}"
}
}
}

Jenkins declarative pipe, download latest upload (build) from Artifactory and get properties

Any sugesstions on this litle problem is very welcome! :)
It works fine to download the latest build but the object does not contain any properties.
Is it possible to get the properties from a downloaded build?
The gool is to get an inputbox with a predefined value displaying previous version i.e. "R1G" and give the user the option to edit the value to i.e. R2A or any other value or only abort (abort meaning there will be no version).
The user also have the option to do nothing withch will led to a timeoute and finaly an abort.
I want to
download latest build from Artifactory repo
store the build.number in "def prev_build"
display the prev_build in an input for the user to be updated (a customized number)
'''some code
echo 'Publiching Artifact.....'
script{
def artifactory_server_down=Artifactory.server 'Artifactory'
def downLoad = """{
"files":
[
{
"pattern": "reponame/",
"target": "${WORKSPACE}/prev/",
"recursive": "false",
"flat" : "false"
}
]
}"""
def buildInfodown=artifactory_server_down.download(downLoad)
//Dont need to publish because I only need the properties
//Grab the latest revision name here and use it again
echo 'Retriving revision from last uploaded build.....'
env.LAST_BUILD_NAME=buildInfodown.build.number
//Yes its a map and I have tried with ['build.number'] but the map is empty
}
echo "Previous build name is $env.LAST_BUILD_NAME" //Will not contain the old (latest)
''' End of snipet
The output is null or the default value I have given the var, not the expected version number.
Yes, firstly the properties should be present in the artifacts you are trying to download.
The build.number etc are part of the buildinfo.json file of the artifacts. these are not properties but metadata of some kind. this info would be visible under "Builds" menu in artifactory. Select the repo and build number.
At the last column/tab there would be buildinfo. Click on that - this file will hold all the info you need corresponding to the artifacts.
The build.number and other info will be pushed/uploaded to artifactory by the CI.
For example in case of Jenkins there is an option available when trying to push to artifactory "Capture and publish build info" --> this step does the work
Thanks a lot for your help.
I see your suggestion works but I had when I got your answer already implemented another solution that also works well
I am using the available query language.
https://www.jfrog.com/confluence/display/RTF/Artifactory+Query+Language
Just before my pipeline declaration in the pipeline file I added
def artifactory_url = 'https://lote.corp.saab.se:8443/artifactory/api/search/aql'
def artifactory_search = 'items.find({ "repo":"my_repo"},{"#product.productNumber":
{"$match":"produktname"}}).sort({"$desc":["created"]})'
pipeline
{
and ...
stage('Get latest revision') {
steps {
script {
def json_text = sh(script: "curl -H 'X-JFrog-Art-Api:${env.RECIPE_API_KEY}' -X POST '${artifactory_url}' -d '${artifactory_search}' -H 'Content-Type: text/plain' -k", returnStdout: true).trim()
def response = readJSON text: json_text
VERSION = response.results[0].path;
echo "${VERSION}"
println 'using each & entry'
response[0].each{ entry ->
println 'Key:' + entry.key + ', Value:' +
entry.value
}
}
}
}
stage('Do relesase on master')
{
when
{
branch "master"
}
options {
timeout(time: 1, unit: 'HOURS')
}
steps {
script{
RELEASE_SCOPE = input message: 'User input
required', ok: 'Ok to go?!',
parameters: [
choice(name: 'RELEASE_TYPE', choices:
'Artifactory\nClearCaseAndArtifactory\nAbort',
description: 'What is the release scope?'),
string(name: 'VERSION', defaultValue:
VERSION, description: '''Edit release name please!!''',
trim: false)
]
}
echo 'Build both RPM and Zip packages'
... gradlew -Pversion=${RELEASE_SCOPE['VERSION']} clean buildPackages"
script {
def artifactory_server = Artifactory.server 'Artifactory'
def buildInfo = Artifactory.newBuildInfo()
def uploadSpec = """{
"files":[
{
"pattern": "${env.WORKSPACE}/prodname/release/build/distributions/prodname*.*",
"target": "test_repo/${RELEASE_SCOPE['VERSION']}/",
"props": "product.name=ProdName;build.name=${JOB_NAME};build.number=${env.BUILD_NUMBER};product.revision=${RELEASE_SCOPE['VERSION']};product.productNumber=produktname"
}
]
}"""
println(uploadSpec)
artifactory_server.upload(uploadSpec)
}
}
}

Jenkins Pipeline Choose Specific Branch but take from default(master) branch

I have a Jenkins Pipeline that I would like to have a user input on to checkout a specific branch of their choosing. i.e. If I create a branch 'dev' and commit it in git,but Jenkins take a default branch(master)
Can any one please help me take a code from 'dev' branch code
Thanks much in advance.
stage('Git Checkout') {
steps {
checkout(
[$class: 'GitSCM',
branches: [[name: '*/dev']],
doGenerateSubmoduleConfigurations: false,
extensions: [],
submoduleCfg: [],
userRemoteConfigs: [[credentialsId:'987654322234245676543',
url:'http://repo.xyz.com/user/devop.git']]]
)
}
}
You can try pipeline step: git
stage('Git Checkout') {
steps {
git(branch: 'dev',
credentialsId: '987654322234245676543',
url: "http://repo.xyz.com/user/devop.git")
}
}

How to trigger a multiple run in a single pipeline job of jenkins?

I have a pipeline job which run with below pipeline groovy script,
pipeline {
parameters{
string(name: 'Unique_Number', defaultValue: '', description: 'Enter Unique Number')
}
stages {
stage('Build') {
agent { node { label 'Build' } }
steps {
script {
sh build.sh
}
}
stage('Deploy') {
agent { node { label 'Deploy' } }
steps {
script {
sh deploy.sh
}
}
stage('Test') {
agent { node { label 'Test' } }
steps {
script {
sh test.sh
}
}
}
}
I just trigger this job multiple times with different unique ID number as input parameter. So as a result i will have multiple run/build for this job at different stages.
With this, i need to trigger a multiple run/build to be promote to next stage (i.e., from build to deploy or from deploy to test) in this pipeline job as a one single build instead of triggering each and every single run/build to next stage. Is there any possibility?
I was also trying to do the same thing and found no relevant answers. May this help to someone.
This will read a file that contains the Jenkins Job name and run them iteratively from one single job.
Please change below code accordingly in your Jenkins.
pipeline {
agent any
stages {
stage('Hello') {
steps {
script{
git branch: 'Your Branch name', credentialsId: 'Your crendiatails', url: ' Your BitBucket Repo URL '
##To read file from workspace which will contain the Jenkins Job Name ###
def filePath = readFile "${WORKSPACE}/ Your File Location"
##To read file line by line ###
def lines = filePath.readLines()
##To iterate and run Jenkins Jobs one by one ####
for (line in lines) {
build(job: "$line/branchName",
parameters:
[string(name: 'vertical', value: "${params.vertical}"),
string(name: 'environment', value: "${params.environment}"),
string(name: 'branch', value: "${params.aerdevops_branch}"),
string(name: 'project', value: "${params.host_project}")
]
)
}
}
}
}
}
}
You can start multiple jobs from one pipeline if you run something as:
build job:"One", wait: false
build job:"Two", wait: false
Your main job starts children pipelines and children pipelines will run in parallel.
You can read PipeLine Build Step documentation for more information.
Also, you can read about the parallel run in declarative pipeline
Here you can find a lot of examples for parallel running

Resources