Jenkins pipeline: use of a global variable within the pipeline - jenkins-pipeline

I've got a Jenkins pipeline with multiple stages, that will need to know how the build was triggered (by user, timer, etc.) and I'd like to avoid having to duplicate the following line in each stage:
currentBuild.rawBuild.getCauses()[0].class.getName().contains('TimerTriggerCause')
When using that command in each when blockm it works as expected, but when placed in the environment block it keeps failing:
[Pipeline] node
Running on Jenkins in /var/lib/jenkins/jobs/test-pipeline/workspace
[Pipeline] {
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Stage on timer)
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
java.lang.NoSuchMethodError: No such DSL method '$' found among steps [archive, bat, build, catchError...zip] or globals [currentBuild, docker, env, params, pipeline, scm]
at org.jenkinsci.plugins.workflow.cps.DSL.invokeMethod(DSL.java:199)
at org.jenkinsci.plugins.workflow.cps.CpsScript.invokeMethod(CpsScript.java:122)
at sun.reflect.GeneratedMethodAccessor513.invoke(Unknown Source)
Jenkins script:
pipeline {
agent {
label 'master'
}
environment {
DAY = Calendar.getInstance().get(Calendar.DAY_OF_WEEK)
HOUR = Calendar.getInstance().get(Calendar.HOUR_OF_DAY)
ONTIMER = currentBuild.rawBuild.getCauses()[0].class.getName().contains('TimerTriggerCause')
}
stages {
stage('Stage on timer') {
when {
expression {
return (${ONTIMER} && (${DAY} != Calendar.SATURDAY && ${DAY} != Calendar.SUNDAY))
}
}
steps {
echo "on timer..."
}
}
}
}
The 2 other variables DAY and HOUR do work fine when used in the when block. Any idea?

After some more trial and error, I got the behaviour I wanted.
The when condition when dealing with environmental variables uses a slightly different syntax. It has its own environment keyword to use then:
when {
environment name: 'ONTIMER', value: 'true'
}
As a bonus, using integer value as well in the when block:
when {
allOf {
environment name: 'ONTIMER', value: 'true'
expression { return Integer.parseInt(env.HOUR) < 11 }
}
}
Even better, it's possible to use the triggeredBy keyword and act on this:
when {
anyOf {
expression { return params.RUN }
allOf {
triggeredBy "TimerTrigger"
expression {
Integer.parseInt(env.HOUR) < 13
}
}
}
}
Values to be used with triggeredBy include:
TimerTrigger
UserId
(others? like SCM trigger?)

Related

How to use parameter to skip a stage of Jenkins declarative pipeline?

I'm trying to create a Jenkinsfile that runs a release build step only if a boolean parameter isRelease is true, otherwise skip this step.
My Jenkinsfile looks like this (extract):
pipeline {
agent { label 'build' }
tools {
jdk 'OpenJDK 11'
}
parameters {
booleanParam( name: 'isRelease', description: 'Run a release build?', defaultValue: false)
}
stages {
stage('Build') {
steps {
sh 'mvn clean compile'
}
}
stage('Test') {
steps {
sh 'mvn test'
}
}
stage('Release') {
when {
beforeAgent true
expression { return isRelease }
}
steps {
sh 'echo "######### Seems to a release!"'
}
}
}
}
However, I don't seem to understand how to use the parameters variable properly. What happens is that the release step is always executed.
I changed expression { return isRelease } to expression { return "${params.isRelease}" } which did not make a difference. Changing it to expression { return ${params.isRelease} } causes the step to fail with java.lang.NoSuchMethodError: No such DSL method '$' found among steps.
What's the right way to use a parameter to skip a step?
You were closest on your first attempt. The latter two failed because:
Converting a Boolean to a String would always return truthiness because a non-empty String is truthy.
That is invalid Jenkins Pipeline or Groovy syntax.
The issue here with the first attempt is that you need to access isRelease from within the params object.
when {
beforeAgent true
expression { params.isRelease }
}
The when directive documentation actually has a very similar example in the expression subsection for reference.

How to create jenkinsfile that supports ansiColors but also executes on jenkins host that does not have ansiColors installed

I have added ansiColors to a Jenkinsfile that is used on multiple jenkins hosts following
Jenkins pipeline ansicolor console output
Unfortunately, not all jenkins hosts have the AnsiColor plugin installed.
On hosts where the plugin is not installed, I get an error
Started by upstream project "**********************" build number 450
originally caused by:
Started by timer
Obtained ............../Jenkinsfile from git https://host:/repo.git
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] End of Pipeline
java.lang.NoSuchMethodError: No such DSL method 'ansiColor' found among steps [
Is there a way to code the jenkinsfile so that it will use ansiColors when available but still execute when the plugin is missing.
My, trimmed, Jenkinsfile
ansiColor('xterm') {
withFolderProperties{
env.getEnvironment()
def nodelabel
try{ nodelabel="${env.CCEBUILD_NODE}" } catch (e) { }
if (nodelabel == "null" || nodelabel == null) { nodelabel = "devts" }
// trimmed
node("${nodelabel}") {
stage('Info') {
dir("${applroot}") {
callAnt(antbaseparameters, "info")
}
}
// trimmed
} // node("${nodelabel}")
} // withFolderProperties
} // ansiColor
You can try putting your code into a function and do something like this:
def doSomthing(){
withFolderProperties{ ... }
}
try {
ansiColor('xterm') {
doSomthing()
}
} catch (NoSuchMethodError e){
doSomthing()
}

Terraform is not recognized as internal or external command in jenkins

I installed jenkins in local windows machine. Then i installed terraform plug in and did the config changed in global tool configuration in jenkins but when i run the jenkin pipeline i get 'terraform' is not recognized as an internal or external command,
operable program or batch file.
Code :
pipeline {
agent any
stages {
stage('Hello') {
steps {
bat 'terraform --version'
echo 'Hello World'
}
}
}
}
can you help me what i am doing wrong in this?
Started by user admin
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] node
Running on Jenkins in C:\Program Files (x86)\Jenkins\workspace\actimize2
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Hello)
[Pipeline] script
[Pipeline] {
[Pipeline] tool
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: No tool named terraform found
Finished: FAILURE
terraform config :
You'll need to get the Terraform home using a tool command and then add it to the Path environment variable so that the shell interpreterinvoked by bat can find the terraform command:
def tfHome = tool name: 'Terraform', type: 'com.cloudbees.jenkins.plugins.customtools.CustomTool'
env.Path = "${tfHome};${env.Path}"
In your pipeline, this would look like:
pipeline {
agent any
stages {
stage('Hello') {
steps {
def tfHome = tool name: 'Terraform', type: 'com.cloudbees.jenkins.plugins.customtools.CustomTool'
env.Path = "${tfHome};${env.Path}"
bat 'terraform --version'
echo 'Hello World'
}
}
}
}
You can also use tool directly in the bat command (this is what I used to do when I was using Jenkins regularly):
pipeline {
agent any
stages {
stage('Hello') {
steps {
bat "${tool name: 'Terraform', type: 'com.cloudbees.jenkins.plugins.customtools.CustomTool'}\terraform --version"
echo 'Hello World'
}
}
}
}
You can see a worked example in this Automating Terraform Projects with Jenkins article.

Execute all stage and steps if one stage or step get failed aslo

I'm setting up Jenkins pipeline which is mentioned below. My build gets aborted if the 1st stage is got failed but I want to execute 1st all stage and steps which are mentioned in stages.
pipeline {
agent none
stages {
stage("build and test the project") {
agent {
docker "coolhub/vault:jenkins"
}
stages {
stage("build") {
steps {
sh 'echo "build.sh"'
}
}
stage("test") {
steps {
sh 'echo "test.sh" '
}
}
}
}
}
}
I'd like to execute 1st all stage and steps which are mentioned in stages.
after all, stage gets executed then finally need to get abort Jenkins job and show stage and steps which are failed.
Yeah, well there's no way currently to do that apart from try catch blocks in a script.
More here: Ignore failure in pipeline build step.
stage('someStage') {
steps {
script {
try {
build job: 'system-check-flow'
} catch (err) {
echo err
}
}
echo currentBuild.result
}
}
In hakamairi's answer, the stage is not marked as failed. It is now possible to fail a stage, continue the execution of the pipeline and choose the result of the build:
pipeline {
agent any
stages {
stage('1') {
steps {
sh 'exit 0'
}
}
stage('2') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
sh "exit 1"
}
}
}
stage('3') {
steps {
sh 'exit 0'
}
}
}
}
In the example above, all stages will execute, the pipeline will be successful, but stage 2 will show as failed:
As you might have guessed, you can freely choose the buildResult and stageResult, in case you want it to be unstable or anything else. You can even fail the build and continue the execution of the pipeline.
Just make sure your Jenkins is up to date, since this is a fairly new feature.

Referencing variable in declarative jenkins pipeline

I am using the groovy below to call a bat command, that no matter how i reference the LOCAL_WORKSPACE within the bat command it does not evaluate it.
What am i missing?
Error
nuget restore $env.LOCAL_WORKSPACE
"Input file does not exist: $env.LOCAL_WORKSPACE"
Script
pipeline {
agent any
stages {
stage('Clone repo') {
steps {
deleteDir()
git branch: 'myBranch', changelog: false, credentialsId: 'myCreds', poll: false, url: 'http://myRepoURL'
}
}
stage ("Set any variables") {
steps{
script{
LOCAL_BUILD_PATH = "$env.WORKSPACE"
}
}
}
stage('Build It, yes we can') {
parallel {
stage("Build one") {
steps {
echo LOCAL_BUILD_PATH
bat 'nuget restore %LOCAL_WORKSPACE%'
}
}
}
}
}
}
You cannot set variables to share data between stages. Basically each script has its own namespace.
What you can do is use an environment directive as described in the pipeline syntax docs. Those constants are globally available, but they are constants, so you cannot change them in any stage.
You can calculate the values though. For example I use an sh step to get the current number of commits on master like this:
pipeline {
agent any
environment {
COMMITS_ON_MASTER = sh(script: "git rev-list HEAD --count", returnStdout: true).trim()
}
stages {
stage("Print commits") {
steps {
echo "There are ${env.COMMITS_ON_MASTER} commits on master"
}
}
}
}
You can use environment variables to store and access to/from stages. For example, if you define LOCAL_ENVR as Jenkins parameter, you can manipulate the variable from stages:
stage('Stage1') {
steps {
script{
env.LOCAL_ENVR = '2'
}
}
}
stage('Stage2') {
steps {
echo "${env.LOCAL_ENVR}"
}
}

Resources