How to use parameter to skip a stage of Jenkins declarative pipeline? - syntax

I'm trying to create a Jenkinsfile that runs a release build step only if a boolean parameter isRelease is true, otherwise skip this step.
My Jenkinsfile looks like this (extract):
pipeline {
agent { label 'build' }
tools {
jdk 'OpenJDK 11'
}
parameters {
booleanParam( name: 'isRelease', description: 'Run a release build?', defaultValue: false)
}
stages {
stage('Build') {
steps {
sh 'mvn clean compile'
}
}
stage('Test') {
steps {
sh 'mvn test'
}
}
stage('Release') {
when {
beforeAgent true
expression { return isRelease }
}
steps {
sh 'echo "######### Seems to a release!"'
}
}
}
}
However, I don't seem to understand how to use the parameters variable properly. What happens is that the release step is always executed.
I changed expression { return isRelease } to expression { return "${params.isRelease}" } which did not make a difference. Changing it to expression { return ${params.isRelease} } causes the step to fail with java.lang.NoSuchMethodError: No such DSL method '$' found among steps.
What's the right way to use a parameter to skip a step?

You were closest on your first attempt. The latter two failed because:
Converting a Boolean to a String would always return truthiness because a non-empty String is truthy.
That is invalid Jenkins Pipeline or Groovy syntax.
The issue here with the first attempt is that you need to access isRelease from within the params object.
when {
beforeAgent true
expression { params.isRelease }
}
The when directive documentation actually has a very similar example in the expression subsection for reference.

Related

Can we use triggers directive for multiple stage

Hi I want to know whether multiple stages can have a triggers directive, if no then I need some way to schedule each stage
pipeline{
stages{
stage{
triggers{cron (#some_exp)}
steps{
# Some steps
}
}
stage{
triggers{cron (#some_exp)}
steps{
# Some steps
}
}
}
}
If you want to trigger each stage separately that means that each stage is probably a separate business logic unit and therefore you should seriously consider creating a separate job for each step in order to avoid packing them all into a single job with multiple triggers and packed logic.
However if you still want to achieve what you want you can do it with the Parameterized Scheduler Plugin which enables you to define cron triggers that trigger the job with a specific environment variable, you can then use this variable as a condition to determine which step to execute.
Here is an example for implementing it:
pipeline {
agent any
parameters {
string(name: 'STAGE', defaultValue: 'setup', description: 'Which stage to run')
}
triggers {
parameterizedCron('''
*/2 * * * * %STAGE=setup
*/3 * * * * %STAGE=build
''')
}
stages {
stage('Setup') {
when {
expression { STAGE== 'setup' }
}
steps {
echo "In Setup stage - STAGE parameter is ${STAGE}"
...
}
}
stage('Build') {
when {
expression { STAGE== 'build' }
}
steps {
echo "In Build stage - STAGE parameter is ${STAGE}"
...
}
}
...
}
}

Jenkinsfile - a way to skip the whole pipeline?

I use the declarative syntax for developing the (multibranch) pipeline script and I'm looking for a way to skip the whole pipeline based on some condition, without having to modify the when on every single stage.
Current use case: I'm setting up a cron to trigger builds at night, but I only want let's say the branches release/v1 and develop to go through the pipeline at night, not the dozen of other branches.
triggers {
cron('H 21 * * 1-5')
}
// SKIP PIPELINE if triggered by timer AND branch not 'release/v1' OR 'develop'
stages {
stage('build') {
when { ... }
}
stage('UT') {
when { ... }
}
etc...
}
any hints would be appreciated.
You can nest stages, if you have the pipeline-definition-plugin 1.3 or later (changelog).
Using that, you can nest your whole job in a parent stage, and use a when directive on the parent stage. All child stages will be skipped if the parent stage is skipped. Here is an example:
pipeline {
agent any
stages {
stage('Parent') {
when {
//...
}
stages {
stage('build') {
steps {
//..
}
}
stage('UT') {
steps {
//...
}
}
}
}
}
}
You can use the SCM Skip plugin. Then use it in the pipeline like so:
scmSkip(deleteBuild: true, skipPattern:'.*#skip-ci.*')
This also enables to delete the build and also use a regex expression easily. The problem is, it aborts the next builds, and then ignores them for the relevant PR if you're connected with GitHub organization.

Execute all stage and steps if one stage or step get failed aslo

I'm setting up Jenkins pipeline which is mentioned below. My build gets aborted if the 1st stage is got failed but I want to execute 1st all stage and steps which are mentioned in stages.
pipeline {
agent none
stages {
stage("build and test the project") {
agent {
docker "coolhub/vault:jenkins"
}
stages {
stage("build") {
steps {
sh 'echo "build.sh"'
}
}
stage("test") {
steps {
sh 'echo "test.sh" '
}
}
}
}
}
}
I'd like to execute 1st all stage and steps which are mentioned in stages.
after all, stage gets executed then finally need to get abort Jenkins job and show stage and steps which are failed.
Yeah, well there's no way currently to do that apart from try catch blocks in a script.
More here: Ignore failure in pipeline build step.
stage('someStage') {
steps {
script {
try {
build job: 'system-check-flow'
} catch (err) {
echo err
}
}
echo currentBuild.result
}
}
In hakamairi's answer, the stage is not marked as failed. It is now possible to fail a stage, continue the execution of the pipeline and choose the result of the build:
pipeline {
agent any
stages {
stage('1') {
steps {
sh 'exit 0'
}
}
stage('2') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
sh "exit 1"
}
}
}
stage('3') {
steps {
sh 'exit 0'
}
}
}
}
In the example above, all stages will execute, the pipeline will be successful, but stage 2 will show as failed:
As you might have guessed, you can freely choose the buildResult and stageResult, in case you want it to be unstable or anything else. You can even fail the build and continue the execution of the pipeline.
Just make sure your Jenkins is up to date, since this is a fairly new feature.

Running a script post successful build in jenkins

I am trying to run a bash script after a successful build in jenkins.
stages {
stage("test") {
steps {
...
}
post {
success {
steps {
sh "./myscript"
}
}
}
}
}
I am getting an error saying that method "steps" does not exist. How can I run a script after a successful build?
You need to remove the ”steps” inside the ”success” block. call the script directly inside the ”success” block.
according to the docs which is quite confusing, the ”success” is a container for steps (so no need to add another nested ”steps” ):
https://jenkins.io/doc/book/pipeline/syntax/#post
stages {
stage("test") {
steps {
...
}
post {
success {
sh "./myscript"
}
}
}
}

Referencing variable in declarative jenkins pipeline

I am using the groovy below to call a bat command, that no matter how i reference the LOCAL_WORKSPACE within the bat command it does not evaluate it.
What am i missing?
Error
nuget restore $env.LOCAL_WORKSPACE
"Input file does not exist: $env.LOCAL_WORKSPACE"
Script
pipeline {
agent any
stages {
stage('Clone repo') {
steps {
deleteDir()
git branch: 'myBranch', changelog: false, credentialsId: 'myCreds', poll: false, url: 'http://myRepoURL'
}
}
stage ("Set any variables") {
steps{
script{
LOCAL_BUILD_PATH = "$env.WORKSPACE"
}
}
}
stage('Build It, yes we can') {
parallel {
stage("Build one") {
steps {
echo LOCAL_BUILD_PATH
bat 'nuget restore %LOCAL_WORKSPACE%'
}
}
}
}
}
}
You cannot set variables to share data between stages. Basically each script has its own namespace.
What you can do is use an environment directive as described in the pipeline syntax docs. Those constants are globally available, but they are constants, so you cannot change them in any stage.
You can calculate the values though. For example I use an sh step to get the current number of commits on master like this:
pipeline {
agent any
environment {
COMMITS_ON_MASTER = sh(script: "git rev-list HEAD --count", returnStdout: true).trim()
}
stages {
stage("Print commits") {
steps {
echo "There are ${env.COMMITS_ON_MASTER} commits on master"
}
}
}
}
You can use environment variables to store and access to/from stages. For example, if you define LOCAL_ENVR as Jenkins parameter, you can manipulate the variable from stages:
stage('Stage1') {
steps {
script{
env.LOCAL_ENVR = '2'
}
}
}
stage('Stage2') {
steps {
echo "${env.LOCAL_ENVR}"
}
}

Resources