What is correct way to write a condition using when in Jenkinsfile? - jenkins-pipeline

I am trying to write when statement in the single stage block in Jenkinsfile. I have tried to write as below. I know it's not the correct way to write. It's a declarative pipeline script. The pipeline expects only a single when block. How can I combine both of my when blocks and write as a single when.
stages{
stage('Approve Dev Deployment') {
agent { label 'docker-kitchensink-slave' }
when {
anyOf {
expression {
return (env.GIT_BRANCH.equals('master') || env.GIT_BRANCH.startsWith('hotfix-'))
}
}
}
when {
expression {
input message: 'Deploy test?'
return true
}
beforeAgent true
}
steps{
approveDeployment()
}
}
}

Write function with the conditions outside the pipeline scope and use the function as a condition.
def checkcondition(){
your_condition
}
stages{
stage('Approve Dev Deployment') {
agent { label 'docker-kitchensink-slave' }
when { checkcondition() }
steps{
approveDeployment()
}
}
}

Related

Jenksinfile fail next stage

I want to be able to fail next stage if previous one was failed but the one after should be running.
CanĀ“t really give any code so I hope on that I can get some lead from you guys how I should achive this.
but for example
Stages{
Stage{
Stage that will fail
}
Stage{
Stage that should fail if previous fail
}
Stage{
Stage that should fail if previous fail
}
Stage{
Stage that should run eitherway
}
}
This may not be the only way, but it is one way it can work. By setting some environment variable switch that is then evaluated in a when block at the beginning of subsequent conditional stages. Subsequent stages here will not "fail", they will just be skipped due to when condition. The last stage doesn't have a when block so is carried out regardless.
// declarative
environment {
FAIL = false
}
Stages{
Stage('Stage that might fail') {
steps {
script {
try {
sh 'whatever happens that may cause this stage to fail'
} catch (err) {
echo err.getMessage()
env.FAIL = true // Sets the variable to true which will be evaluated in when block on subsequent stages
}
}
}
}
Stage('Stage that should fail if previous fail') {
when {
expression {
return env.FAIL != "true" // note that even though FAIL was set as a boolean value the var is stored as a "string"
}
}
steps {
// do something
}
}
Stage('Stage that should fail if previous fail') {
when {
expression {
return env.FAIL != "true"
}
}
steps {
// do something
}
}
Stage('Stage that should run either way') {
// no when block = execute either way
// do stuff
}
}

How to resolve "please provide compiled classes with sonar.java.binaries property"?

I tried all the possible solutions posted by other people but still I am struggling to resolve this issue. I believe in my case, it has got to do something with the agents. I will post 2 codes, the one code works but the other doesn't. Both the codes call the same groovy methods but still the second code snippet doesn't work.
The below code 1 works fine and executed the pipeline successfully:
pipeline{
agent { label 'docker-kitchensink-slave' }
stages{
stage('Checkout') {
steps{
checkout scm
}
}
//Build and Unit Tests
stage('Build and Unit Tests') {
steps{
script{
if (buildType.buildSystem == 'npm'){
buildNpm(configuration)
} else {
build(configuration)
}
}
}
}
// SonarQube Analysis
stage('SonarQube analysis') {
steps{
script{
if (buildType.buildSystem != 'npm'){
sonarQubeGating(configuration)
}
}
}
}
// Build Docker Image and Push to Artifactory
stage('Build Docker Image and Push to Artifactory') {
steps{
artifactoryImagePush(configuration)
}
}
// Approve DEV Deployment
stage('Approve Dev Deployment') {
agent none
when {
anyOf {
expression {
return (env.GIT_BRANCH.equals('master') || env.GIT_BRANCH.startsWith('hotfix-'))
}
}
}
steps{
approveDeployment()
}
}
}
}
The below code 2 doesn't work:
pipeline{
agent none
stages{
stage('Checkout') {
agent { label 'docker-kitchensink-slave' }
steps{
checkout scm
}
}
//Build and Unit Tests
stage('Build and Unit Tests') {
agent { label 'docker-kitchensink-slave' }
steps{
script{
if (buildType.buildSystem == 'npm'){
buildNpm(configuration)
} else {
build(configuration)
}
}
}
}
// SonarQube Analysis
stage('SonarQube analysis') {
agent { label 'docker-kitchensink-slave' }
steps{
script{
if (buildType.buildSystem != 'npm'){
sonarQubeGating(configuration)
}
}
}
}
// Build Docker Image and Push to Artifactory
stage('Build Docker Image and Push to Artifactory') {
agent { label 'docker-kitchensink-slave' }
steps{
unstash 'artifacts'
unstash 'artifacts'
artifactoryImagePush(configuration)
}
}
// Approve DEV Deployment
stage('Approve Dev Deployment') {
agent none
when {
anyOf {
expression {
return (env.GIT_BRANCH.equals('master') || env.GIT_BRANCH.startsWith('hotfix-'))
}
}
}
steps{
approveDeployment()
}
}
}
}
I get the error as below:
[ERROR] Failed to execute goal org.sonarsource.scanner.maven:sonar-maven-plugin:3.6.0.1398:sonar (default-cli) on project xyz-service: Your project contains .java files, please provide compiled classes with sonar.java.binaries property, or exclude them from the analysis with sonar.exclusions property. -> [Help 1]
Below is my sonar code:
void call(Map optionParams = [:]) {
script {
try {
String jacocoPath = optionParams.get('buildSystem').equals('gradle') ?
'build/JacocoReport/test/jacocoTestReport.xml' : 'target/site/jacoco/jacoco.xml'
glSonarMavenScan gitUserCredentialsId: 'sonar-key', javaVersionForSonar: '11.0', mavenVersion: '3.5.4',
additionalProps: ['sonar.coverage.jacoco.xmlReportPaths' : jacocoPath]
} catch (Exception e) {
echo "The following Sonar exception thrown ${e}"
//Stop build here, unless 'requireSonar' is set to False (String or Boolean)
if (!optionParams.get('requireSonar').toString().equalsIgnoreCase('false')) {
throw e
}
}
}
}
I'm a little confused about what you're trying to achieve here. You are showing the code that works. Are you trying to understand WHY the first block works, compared to the second, or are you just trying to get it working? If the latter, clearly you are already done.
If the former, I'm only familiar with scripted pipeline, not declarative pipeline, but it seems possible to me that if there is more than one build node that satisfies that label, then each of those "agent" lines could potentially select a build node, and each one could potentially select a different one. If the build step executes a different node than the sonarqube scan is run on, you will find yourself in a workspace without any compiled classes.

Jenkins pipelines with parallel and different containers

So I am already running Jenkins pipelines with parallel base on the example from: Is it possible to create parallel Jenkins Declarative Pipeline stages in a loop?
I want to run each job in different isolated container, the agent name should be the same to all of them. Tried a few options all of them ended up withe errors, I think I need to use both declarative and scripted but not sure how.
Things I tired:
def generateTerraformStage(env) {
return {
agent { label 'local_terraform' }
stage("stage: Terraform ${TERRAFORM_ACTION} ${env}") {
echo "${env}"
sleep 30
}
}
}
stage('parallel stages') {
agent { label 'local_terraform' }
steps {
script {
parallel parallelStagesMapEnvironment
}
}
}
One of the errors I got during testing:
"java.lang.NoSuchMethodError: No such DSL method 'agent' found among steps" and "java.lang.IllegalArgumentException: Expected named arguments but got org.jenkinsci.plugins.workflow.cps.CpsClosure2#560f3533"
Dynamic parallel stages could be created only by using Scripted Pipelines. The API built-it Declarative Pipeline is not available (like agent, options, when etc.).
I don't see any information that you really need dynamic stages (e.g. based on the value returned by a 3rd-party service), so I prepared two solutions:
dynamic parallel stages - stages are generated based on something
static parallel stages - you know all stages (the when block could be used to disable these which are not needed - e.g. passed in parameters)
pipeline {
// ...
stages {
stage('dynamic parallel stages') {
steps {
script {
// params.ENVS == ['envA', 'envB', 'envC']
def values = params.ENVS.split(',')
def stages = [:]
for (def value in values) {
stages[value] = generateTerraformStage(value)
}
parallel stages
}
}
}
stage('static parallel stages') {
parallel {
stage('envA') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envA') }
}
steps {
terraformStageLogic 'envA'
}
}
stage('envB') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envB') }
}
steps {
terraformStageLogic 'envB'
}
}
stage('envC') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envC') }
}
steps {
terraformStageLogic 'envC'
}
}
// ...
}
}
}
}
Closure<Void> generateTerraformStage(env) {
return {
node('local_terraform') {
stage("stage: Terraform ${TERRAFORM_ACTION} ${env}") {
echo "${env}"
sleep 30
}
}
}
}
void terraformStageLogic(env) {
echo "${env}"
sleep 30
}
When you don't use the workspace in the stage responsible for generating or executing other stages (dynamic parallel stages and static parallel stages) then you don't need to allocate any node to it (waste of resources).

How to write a dynamic declarative pipeline that contains sequential job inside parallel job

I'm trying to write a declarative pipeline code that accepts a map and create a pipeline. I can able to achieve sequential stages or parallel stages but facing problems while making a pipeline that contains sequential stages inside parallel stages.
The input data would be Map. Each list in the map should run parallel and the items inside the list corresponding to each key should run in sequentially.
example data : [1:[11,12], 2:[21,22], 3:[31,32]]
The output should be of image. Could someone give some idea?
Below is the code i have tried.
def stageData = [1:[11,12], 2:[21,22], 3:[31,32]];
def getDeployStages1(stageData){
Map deployStages = [:]
stageData.each{ key, stgValue ->
List stgs = []
stgValue.each{ value ->
deployStages.put("${value}", {
echo "${value}"
})
}
}
return deployStages;
}
def getDeployStages2(stageData){
Map deployStages = [:]
stageData.each{ key, stgValue ->
List stgs = []
stgValue.each{ value ->
stgs.add(stage("${value}"){
echo "${value}"
})
}
deployStages.put("${key}", stgs)
}
return deployStages;
}
pipeline {
agent any
stages {
stage ("deploy1") {
steps {
script {
parallel getDeployStages1(stageData)
}
}
}
stage ("deploy2") {
steps {
script {
parallel getDeployStages2(stageData)
}
}
}
}
}
According to this documentation you can nest the stages in this way
pipeline {
agent none
stages {
stage("build and deploy on Windows and Linux") {
parallel {
stage("windows") {
agent {
label "windows"
}
stages {
stage("build") {
steps {
bat "run-build.bat"
}
}
stage("deploy") {
when {
branch "master"
}
steps {
bat "run-deploy.bat"
}
}
}
}
stage("linux") {
agent {
label "linux"
}
stages {
stage("build") {
steps {
sh "./run-build.sh"
}
}
stage("deploy") {
when {
branch "master"
}
steps {
sh "./run-deploy.sh"
}
}
}
}
}
}
}
}
This should result in the following flow
To apply this in your case, you can simplify your functions to return just elements that need to be sequential (just the values).
pipeline {
agent any
stages {
stage ("parallel") {
parallel {
stage ("deploy1") {
stages {
def list = getDeployStages1(stageData)
for (int i=0; i < list.size(); i++) {
stage(i) {
echo("${list[i]}")
}
}
}
stage ("deploy2") {
stages {
//similar
}
}
}
}
}

Jenkins pipeline stage inside another stage based on if condition

I am trying to write a stages inside another stage based on if condition. I am not able to come up with a solution. Can anyone guide on this
stages {
stage('Example') {
steps {
script {
if(!(fileExists("c:/test.txt")))
{
echo "Inside if"
stage('1') {
echo "stage1"
}
stage('2') {
echo "stage2"
}
}
else
{
stage('else stage') {
echo "else stage1"
}
}
}
}
}
}
This worked for me.
when {
expression
{
return !(fileExists("c:/test.txt"))
}
}

Resources