How can i set the jenkins job to pass when the test are actually failed - shell

I'm trying to do something similar to what this guy is doing:
Jenkins failed build: I Want it to pass
create a pipeline job in Jenkins for all the Known bugs tests, I want the job to PASS when all the tests are FAILED. while when even 1 test is PASS, the job will be GREEN.
I found here this solution
stage('Tests') {
steps {
script{
//running the tests
status = sh "${mvnHome}/bin/mvn clean test -e -Dgroups=categories.knownBug"
if (status === "MARK-AS-UNSTABLE") {
currentBuild.result = "STABLE"
}
}
}
}
but got an error
Unsupported operation in this context # line 47, column 39.
if (status === "MARK-AS-UNSTABLE") {
------------EDIT---------
Thanks to #yrc I changed the code to
try {
sh "${mvnHome}/bin/mvn clean test -e -Dgroups=categories.knownBug"
} catch (err) {
echo "Caught: ${err}"
currentBuild.result = "STABLE"
}
It did help with the error msg, but I want the job to pass when one of the tests is failing. Now, both test and job has failed

Just wrap your execution with a try-catch block.
try {
sh "${mvnHome}/bin/mvn clean test -e -Dgroups=categories.knownBug"
} catch (err) {
echo "Caught: ${err}"
currentBuild.result = "STABLE"
}

Related

Jenkinsfile setting stageResult variable based on conditions

I'm trying to script a Jenkinsfile in Declarative Syntax, requirement here is if set environment variables are not a match then pipeline stage should be depict as "Aborted" but the build status could be either Aborted or unstable, on Jenkins online documentation and from Snippet Generator did see the following style
catchError(buildResult: 'ABORTED', stageResult: 'ABORTED') {
// some block
}
catchError does not serve my purpose as it is appropriate for use when the script inside the stage does not return true or if there is an execution error, although default when condition in Jenkins Declarative Syntax is appropriate, Jenkins does not allow setting stage result to 'ABORTED'
when {
expression {
SCM_BRANCH_NAME ==~ /(master|QA)/
}
expression{
ENVIRONMENT ==~ /(QA)/
}
allOf{
environment ignoreCase: true,name: 'PRODUCT_NAME' , value: 'PRODUCT-1'
}
}
Please view the Sample Jenkinsfile below using if and else format
Sample Jenkinsfile
pipeline {
agent {
node {
label 'master'
}
}
environment{
ENVIRONMENT = "QA"
PRODUCT_NAME = "PRODUCT-1"
SCM_BRANCH_NAME = "master"
}
stages{
stage('Testing-1') {
when{
expression{
ENVIRONMENT ==~ /(QA)/
}
}
steps {
script {
if (PRODUCT_NAME == PRODUCT-1){
sh """
echo "Reached Here ${ENVIRONMENT} - ${PRODUCT_NAME} - ${SCM_BRANCH_NAME}"
// do testing for product 1
"""
}
else{
stageResult = 'ABORTED'
echo "PRODUCT not Available for Testing"
}
}
}
}
stage('Testing-2'){
steps{
sh '''
echo "Reached Second Stage"
'''
}
}
}
}
Any suggestions of how to implement the scenario that if conditions are not met set the stageResult as Abort, any suggestions either with a plugin or with a sample notation script is greatly appreciated
Thanks
Below snippet worked for me. You can use catch error block and throw the error from that block when some condition met. You can also catch the exception in the post stage failure section.
pipeline {
agent {
node {
label 'master'
}
}
environment{
ENVIRONMENT = "QA"
PRODUCT_NAME = "PRODUCT-1"
SCM_BRANCH_NAME = "master"
}
stages{
stage('Testing-1') {
when{
expression{
ENVIRONMENT ==~ /(QA)/
}
}
steps {
script {
catchError(buildResult: 'FAILURE', stageResult: 'ABORTED'){
if (PRODUCT_NAME != PRODUCT-1) {
error ("PRODUCT not Available for Testing")
}
sh """
echo "Reached Here ${ENVIRONMENT} - ${PRODUCT_NAME} - ${SCM_BRANCH_NAME}"
// do testing for product 1
"""
}
}
}
post {
failure {
echo "Something Failed and error has been catched"
}
}
}
stage('Testing-2'){
steps{
sh '''
echo "Reached Second Stage"
'''
}
}
}

How to configure jenkins pipeline with logstash plugin?

Usecase: I want to send jenkins job console log to elasticsearch, from there to kibana so that i can visualise the data.
I am using logstash plugin to achieve this. For freestyle job logstash plugin configuration is working fine but for jenkins pipeline jobs I am getting all required data like build number, job name, build duration and all but it is not showing the build result i.e., success or failure it is not showing.
I tried in two ways:
1.
stage('send to ES') {
logstashSend failBuild: true, maxLines: -1
}
2.
timestamps {
logstash {
node() {
sh'''
echo 'Hello, World!'
'''
try {
stage('GitSCM')
{
git url: 'github repo.git'
}
stage('Initialize')
{
jdk = tool name: 'jdk'
env.JAVA_HOME = "${jdk}"
echo "jdk installation path is: ${jdk}"
sh "${jdk}/bin/java -version"
sh '$JAVA_HOME/bin/java -version'
def mvnHome = tool 'mvn'
}
stage('Build Stage')
{
def mvnHome = tool 'mvn'
sh "${mvnHome}/bin/mvn -B verify"
}
currentBuild.result = 'SUCCESS'
} catch (Exception err) {
currentBuild.result = 'FAILURE'
}
}
}
}
But in both ways I am not getting build result i.e., success or failure in my elasticsearch or kibana.
Can someone help.
I didn't find a clear way to do that, my solution was add those lines at the end of the Jenkinsfile:
echo "Current result: ${currentBuild.currentResult}"
logstashSend failBuild: true, maxLines: 3
In my case, I dont need it to send all console logs, only one log with the result per job.

post equivalent in scripted pipeline?

What is the syntax of 'post' in scripted pipeline comparing to declarative pipeline?
https://jenkins.io/doc/book/pipeline/syntax/#post
For scripted pipeline, everything must be written programmatically and most of the work is done in the finally block:
Jenkinsfile (Scripted Pipeline):
node {
try {
stage('Test') {
sh 'echo "Fail!"; exit 1'
}
echo 'This will run only if successful'
} catch (e) {
echo 'This will run only if failed'
// Since we're catching the exception in order to report on it,
// we need to re-throw it, to ensure that the build is marked as failed
throw e
} finally {
def currentResult = currentBuild.result ?: 'SUCCESS'
if (currentResult == 'UNSTABLE') {
echo 'This will run only if the run was marked as unstable'
}
def previousResult = currentBuild.getPreviousBuild()?.result
if (previousResult != null && previousResult != currentResult) {
echo 'This will run only if the state of the Pipeline has changed'
echo 'For example, if the Pipeline was previously failing but is now successful'
}
echo 'This will always run'
}
}
https://jenkins.io/doc/pipeline/tour/running-multiple-steps/#finishing-up
You can modify #jf2010 solution so that it looks a little neater (in my opinion)
try {
pipeline()
} catch (e) {
postFailure(e)
} finally {
postAlways()
}
def pipeline(){
stage('Test') {
sh 'echo "Fail!"; exit 1'
}
println 'This will run only if successful'
}
def postFailure(e) {
println "Failed because of $e"
println 'This will run only if failed'
}
def postAlways() {
println 'This will always run'
}

how to keep process running after the stage is finished for declarative jenkins pipeline

pipeline {
agent none
stages {
stage('Server') {
agent{
node {
label "xxx"
customWorkspace "/home/xxx/server"
}
}
steps {
sh 'node server.js &'
//start server
}
}
stage('RunCase') {
agent{
node {
label 'clientServer'
customWorkspace "/home/xxx/CITest"
}
}
steps{
sh 'start test'
sh 'run case here'
}
}
}
}
I create above Jenkins pipeline. What I want to do is:
1. start server at server node.
2. start test at test node.
However, I found the server process will be closed when second stage start.
So how to keep server start until my second stage testing work is finished. I try to use &, still not working. It seems it will kill all process I started at first stage.
One solution is to try to start the two stages in "parallel"-mode. For more informations see this two files: parallel-declarative-blog jenkins-pipeline-syntax. But be carefull, because it is not ensured, that the first stage starts before the second one starts. Maybe you need a waiting time for your tests. Here is an example Jenkinsfile:
pipeline {
agent none
stages {
stage('Run Tests') {
parallel {
stage('Start Server') {
steps {
sh 'node server.js &'
}
}
stage('Run Tests) {
steps {
sh 'run case here'
}
}
}
}
}
}
Another solution would be to start the node server in the background. For this you can try different tools, like nohup or pm2.

How to define jenkins build trigger in jenkinsfile to start build after other job

I would like to define a build trigger in my Jenkinsfile. I know how to do it for the BuildDiscarderProperty:
properties([[$class: 'jenkins.model.BuildDiscarderProperty', strategy: [$class: 'LogRotator', numToKeepStr: '50', artifactNumToKeepStr: '20']]])
How can I set the Build Trigger that starts the job, when another project has been built. I cannot find a suitable entry in the Java API docs.
Edit:
My solution is to use the following code:
stage('Build Agent'){
if (env.BRANCH_NAME == 'develop') {
try {
// try to start subsequent job, but don't wait for it to finish
build job: '../Agent/develop', wait: false
} catch(Exception ex) {
echo "An error occurred while building the agent."
}
}
if (env.BRANCH_NAME == 'master') {
// start subsequent job and wait for it to finish
build '../Agent/master', wait: true
}
}
I just looked for the same thing and found this Jenkinsfilein jenkins-infra/jenkins.io
In short:
properties([
pipelineTriggers([cron('H/30 * * * *')])
])
This is an example:
#Project test1
pipeline {
agent {
any
}
stages {
stage('hello') {
steps {
container('dind') {
sh """
echo "Hello world!"
"""
}
}
}
}
post {
success{
build propagate: false, job: 'test2'
}
}
}
post {} will be execute when project test1 is built and the code inside
success {} will only be executed when project test1 is successful.
build propagate: false, job: 'test2' will call project test2.
propogate: false ensures that project test1 does not wait for project test2's
completion and simply invokes it.

Resources