Jenkins trigger conditional build steps in shell? - bash

I am using Jenkins as a server to run cron jobs that are conditional on the success of other jobs. These can be run as multiple execute shell steps. I am specifically wondering if there is a way to make an execute shell step contingent on the exit status of the previous execute shell step.

This is the default behaviour. Each build step, such as "Execute Shell" step, returns an exit code (last command). If that is 0, the next build step is executed. If that is not 0, Jenkins "FAILS" the build, and skips straight to post-build steps.
If your shell returns 0 on success, and everything else is a failure, just put several "Execute Shell" build steps one after another.

Related

In Travis, is it possible to mark a staged as "Cancelled" instead of "Failed" when running a bash script?

There does exist a "Cancelled" state, which you can invoke by clicking on small x next to the job. This is how a cancelled job looks:
Is it possible to enter this cancelled state when running a bash script invoked by your .travis.yml? From Travis docs:
If script returns a non-zero exit code, the build is failed
So returning a different error code doesn't help. Is it just not doable?

bash: stop subshell script marked as failed if one step exits with an error

I am running a script through the SLURM job scheduler on HPC.
I am invoking a subshell script through a master script.
The subshell script contains several steps. One step in the script sometimes fails because of the quality of the data; this step is not required for further steps, but if this step fails, my whole subshell script is marked with "failed" Status in the job scheduler. However, I need this subshell script to have a "completed" Status in the Job scheduler as it is dependency in my master script.
I tried setting up
set +e
in my subshell script right before the optional step, but it doesn't seem to work: I still get an exitCode with errors and FAILED status inthe job scheduler.
In short: I need the subshell script to have Status "completed" in the job scheduler, no matter whether one particular step is finished with errors or not. Will appreciate help with this.
For Slurm jobs submitted with sbatch, the job exit code is taken to be the return code of the submission script itself. The return code of a Bash script is that of the last command in the script.
So if you just end your script with exit 0, Slurm should consider it COMPLETED no matter what.

Windows "Start" command doesn't return from within "Execute shell script" step

Within a kettle job we need to call a program that doesn't return until it is stopped. From a command line this can be done with the Start command of Windows:
Start "some title" /b "C:\windows-style\path with spaces\program.exe" unqoted_param -i -s "quoted param"
This works well by starting the program in another shell while the shell calling it returns and can continue. From within a kettle job this should be possible too, I think, by simply running the above command in a Execute a shell script step with the Insert script option.
However instead of returning from running the program in a new shell, the execution waits for the program to finish. This is not what we want because while the program is running (it's a VPN connection) we need to perform some other steps before the program is stopped again.
I suspect this might have something to do with how kettle performs the script, namely by putting the commands in a temporary batch file, then running that one. At least that's how it is presented in the job log:
2019/09/17 09:40:24 - Step Name - Executing command : cmd.exe /C "C:\Users\username\AppData\Local\Temp\kettle_69458258-d91e-11e9-b9fb-5f418528413ashell.bat"
2019/09/17 09:40:25 - Step Name - (stdout)
2019/09/17 09:40:25 - Step Name - (stdout) C:\pentaho_path>Start "some title" /b "C:\windows-style\path with spaces\program.exe" unqoted_param -i -s "quoted param"```
For a quick solution, you can use parallel execution in the job.
From the Start step (or whichever step precedes the need for the VPN), activate the option to run the subsequent steps in parallel. Then you can put the shell script step in its own branch while the rest of the job can continue (with a wait step on the other branch to allow the VPN to start).
From the question, you are probably running jobs from the Pentaho server. If you happen to run them from a scheduler with kitchen.bat, you could start the VPN before calling kitchen of course.

Is it possible to have a bash script Jenkins job report Red, Amber, and Green?

I have a couple of bash scripts running on Jenkins which return 0 and 1 exit codes depending on their outcome and the Jenkins job shows as green and red, respectively.
We have other Jenkins jobs which sometimes go amber. What exit code do I need to return to get that from a bash script?
The "Execute Shell" build step has an Advanced button.
Click on it and you can enter the exit code that Jenkins will interpret as "unstable".

I am building a pipeline job in jenkins in groovy, and it has to run a batch command first, upon s

I have to build a jenkins pipeline job using groovy script, what the job has to do is, first run an windows batch command, and only if the batch command build is successful, it should call build for another job. How can I find out is the windows batch command was build successfully. I am showing sample code for the query.
import groovy.json.JsonSlurper;
import hudson.model.*
import hudson.EnvVars
pipeline {
agent any
stages {
stage('Build')
{
steps{
bat 'some batch command here'
// if(bat build successful)---> need help here
build 'xyz' //xyz is another job that I am calling here
}
}
}
This is the default behavior of steps: if one of the step fails, it stops.
That bat step includes:
Normally, a script which exits with a nonzero status code will cause the step to fail with an exception
If your batch command does fail, its errorlevel should be different of 0, which is enough to make the steps chain fail.

Resources