How to save the commands history of a Jenkins build run? - shell

I wrote a declarative Jenkins pipeline and would like to track the CLI commandos executed by Jenkins. To do this, I added a stage and the step sh 'history -a' in it:
pipeline {
options {
...
}
agent {
node {
...
}
}
stages {
stage('Build') {
steps {
sh 'hostname'
sh 'pwd'
...
}
}
...
stage('History') {
steps {
sh 'history -a'
}
}
}
post {
...
}
}
But that is not working:
Console Output
...
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Tear Down)
[Pipeline] sh
+ history -a
/path/to/project-root#tmp/durable-66ba15cc/script.sh: 1: history: not found
[Pipeline] }
...
Other Linux commands like hostname, ls, or pwd are working fine.
Why does history run into an error? How to store the shell commands called by Jenkins in the context of a pipeline?

That specific error that you are getting I think it is only because the agent where you are running sh, does not have the history cmd available - history: not found
If you can store the sh commands... If you want only the sh commands, I think you need to write to a file that you create at the beginning, where you write to everytime you have a sh step, or you can just use the pipeline log file (the output console).
You can find here a thread about the location of the pipeline or build logs, in case it helps.

Related

sh command is not executing in jenkins pipeline project

While executing the sh command in jenkins pipeline project, we are getting an an error like below and sh command is not working.
Error message:
[Pipeline] sh
Warning: JENKINS-41339 probably bogus PATH=/bin/sh:/usr/atria/bin:/usr/atria/bin:$PATH; perhaps you meant to use ‘PATH+EXTRA=/something/bin’?
process apparently never started in /var/lib/jenkins/workspace/QSearch_pipelineTest#tmp/durable-6d5deef7
(running Jenkins temporarily with -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true might make the problem clearer)
[Pipeline] }
Below is the jenkins pipeline script code:
pipeline {
agent any
environment {
DATE = "December 17th"
}
stages {
stage("Env Variables") {
environment {
NAME = "Alex"
}
steps {
echo "Today is ${env.DATE}"
echo "My name ${env.NAME}"
echo "My path is ${env.PATH}"
script {
env.WEBSITE = "phoenixNAP KB"
env.PATH = "/bin/sh:$PATH"
}
echo "This is an example for ${env.WEBSITE}"
echo "My path ${env.PATH}"
**sh 'env'**
withEnv(["TEST_VARIABLE=TEST_VALUE"]) {
echo "The value of TEST_VARIABLE is ${env.TEST_VARIABLE}"
}
}
}
}
Below is the output of jenkins build job:
...
[Pipeline] echo
My path /bin/sh:/usr/atria/bin:/usr/atria/bin:$PATH
**[Pipeline] sh
Warning: JENKINS-41339 probably bogus PATH=/bin/sh:/usr/atria/bin:/usr/atria/bin:$PATH; perhaps you meant to use ‘PATH+EXTRA=/something/bin’?
process apparently never started in /var/lib/jenkins/workspace/QSearch_pipelineTest#tmp/durable-6d5deef7
(running Jenkins temporarily with -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true might make the problem clearer)**
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
**ERROR: script returned exit code -2**
Finished: FAILURE

Execute shell script from Jenkins Pipeline

I am trying to execute shell script from my jenkins pipeline. I have provided absolute and relative path in the shell command and still I am facing No such file or directory error while building the pipeline.
This is simple script but yet not working.
Try 1:
stage ( 'Executing shell script' ) {
steps {
sh '/home/patching/shell_script.sh'
}
}
Try 2:
stage ( 'Executing shell script' ) {
steps {
sh './shell_script.sh'
}
}
Try 3:
stage ( 'Executing shell script' ) {
steps {
dir ('/home/patching/shell_script.sh){
sh './shell_script.sh'
}
}
I really don't know what is really wrong with the script. Could some one help me on this?
I got the issue why it wasn't able to find the file that I wanted to run. It was running on different slave.
Can you try this
pipeline {
agent any
stages {
stage('Hello World') {
steps {
script {
sh '''
/home/ubuntu/First.sh
'''
}
}
}
}
}

Jenkins pipeline : using variable created in shell 1, in shell 2

In my pipeline I've 2 stages, and they both call a different shell script (the way they're called is some custom code as we have different pipelines interacting with each other) :
stage('first') {
steps {
script {
stageErrors.add(launchPortalStep.launchStepWithVerification('shell',"chmod +x ./script1.sh && ./script1.sh"))
}
}
}
stage('second') {
steps {
script {
stageErrors.add(launchPortalStep.launchStepWithVerification('shell',"chmod +x ./script2.sh && ./script2.sh"))
}
}
}
The thing is, in script2 I need to use a variable that is set in script1, the only ways I've think of that could work would mean pretty much rewriting everything and that's not a possibility.
Script1 must return either 0 or 1, so it's not possible to return the value and stock in an environment variable that would be then passed as a parameter in script2.
Scripts are also way too long to have them entirely in the pipeline using sh commands (that would be adding at least 700 lines).
What can I do ?
You should use a properties file (it can be even a temp file) to pass the needed variables between the stages.
You should write the needed variables into a properties file (in VAR=VAL format) in the shell script which is called in First step (sh 'echo "MY_ENV_VAR=test_var" > test.prop').
You should read the properties file in the Second step with the readProperties function which is part of the Pipeline Utility Steps plugin def props = readProperties file: 'test.pro). Then set the needed variables as environment variables (env.MY_ENV_VAR = props.MY_ENV_VAR) so you will be able to use the variables in your second shell script as environment variables (sh 'echo MY_ENV_VAR = ${MY_ENV_VAR}').
Complete example code:
pipeline {
agent { label 'MISC' }
stages{
stage('First'){
steps {
sh 'echo "MY_ENV_VAR=test_var" > test.prop'
}
}
stage('Second'){
steps {
script {
// readProperties is a step in Pipeline Utility Steps plugin
def props = readProperties file: 'test.prop'
// Assuming the key name is MY_ENV_VAR in properties file.
env.MY_ENV_VAR = props.MY_ENV_VAR
sh 'echo MY_ENV_VAR = ${MY_ENV_VAR}'
}
}
}
}
}
Output:
[Pipeline] stage
[Pipeline] { (First)
[Pipeline] sh
+ echo MY_ENV_VAR=test_var
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Second)
[Pipeline] script
[Pipeline] {
[Pipeline] readProperties
[Pipeline] sh
+ echo MY_ENV_VAR = test_var
MY_ENV_VAR = test_var
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage

Jenkins pipeline to exit if shell script fails in any stage

I have jenkins pipeline jobs which runs shell scripts internally. even though the shell scripts fails job will show as passed only.
My Pipeline:
stage('Code Checkout') {
timestamps {
step([$class: 'WsCleanup'])
echo "check out======GIT =========== on ${env.gitlabBranch}"
checkout scm
}
}
stage("build") {
sh 'sh script.sh'
}
}
catch(err){
currentBuild.result = 'FAILURE'
emailExtraMsg = "Build Failure:"+ err.getMessage()
throw err
}
}
}
LOG:
+ sh script.sh
$RELEASE_BRANCH is empty
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
It looks like your script returns with zero status code. Otherwise it would throw an exception as described in sh step description. The problem may be that exit status of sh sctipt.sh is the exit status of last executed command and your script may do something after error happens (e.g. echo something before exit). The simplest and brutal way to make sure every error is returned is to use put set -e at the top of your script.
You don't need any catch to have this functionality (I mean fail on script error) unless you want to do some extra operations in case of error. But if you do, then you should enclose script execution in try clause:
stage("build") {
try {
sh 'sh script.sh'
}
catch (err) {
currentBuild.result = 'FAILURE'
emailExtraMsg = "Build Failure:"+ err.getMessage()
throw err
}
}

Jenkins pipeline not executing next stage after failure in one stage of running bash script

I am running a shell script inside a docker container via jenkins groovy pipeline script. The bash script sets some environment variables and then executes unit tests. The stdout of these unit test execution is dumped to a text file.
I later copy this text file outside of the container for usage.
Here is the shell script:
#/bin/bash
source /root/venv/bin/activate
export PYTHONPATH=/foo/bar
cd unit_tests
rm -f results.txt
python tests.py >> results.txt
My pipeline script is as follows:
stage('Run Unit Tests') {
steps {
sh '''
docker-compose -f ./dir1/docker-compose-test.yml up -d
docker cp /supporting_files/run_unit_tests.sh container_1:/foo/bar/
docker exec container_1 /bin/bash run_unit_tests.sh
docker cp container_1:/foo/bar/unit_tests/results.txt .
'''
}
}
stage('Reporting') {
steps {
//steps for reporting
}
}
The problem is whenever any test fails, the results.txt has the appropriate text about failures and their stack. But the pipeline stop executing saying
[Pipeline] }
ERROR: script returned exit code 1
Because of this I am not able to execute next steps of parsing the results.txt file and reporting the results.
How do I make the pipeline execute next stage ?
I tried some things like:
1. Using catchError:
stage('Run Unit Tests') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
sh '''
//Running the commands above
'''
}
}
}
Using try:
try{
stage('Run Unit Tests') {
sh '''
//Executing tests
'''
}
} catch(e) {
echo e.toString()
}
But both of them does not help.
Also the shell script simply dumps the stdout of running tests into a text file so I don't understand why an exit code 1 should be returned as the operation itself does not fail. I saw the text file later, it had the correct failures and error counts with stack.

Resources