sh command is not executing in jenkins pipeline project - bash

While executing the sh command in jenkins pipeline project, we are getting an an error like below and sh command is not working.
Error message:
[Pipeline] sh
Warning: JENKINS-41339 probably bogus PATH=/bin/sh:/usr/atria/bin:/usr/atria/bin:$PATH; perhaps you meant to use ‘PATH+EXTRA=/something/bin’?
process apparently never started in /var/lib/jenkins/workspace/QSearch_pipelineTest#tmp/durable-6d5deef7
(running Jenkins temporarily with -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true might make the problem clearer)
[Pipeline] }
Below is the jenkins pipeline script code:
pipeline {
agent any
environment {
DATE = "December 17th"
}
stages {
stage("Env Variables") {
environment {
NAME = "Alex"
}
steps {
echo "Today is ${env.DATE}"
echo "My name ${env.NAME}"
echo "My path is ${env.PATH}"
script {
env.WEBSITE = "phoenixNAP KB"
env.PATH = "/bin/sh:$PATH"
}
echo "This is an example for ${env.WEBSITE}"
echo "My path ${env.PATH}"
**sh 'env'**
withEnv(["TEST_VARIABLE=TEST_VALUE"]) {
echo "The value of TEST_VARIABLE is ${env.TEST_VARIABLE}"
}
}
}
}
Below is the output of jenkins build job:
...
[Pipeline] echo
My path /bin/sh:/usr/atria/bin:/usr/atria/bin:$PATH
**[Pipeline] sh
Warning: JENKINS-41339 probably bogus PATH=/bin/sh:/usr/atria/bin:/usr/atria/bin:$PATH; perhaps you meant to use ‘PATH+EXTRA=/something/bin’?
process apparently never started in /var/lib/jenkins/workspace/QSearch_pipelineTest#tmp/durable-6d5deef7
(running Jenkins temporarily with -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true might make the problem clearer)**
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
**ERROR: script returned exit code -2**
Finished: FAILURE

Related

Jenkins pipeline : using variable created in shell 1, in shell 2

In my pipeline I've 2 stages, and they both call a different shell script (the way they're called is some custom code as we have different pipelines interacting with each other) :
stage('first') {
steps {
script {
stageErrors.add(launchPortalStep.launchStepWithVerification('shell',"chmod +x ./script1.sh && ./script1.sh"))
}
}
}
stage('second') {
steps {
script {
stageErrors.add(launchPortalStep.launchStepWithVerification('shell',"chmod +x ./script2.sh && ./script2.sh"))
}
}
}
The thing is, in script2 I need to use a variable that is set in script1, the only ways I've think of that could work would mean pretty much rewriting everything and that's not a possibility.
Script1 must return either 0 or 1, so it's not possible to return the value and stock in an environment variable that would be then passed as a parameter in script2.
Scripts are also way too long to have them entirely in the pipeline using sh commands (that would be adding at least 700 lines).
What can I do ?
You should use a properties file (it can be even a temp file) to pass the needed variables between the stages.
You should write the needed variables into a properties file (in VAR=VAL format) in the shell script which is called in First step (sh 'echo "MY_ENV_VAR=test_var" > test.prop').
You should read the properties file in the Second step with the readProperties function which is part of the Pipeline Utility Steps plugin def props = readProperties file: 'test.pro). Then set the needed variables as environment variables (env.MY_ENV_VAR = props.MY_ENV_VAR) so you will be able to use the variables in your second shell script as environment variables (sh 'echo MY_ENV_VAR = ${MY_ENV_VAR}').
Complete example code:
pipeline {
agent { label 'MISC' }
stages{
stage('First'){
steps {
sh 'echo "MY_ENV_VAR=test_var" > test.prop'
}
}
stage('Second'){
steps {
script {
// readProperties is a step in Pipeline Utility Steps plugin
def props = readProperties file: 'test.prop'
// Assuming the key name is MY_ENV_VAR in properties file.
env.MY_ENV_VAR = props.MY_ENV_VAR
sh 'echo MY_ENV_VAR = ${MY_ENV_VAR}'
}
}
}
}
}
Output:
[Pipeline] stage
[Pipeline] { (First)
[Pipeline] sh
+ echo MY_ENV_VAR=test_var
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Second)
[Pipeline] script
[Pipeline] {
[Pipeline] readProperties
[Pipeline] sh
+ echo MY_ENV_VAR = test_var
MY_ENV_VAR = test_var
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage

how to run jenkins declarative pipeline on node selected via parameter and label option

I want to run a pipeline with the node set as parameter via the Node and Label plugin.
How do I change the declarative pipeline
pipeline {
agent {
label 'whatever'
}
...
to use EXECUTION_NODE as agent to execute the pipeline? This seems to be much more complicated than I thought, or I am missing something obvious.
The issue is this: to present you the "Build with parameters" page, Jenkins needs to run your pipeline and parse its parameters. To run a pipeline, Jenkins needs a node. To have a node, it parses your pipeline. So the node is already selected by the time the dialog is shown. Moreover, in declarative pipeline all the nodes of all the stages get selected in the beginning.
You can try running a scripted pipeline or a combination of scripted and declarative, by running node and supplying params.EXECUTION_NODE as label. Scripted pipeline executes the script line by line.
Edit: this is working:
NODE = null
echo "This should be Null: $NODE"
node() {
stage("Define node") {
NODE = params.NODE
echo "This is now $NODE"
}
}
pipeline {
agent { node { label "$NODE" }}
parameters { string(name: 'NODE', defaultValue: 'some_node', description: '') }
stages {
stage("Main") {
steps {
echo "Hi"
}
}
}
}
Here is an output of a second run with 'master' as parameter:
Started by user marat
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] echo
This should be Null: null
[Pipeline] node
Running on Jenkins in /home/jenkins/workspace/test
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Define node)
[Pipeline] echo
This is now master
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] node
Running on master in /var/jenkins_home/workspace/test
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Main)
[Pipeline] echo
Hi
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS

Jenkins pipeline to exit if shell script fails in any stage

I have jenkins pipeline jobs which runs shell scripts internally. even though the shell scripts fails job will show as passed only.
My Pipeline:
stage('Code Checkout') {
timestamps {
step([$class: 'WsCleanup'])
echo "check out======GIT =========== on ${env.gitlabBranch}"
checkout scm
}
}
stage("build") {
sh 'sh script.sh'
}
}
catch(err){
currentBuild.result = 'FAILURE'
emailExtraMsg = "Build Failure:"+ err.getMessage()
throw err
}
}
}
LOG:
+ sh script.sh
$RELEASE_BRANCH is empty
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
It looks like your script returns with zero status code. Otherwise it would throw an exception as described in sh step description. The problem may be that exit status of sh sctipt.sh is the exit status of last executed command and your script may do something after error happens (e.g. echo something before exit). The simplest and brutal way to make sure every error is returned is to use put set -e at the top of your script.
You don't need any catch to have this functionality (I mean fail on script error) unless you want to do some extra operations in case of error. But if you do, then you should enclose script execution in try clause:
stage("build") {
try {
sh 'sh script.sh'
}
catch (err) {
currentBuild.result = 'FAILURE'
emailExtraMsg = "Build Failure:"+ err.getMessage()
throw err
}
}

Multiple variables in Jenkins shell

I try to print two variables in Jenkins shell (one of which is global one) . When I print them independently on shell for each it works, however when I try both variables on single line it fails. See the output, seems like a crop after the first variable .
I've tried to print two local variables, and it seems working. However I need the global one
#!/usr/bin/env groovy
def START
node ('master') {
// options{
// timestamps()
// }
stage("one") {
script{
START = sh(script: 'date --utc +%FT%T', returnStdout: true)
}
stage("two") {
def END = sh(script: 'date --utc +%FT%T', returnStdout: true)
sh "echo start $START"
sh "echo end $END"
sh "echo $START and $END"
}
}
}
+ date --utc +%FT%T
[Pipeline] sh
+ echo start 2019-08-01T14:48:08
start 2019-08-01T14:48:08
[Pipeline] sh
+ echo end 2019-08-01T14:48:09
end 2019-08-01T14:48:09
[Pipeline] sh
+ echo 2019-08-01T14:48:08
2019-08-01T14:48:08
+ and 2019-08-01T14:48:09
/var/jenkins_home#tmp/durable-979e1b9e/script.sh: 2: /var/jenkins_home#tmp/durable-979e1b9e/script.sh: and: not found
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 127
Finished: FAILURE
sh is a dedicated command of jenkins-groovy. The first two work because $START/$END are the final string and doesn't try to replace something else.
sh "echo ${START} and ${END}" writing the variables like this will limit the GString and convert only the correct part and wont try to convert the "and" also.
For more have a look at this examples http://grails.asia/groovy-gstring-examples

Getting SerializableException in Jenkinsfile on curl call

I'm working on a pipeline script that isn't even building anything. It clones a repo and then gets some info about the repo, and also uses the BitBucket REST API to get other information about the repository.
The following is an excerpt of the Jenkinsfile:
stageName = 'GET-COMMITS-AND-USERS'
stage (stageName) {
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: params.JP_MechIdCredentials, usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD']]) {
def uniqueCommitterMap = {}
def format = 'yyyy-MM-dd'
def now = new Date()
def aWhileAgo = now - params.JP_DaysInPastToLookFor.toInteger()
def uniqueCommitterEmails = sh(returnStdout: true, script:"git log --date=short --pretty=format:'%ce' --after='${aWhileAgo.format(format)}' --before='${now.format(format)}' | sort -u")
now = null
aWhileAgo = null
println "uniqueCommitterEmails[${uniqueCommitterEmails}]"
def uniqueCommitterEmailList = uniqueCommitterEmails.split(/[ \t\n]+/)
uniqueCommitterEmails = null
println "uniqueCommitterEmailList[${uniqueCommitterEmailList}] size[${uniqueCommitterEmailList.size()}]"
for (int ctr = 0; ctr < uniqueCommitterEmailList.size(); ++ ctr) {
println "entry[${uniqueCommitterEmailList[ctr]}]"
println "entry[${uniqueCommitterEmailList[ctr].split('#')}]"
uniqueCommitterMap[uniqueCommitterEmailList[ctr].split("#")[0]] = uniqueCommitterEmailList[ctr]
}
println "uniqueCommitterMap[${uniqueCommitterMap}]"
println "end of uCM."
uniqueCommitterEmailList = null
def cmd = "curl -u ${USERNAME}:${PASSWORD} https://.../rest/api/1.0/projects/${params.JP_ProjectName}/repos/${params.JP_RepositoryName}/permissions/users?limit=9999"
USERNAME = null
PASSWORD = null
println "cmd[${cmd}]"
def usersJson = sh(returnStdout: true, script:cmd.toString())
println "Past curl call." // Don't get here
The following is an excerpt of the console output when I run this job with appropriate parameters:
[Pipeline] echo
end of uCM.
cmd[curl -u ****:**** https://.../rest/api/1.0/projects/.../repos/.../permissions/users?limit=9999]
[Pipeline] echo
[Pipeline] sh
[workspace] Running shell script
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
[DOSSIER] Response Code: 201
java.io.NotSerializableException: java.io.StringWriter
at org.jboss.marshalling.river.RiverMarshaller.doWriteObject(RiverMarshaller.java:860)
at org.jboss.marshalling.river.BlockMarshaller.doWriteObject(BlockMarshaller.java:65)
at org.jboss.marshalling.river.BlockMarshaller.writeObject(BlockMarshaller.java:56)
at org.jboss.marshalling.MarshallerObjectOutputStream.writeObjectOverride(MarshallerObjectOutputStream.java:50)
at org.jboss.marshalling.river.RiverObjectOutputStream.writeObjectOverride(RiverObjectOutputStream.java:179)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:344)
at java.util.HashMap.internalWriteEntries(HashMap.java:1777)
at java.util.HashMap.writeObject(HashMap.java:1354)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
As you can see, it appears to execute the "sh" step to call "curl" for the BitBucket REST API, but it doesn't get past that. I can't figure out what object it's complaining about.
Update:
I'm running Jenkins 2.19.2.
The pipeline has the following settings:
"Do not allow concurrent builds": on
10 total defined parameters, one a Credentials parameter, which is referenced in this block
To answer your question I ran Jenkins v2.32.2 from the official Dockerfile and created the following test pipeline:
node() {
stage('serialize') {
def USERNAME = 'myusername'
def PASSWORD = 'mypassword'
def cmd = "echo curl -u ${USERNAME}:${PASSWORD} https://.../${params.TEST_PARAM1}/permissions/users?limit=9999"
USERNAME = null
PASSWORD = null
println "cmd[${cmd}]"
def usersJson = sh(returnStdout: true, script:cmd)
println "Past curl call."
}
}
I also added a text parameter to the build job to have something similar than your params.JP_ProjectName variables.
And this is my output when running with the text parameter set to "defaultValue modified":
Started by user admin
[Pipeline] node
Running on master in /var/jenkins_home/workspace/42217046
[Pipeline] {
[Pipeline] stage
[Pipeline] { (serialize)
[Pipeline] echo
cmd[echo curl -u myusername:mypassword https://.../defaultValue modified/permissions/users?limit=9999]
[Pipeline] sh
[42217046] Running shell script
+ echo curl -u myusername:mypassword https://.../defaultValue modified/permissions/users?limit=9999
[Pipeline] echo
Past curl call.
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
As you can see, the pipeline finished successully. And I can see no issue with the pipeline.
Maybe you can update your question with a screenshot of your job configuration and the version number of your jenkins installation.
I came across the same issue, but it seems that the issue is not caused by sh at all. It is probably caused by a variable you've defined above the sh step, which is not Serializable.

Resources