I have something like this on a Jenkinsfile (Groovy) and I want to record the stdout and the exit code in a variable in order to use the information later.
sh "ls -l"
How can I do this, especially as it seems that you cannot really run any kind of groovy code inside the Jenkinsfile?
The latest version of the pipeline sh step allows you to do the following;
// Git committer email
GIT_COMMIT_EMAIL = sh (
script: 'git --no-pager show -s --format=\'%ae\'',
returnStdout: true
).trim()
echo "Git committer email: ${GIT_COMMIT_EMAIL}"
Another feature is the returnStatus option.
// Test commit message for flags
BUILD_FULL = sh (
script: "git log -1 --pretty=%B | grep '\\[jenkins-full]'",
returnStatus: true
) == 0
echo "Build full flag: ${BUILD_FULL}"
These options where added based on this issue.
See official documentation for the sh command.
For declarative pipelines (see comments), you need to wrap code into script step:
script {
GIT_COMMIT_EMAIL = sh (
script: 'git --no-pager show -s --format=\'%ae\'',
returnStdout: true
).trim()
echo "Git committer email: ${GIT_COMMIT_EMAIL}"
}
Current Pipeline version natively supports returnStdout and returnStatus, which make it possible to get output or status from sh/bat steps.
An example:
def ret = sh(script: 'uname', returnStdout: true)
println ret
An official documentation.
quick answer is this:
sh "ls -l > commandResult"
result = readFile('commandResult').trim()
I think there exist a feature request to be able to get the result of sh step, but as far as I know, currently there is no other option.
EDIT: JENKINS-26133
EDIT2: Not quite sure since what version, but sh/bat steps now can return the std output, simply:
def output = sh returnStdout: true, script: 'ls -l'
If you want to get the stdout AND know whether the command succeeded or not, just use returnStdout and wrap it in an exception handler:
scripted pipeline
try {
// Fails with non-zero exit if dir1 does not exist
def dir1 = sh(script:'ls -la dir1', returnStdout:true).trim()
} catch (Exception ex) {
println("Unable to read dir1: ${ex}")
}
output:
[Pipeline] sh
[Test-Pipeline] Running shell script
+ ls -la dir1
ls: cannot access dir1: No such file or directory
[Pipeline] echo
unable to read dir1: hudson.AbortException: script returned exit code 2
Unfortunately hudson.AbortException is missing any useful method to obtain that exit status, so if the actual value is required you'd need to parse it out of the message (ugh!)
Contrary to the Javadoc https://javadoc.jenkins-ci.org/hudson/AbortException.html the build is not failed when this exception is caught. It fails when it's not caught!
Update:
If you also want the STDERR output from the shell command, Jenkins unfortunately fails to properly support that common use-case. A 2017 ticket JENKINS-44930 is stuck in a state of opinionated ping-pong whilst making no progress towards a solution - please consider adding your upvote to it.
As to a solution now, there could be a couple of possible approaches:
a) Redirect STDERR to STDOUT 2>&1
- but it's then up to you to parse that out of the main output though, and you won't get the output if the command failed - because you're in the exception handler.
b) redirect STDERR to a temporary file (the name of which you prepare earlier) 2>filename (but remember to clean up the file afterwards) - ie. main code becomes:
def stderrfile = 'stderr.out'
try {
def dir1 = sh(script:"ls -la dir1 2>${stderrfile}", returnStdout:true).trim()
} catch (Exception ex) {
def errmsg = readFile(stderrfile)
println("Unable to read dir1: ${ex} - ${errmsg}")
}
c) Go the other way, set returnStatus=true instead, dispense with the exception handler and always capture output to a file, ie:
def outfile = 'stdout.out'
def status = sh(script:"ls -la dir1 >${outfile} 2>&1", returnStatus:true)
def output = readFile(outfile).trim()
if (status == 0) {
// output is directory listing from stdout
} else {
// output is error message from stderr
}
Caveat: the above code is Unix/Linux-specific - Windows requires completely different shell commands.
this is a sample case, which will make sense I believe!
node('master'){
stage('stage1'){
def commit = sh (returnStdout: true, script: '''echo hi
echo bye | grep -o "e"
date
echo lol''').split()
echo "${commit[-1]} "
}
}
For those who need to use the output in subsequent shell commands, rather than groovy, something like this example could be done:
stage('Show Files') {
environment {
MY_FILES = sh(script: 'cd mydir && ls -l', returnStdout: true)
}
steps {
sh '''
echo "$MY_FILES"
'''
}
}
I found the examples on code maven to be quite useful.
All the above method will work. but to use the var as env variable inside your code you need to export the var first.
script{
sh " 'shell command here' > command"
command_var = readFile('command').trim()
sh "export command_var=$command_var"
}
replace the shell command with the command of your choice. Now if you are using python code you can just specify os.getenv("command_var") that will return the output of the shell command executed previously.
How to read the shell variable in groovy / how to assign shell return value to groovy variable.
Requirement : Open a text file read the lines using shell and store the value in groovy and get the parameter for each line .
Here , is delimiter
Ex: releaseModule.txt
./APP_TSBASE/app/team/i-home/deployments/ip-cc.war/cs_workflowReport.jar,configurable-wf-report,94,23crb1,artifact
./APP_TSBASE/app/team/i-home/deployments/ip.war/cs_workflowReport.jar,configurable-temppweb-report,394,rvu3crb1,artifact
========================
Here want to get module name 2nd Parameter (configurable-wf-report) , build no 3rd Parameter (94), commit id 4th (23crb1)
def module = sh(script: """awk -F',' '{ print \$2 "," \$3 "," \$4 }' releaseModules.txt | sort -u """, returnStdout: true).trim()
echo module
List lines = module.split( '\n' ).findAll { !it.startsWith( ',' ) }
def buildid
def Modname
lines.each {
List det1 = it.split(',')
buildid=det1[1].trim()
Modname = det1[0].trim()
tag= det1[2].trim()
echo Modname
echo buildid
echo tag
}
If you don't have a single sh command but a block of sh commands, returnstdout wont work then.
I had a similar issue where I applied something which is not a clean way of doing this but eventually it worked and served the purpose.
Solution -
In the shell block , echo the value and add it into some file.
Outside the shell block and inside the script block , read this file ,trim it and assign it to any local/params/environment variable.
example -
steps {
script {
sh '''
echo $PATH>path.txt
// I am using '>' because I want to create a new file every time to get the newest value of PATH
'''
path = readFile(file: 'path.txt')
path = path.trim() //local groovy variable assignment
//One can assign these values to env and params as below -
env.PATH = path //if you want to assign it to env var
params.PATH = path //if you want to assign it to params var
}
}
Easiest way is use this way
my_var=`echo 2`
echo $my_var
output
: 2
note that is not simple single quote is back quote ( ` ).
For example: when I echo %APPDATA% on cmd line is expected to print the path to appdata folder, like this: "C:\Users\User\AppData". How can I do the same in a jenkins pipeline?
def app1 = "%APPDATA%"
pipeline {
agent any
stages {
stage('Test') {
steps {
echo ¨\¨${app1}\¨"
}
}
}
}
The following methods should work.
echo "${APPDATA}"
echo "${env.APPDATA}"
echo "$APPDATA"
I have a jenkins file where I am trying to loop over one Array and passing its elements to Shell script block, It is Working Fine
But, when I am Looping over Hashmap and trying to pass Its Keys it throws an error.
stage('Validation'){
steps{
script{
test1 = ["elem1","elem2"]
test2 = [key1:"value1", key2:"value2"]
for(defaults in test1){
test=defaults
echo "before shell====> ${test}"
status = sh(returnStdout: true, script: """
echo "${test}"
""").trim()
}
echo "======started next HashMAP loop==========="
for(defaults in test2){
test=defaults.key
echo "before shell====> ${test}"
status = sh(returnStdout: true, script: """
echo "${test}"
""").trim()
}
}
}
}
Where the output:
Last Error :
Caused: java.io.NotSerializableException: java.util.LinkedHashMap$Entry
When I replace the second loop over hashmap entries like this:
test2.each{ entry ->
test=entry.key
echo "before shell====> ${test}"
status = sh(returnStdout: true, script: """
echo "${test}"
""").trim()
}
I get the following output:
That was probably what you wanted. For some reason the "for each" loop does not trigger serialization, like stated in comments of JENKINS-49732
I am trying to login into an instance and check if the file test.txt is not empty, then echo .. make build unstable using the jenkins pipeline (jenkinsfile)But that's not working.
I have this:
post {
always {
sh "ssh ubuntu#$Ip 'if [ -s test.txt ] ; then echo some text && cat test.txt'"
currentBuild.result = 'UNSTABLE'
}
}
Instead of doing above, can I parse through the console log of the latest build to find something eg: some text and if that's found I want to make the build unstable
You need to return standard out from the script:
String stdOut = sh returnStdout: true, script: "ssh ubuntu#$Ip 'if [ -s test.txt ] ; then echo some text && cat test.txt'"
if (stdOut == "") {
currentBuild.status = 'UNSTABLE'
}
Or, you could use returnStatus to return the exit code of the script. The documentation for the sh step can be found here
In Jenkins, I want to get a user input and pass to a shell script for further use.
I tried to set as environment variable, but the shell script failed to get the latest value and the old value is echo.
pipeline {
agent none
environment{
myVar='something_default'
}
stages {
stage('First stage') {
agent none
steps{
echo "update myVar by user input"
script {
test = input message: 'Hello',
ok: 'Proceed?',
parameters: [
string(name: 'input', defaultValue: 'update it!', description: '')
]
myVar = "${test['input']}"
}
echo "${myVar}" // value is updated
}
}
stage('stage 2') {
agent any
steps{
echo "${myVar}" // try to see can myVar pass between stage and it output expected value
sh("./someShell.sh") // the script just contain a echo e.g. echo "myVar is ${myVar}"
// it echo the old value. i.e.something_default
}
}
}
}
The environment variables that we set in the pipeline Script will be accessible only within the script. So, even if you declare your variable as global, it will not work inside a shell script.
Only option I can think off is, pass as it as argument to the shell script
sh("./someShell.sh ${myVar}")
EDIT:
Updated Answer based on OP's query on Shell script for parsing input
LINE="[fristPara:100, secondPaa:abc]"
LINE=$(echo $LINE | sed 's/\[//g')
LINE=$(echo $LINE | sed 's/\]//g')
while read -d, -r pair; do
IFS=':' read -r key val <<<"$pair"
echo "$key = $val"
done <<<"$LINE,
"
You need to pass the variables between your stages as environment variables, e.g. like this:
stage("As user for input") {
steps {
env.DO_SOMETING = input (...)
env.MY_VAR = ...
}
}
stage("Do something") {
when { environment name: 'DO_SOMETING', value: 'yes' }
steps {
echo "DO_SOMETING has the value ${env.DO_SOMETHING}"
echo "MY_VAR has the value ${env.MY_VAR}"
}
}
You have to declare the variable on a global scope so that both places refer to the same instance.
def myVal
pipeline { ... }