Using parameters within a shell command in Jenkinsfile for Jenkins pipeline - shell

I want to use defined parameters in Jenkinsfile in several shell commands, but I get an exception. In my example I want to execute a simple docker command. The parameter defines the path to docker executable.
This is my very short Jenkinsfile:
pipeline {
agent any
parameters {
string(defaultValue: '/Applications/Docker.app/Contents/Resources/bin/docker', description: '', name: 'docker')
}
stages {
stage('Test') {
steps {
sh 'sudo ${params.docker} ps -a'
}
}
}
}
And I get the following exception:
[e2e-web-tests_master-U4G4QJHPUACAEACYSISPVBCMQBR2LS5EZRVEKG47I2XHRI54NCCQ] Running shell script
/Users/Shared/Jenkins/Home/workspace/e2e-web-tests_master-U4G4QJHPUACAEACYSISPVBCMQBR2LS5EZRVEKG47I2XHRI54NCCQ#tmp/durable-e394f175/script.sh: line 2: ${params.docker}: bad substitution
When I change the Jenkinsfile without using the paramter inside the shell command it passes successfully:
pipeline {
agent any
parameters {
string(defaultValue: '/Applications/Docker.app/Contents/Resources/bin/docker', description: '', name: 'docker')
}
stages {
stage('Test') {
steps {
sh 'sudo /Applications/Docker.app/Contents/Resources/bin/docker ps -a'
}
}
}
}
So, how can I use parameters inside a shell command in Jenkinsfile? I tried string and text as parameter types.

The issue you have is that single quotes are a standard java String.
Double quotes are a templatable String, which will either return a GString if it is templated, or else a standard Java String.
So it you use double quotes:
stages {
stage('Test') {
steps {
sh "sudo ${params.docker} ps -a"
}
}
}
then params.docker will replace the ${params.docker} inside the 'sh' script in the pipeline.
If you want to put " inside the "sudo ${params.docker} ps -a" it doesn't work like bash (which is confusing) you use java style escaping, so "sudo \"${params.docker}\" ps -a"

Related

Jenkins Declarative Pipeline is not supporting shell/bash syntax

I have a shell script inside my jenkins pipeline which will call mvn. For that i have to pass variable value to mvn. The variable is not passing inside the Jenkins pipeline's shell. But when trying from local machine shell it is working fine as expected.
ARTIFACT_NAME="Sample_Artifact"
pipeline{
agent {
node{
label "${AGENT}"
}
}
stages{
stage("Setting MultiJob Properties"){
steps{
sh '''set +x
export VERSION=$(mvn -B -q -Dexec.executable=echo -Dexec.args=\${${ARTIFACT_NAME}} )
echo $VERSION
'''
}
}
}
}
Expected Process: export VERSION=$(mvn -B -q -Dexec.executable=echo -Dexec.args=${Sample_Artifact} )
Expected Output: 1.0001
ARTIFACT_NAME - I am passing it from Jenkins UI.
${${ARTIFACT_NAME}} - This variable is perfectly replace value in Freestyle jobs and it is throwing error in the Pipeline jobs.
Error Message: script.sh: 3: Bad substitution
Can Anyone please help me to resolve the issue?
As Ian wrote, you're passing the whole script as a literal (''') instead of an interpolated string ("""), so the variable name doesn't get substituted with its value:
pipeline{
agent {
node {
label AGENT
}
}
stages {
stage("Setting MultiJob Properties") {
steps {
sh """set +x
export VERSION=\$(mvn -B -q -Dexec.executable=echo -Dexec.args=\${$ARTIFACT_NAME})
echo \$VERSION"""
}
}
}
}

Access string variable from bash in jenkinsfile groovy script

I'm building several android apps in a docker image using gradle and a bash script. The script is triggered by jenkins, which runs the docker image.
In the bash script I gather information about the successes of the builds. I want to pass that information to the groovy script of the jenkinsfile.
I tried to create a txt file in the docker container, but the groovy script in the jenkinsfile can not find that file.
This is the groovy script of my jenkinsfile:
script {
try {
sh script:'''
#!/bin/bash
./jenkins.sh
'''
} catch(e){
currentBuild.result = "FAILURE"
} finally {
String buildResults = null
try {
def pathToBuildResults="[...]/buildResults.txt"
buildResults = readFile "${pathToBuildResults}"
} catch(e) {
buildResults = "error receiving build results. Error: " + e.toString()
}
}
}
In my jenkins.sh bash script I do the following:
[...]
buildResults+=" $appName: Build Failed!" //this is done for several apps
echo "$buildResults" | cat > $pathToBuildResults //this works I checked, if the file is created
[...]
The file is created, but groovy cannot find it. I think the reason is, that the jenkins script does not run inside the docker container.
How can I access the string buildResults of the bash script in my groovy jenkins script?
One option that you have in order to avoid the need to read the results file is to modify your jenkins.sh script to print the results to the output instead of writing them to a file and then use the sh step to capture that output and use it instead of the file.
Something like:
script {
try {
String buildResults = sh returnStdout: true, script:'''
#!/bin/bash
./jenkins.sh
'''
// You now have the output of jenkins.sh inside the buildResults parameter
} catch(e){
currentBuild.result = "FAILURE"
}
}
This way you are avoiding the need to handle the output files and directly get the results you need, which you can then parse and use however you need.

Env variable value got reset to original even after assigning the pom version number in jenkins script

I have a scenario where i have to read the maven pom versions for different components and assign the version to docker image(TAG). But after i read the pom, assigned it to some global variable it will reset to original value in groovy jenkins script. Below is the sample. HMAP_VERSION value will 1.2.1 but when it is used in the line: sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} test.com" the value will be UNINITIALISED.
Can somebody tell me what might have gone wrong? This will work with single maven file which is read in env block as below:
environment {
CLOADER_VERSION = readMavenPom().getVersion()
}
Below is the sample of what im tring to do.
#! groovy
environment {
HMAP_VERSION = "UNINITIALISED"
CLOADER_VERSION = "UNINITIALISED"
}
stages {
stage('Build Cloader') {
steps {
checkout([$class: 'GitSCM' "rest is removed")
dir('isa-casloader') {
script {
CLOADER_VERSION = readMavenPom().getVersion()
}
container('build') {
sh '/opt/apache-maven/bin/mvn -s settings.xml -B clean install -DskipTests=true'
}
}
}
}
stage ('Build Casloader Docker Image') {
steps {
dir('isa-casloader') {
container('tools') {
echo("CLOADER_VERSION=${CLOADER_VERSION}")
withCredentials() {
sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} testing.com"
sh 'docker build -t testing.com:${CLOADER_VERSION} .'
sh 'docker push testing.com:${CLOADER_VERSION}'
}
}
}
}
}
stage ('Build Heat Map Docker Image') {
steps {
checkout([$class: 'GitSCM', "rest is commented"])
dir('apps') {
container('tools') {
script {
def pom = readMavenPom file: 'pom-docker.xml'
HMAP_VERSION = pom.version
}
echo("HMAP_VERSION=${HMAP_VERSION}")
withCredentials() {
sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} test.com"
sh 'docker build -t test.com:${HMAP_VERSION} .'
sh 'docker push test.com:${HMAP_VERSION}'
}}}}}}}
By my read of your code, you're mixing environment variables with variables within the Groovy context.
These lines create environment variables, which are accessible in the shell as $HMAP_VERSION and $CLOADER_VERSION:
environment {
HMAP_VERSION = "UNINITIALISED"
CLOADER_VERSION = "UNINITIALISED"
}
However, you're populating a Groovy variable here:
script {
CLOADER_VERSION = readMavenPom().getVersion()
}
To instead populate the environment variable, you'd want to use env.CLOADER_VERSION instead.
This changes what context the variables are evaluated in when you're calling out to shell using the sh directive:
1-> sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} testing.com"
2-> sh 'docker build -t testing.com:${CLOADER_VERSION} .'
3-> sh 'docker push testing.com:${CLOADER_VERSION}'
In line number 1 above, the command is quoted using a double quotes (") which means that the variables ART_USERNAME and ART_PASSWORD are evaluating in the context of the Groovy script.
However, in lines 2 and 3 the commands are quoted using a single quote (') which means that those variables are being evaluated by the shell (likely /bin/sh) and therefore using the values from the environment.
The easiest fix would be to ensure that values you want exposed in the shell are always accessed using the env. prefix in the Groovy context:
// set environment for CLOADER_VERSION
env.CLOADER_VERSION = readMavenPom().getVersion()
// print value of environment variable CLOADER_VERSION
echo("CLOADER_VERSION=${env.CLOADER_VERSION}")
// set environment for HMAP_VERSION
env.HMAP_VERSION = pom.version
// print value of environment variable HMAP_VERSION
echo("HMAP_VERSION=${env.HMAP_VERSION}")
Cheers.
Thanks for the response. My issue got resolved. In docker context as shown below,
withCredentials() {
sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} testing.com"
sh 'docker build -t testing.com:${CLOADER_VERSION} .'
sh 'docker push testing.com:${CLOADER_VERSION}'
}
Login command is proper which is inside double quotes, but the next statements were in single quotes. So variables latest value was not getting resolved. When i change the statements to be inside double quotes, it worked!!
Below is the proper command:
withCredentials() {
sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} testing.com"
sh "docker build -t testing.com:${CLOADER_VERSION} ."
sh "docker push testing.com:${CLOADER_VERSION}"
}
Thanks you.

Accessing Shell variable from within Jenkins Pipeline

I am trying the below line in my Jenkins Pipeline. In the below set of lines, I am assigning the variable IMAGE_NAME in a shell, and trying to access that in the Jenkins Pipeline script, but not able to do that. Any idea on how to do that?
stage('Build: Get Image') {
steps {
echo 'Getting docker image'
sh "IMAGE_NAME=`grep -ri \"Successfully built\"
$BUILD_FILE_NAME | awk \'{print \$3}\'`"
echo "Image Name is:$IMAGE_NAME"
}
}
You can define it as env variable:
env.some_var = 'AAAA'
And print with:
sh 'echo ${env.some_var}'
proxy_host = 'abc.com'
stage('Docker Up') {
steps{
script{
sh("""
echo ${http_proxy}
""")
}
}
Catch here is to use double quotes " to execute the shell script. I tested it and it works fine.

Jenkins: Pipeline sh bad substitution error

A step in my pipeline uploads a .tar to an artifactory server. I am getting a Bad substitution error when passing in env.BUILD_NUMBER, but the same commands works when the number is hard coded. The script is written in groovy through jenkins and is running in the jenkins workspace.
sh 'curl -v --user user:password --data-binary ${buildDir}package${env.BUILD_NUMBER}.tar -X PUT "http://artifactory.mydomain.com/artifactory/release-packages/package${env.BUILD_NUMBER}.tar"'
returns the errors:
[Pipeline] sh
[Package_Deploy_Pipeline] Running shell script
/var/lib/jenkins/workspace/Package_Deploy_Pipeline#tmp/durable-4c8b7958/script.sh: 2:
/var/lib/jenkins/workspace/Package_Deploy_Pipeline#tmp/durable-4c8b7958/script.sh: Bad substitution
[Pipeline] } //node
[Pipeline] Allocate node : End
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
If hard code in a build number and swap out ${env.BUILD_NUMBER} I get no errors and the code runs successfully.
sh 'curl -v --user user:password --data-binary ${buildDir}package113.tar -X PUT "http://artifactory.mydomain.com/artifactory/release-packages/package113.tar"'
I use ${env.BUILD_NUMBER} within other sh commands within the same script and have no issues in any other places.
This turned out to be a syntax issue. Wrapping the command in ''s caused ${env.BUILD_NUMBER to be passed instead of its value. I wrapped the whole command in "s and escaped the nested. Works fine now.
sh "curl -v --user user:password --data-binary ${buildDir}package${env.BUILD_NUMBER}.tar -X PUT \"http://artifactory.mydomain.com/artifactory/release-packages/package${env.BUILD_NUMBER}.tar\""
In order to Pass groovy parameters into bash scripts in Jenkins pipelines (causing sometimes bad substitions) You got 2 options:
The triple double quotes way [ " " " ]
OR
the triple single quotes way [ ' ' ' ]
In triple double quotes you can render the normal parameter from groovy using ${someVariable} ,if it's environment variable ${env.someVariable} , if it's parameters injected into your job ${params.someVariable}
example:
def YOUR_APPLICATION_PATH= "${WORKSPACE}/myApp/"
sh """#!/bin/bash
cd ${YOUR_APPLICATION_PATH}
npm install
"""
In triple single quotes things getting little bit tricky, you can pass the parameter to environment parameter and using it by "\${someVaraiable}" or concating the groovy parameter using ''' + someVaraiable + '''
examples:
def YOUR_APPLICATION_PATH= "${WORKSPACE}/myApp/"
sh '''#!/bin/bash
cd ''' + YOUR_APPLICATION_PATH + '''
npm install
'''
OR
pipeline{
agent { node { label "test" } }
environment {
YOUR_APPLICATION_PATH = "${WORKSPACE}/myapp/"
}
continue...
continue...
continue...
sh '''#!/bin/bash
cd "\${YOUR_APPLICATION_PATH}"
npm install
'''
//OR
sh '''#!/bin/bash
cd "\${env.YOUR_APPLICATION_PATH}"
npm install
'''
Actually, you seem to have misunderstood the env variable. In your sh block, you should access ${BUILD_NUMBER} directly.
Reason/Explanation: env represents the environment inside the script. This environment is used/available directly to anything that is executed, e.g. shell scripts.
Please also pay attention to not write anything to env.*, but use withEnv{} blocks instead.
Usually the most common issue for:
Bad substitution
error is to use sh instead of bash.
Especially when using Jenkins, if you're using Execute shell, make sure your Command starts with shebang, e.g. #!/bin/bash -xe or #!/usr/bin/env bash.
I can definitely tell you, it's all about sh shell and bash shell. I fixed this problem by specifying #!/bin/bash -xe as follows:
node {
stage("Preparing"){
sh'''#!/bin/bash -xe
colls=( col1 col2 col3 )
for eachCol in ${colls[#]}
do
echo $eachCol
done
'''
}
}
I had this same issue when working on a Jenkins Pipeline for Amazon S3 Application upload.
My script was like this:
pipeline {
agent any
parameters {
string(name: 'Bucket', defaultValue: 's3-pipeline-test', description: 'The name of the Amazon S3 Bucket')
string(name: 'Prefix', defaultValue: 'my-website', description: 'Application directory in the Amazon S3 Bucket')
string(name: 'Build', defaultValue: 'public/', description: 'Build directory for the application')
}
stages {
stage('Build') {
steps {
echo 'Running build phase'
sh 'npm install' // Install packages
sh 'npm run build' // Build project
sh 'ls' // List project files
}
}
stage('Deploy') {
steps {
echo 'Running deploy phase'
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: 'AWS_ACCESS_KEY_ID', credentialsId: 'AWSCredentials', secretKeyVariable: 'AWS_SECRET_ACCESS_KEY']]) {
sh 'aws s3 ls' // List AWS S3 buckets
sh 'aws s3 sync "${params.Build}" s3://"${params.Bucket}/${params.Prefix}" --delete' // Sync project files with AWS S3 Bucket project path
}
}
}
}
post {
success {
echo 'Deployment to Amazon S3 suceeded'
}
failure {
echo 'Deployment to Amazon S3 failed'
}
}
}
Here's how I fixed it:
Seeing that it's an interpolation call of variables, I had to substitute the single quotation marks (' ') in this line of the script:
sh 'aws s3 sync "${params.Build}" s3://"${params.Bucket}/${params.Prefix}" --delete' // Sync project files with AWS S3 Bucket project path
to double quotation marks (" "):
sh "aws s3 sync ${params.Build} s3://${params.Bucket}/${params.Prefix} --delete" // Sync project files with AWS S3 Bucket project path
So my script looked like this afterwards:
pipeline {
agent any
parameters {
string(name: 'Bucket', defaultValue: 's3-pipeline-test', description: 'The name of the Amazon S3 Bucket')
string(name: 'Prefix', defaultValue: 'my-website', description: 'Application directory in the Amazon S3 Bucket')
string(name: 'Build', defaultValue: 'public/', description: 'Build directory for the application')
}
stages {
stage('Build') {
steps {
echo 'Running build phase'
sh 'npm install' // Install packages
sh 'npm run build' // Build project
sh 'ls' // List project files
}
}
stage('Deploy') {
steps {
echo 'Running deploy phase'
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: 'AWS_ACCESS_KEY_ID', credentialsId: 'AWSCredentials', secretKeyVariable: 'AWS_SECRET_ACCESS_KEY']]) {
sh 'aws s3 ls' // List AWS S3 buckets
sh "aws s3 sync ${params.Build} s3://${params.Bucket}/${params.Prefix} --delete" // Sync project files with AWS S3 Bucket project path
}
}
}
}
post {
success {
echo 'Deployment to Amazon S3 suceeded'
}
failure {
echo 'Deployment to Amazon S3 failed'
}
}
}
That's all
I hope this helps
I was having the issue with showing the {env.MAJOR_VERSION} in an artifactory of jar file . show I approaches by keeping of environment step in Jenkinsfile.
pipeline {
agent any
environment {
MAJOR_VERSION = 1
}
stages {
stage('build') {
steps {
sh 'ant -f build.xml -v'
}
}
}
post {
always{
archiveArtifacts artifacts: 'dist/*.jar', fingerprint: true
}
}
}
I got the issue solved and then it was not showing me bad substitution in my Jenkins build output. so environment step plays a more role in Jenkinsfile.
suggestion from #avivamg didn't worked for me, here is the syntax which works for me:
sh "python3 ${env.WORKSPACE}/package.py --product productname " +
"--build_dir ${release_build_dir} " +
"--signed_product_dir ${signed_product_dir} " +
"--version ${build_version}"
I got similar issue. But my usecase is little different
steps{
sh '''#!/bin/bash -xe
VAR=TRIAL
echo $VAR
if [ -d /var/lib/jenkins/.m2/'\${params.application_name}' ]
then
echo 'working'
echo ${VAR}
else
echo 'not working'
fi
'''
}
}
here I'm trying to declare a variable inside the script and also use a parameter from outside
After trying multiple ways
The following script worked
stage('cleaning com/avizva directory'){
steps{
sh """#!/bin/bash -xe
VAR=TRIAL
echo \$VAR
if [ -d /var/lib/jenkins/.m2/${params.application_name} ]
then
echo 'working'
echo \${VAR}
else
echo 'not working'
fi
"""
}
}
changes made :
Replaced triple single quotes --> triple double quotes
Whenever I want to refer to local variable I used escape character
$VAR --> \$VAR
This caused the error Bad Substitution:
pipeline {
agent any
environment {
DOCKER_IMAGENAME = "mynginx:latest"
DOCKER_FILE_PATH = "./docker"
}
stages {
stage('DockerImage-Build') {
steps {
sh 'docker build -t ${env.DOCKER_IMAGENAME} ${env.DOCKER_FILE_PATH}'
}
}
}
}
This fixed it: replace ' with " on sh command
pipeline {
agent any
environment {
DOCKER_IMAGENAME = "mynginx:latest"
DOCKER_FILE_PATH = "./docker"
}
stages {
stage('DockerImage-Build') {
steps {
sh "docker build -t ${env.DOCKER_IMAGENAME} ${env.DOCKER_FILE_PATH}"
}
}
}
}
The Jenkins Script is failing inside the "sh" command-line E.g:
sh 'npm run build' <-- Fails referring to package.json
Needs to be changed to:
sh 'npm run ng build....'
... ng $PATH is not found by the package.json.

Resources