Change groovy variables inside shell executor in Jenkins pipeline - shell

I have a Jenkins pipeline job where I am taking some build variables as input, and if the variables are not passed by the user, I execute a script and get the value of those variables. Later I have to use the value of these variables to trigger other jobs.
So my code looks something like this:
node {
withCredentials([[$class: 'StringBinding', credentialsId: 'DOCKER_HOST', variable: 'DOCKER_HOST']]) {
env.T_RELEASE_VERSION = T_RELEASE_VERSION
env.C_RELEASE_VERSION = C_RELEASE_VERSION
env.N_RELEASE_VERSION = N_RELEASE_VERSION
env.F_RELEASE_VERSION = F_RELEASE_VERSION
....
stage concurrency: 1, name: 'consul-get-version'
sh '''
if [ -z ${T_RELEASE_VERSION} ]
then
export T_RELEASE_VERSION=$(ruby common/consul/services_prod_version.rb prod_t_release_version)
aws ecr get-login --region us-east-1
aws ecr list-images --repository-name t-server | grep ${T_RELEASE_VERSION}
else
aws ecr get-login --region us-east-1
aws ecr list-images --repository-name t-server | grep ${T_RELEASE_VERSION}
fi
.......
't-integ-pipeline' : {
build job: 't-integ-pipeline', parameters: [[$class: 'StringParameterValue', name: 'RELEASE_VERSION', value: T_RELEASE_VERSION],
[$class: 'BooleanParameterValue', name: 'FASTFORWARD_TO_DEPLOY', value: true]]
},
......
The issue is when I am triggering the main job with empty T_RELEASE_VERSION, the child build job t-integ-pipeline is triggered with an empty value of the RELEASE_VERSION parameter.
How can I change a groovy parameter inside a shell executor and then access it again in the groovy executor with the modified value?

When using env-inject it was possible to store the values in the properties files and the inject them as environment variables. Couldn't find any easy way to do it in pipeline.
Here is a solution anyway, store the values to a file, and read the file from the pipeline. Then use eval or similar to transform it to an parsable object (hash).
Eval.me example: Serializing groovy map to string with quotes
Write/Read to file example:
https://wilsonmar.github.io/jenkins2-pipeline/
EDIT
Manish solution for readability:
sh 'ruby common/consul/services_prod_version.rb prod_n_release_version > status'
N_RELEASE_VERSION_NEW = readFile('status').trim()
sh 'ruby common/consul/services_prod_version.rb prod_q_release_version > status'
Q_RELEASE_VERSION_NEW = readFile('status').trim()

I found a way change the groovy variable in the shell, No need to store it in the file, There is a example here git-tag-message-plugin, I use this method like below:
script{
N_RELEASE_VERSION_NEW = getN_RELEASE_VERSION_NEW()
}
String getN_RELEASE_VERSION_NEW() {
return sh(script: "ruby common/consul/services_prod_version.rb prod_n_release_version ", returnStdout: true)?.trim()
}

Related

How to extract command output from the multi lines shell in Jenkins

How to get the output of kubectl describe deployment nginx | grep Image in an environment variable?
My code:
stage('Deployment'){
script {
sh """
export KUBECONFIG=/tmp/kubeconfig
kubectl describe deployment nginx | grep Image"""
}
}
In this situation, you can access the environment variables in the pipeline scope within the env object, and assign values to its members to initialize new environment variables. You can also utilize the optional returnStdout parameter to the sh step method to return the stdout of the method, and therefore assign it to a Groovy variable (because it is within the script block in the pipeline).
script {
env.IMAGE = sh(script: 'export KUBECONFIG=/tmp/kubeconfig && kubectl describe deployment nginx | grep Image', returnStdout: true).trim()
}
Note you would also want to place the KUBECONFIG environment variable within the environment directive at the pipeline scope instead (unless the kubeconfig will be different in different scopes):
pipeline {
environment { KUBECONFIG = '/tmp/kubeconfig' }
}
You can use the syntax:
someVariable = sh(returnStdout: true, script: some_script).trim()

Does "sh" command in Jenkins file starts a new session or a new shell?

I observe a scenario when I'm writing a Jenkinsfile to first authenticate a session on AWS and then push a dockerfile to designated ECR. The below code block works fine and pushes the image to ECR:
stage('build and push images') {
steps {
sh """
sh assume_role.sh
source /tmp/${assume_role_session_name}
aws ecr get-login --region ${aws_region} --registry-ids ${ROLEARN} --no-include-email
docker build -t my-docker-image .
docker tag my-docker-image:latest ${ROLEARN}.dkr.ecr.${aws_region}.amazonaws.com/${ECR_name}:${ECS_TAG_VERSION}
docker push ${ROLEARN}.dkr.ecr.${aws_region}.amazonaws.com/${ECR_name}:${ECS_TAG_VERSION}
docker rmi -f my-docker-image:latest
"""
}
}
However, when I divided each step with an individual sh command (like below), docker push failed because the Jenkins agent hasn't been authenticated, which means the authentication token isn't passed to docker push command line.
stage('build and push images') {
steps {
sh "assume_role.sh"
sh "source /tmp/${assume_role_session_name}"
sh "aws ecr get-login --region ${aws_region} --registry-ids ${ROLEARN} --no-include-email"
sh "docker build -t my-docker-image . "
sh "docker tag my-docker-image:latest ${ROLEARN}.dkr.ecr.${aws_region}.amazonaws.com/${ECR_name}:${ECS_TAG_VERSION}"
sh "docker push ${ROLEARN}.dkr.ecr.${aws_region}.amazonaws.com/${ECR_name}:${ECS_TAG_VERSION}"
sh "docker rmi -f my-docker-image:latest"
}
}
Thus, I'm suspecting that the each sh starts a new session in Jenkins steps, in between which, authentication tokens cannot be passed through. I don't know whether my guess is correct and how to find evidence to support my guess.
I thought I would share my solution on how I overcame the annoying need to repeatedly assume the role in every sh block. Passing the extracted credentials (dynamically of course) as environment variables solved the issue for me, and there was no need to re-authenticate again in different scripts.
Adding the credentials to the environment variables forces each script to use them.
environment {
ACCESS = sh(
returnStdout: true,
script: '''
echo "$(aws \
sts assume-role \
--role-arn="arn:aws:iam::\${AWS_ACCOUNT_DEV}:role/\${ASSUME_ROLE}" \
--role-session-name="jenkins" \
--output json
)"
'''
).trim()
}
stages{
stage('Create env variables') {
steps {
script {
env.AWS_ACCESS_KEY_ID = sh(
returnStdout: true,
script: '''
echo "${ACCESS}" | jq -re '.Credentials.AccessKeyId'
'''
).trim()
env.AWS_SECRET_ACCESS_KEY = sh(
returnStdout: true,
script: '''
echo "${ACCESS}" | jq -re '.Credentials.SecretAccessKey'
'''
).trim()
env.AWS_SESSION_TOKEN = sh(
returnStdout: true,
script: '''
echo "${ACCESS}" | jq -re '.Credentials.SessionToken'
'''
).trim()
}
}
}
}
To your question, this StackOverflow answer describes what happens to the environment variables set within the sh execution.
Hope this helps ;)

Curl returns Invalid JSON error in a Jenkins Pipeline script but returns the expected response on a bash shell run or in a Jenkins Freestyle job

I am writing a Jenkins Pipeline job for setting up AWS infrastructure using API calls to our in-house AWS CLI wrapper library. Running the raw bash scripts on a CentOS box or as a Jenkins Freestyle job runs fine. However, it fails in the context of a Pipeline job. I think that the quotes may need to be different for the Pipeline job but I am not sure how.
After further investigation, I found that the curl command returns the wrong response from the service when running the scripts within a Jenkins Pipeline job.
pipeline {
agent any
stages {
stage('Checkout code from Git'){
steps {
echo "Checkout code from a GitHub repository"
// Checkout code from a GitHub repository
checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [[$class: 'SubmoduleOption', disableSubmodules: false, parentCredentials: false, recursiveSubmodules: true, reference: '', trackingSubmodules: false]], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'xxxx', url: 'git#github.com:bbc/repo.git']]])
}
}
stage('Call our internal AWS CLI Wrapper System API to perform an ACTION on a specified ENVIRONMENT') {
steps {
script {
if("${params.ENVIRONMENT}" == 'int' && "${params.ACTION}" == 'create'){
echo "ENVIRONMENT=${params.ENVIRONMENT}, ACTION=${params.ACTION}"
echo ""
sh '''#!/bin/bash
# Create Neptune Cluster for the Int environment
cd blah-db
echo "Current working directory is $PWD"
CLOUD_FORMATION_FILE=$PWD/infrastructure/templates/neptune-cluster.json
echo "The CloudFormation file to operate on is $CLOUD_FORMATION_FILE"
echo "Running jq to transform the source CloudFormation file"
template=$(jq -M '.Parameters.Env.Default="int"' $CLOUD_FORMATION_FILE)
echo "Echoing the transformed CloudFormation file: \n$template"
echo "Running curl to make the http request to our internal AWS CLI Wrapper System"
curl -d "{\"aws_account\": \"1111111111\", \"region\": \"us-east-1\", \"name_suffix\": \"cluster\", \"template\": $template}" \
-H 'Content-Type: application/json' -H 'Accept: application/json' https://base.api.url/v1/services/blah-neptune/int/stacks \
--cert /path/to/client/certificate/client.crt --key /path/to/client/private-key/client.key
cd ..
pwd
# Set a timer to run for 300 seconds or 5 minutes to create a delay to allow for the Neptune Cluster to be fully provisioned first before adding instances to it.
'''
}
}
}
}
}
}
The actual result that I get from making the API call:
{"error": "Invalid JSON. Expecting property name: line 1 column 1 (char 1)"}
try change the curl as following:
curl -d '{"aws_account": "1111111111", "region": "us-east-1", "name_suffix": "cluster", "template": $template}'
Or assign the whole cmd to a variable and print it out to see it's as your wanted or not.
cmd = '''#!/bin/bash
cd blah-db
...
'''
echo cmd // compare the output string to the cmd of freestyle job.
sh cmd

Assign extracted values from aws command to variables in jenkins pipeline

def id
def state
pipeline {
agent any
stages{
stage('aws') {
steps {
script{
/*extract load generator instanceId*/
sh "aws ec2 describe-instances --filters 'Name=tag:Name,Values=xxx' --output text --query 'Reservations[*].Instances[*].{id:InstanceId,state:State.Name}' --region us-east-1"
echo "id and state: ${id} ${state}"
}
}
}
}
}
I am trying to extract the instace id and state of the xxx instance using the above command and able to get the values of them
But when I try to echo them I get the values as null. So they are not being assigned to the ${id} and {state} variables
Is there any way I could assign them to the above variables in jenkins pipeline
Note: Don't want to use jq
Thanks
Your current implementation doesn't assign any variables, shell, Jenkins, or otherwise. id and instanceState are just aliases for other fields in the context of the aws command. In order to have access to those values in the context of the pipeline, I'd recommend combining the output of the sh step with the readJSON step (it's part of the pipeline utility steps plugin). Then you can do something like this:
def id
def state
pipeline {
agent any
stages{
stage('aws') {
steps {
script{
/*extract load generator instanceId*/
instanceInfo = sh (
script: "aws ec2 describe-instances --filters 'Name=tag:Name,Values=xxx' --output text --query 'Reservations[*].Instances[*].{id:InstanceId,instanceState:State.Name}' --region us-east-1",
returnStdout: true
).trim()
instanceJSON = readJSON text: instanceInfo
instanceJSON.each { instance ->
echo "${instance.id[0]}: ${instance.instanceState[0]}"
}
}
}
}
}
}
(I hand-fudged a couple of those items for my minimal test case; please post any errors you get and we'll clean things up)

Jenkins: Pipeline sh bad substitution error

A step in my pipeline uploads a .tar to an artifactory server. I am getting a Bad substitution error when passing in env.BUILD_NUMBER, but the same commands works when the number is hard coded. The script is written in groovy through jenkins and is running in the jenkins workspace.
sh 'curl -v --user user:password --data-binary ${buildDir}package${env.BUILD_NUMBER}.tar -X PUT "http://artifactory.mydomain.com/artifactory/release-packages/package${env.BUILD_NUMBER}.tar"'
returns the errors:
[Pipeline] sh
[Package_Deploy_Pipeline] Running shell script
/var/lib/jenkins/workspace/Package_Deploy_Pipeline#tmp/durable-4c8b7958/script.sh: 2:
/var/lib/jenkins/workspace/Package_Deploy_Pipeline#tmp/durable-4c8b7958/script.sh: Bad substitution
[Pipeline] } //node
[Pipeline] Allocate node : End
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
If hard code in a build number and swap out ${env.BUILD_NUMBER} I get no errors and the code runs successfully.
sh 'curl -v --user user:password --data-binary ${buildDir}package113.tar -X PUT "http://artifactory.mydomain.com/artifactory/release-packages/package113.tar"'
I use ${env.BUILD_NUMBER} within other sh commands within the same script and have no issues in any other places.
This turned out to be a syntax issue. Wrapping the command in ''s caused ${env.BUILD_NUMBER to be passed instead of its value. I wrapped the whole command in "s and escaped the nested. Works fine now.
sh "curl -v --user user:password --data-binary ${buildDir}package${env.BUILD_NUMBER}.tar -X PUT \"http://artifactory.mydomain.com/artifactory/release-packages/package${env.BUILD_NUMBER}.tar\""
In order to Pass groovy parameters into bash scripts in Jenkins pipelines (causing sometimes bad substitions) You got 2 options:
The triple double quotes way [ " " " ]
OR
the triple single quotes way [ ' ' ' ]
In triple double quotes you can render the normal parameter from groovy using ${someVariable} ,if it's environment variable ${env.someVariable} , if it's parameters injected into your job ${params.someVariable}
example:
def YOUR_APPLICATION_PATH= "${WORKSPACE}/myApp/"
sh """#!/bin/bash
cd ${YOUR_APPLICATION_PATH}
npm install
"""
In triple single quotes things getting little bit tricky, you can pass the parameter to environment parameter and using it by "\${someVaraiable}" or concating the groovy parameter using ''' + someVaraiable + '''
examples:
def YOUR_APPLICATION_PATH= "${WORKSPACE}/myApp/"
sh '''#!/bin/bash
cd ''' + YOUR_APPLICATION_PATH + '''
npm install
'''
OR
pipeline{
agent { node { label "test" } }
environment {
YOUR_APPLICATION_PATH = "${WORKSPACE}/myapp/"
}
continue...
continue...
continue...
sh '''#!/bin/bash
cd "\${YOUR_APPLICATION_PATH}"
npm install
'''
//OR
sh '''#!/bin/bash
cd "\${env.YOUR_APPLICATION_PATH}"
npm install
'''
Actually, you seem to have misunderstood the env variable. In your sh block, you should access ${BUILD_NUMBER} directly.
Reason/Explanation: env represents the environment inside the script. This environment is used/available directly to anything that is executed, e.g. shell scripts.
Please also pay attention to not write anything to env.*, but use withEnv{} blocks instead.
Usually the most common issue for:
Bad substitution
error is to use sh instead of bash.
Especially when using Jenkins, if you're using Execute shell, make sure your Command starts with shebang, e.g. #!/bin/bash -xe or #!/usr/bin/env bash.
I can definitely tell you, it's all about sh shell and bash shell. I fixed this problem by specifying #!/bin/bash -xe as follows:
node {
stage("Preparing"){
sh'''#!/bin/bash -xe
colls=( col1 col2 col3 )
for eachCol in ${colls[#]}
do
echo $eachCol
done
'''
}
}
I had this same issue when working on a Jenkins Pipeline for Amazon S3 Application upload.
My script was like this:
pipeline {
agent any
parameters {
string(name: 'Bucket', defaultValue: 's3-pipeline-test', description: 'The name of the Amazon S3 Bucket')
string(name: 'Prefix', defaultValue: 'my-website', description: 'Application directory in the Amazon S3 Bucket')
string(name: 'Build', defaultValue: 'public/', description: 'Build directory for the application')
}
stages {
stage('Build') {
steps {
echo 'Running build phase'
sh 'npm install' // Install packages
sh 'npm run build' // Build project
sh 'ls' // List project files
}
}
stage('Deploy') {
steps {
echo 'Running deploy phase'
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: 'AWS_ACCESS_KEY_ID', credentialsId: 'AWSCredentials', secretKeyVariable: 'AWS_SECRET_ACCESS_KEY']]) {
sh 'aws s3 ls' // List AWS S3 buckets
sh 'aws s3 sync "${params.Build}" s3://"${params.Bucket}/${params.Prefix}" --delete' // Sync project files with AWS S3 Bucket project path
}
}
}
}
post {
success {
echo 'Deployment to Amazon S3 suceeded'
}
failure {
echo 'Deployment to Amazon S3 failed'
}
}
}
Here's how I fixed it:
Seeing that it's an interpolation call of variables, I had to substitute the single quotation marks (' ') in this line of the script:
sh 'aws s3 sync "${params.Build}" s3://"${params.Bucket}/${params.Prefix}" --delete' // Sync project files with AWS S3 Bucket project path
to double quotation marks (" "):
sh "aws s3 sync ${params.Build} s3://${params.Bucket}/${params.Prefix} --delete" // Sync project files with AWS S3 Bucket project path
So my script looked like this afterwards:
pipeline {
agent any
parameters {
string(name: 'Bucket', defaultValue: 's3-pipeline-test', description: 'The name of the Amazon S3 Bucket')
string(name: 'Prefix', defaultValue: 'my-website', description: 'Application directory in the Amazon S3 Bucket')
string(name: 'Build', defaultValue: 'public/', description: 'Build directory for the application')
}
stages {
stage('Build') {
steps {
echo 'Running build phase'
sh 'npm install' // Install packages
sh 'npm run build' // Build project
sh 'ls' // List project files
}
}
stage('Deploy') {
steps {
echo 'Running deploy phase'
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: 'AWS_ACCESS_KEY_ID', credentialsId: 'AWSCredentials', secretKeyVariable: 'AWS_SECRET_ACCESS_KEY']]) {
sh 'aws s3 ls' // List AWS S3 buckets
sh "aws s3 sync ${params.Build} s3://${params.Bucket}/${params.Prefix} --delete" // Sync project files with AWS S3 Bucket project path
}
}
}
}
post {
success {
echo 'Deployment to Amazon S3 suceeded'
}
failure {
echo 'Deployment to Amazon S3 failed'
}
}
}
That's all
I hope this helps
I was having the issue with showing the {env.MAJOR_VERSION} in an artifactory of jar file . show I approaches by keeping of environment step in Jenkinsfile.
pipeline {
agent any
environment {
MAJOR_VERSION = 1
}
stages {
stage('build') {
steps {
sh 'ant -f build.xml -v'
}
}
}
post {
always{
archiveArtifacts artifacts: 'dist/*.jar', fingerprint: true
}
}
}
I got the issue solved and then it was not showing me bad substitution in my Jenkins build output. so environment step plays a more role in Jenkinsfile.
suggestion from #avivamg didn't worked for me, here is the syntax which works for me:
sh "python3 ${env.WORKSPACE}/package.py --product productname " +
"--build_dir ${release_build_dir} " +
"--signed_product_dir ${signed_product_dir} " +
"--version ${build_version}"
I got similar issue. But my usecase is little different
steps{
sh '''#!/bin/bash -xe
VAR=TRIAL
echo $VAR
if [ -d /var/lib/jenkins/.m2/'\${params.application_name}' ]
then
echo 'working'
echo ${VAR}
else
echo 'not working'
fi
'''
}
}
here I'm trying to declare a variable inside the script and also use a parameter from outside
After trying multiple ways
The following script worked
stage('cleaning com/avizva directory'){
steps{
sh """#!/bin/bash -xe
VAR=TRIAL
echo \$VAR
if [ -d /var/lib/jenkins/.m2/${params.application_name} ]
then
echo 'working'
echo \${VAR}
else
echo 'not working'
fi
"""
}
}
changes made :
Replaced triple single quotes --> triple double quotes
Whenever I want to refer to local variable I used escape character
$VAR --> \$VAR
This caused the error Bad Substitution:
pipeline {
agent any
environment {
DOCKER_IMAGENAME = "mynginx:latest"
DOCKER_FILE_PATH = "./docker"
}
stages {
stage('DockerImage-Build') {
steps {
sh 'docker build -t ${env.DOCKER_IMAGENAME} ${env.DOCKER_FILE_PATH}'
}
}
}
}
This fixed it: replace ' with " on sh command
pipeline {
agent any
environment {
DOCKER_IMAGENAME = "mynginx:latest"
DOCKER_FILE_PATH = "./docker"
}
stages {
stage('DockerImage-Build') {
steps {
sh "docker build -t ${env.DOCKER_IMAGENAME} ${env.DOCKER_FILE_PATH}"
}
}
}
}
The Jenkins Script is failing inside the "sh" command-line E.g:
sh 'npm run build' <-- Fails referring to package.json
Needs to be changed to:
sh 'npm run ng build....'
... ng $PATH is not found by the package.json.

Resources