How to extract command output from the multi lines shell in Jenkins - shell

How to get the output of kubectl describe deployment nginx | grep Image in an environment variable?
My code:
stage('Deployment'){
script {
sh """
export KUBECONFIG=/tmp/kubeconfig
kubectl describe deployment nginx | grep Image"""
}
}

In this situation, you can access the environment variables in the pipeline scope within the env object, and assign values to its members to initialize new environment variables. You can also utilize the optional returnStdout parameter to the sh step method to return the stdout of the method, and therefore assign it to a Groovy variable (because it is within the script block in the pipeline).
script {
env.IMAGE = sh(script: 'export KUBECONFIG=/tmp/kubeconfig && kubectl describe deployment nginx | grep Image', returnStdout: true).trim()
}
Note you would also want to place the KUBECONFIG environment variable within the environment directive at the pipeline scope instead (unless the kubeconfig will be different in different scopes):
pipeline {
environment { KUBECONFIG = '/tmp/kubeconfig' }
}

You can use the syntax:
someVariable = sh(returnStdout: true, script: some_script).trim()

Related

How can I source Terraform HCL variables in bash?

I have Terraform variables defined like
variable "location" {
type = string
default = "eastus"
description = "Desired Azure Region"
}
variable "resource_group" {
type = string
default = "my-rg"
description = "Desired Azure Resource Group Name"
}
and potentially / partially overwritten in terraform.tfvars file
location = "westeurope"
and then defined variables as outputs e.g. a file outputs.tf:
output "resource_group" {
value = var.resource_group
}
output "location" {
value = var.location
}
How can I "source" the effective variable values in a bash script to work with these values?
One way is to use Terraform output values as JSON and then an utility like jq to convert and source as variables:
source <(terraform output --json | jq -r 'keys[] as $k | "\($k|ascii_upcase)=\(.[$k] | .value)"')
note that output is only available after executing terraform plan, terraform apply or even a terraform refresh
If jq is not available or not desired, sed can be used to convert Terraform HCL output into variables, even with upper case variable names:
source <(terraform output | sed -r 's/^([a-z_]+)\s+=\s+(.*)$/\U\1=\L\2/')
or using -chdir argument if Terraform templates / modules are in another folder:
source <(terraform -chdir=$TARGET_INFRA_FOLDER output | sed -r 's/^([a-z_]+)\s+=\s+(.*)$/\U\1=\L\2/')
Then these variables are available in bash script:
LOCATION="westeurope"
RESOURCE_GROUP="my-rg"
and can be addressed as $LOCATION and $RESOURCE_GROUP.

How to pick variables in vars.tf from BASH script

I am provisiong an EC2 instance using Terraform. It also has a startup script. I have vars.tf where I have specified all the variables in it. In my bash.sh script it should pickup one variable from vars.tf
Is it possible to refer the variable in vars.tf from bash script? Below is my use case.
bash.sh
#!/bin/bash
docker login -u username -p token docker.io
vars.tf
variable "username" {
default = "myuser"
}
variable "token" {
default = "mytoken"
}
My bash script should pick the variable from vars.tf
If this is not possible any workaround?
In order to provide Terraform variables to a script, we can use templatefile function. This function reads to content of a template file and injects Terraform variables in places marked by the templating syntax (${ ... }).
First we want to create a template file with the bash script and save it as init.tftpl:
#!/bin/bash
docker login -u ${username} -p ${token} docker.io
When creating the instance, we can use templatefile to provide the rendered script as user data:
resource "aws_instance" "web" {
ami = "ami-xxxxxxxxxxxxxxxxx"
instance_type = "t2.micro"
user_data = templatefile("init.tftpl", {
username = var.username
token = var.token
})
}

Escaping Dollar sign in Jenkins credentials

I have test$001 as a value in Jenkins secret text credentials. Later in pipeline script i'm accessing that value and writing it to yaml file like mentioned below, which is used as K8S configmap.
Problem is with the Dollar sign in the value.
environment {
TEST_CRED=credentials('TEST_CRED')
}
script.sh
cat << EOF > test.yaml
...
data:
TEST: ${TEST_CRED}
EOF
Expected: test$001
Printed: test$$001 (Note extra dollar sign being inserted automatically)
I tried all possibilities to escape this dollar sign, nothing worked.
TEST_01: '${TEST_CRED}'
TEST_02: ${TEST_CRED}
TEST_03: '$${TEST_CRED}'
TEST_04: $${TEST_CRED}
TEST_05: "$${TEST_CRED}"
TEST_08: $TEST_CRED
When storing value in Jenkins secret text credentials, escape the dollar sign. So, test$001 should actually be stored as test\$001.
Following works for me:
pipeline {
agent any
environment {
MYTEST_CRED=credentials('TEST_CRED')
}
stages {
stage('Special Char') {
steps {
sh """
cat << EOF > test.yaml
Name: test-config
Namespace: default
data:
TEST: ${MYTEST_CRED}
EOF
"""
}
}
}
}
Output:
This is an example when I'm passing a not escaped string to the Jenkins job via parameters. And things are not going my way.
// Original and expected value. Works fine with pure groovy
echo env.SECRET_VALUE
test#U$3r
// But this variable in shell is getting messed up
// sh("\$ENV") and sh('$ENV') are using value of shell env variale
sh("echo \$SECRET_VALUE")
test#U$$3r
sh('echo $SECRET_VALUE')
test#U$$3r
// sh("$ENV") and sh("${ENV}") are using value of groovy variables passed to the shell
sh("echo $SECRET_VALUE")
test#Ur
sh("echo ${SECRET_VALUE}")
test#Ur
Let's try to fix it
env.ESCAPED_SECRET_VALUE = env.SECRET_VALUE.replaceAll(/(!|"|#|#|\$|%|&|\\/|\(|\)|=|\?)/, /\\$0/)
// groovy variable is becoming a bit broken
echo env.ESCAPED_SECRET_VALUE
test\#U\$3r
// shell env variable is still broken
sh("echo \$ESCAPED_SECRET_VALUE")
test\#U\$$3r
sh('echo $ESCAPED_SECRET_VALUE')
test\#U\$$3r
// But, if we will pass groovy env variable to the shell - it looks good
sh("echo $ESCAPED_SECRET_VALUE")
test#U$3r
sh("echo ${ESCAPED_SECRET_VALUE}")
test#U$3r
If You are using command straight in the sh(script:""), then just pass groovy ESCAPED variable. If You need to invoke shell script file, then try to pass value of this groovy ESCAPED variable as input argument into it
Example:
sh("./my_super_script.sh $ESCAPED_SECRET_VALUE")
# my_super_script.sh
#!/bin/bash
SECRET_VALUE=$1
echo $SECRET_VALUE
I did a setup as per your requirement and got the desired results.
The setup is shown below with the screenshots,
Setup Jenkins secret text credential
Setup Binding in the Jenkins job
Configuring the build to create the test.yaml
Content of test.yaml
$ cat test.yaml
...
data:
TEST: test$001

Env variable value got reset to original even after assigning the pom version number in jenkins script

I have a scenario where i have to read the maven pom versions for different components and assign the version to docker image(TAG). But after i read the pom, assigned it to some global variable it will reset to original value in groovy jenkins script. Below is the sample. HMAP_VERSION value will 1.2.1 but when it is used in the line: sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} test.com" the value will be UNINITIALISED.
Can somebody tell me what might have gone wrong? This will work with single maven file which is read in env block as below:
environment {
CLOADER_VERSION = readMavenPom().getVersion()
}
Below is the sample of what im tring to do.
#! groovy
environment {
HMAP_VERSION = "UNINITIALISED"
CLOADER_VERSION = "UNINITIALISED"
}
stages {
stage('Build Cloader') {
steps {
checkout([$class: 'GitSCM' "rest is removed")
dir('isa-casloader') {
script {
CLOADER_VERSION = readMavenPom().getVersion()
}
container('build') {
sh '/opt/apache-maven/bin/mvn -s settings.xml -B clean install -DskipTests=true'
}
}
}
}
stage ('Build Casloader Docker Image') {
steps {
dir('isa-casloader') {
container('tools') {
echo("CLOADER_VERSION=${CLOADER_VERSION}")
withCredentials() {
sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} testing.com"
sh 'docker build -t testing.com:${CLOADER_VERSION} .'
sh 'docker push testing.com:${CLOADER_VERSION}'
}
}
}
}
}
stage ('Build Heat Map Docker Image') {
steps {
checkout([$class: 'GitSCM', "rest is commented"])
dir('apps') {
container('tools') {
script {
def pom = readMavenPom file: 'pom-docker.xml'
HMAP_VERSION = pom.version
}
echo("HMAP_VERSION=${HMAP_VERSION}")
withCredentials() {
sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} test.com"
sh 'docker build -t test.com:${HMAP_VERSION} .'
sh 'docker push test.com:${HMAP_VERSION}'
}}}}}}}
By my read of your code, you're mixing environment variables with variables within the Groovy context.
These lines create environment variables, which are accessible in the shell as $HMAP_VERSION and $CLOADER_VERSION:
environment {
HMAP_VERSION = "UNINITIALISED"
CLOADER_VERSION = "UNINITIALISED"
}
However, you're populating a Groovy variable here:
script {
CLOADER_VERSION = readMavenPom().getVersion()
}
To instead populate the environment variable, you'd want to use env.CLOADER_VERSION instead.
This changes what context the variables are evaluated in when you're calling out to shell using the sh directive:
1-> sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} testing.com"
2-> sh 'docker build -t testing.com:${CLOADER_VERSION} .'
3-> sh 'docker push testing.com:${CLOADER_VERSION}'
In line number 1 above, the command is quoted using a double quotes (") which means that the variables ART_USERNAME and ART_PASSWORD are evaluating in the context of the Groovy script.
However, in lines 2 and 3 the commands are quoted using a single quote (') which means that those variables are being evaluated by the shell (likely /bin/sh) and therefore using the values from the environment.
The easiest fix would be to ensure that values you want exposed in the shell are always accessed using the env. prefix in the Groovy context:
// set environment for CLOADER_VERSION
env.CLOADER_VERSION = readMavenPom().getVersion()
// print value of environment variable CLOADER_VERSION
echo("CLOADER_VERSION=${env.CLOADER_VERSION}")
// set environment for HMAP_VERSION
env.HMAP_VERSION = pom.version
// print value of environment variable HMAP_VERSION
echo("HMAP_VERSION=${env.HMAP_VERSION}")
Cheers.
Thanks for the response. My issue got resolved. In docker context as shown below,
withCredentials() {
sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} testing.com"
sh 'docker build -t testing.com:${CLOADER_VERSION} .'
sh 'docker push testing.com:${CLOADER_VERSION}'
}
Login command is proper which is inside double quotes, but the next statements were in single quotes. So variables latest value was not getting resolved. When i change the statements to be inside double quotes, it worked!!
Below is the proper command:
withCredentials() {
sh "docker login -u ${ART_USERNAME} -p ${ART_PASSWORD} testing.com"
sh "docker build -t testing.com:${CLOADER_VERSION} ."
sh "docker push testing.com:${CLOADER_VERSION}"
}
Thanks you.

Accessing Shell variable from within Jenkins Pipeline

I am trying the below line in my Jenkins Pipeline. In the below set of lines, I am assigning the variable IMAGE_NAME in a shell, and trying to access that in the Jenkins Pipeline script, but not able to do that. Any idea on how to do that?
stage('Build: Get Image') {
steps {
echo 'Getting docker image'
sh "IMAGE_NAME=`grep -ri \"Successfully built\"
$BUILD_FILE_NAME | awk \'{print \$3}\'`"
echo "Image Name is:$IMAGE_NAME"
}
}
You can define it as env variable:
env.some_var = 'AAAA'
And print with:
sh 'echo ${env.some_var}'
proxy_host = 'abc.com'
stage('Docker Up') {
steps{
script{
sh("""
echo ${http_proxy}
""")
}
}
Catch here is to use double quotes " to execute the shell script. I tested it and it works fine.

Resources