jenkins pipelines: shell script cannot get the updated environment variable - shell

In Jenkins, I want to get a user input and pass to a shell script for further use.
I tried to set as environment variable, but the shell script failed to get the latest value and the old value is echo.
pipeline {
agent none
environment{
myVar='something_default'
}
stages {
stage('First stage') {
agent none
steps{
echo "update myVar by user input"
script {
test = input message: 'Hello',
ok: 'Proceed?',
parameters: [
string(name: 'input', defaultValue: 'update it!', description: '')
]
myVar = "${test['input']}"
}
echo "${myVar}" // value is updated
}
}
stage('stage 2') {
agent any
steps{
echo "${myVar}" // try to see can myVar pass between stage and it output expected value
sh("./someShell.sh") // the script just contain a echo e.g. echo "myVar is ${myVar}"
// it echo the old value. i.e.something_default
}
}
}
}

The environment variables that we set in the pipeline Script will be accessible only within the script. So, even if you declare your variable as global, it will not work inside a shell script.
Only option I can think off is, pass as it as argument to the shell script
sh("./someShell.sh ${myVar}")
EDIT:
Updated Answer based on OP's query on Shell script for parsing input
LINE="[fristPara:100, secondPaa:abc]"
LINE=$(echo $LINE | sed 's/\[//g')
LINE=$(echo $LINE | sed 's/\]//g')
while read -d, -r pair; do
IFS=':' read -r key val <<<"$pair"
echo "$key = $val"
done <<<"$LINE,
"

You need to pass the variables between your stages as environment variables, e.g. like this:
stage("As user for input") {
steps {
env.DO_SOMETING = input (...)
env.MY_VAR = ...
}
}
stage("Do something") {
when { environment name: 'DO_SOMETING', value: 'yes' }
steps {
echo "DO_SOMETING has the value ${env.DO_SOMETHING}"
echo "MY_VAR has the value ${env.MY_VAR}"
}
}

You have to declare the variable on a global scope so that both places refer to the same instance.
def myVal
pipeline { ... }

Related

How can I echo the value of a windows system variable inside Jenkins pipeline?

For example: when I echo %APPDATA% on cmd line is expected to print the path to appdata folder, like this: "C:\Users\User\AppData". How can I do the same in a jenkins pipeline?
def app1 = "%APPDATA%"
pipeline {
agent any
stages {
stage('Test') {
steps {
echo ¨\¨${app1}\¨"
}
}
}
}
The following methods should work.
echo "${APPDATA}"
echo "${env.APPDATA}"
echo "$APPDATA"

Looping Hashmap in jenkins file and passing keys to shell or powershell or bat, Not working

I have a jenkins file where I am trying to loop over one Array and passing its elements to Shell script block, It is Working Fine
But, when I am Looping over Hashmap and trying to pass Its Keys it throws an error.
stage('Validation'){
steps{
script{
test1 = ["elem1","elem2"]
test2 = [key1:"value1", key2:"value2"]
for(defaults in test1){
test=defaults
echo "before shell====> ${test}"
status = sh(returnStdout: true, script: """
echo "${test}"
""").trim()
}
echo "======started next HashMAP loop==========="
for(defaults in test2){
test=defaults.key
echo "before shell====> ${test}"
status = sh(returnStdout: true, script: """
echo "${test}"
""").trim()
}
}
}
}
Where the output:
Last Error :
Caused: java.io.NotSerializableException: java.util.LinkedHashMap$Entry
When I replace the second loop over hashmap entries like this:
test2.each{ entry ->
test=entry.key
echo "before shell====> ${test}"
status = sh(returnStdout: true, script: """
echo "${test}"
""").trim()
}
I get the following output:
That was probably what you wanted. For some reason the "for each" loop does not trigger serialization, like stated in comments of JENKINS-49732

How to assign a shell command output as value to envrionment variable?

Below is my pipeline snippet and I am trying to assign RSTATE variable a value at run time. This value is basically stored in a text file but we need to grep and cut it. So a shell command output should be its value.
pipeline
{
agent any
environment
{
RSTATE = 'R4C'
ISO_REV = 'TA'
BuildSource = '18'
}
stages
{
stage('get Rstate')
{
echo env.RSTATE
}
}
}
I am trying to assign RSTATE value like:
RSTATE = sh ( script: 'grep RSTATE /proj/MM/scm/com/iv_build/mm18_1/rstate/next_rstate.txt
|cut -d "=" -f2', returnStdout: true).trim()
But this is not working.
I also tried to run a shell script but that also not works. Only hard coded value is working. Please suggest.
I tested and worksm you need to validate if your script return the value you want.
pipeline
{
agent any
environment
{
RSTATE = 'R4C'
ISO_REV = 'TA'
BuildSource = '18'
}
stages
{
stage('get Rstate')
{
steps {
script {
def RSTATE2 = sh ( script: 'echo \${RSTATE}', returnStdout: true).trim()
echo env.RSTATE
echo RSTATE2
}
}
}
}
}

Accessing Shell variable from within Jenkins Pipeline

I am trying the below line in my Jenkins Pipeline. In the below set of lines, I am assigning the variable IMAGE_NAME in a shell, and trying to access that in the Jenkins Pipeline script, but not able to do that. Any idea on how to do that?
stage('Build: Get Image') {
steps {
echo 'Getting docker image'
sh "IMAGE_NAME=`grep -ri \"Successfully built\"
$BUILD_FILE_NAME | awk \'{print \$3}\'`"
echo "Image Name is:$IMAGE_NAME"
}
}
You can define it as env variable:
env.some_var = 'AAAA'
And print with:
sh 'echo ${env.some_var}'
proxy_host = 'abc.com'
stage('Docker Up') {
steps{
script{
sh("""
echo ${http_proxy}
""")
}
}
Catch here is to use double quotes " to execute the shell script. I tested it and it works fine.

Jenkins Pipeline Syntax - Need to get Parameters for Job from wget

Im new to Jenkins Pipeline , Groovy syntax etc.
I have a Jenkins job that takes 5 parameters.
I want to schedule a pipeline that checks for listings via WGET ( format is csv - i can switch to JSON output also ) and the csv is one row with 5 parameters listed
a,b,c,d,e
I need to parse that list and pass the parameters to the
JOB "IF" there are rows , if not , skip and complete the pipeline.
I have searched and basically got as far as this for testing...:
pipeline {
environment {
testVar='foo'}
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
script {
sh "RESULT=\$(wget -qO- https://www.url.com/getlist)"
sh "echo \$RESULT"
//variable define based on parse of CSV???
}
}
}
stage('Example Deploy') {
when {
expression { testVar=='foo' }
}
steps {
echo 'Deploying'
build job: 'Testing', parameters:
[
string(name: 's_e', value: 'u'),
string(name: 't_e', value: 't'),
string(name: 's_s', value: 'DS'),
string(name: 't_s', value: 'SH'),
string(name: 'efg', value: 'TEST')
]
}
}
}}
Obviously i have more work to do around parse of RESULT (but I am not sure how I can achieve this in Pipeline).
I then need to check for RESULT empty or not , then pass variables to the Build.
I opted for a different option.
Instead I now have a Jenkins Job where I use the "Trigger/Call Builds on other Projects"
Before thats added as a build step , i have some code to get the WGET CSV information.
RESULT=$(wget -qO- https://url.com/getlist)
if [ -z "$RESULT" ]
then
echo "Nothing to do"
# exit 1
else
echo "$RESULT"
s_env_upper=$(echo $RESULT | awk -F',' '{print $1}')
t_env_upper=$(echo $RESULT | awk -F',' '{print $2}')
s_env=$(echo $s_env_upper| tr [A-Z] [a-z])
t_env=$(echo $t_env_upper| tr [A-Z] [a-z])
echo "s_env=$s_env" > params.cfg
echo "t_env=$t_env" >> params.cfg
fi
Hope this helps someone else... i was breaking my heart trying to get pipeline to do the work and answer was simpler.

Resources