Jenkins Pipeline Syntax - Need to get Parameters for Job from wget - bash

Im new to Jenkins Pipeline , Groovy syntax etc.
I have a Jenkins job that takes 5 parameters.
I want to schedule a pipeline that checks for listings via WGET ( format is csv - i can switch to JSON output also ) and the csv is one row with 5 parameters listed
a,b,c,d,e
I need to parse that list and pass the parameters to the
JOB "IF" there are rows , if not , skip and complete the pipeline.
I have searched and basically got as far as this for testing...:
pipeline {
environment {
testVar='foo'}
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
script {
sh "RESULT=\$(wget -qO- https://www.url.com/getlist)"
sh "echo \$RESULT"
//variable define based on parse of CSV???
}
}
}
stage('Example Deploy') {
when {
expression { testVar=='foo' }
}
steps {
echo 'Deploying'
build job: 'Testing', parameters:
[
string(name: 's_e', value: 'u'),
string(name: 't_e', value: 't'),
string(name: 's_s', value: 'DS'),
string(name: 't_s', value: 'SH'),
string(name: 'efg', value: 'TEST')
]
}
}
}}
Obviously i have more work to do around parse of RESULT (but I am not sure how I can achieve this in Pipeline).
I then need to check for RESULT empty or not , then pass variables to the Build.

I opted for a different option.
Instead I now have a Jenkins Job where I use the "Trigger/Call Builds on other Projects"
Before thats added as a build step , i have some code to get the WGET CSV information.
RESULT=$(wget -qO- https://url.com/getlist)
if [ -z "$RESULT" ]
then
echo "Nothing to do"
# exit 1
else
echo "$RESULT"
s_env_upper=$(echo $RESULT | awk -F',' '{print $1}')
t_env_upper=$(echo $RESULT | awk -F',' '{print $2}')
s_env=$(echo $s_env_upper| tr [A-Z] [a-z])
t_env=$(echo $t_env_upper| tr [A-Z] [a-z])
echo "s_env=$s_env" > params.cfg
echo "t_env=$t_env" >> params.cfg
fi
Hope this helps someone else... i was breaking my heart trying to get pipeline to do the work and answer was simpler.

Related

Create .tfvars file using Bash from Jenkins pipeline

i'm trying to create ".tfvars" file on the fly using Bash script with Jenkins parameters as arguments, here is what I did already :
Jenkins pipeline file
pipeline {
agent { label "${params.environment}_slave" }
parameters {
string(name: 'branch', defaultValue:"main")
choice(name: 'environment',choices: ['nonprod','prod'],description:'Describe where you want this pipeline to run')
booleanParam(name: 'bool', defaultValue:"false")
string(name: 'string', defaultValue:"value")
text(name: 'blabla', defaultValue:'''test\test-api\nmlflow''')
string(name: 'int', defaultValue:"1234")
}
environment {
SCM_URL = "https://my_git/my_repo"
}
stages {
stage("Test if prerequisites have been executed") {
steps {
git branch: "$params.branch" ,url: "${SCM_URL}"
sh "chmod +x -R ${env.WORKSPACE}"
sh "./script.sh \"${params}\""
}
}
}
}
Bash script :
params=$1
modified=${params:1:-1}
res=$(echo $modified | sed 's/:/=/g')
while IFS='=' read -r key value; do
array["$key"]="$value"
done <<< "$res"
for key in "${!array[#]}"; do
echo "$key=${array[$key]}" >> terraforms.tfvars;
done
printf "terraforms.tfvars ======== \n"
cat terraforms.tfvars
and when I run everything in Jenkins, here is the result :
+ chmod +x -R /home/jenkins/workspace/my_repo
[Pipeline] sh
+ ./script.sh '[environment:nonprod, bool:false, string:value, blabla:test
test-api
mlflow, branch:main, int:1234]'
terraforms.tfvars ========
0=nonprod, bool=false, string=value, blabla=test test-api mlflow, branch=main, int=1234
I don't understand why I have 0=nonprod instead of environment=nonprod
any ideas ? or suggestions about the whole thing ?
thank you very much
I had 0=nonprod because I didn't instantiate the array first :
declare -A array

Running bash script from pipeline always hangs

I've created a simple pipeline which is attempting to run a script and then I'll do something else with the output, however the script (CheckTagsDates.sh) never finishes according to Jenkins. If I SSH into the Jenkins slave node, su as the jenkins user, navigate to the correct workspace folder, I can execute the command successfully.
pipeline {
agent {label 'agent'}
stages {
stage('Check for releases in past 24hr') {
steps{
sh 'chmod +x CheckTagsDates.sh'
script {
def CheckTagsDates = sh(script: './CheckTagsDates.sh', returnStdout: true)
echo "${CheckTagsDates}"
}
}
}
}
}
Here is the contents of the CheckTagsDates.sh file
#!/bin/bash
while read line
do
array[ $i ]="$line"
(( i++ ))
done < <( curl -L -s 'https://registry.hub.docker.com/v2/repositories/library/centos/tags'|jq -r '."results"[] | "\(.name)&\(.last_updated)"')
for i in "${array[#]}"
do
echo $i | cut -d '&' -f 1
echo $i | cut -d '&' -f 2
done
Here is the output from the script in the console
latest
2020-01-18T00:42:35.531397Z
centos8.1.1911
2020-01-18T00:42:33.410905Z
centos8
2020-01-18T00:42:29.783497Z
8.1.1911
2020-01-18T00:42:19.111164Z
8
2020-01-18T00:42:16.802842Z
centos7.7.1908
2019-11-12T00:42:46.131268Z
centos7
2019-11-12T00:42:41.619579Z
7.7.1908
2019-11-12T00:42:34.744446Z
7
2019-11-12T00:42:24.00689Z
centos7.6.1810
2019-07-02T14:42:37.943412Z
How I told you in a comment, I think that is a wrong use of the echo instruction for string interpolation.
Jenkins Pipeline uses rules identical to Groovy for string interpolation. Groovy’s String interpolation support can be confusing to many newcomers to the language. While Groovy supports declaring a string with either single quotes, or double quotes, for example:
def singlyQuoted = 'Hello'
def doublyQuoted = "World"
Only the latter string will support the dollar-sign ($) based string interpolation, for example:
def username = 'Jenkins'
echo 'Hello Mr. ${username}'
echo "I said, Hello Mr. ${username}"
Would result in:
Hello Mr. ${username}
I said, Hello Mr. Jenkins
Understanding how to use string interpolation is vital for using some of Pipeline’s more advanced features.
Source: https://jenkins.io/doc/book/pipeline/jenkinsfile/#string-interpolation
As a workaround for this case, I would suggest you to do the parsing of the json content in Groovy, instead of shell, and limit the script to only retrieving the json.
pipeline {
agent {label 'agent'}
stages {
stage('Check for releases in past 24hr') {
steps{
script {
def TagsDates = sh(script: "curl -L -s 'https://registry.hub.docker.com/v2/repositories/library/centos/tags'", returnStdout: true).trim()
TagsDates = readJSON(text: TagsDates)
TagsDates.result.each {
echo("${it.name}")
echo("${it.last_updated}")
}
}
}
}
}
}

Jenkins pipeline I need to execute the shell command and the result is the value of def variable. What shall I do? Thank you

Jenkins pipeline I need to execute the shell command and the result is the value of def variable.
What shall I do? Thank you
def projectFlag = sh("`kubectl get deployment -n ${namespace}| grep ${project} | wc -l`")
//
if ( "${projectFlag}" == 1 ) {
def projectCI = sh("`kubectl get deployment ${project} -n ${namespace} -o jsonpath={..image}`")
echo "$projectCI"
} else if ( "$projectCI" == "${imageTag}" ) {
sh("kubectl delete deploy ${project} -n ${namespaces}")
def redeployFlag = '1'
echo "$redeployFlag"
if ( "$projectCI" != "${imageTag}" ){
sh("kubectl set image deployment/${project} ${appName}=${imageTag} -n ${namespaces}")
}
else {
def redeployFlag = '2'
}
I believe you're asking how to save the result of a shell command to a variable for later use?
The way to do this is to use some optional parameters available on the shell step interface. See https://jenkins.io/doc/pipeline/steps/workflow-durable-task-step/#sh-shell-script for the documentation
def projectFlag = sh(returnStdout: true,
script: "`kubectl get deployment -n ${namespace}| grep ${project} | wc -l`"
).trim()
Essentially set returnStdout to true. The .trim() is critical for ensuring you don't pickup a \n newline character which will ruin your evaluation logic.

Make build UNSTABLE if text found in console log using jenkinsfile (jenkins pipeline)

I am trying to login into an instance and check if the file test.txt is not empty, then echo .. make build unstable using the jenkins pipeline (jenkinsfile)But that's not working.
I have this:
post {
always {
sh "ssh ubuntu#$Ip 'if [ -s test.txt ] ; then echo some text && cat test.txt'"
currentBuild.result = 'UNSTABLE'
}
}
Instead of doing above, can I parse through the console log of the latest build to find something eg: some text and if that's found I want to make the build unstable
You need to return standard out from the script:
String stdOut = sh returnStdout: true, script: "ssh ubuntu#$Ip 'if [ -s test.txt ] ; then echo some text && cat test.txt'"
if (stdOut == "") {
currentBuild.status = 'UNSTABLE'
}
Or, you could use returnStatus to return the exit code of the script. The documentation for the sh step can be found here

jenkins pipelines: shell script cannot get the updated environment variable

In Jenkins, I want to get a user input and pass to a shell script for further use.
I tried to set as environment variable, but the shell script failed to get the latest value and the old value is echo.
pipeline {
agent none
environment{
myVar='something_default'
}
stages {
stage('First stage') {
agent none
steps{
echo "update myVar by user input"
script {
test = input message: 'Hello',
ok: 'Proceed?',
parameters: [
string(name: 'input', defaultValue: 'update it!', description: '')
]
myVar = "${test['input']}"
}
echo "${myVar}" // value is updated
}
}
stage('stage 2') {
agent any
steps{
echo "${myVar}" // try to see can myVar pass between stage and it output expected value
sh("./someShell.sh") // the script just contain a echo e.g. echo "myVar is ${myVar}"
// it echo the old value. i.e.something_default
}
}
}
}
The environment variables that we set in the pipeline Script will be accessible only within the script. So, even if you declare your variable as global, it will not work inside a shell script.
Only option I can think off is, pass as it as argument to the shell script
sh("./someShell.sh ${myVar}")
EDIT:
Updated Answer based on OP's query on Shell script for parsing input
LINE="[fristPara:100, secondPaa:abc]"
LINE=$(echo $LINE | sed 's/\[//g')
LINE=$(echo $LINE | sed 's/\]//g')
while read -d, -r pair; do
IFS=':' read -r key val <<<"$pair"
echo "$key = $val"
done <<<"$LINE,
"
You need to pass the variables between your stages as environment variables, e.g. like this:
stage("As user for input") {
steps {
env.DO_SOMETING = input (...)
env.MY_VAR = ...
}
}
stage("Do something") {
when { environment name: 'DO_SOMETING', value: 'yes' }
steps {
echo "DO_SOMETING has the value ${env.DO_SOMETHING}"
echo "MY_VAR has the value ${env.MY_VAR}"
}
}
You have to declare the variable on a global scope so that both places refer to the same instance.
def myVal
pipeline { ... }

Resources