Given the following script:
node {
def hello = "Hello"
stage("Greetings") {
echo "${hello}world!"
}
}
The logs display HelloWorld!
When I attempt to use this in a multi-line sh command
node {
def hello = "Hello"
stage("Greetings") {
sh '''
echo ${hello}world!
'''
}
}
The variable is regarded as an empty string resulting in world!
Why and how to fix it?
You can try using double quotes instead of single quotes. Try this, see if this will work.
node {
def hello = "Hello"
stage("Greetings") {
sh """
export GREETINGS=5
echo ${hello}world \$GREETINGS times!
"""
}
}
Related
I have written a shell script in a groovy function which should return the the output (in a single line) as abcd-def-chart but, I am getting the output as shown below in 2 different lines:
abcd-def
-chart
My groovy code:
String getChartName(Map configuration = [:]) {
if (configuration.chartName != null) {
return configuration.chartName
}
chartName = ""
if (configuration.buildSystem == 'maven') {
chartName = getMavenProjectName() + "-chart"
}
echo "chartName: ${chartName}"
return chartName
}
String getMavenProjectName() {
echo "inside getMavenProjectName +++++++"
def mavenChartName = sh returnStdout:true, script: '''
#!/bin/bash
GIT_LOG=$(env -i git config --get remote.origin.url)
basename "$GIT_LOG" .git; '''
echo "mavenChartName: ${mavenChartName}"
return mavenChartName
}
I've created a simple pipeline which is attempting to run a script and then I'll do something else with the output, however the script (CheckTagsDates.sh) never finishes according to Jenkins. If I SSH into the Jenkins slave node, su as the jenkins user, navigate to the correct workspace folder, I can execute the command successfully.
pipeline {
agent {label 'agent'}
stages {
stage('Check for releases in past 24hr') {
steps{
sh 'chmod +x CheckTagsDates.sh'
script {
def CheckTagsDates = sh(script: './CheckTagsDates.sh', returnStdout: true)
echo "${CheckTagsDates}"
}
}
}
}
}
Here is the contents of the CheckTagsDates.sh file
#!/bin/bash
while read line
do
array[ $i ]="$line"
(( i++ ))
done < <( curl -L -s 'https://registry.hub.docker.com/v2/repositories/library/centos/tags'|jq -r '."results"[] | "\(.name)&\(.last_updated)"')
for i in "${array[#]}"
do
echo $i | cut -d '&' -f 1
echo $i | cut -d '&' -f 2
done
Here is the output from the script in the console
latest
2020-01-18T00:42:35.531397Z
centos8.1.1911
2020-01-18T00:42:33.410905Z
centos8
2020-01-18T00:42:29.783497Z
8.1.1911
2020-01-18T00:42:19.111164Z
8
2020-01-18T00:42:16.802842Z
centos7.7.1908
2019-11-12T00:42:46.131268Z
centos7
2019-11-12T00:42:41.619579Z
7.7.1908
2019-11-12T00:42:34.744446Z
7
2019-11-12T00:42:24.00689Z
centos7.6.1810
2019-07-02T14:42:37.943412Z
How I told you in a comment, I think that is a wrong use of the echo instruction for string interpolation.
Jenkins Pipeline uses rules identical to Groovy for string interpolation. Groovy’s String interpolation support can be confusing to many newcomers to the language. While Groovy supports declaring a string with either single quotes, or double quotes, for example:
def singlyQuoted = 'Hello'
def doublyQuoted = "World"
Only the latter string will support the dollar-sign ($) based string interpolation, for example:
def username = 'Jenkins'
echo 'Hello Mr. ${username}'
echo "I said, Hello Mr. ${username}"
Would result in:
Hello Mr. ${username}
I said, Hello Mr. Jenkins
Understanding how to use string interpolation is vital for using some of Pipeline’s more advanced features.
Source: https://jenkins.io/doc/book/pipeline/jenkinsfile/#string-interpolation
As a workaround for this case, I would suggest you to do the parsing of the json content in Groovy, instead of shell, and limit the script to only retrieving the json.
pipeline {
agent {label 'agent'}
stages {
stage('Check for releases in past 24hr') {
steps{
script {
def TagsDates = sh(script: "curl -L -s 'https://registry.hub.docker.com/v2/repositories/library/centos/tags'", returnStdout: true).trim()
TagsDates = readJSON(text: TagsDates)
TagsDates.result.each {
echo("${it.name}")
echo("${it.last_updated}")
}
}
}
}
}
}
How to write the below shell in groovy
process_name = spin_user
if grep -i ${process_name} /tmp/output.log ; then
echo "Success"
grep -i ${process_name} output.log > final_output.log
else
echo "failure"
fi
<< edited in response to comment >>
1. Pure Groovy Solution
If you just want to implement the functionality in your bash script in groovy, you can do something like this:
def processName = 'spin_user'
def outLog = new File('/tmp/output.log')
def finalLog = new File('final_output.log')
def lines = outLog.readLines()
def hasProcess = { it.toLowerCase().contains(processName) }
if(lines.any(hasProcess)) {
println "Sucess"
finalLog.text = lines.findAll(hasProcess).join('\n')
} else {
println "failure"
}
should be noted that if your log file is large, there are better ways of searching for a string that do not require you to load the entire file into memory.
2. Process Management Solution
If you were specifically looking to use the linux system grep command from within groovy, the above will naturally not help you. The following groovy code:
import java.util.concurrent.*
def processName = 'spin_user'
def outLog = '/tmp/output.log'
def finalLog = 'final_output.log'
def result = exec('grep', '-i', processName, outLog)
if (result) {
println "success"
new File(finalLog).text = result
} else {
println "failure"
}
String exec(String... args) {
def out = new StringBuffer()
def err = new StringBuffer()
def process = args.toList().execute()
process.consumeProcessOutput(out, err)
process.waitForOrKill(5000)
if (err) {
println "ERROR: executing ${args} returned ${process.exitValue()}"
println "STDERR: ${err}"
}
out
}
will execute the grep command and should get you closer to what you want.
It should be noted that the output redirection > in your shell command is as far as I know hard to do on an external process from java/groovy and we are therefore writing the output to the final_output.log file from within groovy instead of executing the command using output redirection.
I also added a five second max timeout on the grep process execution. This is not required and that line can safely be removed, it's just there as a safeguard for cases where the grep blocks indefinitely.
Below is my pipeline snippet and I am trying to assign RSTATE variable a value at run time. This value is basically stored in a text file but we need to grep and cut it. So a shell command output should be its value.
pipeline
{
agent any
environment
{
RSTATE = 'R4C'
ISO_REV = 'TA'
BuildSource = '18'
}
stages
{
stage('get Rstate')
{
echo env.RSTATE
}
}
}
I am trying to assign RSTATE value like:
RSTATE = sh ( script: 'grep RSTATE /proj/MM/scm/com/iv_build/mm18_1/rstate/next_rstate.txt
|cut -d "=" -f2', returnStdout: true).trim()
But this is not working.
I also tried to run a shell script but that also not works. Only hard coded value is working. Please suggest.
I tested and worksm you need to validate if your script return the value you want.
pipeline
{
agent any
environment
{
RSTATE = 'R4C'
ISO_REV = 'TA'
BuildSource = '18'
}
stages
{
stage('get Rstate')
{
steps {
script {
def RSTATE2 = sh ( script: 'echo \${RSTATE}', returnStdout: true).trim()
echo env.RSTATE
echo RSTATE2
}
}
}
}
}
In Jenkins, I want to get a user input and pass to a shell script for further use.
I tried to set as environment variable, but the shell script failed to get the latest value and the old value is echo.
pipeline {
agent none
environment{
myVar='something_default'
}
stages {
stage('First stage') {
agent none
steps{
echo "update myVar by user input"
script {
test = input message: 'Hello',
ok: 'Proceed?',
parameters: [
string(name: 'input', defaultValue: 'update it!', description: '')
]
myVar = "${test['input']}"
}
echo "${myVar}" // value is updated
}
}
stage('stage 2') {
agent any
steps{
echo "${myVar}" // try to see can myVar pass between stage and it output expected value
sh("./someShell.sh") // the script just contain a echo e.g. echo "myVar is ${myVar}"
// it echo the old value. i.e.something_default
}
}
}
}
The environment variables that we set in the pipeline Script will be accessible only within the script. So, even if you declare your variable as global, it will not work inside a shell script.
Only option I can think off is, pass as it as argument to the shell script
sh("./someShell.sh ${myVar}")
EDIT:
Updated Answer based on OP's query on Shell script for parsing input
LINE="[fristPara:100, secondPaa:abc]"
LINE=$(echo $LINE | sed 's/\[//g')
LINE=$(echo $LINE | sed 's/\]//g')
while read -d, -r pair; do
IFS=':' read -r key val <<<"$pair"
echo "$key = $val"
done <<<"$LINE,
"
You need to pass the variables between your stages as environment variables, e.g. like this:
stage("As user for input") {
steps {
env.DO_SOMETING = input (...)
env.MY_VAR = ...
}
}
stage("Do something") {
when { environment name: 'DO_SOMETING', value: 'yes' }
steps {
echo "DO_SOMETING has the value ${env.DO_SOMETHING}"
echo "MY_VAR has the value ${env.MY_VAR}"
}
}
You have to declare the variable on a global scope so that both places refer to the same instance.
def myVal
pipeline { ... }