jq in a Jenkins pipeline not saving output to variable - jenkins-pipeline

So in my Jenkins pipeline I run a couple of curl commands across different stages. I store the ouput of Stage1 into a file and for every item in that list I run another curl command and use the output of that to extract some values using jq.
However from the second stage I can't seem to store the jq extracted values into variables to echo them later. What am I doing wrong?
{Stage1}
.
.
.
{Stage2}
def lines = stageOneList.readLines()
lines.each { line -> println line
stageTwoList = sh (script: "curl -u $apptoken" + " -X GET --url " + '"' + "$appurl" + "components/tree?component=" + line + '"', returnStdout: true)
pfName = sh (script: "jq -r '.component.name' <<< '${stageTwoList}' ")
pfKey = sh (script: "jq -r '.component.key' <<< '${stageTwoList}' ")
echo "Component Names and Keys\n | $pfName | $pfKey |"
}
returns in the end for Stage2
[Pipeline] sh
+ jq -r .component.name
digital-hot-wallet-gateway
[Pipeline] sh
+ jq -r .component.key
dhwg
[Pipeline] echo
Component Names and Keys
| null | null |
Any help in the right direction appreciated!

You passed true as the argument for the returnStdout argument to the shell step method for stageTwoList, but then forgot to use the same argument for the JSON parsed returns to the next two variable assignments:
def lines = stageOneList.readLines()
lines.each { line -> println line
stageTwoList = sh(script: "curl -u $apptoken" + " -X GET --url " + '"' + "$appurl" + "components/tree?component=" + line + '"', returnStdout: true)
pfName = sh(script: "jq -r '.component.name' <<< '${stageTwoList}' ", returnStdout: true)
pfKey = sh(script: "jq -r '.component.key' <<< '${stageTwoList}' ", returnStdout: true)
echo "Component Names and Keys\n | $pfName | $pfKey |"
}
Note you can also make this much easier on yourself by doing the JSON parsing natively in Groovy and with Jenkins Pipeline step methods:
String stageTwoList = sh(script: "curl -u $apptoken" + " -X GET --url " + '"' + "$appurl" + "components/tree?component=" + line + '"', returnStdout: true)
Map stageTwoListData = readJSON(text: stageTwoList)
pfName = stageTwoListData['component']['name']
pfKey = stageTwoListData['component']['key']

Related

Linux command execution is not working from Ruby

in my ruby code, I have to search lines from the file and then process. below is the code to run the zgrep on the file and then process the output. the code is not running. I get below error all the time.
2022-09-19 15:24:/src/SVSHelpers/file_search.rb(20):INFO:{"scanid"=>"110718", "rawrev"=>"SC_FLD_IODB0", "srchpattern"=>".*pie_mcawrp.FatalErrDetected", "limit"=>"20"}
2022-09-19 15:24:/src/SVSHelpers/file_search.rb(23):INFO:zgrep -i -m20 ".*pie_mcawrp.FatalErrDetected" /proj/platform_scandump/scanview30/ASCII/ScanviewStg/110718.txt.gz
2022-09-19 15:24:/src/SVSHelpers/file_search.rb(45):ERROR:problem in signal retrieval command: zgrep -i -m20 ".*pie_mcawrp.FatalErrDetected" /proj/platform_scandump/scanview30/ASCII/ScanviewStg/110718.txt.gz, error : undefined method `flush' for ScanView::ReportScriptingHelper::ScanViewScript, searchedline :
here is my code:
items["srchpattern"] = items["srchpattern"].gsub("?", ".*")
$logger.info("#{items}")
command = "zgrep -i -m" + items["limit"] + ' "' + items["srchpattern"] + '" ' + $ascii_file_path + '/' + items["scanid"] +'.txt.gz'
#command='zgrep -i -m20 ".*pie_mcawrp.FatalErrDetected" /proj/platform_scandump/scanview30/ASCII/ScanviewStg/110718.txt.gz'
$logger.info("#{command}")
ll = `#{command}`
$logger.info("found lines : #{ll}")
searchedlines = ll.split("\n")
signals = []
for line in searchedlines do
tokens = line.split(",")
signals.push({
"signalNameString": tokens[0],
"signalHexValueString": tokens[1],
"signalDecimalValue": tokens[1].to_i(16)
})
end
return JSON.dump({
"statusmsg": "passed",
"data": {
"scanOrSystemIdString": items["scanid"],
"rtlSignalDatApisList": signals
}
})
this code is executed in the instant_eval. if run outside of instant_eval it runs alright. also it is running fine on window. however creating problem on Linux.

Can not find or read the content from variables returning null from params

I have an issue where the the prompt is allowing user to pick the params value based on what is loaded into the variables. The user can select the value in the variables , but the value of the params is not returning. The echo is blank and also inside the node it is not returning the params value.
+ echo
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
java.lang.NullPointerException: Cannot invoke method $() on null object
at org.codehaus.groovy.runtime.NullObject.invokeMethod(NullObject.java:91)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:48)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
Script:
#!/usr/bin/env groovy
stage('Connect Primary') {
node("Primary") {
script {
GET_LISTSTANDBY= sh (script: "sudo cat /pathtofile/samplestandby.txt", returnStdout: true).trim()
println "$GET_LISTSTANDBY"
}
stage('Connect Primary DB Server') {
node("nodename2") {
sh """
sudo su - postgres -c 'repmgr cluster show | grep -i "standby" | sed 's/standby.*//' | sed -r 's/^.{4}//' | cut -d "|" -f 2 | sed 's/^[[:space:]]*//' > samplestandby.txt'
samplestandby=`sudo cat /pathtofile/samplestandby.txt | sed 's/ //g'`
echo "\${samplestandby}"
sudo cp -R /pathtofile/samplestandby.txt ${env.WORKSPACE}/dir-switch
""".stripIndent()
script {
GET_samplestandby= sh (script: "sudo cat /pathtofile/samplestandby.txt", returnStdout: true).trim()
println "$GET_samplestandby"
}
}
}
stage('Prompt to select Standby') {
script {
def nodechosen = input message: 'Choose', ok: 'Next',
parameters: [choice(name: 'standbynode', choices: "${GET_LISTSTANDBY}", description: 'Select the option')]
node(nodechosen) {
echo "Running in Selected node for the choice prompt"
}
}
}
Use ${WORKSPACE} Jenkins environment variable in your getNodeNames() function instead of current directory.

Output not printed to screen when command chain is executed from Groovy Script

I wrote the following Groovy code:
#!/opt/groovy-2.4.12/bin/groovy
String todayDate = new Date().format( 'yy-MM-dd' )
def p = ['/usr/bin/aws', 'rds', 'describe-db-snapshots', '--db-instance-identifier dev-rds-2017-10-02', '--snapshot-type automated', '--query "DBSnapshots[?SnapshotCreateTime>=' + todayDate +'.DBSnapshotIdentifier"'].execute() | 'grep rds'.execute() | ['tr', '-d', '\'\"|[:space:]\''].execute()
p.waitFor()
//p.waitFor()
println todayDate
println p.text
Which is supposed to return the name of the latest RDS snapshot id.
When I run the script in terminal, the only output that I get is the println todayDate but not the output of the aws cli command.
Edit #1:
This is the output of the command when I run it in terminal:
$ /usr/bin/aws rds describe-db-snapshots --db-instance-identifier dev-rds-2017-10-02 --snapshot-type automated --query "DBSnapshots[?SnapshotCreateTime>='$todaydate'].DBSnapshotIdentifier" | grep rds | tr -d '\"'
rds:dev-rds-2017-10-02-2017-11-22-00-05
Edit #2:
[jenkins#ip-X-X-X-X ~]$ /usr/bin/aws rds describe-db-snapshots --db-instance-identifier dev-rds-2017-10-02 --snapshot-type automated --query "DBSnapshots[?SnapshotCreateTime>='2017-11-23'].DBSnapshotIdentifier" | grep rds | tr -d '\"'
rds:dev-rds-2017-10-02-2017-11-23-00-05
[jenkins#ip-X-X-X-X ~]$ groovy -d rdssnapshotid.groovy
/usr/bin/aws rds describe-db-snapshots --db-instance-identifier dev-rds-2017-10-02 --snapshot-type automated --query "DBSnapshots[?SnapshotCreateTime>='2017-11-23'].DBSnapshotIdentifier" | grep rds | tr -d '\"'
[jenkins#ip-X-X-X-X ~]$
Any idea what I'm doing wrong? Cause I get no errors...
You are using it incorrectly.
Just see if the statement generate the right command first. But it does not.
Here is the fixed script which generates the expected command. Later you can execute it.
def todaydate = new Date().format( 'yy-MM-dd' )
def cmd = ['/usr/bin/aws', 'rds', 'describe-db-snapshots', '--db-instance-identifier', 'dev-rds-2017-10-02', '--snapshot-type', 'automated', '--query', "\"DBSnapshots[?SnapshotCreateTime>='${todaydate}'].DBSnapshotIdentifier\"", '|', 'grep', 'rds', '|', 'tr', '-d', "'\\\"'"]
println cmd.join(' ')
Here is online demo for quick try.
You may want to execute it? Then do
def process = cmd.execute()
process.waitFor()
println process.text
EDIT: Based on the comments (to address the pipe issue, though generated command is ok)
def todaydate = new Date().format( 'yy-MM-dd' )
def process = ['/usr/bin/aws', 'rds', 'describe-db-snapshots', '--db-instance-identifier', 'dev-rds-2017-10-02', '--snapshot-type', 'automated', '--query', "\"DBSnapshots[?SnapshotCreateTime>='${todaydate}'].DBSnapshotIdentifier\""].execute() | ['grep', 'rds'].execute() | ['tr', '-d', "'\\\"'"].execute()
process.waitFor()
println process.text

Can't store sh command output through DSL (groovy) in Jenkins pipeline job

I would like to get last build output in pipeline Jenkins job and attach in email (using emailext plugin). Curl works fine and gives proper build output but i can't store in variable to attach in the email. I'm using latest jenkins version.
I can see there are couple of related posts for simple sh command but that doesn't work for curl response store.
Tried code:
1.
def consoleOutput = sh(returnStdout: true, script: 'curl http://' + jenkinsUser + ':' + jenkinsUserToken + '#' + jenkinsServer + ':8080/job/' + 'myJob/lastBuild/consoleText').trim()
echo consoleOutput
2.
sh 'curl http://' + jenkinsUser + ':' + jenkinsUserToken + '#' + jenkinsServer + ':8080/job/' + "${env.JOB_NAME}" + '/lastBuild/consoleText; echo $? > status'
def consoleOutput = readFile('status').trim()
3.
def consoleOutput = sh(script: 'curl http://' + jenkinsUser + ':' + jenkinsUserToken + '#' + jenkinsServer + ':8080/job/' + '/myJob/lastBuild/consoleText', returnStatus: true).split("\r?\n")
echo consoleOutput
It looks like you're missing the inner array and some double quotes and escaped double quotes for running the script:
sh([ script: "curl \"http://${jenkinsUser}:${jenkinsUserToken}#${jenkinsServer}:8080/job/myJob/lastBuild/consoleText\"").trim()
Also there are multiple ways to do shell scripts and it depends on the type of jenkins pipeline you are using.
In a jenkins declarative pipeline you need to include a script {...} block for all script type code and setting variables, and that would look like this:
pipeline {
agent {
...
}
parameters {
...
}
environment {
...
}
stages {
stage('Run Required Scripts') {
steps {
...
script {
NOTIFIER_BULD_NAME = sh([script: "./getNotifier.sh", returnStdout: true]).trim()
EMAIL_TEXT = sh([script: "./printEmailText.sh ${CURRENT_BUILD} ${PREVIOUS_BUILD}", returnStdout: true]).trim()
BODY= sh([ script: "curl \"http://${jenkinsUser}:${jenkinsUserToken}#${jenkinsServer}:8080/job/myJob/lastBuild/consoleText\"").trim()
}
}
}
stage('Send Email') {
when {
expression {
// Only send when there is text.
"${EMAIL_TEXT}" != "";
}
}
steps{
emailext (
to: 'software#company.com',
subject: "You have mail - ${EMAIL_TEXT}",
body: """${NOTIFIER_BULD_NAME} - ${EMAIL_TEXT}:
...
${BODY}
""",
attachLog: false
)
}
}
}
In a Jenkins scripted pipeline, you don't need a script{} block, you can actually put it most places. Mostly I've put it in stage blocks stage('some stage'){...} and I've done it like this:
V5_DIR = WORKSPACE + '/' + sh([script: "basename ${V5_GIT_URL} .git", returnStdout: true]).trim()
Although I've also used curl commands (for scripted pipelines) and didn't need the inner array...
lastSuccessfulCommit = sh(
script: "curl -sL --user ${JENKINS_API_USER}:${JENKINS_API_PSW} \"${lastSuccessfulCommitUrl}\" | sed -e 's/<[^>]*>//g'",
returnStdout: true
)
And for reference, echoing vars looks like this in both
sh([script: "echo \"Value: ${someVariable}\""])
Hopefully this documentation helps a bit too, but I know recently Jenkins documentation can be pretty spotty, so I also found a great gist about how to not do things for Jenkins Declarative pipelines. Good luck!

Gradle Exec task fails running sed

my build script has the following task
task editProjectArtificat (type:Exec) {
executable "sed"
args "-e '/myInsertionMatchingPattern/r " + projectDir.toString() + "/scripts/install/myTextFileToInsert' < " + projectDir.toString() + "/build/scripts/MyOriginalInputFile > " + projectDir.toString() + "/build/scripts/MyChangedOutputFile"
}
gradle build fails when the above task executes with this error message
sed: 1: " '/myInsertionPattern/r ...": invalid command code
FAILURE: Build failed with an exception.
What went wrong:
Execution failed for task ':MyProject:editProjectArtificat'.
Process 'command 'sed'' finished with non-zero exit value 1
However, when I change the gradle.build script to make the task look like this
task editProjectArtificat (type:Exec) {
executable "sed"
args "-e /myInsertionMatchingPattern/r " + projectDir.toString() + "/scripts/install/myTextFileToInsert < " + projectDir.toString() + "/build/scripts/MyOriginalInputFile > " + projectDir.toString() + "/build/scripts/MyChangedOutputFile"
}
Now that both of the "'" removed in the "arg" line, we no longer get gradle build errors; however, sed does not produce "MyChangedOutputFile" file as expected when gradle build is done.
Typing sed command with both "'" on a shell produces the expected output? sed fails when the "'" are removed on the shell. my understanding sed needs "'" around the matching pattern and commands.
I don't know gradle, but it seems like args needs a list with each argument separated. However, you are using redirection (< and >) and that has to be done by the shell, so you shouldn't be executing sed but bash. You want to have something like bash -c "sed -e '/.../r ...' <... >..." so something like this might work:
task editProjectArtificat (type:Exec) {
executable "bash"
args "-c", "sed -e '/myInsertionMatchingPattern/r " + projectDir.toString() + "/scripts/install/myTextFileToInsert' < " + projectDir.toString() + "/build/scripts/MyOriginalInputFile > " + projectDir.toString() + "/build/scripts/MyChangedOutputFile"
}

Resources