In Jenkins pipeline, how can I convert a String to Date? - jenkins-pipeline

I just want to be able to convert 2019-11-05T08:43:43.488-0500 to a Date object? I see Groovy String to Date but that doesn't work in pipeline (I'm aware not all Groovy does work in pipeline).

You can use java.text.SimpleDateFormat to parse String to Date object in a Jenkins Pipipeline. And this is actually what the Date.parse(format,date) does under the hood - https://github.com/apache/groovy/blob/GROOVY_2_4_12/src/main/org/codehaus/groovy/runtime/DefaultGroovyStaticMethods.java#L186
You will need, however, approve using DateFormat.parse(date) method when you run it for the first time in the Jenkins Pipeline.
Scripts not permitted to use method java.text.DateFormat parse java.lang.String. Administrators can decide whether to approve or reject this signature.
[Pipeline] End of Pipeline
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: Scripts not permitted to use method java.text.DateFormat parse java.lang.String
at org.jenkinsci.plugins.scriptsecurity.sandbox.whitelists.StaticWhitelist.rejectMethod(StaticWhitelist.java:175)
When you approve it, the following code should work for you:
import java.text.SimpleDateFormat
pipeline {
agent any
stages {
stage("Test") {
steps {
script {
def date = "2019-11-05T08:43:43.488-0500"
def format = "yyyy-MM-dd'T'HH:mm:ss.SSSZ"
def parsed = new SimpleDateFormat(format).parse(date)
echo "date = ${parsed}"
}
}
}
}
}
The output:
Running on Jenkins in /home/wololock/.jenkins/workspace/pipeline-sandbox
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] script
[Pipeline] {
[Pipeline] echo
date = Tue Nov 05 14:43:43 CET 2019
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS

Related

Error exit code 1 code showing in Jenkins Consiole output ( I do not want to see it )

I have a job running in the Jenkins pipeline and the output is showing error exit code 1 because am using if statement to create NOT_BUILT in the stage. Is there any other way to not see the work error exit code 1. I do not want to use When statement but if possible to still use IF statement and have a blank stage but without this message error exit code 1 in the console output.
This is my script below :
if(route53 == 'false' ) {
catchError(buildResult: 'SUCCESS', stageResult: 'NOT_BUILT') {
sh "exit 1"
}
}
else if(route53 == 'true' && all == "Yes" ) {
catchError(buildResult: 'SUCCESS', stageResult: 'NOT_BUILT') {
sh "exit 1"
}
}
The result in the pipeline console output is showing this, the stage graphic is fine as it is showing a blank stage but the console output error code is what I really want to manipulate.
output result
+ exit 1
[Pipeline] }
[Pipeline] }
ERROR: script returned exit code 1
[Pipeline] }
ERROR: script returned exit code 1
[Pipeline] }
ERROR: script returned exit code 1
[Pipeline] }
When using declarative pipelines the NOT_BUILT state is preserved to a stage the was not executed because its when directive was evaluated as false, and there is not direct way to set it except with the catchError workaround. (btw you can control the error message by using error('Your Message') instead of exit 1)
Therefore, it is easiest to achieve using the when directive and also makes your pipeline more readable. If you insist on using if statements you can still use them inside a when directive with the generic expression option which allows you to run any groovy code and calculate the relevant Boolean value according to your needs.
So you can still use your original code and just update it to return a Boolean:
stage('Conditional stage') {
when {
expression {
if(route53 == 'false' ) {
return false
}
else if(route53 == 'true' && all == "Yes" ) {
return false
}
return true
}
}
steps {
...
}
}

how to archive artifacts at the end of pipeline stage

I have the following pipeline
node ("testNode"){
dev env = ${ENV};
stage ("Copy artifact"){
copyArtifacts(projectName: 'appBuildJob',selector: lastCompleted());
}
stage ("Archive artifact"){
// Archive the build output artifacts.
archiveArtifacts artifacts: 'app/build/outputs/apk/app-${ENV}.apk';
}
stage ('env1'){
if (env == "env1") {
buildResult = build(job: 'env1Tests',propagate: false).result;
currentBuild.description = 'env1 - ' + buildResult
} else {
echo 'Env param is ' + env +'. Nothing to do here.';
}
}
stage ('env2'){
if (env == "env2") {
buildResult = build(job: 'env2Tests',propagate: false).result;
currentBuild.description = 'env2 - ' + buildResult
} else {
echo 'Env param is ' + env +'. Nothing to do here.';
}
}
}
My problem is that the job appBuildJob can have either env1 or env2 param and both env1Tests and env2Tests depends on the artifact of my pipeline. So if for example now runs env1 it saves the artifact and everything. Then if I run env2 my env2Tests will fail because it can't find env2 app, it will find only env1 app. After this fails and the pipeline ends i have my env2 artifact saved. So if I run immediately emv2 it will work
I want to save and overwrite the artifact at the end of the "Archive artifact" stage but it only does that when all the pipeline ended.

What is the best possible way to read the data from a file using readFile and converting it to a List in groovy?

I'm trying to read values from text files and putting the values into the list using the below method.
def myKeys = []
new File( '/tmp/A.txt' ).eachLine { line ->
myKeys << line
}
def myValues = []
new File( '/tmp/B.txt' ).eachLine { line ->
myValues << line
}
Problem is, Jenkins doesn't allow this to run on a slave and I'm not sure how to use readFile method here because it doesn't solve the purpose. I want to create a List, which readFile couldn't do.
You can get the same result using readFile step. It reads a given file from your workspace and returns the content of the file as a string. Then you can use String.eachLine(closure) method to iterate every line and add it to the list you expect. Keep in mind one thing, however - if you want to use String.eachLine() method, you need to do it in the #NonCPS mode. Otherwise, you will get maybe a single element from the iteration at best.
Take a look at the following example:
pipeline {
agent any
stages {
stage("Read test.txt file") {
steps {
script {
final String content = readFile(file: "test.txt")
final List myKeys = extractLines(content)
echo "myKeys = ${myKeys}"
}
}
}
}
}
#NonCPS
List extractLines(final String content) {
List myKeys = []
content.eachLine { line ->
myKeys << line
}
return myKeys
}
In this example, we use simple test.text file with the following content:
$ cat test.txt
123
qwe
asd
zxc
Running this exemplary pipeline produces the following output:
Running on Jenkins in /home/wololock/.jenkins/workspace/jobA
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Read test.txt file)
[Pipeline] script
[Pipeline] {
[Pipeline] readFile
[Pipeline] echo
myKeys = [123, qwe, asd, zxc]
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
You could use a similar approach to extract keys and values from two different files, e.g.
def myKeys = extractLines(readFile(file:"/tmp/A.txt"))
def myValues = extractLines(readFile(file:"/tmp/B.txt"))

How to pass pipeline stage variable to makefile

I am calculating the value of local variable (S_STACK_ID) inside dynamic stage of jenkins pipeline
I need to pass S_STACK_ID variable to makefile so that it could be used in makefile to uniquely identify ECS Stack to be deployed
I have tried below code but it passes blank 'ARGS' to makefile
stage('build') {
steps {
script {
def stages = [failFast:true]
for (int i=1; i<5; i++) {
stages["LG ${i}"]={
stage ("LG ${i}"){
S_STACK_ID=env.STACK_ID+i
withCredentials([[
sh 'make ARGS="${S_STACK_ID}" build'
}
}
}
}
parallel stages
}
}
}
sh 'make ARGS="myStack" build' //This correclty passes "myStack" to makefile
sh 'make ARGS="${S_STACK_ID}" build' // Passess blank to makefile and not the value of S_STACK_ID which is an issue for me
Thanks
This worked as "" are required for shell commands to interpolate string literals
sh "make clean \"ARGS=${S_STACK_ID}\""

I have 3 stages to build in jenkins using pipeline code (Scripted0

I have 3 stages(a,b,c) to run on jenkins using pipeline code(scripted), I
need to run stage a,b in parallel and run c after a is success (I am
doing this using pipeline code) but in blue ocean it showing only task
name but I wanna see stage names(in this case I have only 2 tasks with 3
stages and stage a and c are in one task). can someone help how can view
all three stages according to this situation.
def stages = [failFast: false]
def testList = ["a", "b", "c"]
def tasks = [:]
tasks["a-and-c"] = {
stage ("a"){
ansiColor('xterm') {
sh " ls -lart; sleep 30 "
}
if (currentBuild.currentResult == 'SUCCESS') {
stage("c") {
ansiColor('xterm') {
sh " ls -lart "
}
}
} else {
sh 'exit'
}
}
}
tasks["c"] = {
stage ("c"){
ansiColor('xterm') {
sh " ls -lart; sleep 20"
}
}
}
parallel tasks
I am expecting to have a separate view in blueocean for all three stages,
right now I am getting a-and-c and b parallel but I looking for a,b as
parallel and c after a is success. Thank you in advance.

Resources