I am trying to setup a jenkins pipeline script that sends out an email when there is a job that have been running for more than 24 hours.
// Long running jobs
pipeline {
agent any
environment {
EMAIL_ALERT_TO = "address"
EMAIL_ALERT_CC = "address"
}
stages {
stage('def methods') {
steps {
script {
Jenkins.instance.getAllItems(Job).each(){ job -> job.isBuildable()
if (job.isBuilding()){
def myBuild= job.getLastBuild()
def runningSince= groovy.time.TimeCategory.minus( new Date(), myBuild.getTime() )
echo "myBuild = ${myBuild}"
echo "runningSince = ${runningSince}"
env.myBuild = myBuild
env.runningSince = runningSince
}
}
}
}
}
}
post {
// Email out the results
always {
script {
if (runningSince.hours >= 1){
mail to: "${env.EMAIL_ALERT_CC}",
cc: "${env.EMAIL_ALERT_CC}",
subject: "Long Running Jobs",
body: "Build: ${myBuild} ---- Has Been Running for ${runningSince.hours} hours:${runningSince.minutes} minutes"
}
}
}
}
}
I am seeing RejectedAccessException which appears to be related to arrays/list.
This is what I believe you are looking for
https://issues.jenkins-ci.org/browse/JENKINS-54952?page=com.atlassian.jira.plugin.system.issuetabpanels%3Achangehistory-tabpanel
Related
So I am already running Jenkins pipelines with parallel base on the example from: Is it possible to create parallel Jenkins Declarative Pipeline stages in a loop?
I want to run each job in different isolated container, the agent name should be the same to all of them. Tried a few options all of them ended up withe errors, I think I need to use both declarative and scripted but not sure how.
Things I tired:
def generateTerraformStage(env) {
return {
agent { label 'local_terraform' }
stage("stage: Terraform ${TERRAFORM_ACTION} ${env}") {
echo "${env}"
sleep 30
}
}
}
stage('parallel stages') {
agent { label 'local_terraform' }
steps {
script {
parallel parallelStagesMapEnvironment
}
}
}
One of the errors I got during testing:
"java.lang.NoSuchMethodError: No such DSL method 'agent' found among steps" and "java.lang.IllegalArgumentException: Expected named arguments but got org.jenkinsci.plugins.workflow.cps.CpsClosure2#560f3533"
Dynamic parallel stages could be created only by using Scripted Pipelines. The API built-it Declarative Pipeline is not available (like agent, options, when etc.).
I don't see any information that you really need dynamic stages (e.g. based on the value returned by a 3rd-party service), so I prepared two solutions:
dynamic parallel stages - stages are generated based on something
static parallel stages - you know all stages (the when block could be used to disable these which are not needed - e.g. passed in parameters)
pipeline {
// ...
stages {
stage('dynamic parallel stages') {
steps {
script {
// params.ENVS == ['envA', 'envB', 'envC']
def values = params.ENVS.split(',')
def stages = [:]
for (def value in values) {
stages[value] = generateTerraformStage(value)
}
parallel stages
}
}
}
stage('static parallel stages') {
parallel {
stage('envA') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envA') }
}
steps {
terraformStageLogic 'envA'
}
}
stage('envB') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envB') }
}
steps {
terraformStageLogic 'envB'
}
}
stage('envC') {
agent { label 'local_terraform' }
when {
expression { return params.ENVS.split(',').contains('envC') }
}
steps {
terraformStageLogic 'envC'
}
}
// ...
}
}
}
}
Closure<Void> generateTerraformStage(env) {
return {
node('local_terraform') {
stage("stage: Terraform ${TERRAFORM_ACTION} ${env}") {
echo "${env}"
sleep 30
}
}
}
}
void terraformStageLogic(env) {
echo "${env}"
sleep 30
}
When you don't use the workspace in the stage responsible for generating or executing other stages (dynamic parallel stages and static parallel stages) then you don't need to allocate any node to it (waste of resources).
how can I save a command in a variable and executed anywhere in the stage
tried differnt way, but still success
here is my example
pipeline {
agent any
environment {
myscript = sh '''
echo "hello"
echo "hello"
echo "hello"
'''
}
stages {
stage("RUN") {
steps {
sh "${myscript}"
}
}
}
}
you can do it like this. Not with a groovy variable but can be more dynamic with groovy function/method
def reusableScript(message) {
sh """
echo Hello World
echo Hi ${message}
"""
}
pipeline {
agent any;
stages {
stage('01') {
steps {
script {
reusableScript("From ${env.STAGE_NAME}")
}
}
}
stage('02') {
steps {
script {
reusableScript("From ${env.STAGE_NAME}")
}
}
}
}
}
I'm trying to write a declarative pipeline code that accepts a map and create a pipeline. I can able to achieve sequential stages or parallel stages but facing problems while making a pipeline that contains sequential stages inside parallel stages.
The input data would be Map. Each list in the map should run parallel and the items inside the list corresponding to each key should run in sequentially.
example data : [1:[11,12], 2:[21,22], 3:[31,32]]
The output should be of image. Could someone give some idea?
Below is the code i have tried.
def stageData = [1:[11,12], 2:[21,22], 3:[31,32]];
def getDeployStages1(stageData){
Map deployStages = [:]
stageData.each{ key, stgValue ->
List stgs = []
stgValue.each{ value ->
deployStages.put("${value}", {
echo "${value}"
})
}
}
return deployStages;
}
def getDeployStages2(stageData){
Map deployStages = [:]
stageData.each{ key, stgValue ->
List stgs = []
stgValue.each{ value ->
stgs.add(stage("${value}"){
echo "${value}"
})
}
deployStages.put("${key}", stgs)
}
return deployStages;
}
pipeline {
agent any
stages {
stage ("deploy1") {
steps {
script {
parallel getDeployStages1(stageData)
}
}
}
stage ("deploy2") {
steps {
script {
parallel getDeployStages2(stageData)
}
}
}
}
}
According to this documentation you can nest the stages in this way
pipeline {
agent none
stages {
stage("build and deploy on Windows and Linux") {
parallel {
stage("windows") {
agent {
label "windows"
}
stages {
stage("build") {
steps {
bat "run-build.bat"
}
}
stage("deploy") {
when {
branch "master"
}
steps {
bat "run-deploy.bat"
}
}
}
}
stage("linux") {
agent {
label "linux"
}
stages {
stage("build") {
steps {
sh "./run-build.sh"
}
}
stage("deploy") {
when {
branch "master"
}
steps {
sh "./run-deploy.sh"
}
}
}
}
}
}
}
}
This should result in the following flow
To apply this in your case, you can simplify your functions to return just elements that need to be sequential (just the values).
pipeline {
agent any
stages {
stage ("parallel") {
parallel {
stage ("deploy1") {
stages {
def list = getDeployStages1(stageData)
for (int i=0; i < list.size(); i++) {
stage(i) {
echo("${list[i]}")
}
}
}
stage ("deploy2") {
stages {
//similar
}
}
}
}
}
I need to fail one Jenkins pipeline stage when one file contains 'errors'
I do not know how to return an error from bash to Jenkins
stage('check if file continas error and exit if true') {
steps {
sh "grep 'error' filetocheck.txt"
}
}
}
reference Is it possible to capture the stdout from the sh DSL command in the pipeline
This worked for me,
def runShell(String command){
def responseCode = sh returnStatus: true, script: "${command} &> tmp.txt"
def output = readFile(file: "tmp.txt")
return (output != "")
}
pipeline {
agent any
stages {
stage('check shellcheck') {
steps {
script {
if (runShell('grep \'error\' file_to_parse.txt')) {
sh "exit 1"
}
}
}
}
}
}
you can try using String.count(charSequence) where String could be a file or string.
def file = 'path/to/file.txt'
if ( file.count('error') > 0 )
return stageResultMap.didB2Succeed = false
I am trying to write a stages inside another stage based on if condition. I am not able to come up with a solution. Can anyone guide on this
stages {
stage('Example') {
steps {
script {
if(!(fileExists("c:/test.txt")))
{
echo "Inside if"
stage('1') {
echo "stage1"
}
stage('2') {
echo "stage2"
}
}
else
{
stage('else stage') {
echo "else stage1"
}
}
}
}
}
}
This worked for me.
when {
expression
{
return !(fileExists("c:/test.txt"))
}
}