I use the declarative syntax for developing the (multibranch) pipeline script and I'm looking for a way to skip the whole pipeline based on some condition, without having to modify the when on every single stage.
Current use case: I'm setting up a cron to trigger builds at night, but I only want let's say the branches release/v1 and develop to go through the pipeline at night, not the dozen of other branches.
triggers {
cron('H 21 * * 1-5')
}
// SKIP PIPELINE if triggered by timer AND branch not 'release/v1' OR 'develop'
stages {
stage('build') {
when { ... }
}
stage('UT') {
when { ... }
}
etc...
}
any hints would be appreciated.
You can nest stages, if you have the pipeline-definition-plugin 1.3 or later (changelog).
Using that, you can nest your whole job in a parent stage, and use a when directive on the parent stage. All child stages will be skipped if the parent stage is skipped. Here is an example:
pipeline {
agent any
stages {
stage('Parent') {
when {
//...
}
stages {
stage('build') {
steps {
//..
}
}
stage('UT') {
steps {
//...
}
}
}
}
}
}
You can use the SCM Skip plugin. Then use it in the pipeline like so:
scmSkip(deleteBuild: true, skipPattern:'.*#skip-ci.*')
This also enables to delete the build and also use a regex expression easily. The problem is, it aborts the next builds, and then ignores them for the relevant PR if you're connected with GitHub organization.
Related
community.
I would like to run Jenkins job, same job but with diferent maven opts values and in parallel. How can i achieve that? I was trying to use different Jenkins plugins, but with no luck.
Trying to configure pipelines using groovy scripts, but i am so amateur that i can't figure out how to achieve what i want. The goal is to run same jenkins job in parallel, but the only thing that must be different is environment where my tests should run.
Maybe there is already a solution so you could point me to that.
You should be able to use parallel blocks for this. Following is a sample.
pipeline {
agent none
stages {
stage('Run Tests') {
parallel {
stage('Test On Dev') {
agent {
label "IfYouwantToChangeAgent"
}
steps {
sh "mvn clean test -Dsomething=dev"
}
post {
always {
junit "**/TEST-*.xml"
}
}
}
stage('Test On QA') {
agent {
label "QA"
}
steps {
sh "mvn clean test -Dsomething=qa"
}
post {
always {
junit "**/TEST-*.xml"
}
}
}
}
}
}
}
I'm trying to create a Jenkinsfile that runs a release build step only if a boolean parameter isRelease is true, otherwise skip this step.
My Jenkinsfile looks like this (extract):
pipeline {
agent { label 'build' }
tools {
jdk 'OpenJDK 11'
}
parameters {
booleanParam( name: 'isRelease', description: 'Run a release build?', defaultValue: false)
}
stages {
stage('Build') {
steps {
sh 'mvn clean compile'
}
}
stage('Test') {
steps {
sh 'mvn test'
}
}
stage('Release') {
when {
beforeAgent true
expression { return isRelease }
}
steps {
sh 'echo "######### Seems to a release!"'
}
}
}
}
However, I don't seem to understand how to use the parameters variable properly. What happens is that the release step is always executed.
I changed expression { return isRelease } to expression { return "${params.isRelease}" } which did not make a difference. Changing it to expression { return ${params.isRelease} } causes the step to fail with java.lang.NoSuchMethodError: No such DSL method '$' found among steps.
What's the right way to use a parameter to skip a step?
You were closest on your first attempt. The latter two failed because:
Converting a Boolean to a String would always return truthiness because a non-empty String is truthy.
That is invalid Jenkins Pipeline or Groovy syntax.
The issue here with the first attempt is that you need to access isRelease from within the params object.
when {
beforeAgent true
expression { params.isRelease }
}
The when directive documentation actually has a very similar example in the expression subsection for reference.
I have three Gradle tasks if I execute them one by one on its own then its working. But when I execute them from another task then its not working. Here is how my task looks like
import com.github.gradle.node.npm.task.NpmTask
plugins {
id("com.github.node-gradle.node") version "3.4.0"
}
// Executing this task on its own is working
tasks.register<NpmTask>("buildFrontEnd") {
workingDir.set(file("${projectDir}/frontend"))
args.set(listOf("run", "build"))
}
// Executing this task on its own is working
tasks.register<Delete>("cleanFrontEnd") {
delete(
fileTree("${projectDir}/backend/main/resources/static/js"),
)
}
// Executing this task on its own is working
tasks.register<Copy>("copyFrontEnd") {
into("$projectDir")
copy {
from("${projectDir}/frontend/dist/css")
into("${projectDir}/backend/main/resources/static/css")
}
copy {
from("${projectDir}/frontend/dist/js")
into("${projectDir}/backend/main/resources/static/js")
}
}
// This tasks is not executing "copyFrontEnd"
tasks.register("frontEndBuild") {
dependsOn("buildFrontEnd", "cleanFrontEnd", "copyFrontEnd")
tasks.findByName("copyFrontEnd")?.mustRunAfter("buildFrontEnd", "cleanFrontEnd")
// Tried this too but it is not working
// tasks.findByName("cleanFrontEnd")?.mustRunAfter("buildFrontEnd")
// tasks.findByName("copyFrontEnd")?.mustRunAfter("cleanFrontEnd")
}
This is the output
> Task :cleanFrontEnd
> Task :copyFrontEnd NO-SOURCE
> Task :frontEndBuild
BUILD SUCCESSFUL in 20s
2 actionable tasks: 2 executed
For :copyFrontEnd it is saying NO-SOURCE but if that is the case then why its working when executing on its own? Is there anyway to fix this.
When you register a task, the bit inside the { } block configures the task. This block will always run whenever the task is required.
tasks.register<Copy>("copyFrontEnd") {
into("$projectDir")
// these copy actions will ALWAYS run whenever Gradle configures this task
copy {
from("${projectDir}/frontend/dist/css")
into("${projectDir}/backend/main/resources/static/css")
}
copy {
from("${projectDir}/frontend/dist/js")
into("${projectDir}/backend/main/resources/static/js")
}
}
Because you've used register when creating the task, Gradle won't execute the task configuration block won't unless the task is required. But when it is, those copy actions will run immediately.
To add task actions, which will actually do the work of the task, add them using doLast { } or doFirst { }.
tasks.register<Copy>("copyFrontEnd") {
into("$projectDir")
doLast {
copy {
from("${projectDir}/frontend/dist/css")
into("${projectDir}/backend/main/resources/static/css")
}
copy {
from("${projectDir}/frontend/dist/js")
into("${projectDir}/backend/main/resources/static/js")
}
}
}
Now those copy actions will only run when the task runs.
This still won't work however...
NO-SOURCE
Task did not need to execute its actions.
Task has inputs and outputs, but no sources
https://docs.gradle.org/current/userguide/more_about_tasks.html#sec:task_outcomes
When you register a Copy task, you need to specify what the source is.
tasks.register<Copy>("copyFrontEnd") {
into("$projectDir")
// from(...) // no from, no source - Gradle thinks "this task doesn't need to run"
}
In your case, you can specify the source as "${projectDir}/frontend/dist/css" and get rid of the two separate copy actions.
tasks.register<Copy>("copyFrontEnd") {
from("${projectDir}/frontend/dist/")
into("${projectDir}/backend/main/resources/static/css")
}
You might want to fiddle around with includes to only copy the css/ and js/ directories - https://docs.gradle.org/current/userguide/working_with_files.html#sec:copying_directories_example
Typesafe references
This last bit is unrelated, but I thought I'd include it as it can help make your buildscript clearer.
While you can reference task by their names, it's nicer to reference them by a variable. When you register a task, it returns a handle to that task.
You can use that handle to specify task dependencies.
val buildFrontEndTask: TaskProvider<NpmTask> = tasks.register<NpmTask>("buildFrontEnd") {
// ...
}
val cleanFrontEndTask: TaskProvider<Delete> = tasks.register<Delete>("cleanFrontEnd") {
// ...
}
val copyFrontEndTask: TaskProvider<Copy> = tasks.register<Copy>("copyFrontEnd") {
// ...
}
tasks.register("frontEndBuild") {
// here you can set the task dependencies based on the task providers
dependsOn(buildFrontEndTask, cleanFrontEndTask, copyFrontEndTask)
}
Hi I want to know whether multiple stages can have a triggers directive, if no then I need some way to schedule each stage
pipeline{
stages{
stage{
triggers{cron (#some_exp)}
steps{
# Some steps
}
}
stage{
triggers{cron (#some_exp)}
steps{
# Some steps
}
}
}
}
If you want to trigger each stage separately that means that each stage is probably a separate business logic unit and therefore you should seriously consider creating a separate job for each step in order to avoid packing them all into a single job with multiple triggers and packed logic.
However if you still want to achieve what you want you can do it with the Parameterized Scheduler Plugin which enables you to define cron triggers that trigger the job with a specific environment variable, you can then use this variable as a condition to determine which step to execute.
Here is an example for implementing it:
pipeline {
agent any
parameters {
string(name: 'STAGE', defaultValue: 'setup', description: 'Which stage to run')
}
triggers {
parameterizedCron('''
*/2 * * * * %STAGE=setup
*/3 * * * * %STAGE=build
''')
}
stages {
stage('Setup') {
when {
expression { STAGE== 'setup' }
}
steps {
echo "In Setup stage - STAGE parameter is ${STAGE}"
...
}
}
stage('Build') {
when {
expression { STAGE== 'build' }
}
steps {
echo "In Build stage - STAGE parameter is ${STAGE}"
...
}
}
...
}
}
I'm trying to run the following process in my Jenkinsfile:
Build the app
Trigger deploy of two components to the test environment, in parallel
foo deploy
bar deploy
Run tests on a deployed app
The steps 2 and 3 require locking a resource because I have only one test environment available.
There's no problem to achieve that w/o running step 2 in parallel, however, when I configure Jenkinsfile to run them together I get the following error from Jenkins:
WorkflowScript: 19: Parallel stages or branches can only be included in a top-level stage. # line 19, column 7.
stage('Deploy Foo') {
^
Here's the full Jenkinsfile:
pipeline {
agent any
stages {
stage('Build') {
steps {
powershell(script: '.\\ci\\build.ps1 -Script .\\ci\\build\\build.cake')
}
}
stage('Deploy and run tests') {
when {
branch('develop')
}
options {
lock('test-env')
}
stages {
stage('Deploy') {
parallel {
stage('Deploy Foo') {
steps {
build(job: 'Deploy_Foo')
}
}
stage('Deploy Bar') {
steps {
build(job: 'Deploy_Bar')
}
}
}
}
stage('Run tests') {
steps {
powershell(script: '.\\ci\\build.ps1 -Script .\\ci\\test\\build.cake')
}
}
}
}
}
}
I have also tried a solution with locking test-env resource separately for Deploy and Test stages, however, this increases the risk of race condition when some other running job may wait for that resource and "jump in" between Deploy and Test stages of the current job.
Is there any way to achieve such mix of sequential and parallel stages as described above in Jenkinsfile?