I want to execute nested declarative scripts that pre exist. Say I have this Declarative script in my workspace and its called test.DS
pipeline {
agent any
stages {
parallel {
stage('stage-1') {
steps {
sh "echo this is stage-1"
}
}
stage('stage-2') {
steps {
sh "echo this is stage-2"
}
}
}
}
}
What would a declarative script look like that will run this script test.DS?
Below is one of the possible solutions
node {
load './test.DS'
}
Related
community.
I would like to run Jenkins job, same job but with diferent maven opts values and in parallel. How can i achieve that? I was trying to use different Jenkins plugins, but with no luck.
Trying to configure pipelines using groovy scripts, but i am so amateur that i can't figure out how to achieve what i want. The goal is to run same jenkins job in parallel, but the only thing that must be different is environment where my tests should run.
Maybe there is already a solution so you could point me to that.
You should be able to use parallel blocks for this. Following is a sample.
pipeline {
agent none
stages {
stage('Run Tests') {
parallel {
stage('Test On Dev') {
agent {
label "IfYouwantToChangeAgent"
}
steps {
sh "mvn clean test -Dsomething=dev"
}
post {
always {
junit "**/TEST-*.xml"
}
}
}
stage('Test On QA') {
agent {
label "QA"
}
steps {
sh "mvn clean test -Dsomething=qa"
}
post {
always {
junit "**/TEST-*.xml"
}
}
}
}
}
}
}
pipeline {
agent any
stages {
stage('Deploy') {
steps {
retry(3) {
sh './flakey-deploy.sh migrateOut --argFile sourcearg.txt --encassAsPolicyDependency --dest OTK.xml --folderName /OTK'
}
}
}
}
}
I would like to call this part of command "migrateOut --argFile sourcearg.txt --encassAsPolicyDependency --dest OTK.xml --folderName /OTK" from file placed in jenkins workspace.
TIA
I'm setting up Jenkins pipeline which is mentioned below. My build gets aborted if the 1st stage is got failed but I want to execute 1st all stage and steps which are mentioned in stages.
pipeline {
agent none
stages {
stage("build and test the project") {
agent {
docker "coolhub/vault:jenkins"
}
stages {
stage("build") {
steps {
sh 'echo "build.sh"'
}
}
stage("test") {
steps {
sh 'echo "test.sh" '
}
}
}
}
}
}
I'd like to execute 1st all stage and steps which are mentioned in stages.
after all, stage gets executed then finally need to get abort Jenkins job and show stage and steps which are failed.
Yeah, well there's no way currently to do that apart from try catch blocks in a script.
More here: Ignore failure in pipeline build step.
stage('someStage') {
steps {
script {
try {
build job: 'system-check-flow'
} catch (err) {
echo err
}
}
echo currentBuild.result
}
}
In hakamairi's answer, the stage is not marked as failed. It is now possible to fail a stage, continue the execution of the pipeline and choose the result of the build:
pipeline {
agent any
stages {
stage('1') {
steps {
sh 'exit 0'
}
}
stage('2') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
sh "exit 1"
}
}
}
stage('3') {
steps {
sh 'exit 0'
}
}
}
}
In the example above, all stages will execute, the pipeline will be successful, but stage 2 will show as failed:
As you might have guessed, you can freely choose the buildResult and stageResult, in case you want it to be unstable or anything else. You can even fail the build and continue the execution of the pipeline.
Just make sure your Jenkins is up to date, since this is a fairly new feature.
I'm trying to run the following process in my Jenkinsfile:
Build the app
Trigger deploy of two components to the test environment, in parallel
foo deploy
bar deploy
Run tests on a deployed app
The steps 2 and 3 require locking a resource because I have only one test environment available.
There's no problem to achieve that w/o running step 2 in parallel, however, when I configure Jenkinsfile to run them together I get the following error from Jenkins:
WorkflowScript: 19: Parallel stages or branches can only be included in a top-level stage. # line 19, column 7.
stage('Deploy Foo') {
^
Here's the full Jenkinsfile:
pipeline {
agent any
stages {
stage('Build') {
steps {
powershell(script: '.\\ci\\build.ps1 -Script .\\ci\\build\\build.cake')
}
}
stage('Deploy and run tests') {
when {
branch('develop')
}
options {
lock('test-env')
}
stages {
stage('Deploy') {
parallel {
stage('Deploy Foo') {
steps {
build(job: 'Deploy_Foo')
}
}
stage('Deploy Bar') {
steps {
build(job: 'Deploy_Bar')
}
}
}
}
stage('Run tests') {
steps {
powershell(script: '.\\ci\\build.ps1 -Script .\\ci\\test\\build.cake')
}
}
}
}
}
}
I have also tried a solution with locking test-env resource separately for Deploy and Test stages, however, this increases the risk of race condition when some other running job may wait for that resource and "jump in" between Deploy and Test stages of the current job.
Is there any way to achieve such mix of sequential and parallel stages as described above in Jenkinsfile?
I am trying to run a bash script after a successful build in jenkins.
stages {
stage("test") {
steps {
...
}
post {
success {
steps {
sh "./myscript"
}
}
}
}
}
I am getting an error saying that method "steps" does not exist. How can I run a script after a successful build?
You need to remove the ”steps” inside the ”success” block. call the script directly inside the ”success” block.
according to the docs which is quite confusing, the ”success” is a container for steps (so no need to add another nested ”steps” ):
https://jenkins.io/doc/book/pipeline/syntax/#post
stages {
stage("test") {
steps {
...
}
post {
success {
sh "./myscript"
}
}
}
}