I want to send few parameters to one of my shell scripts written in linux server from a jenkins job. Below is my jenkins pipeline job:
def MY_VAR
def BUILD_NUMBER
pipeline {
agent any
stages {
stage('Stage One') {
steps {
script {
BUILD_NUMBER={currentBuild.number}
MY_VAR ='abc'
}
}
}
stage('Stage Two') {
steps {
sh '''
cd /scripts/
./my_scripts.sh $BUILD_NUMBER $MY_VAR'''
}
}
}
}
Here I am able to send the value of BUILD_NUMBER but not of MY_VAR. It seems to me that since MY_VAR is set into pipeline that is why it is happening. Can anybody please help with the solution
If you want to interpolate $MY_VAR when executing sh step, you need to replace single quotes with the double-quotes.
def MY_VAR
def BUILD_NUMBER
pipeline {
agent any
stages {
stage('Stage One') {
steps {
script {
BUILD_NUMBER={currentBuild.number}
MY_VAR ='abc'
}
}
}
stage('Stage Two') {
steps {
sh """
cd /scripts/
./my_scripts.sh $BUILD_NUMBER $MY_VAR"""
}
}
}
}
The $BUILD_NUMBER worked, because pipeline exposes env.BUILD_NUMBER and this variable can be accessed inside the shell step as bash's $BUILD_NUMBER env variable. Alternatively, you could set MY_VAR as an environment variable and keep single quotes in the sh step. Something like this should do the trick:
pipeline {
agent any
stages {
stage('Stage One') {
steps {
script {
//you can remove BUILD_NUMBER assignment - env.BUILD_NUMBER is already created.
//BUILD_NUMBER={currentBuild.number}
env.MY_VAR ='abc'
}
}
}
stage('Stage Two') {
steps {
sh '''
cd /scripts/
./my_scripts.sh $BUILD_NUMBER $MY_VAR'''
}
}
}
}
You can learn more about Jenkins Pipeline environment variables from one of my blog posts.
Related
pipeline {
agent any
stages {
stage('Deploy') {
steps {
retry(3) {
sh './flakey-deploy.sh migrateOut --argFile sourcearg.txt --encassAsPolicyDependency --dest OTK.xml --folderName /OTK'
}
}
}
}
}
I would like to call this part of command "migrateOut --argFile sourcearg.txt --encassAsPolicyDependency --dest OTK.xml --folderName /OTK" from file placed in jenkins workspace.
TIA
How to convert declarative pipeline as below to scripted pipeline?
I have this syntax for declarative pipeline and I would like to use the dockerfile in my jenkinsfile which is in scripted (node () ) pipeline.
pipeline {
agent { dockerfile true }
stages {
stage('Test') {
steps {
sh 'node --version'
sh 'svn --version'
}
}
}
}
You can try like this:
node {
checkout scm
def customImage = docker.build("my-image:${env.BUILD_ID}")
customImage.inside {
sh 'node —-version'
sh 'svn --version'
}
}
The build() method builds the Dockerfile in the current directory by default. This can be overridden by providing a directory path containing a Dockerfile as the second argument of the build() method, for example:
node {
checkout scm
def customImage = docker.build("my-image:${env.BUILD_ID}", "./dockerfiles/test")
customImage.inside {
sh 'node —-version'
sh 'svn --version'
}
}
I want to execute nested declarative scripts that pre exist. Say I have this Declarative script in my workspace and its called test.DS
pipeline {
agent any
stages {
parallel {
stage('stage-1') {
steps {
sh "echo this is stage-1"
}
}
stage('stage-2') {
steps {
sh "echo this is stage-2"
}
}
}
}
}
What would a declarative script look like that will run this script test.DS?
Below is one of the possible solutions
node {
load './test.DS'
}
I'm setting up Jenkins pipeline which is mentioned below. My build gets aborted if the 1st stage is got failed but I want to execute 1st all stage and steps which are mentioned in stages.
pipeline {
agent none
stages {
stage("build and test the project") {
agent {
docker "coolhub/vault:jenkins"
}
stages {
stage("build") {
steps {
sh 'echo "build.sh"'
}
}
stage("test") {
steps {
sh 'echo "test.sh" '
}
}
}
}
}
}
I'd like to execute 1st all stage and steps which are mentioned in stages.
after all, stage gets executed then finally need to get abort Jenkins job and show stage and steps which are failed.
Yeah, well there's no way currently to do that apart from try catch blocks in a script.
More here: Ignore failure in pipeline build step.
stage('someStage') {
steps {
script {
try {
build job: 'system-check-flow'
} catch (err) {
echo err
}
}
echo currentBuild.result
}
}
In hakamairi's answer, the stage is not marked as failed. It is now possible to fail a stage, continue the execution of the pipeline and choose the result of the build:
pipeline {
agent any
stages {
stage('1') {
steps {
sh 'exit 0'
}
}
stage('2') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
sh "exit 1"
}
}
}
stage('3') {
steps {
sh 'exit 0'
}
}
}
}
In the example above, all stages will execute, the pipeline will be successful, but stage 2 will show as failed:
As you might have guessed, you can freely choose the buildResult and stageResult, in case you want it to be unstable or anything else. You can even fail the build and continue the execution of the pipeline.
Just make sure your Jenkins is up to date, since this is a fairly new feature.
I am using the groovy below to call a bat command, that no matter how i reference the LOCAL_WORKSPACE within the bat command it does not evaluate it.
What am i missing?
Error
nuget restore $env.LOCAL_WORKSPACE
"Input file does not exist: $env.LOCAL_WORKSPACE"
Script
pipeline {
agent any
stages {
stage('Clone repo') {
steps {
deleteDir()
git branch: 'myBranch', changelog: false, credentialsId: 'myCreds', poll: false, url: 'http://myRepoURL'
}
}
stage ("Set any variables") {
steps{
script{
LOCAL_BUILD_PATH = "$env.WORKSPACE"
}
}
}
stage('Build It, yes we can') {
parallel {
stage("Build one") {
steps {
echo LOCAL_BUILD_PATH
bat 'nuget restore %LOCAL_WORKSPACE%'
}
}
}
}
}
}
You cannot set variables to share data between stages. Basically each script has its own namespace.
What you can do is use an environment directive as described in the pipeline syntax docs. Those constants are globally available, but they are constants, so you cannot change them in any stage.
You can calculate the values though. For example I use an sh step to get the current number of commits on master like this:
pipeline {
agent any
environment {
COMMITS_ON_MASTER = sh(script: "git rev-list HEAD --count", returnStdout: true).trim()
}
stages {
stage("Print commits") {
steps {
echo "There are ${env.COMMITS_ON_MASTER} commits on master"
}
}
}
}
You can use environment variables to store and access to/from stages. For example, if you define LOCAL_ENVR as Jenkins parameter, you can manipulate the variable from stages:
stage('Stage1') {
steps {
script{
env.LOCAL_ENVR = '2'
}
}
}
stage('Stage2') {
steps {
echo "${env.LOCAL_ENVR}"
}
}