Jenkins function of shared library executed on wrong node - jenkins-pipeline

I have a simple shared library and want to call a function which creates a file.
My pipeline script looks like this:
#Library('jenkins-shared-library')_
pipeline {
agent {
node {
label 'node1'
}
}
stages {
stage('test') {
steps {
script {
def host = sh(script: 'hostname', returnStdout: true).trim()
echo "Hostname is: ${host}"
def testlib = new TestLibrary(script:this)
testlib.createFile("/tmp/file1")
}
}
}
}
}
This pipeline job is triggered by another job which runs on the built-in master node.
The stage 'test' is correctly executed on the 'node1'.
Problem: The created file "/tmp/file1" is created on the jenkins master, instead of "node1"
I also tried it without the shared library and load a groovy script
directly in a step:
pipeline {
agent {
node {
label 'node1'
}
}
stages {
stage('test') {
steps {
script {
def script = load "/path/to/script.groovy"
script.createFile("/tmp/file1")
}
}
}
}
}
This also creates the file on the master node, instead on "node1".
Is there no way of loading external libs or classes and execute them on the node where the stage is running on ? I dont want to place all the code directly into the step.

Ok I found it myself, groovy scripts run by definition always on the master node. No way to run scripts or shared-library code on a node other than master.

Related

How can I use vSphere Cloud Plugin inside Jenkins pipeline code?

I have a setup at work where the vSphere host are manually restarted before execution of a specific jenkins job, as a noob in the office I automated this process by adding a extra build step to restart vm's with the help of https://wiki.jenkins-ci.org/display/JENKINS/vSphere+Cloud+Plugin! (vSphere cloud plugin).
I would like to now integrate this as a pipeline code, please advise.
I have already checked that this plugin is Pipeline compatible.
I currently trigger the vSphere host restart in pipeline by making it to remotely trigger a job configured with vSphere cloud plugin.
pipeline {
agent any
stages {
stage('Restarting vSphere') {
steps {
script {
sh "curl -v 'http://someserver.com/job/Vivin/job/executor_configurator/buildWithParameters?Host=build-114&token=bonkers'"
}
}
}
stage('Setting Executors') {
steps {
script {
def jenkins = Jenkins.getInstance()
jenkins.getNodes().each {
if (it.displayName == 'brewery-133') {
echo 'brewery-133'
it.setNumExecutors(8)
}
}
}
}
}
}
}
I would like to integrate the vSphere cloud plugin directly in the Pipeline code itself, please help me to integrate.
pipeline {
agent any
stages {
stage('Restarting vSphere') {
steps {
vSphere cloud plugin code that is requested
}
}
}
stage('Setting Executors') {
steps {
script {
def jenkins = Jenkins.getInstance()
jenkins.getNodes().each {
if (it.displayName == 'brewery-133') {
echo 'brewery-133'
it.setNumExecutors(8)
}
}
}
}
}
}
}
Well I found the solution myself with the help of 'pipeline-syntax' feature found in the menu of a Jenkins pipeline job.
'Pipeline-syntax' feature page contains syntax of all the possible parameters made available via the API of the installed plugins of a Jenkins server, using which we can generate or develop the syntax based on our needs.
http://<jenkins server url>/job/<pipeline job name>/pipeline-syntax/
My Jenkinsfile (Pipeline) now look like this
pipeline {
agent any
stages {
stage('Restarting vSphere') {
steps {
vSphere buildStep: [$class: 'PowerOff', evenIfSuspended: false, ignoreIfNotExists: false, shutdownGracefully: true, vm: 'brewery-133'], serverName: 'vspherecentral'
vSphere buildStep: [$class: 'PowerOn', timeoutInSeconds: 180, vm: 'brewery-133'], serverName: 'vspherecentral'
}
}
stage('Setting Executors') {
steps {
script {
def jenkins = Jenkins.getInstance()
jenkins.getNodes().each {
if (it.displayName == 'brewery-133') {
echo 'brewery-133'
it.setNumExecutors(1)
}
}
}
}
}
}
}

Is there a way to move the entire post {} build section in Jenkinsfile to the global pipeline library?

I'm relatively new to Jenkins pipelines, but having implemented already a few, I've realised I need to start using jenkins shared library before I go mad.
Have already figured out how to define some repetitive steps in the library and call them with less clutter from Jenkinsfile, but not sure if the same thing can be done for the entire post build section (thought I've read about to how to define the entire pipeline in the lib and similar), as this is pretty much static end of every single pipeline code:
#Library('jenkins-shared-library')_
pipeline {
agent none
stages {
stage ('System Info') { agent any
steps { printSysInfo() }
}
stage ('Init'){ agent {label 'WinZipSE'}
steps { init('SCMroot') }
}
stage('Build') { agent any
steps { doMagic() }
}
}
// This entire 'post {}' section needs to go to a shared lib
// and be called just with a simple methed call, e.g.
// doPostBuild()
post {
always {
node ('master') {
googlechatnotification (
message: '[$BUILD_STATUS] Build $JOB_NAME $BUILD_NUMBER has finished',
url: 'id:credential_id_for_Ubuntu')
step (
[$class: 'Mailer',
recipients: 'sysadmins#example.com me#example.com',
notifyEveryUnstableBuild: true,
sendToIndividuals: true]
)
}
}
success {
node ('master') {
echo 'This will run only if successful'
}
}
failure {
node ('master') {
echo 'This will run only if failed'
}
}
// and so on
}
}
I just dunno how to syntactically achieve that. For sure, I can define the entire post build section an a lib/var like: doPotBuild.groovy
def call () {
post {...}
}
but how I will eventually call it from within my Jenkinsfile outside of that defined post {} build block section (AKA stages).
I can call it within some stage('post build){doPostBuild()}, but it won't serve the way how the true post {} section is supposed to work, e.g. it won't get executed it there was failure in one of the previous stages.
Any thoughts on that and mainly working examples?
I am not entirely if this will work as I don't use declarative pipelines, so am unsure how rigid the top level structure is. But I would revert to a script block.
#Library('jenkins-shared-library')_
pipeline {
agent none
stages {
stage ('System Info') { agent any
steps { printSysInfo() }
}
stage ('Init'){ agent {label 'WinZipSE'}
steps { init('SCMroot') }
}
stage('Build') { agent any
steps { doMagic() }
}
}
// This entire 'post {}' section needs to go to a shared lib
// and be called just with a simple methed call, e.g.
// doPostBuild()
script {
doPostBuild()
}
}

How can I define multiple agents in declarative jenkinsfile?

In my Jenkinsfile I want a particular stage to run on both the agents in parallel.for example:
stage('abc'){
agent {
label "dev6" && "dev7"
}
steps {
xyz()
}
}
I have two slaves with label dev6 and dev7. I want xyz() to start on both the agents dev6 and dev7 at same time parallely. What is the correct way to do it? Do i need parallel block ? from the above code it just starts the functions on one of dev6 or dev7. I tried with
label "dev6 || dev7"
label "dev6 && dev7"
but it doenst work. Can someone help??
Thanks
You need parallel on level of stages, the reason for that is actually you want this to run twice on separate agents. Unless I misunderstood you.
pipeline {
agent none
stages {
stage('Test') {
parallel {
stage('Test On dev6') {
agent {
label "dev6"
}
steps {
xyz()
}
}
stage('Test On dev7') {
agent {
label "dev7"
}
steps {
xyz()
}
}
}
}
}

How to trigger a multiple run in a single pipeline job of jenkins?

I have a pipeline job which run with below pipeline groovy script,
pipeline {
parameters{
string(name: 'Unique_Number', defaultValue: '', description: 'Enter Unique Number')
}
stages {
stage('Build') {
agent { node { label 'Build' } }
steps {
script {
sh build.sh
}
}
stage('Deploy') {
agent { node { label 'Deploy' } }
steps {
script {
sh deploy.sh
}
}
stage('Test') {
agent { node { label 'Test' } }
steps {
script {
sh test.sh
}
}
}
}
I just trigger this job multiple times with different unique ID number as input parameter. So as a result i will have multiple run/build for this job at different stages.
With this, i need to trigger a multiple run/build to be promote to next stage (i.e., from build to deploy or from deploy to test) in this pipeline job as a one single build instead of triggering each and every single run/build to next stage. Is there any possibility?
I was also trying to do the same thing and found no relevant answers. May this help to someone.
This will read a file that contains the Jenkins Job name and run them iteratively from one single job.
Please change below code accordingly in your Jenkins.
pipeline {
agent any
stages {
stage('Hello') {
steps {
script{
git branch: 'Your Branch name', credentialsId: 'Your crendiatails', url: ' Your BitBucket Repo URL '
##To read file from workspace which will contain the Jenkins Job Name ###
def filePath = readFile "${WORKSPACE}/ Your File Location"
##To read file line by line ###
def lines = filePath.readLines()
##To iterate and run Jenkins Jobs one by one ####
for (line in lines) {
build(job: "$line/branchName",
parameters:
[string(name: 'vertical', value: "${params.vertical}"),
string(name: 'environment', value: "${params.environment}"),
string(name: 'branch', value: "${params.aerdevops_branch}"),
string(name: 'project', value: "${params.host_project}")
]
)
}
}
}
}
}
}
You can start multiple jobs from one pipeline if you run something as:
build job:"One", wait: false
build job:"Two", wait: false
Your main job starts children pipelines and children pipelines will run in parallel.
You can read PipeLine Build Step documentation for more information.
Also, you can read about the parallel run in declarative pipeline
Here you can find a lot of examples for parallel running

how to keep process running after the stage is finished for declarative jenkins pipeline

pipeline {
agent none
stages {
stage('Server') {
agent{
node {
label "xxx"
customWorkspace "/home/xxx/server"
}
}
steps {
sh 'node server.js &'
//start server
}
}
stage('RunCase') {
agent{
node {
label 'clientServer'
customWorkspace "/home/xxx/CITest"
}
}
steps{
sh 'start test'
sh 'run case here'
}
}
}
}
I create above Jenkins pipeline. What I want to do is:
1. start server at server node.
2. start test at test node.
However, I found the server process will be closed when second stage start.
So how to keep server start until my second stage testing work is finished. I try to use &, still not working. It seems it will kill all process I started at first stage.
One solution is to try to start the two stages in "parallel"-mode. For more informations see this two files: parallel-declarative-blog jenkins-pipeline-syntax. But be carefull, because it is not ensured, that the first stage starts before the second one starts. Maybe you need a waiting time for your tests. Here is an example Jenkinsfile:
pipeline {
agent none
stages {
stage('Run Tests') {
parallel {
stage('Start Server') {
steps {
sh 'node server.js &'
}
}
stage('Run Tests) {
steps {
sh 'run case here'
}
}
}
}
}
}
Another solution would be to start the node server in the background. For this you can try different tools, like nohup or pm2.

Resources