Context: MultiBranchPipeline Shred Library
We are working on test automation for pipeline development in our CI/CD system.
Our CI/CD system triggers builds based on commits to a bitbucket repositories having a Jenkinsfile (MultiBranch pipeline) for the micro services that we are deploying.
Each commit (with a Jenkinsfile in the root) triggers a build with the multibranch plugin.
The current approach is based on using those commits also for the the shared pipeline repository containing the shared Library used for the system.
So we are using a Jenkinsfile in the sharedLibrary Project which is executed on each commit and contains code to reference itself.
#Library('sharedLibrary#currentBranch') _ // this is the relevant part
#Field def projectList = [p1, p2, ... ] // list of Projects to be tested
node('testNode') {
projectList.each { project2Test ->
def gitConfig = setupGitParameters(project2Test)
stage('checkout ' + project2Test) {
withCredentials(gitConfig){
sh """"
git clone ${project2Test}
cd ${project2Test}
git config ....
git checkout -b ${testBranch}
echo 'patch Jenkinsfile'
sed -i ${changeWhatisNeeded} Jenkinsfile
git commit -am "pipeline test mit Library ${env.BRANCH_NAME} #${env.BUILD_NUMBER}" # to force a commit
git push origin ${it.new} # this push creates a new build on the MB pipeline
"""
}
}
}
}
So I am trying to parameterize the branch of the #Library annototation to match the current Branch automatically. But I failed so far.
I am hoping for any pointers to solve this issue as it is essential for a complete test automation.
As we all know, the manual editing of string constants is a proven NOGO for anything automatated!!
Related
I'm working on a build job in Bamboo right now that conducts a few tasks (updating various dependencies) on a project in Git-repository A. This build is triggered by changes in a Git-repository B. So whenever a commit is pushed to repository B the build in repository A starts.
However, this build should only be started if the commit in repository B is made by a specific user. So in the build script of repository A I'd like to do a git log in repository B to check if the latest commit is done by this specific user and if not, I want to exit the script.
What I currently do is this:
if [ "$(git log -1 --pretty=format:'%an')" != "SPECIFIC_USERNAME" ]; then
echo '-----THIS COMMIT IS NOT SCHEDULED FOR BUILD-----'
exit 0
fi
However doing git log here, only shows me the logs of repository A although I need the log from repository B.
Can this be done somehow?
I appreciate any help!
We are just starting out using Jenkins Multi-branch pipelines. I like the idea of Jenkins automatically creating a new Jenkins job when a new branch is created. It will make sure that all releasable development is being built in Jenkins. We have about 40 or 50 projects that get branched for almost every release, and creating those 40 or so jobs every time we branch is error prone work.
However, I see there are two types of pipeline builds in Jenkins:
Regular Pipeline builds: You specify the location and branch in your Jenkins job. However, you can specify whether you want to use the script inside your Jenkins job configuration, or a script from your source repository. This would allow us to maintain a single Jenkinsfile for all of our jobs. If we change something in the build procedure, we only have to edit a single Jenkinsfile.
Multi-Branch Pipeline builds: Jenkins will automatically create a new Jenkins job for you when a new branch is created. This means we no longer have to create dozens of new Jenkins projects when a new branch occurs. However, it looks like the Jenkinsfile must be located on the root of the project. If you make a basic change in your build procedure, you have to update all Jenkins projects.
I'd like to be able to use the Multi-branch Pipeline build, but I want to either specify where to pull up the Jenkinsfile from our repository, or include a master Jenkinsfile from a repository URL.
Is there a way to do this with Jenkins Multi-branch pipelines?
If you have common build logic across repos, you can move most of the pipeline logic to a separate groovy script. This script can then be referenced in any Jenkinsfile.
This could be done either by checking another checkout of the repo that the the groovy script is in to another directory and then doing a standard groovy load or, probably the better approach would be by storing it as a groovy script in the Jenkins Global Script Library - which is essentially a self-contained git repo within Jenkins
(see https://github.com/jenkinsci/workflow-cps-global-lib-plugin/blob/master/README.md for more details).
We had a similar requirement, and created a global groovy method in a script that was maintained in Git and deployed to Jenkins' Global script library under /vars/ when it changed:
e.g. the script 'scriptName.groovy' has
def someMethod(){
//some build logic
stage 'Some Stage'
node(){
//do something
}
}
That way the common function could be called in any Jenkinsfile via
scriptName.methodName()
I haven't been able to find any info about this, so i hope you guys can help me on this one
I've a maven project hosted in bitbucket that has a BitBucket WebHook pointing to someurl/bitbucket-hook/ , this hooks triggers the build of my project that is defined by a pipeline that has this structure:
node {
stage 'Checkout'
git url: 'https:...'
def mvnHome = tool 'M3'
#Various stages here
...
stage 'Release'
sh "${mvnHome}/bin/mvn -B clean install release:prepare release:perform release:clean"
}
the problem is that maven release plugin pushes changes to BitBucket, and this triggers again the jenkins script, making an infinite loop of builds, is there a way to prevent this?
I've tried setting a quiet period in Jenkins with no success
From my perspective you should have specific jobs for build and release, and the release job should be triggered manually. Anyway, if there is some reason to have them in the job you can check for the message of the last commit:
node {
git 'https...'
sh 'git log -1 > GIT_LOG'
git_log = readFile 'GIT_LOG'
if (git_log.contains('[maven-release-plugin]')) {
currentBuild.result = 'ABORTED'
return
}
... // continue with release or whatever
}
A New Way to Do Continuous Delivery with Maven and Jenkins Pipeline article approach solves the infinite loop:
Use the Maven release plugin to prepare a release with
pushChanges=false (we are not going to push the release commits back
to master) and preparationGoals=initialize (we don't care if the tag
is bad as we will only push tags that are good)
sh "${mvnHome}/bin/mvn -DreleaseVersion=${version} -DdevelopmentVersion=${pom.version} -DpushChanges=false -DlocalCheckout=true -DpreparationGoals=initialize release:prepare release:perform -B"
Another solution can be to change the git hook (post-receive) and add a conditional curl similar to this script:
#!/bin/bash
git_log=$(git log --branches -1)
if ! [[ $git_log =~ .*maven-release-plugin.* ]] ;
then
curl http://buildserver:8080/git/notifyCommit?url=ssh://git#server/projects/name.git;
fi
Since GitLab 7.6, or thereabouts, there is a new option to use TeamCity directly from GitLab projects. In the setup there is this message:
The build configuration in Teamcity must use the build format number
%build.vcs.number% you will also want to configure monitoring of all
branches so merge requests build, that setting is in the vsc root
advanced settings.
I'm not sure how this works. Lets say I have a repository Foo.
I have setup a build on TeamCity to listen to Foo with branch specification: +:refs/pull/*/merge
I then fork Foo in gitlab as FooFork, make a change, then request a merge FooFork -> Foo.
But nothing happens to test this merge, which is what I was expecting GitLab to do. If I accept the merge than the build server jumps into action (immediately) and builds twice (master and /ref/master).
I've also set the build configuration to use exactly: %build.vcs.number% as the build number as prescribed, but gitlab doesn't seem to give me any information about the build result.
So I'm a bit confused really as to what exactly this GitLab -> TeamCity integration is supposed to do and whether I'm doing wrong.
I'm currently running GitLab 7.9 and TeamCity 8.1.4
Update:
Seems this use case was not supported prior to version 8 - https://github.com/gitlabhq/gitlabhq/issues/7240
I'm running GitLab 8.0.2 and TeamCity 9.1.1 and am able to run CI builds on branches and merge requests.
I trigger CI builds for specific branches by setting a VCS trigger together with the branch specification +:refs/heads/(xyz*) where xyz is the string for our ticket system prefix since all active branches need to be named after an entry in our issue tracker.
I trigger builds for merge requests via the branch specification +:refs/(merge-requests/*)
Everything works as as expected and lets us know the status of all feature / bug branches and merge requests automatically.
Thanks to Rob's comment linking to the GitLab 8 release notes entry on the merge request spec.
Same problem here. There might be another way, I'm evaluating right now. Since there's no direct way of getting the merged state from the target MR, you have to build it on your own:
IMO there's the following todos
1.) init a bare repo $ git init
2.) add your target repo $ git remote add origin git#your-repo:<origin.group>/<origin.repo>.git
3.) add the remote/feature/to-merge's $ git remote add target git#your-repo:<feature.group>/<feature.repo>.git
4.) checkout your feature branch $ git checkout -b <feature.branch> feature/<feature.branch>
5.) checkout your original branch $ git checkout -b <origin.branch> origin/<origin.branch>
6.) Rebase feature into your original branch $ git rebase <feature.branch>
As stated here [1], GitLab-CE can fire an event on creation of a merge-request,
so all you have to do is building some meta, that can evaluate the WebHooks.
[1] http://doc.gitlab.com/ce/web_hooks/web_hooks.html#merge-request-events
I am new to using bitbucket was trying just to set up simple build pipeline. Clicked on pipeline menu option and edited the example file and committed. This created a pipeline yaml file on my master branch. It ran and built ok - it did not build my develop branch.
Do i need a pipeline yaml file on each branch.
I can see from docs that i can put branch specific steps into the one file, if i edit the file that has been commited on master to include a section for the develop branch, will this run when i do a commit to the develop branch or will this only trigger on a commit to master branch.
Bitbucket will run the pipelines that has a corresponding definition for the branch that you have commited to. So, if you commit the pipelines configuration file to master, only default or master pipeline from this file will be executed. If you want to run a pipeline for develop branch, you need to commit this file to develop branch as well. Note, that the default pipeline is executed regardless of the branch name if there is no other pipeline defined for this particular branch. So, your comment is correct, you need to have the bitbucket-pipelines.yml in each branch.
Here is how Bitbucket will resolve the pipeline execution configuration^
If there is no bitbucket-pipelines.yml - no pipelines will run for the branch
If there is bitbucket-pipelines.yml and there is a default pipeline definition only, Bitbucket will execute the default pipeline.
pipelines:
default:
- step:
script:
- echo "Running the default pipeline"
If there is also a specific pipeline defined for a particular branch, lets say for develop, Bitbucket will execute this pipeline instead of the default
pipelines:
default:
- step:
script:
- echo "This will not be executed if the branch is develop"
develop:
- step:
script:
- echo "Running the develop pipeline"
Note, that if the branch name would be something else, lets say release, since there is no pipeline defined for release branch, the default pipeline will be executed.