Gitlab hook on opening merge request - continuous-integration

I have a buildbot server and Gitlab. I could not figure out, to trigger builds whenever a merge request is opened on Gitlab. The purpose should be, that buildbot writes a comment back to the merge request whenever a build succeeds or fails (where as the build is done on the merge request + the upstream branch).
Any hints how to trigger that?
Thanks!

The Gitlab team actually merged some stuff in to make it possible to fire web hooks whenever a merge request is opened or updated:
see https://github.com/gitlabhq/gitlabhq/pull/5881 and
https://github.com/gitlabhq/gitlabhq/issues/1137

You could implement a service like the one for GitLab CI. This actually posts back to the merge-request whether GitLab CI passed or failed the test-suite.

I implemented one and am contributing it back to the buildbot project, see https://github.com/buildbot/buildbot/pull/1820
It uses webhooks and posts comments back to the merge request to show build status.

Related

Teamcity doesn't trigger builds consistently for changes on PR

I have an Teamcity server with a bitbucket server repository. On the Teamcity is a build pipeline which is used to validate pull requests setup. And basically it works as expected. But while new pull requests are triggered within a minute, when I commit changes on a pull request it can take up to an hour until Teamcity has the changes in the build configuration page. As soon as it does find them, it triggers the build as expected.
It also doesn't make a difference if I select the "check for pending changes" in the Actions menu.
Strangely on my other build pipline, which builds changes on the master branch new commits are triggered within the minute as well.
Pull Request Build Feature:
My branch specifications:
The "Changes checking" settings:
And the trigger:
We use TeamCity Professional 2019.2 (build 71499)
EDIT1: I just realized that there are two different views for changes: one for the branch and one for the pull request. The changes do show up in the branch for the pull request very quickly, but not in the one for the pullrequest.
Branch view:
Pull Request view:
As a reference those screenshots where made 16:32.
EDIT2: I used this article to set it up: https://www.jetbrains.com/help/teamcity/2019.2/pull-requests.html
EDIT3: I just found out that I can trigger the build with browser the pull request on the Bitbucket Server page. No idea how that works though.
It seems this is by design of Bitbucket Server:
https://community.atlassian.com/t5/Bitbucket-questions/Change-pull-request-refs-after-Commit-instead-of-after-Approval/qaq-p/194702
TLDR: The Refs on pullrequest branches are not updated immediately, due to performance consideration. The easiest way to trigger it is to view the PullRequest on the Bitbucket Server website.
Comments, and ref updates to the master will also trigger it when I understoot that correctly.

How can i remove Azure Pipeline Build from GitHub checks

I just setup CI/CD for a GitHub repo.
The CI build which validates a pull request is setup up as GitHub Action.
The CD build (which should run after the pull request was merged) is setup using Azure Pipelines as i would like to use the artifacts generated as a trigger for a Release Pipeline using Azure Pipelines as well.
The only thing that's still bugging me is, that the CD Build is also triggering automatically for a pull request and i can't figure out where i can configure those checks.
The checks currently running when a pull request is created are the following:
I want to get rid of the Continous Delivery Build here.
I tried to configure the branch protection rules but this has no effect:
On the Azure Pipeline side i completely disabled the triggers:
But this also has no visible effect to me.
I tested Disable pull request validation in the Triggers of the azure devops pipeline. On my side, it works well, and the build pipeline validation check is not displayed in the github pull request.
You can first check whether the pipeline source repo that you set the "Disable pull request validation" option corresponds to the github repo that created the pull request. Then try a few more times, it is possible that the settings are not applied immediately.
In addition, as workaround you can opt out of pull request validation entirely by specifying pr: none in yaml. Please refer to this official document.
# no PR triggers
pr: none

How do you get a BitBucket pull request to trigger a Bamboo build?

I'm working on CI/CD environment for my workplace. One of the issues we're struggling with is the pull request workflow we currently use, and which we want to continue to use.
We can get BitBucket to tell Bamboo to make a build when you commit to a feature branch and push it to BitBucket, but for the life of me I can't find anything in there related to actual pull requests. In our instance, it would be great if a plan branch would be created when a pull request for a feature branch is generated in BitBucket. Subsequent commits would then trigger additional builds on the PR plan branch. Is this even possible on Bamboo with BitBucket?
Updating this to say that proper pull request support was finally added to Bamboo in version 6.0:
release notes
You can now configure the plan branching model to automatically create branches when new PRs are added to Bitbucket.
It looks like somebody answered your question on another forum: https://answers.atlassian.com/questions/17435563/how-do-you-get-a-bitbucket-pull-request-to-trigger-a-bamboo-build which references this feature request for Bamboo: https://jira.atlassian.com/browse/BAM-14844
I'm posting this here for other people like myself who stumble across this question.
The bamboo feature called plan branches has been implemented in recent releases. See the link posted by Mynock, or directly here.

How can I configure the Bluemix Pipeline to either tag builds or create a work item (defect) according to the state of the build?

I have a Build & Deploy Pipeline in Bluemix, I would like to create a condition where, if the build fails, it will automatically assign a defect (i.e., work item in the "Track & Plan" page) to whoever delivered the very latest change (or just assign to the main owner of the App/Project), also, if the build is completed successfully, I would like to tag it.
Tagging is ok, that's general GIT knowledge, I just wanted to solve 2 Problems with that plan:
How do we trigger a specific subsequent Stage in the pipeline if the current build fails/passes?
How do I create a work item from the pipeline? Do I need to create a separate GIT repo and build some sort of API package that allows me to invoke a mechanism that creates the ticket?
I guess I'm going too maverick with this Pipeline, please share your thoughts.
As of right now you can not create a work item from the pipeline. That is a great feature improvement and I can take it back to the team.
For your question about triggering a stage if something passes or fails... The way it works now only the next stage will be triggered if the previous is successful. The pipeline is based on Jenkins and Jenkins doesn't allow you to trigger a specific job if a job passes or fails. You would want to detect the pass or fail in your stage and do your logic based on that.

How to push from Gitlab to Github with webhooks

My Google-fu is failing me for what seems obvious if I can only find the right manual.
I have a Gitlab server which was installed by our hosting provider
The Gitlab server has many projects.
For some of these projects, I want that Gitlab automatically pushes to a remote repository (in this case Github) every time there is a push from a local client to Gitlab.
Like this: client --> gitlab --> github
Any tags and branches should also be pushed.
AFAICT I have 3 options:
Configure the local client with two remotes, and push simultaneous to Gitlab and Github. I want to avoid this because developers.
Add a git post-receive hook in the repository on the Gitlab server. This would be most flexible (I have sufficient Linux experience to write shell scripts as git hooks) and I have found documentation on how to do this, but I want to avoid this too because then the hosting provider will need to give me shell access.
I use webhooks in Gitlab. I am unfamiliar with what the very basics of webhooks are, and I am unable to locate understandable documentation or even a simple step-by-step example. This is the documentation from Gitlab that I found and I do not understand it: http://demo.gitlab.com/help/web_hooks/web_hooks
I would appreciate good pointers, and I will summarize and document a solution when I find it.
EDIT
I'm using this Ruby code for a web hook:
class PewPewPew < Sinatra::Base
post '/pew' do
push = JSON.parse(request.body.read)
puts "I got some JSON: #{push.inspect}"
end
end
Next: find out how to tell the gitlab server that it has to push a repository. I am going back to the GitLab API.
EDIT
I think I have an idea. On the server where I run the webhook, I pull from GitLab and then I push to Github. I can even do some "magic" (running tests, building jars, deploying to Artifactory,...) before I push to GitHub. In fact it would be great if Jenkins were able to push to a remote repository after a succesful build, then I wouldn't need to write my own webhook, because I'm pretty sure Jenkins already provides a webhook for Gitlab, either native or via a plugin. But I don't know. Yet.
EDIT
I solved it in Jenkins.
You can set more than one git remote in an Jenkins job. I used Git Publisher as a Post-Build Action and it worked like a charm, exactly what I wanted.
would work of course.
is possible but dangerous because GitLab shell automatically symlinks hooks into repositories for you, and those are necessary for permission checks: https://github.com/gitlabhq/gitlab-shell/tree/823aba63e444afa2f45477819770fec3cb5f0159/hooks so I'd rather stay away from it.
Web hooks are not suitable directly: they make an HTTP request with fixed format on certain events, in your case push, not Git protocol requests.
Of course, you could write a server that consumes the hook, clones and pushes, but a service (single push and no deployment) or GitLab CI (already implements hook management) would be strictly better solutions.
services are a the best option if someone implements it: live in the source tree, would do a single push, and require no extra deployment overhead.
GitLab CI or othe CIs like Jenkins are the best option currently available. They are essentially already implemented server for the webhooks, which automatically clone for you: all you have to do then is to push from them.
The keywords you want to Google for are "gitlab mirror github". That has led me to: Gitlab repository mirroring for instance. There seems to be no perfect, easy solution today.
Also this has already been proposed at the feature request forum at: http://feedback.gitlab.com/forums/176466-general/suggestions/4614663-automatic-push-to-remote-mirror-repo-after-push-to Always check there ;) Go and upvote the request.
The key difficulty now is how to store the push credentials.
I solved it in Jenkins. You can set more than one git remote in an Jenkins job. I used Git Publisher as a Post-Build Action and it worked like a charm, exactly what I wanted.
I added "-publisher" jobs that run after "" is built successfully. I could have done it in one job, but I decided to split it up. The build jobs are triggered by a web hook in GitLab; the publisher jobs are using a #daily schedule from the BuildResultTrigger plugin.

Resources