How to create a webhook between Bitbucket and Azure DevOps? - continuous-integration

We have all our repositories in Bitbucket and I'm trying to set up a continuous intergration services to Azure DevOps that would build the project after each push.
We have created a dedicated user account for Bitbucket repositories that has real-only access to all repositories.
However, creating a CI webhook trigger from Bitbucket to Azure Devops requires admin access to repositories. We do not want to give that level of access to CI user account.
I could add the webhook to Bitbucket repository manually, but I'm missing the URL to which the webhook should post the trigger.
The url is something like https://dev.azure.com/myorganization/_apis/public/hooks/externalEvents?publisherId ...
I think it's called deployment trigger url but I cannot find it anywhere. Does the new Azure DevOps support manually adding webhooks or do we have to do it manually somehow?

I'm in the same boat with you all. I don't want to give my CI account "Admin" rights to ANY repo.
My workaround so far has been to give the CI account temporary access in order to create the webhook when the pipeline is first saved, then downgrade it after the webhook has been created, knowing that any changes will require another temporary permission elevation.
FWIW, the webhook URL that is used is this:
https://[REDACTED].visualstudio.com/_apis/public/hooks/externalEvents?publisherId=bitbucket&channelId=[REDACTED]&api-version=5.1-preview
As you can see, we are kind of in an understandable Catch-22 here, because we could conceivably create the pipeline and get that channelId to use to manually create the webhook in Bitbucket, but can't even SAVE a pipeline without repo Admin rights, so we can't get the channelId.
I wish there was a way to disable the webhook creation so we could manually create it on the Bitbucket side.

I know that this has been a long time since it was asked, but recently I was faced with the exact same issue and I thought I should add this here for anyone struggling to find out where these URLs are coming from.
I was seeing in Bitbucket two webhooks in the format https://dev.azure.com/[myorganization]/_apis/public/hooks/externalEvents?publisherId=... and I was trying to figure out how these were created in the first place.
As it turns out, when you create a new Bitbucket Pipeline in Azure and you select a repository for this pipeline, Azure automatically creates these webhooks for us in Bitbucket! In other words, it doesn't seem to be a way to deduce these URLs from anywhere, but rather they are created by Azure upon creation of the Pipeline, as well as they are deleted by Azure once you delete the Pipeline from Azure!.

Related

GitLab Custom CI configuration path and merge request

For one of our repositories we set "Custom CI configuration path" inside GitLab to a remote gitlab-ci.yml. We want to do this to prevent Developers to change the gitlab-ci.yml file (as protected files are available in EE Premium and up). But except this purpose, the Custom CI configuration path feature should work anyway for Merge Requests.
Being in repo
group1/repo1
we set
.gitlab-ci.yml#group1/repo1-ci
repo1-ci repository exists and ci works correctly when we push to configured branches etc.
For Merge Request functionality GitLab tells us:
Detached merge request pipeline #123 failed for ...
Project group1/repo1-ci not found or access denied!
We added the developers to repo1-ci repo as developers, to be able to read the files. It does not help. Anyway the expectation is, that it is not run with user permissions, so it should simply find the gitlab-ci.yml file.
Any ideas on this?
So our expectations were right an it seems that we have to add one important thing into our considerations:
If a user interacts in the GitLab UI with the Merge Request features and you are using "Custom CI configuration path" for your gitlab-ci.yml file, please ensure
this user needs at least read permissions to that remote file, even if you moved it to another repo on purpose (e.g. use enhanced file protection in PREMIUM/ULTIMATE or push/merge protect the branches for the Developer role)
the user got this permission change applied in a running session
The last part failed for our users, as it worked one day later. Seems that they just continued working from their open merge request page and GitLab checks the accessibility out of this session (using a cookie, token or something which was not updated with the the access to the remote repo/file)
It works!

Trigger bamboo plan from bitbucket Webhooks

I spent a couple of hours to figure out why I'm not able to trigger a webhook from bitbucket to bamboo, I found nothing yet
Issue:
I want to track when a PR is merged or a branch is deleted which as I see I'm not able to track this stuff from bamboo, so I need to have a webhook in bitbucket and call a bamboo reset api base on this page if there is no better idea.
base on this page I thought I can trigger a webhook
https://confluence.atlassian.com/bamboo/triggering-a-bamboo-build-from-bitbucket-cloud-using-webhooks-873949130.html
But this solution is now working because each time I got this error message
{"message":"Anonymous user can't access this resource. If it should be available, modify anonymous user permissions at Administration > Security settings","status-code":401}
The only access we have for Anonymous group is view which I see this is not enough to call this API from bitbucket
https://confluence.atlassian.com/bamboo/bamboo-permissions-369296034.html
So I don't know what to do and how to track if a PR is merged or a branch is deleted.
I would appreciate to tell me what the problem is
FYI: bamboo and bitbucket version is the latest one
What is your Bamboo version? This issue was covered at Bamboo 6.7.0. At Bamboo > Administration > Security settings you can grant/deny access of anonymous users to given webhook
The easiest way is to enable triggers for anonymous users. Also, as #Hamed mentioned, allowing anonymous access is not feasible in some environments. The problem is we cannot even go with <User>:<Password>#<Bamboo URL> and that strips off the auth details.
One possible way of doing this is to keep a proxy between Bitbucket and Bamboo and then add the Authentication headers at the proxy level.

What is webhookId and deployKeyId when mirroring a bitbucket repository in google cloud

We have a lot of bitbucket repositories which I am trying to link to google cloud so I can use automatic build triggers to build container images.
If you set up a build trigger using a bitbucket source through the admin panel interface, google seems to create a mirrored source repository, adds a ssh key on bitbucket (so it can pull any changes) and also add a bb service (old style webhooks) on the repository so that pushing to bb triggers a pull in to google (i.e. mirroring). This all seems to work well, but I would like to be able to set this up programatically via an API.
I can setup the bitbucket stuff fine using their API but I am not sure how to go about creating the google repository using this
Using the API to fetch an existing mirrored repository gives some clues about how it works. The mirrorConfig key is the key.
{
"name": "projects/my_project_id/repos/bitbucket-organisation-myrepo",
"size": "7670706",
"url": "https://source.developers.google.com/p/my_project_id/r/bitbucket-organisation-myrepo",
"mirrorConfig": {
"url": "ssh://git#bitbucket.org/organisation/myrepo.git",
"webhookId": "12299619",
"deployKeyId": "6759258"
}
}
The POST service setup in bitbucket is:
https://source.developers.google.com/webhook/bitbucket?id=M12A8PQNCVD&project=385688625156
Notice how the ids in google don't correspond with the bitbucket webhook url. Adding another repository gives completely different ids and creates a new ssh keypair on my bitbucket user every time.
webhookId looks like it's linking to some sort of "google cloud webhook" which must be how the bitbucket webhook is linked up with the google webhook, but I cant see where you find or create them.
deployKeyId sounds like it's linking to some sort of credentials store which must be where it stores the private key.
My question is, what are these two IDs really? and which APIs can I use to setup webhooks and deploy keys in google cloud?

Heroku review app deploy with github: how to change the user deploying

I inherited a Heroku account. We moved to a team setup and I'm getting started with review apps.
When trying to create a review app, Heroku complains:
Cannot create this review app • Your role collab on the team xxxxx is
not allowed to perform that action.
However, I'm an admin on Heroku.
Another admin tried also and had the same problem.
The deploys prior to moving to a team, seems to have been initiated by another user, who is a collaborator, but I know for certain that the user did not actually trigger the deploy, the activity shows as being initiated by this user however it was triggered by the owner of the (then) personal app clicking create review app.
I'm trying to understand how the deploy activity is linked to the github account, so my first question is:
Is the deploy activity associated with a specific Github user? If so, where/how is the user defined?
I get the impression I need to disconnect the Github account from the pipeline, and re-connect with one of the admin's accounts, but I'm wary of disconnecting without understanding the consequences: the Heroku help on that is not clear at all.
My second question is:
What happens when I disconnect the Github account from the pipeline? Should I worry that it will mess up my running dynos? If not in disconnecting, could it cause trouble on re-connection?
Thanks
The only way I found to do this was to un-link the github capability in Heroku, and re-link with the account I wanted to use.
I had to do this recently. I had set up a pipeline in Heroku with review apps using an old Heroku account. I later wanted to remove that old account's access to the Heroku apps and I ran into the same issue you did.
Here's what I did.
I unlinked github from the pipeline. This deleted all existing review apps. But it left the staging and production apps alone and in their stages in the pipeline.
Then I re-linked github to the pipeline using the more permanent Heroku account.
And I had to re-set up the review apps configuration in the pipeline.
At that point, I had to recreate all review apps that existed before I started.

How to push from Gitlab to Github with webhooks

My Google-fu is failing me for what seems obvious if I can only find the right manual.
I have a Gitlab server which was installed by our hosting provider
The Gitlab server has many projects.
For some of these projects, I want that Gitlab automatically pushes to a remote repository (in this case Github) every time there is a push from a local client to Gitlab.
Like this: client --> gitlab --> github
Any tags and branches should also be pushed.
AFAICT I have 3 options:
Configure the local client with two remotes, and push simultaneous to Gitlab and Github. I want to avoid this because developers.
Add a git post-receive hook in the repository on the Gitlab server. This would be most flexible (I have sufficient Linux experience to write shell scripts as git hooks) and I have found documentation on how to do this, but I want to avoid this too because then the hosting provider will need to give me shell access.
I use webhooks in Gitlab. I am unfamiliar with what the very basics of webhooks are, and I am unable to locate understandable documentation or even a simple step-by-step example. This is the documentation from Gitlab that I found and I do not understand it: http://demo.gitlab.com/help/web_hooks/web_hooks
I would appreciate good pointers, and I will summarize and document a solution when I find it.
EDIT
I'm using this Ruby code for a web hook:
class PewPewPew < Sinatra::Base
post '/pew' do
push = JSON.parse(request.body.read)
puts "I got some JSON: #{push.inspect}"
end
end
Next: find out how to tell the gitlab server that it has to push a repository. I am going back to the GitLab API.
EDIT
I think I have an idea. On the server where I run the webhook, I pull from GitLab and then I push to Github. I can even do some "magic" (running tests, building jars, deploying to Artifactory,...) before I push to GitHub. In fact it would be great if Jenkins were able to push to a remote repository after a succesful build, then I wouldn't need to write my own webhook, because I'm pretty sure Jenkins already provides a webhook for Gitlab, either native or via a plugin. But I don't know. Yet.
EDIT
I solved it in Jenkins.
You can set more than one git remote in an Jenkins job. I used Git Publisher as a Post-Build Action and it worked like a charm, exactly what I wanted.
would work of course.
is possible but dangerous because GitLab shell automatically symlinks hooks into repositories for you, and those are necessary for permission checks: https://github.com/gitlabhq/gitlab-shell/tree/823aba63e444afa2f45477819770fec3cb5f0159/hooks so I'd rather stay away from it.
Web hooks are not suitable directly: they make an HTTP request with fixed format on certain events, in your case push, not Git protocol requests.
Of course, you could write a server that consumes the hook, clones and pushes, but a service (single push and no deployment) or GitLab CI (already implements hook management) would be strictly better solutions.
services are a the best option if someone implements it: live in the source tree, would do a single push, and require no extra deployment overhead.
GitLab CI or othe CIs like Jenkins are the best option currently available. They are essentially already implemented server for the webhooks, which automatically clone for you: all you have to do then is to push from them.
The keywords you want to Google for are "gitlab mirror github". That has led me to: Gitlab repository mirroring for instance. There seems to be no perfect, easy solution today.
Also this has already been proposed at the feature request forum at: http://feedback.gitlab.com/forums/176466-general/suggestions/4614663-automatic-push-to-remote-mirror-repo-after-push-to Always check there ;) Go and upvote the request.
The key difficulty now is how to store the push credentials.
I solved it in Jenkins. You can set more than one git remote in an Jenkins job. I used Git Publisher as a Post-Build Action and it worked like a charm, exactly what I wanted.
I added "-publisher" jobs that run after "" is built successfully. I could have done it in one job, but I decided to split it up. The build jobs are triggered by a web hook in GitLab; the publisher jobs are using a #daily schedule from the BuildResultTrigger plugin.

Resources