My employer uses an on-prem bitbucket server, and it echoes back a pull request URL after I do a git push. Is there a way to have a global hook which lets me open this URL directly in my browser every time I git push from anywhere, be it a terminal or an IDE?
Not really: there is no post-push client-side hook.
So, as mentioned in here, you could need to make a Git wrapper script in order to:
detect the push
parse its stderr output
extract the URL
call a brwoser with it.
Related
My objective is to sync changes between all 4 repos such as "web, android, iOS & config", having config as the single source of truth.
Now of course each and every individual platform repo has its content, but at the root, there will be a repo that has the exact same content, so:
root of web will have a folder "config", where the content is exactly similar to "config" repo
root of android will have a folder "config", where the content is exactly similar to "config" repo
etc
Now developer will only make changes to the repo "config", and upon pr successfully merged, i wish to create PR in "web.config,android.config & iOS.config" individually.
I do realized that bitbucket provides documentation for their API, however im not exactly sure which exact API suits my needs.
My best guess is i will have this script, being triggered upon pr merged successfully, then we can query the last commit of master branch by running the below command
curl -u username:password https://bitbucket.xx.xxx.com/rest/api/1.0/projects/PROJECT/repos/config/branches/default
# {"id":"refs/heads/master","displayId":"master","type":"BRANCH","latestCommit":"afff780e4xxxxxxxx","latestChangeset":"afff780e4xxxxxxxxx","isDefault":true}
Now that i successfully has the latestCommit, i can then query the changes?
curl -u username:password https://bitbucket.xx.xxx.com/rest/api/1.0/projects/PROJECT/repos/config/commits/afff780e4xxxxxxxx
in which i have tons of metadata, but im not sure what to do with them in order to proceed and create a new PR in other repos. Appreciated if anyone can shed some lights on this
During a Google Cloud Build, is there a way to get information regarding the fact that the build is associated with a Pull Request like the Pull Request number/id for example?
It seems that no such substitution variable is available for the moment ref: https://cloud.google.com/cloud-build/docs/configuring-builds/substitute-variable-values
In GitHub, a single branch can be associated with multiple Pull Requests.
You can look up all PRs associated with a given branch ref using the GitHub API: https://developer.github.com/v3/pulls/
Cloud Build does not currently provide the Pull Request information, but if it did this would probably come from something like the Check Suite data, which also treats PRs as a list.
Not from Github API, but you can get the PR# from command line:
$ hub pr list -f "%I%n" -h "$(git rev-parse --abbrev-ref HEAD)"
12345
Source: This blog post.
Now they are available as CloudBuild environment variables. From the official document:
Cloud Build provides the following GitHub-specific default
substitutions available for pull request triggers:
$_HEAD_BRANCH : head branch of the pull request
$_BASE_BRANCH : base branch of the pull request
$_HEAD_REPO_URL : url of the head repo of the pull request
$_PR_NUMBER : number of the pull request
Making two different go modules
source.cloud.google.com/me/a
source.cloud.google.com/me/b
With source.cloud.google.com/me/common as a common lib dependency (to share a model)
I'm trying to go get source.cloud.google.com/me/common (even manually wrote it in the go.mod file) but I keep receiving the following error:
package source.cloud.google.com/me/common:
unrecognized import path "source.cloud.google.com/me/common"
(parse https://source.cloud.google.com/me/common?go-get=1: no go-import meta tags ())
I have gcloud set up to be able to use app deploy and create new source repositories. I've tried setting up ssh for google cloud and attempted to use manual credentials. This neither works locally or in the google cloud build service.
I want to do two things:
Be able to go get this dependencsource.cloud.google.com/me/common
Be able to integrate this go get into my App Engine automated build pipeline.
Any help would be appreciated.
Configure repo on https://source.cloud.google.com/
Authorize manual git access https://source.developers.google.com/p/YOUR_PROJECT_ID/r/YOUR_REPO
In this example: https://source.developers.google.com/p/m/r/common
Your common module should go like source.developers.google.com/p/m/r/common.git
Run: go get source.developers.google.com/p/m/r/common.git on the other module
I would try the following steps:
Make sure it has manual git access - You can try a git clone from folder "a" to check if correct git access is in place. Delete it after it gets cloned.
Make sure that you are using HTTPs - looks like you are good in that regards - go1.14 made HTTPs as default for go get's.
Now, coming to the actual problem - looks like your private version control systems isn't sending the required "go-import" meta tag.
For example - refer any github go module, you can see the "go-import" meta tag:
In order to fix it, the VCS server needs to respond with this tag when go get tries to download "common" module
<meta name="go-import" content="source.cloud.google.com/me/common git https:source.cloud.google.com/me/common">
This works:
got get source.developers.google.com/p/YOUR_PROJECT_ID/r/YOUR_REPO.git
I have a Python Scrapy spider that I want to run at regular intervals on Heroku or similar.
It produces a JSON file that I would like to commit to a Github repo.
Given Heroku or other similar platform, how can I set it up to automatically commit, post-run?
You could write an item pipeline that keeps a static list of items.
Give this pipeline a function called spider_closed and use the dispatcher to attach that function as a signal handler for spider_closed signal.
In spider_closed use json.loads to serialize your data. Save it to a file. And commit it to github.
This repo has good code examples:
https://github.com/dm03514/CraigslistGigs/blob/master/craigslist_gigs/pipelines.py
This seems to be a good library to use for the git part:
https://pypi.python.org/pypi/GitPython/
HTH - if you need anymore help I'd be happy to update this response.
Hey so I see I have a main repo and then a development fork of the repo. I work of the dev and submit pull request for code review to the main, if it gets accepted my boss will merge my pull request with the main repo. We want to set up an even hook similar to "Post-Recieve URLs" that will send a post to my main web app once a pull request is accepted to do a git pull. If I have this right "Post-Recieve URLs" only works for if I commit directly to the repository is that correct? So it wont work if I merge a pull request.
If I have this right "Post-Recieve URLs" only works for if I commit directly to the repository is that correct?
Yes, so not activated in case of merge done directly within the repo.
And this thread mentions that a "hook on merge" (ie on the auto-commit done by a merge) might not work.
A background job in charge of monitoring any new commits (and check if that commit is the result of a merge, by looking at its parent: more than one parent means "merge") is more appropriate.