Since GitLab 7.6, or thereabouts, there is a new option to use TeamCity directly from GitLab projects. In the setup there is this message:
The build configuration in Teamcity must use the build format number
%build.vcs.number% you will also want to configure monitoring of all
branches so merge requests build, that setting is in the vsc root
advanced settings.
I'm not sure how this works. Lets say I have a repository Foo.
I have setup a build on TeamCity to listen to Foo with branch specification: +:refs/pull/*/merge
I then fork Foo in gitlab as FooFork, make a change, then request a merge FooFork -> Foo.
But nothing happens to test this merge, which is what I was expecting GitLab to do. If I accept the merge than the build server jumps into action (immediately) and builds twice (master and /ref/master).
I've also set the build configuration to use exactly: %build.vcs.number% as the build number as prescribed, but gitlab doesn't seem to give me any information about the build result.
So I'm a bit confused really as to what exactly this GitLab -> TeamCity integration is supposed to do and whether I'm doing wrong.
I'm currently running GitLab 7.9 and TeamCity 8.1.4
Update:
Seems this use case was not supported prior to version 8 - https://github.com/gitlabhq/gitlabhq/issues/7240
I'm running GitLab 8.0.2 and TeamCity 9.1.1 and am able to run CI builds on branches and merge requests.
I trigger CI builds for specific branches by setting a VCS trigger together with the branch specification +:refs/heads/(xyz*) where xyz is the string for our ticket system prefix since all active branches need to be named after an entry in our issue tracker.
I trigger builds for merge requests via the branch specification +:refs/(merge-requests/*)
Everything works as as expected and lets us know the status of all feature / bug branches and merge requests automatically.
Thanks to Rob's comment linking to the GitLab 8 release notes entry on the merge request spec.
Same problem here. There might be another way, I'm evaluating right now. Since there's no direct way of getting the merged state from the target MR, you have to build it on your own:
IMO there's the following todos
1.) init a bare repo $ git init
2.) add your target repo $ git remote add origin git#your-repo:<origin.group>/<origin.repo>.git
3.) add the remote/feature/to-merge's $ git remote add target git#your-repo:<feature.group>/<feature.repo>.git
4.) checkout your feature branch $ git checkout -b <feature.branch> feature/<feature.branch>
5.) checkout your original branch $ git checkout -b <origin.branch> origin/<origin.branch>
6.) Rebase feature into your original branch $ git rebase <feature.branch>
As stated here [1], GitLab-CE can fire an event on creation of a merge-request,
so all you have to do is building some meta, that can evaluate the WebHooks.
[1] http://doc.gitlab.com/ce/web_hooks/web_hooks.html#merge-request-events
Related
I want to build a project in Azure Pipelines, but I want to know what the idiomatic way is to obtain the latest tag, latest tag distance, and repo remote path/URL in order to pass those values into the actual build script that is inside the repository.
Previously our build script would invoke hg log -r . --template with a clever template, but we found when moving to Continua CI build server that the build agent doesn't have access to the actual repository during a build, and had to find another way.
I'm assuming the same issue would crop up with Azure Pipelines and haven't quite found the relevant docs yet on artifact versioning.
Many thanks in advance.
For git at least, Azure Pipelines does a full clone of the repo by default, unless you explicitly denote that you're doing a shallow clone (source: https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops).
Deriving the version/tag can be done via normal git commands (i.e. git describe --tags or whatever you prefer), which can then be saved as VSO variables to be accessed in later steps in the same job (see https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#set-variables-using-expressions for more info on how to do that).
Sonarqube Server Version 7.0 (build 36138)
Sonarqube Branch Plugin 7.0 (build 413)
sonar-maven-plugin:3.4.0.905
Java project
Sonarqube is set up with a master branch already.
As part of a Jenkins build job we execute the following command:
mvn sonar:sonar -Dsonar.host.url=<our host> -Dsonar.projectName=<project name> -Dsonar.projectKey=<project name> -Dsonar.branch.name=${BRANCH}
where BRANCH is set to the branch name we are building in Jenkins.
Analysis appears to work when we build our "develop" branch in that the develop branch appears if it isn't in Sonarqube and the timestamp for the analysis is correct on the server, but there are two issues:
1) I've set "develop" to be a long-lived branch per the instructions in https://docs.sonarqube.org/display/PLUG/Branch+Plugin by modifying the long-lived branch regex in the SQ server to be:
(branch|release|develop)-.*
but I only see the Issues and Code tabs on the "develop" branch display. And in the Jenkins job, I see the message:
[INFO] Branch name: develop, type: short living
which leads me to believe that develop is not being recognized as a long living branch.
2) There is no output in the Issues tab. Only the code tab shows anything. But the master branch output shows 225 issues, so I would expect the same list of issues in the develop branch (since they haven't been addressed).
Questions:
Do long living branches show all of the same output that you
normally see for the master branch, including "Overview"?
Is there something that I need to do to specify the "develop" branch
as long-living in the maven command above?
Any idea why the issues tab doesn't show anything?
Many thanks,
Wes
In my opinion it is your regex you should check first. You are using the default (branch|release|develop)-.* notice that you have added hyphen(-) at the end.
So, sonar expects the branch name to be branch-, release- or develop-. In your case I believe the regex should be (branch|release|develop).*
Seems you hit this:
There's already a Jira ticket
https://jira.sonarsource.com/browse/MMF-1265
https://community.sonarsource.com/t/long-lived-branches-quality-gate-does-not-fail-in-first-analysis/175
Since travis-ci.org doesn't support bitbucket.org I need another CI service which supports it and allows managing the build commands in a VCS file (like .travis.yml in travis).
My quite annoying research result so far is:
semaphoreci.com: projects which are forks aren't listed even after refreshing the project list
app.shippable.com: signing up with both github.com and bitbucket.org doesn't work
codeship.com: doesn't support to run commands as ''root'' user((https://codeship.com/documentation/faq/root-level-access/))
www.snap-ci.com: no support for bitbucket.org((http://www.slant.co/topics/186/~hosted-continuous-integration-services))
I don't get why people would not want to share the CI service build commands in the VCS - chances of good collaboration without such a feature seems small to me. Even if one adds a script file in the VCS it still needs to be set up in the CI service which appears to be an unnecessary step.
A few months ago Bitbucket launched Pipelines. Quoting from the link:
Continuous delivery is now seamlessly integrated into your Bitbucket Cloud repositories.
You may use it on free plans, but next year they will reduce the build minutes for free plans from 500 minutes to 50 minutes as told in this link.
Also, CircleCI is supporting Bitbucket. It has free plan with 1500 build minutes. It can be triggered by commit or tag in BB. https://circleci.com/
The company that owns BitBucket also has a product called Bamboo for CI. Though most should work with any git that provides a webhook.
According to this blog, it is possible to use Travis-CI for Bitbucket:
Clone github repository:
git clone https://github.com/{github_user}/{github_repository}
cd {github_repository}
Add submodule bitbucket repository:
git submodule add https://bitbucket.org/{bitbucket_user}/{bitbucket_repository}
Add .travis.yml to root dir:
git:
submodules:
false
before_install:
- echo -e "machine bitbucket.org\n login $BITBUCKET_USER_NAME\n password $BITBUCKET_USER_PASSWORD" >~/.netrc
- git submodule update --init --recursive
$BITBUCKET_USER_NAME is bitbucket username
$BITBUCKET_USER_PASSWORD is app password
Open https://travis-ci.org/{github_user}/{github_repository}
A Semaphore CI user can add a fork of a project to his Semaphore account following these steps on the documentation page. Also, Semaphore is building a fork pull request and those builds are visible.
There also is (now) an option to use GitLab as CI/CD server for repository hosted on Bitbucket.
See the documentation here: on GitLab site
Helo,
I'm using Bamboo to deploy a Java webapp project that's triggered by git repo push. My requirement is to deploy based on conditions,
"branch is pushed" and
"new commit is tagged as some value"
Is it possible to be done using existing plugin? If I have to implement it manually, is it possible? how to?
There may be simple and more direct ways of doing this with the latest Atlassian Bamboo major release (version 5, see https://www.atlassian.com/software/bamboo/deploy). I would certainly embrace some additional automation/deployment workflows around these types of features, but I implemented something akin to what you're asking for without plugins and have been using it quite successfully for eight months.
Here's how it works:
We merge to a testing branch which executes a suite of unit, integration, functional, and workflow tests, and builds various pieces of documentation. This is like your (1): "branch is pushed"
We run a second manual plan which pulls the latest testing branch, tags it, and pushes the tag.
Our third step is to run a deployment plan which deploys the latest tag.
I think step (1) and (3) will vary widely between applications. Step (2) however may hit on what you're after. Here's the details regarding that plan and its associated task:
Checkout the testing branch, Force Clean Build is enabled
Use an inline script to add a remote repository. E.g., a GitHub example:
git remote add origin git#github.com:/repo.git || exit 0
Use the git excutable with arguments pull origin testing to ensure we're consistent with the upstream repository
Use the git excutable with arguments fetch --all --tags to get the latest tags from all repositories
Use a bash executable with a custom script to change the version of our codebase to what the tag will be named.
Use the git excutable with arguments push origin --tags to push the tag created in the previous step.
The custom script mentioned in (5) looks like this:
scripts/version.sh ${DATE}
git commit -m "bumped version"
git tag -af "${DATE}" -m "Build server tagged ${DATE}"
For completeness, I'm using || exit 1 everywhere within scripts to make sure they fail-fast but left these out for brevity.
tl;dr No plugins support what you're asking for to my knowledge. It is possible. The how will vary for you, hopefully what I've set forth shows that.
The Setup
I've got a Mac mini set up with Jenkins pulling a repo down from GitHub and performing an Xcode build. Because the mini is firewalled off from the internet I am polling GitHub for changes every 15 minutes.
Prior to kicking off the Xcode build I have Jenkins execute the following script to bump my project build number and commit the results to the repo:
#!/bin/sh
agvtool bump -all
/usr/local/git/bin/git commit -a -m "This is Jenkins, updating your build numbers, sir."
After the build I have a post-build action set up to run another script that pulls and then pushes from GitHub, like so:
#!/bin/sh
/usr/local/git/bin/git pull origin develop
/usr/local/git/bin/git push origin develop
The Problem
Because Jenkins is performing a commit and a push to GitHub each time it runs, the next poll will find the changes it pushed up last time, creating a new build whether there were other actual changes or not.
The Question
Is there something wonky in my setup that I should be doing differently? I'd like Jenkins to update my Xcode target's build number and commit and push that result to GitHub. This configuration obviously accomplishes that, but with the side effect of a build every 15 minutes, instead of a build every 15 minutes as needed.
Jenkins is consulting the SHA1 of the lastSuccessfulBuild to determine if there have been changes, but it's doing that before my commit. Is there a way to set the SHA1 of the current build to the hash resulting from the commit of the version number bump? Something like so:
#!/bin/sh
agvtool bump -all
/usr/local/git/bin/git commit -a -m "This is Jenkins, updating your build numbers, sir."
NEW_SHA=$(/usr/local/git/bin/git rev-parse HEAD)
# Some awesome Jenkins-fu to set the SHA1 of the current build to NEW_SHA
This way when Jenkins compares hashes the next time around it won't build unless there have been changes since its commit to bump version numbers.
Thanks for any and all help.
I think this SO question answers your question, but I can't say for certain as we're not doing build version increment on a polling branch. I would also do the git push prior to building, that way you won't have to worry about as many merges that would occur should you commit code things during the build. Thankfully, if you keep it how it is you won't have much to worry about in the way of Xcode project merges.