Is there an alternative to gitlab-runner? - continuous-integration

I would like to know if there is an alternative to gitlab runners compatible with bitbucket.
The main functionnalities i'm searching for is to install an executor on my server and use it to run some commands on the server where it is installed.
I've tried circle-ci, bitbucket pipelines and travis but all of those require to be connect to my server over ssh.

If you like the gitlab ci you can use it as a CI for bitbucket.
Just mirror your bitbucket repo on gitlab and your will be able to use all of gitlab ci features.
"The following are some possible use cases for repository mirroring:
You migrated to GitLab but still need to keep your project in another source. In that case, you can simply set it up to mirror to GitLab (pull) and all the essential history of commits, tags, and branches will be available in your GitLab instance."
It works on gitlab.com and also on gitlab self hosted (CE or Enterprise)

Related

The best way to automate deploy with bitbucket and a remote server (like digital ocean droplet) is with git hooks or pipelines?

I'm working on a project and now is the stage of deployment. I have a droplet in digital ocean and I could just clone my git repository from bitbucket inside this droplet and every time I do a git push to my remote repository I just do a git pull inside my droplet. I really don't want to do this every time, so I searched how could I automate this and I found two ways:
Git hooks
https://macarthur.me/posts/deploying-code-with-a-git-hook
In this link show how to do it (I don't like the fact the after clone my git repository from bitbucket I have also to remote link with my droplet)
Pipeline
Using BitBucket Pipelines to Deploy onto VPS via SSH Access I also find this way that I just do my git pull inside my pipeline
So here is my question: between these two ways, which one is better? The only thing that I don't like in the git hooks way is that every time I clone my bitbucket repository in a new machine, I have to add new remote repository to automate the deployment.

How to update a Windows machine with changes done in a git repository

I am planning to do below
Copy from git repository to a Windows machine each time a commit/ update is made to that folder only. May be something like Jenkins can be used for same but unable to determine how can I do it?
Check commit made to repo ( this I have done)
As soon as commit is made to repo, trigger a jenkins job that will update this change to a windows server ( How to do this?)
If the repository is local, it would be easier to push directly to the Windows machine, assuming it has an SSH server (which Windows 10 2019.09 and more now have)
If the repository is distant, you can configure a webhook in order to call a Jenkins server for a specific job.
See for instance "Triggering a Jenkins build every time changes are pushed to a Git branch on GitHub" by David Luet
Or you can define a Jenkins pipeline that GitLab-CI can execute.
In both cases, your Jenkins job will have to copy the checked out repository.
I would use git bundle to compress the repository into one file (or a simple tar), copy it over the Windows server, and decompress there.

Use git or hg repository tag as version in Azure Pipelines

I want to build a project in Azure Pipelines, but I want to know what the idiomatic way is to obtain the latest tag, latest tag distance, and repo remote path/URL in order to pass those values into the actual build script that is inside the repository.
Previously our build script would invoke hg log -r . --template with a clever template, but we found when moving to Continua CI build server that the build agent doesn't have access to the actual repository during a build, and had to find another way.
I'm assuming the same issue would crop up with Azure Pipelines and haven't quite found the relevant docs yet on artifact versioning.
Many thanks in advance.
For git at least, Azure Pipelines does a full clone of the repo by default, unless you explicitly denote that you're doing a shallow clone (source: https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops).
Deriving the version/tag can be done via normal git commands (i.e. git describe --tags or whatever you prefer), which can then be saved as VSO variables to be accessed in later steps in the same job (see https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#set-variables-using-expressions for more info on how to do that).

Is there a CI service for bitbucket.org which allow managing build commands in a VCS file?

Since travis-ci.org doesn't support bitbucket.org I need another CI service which supports it and allows managing the build commands in a VCS file (like .travis.yml in travis).
My quite annoying research result so far is:
semaphoreci.com: projects which are forks aren't listed even after refreshing the project list
app.shippable.com: signing up with both github.com and bitbucket.org doesn't work
codeship.com: doesn't support to run commands as ''root'' user((https://codeship.com/documentation/faq/root-level-access/))
www.snap-ci.com: no support for bitbucket.org((http://www.slant.co/topics/186/~hosted-continuous-integration-services))
I don't get why people would not want to share the CI service build commands in the VCS - chances of good collaboration without such a feature seems small to me. Even if one adds a script file in the VCS it still needs to be set up in the CI service which appears to be an unnecessary step.
A few months ago Bitbucket launched Pipelines. Quoting from the link:
Continuous delivery is now seamlessly integrated into your Bitbucket Cloud repositories.
You may use it on free plans, but next year they will reduce the build minutes for free plans from 500 minutes to 50 minutes as told in this link.
Also, CircleCI is supporting Bitbucket. It has free plan with 1500 build minutes. It can be triggered by commit or tag in BB. https://circleci.com/
The company that owns BitBucket also has a product called Bamboo for CI. Though most should work with any git that provides a webhook.
According to this blog, it is possible to use Travis-CI for Bitbucket:
Clone github repository:
git clone https://github.com/{github_user}/{github_repository}
cd {github_repository}
Add submodule bitbucket repository:
git submodule add https://bitbucket.org/{bitbucket_user}/{bitbucket_repository}
Add .travis.yml to root dir:
git:
submodules:
false
before_install:
- echo -e "machine bitbucket.org\n login $BITBUCKET_USER_NAME\n password $BITBUCKET_USER_PASSWORD" >~/.netrc
- git submodule update --init --recursive
$BITBUCKET_USER_NAME is bitbucket username
$BITBUCKET_USER_PASSWORD is app password
Open https://travis-ci.org/{github_user}/{github_repository}
A Semaphore CI user can add a fork of a project to his Semaphore account following these steps on the documentation page. Also, Semaphore is building a fork pull request and those builds are visible.
There also is (now) an option to use GitLab as CI/CD server for repository hosted on Bitbucket.
See the documentation here: on GitLab site

TFS 2013 CI build from a remote Git repo

Is it possible to schedule a CI Build in TFS server to clone a "remote Git repository" (i.e. from gitHub or gitLab) and run the build??
I know that TFS 2013 can work with a Self-hosted Git repo. But can i make a "TFS build controller" get the code from a remote repo and build it, like cloning/pulling from github?
I am only interested in running the continous integration build, not in using Visual Studio as a git client. (I use SourceTree as git client)
If you create a self hosted Git repo in TFS and clone your remote repo into it you will be able to configure a build for that folder. Then as part of the "pre-build" script you can run whatever you like to get the right bits to where they need to be.
Ideally you need to customise the build template to run a script pre-get that will sync the repos before TF Build gets the code.

Resources