How to use GitLab as local build and deploy tool? - continuous-integration

I want to build and deploy my projects with GitLab's pipeline hosted locally and solely for build/deploy (sources hosted elsewhere). Everything GitLab related nicely stored within my_gitlab folder with:
my_gitlab
├── config
├── data
├── docker-compose.yaml
└── logs
in it and runs with single docker-compose up -d command. Runners, users, keys, etc. is all setup and persist between reboots. my_gitlab occupies 764 Kb disk space and can be pushed to git repo to share local build/deploy functionality.
The only problem is that you cannot initiate pipeline by pointing to sources directory - you need to push sources to thus locally hosted GitLab with .gitlab-ci.yml in it. Each such push causes my_gitlab dir to grow up to 200 Mb+ in size.
Is there a way to strip repositories data from GitLab or initiate pipeline without pushing code? Is it even somewhat OK usage of GitLab?

You can use GitLabs Interface to start a new Pipeline without pushing any code.
On the left side in your project go to CI/CD -> Pipelines -> Run Pipeline and select your branch.

Related

Gitlab Runner configuration to ignore folder builded on server

I'm new with Gitlab CI. Every time Gitlab CI run, it replace old folder on server. I have small problem when I want to reduce time Gradle build for project which include DL4J (very big size and take time to build). So I want it keep build folder from last version. I follow this to reduce time build by gradle.
Question: Is that possible to skip some folder by config of gitlab ci to keep it exist. This is my gitlab ci
stages:
- build
something_run:
stage: build
script:
- gradle build
- systemctl restart myproject
tags:
- ml
only:
- master
When it run, gradle will build project and time to build quite long. So I want next time CI run it will not delete last build version.
Take a look at cache (https://docs.gitlab.com/ee/ci/yaml/#cache)
cache is used to specify a list of files and directories which should be cached between jobs.
GitLab CI/CD provides a caching mechanism that can be used to save time when your jobs are running.
See also https://docs.gitlab.com/ee/ci/caching/index.html

Terraform modules cache

I have .terraform/modules folder generated by terraform itself.
It's where terraform keeps modules by default and I'm fine with that.
when running terraform init command and if .terraform folder is gone it will try to pull modules again I would like to avoid that step by saying to use pre-populated modules folder from different location - it's like building shared cache folder for terraform for our CI/CD pipelines, pull only if new version of a modules specified otherwise use cache.
NOTE:
We don't run anything on Jenkins locally, every `Stage` in Jenkins uses Ephemeral Docker
container agents to run all the `Steps` and to keep Jenkins clean,
otherwise I would use local workspace cache for all that.
is there a way to do that?
Thank you

Use git or hg repository tag as version in Azure Pipelines

I want to build a project in Azure Pipelines, but I want to know what the idiomatic way is to obtain the latest tag, latest tag distance, and repo remote path/URL in order to pass those values into the actual build script that is inside the repository.
Previously our build script would invoke hg log -r . --template with a clever template, but we found when moving to Continua CI build server that the build agent doesn't have access to the actual repository during a build, and had to find another way.
I'm assuming the same issue would crop up with Azure Pipelines and haven't quite found the relevant docs yet on artifact versioning.
Many thanks in advance.
For git at least, Azure Pipelines does a full clone of the repo by default, unless you explicitly denote that you're doing a shallow clone (source: https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/pipeline-options-for-git?view=azure-devops).
Deriving the version/tag can be done via normal git commands (i.e. git describe --tags or whatever you prefer), which can then be saved as VSO variables to be accessed in later steps in the same job (see https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#set-variables-using-expressions for more info on how to do that).

How to discover Jenkinsfile in subfolders

I have a project with two folders, that are independent, and needs separate builds in Jenkins (running v2.74)
My structure is
folder
├── project1
│   └── Jenkinsfile
└── project2
└── Jenkinsfile
When I click "scan organization" in Jenkins, it doesnt discover the Jenkinsfiles in subdirectories.
Here is a sample from the "Scan organization log":
Proposing kg-pipeline
Examining my-test-project
Checking branches...
Getting remote branches...
Checking branch jenkins
‘Jenkinsfile’ not found
Does not meet criteria
Checking branch master
‘Jenkinsfile’ not found
Does not meet criteria
2 branches were processed
Finished examining my-test-project
I didnt touch the configuration of the job that scans the organization and finds branches with Jenkinsfiles. Here is the current setting for the project
My question is: How do I configure Jenkins to see each folder individually? I am also interested in links to example projects set up this way.
There is a simple solution.
go to JENKINS_URL/job/JOB_NAME/configure
under Build Configuration, select by Jenkinsfile for Mode
under Script Path, set it to DIR_NAME/Jenkinsfile
For instance, if you had your Jenkinsfile inside a src directory, then you would set it to src/Jenkinsfile and Jenkins will be able to find the Jenkinsfile now
Currently trying to set up the same thing - this might help, even though it's not a complete answer:
According to this you should be able to setup two Multibranch Pipeline Projects, each with the configuration Mode "by Jenkinsfile".
Support for using custom paths for the Jeninsfile was added in JENKINS-34561
You can add more than one "Pipeline Jenkinsfile" under Project Recognizers with your paths to Jenkins files.

Is there a CI service for bitbucket.org which allow managing build commands in a VCS file?

Since travis-ci.org doesn't support bitbucket.org I need another CI service which supports it and allows managing the build commands in a VCS file (like .travis.yml in travis).
My quite annoying research result so far is:
semaphoreci.com: projects which are forks aren't listed even after refreshing the project list
app.shippable.com: signing up with both github.com and bitbucket.org doesn't work
codeship.com: doesn't support to run commands as ''root'' user((https://codeship.com/documentation/faq/root-level-access/))
www.snap-ci.com: no support for bitbucket.org((http://www.slant.co/topics/186/~hosted-continuous-integration-services))
I don't get why people would not want to share the CI service build commands in the VCS - chances of good collaboration without such a feature seems small to me. Even if one adds a script file in the VCS it still needs to be set up in the CI service which appears to be an unnecessary step.
A few months ago Bitbucket launched Pipelines. Quoting from the link:
Continuous delivery is now seamlessly integrated into your Bitbucket Cloud repositories.
You may use it on free plans, but next year they will reduce the build minutes for free plans from 500 minutes to 50 minutes as told in this link.
Also, CircleCI is supporting Bitbucket. It has free plan with 1500 build minutes. It can be triggered by commit or tag in BB. https://circleci.com/
The company that owns BitBucket also has a product called Bamboo for CI. Though most should work with any git that provides a webhook.
According to this blog, it is possible to use Travis-CI for Bitbucket:
Clone github repository:
git clone https://github.com/{github_user}/{github_repository}
cd {github_repository}
Add submodule bitbucket repository:
git submodule add https://bitbucket.org/{bitbucket_user}/{bitbucket_repository}
Add .travis.yml to root dir:
git:
submodules:
false
before_install:
- echo -e "machine bitbucket.org\n login $BITBUCKET_USER_NAME\n password $BITBUCKET_USER_PASSWORD" >~/.netrc
- git submodule update --init --recursive
$BITBUCKET_USER_NAME is bitbucket username
$BITBUCKET_USER_PASSWORD is app password
Open https://travis-ci.org/{github_user}/{github_repository}
A Semaphore CI user can add a fork of a project to his Semaphore account following these steps on the documentation page. Also, Semaphore is building a fork pull request and those builds are visible.
There also is (now) an option to use GitLab as CI/CD server for repository hosted on Bitbucket.
See the documentation here: on GitLab site

Resources