Is it possible to push to gitlab from a gitlab job - gradle

I'd like to push tags to the gitlab repository for which a job is running.
I'm using the Gradle plugin reckon which is using the grgit/JGit API. Reckon is managing semantic versioning and is able to create and push a tag to a Git repository.
First I want to run in GitLab SaaS. And I assume I need a kind of token so I don't have to pass my personal credentials for security reasons?
Then I also have to work on running it in GitLab hosted environment. But I would expect that in both environments it should work the same way.
There is some thing like a deploy key but I really can't find any references on how to use them. But maybe deploy key is not really made for that kind of operation.
.release-template:
stage:
release
image: adoptopenjdk:11-jdk-hotspot
dependencies:
- deliver
script:
- |
./gradlew reckonTagPush -Preckon.scope=$scope -Preckon.stage=$stage \
-Dorg.ajoberstar.grgit.auth.username=${???} \
-Dorg.ajoberstar.grgit.auth.password=${???}
artifacts:
paths:
- build/
#only:
# - master
when: manual #ONLY MANUAL RELEASES, ONLY FROM MASTER
release-major:
extends: .release-template
variables:
scope: major
stage: final
release-minor:
extends: .release-template
variables:
scope: minor
stage: final
release-patch:
extends: .release-template
variables:
scope: patch
stage: final

If all you're doing is adding a git tag to the repo, and not adding commits, merging branches, etc. you can simply use the Tags API to create a new tag:
curl --request POST --header "PRIVATE-TOKEN: $(CI_JOB_TOKEN)" "https://gitlab.example.com/api/v4/projects/:project_id:/repository/tags?tag_name=test&ref=master"
The $CI_JOB_TOKEN variable is a Predefined Variable automatically provided to running jobs by Gitlab. The $CI_JOB_TOKEN specifically will hold a non-admin API token, which should be fine for the Tags API.
If you were using other API's that requires Admin permissions, you'd have to use a personal access token of an Admin.

Related

serverless remove lamda using gitlab CI

I'm using gitlab CI for deployment.
I'm running into a problem when the review branch is deleted.
stop_review:
variables:
GIT_STRATEGY: none
stage: cleanup
script:
- echo "$AWS_REGION"
- echo "Stopping review branch"
- serverless config credentials --provider aws --key ${AWS_ACCESS_KEY_ID} --secret ${AWS_SECRET_ACCESS_KEY}
- echo "$CI_COMMIT_REF_NAME"
- serverless remove --stage=$CI_COMMIT_REF_NAME --verbose
only:
- branches
except:
- master
environment:
name: review/$CI_COMMIT_REF_NAME
action: stop
when: manual
error is This command can only be run in a Serverless service directory. Make sure to reference a valid config file in the current working directory if you're using a custom config file
I have tried different GIT_STRATEGY, can some point me in right direction?
In order to run serverless remove, you'll need to have the serverless.yml file available, which means the actual repository will need to be cloned. (or that file needs to get to GitLab in some way).
It's required to have a serverless.yml configuration file available when you run serverless remove because the Serverless Framework allows users to provision infrastructure using not only the framework's YML configuration but also additional resources (like CloudFormation in AWS) which may or may not live outside of the specified app or stage CF Stack entirely.
In fact, you can also provision infrastructure into other providers as well (AWS, GCP, Azure, OpenWhisk, or actually any combination of these).
So it's not sufficient to simply identify the stage name when running sls remove, you'll need the full serverless.yml template.

Are there any advantages to Gitlab-CI api triggers vs using the trigger keyword?

In 11.8, Gitlab CI introduced the trigger
keyword to trigger a pipeline in another project.
staging:
stage: deploy
trigger:
project: my/deployment
branch: stable
Before that, the conventional way of triggering another pipeline was by making a post request using the api.
build_docs:
stage: deploy
script:
- curl --request POST --form "token=$CI_JOB_TOKEN" --form ref=master https://gitlab.example.com/api/v4/projects/9/trigger/pipeline
only:
- tags
Are there any reasons to continue using the older api method of triggering multi-project pipelines? Are there any advantages to that method vs the newer trigger keyword?
Trigger jobs can use only a limited set of the GitLab CI/CD configuration keywords.
https://docs.gitlab.com/ee/ci/pipelines/multi_project_pipelines.html#trigger-job-configuration-keywords
If you need to do before_script, script, after_script in the trigger job itself then you need to use the API.

How to execute gitlab-ci jobs on specific events

I'm learning gitlab-ci and I'm having a difficult time setting up the .yml file to run a specific job only when a certain trigger token is used or when a branch is merged into master.
I've read through gitlab-ci docs and reviewed several examples. Still, I'm not seeing what I'm looking for.
*Edit: Answering part of my own question, using only: - master should only run the job for merges and pushes to master branch.
.build_template: &base_defs
stage: build_base
<<: *tags_defs
variables:
FILE_VER: "3.4"
script:
- docker build -t "${DEV_BASE}:latest" "${VERSION}/devel/base"
--build-arg FILE_VERSION=${FILE_VER}
only:
- master
- ~ WHEN TRIGGER TOKEN MATCHES = K3K3K3K3 ~
Maybe you can use
only:
variables:
- token == "..."
and make it work with one of the predefined gitlab variables?
Reference: GitLab Docs

Gitlab-CI multi-project-pipeline

currently I'm trying to understand the Gitlab-CI multi-project-pipeline.
I want to achieve to run a pipeline if another pipeline has finshed.
Example:
I have one project nginx saved in namespace baseimages which contains some configuration like fast-cgi-params. The ci-file looks like this:
stages:
- release
- notify
variables:
DOCKER_HOST: "tcp://localhost:2375"
DOCKER_REGISTRY: "registry.mydomain.de"
SERVICE_NAME: "nginx"
DOCKER_DRIVER: "overlay2"
release:
stage: release
image: docker:git
services:
- docker:dind
script:
- docker build -t $SERVICE_NAME:latest .
- docker tag $SERVICE_NAME:latest $DOCKER_REGISTRY/$SERVICE_NAME:latest
- docker push $DOCKER_REGISTRY/$SERVICE_NAME:latest
only:
- master
notify:
stage: notify
image: appropriate/curl:latest
script:
- curl -X POST -F token=$CI_JOB_TOKEN -F ref=master https://gitlab.mydomain.de/api/v4/projects/1/trigger/pipeline
only:
- master
Now I want to have multiple projects to rely on this image and let them rebuild if my baseimage changes e.g. new nginx version.
baseimage
|
---------------------------
| | |
project1 project2 project3
If I add a trigger to the other project and insert the generated token at $GITLAB_CI_TOKEN the foreign pipeline starts but there is no combined graph as shown in the documentation (https://docs.gitlab.com/ee/ci/multi_project_pipelines.html)
How is it possible to show the full pipeline graph?
Do I have to add every project which relies on my baseimage to the CI-File of the baseimage or is it possible to subscribe the baseimage-pipline in each project?
The Multi-project pipelines is a paid for feature introduced in GitLab Premium 9.3, and can only be accessed using GitLab's Premium or Silver models.
A way to see this is to the right of the document title:
Well after some more digging into the documentation I found a little sentence which states that Gitlab CE provides features marked as Core-Feature.
We have 50+ Gitlab packages where this is needed. What we used to do was push a commit to a downstream package, wait for the CI to finish, then push another commit to the upstream package, wait for the CI to finish, etc. This was very time consuming.
The other thing you can do is manually trigger builds and you can manually determine the order.
If none of this works for you or you want a better way, I built a tool to help do this called Gitlab Pipes. I used it internally for many months and realized that people need something like this, so I did the work to make it public.
Basically it listens to Gitlab notifications and when it sees a commit to a package, it reads the .gitlab-pipes.yml file to determine that projects dependencies. It will be able to construct a dependency graph of your projects and build the consumer packages on downstream commits.
The documentation is here, it sort of tells you how it works. And then the primary app website is here.
If you click the versions history ... from multi_project_pipelines it reveals.
Made available in all tiers in GitLab 12.8.
Multi-project pipeline visualizations as of 13.10-pre is marked as premium however in my ee version the visualizations for down/upstream links are functional.
So reference Triggering a downstream pipeline using a bridge job
Before GitLab 11.8, it was necessary to implement a pipeline job that was responsible for making the API request to trigger a pipeline in a different project.
In GitLab 11.8, GitLab provides a new CI/CD configuration syntax to make this task easier, and avoid needing GitLab Runner for triggering cross-project pipelines. The following illustrates configuring a bridge job:
rspec:
stage: test
script: bundle exec rspec
staging:
variables:
ENVIRONMENT: staging
stage: deploy
trigger: my/deployment

Is it possible to add CI info in push?

We are using Gitlab CE and Gitlab Runner for our CI/CD on our Stage Servers. We got a branch for lets say dev1 where we need to do different tasks for different changes.
E.g. for frontend stuff we need a compiler to start and for backend we need to run php-unit.
Can I decide in the push what kind of Pipeline I want to start? I saw tags but they are different in git (for versioning) and gitlab (for runners) I suppose.
Is there a best practive for that use case or do I have to use 2 different branches?
You can define two manual tasks for dev1 branch, and decide on your own which task to invoke.
run-php-unit:
stage: build
script:
- echo "Running php unit"
when: manual
only: dev1
start-compiler:
stage: build
script:
- echo "Starting compiler"
when: manual
only: dev1

Resources