How to reference another yml file from the main github action yaml file? - yaml

I'm defining a github action script that's referencing to another yaml file, hoping to put the configuration into a more organised way.
Here is my job file, named as deploy.yml in the path of ./.github/workflows/, where the first . is the root folder of my project.
....
jobs:
UnitTest:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v3
- uses: ./.github/workflows/unittest.yml
In the same ./.github/workflows/ folder, I created another file called unittest.yml as below:
name: "UnitTest"
description: "Perform Unit Test"
runs:
# using: "composite"
- name: Dependency
run: |
echo "Dependency setup commands go here"
- name: UnitTest
run: make test.unit
However, when I tried to test the script locally using act with command act --secret-file .secrets --container-architecture linux/amd64, I received the following error:
[Deploy/UnitTest] ✅ Success - Main actions/checkout#v3
[Deploy/UnitTest] ⭐ Run Main ./.github/workflows/unittest.yml
[Deploy/UnitTest] ❌ Failure - Main ./.github/workflows/unittest.yml
[Deploy/UnitTest] file does not exist
[Deploy/UnitTest] 🏁 Job failed
I have tried to put just the file name unittest.yml or ./unittest.yml or myrepo_name/.github/workflows/unittest.yml or put the file into a subfolder like step 2 of this document illustrated, but all no luck.
Based on examples of runs for composition actions, I would imagine this should work.
Would anyone please advise?
P.S. You might have noticed the commented line of using: "composite" in the unittest.yml. If I uncomment the line, I'll receive error:
Error: yaml: line 3: did not find expected key

Composite actions are not referenced by YAML file, but a folder. In that folder, you are expected to have an action.yml describing the action.
This is why you're getting the error with using: composite, you're defining a workflow (because it's in ./github/workflows), but you are using action syntax.
I would advise this folder structure:
.github/
|-- workflows/
| -- deploy.yml
unittest-action/
|-- action.yml
With this structure, you should be able to reference the action with
- uses: actions/checkout#v3
- uses: ./unittest-action
Please see the docs for more information.
Depending on your use-case and setup, you might also want to consider reusable workflows.
You can define a reusable workflow in your .github/workflows directory like so:
# unittest.yml
on: workflow_call
jobs:
deploy:
# ...
and then call it like so:
jobs:
UnitTest:
uses: ./.github/workflows/unittest.yml
Note how the reusable workflow is an entire job. This means, you can't do the checkout from the outside and then just run the unit test in the reusable job. The reusable job (unittest.yml) needs to do the checkout first.
Which one to pick?
Here's a blog post summarising some of the differences between composite actions and reusable workflows, like:
reusable workflows can contain several jobs, composite actions only contain steps
reusable workflows have better support for using secrets
composite actions can be nested, but as of Jul '22, reusable workflows can't call other reusable workflows

Related

How can I set gradle/test to work on the same docker environment where other CircleCi jobs are running

I have a CircleCi's workflow that has 2 jobs. The second job (gradle/test) is dependent on the first one creating some files for it.
The problem is with the first job running inside a docker, and the second job (gradle/test) is not. Hence, the gradle/test is failing since it cannot find the files the first job created. How can I set gradle/test to work on the same space?
Here is a code of the workflow:
version: 2.1
orbs:
gradle: circleci/gradle#2.2.0
executors:
daml-executor:
docker:
- image: cimg/openjdk:11.0-node
...
workflows:
checkout-build-test:
jobs:
- daml_test:
daml_sdk_version: "2.2.0"
context: refapps
- gradle/test:
app_src_directory: prototype
executor: daml-executor
requires:
- daml_test
Can anyone help me configure gradle/test correctly?
CircleCI has a mechanism to share artifacts between jobs called "workspace" (well, they have multiple ones, but workspace is what you want here).
Concretely, you would add this at the end of your daml_test job definition, as an additional step:
- persist_to_workspace:
root: /path/to/folder
paths:
- "*"
and that would add all the files from /path/to/folder to the workspace. On the other side, you can "mount" the workspace in your gradle/test job by adding something like this before the step where you need the files:
- attach_workspace:
at: /whatever/mountpoint
I like to use /tmp/workspace for the path on both sides, but that's just personal preference.

Bamboo specs YAML issue

I'm trying to build a bamboo specs yaml but I'm having some weird errors which have messages that are not helping much. I followed the documentation for it here but still not working.
So I have bamboo 7.2.4 and I'm trying to create a stage
version:2
stages:
- run tests:
jobs:
- Test
Test:
tasks:
- script:whatever
When running this I get
Bamboo YAML import failed: Document structure is incorrect: Tests: Property is required.
No clue what that means nor why it's happening
Bamboo YAML specs are very difficult to troubleshoot. This is one disadvantage of YAML specs when compared to Java specs. Looks like you are missing some key and essential tags in your example code. Can you re-format as below and see?
First, manually create a project (or use an existing project) but make sure to update the project-key by replacing <MYKEY> below,
---
version:2
plan:
project-key: <MYKEY>
key: MYPLN
name: My Plan
stages:
- run tests:
jobs:
- Test
Test:
key: JB1
tasks:
- script:
- echo 'My Plan'

Having Gitlab Projects calling the same gitlab-ci.yml stored in a central location

I have many Gitlab project followed the same CI template. Whenever there is a small change in the CI script, I have to manually modify the CI script in each project. Is there a way you can store your CI script in a central location and have your project called that CI script with some environment variable substitution? For instance,
gitlab-ci.yml in each project
/bin/bash -c "$(curl -fsSL <link_to_the_central_location>.sh)"
gitlab-ci.yml in the central location
stages:
- build
- test
build-code-job:
stage: build
script:
- echo "Check the ruby version, then build some Ruby project files:"
- ruby -v
- rake
test-code-job1:
stage: test
script:
- echo "If the files are built successfully, test some files with one command:"
- rake test1
test-code-job2:
stage: test
script:
- echo "If the files are built successfully, test other files with a different command:"
- rake test2
You do not need curl, actually gitlab supports this via the include directive.
you need a repository, where you store your general yml files. (you can choose if it is a whole ci file, or just parts. For this example lets call this repository CI and assume your gitlab runs at example.com - so the project url would be example.com/ci. we create two files in there just to show the possibilities.
is a whole CI definition, ready to use - lets call the file ci.yml. This approach is not really flexible
stages:
- build
- test
build-code-job:
stage: build
script:
- echo "Check the ruby version, then build some Ruby project files:"
- ruby -v
- rake
test-code-job1:
stage: test
script:
- echo "If the files are built successfully, test some files with one command:"
- rake test1
test-code-job2:
stage: test
script:
- echo "If the files are built successfully, test other files with a different command:"
- rake test2
is a partly CI definition, which is more extendable. lets call the files includes.yml
.build:
stage: build
script:
- echo "Check the ruby version, then build some Ruby project files:"
- ruby -v
- rake
.test:
stage: test
script:
- echo "this script tag will be overwritten"
There is even the option to use template string from yaml. please reference the gitlab documentation but it is similar to 2.
we do have our project which wants to use such definitions. so either
For the whole CI file
include:
- project: 'ci'
ref: master # think about tagging if you need it
file: 'ci.yml'
as you can see now we are referencing one yml file, with all the cahnges.
with partial extends
include:
- project: 'ci'
ref: master # think about tagging if you need it
file: 'includes.yml'
stages:
- build
- test
build-code-job:
extends: .build
job1:
extends: .test
script:
- rake test1
job2:
extends: .test
script:
- rake test2
As you see, you can easily use the includes, to have a way more granular setup. Additionally you could define at job1 and job2 variables, eg for the test target, and move the script block into the includes.yml
Futhermore you can also use anchors for the script parts. Which looks like this
includes.yml
.build-scirpt: &build
- echo "Check the ruby version, then build some Ruby project files:"
- ruby -v
- rake
.build:
stage: build
script:
- *build
and you can use also the script anchor within your configuration
For a deeper explanation you can also take a look at https://docs.gitlab.com/ee/ci/yaml/includes.html

variables substitution (or overriding) when extends jobs from gitlab templates unsing include

Using gitlab ci, I have a repo where all my templates are created.
For example, I have a sonar scanner job template named .sonar-scanner.yml:
sonar-analysis:
stage: quality
image:
name: sonar-scanner-ci:latest
entrypoint: [""]
script:
- sonar-scanner
-D"sonar.projectKey=${SONAR_PROJECT_NAME}"
-D"sonar.login=${SONAR_LOGIN}"
-D"sonar.host.url=${SONAR_SERVER}"
-D"sonar.projectVersion=${CI_COMMIT_SHORT_SHA}"
-D"sonar.projectBaseDir=${CI_PROJECT_DIR}"
I have include this template as a project like this in main gitlab ci file:
include:
- project: 'organization/group/ci-template'
ref: master
file: '.sonar-scanner.yml'
So as you can understand I have a repo named ci-templates where all my templates are created. And in another repo, I extends using include these templates.
Finally, in a repo, when a new merge request is created, my job for sonar is running under another file in my project test/quality.yml:
sonar:
stage: quality
extends:
- sonar-analysis
allow_failure: true
only:
refs:
- merge_requests
All is working well except the substitution or the overriding of my env var. Indeed of my template. I have many sonar server or project names. I would like to know how to override these variables SONAR_SERVER and SONAR_PROJECT_NAME when I extend the job from a template.
In my main .gitlab-ci.yml file I have a variables section declaration, and when I override these variables in, it works.
But it's not really what I want. Using many stages and many micro service it is possible to reuse the same extending job in a different way. That I really want to do is to override these variables directly in the file test/quality.yml.
This, for example does not work:
sonar:
stage: quality
extends:
- sonar-analysis
variables:
SONAR_PROJECT_NAME: foo
SONAR_SERVER: bar
allow_failure: true
only:
refs:
- merge_requests
This not work too:
variables:
SONAR_PROJECT_NAME: foo
SONAR_SERVER: bar
sonar:
stage: quality
extends:
- sonar-analysis
allow_failure: true
only:
refs:
- merge_requests
What is the best way to make this work ?
Since this question was asked in Feburary 2020, a new MR Use non-predefined variables inside CI include blocks has been merged into Gitlab 14.2, which resolves the issue for the overridden job.
The project that does the include can redefine the variables when extending a job:
include:
- project: 'organization/group/ci-template'
ref: master
file: '.sonar-scanner.yml'
sonar:
stage: quality
extends:
- sonar-analysis
variables:
SONAR_PROJECT_NAME: foo
SONAR_SERVER: bar
allow_failure: true
But in this case you probably want the job in the template to start with a dot .sonar-analysis instead of sonar-analysis to not create a real sonar-analysis job in the template (see hidden jobs).
Or you can also directly set the variables values (to redefine them) of an existing job in the project that does the include:
include:
- project: 'organization/group/ci-template'
ref: master
file: '.sonar-scanner.yml'
sonar-analysis:
variables:
SONAR_PROJECT_NAME: foo
SONAR_SERVER: bar
I verified this with a test project, which includes a template from a peer test project, and when it runs, results in two jobs. This is the job output for the overriding job:
$ echo sonar.projectKey=${SONAR_PROJECT_NAME}
sonar.projectKey=foo
$ echo sonar.login=${SONAR_LOGIN}
sonar.login=bob
$ echo sonar.host.url=${SONAR_SERVER}
sonar.host.url=bar

Got "ZIP does not support timestamps before 1980" while deploying a Go Cloud Function on GCP via Triggers

Problem:
I am trying to deploy a function with this step in a second level compilation (second-level-compilation.yaml)
- name: 'gcr.io/cloud-builders/gcloud'
args: ['beta', 'functions',
'deploy', '${_FUNCTION_NAME}',
'--source', 'path/to/function',
'--runtime', 'go111',
'--region', '${_GCP_CLOUD_FUNCTION_REGION}',
'--entry-point', '${_ENTRYPOINT}',
'--env-vars-file', '${_FUNCTION_PATH}/.env.${_DEPLOY_ENV}.yaml',
'--trigger-topic', '${_TRIGGER_TOPIC_NAME}',
'--timeout', '${_FUNCTION_TIMEOUT}',
'--service-account', '${_SERVICE_ACCOUNT}']
I get this error from Cloud Build using the Console.
Step #1: Step #11: ERROR: (gcloud.beta.functions.deploy) Error creating a ZIP archive with the source code for directory path/to/function: ZIP does not support timestamps before 1980
Here is the global flow:
The following step is in a first-level compilation (first-level-compilation.yaml). Which is triggered by Cloud build using a Github repository (via Application GitHub Cloud Build) :
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
args: ['-c', 'launch-second-level-compilation.sh ${_MY_VAR}']
The script "launch-second-level-compilation.sh" does specific operations based on ${_MY_VAR} and then launches a second-level compilation passing a lot of substitutions variables with "gcloud builds submit --config=second-level-compilation.yaml --substitutions=_FUNCTION_NAME=val,_GCP_CLOUD_FUNCTION_REGION=val,....."
Then, the "second-level-compilation.yaml" described at the beginning of this question is executed, using the substitutions values generated and passed through the launch-second-level-compilation.sh script.
The main idea here is to have a generic first-level-compilation.yaml in charge of calling a second-level compilation with specific dynamically generated substitutions.
Attempts / Investigations
As described in this issue Cloud Container Builder, ZIP does not support timestamps before 1980, I tried to "ls" the files in the /workspace directory. But none of the files at the /workspace root have strange DATE.
I changed the path/to/function from a relative path to /workspace/path/to/function, but no success, without surprise as the directory ends to be the same.
Please make sure you don't have folders without files. For example:
|--dir
|--subdir1
| |--file1
|--subdir2
|--file2
In this example dir doesn't directly contain any file, only subdirectories. During local deployment gcp sdk puts dir into tarball without copying last modified field.
So it is set to 1st Jan 1970 that causes problems with ZIP.
As possible workaround just make sure every directory contains at least one file.

Resources