Tag release not built with CircleCI - continuous-integration

I am using CircleCI to build a project, everything is running fine, except that my tags are not built when pushed to github:
I don't understand why, I have reduced my whole configuration to a minimalistic config file, it's the same logic:
version: 2
jobs:
my_dummy_job_nightly:
working_directory: ~/build
docker:
- image: docker:git
steps:
- checkout
- setup_remote_docker:
reusable: true
exclusive: true
- run:
name: NIGHTLY BUILD
command: |
apk add --update py-pip
python -m pip install --upgrade pip
my_dummy_job_deploy:
working_directory: ~/build
docker:
- image: docker:git
steps:
- checkout
- setup_remote_docker:
reusable: true
exclusive: true
- run:
name: RELEASE BUILD
command: |
apk add --update py-pip
python -m pip install --upgrade pip
###################################################################################
# CircleCI WORKFLOWS #
###################################################################################
workflows:
version: 2
build-and-deploy:
jobs:
###################################################################################
# NIGHTLY BUILDS #
###################################################################################
- my_dummy_job_nightly:
filters:
tags:
ignore: /.*/
branches:
only: master
###################################################################################
# TAGS BUILDS #
###################################################################################
- hold:
type: approval
filters:
tags:
only: /.*/
branches:
ignore: /.*/
- my_dummy_job_deploy:
requires:
- hold
filters:
tags:
only: /.*/
branches:
ignore: /.*/
I don't understand why the tags don't build ... The regex should let everything through...

I have finally found the issue. Nothing to do with the configuration, CircleCI interface do not show tag build in the Workflows Interface and thus the approval operation block the whole process.
To access the workflow and approve the deployment, you have to click on the build and click on the workflow (see below):
Once on the workflow, it is possible to approve the process:
The only solution I have found to make the build appear is to create a dummy and useless step in the build process that will appear before the approval.
version: 2
jobs:
init_tag_build:
working_directory: ~/build
docker:
- image: docker:git
steps:
- checkout
- setup_remote_docker:
reusable: true
exclusive: true
- run:
name: Launch Build OP
command: |
echo "start tag workflow"
my_deploy_job:
working_directory: ~/build
docker:
- image: docker:git
steps:
- checkout
- setup_remote_docker:
reusable: true
exclusive: true
- run:
name: DEPLOY BUILD
command: |
echo "do the deploy work"
workflows:
version: 2
build-and-deploy:
jobs:
- init_tag_build:
filters:
tags:
only: /.*/
branches:
ignore: /.*/
- hold:
type: approval
requires:
- init_tag_build
filters:
tags:
only: /.*/
branches:
ignore: /.*/
- my_deploy_job:
requires:
- hold
filters:
tags:
only: /.*/
branches:
ignore: /.*/

TL;DR
In the yaml you ignore every branch. Remove the following part.
branches:
ignore: /.*/
You probably meant to build only if tags are shown, but you ignored all the branches. If you want to build for every branch with tags remove the line. If you want to build for some branch (e.g. dev) with tags then add branches: only: dev.
The connection between the two specifier is AND instead of OR. There is a discussion on CircleCI forum to add feature to change it to OR.

Related

GitHub Actions Matrix sharing the Same Code CheckOut

I tried to perform step actions/checkout#v3 once on chained jobs, but it seems like the "build" job does not get the code. I'm getting an error "can't find the project".
Can I call actions/checkout # v3 once for two jobs?
It works when I call the code checkout twice.
name: publish-nuget
on:
push:
branches:
- main
jobs:
prepare:
runs-on: ubuntu-latest
- name: Checkout code
uses: actions/checkout#v3
- name: Get package version
id: get_package_version
uses: kzrnm/get-net-sdk-project-versions-action#v1.3.0
with:
proj-path: ProjectOne.csproj
build:
needs: prepare
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout#v3
# Add the projects path below
strategy:
matrix:
projects: [
'ProjectOne.csproj',
'ProjectTwo.csproj',
]
steps:
- name: Pack NuGet
run: dotnet pack ${{ matrix.projects }} -p:PackageVersion=${{ env.PACKAGE_VERSION }} --configuration Release
It does not work when I call the code checkout once (on the 'prepare' job).
name: publish-nuget
on:
push:
branches:
- main
jobs:
prepare:
runs-on: ubuntu-latest
- name: Checkout code
uses: actions/checkout#v3
- name: Get package version
id: get_package_version
uses: kzrnm/get-net-sdk-project-versions-action#v1.3.0
with:
proj-path: ProjectOne.csproj
build:
needs: prepare
runs-on: ubuntu-latest
steps:
# Add the projects path below
strategy:
matrix:
projects: [
'ProjectOne.csproj',
'ProjectTwo.csproj',
]
steps:
- name: Pack NuGet
run: dotnet pack ${{ matrix.projects }} -p:PackageVersion=${{ env.PACKAGE_VERSION }} --configuration Release
Having a job being dependent on another job, is just for logical purposes and not state or artifact dependency sharing. You are actually runing the 2 jobs on 2 different agents. If you want to share something from the prepare job, you can use the cache or artifact API. E.g. using the cache API to cache the path 'somePath'. Same step for downloading the cache.
- name: Cached build artifacts
uses: actions/cache#v2
id: artifactcache
with:
path: somePath
key: buildArtifacts${{ github.run_number}}
As you are not gaining anything form splitting this up into 2 jobs, I would run everything in a single job instead.

GitHub Actions restrict job with tag validation using if condition or github ref

I have an application code which runs the maven builds we created snapshots and then create a tag created once the snapshot build is successful, and then we use the tag created in the snapshot build and then run the build using the tag for release using the GitHub actions workflows.
now I'm trying to run the release of the build workflow job only if the tag as matching and if the tag, not matches, skip the jobs or fails the jobs, let's say the tag will be created in the snapshot build and will use tag format as "<appname>-<app_build_type>-<github-actions-run_number>-<env>-origin/<branch>" and the actual tag looks like "java-maven-44-dev-origin/develop"
I was trying the below 2 methods workflows, but it is not at all working
1st Method is:
name: maven-release
on:
workflow_dispatch:
jobs:
checkout_code:
runs-on: [ubuntu-latest]
steps:
- name: Checkout
uses: actions/checkout#v3
with:
fetch-depth: 0
maven_build:
runs-on: [ubuntu-latest]
if: ${{ github.ref == 'refs/tags/*-origin/develop' }}
needs: checkout_code
steps:
- name: Run maven build
run: |
mvn clean install
2nd Method is:
name: maven-release
on:
workflow_dispatch:
jobs:
checkout_code:
runs-on: [ubuntu-latest]
steps:
- name: Checkout
uses: actions/checkout#v3
with:
fetch-depth: 0
maven_build:
runs-on: [ubuntu-latest]
if: contains(github.ref, 'refs/tags/*-origin/develop')
needs: checkout_code
steps:
- name: Run maven build
run: |
mvn clean install
I want to skip/fail the at job level only and no in run command, if anyone knows any solution to this or anyone can provide some suggestions on how to achieve this
Its it's possible for your use case, you could trigger your workflow on push and then apply your filter to the branch or tag
name: maven-release
on:
workflow_dispatch:
push:
tags:
- '*-origin/develop'
jobs:
checkout_code:
runs-on: [ubuntu-latest]
steps:
- name: Checkout
uses: actions/checkout#v3
with:
fetch-depth: 0
maven_build:
runs-on: [ubuntu-latest]
if: ${{ github.ref == 'refs/tags/*-origin/develop' }}
needs: checkout_code
steps:
- name: Run maven build
run: |
mvn clean install
The Github filter pattern cheat sheet
For the workflows for which each task needs to be executed only on a particular tag, then we can make use of below snipped
name: maven_build
on:
workflow_dispatch:
jobs:
checkout:
runs-on: [ubuntu-latest]
steps:
- name: Checkout
uses: actions/checkout#v3
with:
fetch-depth: 0
maven_build:
runs-on: [ubuntu-latest]
if: startsWith(github.ref, 'refs/tags/') && !contains(github.ref')
needs: checkout
steps:
- name: Run maven build command
run: |
mvn clean install
The above snippet will run only when the checkout branch matches a tag

Run command taking output from previous step on GitHub actions

I'm trying to make a GitHub action that builds a Hugo website, deploys it on Pinata and saves the output hash of this last step to a txt file. I managed to achieve the first and second steps. And, for the third one, I've been trying to do it by running an "echo" command. However, I get this message: "You have an error in your yaml syntax on line 36"
How do I run the script taking the output from the step identified as "ipfs-pin"?
Here's my code:
name: deploy
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#master
- uses: jakejarvis/hugo-build-action#master
with:
args: --minify --buildDrafts
- uses: anantaramdas/ipfs-pinata-deploy-action#v1.6.4
id: ipfs-pin
with:
pin-name: '[my-pin-name]'
path: './public'
pinata-api-key: [API Key]
pinata-secret-api-key: [secret API Key]
verbose: true
remove-old: true
saves-hash-on-file:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: actions/setup-node#v2
with:
node-version: '14'
- run: echo ${{steps.build.ipfs-pin.hash}} > /.github/ipfs-hash.txt
First
It seems your indentation has a problem, I reproduced the workflow to correct it without returning error when pushing the workflow on the repository:
name: Deploy
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
outputs:
hash: ${{ steps.ipfs-pin.outputs.hash }}
steps:
- uses: actions/checkout#master
- uses: jakejarvis/hugo-build-action#master
with:
args: --minify --buildDrafts
- uses: anantaramdas/ipfs-pinata-deploy-action#v1.6.4
id: ipfs-pin
with:
pin-name: '[my-pin-name]'
path: './public'
pinata-api-key: '[API Key]'
pinata-secret-api-key: '[secret API Key]'
verbose: true
remove-old: true
saves-hash-on-file:
runs-on: ubuntu-latest
needs: [build]
steps:
- uses: actions/checkout#v2
- uses: actions/setup-node#v2
with:
node-version: '14'
- run: echo ${{steps.build.outputs.hash}} > /.github/ipfs-hash.txt
Second
As you can see on the workflow above, I added the outputs field at the job1 (build) level, without this you can't share the output on other jobs.
Reference about outputs
Moreover, to share outputs between jobs, you will have to add the needs: [build] line at the job2 (saves-hash-on-file) level.
Note: I couldn't run it successfully as I don't have any credential to test, but it should work if you copy/paste the workflow I shared using your credentials.

Can't checkout the same repo multiple times in a pipeline

I have self-hosted agents on multiple environments that I am trying to run the same build/deploy processes on. I would like to be able to deploy the same code from a single repo to multiple systems concurrently. Thus, I have created an "overhead" pipeline, and several "processes" pipeline templates. Everything seems to be going very well, except for when I try to perform checkouts of the same repo twice in the same pipeline execution. I get the following error:
An error occurred while loading the YAML build pipeline. An item with the same key has already been added.
I would really like to be able to just click ONE button to trigger a main pipeline that calls all the templates requires and gives the parameters needed to get all my jobs done at once. I could of course define this "overhead" pipeline and then queue up as many instances as I need of it per systems that I need to deploy to, but I'm lazy, hence why I'm using pipelines!
As soon as I remove the checkout from Common.yml, the validation succeeds without any issues. If I keep the checkout in there but only call the Common.yml once for the entire Overhead pipeline, then it succeeds without any issues as well. But the problem is: I need to pull the contents of the repo to EACH of my agents that are running on completely separate environments that are in no way ever able to talk to each other (can't pull the information to one agent and have it do some sort of a "copy" to all the other agent locations.....).
Any assistance is very much welcomed, thank you!
The following is my "overhead" pipeline:
# azure-pipelines.yml
trigger:
none
parameters:
- name: vLAN
type: string
default: 851
values:
- 851
- 1105
stages:
- stage: vLAN851
condition: eq('${{ parameters.vLAN }}', '851')
pool:
name: xxxxx
demands:
- vLAN -equals 851
jobs:
- job: Common_851
steps:
- template: Procedures/Common.yml
- job: Export_851
dependsOn: Common_851
steps:
- template: Procedures/Export.yml
parameters:
Server: ABTS-01
- stage: vLAN1105
condition: eq('${{ parameters.vLAN }}', '1105')
pool:
name: xxxxx
demands:
- vLAN -equals 1105
jobs:
- job: Common_1105
steps:
- template: Procedures/Common.yml
- job: Export_1105
dependsOn: Common_1105
steps:
- template: Procedures/Export.yml
parameters:
Server: OTS-01
And here is the "Procedures/Common.yml":
steps:
- checkout: git://xxxxx/yyyyy#$(Build.SourceBranchName)
clean: true
enabled: true
timeoutInMinutes: 1
- task: UsePythonVersion#0
enabled: true
timeoutInMinutes: 1
displayName: Select correct version of Python
inputs:
versionSpec: '3.8'
addToPath: true
architecture: 'x64'
- task: CmdLine#2
enabled: true
timeoutInMinutes: 5
displayName: Ensure Python Requirements Installed
inputs:
script: |
python -m pip install GitPython
And here is the "Procedures/Export.yml":
parameters:
- name: Server
type: string
steps:
- task: PythonScript#0
enabled: true
timeoutInMinutes: 3
displayName: xxxxx
inputs:
arguments: --name "xxxxx" --mode True --Server ${{ parameters.Server }}
scriptSource: 'filePath'
scriptPath: 'xxxxx/main.py'
I managed to make checkout work with variable branch names by using template expression variables ${{ ... }} instead of macro syntax $(...) variables.
The difference is that, template expressions are processed at compile time while macros are processed at runtime.
So in my case I have something like:
- checkout: git://xxx/yyy#${{ variables.BRANCH_NAME }}
For more information about variables syntax :
Understand variable syntax
I couldn't get it to work with expressions but I was able to get it to work using repository resources following the documentation at: https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops
resources:
repositories:
- repository: MyGitHubRepo # The name used to reference this repository in the checkout step
type: git
name: MyAzureProjectName/MyGitRepo
ref: $(Build.SourceBranch)
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
#some job
steps:
- checkout: MyGitHubRepo
#some other job
steps:
- checkout: MyGitHubRepo
- script: dir $(Build.SourcesDirectory)

CircleCI: Create Workflow with separate jobs. One build and different Deploys per environment

everyone. I need some help with some issues that I am facing configuring circleCi for my Angular project.
The config.yml that I am using for a build and deploy process is detailed below. Currently I have decided to do separate jobs for each environment and each one includes the building and deploy. The problem with this approach is that I am repeating myself and I can't find the correct way to deploy an artifact builded in the previous job for the same workflow.
version: 2
jobs:
build:
docker:
- image: circleci/node:8-browsers
steps:
- checkout
- restore_cache:
key: dependency-cache-{{ checksum "package.json" }}
- run:
name: Install dependencies
command: npm install
- save_cache:
key: dependency-cache-{{ checksum "package.json" }}
paths:
- .node_modules
- run:
name: Build Application (Production mode - aot enabled)
command: npm run build:prod
- store_artifacts:
path: dist
destination: dist
deploy_prod:
docker:
- image: circleci/node:8-browsers
environment:
- FIREBASE_TOKEN: "1/AFF2414141ASdASDAKDA4141421sxscq"
steps:
- checkout
- restore_cache:
key: dependency-cache-{{ checksum "package.json" }}
- run:
name: Install dependencies
command: npm install
- save_cache:
key: dependency-cache-{{ checksum "package.json" }}
paths:
- .node_modules
- run:
name: Build Application (Production mode - aot enabled)
command: npm run build:prod
- store_artifacts:
path: dist
destination: dist
- run:
command: ./node_modules/.bin/firebase use default
- deploy:
command: ./node_modules/.bin/firebase deploy --token=$FIREBASE_TOKEN
deploy_qa:
docker:
- image: circleci/node:8-browsers
environment:
- FIREBASE_TOKEN: "1/AFF2414141ASdASDAKDA4141421sxscq"
steps:
- checkout
- restore_cache:
key: dependency-cache-{{ checksum "package.json" }}
- run:
name: Install dependencies
command: npm install
- save_cache:
key: dependency-cache-{{ checksum "package.json" }}
paths:
- .node_modules
- run:
name: Build Application (Production mode - aot enabled)
command: npm run build:prod
- store_artifacts:
path: dist
destination: dist
- run:
command: ./node_modules/.bin/firebase use qa
- deploy:
command: ./node_modules/.bin/firebase deploy --token=$FIREBASE_TOKEN
workflows:
version: 2
build-and-deploy:
jobs:
- build:
filters:
branches:
only:
- master
ignore:
- /feat-.*/
- deploy_prod:
filters:
branches:
ignore:
- /.*/
tags:
only:
- /v[0-9]+(\.[0-9]+){2}/
- deploy_qa:
filters:
branches:
ignore:
- /.*/
tags:
only:
- /v[0-9]+(\.[0-9]+){2}-BETA-([0-9]*)/
I understand that each job is using a different docker image, so this prevents me work on the same workspace.
Q: How can I use the same docker image for different jobs in the same workflow?
I included the store_artifacts thinking it could help me, but what I read about it is that it only works for using through the dashboard or the API.
Q: Am I able to recover an artifact on a job that requires a different job that stored the artifact?
I know that I am repeating myself, my goal is to have a build job required for a deploy job per environment depending on the tags' name. So my deploy_{env} jobs are mainly the firebase commands.
workflows:
version: 2
build-and-deploy:
jobs:
- build:
filters:
branches:
only:
- master
ignore:
- /feat-.*/
tags:
only:
- /v[0-9]+(\.[0-9]+){2}/
- /v[0-9]+(\.[0-9]+){2}-BETA-([0-9]*)/
- deploy_prod:
requires:
- build
filters:
branches:
ignore:
- /.*/
tags:
only:
- /v[0-9]+(\.[0-9]+){2}/
- deploy_qa:
requires:
- build
filters:
branches:
ignore:
- /.*/
tags:
only:
- /v[0-9]+(\.[0-9]+){2}-BETA-([0-9]*)/
Q: What are the recommended steps (best practices) for this solution?
Q: How can I use the same docker image for different jobs in the same workflow?
There might be two methods of going about this:
1.) Docker Layer Caching: https://circleci.com/docs/2.0/docker-layer-caching/
2.) Caching the .tar file: https://circleci.com/blog/how-to-build-a-docker-image-on-circleci-2-0/
Q: Am I able to recover an artifact on a job that requires a different
job that stored the artifact?
The persist_to_workspace and attach_workspace keys should be helpful here: https://circleci.com/docs/2.0/workflows/#using-workspaces-to-share-data-among-jobs
Q: What are the recommended steps (best practices) for this solution?
Not sure here! Whatever works the fastest and cleanest for you. :)

Resources