Azure Pipelines Data Driven Matrix - matrix

In GitHub Actions, I can write a matrix job like so:
jobs:
test:
name: Test-${{matrix.template}}-${{matrix.os}}
runs-on: ${{matrix.os}}
strategy:
matrix:
os: [ubuntu-latest, windows-latest, macOS-latest]
template: ['API', 'GraphQL', 'Orleans', 'NuGet']
steps:
#...
This will run every combination of os and template. In Azure Pipelines, you have to specify each combination manually like so:
stages:
- stage: Test
jobs:
- job: Test
strategy:
matrix:
Linux:
os: ubuntu-latest
template: API
Mac:
os: macos-latest
template: API
Windows:
os: windows-latest
template: API
# ...continued
pool:
vmImage: $(os)
timeoutInMinutes: 20
steps:
#...
Is it possible to create a data driven matrix strategy similar to GitHub Actions?

Is it possible to create a data driven matrix strategy similar to GitHub Actions?
The answer is yes. This is a known issue that has already been reported on github:
Add cross-product matrix strategy
In addition, there is workaround that mentioned this issue in the official documentation:
Note
The matrix syntax doesn't support automatic job scaling but you can
implement similar functionality using the each keyword. For an
example, see nedrebo/parameterized-azure-jobs.
jobs:
- template: azure-pipelines-linux.yml
parameters:
images: [ 'archlinux/base', 'ubuntu:16.04', 'ubuntu:18.04', 'fedora:31' ]
pythonVersions: [ '3.5', '3.6', '3.7' ]
swVersions: [ '1.0.0', '1.1.0', '1.2.0', '1.3.0' ]
- template: azure-pipelines-windows.yml
parameters:
images: [ 'vs2017-win2016', 'windows-2019' ]
pythonVersions: [ '3.5', '3.6', '3.7' ]
swVersions: [ '1.0.0', '1.1.0', '1.2.0', '1.3.0' ]
azure-pipelines-windows.yml:
jobs:
- ${{ each image in parameters.images }}:
- ${{ each pythonVersion in parameters.pythonVersions }}:
- ${{ each swVersion in parameters.swVersions }}:
- job:
displayName: ${{ format('OS:{0} PY:{1} SW:{2}', image, pythonVersion, swVersion) }}
pool:
vmImage: ${{ image }}
steps:
- script: echo OS version &&
wmic os get version &&
echo Lets test SW ${{ swVersion }} on Python ${{ pythonVersion }}

Not an ideal solution, but for now, you can loop over parameters. Write a template like the following, and pass your data to it.
# jobs loop template
parameters:
jobs: []
jobs:
- ${{ each job in parameters.jobs }}: # Each job
- ${{ each pair in job }}: # Insert all properties other than "steps"
${{ if ne(pair.key, 'steps') }}:
${{ pair.key }}: ${{ pair.value }}
steps: # Wrap the steps
- task: SetupMyBuildTools#1 # Pre steps
- ${{ job.steps }} # Users steps
- task: PublishMyTelemetry#1 # Post steps
condition: always()
See here for more examples: https://github.com/Microsoft/azure-pipelines-yaml/blob/master/design/each-expression.md#scenario-wrap-jobs

Related

automatic new release in GIthub action

I have no knowledge in the yaml language, I am a designer and not a programmer, but I followed a step by step to create an atomization that would create an automatic release, but it is returning an Error: Input required and not supplied: tag_name
YAML code:
name: Release
on:
push:
branches:
- master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
FILE_NAME: "Dark-Everywhere"
FILE_EXTENSION: ".zip"
BRANCHES: "1.19.3,1.19,1.18"
PACKAGE_NAME: "assets,pack.mcmeta,pack.png"
TAG_NAME: "1.0.0"
jobs:
release:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout#v2
- name: Create a release
uses: actions/create-release#v1
env:
GITHUB_TOKEN: ${{ env.GITHUB_TOKEN }}
with:
tag_name: "v${{ env.TAG_NAME }}"
release_name: "Dark Everywhere ${{ env.TAG_NAME }}🌙"
draft: true
prerelease: false
- name: Generate a zip archive for each branch
run: |
for branch in $(echo $BRANCHES | tr "," "\\n"); do
zip -r "$FILE_NAME-$branch$FILE_EXTENSION" $PACKAGE_NAME
done
- name: Upload files
uses: actions/upload-artifact#v2
with:
name: "$FILE_NAME-$branch$FILE_EXTENSION"
path: "$FILE_NAME-$branch$FILE_EXTENSION"
- name: Update release
uses: actions/update-release#v1
env:
GITHUB_TOKEN: ${{ env.GITHUB_TOKEN }}
with:
release_id: ${{ env.RELEASE_ID }}
body: "Build description here"
I already tried to add TAG_NAME: "1.0.0" inside env in the code but the error persists

Unable to read Github Actions job's outputs from another job

I have the following workflow in Github Actions, where I have a job that create some outputs and a dependant job that read those outputs, pretty similar to the example from the docs:
name: Sandbox
on:
push:
env:
POSTGRESQL_VERSION: "14.4.0-debian-11-r13"
jobs:
setup:
runs-on: ubuntu-latest
outputs:
prod_tag: "steps.prod_tag.outputs.prod_tag"
postgresql_version: "steps.postgresql_version.outputs.postgresql_version"
steps:
- uses: actions/checkout#v3
- id: prod_tag
run: |
if [[ ${{ github.ref_type }} == "tag" ]]; then
echo "::set-output name=prod_tag::${{github.ref_name}}"
else
echo "::set-output name=prod_tag::latest"
fi;
- id: postgresql_version
run: echo "::set-output name=postgresql_version::${POSTGRESQL_VERSION}"
- name: Show output variables
run: |
echo "PROD TAG: ${{steps.prod_tag.outputs.prod_tag}}"
echo "POSTGRESQL_VERSION: ${{steps.postgresql_version.outputs.postgresql_version}}"
show_outputs:
runs-on: ubuntu-latest
needs: setup
steps:
- run: |
echo "PROD TAG: ${{needs.setup.outputs.prod_tag}}"
echo "POSTGRESQL_VERSION: ${{needs.setup.outputs.postgresql_version}}"
However, in my example, it doesn't work as expected and show_outputs shows PROD TAG: steps.prod_tag.outputs.prod_tag and POSTGRESQL_VERSION: steps.postgresql_version.outputs.postgresql_versioninstead of the values set in the setup job, that should be latest and 14.4.0-debian-11-r13. In the step Show output variables of the setup job I can see that the values are properly set, and I've tried several different approaches (setting the variables from the same step, not taking the value from the environment variable, etc) but with no success.
Any idea what can be wrong with my example?
You should surround the variables with ${{ and }}
try with:
outputs:
prod_tag: ${{ steps.prod_tag.outputs.prod_tag }}
postgresql_version: ${{ steps.postgresql_version.outputs.postgresql_version }}
instead of:
outputs:
prod_tag: "steps.prod_tag.outputs.prod_tag"
postgresql_version: "steps.postgresql_version.outputs.postgresql_version"

Github actions: merge artifacts after matrix steps

I have a job in Github Actions workflow that runs unit tests and then uploads reports to Jira Xray. The thing is tests step takes quite a while to complete, so I want to split task execution into a few smaller chunks using matrix.
I did it for linting and it works well, however for unit tests I'm struggling with how can I collect and merge all reports together so they can be uploaded after all matrix steps are done.
Here's how current unit tests step looks like
unit-test:
runs-on: ubuntu-latest
needs: setup
steps:
- uses: actions/checkout#v3
with:
fetch-depth: 0
- uses: actions/cache#v3
with:
path: ${{ env.CACHE_NODE_MODULES_PATH }}
key: build-${{ hashFiles('**/package-lock.json') }}
- run: npx nx affected:test --parallel=3 --base=${{ env.BASE_REF}} --head=HEAD # actual unit tests
- name: Check file existence #checking whether there're reports at all
if: success() || failure()
id: check_files
uses: andstor/file-existence-action#v1
with:
# all reports will be placed in this directory
# for matrix job reports will be separated between agents, so it's required to merge them
files: 'reports/**/test-*.xml'
- name: Import results to Xray
if: (success() || failure()) && steps.check_files.outputs.files_exists == 'true' && github.event_name == 'push'
uses: mikepenz/xray-action#v2
with:
username: ${{ secrets.XRAY_CLIENT_ID }}
password: ${{ secrets.XRAY_CLIENT_SECRET }}
testFormat: 'junit'
testPaths: 'reports/**/test-*.xml' # that's where I need to grab all reports
projectKey: 'MY_KEY'
combineInSingleTestExec: true
Matrix job for linting looks like this. I would like to do the same for unit tests, but at the same time I want to collect all reports as it works in the job above
linting:
runs-on: ubuntu-latest
needs: [setup]
strategy:
matrix:
step: ${{ fromJson(needs.setup.outputs.lint-bins) }} # this will be something like [1,2,3,4]
steps:
- uses: actions/checkout#v3
with:
fetch-depth: 0
- uses: actions/cache#v3
with:
path: ${{ env.CACHE_NODE_MODULES_PATH }}
key: build-${{ hashFiles('**/package-lock.json') }}
# some nodejs logic to run few jobs, it uses "execSync" from "child_process" to invoke the task
- run: node scripts/ci-run-many.mjs --target=lint --outputTarget=execute --partNumber=${{ matrix.step }} --base=${{ env.BASE_REF}} --head=HEAD
Figured it myself
unit-test:
runs-on: ubuntu-latest
needs: [setup]
strategy:
fail-fast: false
matrix:
step: ${{ fromJson(needs.setup.outputs.unit-test-bins) }}
steps:
- uses: actions/checkout#v3
with:
fetch-depth: 0
- uses: actions/cache#v3
with:
path: ${{ env.CACHE_NODE_MODULES_PATH }}
key: build-${{ hashFiles('**/package-lock.json') }}
- run: node scripts/ci-run-many.mjs --target=test --outputTarget=execute --partNumber=${{ matrix.step }} --base=${{ env.BASE_REF}} --head=HEAD
- name: Upload reports' artifacts
if: success() || failure()
uses: actions/upload-artifact#v3
with:
name: ${{ env.RUN_UNIQUE_ID }}_artifact_${{ matrix.step }}
if-no-files-found: ignore
path: reports
retention-days: 1
process-test-data:
runs-on: ubuntu-latest
needs: unit-test
if: success() || failure()
steps:
- uses: actions/checkout#v3
- name: Download reports' artifacts
uses: actions/download-artifact#v3
with:
path: downloaded_artifacts
- name: Place reports' artifacts
run: rsync -av downloaded_artifacts/*/*/ unit_test_reports/
- name: Check reports existence
id: check_files
uses: andstor/file-existence-action#v1
with:
files: 'unit_test_reports/**/test-*.xml'
- name: Import results to Xray
run: ls -R unit_test_reports/

Can I have a GitHub Actions Workflow without any Jobs inside?

When translating existing Azure DevOps YAML pipelines to GitHub Actions YAML, I noticed some of my Azure pipelines were just templates calling other YAML files.
trigger:
- master
resources:
repositories:
- repository: templates
type: git
name: 'Cloud Integration\PipelineTemplates'
name: $(Build.SourceBranchName)_$(Build.Reason)_$(rev:r)
variables:
- group: var-lc-integration-emailservice
- name: logicapp_workflows
value: false
- name: base_resources
value: false
- name: functionapp_resources
value: false
- name: functionapp
value: false
- name: ia_resources
value: false
- name: ia_configs
value: false
- name: apim_resources
value: true
stages:
- template: yml-templates\master.yml#templates
parameters:
logicapp_workflows: ${{ variables.logicapp_workflows }}
base_resources: ${{ variables.base_resources }}
functionapp_resources: ${{ variables.functionapp_resources }}
functionapp: ${{ variables.functionapp }}
ia_resources: ${{ variables.ia_resources }}
ia_configs: ${{ variables.ia_configs }}
apim_resources: ${{ variables.apim_resources }}
While writing a GitHub workflow for the above Azure pipeline, Can we have a "dummy job" or no job at all within a workflow to solve this issue?
IIUC yes, see reusing GitHub Actions workflows.
It allows you to call another workflow from your repository and provide inputs/secrets as necessary.

Is it possible to give a dynamic value in startagy matrix github actions?

I have the following workflow and want to pass some dynamic value in startagy matrix:
env:
FORMULA: testFormula
jobs:
test:
name: Test for ${{ matrix.os }}
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
include:
- os: ubuntu-latest
asset_name: ${{ env.FORMULA }}-${{ steps.get-tag.outputs.TAG_NAME }}
command_name: echo "matrix1"
- os: ubuntu-latest
artifact_name: ${{ env.FORMULA }}_${{ steps.get-tag.outputs.TAG_NAME }}-1_amd64.deb
asset_name: ${{ env.FORMULA }}-${{ steps.get-tag.outputs.TAG_NAME }}.deb
command_name: echo "matrix2"
- os: macos-latest
artifact_name: ${{ env.FORMULA }-v${{ steps.get-tag.outputs.TAG_NAME }}-x86.exe
asset_name: ${{ env.FORMULA }}-${{ steps.get-tag.outputs.TAG_NAME }}.exe
command_name: echo "matrix3"
- os: macos-latest
artifact_name: macos/${{ env.FORMULA }-v${{ steps.get-tag.outputs.TAG_NAME }}.pkg
asset_name: ${{ env.FORMULA }}-${{ steps.get-tag.outputs.TAG_NAME }}.pkg
command_name: echo "matrix4"
steps:
- name: Getting latest tag
id: get-tag
run: |
echo "::set-output name=TAG_NAME::2.3.4)"
and I am using the strategy matrix below to run somethings. But I am getting error
The workflow is not valid. .github/workflows/test.yml : Unrecognized named-value: 'env'. Located at position 1 within expression: env.FORMULA
Please help me how to pass dynamic values to strategy matrix.
Thanks in advance.
This is actually shown in the GitHub Actions documentation.
To gloss what's described there, you construct a JSON object that has the form of what's expected by the matrix key (string names/keys that have array values). Then you set the serialized JSON as an output and refer to it in your matrix job. Like so:
generate-build-matrix:
# one of each build category must be set
runs-on: ubuntu-latest
outputs:
matrix: ${{ steps.create-matrix.outputs.matrix }}steps:
steps:
- name: Create matrix object
id: create-matrix
shell: python
run: |
from os import getenv
from subprocess import Popen
import json
matrix_obj = {"prod-environment":[], "nice-app":[], "app-platform": []}
if getenv("ENV_PRODUCTION").lower() == "true":
matrix_obj["prod-environment"].append("production")
...
if getenv("APP_GA").lower() == "true":
matrix_obj["nice-app"].append("ga")
matrix_str = f'::set-output name=matrix::{json.dumps(matrix_obj)}'
print(f"Final matrix string: {matrix_str}")
Popen(['echo', matrix_str], bufsize=1)
env:
ENV_PRODUCTION: ${{inputs.env-production}}
APP_GA: ${{inputs.app-ga}}
...
matrix-job:
strategy:
matrix:
${{fromJSON(needs.generate-build-matrix.outputs.matrix)}}
name: My matrix job
...

Resources