Calling a secondary pipeline from main pipeline - yaml

What I'm trying to achieve is that on pull request completion, I run a pipeline which requests the user to select 1 of four options, run a pipeline in which I can choose which version to build/release, all versions, undo a particular version, or a hotfix pipeline that skips the build stage and merely deploys a particular version.
Sorry in advance for the poor formatting.
'''
name: ADO-self-hosted-pipeline-Master-calls-Specific-Template
trigger: none
parameters:
- name: pipeline
displayName: Choose Pipeline
type: string
default: All-Versions
values:
- All-Versions
- Cherry-Pick-Versions
- Hotfix
- Undo-Versions
stages:
- stage: CherryPick
pool: $(AGENT_POOL)
displayName: 'Cherry pick'
jobs:
- job: SelectCherryPick
displayName: Select the Cherry Pick pipeline
- template: azure-pipelines_Cherry Pick.yml#self
condition: and(succeeded(), ${{eq(parameters['pipeline'], 'Cherry-Pick-Versions' ) }}
- stage: Hotfix
pool: $(AGENT_POOL)
displayName: 'Hotfix'
jobs:
- job: SelectHotfix
displayName: Select the Hotfix pipeline
- template: azure-pipelines_Hotfix.yml#self
condition: and(succeeded(), ${{eq(parameters['pipeline'], 'Hotfix' ) }}
- stage: All-Versions
pool: $(AGENT_POOL)
displayName: 'All-Versions'
jobs:
- job: SelectAll
displayName: Select the All Migrations pipeline
- template: azure-pipelines_All Migrations.yml#self
condition: and(succeeded(), ${{eq(parameters['pipeline'], 'All-Versions' ) }}
- stage: Undo-Versions
pool: $(AGENT_POOL)
displayName: 'Undo-Versions'
jobs:
- job: SelectUndo
displayName: Select the Undo pipeline
- template: azure-pipelines_undo.yml#self
condition: and(succeeded(), ${{eq(parameters['pipeline'], 'Undo-Versions' ) }}
variables:
BUILD_NAME: 'Build'
FLYWAY: 'E:\Agents\$(Agent.Name)\Flyway\flyway -user="$(userName)" -password="$(password)" -licenseKey=$(FLYWAY_LICENSE_KEY) -outOfOrder=true'
RELEASE_PREVIEW: 'Release-Preview.sql'
DRIFT_AND_CHANGE_REPORT: 'Drift-And-Change-Report.html'
DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME: 'Drift And Change Report'
group: flyway_vars
#cherryPickVersions: ${{parameters.cherryPickVersions}}
'''
The pipeline works well with result:
Master Pipeline to Select further Pipelines
So I can choose which pipeline template from there, but when I press Run, I get an error:
Run after selecting cherry pick
The error is: /azure-pipelines_Cherry Pick.yml (Line: 61, Col: 1): Unexpected value 'stages'
The full Cherry Pick YAML Template is:
#name: ADO-self-hosted-pipeline-Cherry-Pick
# This is the default pipeline for a self-hosted Windows agent on Azure Devops.
# Install flyway cli on agent, add flyway to PATH: https://download.red-gate.com/maven/release/org/flywaydb/enterprise/flyway-commandline
# Install python3 on agent and add pip to PATH if staticCodeAnalysis is set to true
# Make sure this file is in the same directory as the migrations folder of the Flyway Enterprise project.
# Provision a dev, shadow, build databases, as well as any target environments that need to be created: https://documentation.red-gate.com/fd/proof-of-concept-checklist-152109292.html
# Further instructions if needed here: https://documentation.red-gate.com/fd/self-hosted-windows-agent-yaml-pipeline-in-azure-devops-158564470.html
# For video reference, see: https://www.red-gate.com/hub/university/courses/flyway/flyway-desktop/setting-up-a-flyway-desktop-project/basic-flyway-desktop-project-setup-and-configuration
#trigger:
# branches:
# include:
# - master
# paths:
# include:
# - migrations/*
#This is the Cherry-Pick Pipeline
# IMPORTANT: DO NOT ADD DEPLOYMENT STEPS TO THE BUILD STAGE - THE BUILD IS A DESTRUCTIVE ACTION
parameters:
- name: cherryPickVersions
displayName: 'Scripts To Deploy: Comma Separated List Of Full Version Numbers'
default: ''
type: string
- name: buildStage
type: object
default:
stage: 'Build'
displayName: 'Temp Build'
variableGroupName: 'RCS DB Common' #userName, password, JDBC, Database.Name
# This is the extensible definition of your target environments.
# Every parameter in deploymentStages corresponds to an environment - here it's Test and Prod.
# Pay attention to the 'dependsOn' field - this determines order of operations.
# IMPORTANT: check_JDBC will have schema dropped
- name: deploymentStages
type: object
default:
- stage: 'Test'
dependsOn: 'Build'
displayName: 'Deploy Test'
pauseForCodeReview: false
generateDriftAndChangeReport: true #requires check database to be provisioned
staticCodeAnalysis: false #requires python3 installed on agent and pip on PATH
variableGroupName: 'RCS DB Test' #userName, password, JDBC, Database.Name, check_JDBC
- stage: 'Prod'
dependsOn: 'Test'
displayName: 'Deploy Prod'
pauseForCodeReview: true
generateDriftAndChangeReport: true #requires check database to be provisioned
staticCodeAnalysis: false #requires python3 installed on agent and pip on PATH
variableGroupName: 'RCS DB Test Prod' #userName, password, JDBC, Database.Name, check_JDBC
***stages:*** # This is line 61
- stage: Build
pool: $(AGENT_POOL)
displayName: ${{parameters.buildStage.displayName}}
jobs:
- job: Build
variables:
- group: ${{parameters.buildStage.variableGroupName}}
- group: flyway_vars
steps:
- script: '$(FLYWAY) clean info -url="$(JDBC)"'
failOnStderr: true
displayName: 'Clean Build DB'
env:
FLYWAY_CLEAN_DISABLED: false
- script: '$(FLYWAY) migrate info -url="$(JDBC)" -baselineOnMigrate=true -baselineVersion=$(BASELINE_VERSION)'
failOnStderr: true
displayName: 'Validate Migrate Scripts'
- script: '$(FLYWAY) undo info -url="$(JDBC)" -target="$(FIRST_UNDO_SCRIPT)"?'
continueOnError: true
displayName: 'Validate Undo Scripts'
- task: CopyFiles#2
inputs:
targetFolder: '$(System.ArtifactsDirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Build Artifact'
inputs:
ArtifactName: '$(BUILD_NAME)'
PathtoPublish: '$(System.ArtifactsDirectory)'
- ${{each stage in parameters.deploymentStages}}:
- stage: ${{stage.stage}}
pool: $(AGENT_POOL)
displayName: ${{stage.displayName}}
dependsOn: ${{stage.dependsOn}}
jobs:
- job: PreRelease
displayName: Release Preview
variables:
- group: ${{stage.variableGroupName}}
- group: flyway_vars
steps:
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: '$(BUILD_NAME)'
downloadPath: '$(System.ArtifactsDirectory)'
- script: '$(FLYWAY) migrate -dryRunOutput="$(System.ArtifactsDirectory)\${{stage.stage}}-$(RELEASE_PREVIEW)" -url="$(JDBC)" -baselineOnMigrate=true -baselineVersion=$(BASELINE_VERSION)'
workingDirectory: '$(System.DefaultWorkingDirectory)'
failOnStderr: true
displayName: 'Pre-Release Deployment Report'
env:
FLYWAY_CLEAN_DISABLED: true
- task: PublishBuildArtifacts#1
displayName: 'Publish Release Preview'
inputs:
ArtifactName: 'Release Preview'
PathtoPublish: '$(System.ArtifactsDirectory)\${{stage.stage}}-$(RELEASE_PREVIEW)'
- ${{if eq(stage.staticCodeAnalysis, true)}}:
- job: ChangeReport
timeoutInMinutes: 0 # how long to run the job before automatically cancelling
cancelTimeoutInMinutes: 2 # how much time to give 'run always even if cancelled tasks' before stopping them
dependsOn: 'PreRelease'
displayName: Change Report With Code Analysis
variables:
- group: ${{stage.variableGroupName}}
- group: flyway_vars
steps:
- script: 'pip install sqlfluff==1.2.1'
displayName: 'Install SQL Fluff'
failOnStderr: true
- script: '$(FLYWAY) check -changes -drift -code -check.buildUrl=$(check_JDBC) -url="$(JDBC)" -check.reportFilename="$(System.ArtifactsDirectory)\$(Database.Name)-$(Build.BuildId)-$(DRIFT_AND_CHANGE_REPORT)"'
workingDirectory: '$(System.DefaultWorkingDirectory)'
failOnStderr: true
displayName: '$(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
env:
FLYWAY_CLEAN_DISABLED: false
- task: PublishBuildArtifacts#1
displayName: 'Publish $(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
inputs:
ArtifactName: '$(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
PathtoPublish: '$(System.ArtifactsDirectory)\$(Database.Name)-$(Build.BuildId)-$(DRIFT_AND_CHANGE_REPORT)'
- ${{if and(eq( stage.generateDriftAndChangeReport, true), eq( stage.staticCodeAnalysis, false))}}:
- job: ChangeReport
displayName: Change Report
timeoutInMinutes: 0
dependsOn: 'PreRelease'
variables:
- group: ${{stage.variableGroupName}}
- group: flyway_vars
steps:
- script: '$(FLYWAY) check -cherryPick=$(cherryPickVersions) -drift -check.buildUrl=$(check_JDBC) -url="$(JDBC)" -check.reportFilename="$(System.ArtifactsDirectory)\$(Database.Name)-$(Build.BuildId)-$(DRIFT_AND_CHANGE_REPORT)"'
workingDirectory: '$(System.DefaultWorkingDirectory)'
failOnStderr: true
displayName: '$(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
env:
FLYWAY_CLEAN_DISABLED: false
- task: PublishBuildArtifacts#1
displayName: 'Publish $(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
inputs:
ArtifactName: '$(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
PathtoPublish: '$(System.ArtifactsDirectory)\$(Database.Name)-$(Build.BuildId)-$(DRIFT_AND_CHANGE_REPORT)'
- ${{if and(eq( stage.generateDriftAndChangeReport, false), eq( stage.staticCodeAnalysis, false))}}:
- job: ChangeReport
pool: server
displayName: Skipping Change Report
dependsOn: 'PreRelease'
- ${{if eq(stage.pauseForCodeReview, true)}}:
- job: CodeReview
displayName: Code Review
dependsOn: 'ChangeReport'
pool: server
steps:
- task: ManualValidation#0
displayName: 'Review Change Report Prior To Release'
timeoutInMinutes: 4320 # job times out in 1 hour
inputs:
notifyUsers: |
user#email.com
example#example.com
instructions: 'Review changes'
- ${{if eq(stage.pauseForCodeReview, false)}}:
- job: CodeReview
pool: server
displayName: Skipping Code Review
dependsOn: 'ChangeReport'
- job: Deploy
displayName: Deployment
dependsOn: 'CodeReview'
variables:
- group: ${{stage.variableGroupName}}
- group: flyway_vars
steps:
- script: '$(FLYWAY) -cherryPick=$(cherryPickVersions) info migrate info -url="$(JDBC)" -baselineOnMigrate=true -baselineVersion=$(BASELINE_VERSION)'
workingDirectory: $(System.DefaultWorkingDirectory)
displayName: ${{stage.displayName}}
failOnStderr: true
env:
FLYWAY_CLEAN_DISABLED: true # clean drops a target DB schema, keep disabled except for build step
If anyone has tried to do something similar, i'd love to hear from you.

Related

Build pipeline name is not displayed as expected

I have this pipeline file:
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
branches:
include:
- main
- issues*
- tasks*
paths:
exclude:
- documentation/*
- Readme.md
variables:
- name: majorVersion
value: 1
- name: minorVersion
value: 0
- name: revision
value: $[counter(variables['minorVersion'],0)]
- name: buildVersion
value: $(majorVersion).$(minorVersion).$(revision)
name: $(buildVersion)
and I expect the pipeline name to be 1.0.0
but instead it is a string $(majorVersion).$(minorVersion).$(revision)
where did i get the formatting wrong?

Increment version and update pom.xml <version> in Azure DevOps pipeline

I am trying to increment the version number for my builds and then update the pom.xml file with the new version. My Powershell to edit and save the pom.xml does not seem to work, I don't get it to reach the xml.project.version i.e. < version > tag and make changes to it.
Do you have any suggestion how to get it to find the < version > tag and save the updated document?
For additional info, the DevOps pipeline runs in windows-latest at the moment.
trigger:
branches:
include:
- localdev
variables:
- name: azureSubscription
value: 'xxxx'
- name: webAppName
value: 'xxx'
- name: environmentName
value: 'xxx'
- name: vmImageName
value: 'ubuntu-latest'
- name: version.MajorMinor
value: '1.0'
- name: version.Patch
value: $[counter(variables['version.MajorMinor'], 0)]
- name: stableVersionNumber
value: '$(version.MajorMinor).$(version.Patch)'
- name: prereleaseVersionNumber
value: 'Set dynamically below in a task'
- name: versionNumber
value: 'Set dynamically below in a task'
- name: isMainBranch
value: $[eq(variables['Build.SourceBranch'], 'refs/heads/main')]
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Fixversionnumber
displayName: Fix Version number
pool:
vmImage: $(vmImageName)
steps:
- task: PowerShell#2
displayName: Set the prereleaseVersionNumber variable value
inputs:
targetType: 'inline'
script: |
[string] $prereleaseVersionNumber = "$(stableVersionNumber)"
Write-Host "Setting the prerelease version number variable to '$prereleaseVersionNumber'."
Write-Host "##vso[task.setvariable variable=prereleaseVersionNumber]$prereleaseVersionNumber"
- task: PowerShell#2
displayName: Set the versionNumber to the stable or prerelease version number based on if the 'main' branch is being built or not
inputs:
targetType: 'inline'
script: |
[bool] $isMainBranch = $$(isMainBranch)
[string] $versionNumber = "$(prereleaseVersionNumber)"
if ($isMainBranch)
{
$versionNumber = "$(stableVersionNumber)"
}
Write-Host "Setting the version number to use to '$versionNumber'."
Write-Host "##vso[task.setvariable variable=versionNumber]$versionNumber"
- task: PowerShell#2
displayName: Set the name of the build (i.e. the Build.BuildNumber)
inputs:
targetType: 'inline'
script: |
[string] $buildName = "$(versionNumber)_$(Build.SourceBranchName)"
Write-Host "Setting the name of the build to '$buildName'."
Write-Host "##vso[build.updatebuildnumber]$buildName"
- task: PowerShell#2
displayName: Set the name of the build (i.e. the Build.BuildNumber)
inputs:
targetType: 'inline'
script: |
#Get version
$versionNum = "$(versionNumber)"
# Specify the file path
#$xmlFileName= "pom.xml"
# Read the existing file
[xml]$xml = Get-Content "pom.xml"
#[xml]$xmlDoc = Get-Content pom.xml
# If it was one specific element you can just do like so:
$xml.project.version = "$versionNum"
#Remove the old pom.xml
#Remove-Item $xmlFileName
# Then you can save that back to the xml file
$xml.Save("$(System.DefaultWorkingDirectory)\pom.xml")
# Print new file content
Write-Host "#########################################'$versionNum'.#########################################"
Write-Host "Setting the version number to use to '$versionNum'."
Write-Host "######################################### '$versionNum'.#########################################"
gc $(System.DefaultWorkingDirectory)\pom.xml
- task: Maven#3
displayName: 'Maven Package'
inputs:
mavenPomFile: 'pom.xml'
- task: CopyFiles#2
displayName: 'Copy Files to artifact staging directory'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)'
Contents: '**/target/*.?(war|jar)'
TargetFolder: $(Build.ArtifactStagingDirectory)
- task: buildDropFile
inputs:
targetPath: $(Build.ArtifactStagingDirectory)
artifactName: drop
You can use a third party extension Replace Tokens. It can search files to find what to replace by a parameter tokenPattern and replaces values with variables in the pipeline. Click "Git if free" and download it to your organization.
In your pom.xml file, please replace your version number to a unique value, like #{version}#.
<version>#{versionNumber}#</version>
Please notice the #{...}#, this will help task to find which content to replace. The versionNumber should be the variable that you want to replace.
Then, in your pipeline, search and add a replace tokens task. Here is an example:
steps:
- task: replacetokens#5
inputs:
targetFiles: '**/pom.xml'
encoding: 'auto'
tokenPattern: 'default' # The defult token pattern is #{...}#
writeBOM: true
actionOnMissing: 'continue'
keepToken: false
actionOnNoFiles: 'continue'
enableTransforms: false
enableRecursion: false
useLegacyPattern: false
enableTelemetry: true

Github actions: merge artifacts after matrix steps

I have a job in Github Actions workflow that runs unit tests and then uploads reports to Jira Xray. The thing is tests step takes quite a while to complete, so I want to split task execution into a few smaller chunks using matrix.
I did it for linting and it works well, however for unit tests I'm struggling with how can I collect and merge all reports together so they can be uploaded after all matrix steps are done.
Here's how current unit tests step looks like
unit-test:
runs-on: ubuntu-latest
needs: setup
steps:
- uses: actions/checkout#v3
with:
fetch-depth: 0
- uses: actions/cache#v3
with:
path: ${{ env.CACHE_NODE_MODULES_PATH }}
key: build-${{ hashFiles('**/package-lock.json') }}
- run: npx nx affected:test --parallel=3 --base=${{ env.BASE_REF}} --head=HEAD # actual unit tests
- name: Check file existence #checking whether there're reports at all
if: success() || failure()
id: check_files
uses: andstor/file-existence-action#v1
with:
# all reports will be placed in this directory
# for matrix job reports will be separated between agents, so it's required to merge them
files: 'reports/**/test-*.xml'
- name: Import results to Xray
if: (success() || failure()) && steps.check_files.outputs.files_exists == 'true' && github.event_name == 'push'
uses: mikepenz/xray-action#v2
with:
username: ${{ secrets.XRAY_CLIENT_ID }}
password: ${{ secrets.XRAY_CLIENT_SECRET }}
testFormat: 'junit'
testPaths: 'reports/**/test-*.xml' # that's where I need to grab all reports
projectKey: 'MY_KEY'
combineInSingleTestExec: true
Matrix job for linting looks like this. I would like to do the same for unit tests, but at the same time I want to collect all reports as it works in the job above
linting:
runs-on: ubuntu-latest
needs: [setup]
strategy:
matrix:
step: ${{ fromJson(needs.setup.outputs.lint-bins) }} # this will be something like [1,2,3,4]
steps:
- uses: actions/checkout#v3
with:
fetch-depth: 0
- uses: actions/cache#v3
with:
path: ${{ env.CACHE_NODE_MODULES_PATH }}
key: build-${{ hashFiles('**/package-lock.json') }}
# some nodejs logic to run few jobs, it uses "execSync" from "child_process" to invoke the task
- run: node scripts/ci-run-many.mjs --target=lint --outputTarget=execute --partNumber=${{ matrix.step }} --base=${{ env.BASE_REF}} --head=HEAD
Figured it myself
unit-test:
runs-on: ubuntu-latest
needs: [setup]
strategy:
fail-fast: false
matrix:
step: ${{ fromJson(needs.setup.outputs.unit-test-bins) }}
steps:
- uses: actions/checkout#v3
with:
fetch-depth: 0
- uses: actions/cache#v3
with:
path: ${{ env.CACHE_NODE_MODULES_PATH }}
key: build-${{ hashFiles('**/package-lock.json') }}
- run: node scripts/ci-run-many.mjs --target=test --outputTarget=execute --partNumber=${{ matrix.step }} --base=${{ env.BASE_REF}} --head=HEAD
- name: Upload reports' artifacts
if: success() || failure()
uses: actions/upload-artifact#v3
with:
name: ${{ env.RUN_UNIQUE_ID }}_artifact_${{ matrix.step }}
if-no-files-found: ignore
path: reports
retention-days: 1
process-test-data:
runs-on: ubuntu-latest
needs: unit-test
if: success() || failure()
steps:
- uses: actions/checkout#v3
- name: Download reports' artifacts
uses: actions/download-artifact#v3
with:
path: downloaded_artifacts
- name: Place reports' artifacts
run: rsync -av downloaded_artifacts/*/*/ unit_test_reports/
- name: Check reports existence
id: check_files
uses: andstor/file-existence-action#v1
with:
files: 'unit_test_reports/**/test-*.xml'
- name: Import results to Xray
run: ls -R unit_test_reports/

No package found with specified pattern: D:\a\1\a\func-liststarter-api-seg-dev-001.zip<br/>Check if the package mentioned in the task is published as

buildtest.yaml file is below
parameters:
- name: solution_filename
type: string
default: '**/*.sln'
stages:
- stage: BuildTest
displayName: 'Build & Validation Stage'
variables:
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
solution: ${{parameters.solution_filename}}
jobs:
- job: Build
displayName: Build Full Solution
dependsOn: []
pool:
vmImage: $(poolImage)
steps:
- template: ./ci-and-pr.yaml
parameters:
solution_filename: $(solution)
solution_build_configuration: $(buildConfiguration)
- task: PublishPipelineArtifact#1
inputs:
path: '$(Build.ArtifactStagingDirectory)'
artifact: 'drop'
- task: CopyFiles#2
inputs:
targetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ci-and-pr.yaml is below
parameters:
- name: solution_filename
type: string
default: '**/*.sln'
# solution_build_configuration - (Optional) The build configuration within the solution that should be invoked.
# Default is Release but can be overwritten if you want to do say a Debug build
- name: solution_build_configuration
type: string
default: Release
- name: solution_target_platform
type: string
default: 'Any CPU'
steps:
# Bootstrap the build
- template: ./bootstrap-build.yaml
- task: NuGetCommand#2
displayName: 'NuGet restore'
inputs:
command: restore
restoreSolution: ${{parameters.solution_filename}}
feedsToUse: config
nugetConfigPath: $(Build.SourcesDirectory)/nuget.config
includeNuGetOrg: true
- task: DotNetCoreCLI#1
displayName: Build
inputs:
command: build
projects: ${{parameters.solution_filename}}
arguments: '--configuration $(buildConfiguration)'
- task: DotNetCoreCLI#1
displayName: 'Publish Solution'
inputs:
command: publish
publishWebProjects: false
projects: ${{parameters.solution_filename}}
arguments: "--configuration ${{ parameters.solution_build_configuration }} --output $(Build.ArtifactStagingDirectory)"
deploApp.yaml file below
parameters:
- name: AzureServiceConnection
type: string
#- name: AzureRegion
# type: string
#- name: SubscriptionId
# type: string
- name: AppName
type: string
steps:
- task: AzureFunctionApp#1
displayName: "Azure Function App Deployment"
inputs:
azureSubscription: ${{parameters.AzureServiceConnection}}
appName: ${{parameters.AppName}}
appType: 'functionApp'
package: $(Pipeline.Workspace)/**/*.zip
deploymentMethod: runFromPackage
Getting the error while running the pipeline " No package found with specified pattern: D:\a\1\a\func-liststarter-api-seg-dev-001.zipCheck if the package mentioned in the task is published as an artifact in the build or a previous stage and downloaded in the current job"
No package found with specified pattern: D:\a\1\a\func-liststarter-api-seg-dev-001.zipCheck if the package mentioned in the task is published as an artifact in the build or a previous stage and downloaded in the current job

Is it possible to achieve such a refactor in YAML

I'm working on a concourse pipeline and I need to duplicate a lot of code in my YAML so I'm trying to refactor it so it is easily maintainable and I don't end up with thousands of duplicates lines/blocks.
I have achieve the following yaml file after what seems to be the way to go but it doesn't fullfill all my needs.
add-rotm-points: &add-rotm-points
task: add-rotm-points
config:
platform: linux
image_resource:
type: docker-image
source:
repository: ((registre))/polygone/concourse/cf-cli-python3
tag: 0.0.1
insecure_registries: [ ((registre)) ]
run:
path: source-pipeline/commun/rotm/trigger-rotm.sh
args: [ "source-pipeline", "source-code-x" ]
inputs:
- name: source-pipeline
- name: source-code-x
jobs:
- name: test-a
plan:
- in_parallel:
- get: source-pipeline
- get: source-code-a
trigger: true
- <<: *add-rotm-points
- name: test-b
plan:
- in_parallel:
- get: source-pipeline
- get: source-code-b
trigger: true
- <<: *add-rotm-points
My problem is that both my jobs uses the generic task defined at the top. But in the generic task I need to change source-code-x to the -a or -b version I use in my jobs.
I cannot find a way to achieve this without duplicating my anchor in every jobs and that seems to be counter productive. But i may not have full understood yaml anchors/merges.
All you need to do is map inputs on individual tasks, like this:
add-rotm-points: &add-rotm-points
task: add-rotm-points
config:
platform: linux
image_resource:
type: docker-image
source:
repository: ((registre))/polygone/concourse/cf-cli-python3
tag: 0.0.1
insecure_registries: [ ((registre)) ]
run:
path: source-pipeline/commun/rotm/trigger-rotm.sh
args: [ "source-pipeline", "source-code-x" ]
inputs:
- name: source-pipeline
- name: source-code-x
jobs:
- name: test-a
plan:
- in_parallel:
- get: source-pipeline
- get: source-code-a
trigger: true
- <<: *add-rotm-points
input_mapping:
source-code-x: source-code-a
- name: test-b
plan:
- in_parallel:
- get: source-pipeline
- get: source-code-b
trigger: true
- <<: *add-rotm-points
input_mapping:
source-code-x: source-code-b
See Example Three in this blog: https://blog.concourse-ci.org/introduction-to-task-inputs-and-outputs/

Resources