Is it possible to achieve such a refactor in YAML - yaml

I'm working on a concourse pipeline and I need to duplicate a lot of code in my YAML so I'm trying to refactor it so it is easily maintainable and I don't end up with thousands of duplicates lines/blocks.
I have achieve the following yaml file after what seems to be the way to go but it doesn't fullfill all my needs.
add-rotm-points: &add-rotm-points
task: add-rotm-points
config:
platform: linux
image_resource:
type: docker-image
source:
repository: ((registre))/polygone/concourse/cf-cli-python3
tag: 0.0.1
insecure_registries: [ ((registre)) ]
run:
path: source-pipeline/commun/rotm/trigger-rotm.sh
args: [ "source-pipeline", "source-code-x" ]
inputs:
- name: source-pipeline
- name: source-code-x
jobs:
- name: test-a
plan:
- in_parallel:
- get: source-pipeline
- get: source-code-a
trigger: true
- <<: *add-rotm-points
- name: test-b
plan:
- in_parallel:
- get: source-pipeline
- get: source-code-b
trigger: true
- <<: *add-rotm-points
My problem is that both my jobs uses the generic task defined at the top. But in the generic task I need to change source-code-x to the -a or -b version I use in my jobs.
I cannot find a way to achieve this without duplicating my anchor in every jobs and that seems to be counter productive. But i may not have full understood yaml anchors/merges.

All you need to do is map inputs on individual tasks, like this:
add-rotm-points: &add-rotm-points
task: add-rotm-points
config:
platform: linux
image_resource:
type: docker-image
source:
repository: ((registre))/polygone/concourse/cf-cli-python3
tag: 0.0.1
insecure_registries: [ ((registre)) ]
run:
path: source-pipeline/commun/rotm/trigger-rotm.sh
args: [ "source-pipeline", "source-code-x" ]
inputs:
- name: source-pipeline
- name: source-code-x
jobs:
- name: test-a
plan:
- in_parallel:
- get: source-pipeline
- get: source-code-a
trigger: true
- <<: *add-rotm-points
input_mapping:
source-code-x: source-code-a
- name: test-b
plan:
- in_parallel:
- get: source-pipeline
- get: source-code-b
trigger: true
- <<: *add-rotm-points
input_mapping:
source-code-x: source-code-b
See Example Three in this blog: https://blog.concourse-ci.org/introduction-to-task-inputs-and-outputs/

Related

Build pipeline name is not displayed as expected

I have this pipeline file:
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
branches:
include:
- main
- issues*
- tasks*
paths:
exclude:
- documentation/*
- Readme.md
variables:
- name: majorVersion
value: 1
- name: minorVersion
value: 0
- name: revision
value: $[counter(variables['minorVersion'],0)]
- name: buildVersion
value: $(majorVersion).$(minorVersion).$(revision)
name: $(buildVersion)
and I expect the pipeline name to be 1.0.0
but instead it is a string $(majorVersion).$(minorVersion).$(revision)
where did i get the formatting wrong?

Calling a secondary pipeline from main pipeline

What I'm trying to achieve is that on pull request completion, I run a pipeline which requests the user to select 1 of four options, run a pipeline in which I can choose which version to build/release, all versions, undo a particular version, or a hotfix pipeline that skips the build stage and merely deploys a particular version.
Sorry in advance for the poor formatting.
'''
name: ADO-self-hosted-pipeline-Master-calls-Specific-Template
trigger: none
parameters:
- name: pipeline
displayName: Choose Pipeline
type: string
default: All-Versions
values:
- All-Versions
- Cherry-Pick-Versions
- Hotfix
- Undo-Versions
stages:
- stage: CherryPick
pool: $(AGENT_POOL)
displayName: 'Cherry pick'
jobs:
- job: SelectCherryPick
displayName: Select the Cherry Pick pipeline
- template: azure-pipelines_Cherry Pick.yml#self
condition: and(succeeded(), ${{eq(parameters['pipeline'], 'Cherry-Pick-Versions' ) }}
- stage: Hotfix
pool: $(AGENT_POOL)
displayName: 'Hotfix'
jobs:
- job: SelectHotfix
displayName: Select the Hotfix pipeline
- template: azure-pipelines_Hotfix.yml#self
condition: and(succeeded(), ${{eq(parameters['pipeline'], 'Hotfix' ) }}
- stage: All-Versions
pool: $(AGENT_POOL)
displayName: 'All-Versions'
jobs:
- job: SelectAll
displayName: Select the All Migrations pipeline
- template: azure-pipelines_All Migrations.yml#self
condition: and(succeeded(), ${{eq(parameters['pipeline'], 'All-Versions' ) }}
- stage: Undo-Versions
pool: $(AGENT_POOL)
displayName: 'Undo-Versions'
jobs:
- job: SelectUndo
displayName: Select the Undo pipeline
- template: azure-pipelines_undo.yml#self
condition: and(succeeded(), ${{eq(parameters['pipeline'], 'Undo-Versions' ) }}
variables:
BUILD_NAME: 'Build'
FLYWAY: 'E:\Agents\$(Agent.Name)\Flyway\flyway -user="$(userName)" -password="$(password)" -licenseKey=$(FLYWAY_LICENSE_KEY) -outOfOrder=true'
RELEASE_PREVIEW: 'Release-Preview.sql'
DRIFT_AND_CHANGE_REPORT: 'Drift-And-Change-Report.html'
DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME: 'Drift And Change Report'
group: flyway_vars
#cherryPickVersions: ${{parameters.cherryPickVersions}}
'''
The pipeline works well with result:
Master Pipeline to Select further Pipelines
So I can choose which pipeline template from there, but when I press Run, I get an error:
Run after selecting cherry pick
The error is: /azure-pipelines_Cherry Pick.yml (Line: 61, Col: 1): Unexpected value 'stages'
The full Cherry Pick YAML Template is:
#name: ADO-self-hosted-pipeline-Cherry-Pick
# This is the default pipeline for a self-hosted Windows agent on Azure Devops.
# Install flyway cli on agent, add flyway to PATH: https://download.red-gate.com/maven/release/org/flywaydb/enterprise/flyway-commandline
# Install python3 on agent and add pip to PATH if staticCodeAnalysis is set to true
# Make sure this file is in the same directory as the migrations folder of the Flyway Enterprise project.
# Provision a dev, shadow, build databases, as well as any target environments that need to be created: https://documentation.red-gate.com/fd/proof-of-concept-checklist-152109292.html
# Further instructions if needed here: https://documentation.red-gate.com/fd/self-hosted-windows-agent-yaml-pipeline-in-azure-devops-158564470.html
# For video reference, see: https://www.red-gate.com/hub/university/courses/flyway/flyway-desktop/setting-up-a-flyway-desktop-project/basic-flyway-desktop-project-setup-and-configuration
#trigger:
# branches:
# include:
# - master
# paths:
# include:
# - migrations/*
#This is the Cherry-Pick Pipeline
# IMPORTANT: DO NOT ADD DEPLOYMENT STEPS TO THE BUILD STAGE - THE BUILD IS A DESTRUCTIVE ACTION
parameters:
- name: cherryPickVersions
displayName: 'Scripts To Deploy: Comma Separated List Of Full Version Numbers'
default: ''
type: string
- name: buildStage
type: object
default:
stage: 'Build'
displayName: 'Temp Build'
variableGroupName: 'RCS DB Common' #userName, password, JDBC, Database.Name
# This is the extensible definition of your target environments.
# Every parameter in deploymentStages corresponds to an environment - here it's Test and Prod.
# Pay attention to the 'dependsOn' field - this determines order of operations.
# IMPORTANT: check_JDBC will have schema dropped
- name: deploymentStages
type: object
default:
- stage: 'Test'
dependsOn: 'Build'
displayName: 'Deploy Test'
pauseForCodeReview: false
generateDriftAndChangeReport: true #requires check database to be provisioned
staticCodeAnalysis: false #requires python3 installed on agent and pip on PATH
variableGroupName: 'RCS DB Test' #userName, password, JDBC, Database.Name, check_JDBC
- stage: 'Prod'
dependsOn: 'Test'
displayName: 'Deploy Prod'
pauseForCodeReview: true
generateDriftAndChangeReport: true #requires check database to be provisioned
staticCodeAnalysis: false #requires python3 installed on agent and pip on PATH
variableGroupName: 'RCS DB Test Prod' #userName, password, JDBC, Database.Name, check_JDBC
***stages:*** # This is line 61
- stage: Build
pool: $(AGENT_POOL)
displayName: ${{parameters.buildStage.displayName}}
jobs:
- job: Build
variables:
- group: ${{parameters.buildStage.variableGroupName}}
- group: flyway_vars
steps:
- script: '$(FLYWAY) clean info -url="$(JDBC)"'
failOnStderr: true
displayName: 'Clean Build DB'
env:
FLYWAY_CLEAN_DISABLED: false
- script: '$(FLYWAY) migrate info -url="$(JDBC)" -baselineOnMigrate=true -baselineVersion=$(BASELINE_VERSION)'
failOnStderr: true
displayName: 'Validate Migrate Scripts'
- script: '$(FLYWAY) undo info -url="$(JDBC)" -target="$(FIRST_UNDO_SCRIPT)"?'
continueOnError: true
displayName: 'Validate Undo Scripts'
- task: CopyFiles#2
inputs:
targetFolder: '$(System.ArtifactsDirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Build Artifact'
inputs:
ArtifactName: '$(BUILD_NAME)'
PathtoPublish: '$(System.ArtifactsDirectory)'
- ${{each stage in parameters.deploymentStages}}:
- stage: ${{stage.stage}}
pool: $(AGENT_POOL)
displayName: ${{stage.displayName}}
dependsOn: ${{stage.dependsOn}}
jobs:
- job: PreRelease
displayName: Release Preview
variables:
- group: ${{stage.variableGroupName}}
- group: flyway_vars
steps:
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: '$(BUILD_NAME)'
downloadPath: '$(System.ArtifactsDirectory)'
- script: '$(FLYWAY) migrate -dryRunOutput="$(System.ArtifactsDirectory)\${{stage.stage}}-$(RELEASE_PREVIEW)" -url="$(JDBC)" -baselineOnMigrate=true -baselineVersion=$(BASELINE_VERSION)'
workingDirectory: '$(System.DefaultWorkingDirectory)'
failOnStderr: true
displayName: 'Pre-Release Deployment Report'
env:
FLYWAY_CLEAN_DISABLED: true
- task: PublishBuildArtifacts#1
displayName: 'Publish Release Preview'
inputs:
ArtifactName: 'Release Preview'
PathtoPublish: '$(System.ArtifactsDirectory)\${{stage.stage}}-$(RELEASE_PREVIEW)'
- ${{if eq(stage.staticCodeAnalysis, true)}}:
- job: ChangeReport
timeoutInMinutes: 0 # how long to run the job before automatically cancelling
cancelTimeoutInMinutes: 2 # how much time to give 'run always even if cancelled tasks' before stopping them
dependsOn: 'PreRelease'
displayName: Change Report With Code Analysis
variables:
- group: ${{stage.variableGroupName}}
- group: flyway_vars
steps:
- script: 'pip install sqlfluff==1.2.1'
displayName: 'Install SQL Fluff'
failOnStderr: true
- script: '$(FLYWAY) check -changes -drift -code -check.buildUrl=$(check_JDBC) -url="$(JDBC)" -check.reportFilename="$(System.ArtifactsDirectory)\$(Database.Name)-$(Build.BuildId)-$(DRIFT_AND_CHANGE_REPORT)"'
workingDirectory: '$(System.DefaultWorkingDirectory)'
failOnStderr: true
displayName: '$(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
env:
FLYWAY_CLEAN_DISABLED: false
- task: PublishBuildArtifacts#1
displayName: 'Publish $(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
inputs:
ArtifactName: '$(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
PathtoPublish: '$(System.ArtifactsDirectory)\$(Database.Name)-$(Build.BuildId)-$(DRIFT_AND_CHANGE_REPORT)'
- ${{if and(eq( stage.generateDriftAndChangeReport, true), eq( stage.staticCodeAnalysis, false))}}:
- job: ChangeReport
displayName: Change Report
timeoutInMinutes: 0
dependsOn: 'PreRelease'
variables:
- group: ${{stage.variableGroupName}}
- group: flyway_vars
steps:
- script: '$(FLYWAY) check -cherryPick=$(cherryPickVersions) -drift -check.buildUrl=$(check_JDBC) -url="$(JDBC)" -check.reportFilename="$(System.ArtifactsDirectory)\$(Database.Name)-$(Build.BuildId)-$(DRIFT_AND_CHANGE_REPORT)"'
workingDirectory: '$(System.DefaultWorkingDirectory)'
failOnStderr: true
displayName: '$(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
env:
FLYWAY_CLEAN_DISABLED: false
- task: PublishBuildArtifacts#1
displayName: 'Publish $(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
inputs:
ArtifactName: '$(DRIFT_AND_CHANGE_REPORT_DISPLAY_NAME)'
PathtoPublish: '$(System.ArtifactsDirectory)\$(Database.Name)-$(Build.BuildId)-$(DRIFT_AND_CHANGE_REPORT)'
- ${{if and(eq( stage.generateDriftAndChangeReport, false), eq( stage.staticCodeAnalysis, false))}}:
- job: ChangeReport
pool: server
displayName: Skipping Change Report
dependsOn: 'PreRelease'
- ${{if eq(stage.pauseForCodeReview, true)}}:
- job: CodeReview
displayName: Code Review
dependsOn: 'ChangeReport'
pool: server
steps:
- task: ManualValidation#0
displayName: 'Review Change Report Prior To Release'
timeoutInMinutes: 4320 # job times out in 1 hour
inputs:
notifyUsers: |
user#email.com
example#example.com
instructions: 'Review changes'
- ${{if eq(stage.pauseForCodeReview, false)}}:
- job: CodeReview
pool: server
displayName: Skipping Code Review
dependsOn: 'ChangeReport'
- job: Deploy
displayName: Deployment
dependsOn: 'CodeReview'
variables:
- group: ${{stage.variableGroupName}}
- group: flyway_vars
steps:
- script: '$(FLYWAY) -cherryPick=$(cherryPickVersions) info migrate info -url="$(JDBC)" -baselineOnMigrate=true -baselineVersion=$(BASELINE_VERSION)'
workingDirectory: $(System.DefaultWorkingDirectory)
displayName: ${{stage.displayName}}
failOnStderr: true
env:
FLYWAY_CLEAN_DISABLED: true # clean drops a target DB schema, keep disabled except for build step
If anyone has tried to do something similar, i'd love to hear from you.

variable inside variable in yaml files?

Here is a piece of code:
parameters:
- name: Scenario1
type: object
default: ['Test1','Test2','Test3','Test4']
- name: Scenario2
type: object
default: ['Test5','Test6']
jobs:
- job: Test_Run
pool:
vmImage: 'ubuntu-latest'
steps:
- template: Test.yml
parameters:
tests: ${{ parameters['Scenario1'] }}
For now the part:
tests: ${{ parameters['Scenario1'] }} is hardcoded. I would like to have something like this to be able to pick a scenario:
parameters:
- name: Scenario1
type: object
default: ['Test1','Test2','Test3','Test4']
- name: Scenario2
type: object
default: ['Test5','Test6']
jobs:
- job: Test_Run
pool:
vmImage: 'ubuntu-latest'
steps:
- template: Test.yml
parameters:
tests: ${{ parameters[$(Scenario)] }}
I would like to pass a $(Scenario) variable from Azure pipeline, but I do not know how to insert a variable inside ${{xxx}}. :|
You are looking for using object types. Would be something like
- name: Scenarios
type: object
default:
- ScenarioName: 'Scenario1'
Tests: ['Test1','Test2','Test3','Test4']
- ScenarioName: 'Scenario2'
Tests: ['Test5','Test6']
Then it would be called like:
- ${{ each Scenario in parameters.Scenarios}} :
- template: test.yml
parameters:
ScenarioName: ${{ Scenario.ScenarioName}}
ScenarioTests: ${{ Scenario.Tests}}

Task with loop in Argo workflow

I want to introduce a for loop in a workflow that consists of 2 individual tasks. The second will be dependent on the first. Each one should use different templates. The second should iterate with {{item}}. For each iteration I want to know if the default is to execute only the second task or it will re-execute the whole flow?
To repeat the second step only, use withItems/withParameter (there is no withArtifact, though you can get the same behavior with data). These loops repeat the specific step they are mentioned in for the specified items/parameter only.
- name: create-resources
inputs:
paramet`enter code here`ers:
- name: env
- name: giturl
- name: resources
- name: awssecret
dag:
tasks:
- name: resource
template: resource-create
arguments:
parameters:
- name: env
value: "{{inputs.parameters.env}}"
- name: giturl
value: "{{inputs.parameters.giturl}}"
- name: resource
value: "{{item}}"
- name: awssecret
value: "{{inputs.parameters.awssecret}}"
withParam: "{{inputs.parameters.resources}}"
############# For parallel execution use steps ##############
steps:
- - name: resource
template: resource-create
arguments:
parameters:
- name: env
value: "{{inputs.parameters.env}}"
- name: giturl
value: "{{inputs.parameters.giturl}}"
- name: resource
value: "{{item}}"
- name: awssecret
value: "{{inputs.parameters.awssecret}}"
withParam: "{{inputs.parameters.resources}}"

Why am I getting this error in Concourse? Error: No step configured

I am brand new to concourse and am trying to use it to make a terraform-ci platform and cannot figure out why im getting this error on my very first pipeline, can anyone help out?
jobs:
- name: terraform-pipeline
serial: true
plan:
- aggregate:
- get: master-branch
trigger: true
- get: common-tasks
params: { submodules: [ terraform ] }
trigger: true
- task: terraform-plan
file: common-tasks/terraform/0.12.29.yml
input_mapping: { source: master-branch }
params:
command: plan
cache: true
access_key: ((aws-access-key))
secret_key: ((aws-secret-key))
directory: master-branch/terraform-poc/dev
resources:
- name: master-branch
type: git
source:
uri: https://github.com/rossrollin/terraform-poc
branch: master
- name: common-tasks
type: git
source:
uri: https://github.com/telia-oss/concourse-tasks.git
branch: master
Executing pipeline like so:
fly -t concourse-poc sp -p terraform-pipeline -c pipeline2.yml -v aws-access-key=''-v aws-secret-key=''
error: error unmarshaling JSON: while decoding JSON: no step configured
The aggregate step was deprecated in version 5.2.0 and removed in version 7.0.0.
You need to replace it with the new in_parallel step.
- - aggregate:
+ - in_parallel:
Removing '- aggregate:' and just running the resource get's inline fixes my issue.

Resources