Azure Pipeline pass parameter from one job to another - bash

My ci_pipeline.yml contains the following:
- stage: DeployDEV
pool:
vmImage: 'windows-latest'
variables:
- group: [REDACTED]
- name: test_var
value: firstval
jobs:
- job: BuildDeploy
displayName: "REDACTED"
steps:
- script: echo "test_var = $(test_var)"
displayName: first variable pass
- bash: echo "##vso[task.setvariable variable=test_var]secondValue"
displayName: set new variable value
- script: echo "test_var = $(test_var)"
displayName: second variable pass
name: Env
displayName: "Extract Source Branch Name"
- template: pipeline_templates/ci_pipeline_templates/build_deploy_dev.yml
parameters:
testvar: $(test_var)
and in the build_deploy_dev.yml template:
parameters:
- name: testvar
jobs:
- job: Test
displayName: "Testjob"
steps:
- script: echo "test_var=${{ parameters.testvar }}"
name: TestVar
displayName: 'test'
I need to be able to modify the variable in the main yml file BEFORE passing it into the template. However, test_var still remains as firstval. What am I doing wrong? It seems like the change is successful in the main yml file. The second variable pass script displays test_var=secondValue. How to make the change be permanent so the template can have it ?

As stated above: Parameters are evaluated in compile time. So, using parameters won't solve your problem, however you can create a new variable as an output and make your template job dependent from the job which generates a new variable afterwards you can use it in your template as follows:
ci_pipeline.yml
- stage: DeployDEV
pool:
vmImage: 'windows-latest'
variables:
- group: [REDACTED]
- name: test_var
value: firstval
jobs:
- job: BuildDeploy
displayName: "REDACTED"
steps:
- script: echo "test_var = $(test_var)"
displayName: first variable pass
- bash: echo "##vso[task.setvariable variable=new_var;isOutput=true]SomeValue"
displayName: set new variable value
name: SetMyVariable
name: Env
displayName: "Extract Source Branch Name"
- template: pipeline_templates/ci_pipeline_templates/build_deploy_dev.yml
Then in your build_deploy_dev.yml you can use the variable you've created previously:
jobs:
- job: Test
variables:
test_var: $[ dependencies.BuildDeploy.outputs['SetMyVariable.new_var'] ]
displayName: "Testjob"
dependsOn: BuildDeploy
steps:
- script: echo "test_var=$(test_var)"
name: TestVar
displayName: 'test'
Note that you can still leverage your $(test_var) for instance to check whether it has the value 'firstValue' and then in case positive you create the 'newVar' and use it on the other job, you could even use the value of $(test_var) in your 'newVar' and use the value which you previously set:
- bash: echo "##vso[task.setvariable variable=new_var]$(test_var)"
displayName: set new variable value
In this way you can dynamically 'change' the behavior of your pipeline and keep your template file.

Unfortunately you cannot use variables set during runtime, ie. $test_var, as parameters for templates. Think of templates less like functions and more like snippets. The pipeline essentially swaps the guts of the template for the reference to the template and all parameters are evaluated at that time and swapped.
So when you set $test_var to 'firstval', the template is evaluated at that time and the parameter is set to 'firstval' as well. Then when you set $test_var later in the yaml, it is too late. See the documentation below:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#context
It may feel dirty to reuse code but unfortunately this would be the recommended solution.
- stage: DeployDEV
pool:
vmImage: 'windows-latest'
variables:
- group: [REDACTED]
- name: test_var
value: firstval
jobs:
- job: BuildDeploy
displayName: "REDACTED"
steps:
- script: echo "test_var = $(test_var)"
displayName: first variable pass
- bash: echo "##vso[task.setvariable variable=test_var]secondValue"
displayName: set new variable value
- script: echo "test_var = $(test_var)"
displayName: second variable pass
- job: Test
displayName: "Testjob"
steps:
- script: echo "test_var=$(test_var)"
name: TestVar
displayName: 'test'

Related

Github composite action output can not be set from within a bash script

For one of my projects, I am setting an action output from within a bash script that is executed inside a composite action. I found that GitHub has excellent documentation on how to create a GitHub composite action output. It states that this can be done using the following action.yml file.
name: 'Hello World'
description: 'Greet someone'
inputs:
who-to-greet: # id of input
description: 'Who to greet'
required: true
default: 'World'
outputs:
random-number:
description: "Random number"
value: ${{ steps.random-number-generator.outputs.random-number }}
runs:
using: "composite"
steps:
- run: echo Hello ${{ inputs.who-to-greet }}.
shell: bash
- id: random-number-generator
run: echo "random-number=$(echo $RANDOM)" >> $GITHUB_OUTPUT
shell: bash
- run: echo "${{ github.action_path }}" >> $GITHUB_PATH
shell: bash
- run: goodbye.sh
shell: bash
I checked the results using the following action workflow, and it works.
on: [push]
jobs:
hello_world_job:
runs-on: ubuntu-latest
name: A job to say hello
steps:
- uses: actions/checkout#v3
- id: foo
uses: actions/hello-world-composite-action#v1
with:
who-to-greet: 'Mona the Octocat'
- run: echo random-number ${{ steps.foo.outputs.random-number }}
shell: bash
My use case, however, differs from the example above in that I have to set the output variable inside the goodbye.sh script. According to the documentation, this should be done using the GITHUB_OUTPUT variable:
echo "{name}={value}" >> $GITHUB_OUTPUT
After some testing, this method is not working for composite actions. As this could also be a bug or not supported, I created a bug report at https://github.com/orgs/community/discussions/47775. However, I quickly wanted to double-check if there may be something wrong with my syntax.
Steps to reproduce
Fork this repository.
Enable GitHub actions on the fork.
Push a commit to your fork.
See that only the random-number variable is set while the random-number-bash` is set (See this example workflow).
I found my issue using #benjamin-w's comment. The problem was that the goodbye.sh step should contain an id key for the created output to be referenced correctly. The correct syntax should be:
action.yml
name: 'Hello World'
description: 'Greet someone'
inputs:
who-to-greet: # id of input
description: 'Who to greet'
required: true
default: 'World'
outputs:
random-number:
description: "Random number"
value: ${{ steps.random-number-generator.outputs.random-number }}
random-number-bash:
description: "Random number bash"
value: ${{ steps.random-number-generator-bash.outputs.random-number-bash }}
runs:
using: "composite"
steps:
- run: echo Hello ${{ inputs.who-to-greet }}.
shell: bash
- id: random-number-generator
run: echo "random-number=$(echo $RANDOM)" >> $GITHUB_OUTPUT
shell: bash
- run: echo "${{ github.action_path }}" >> $GITHUB_PATH
shell: bash
- run: goodbye.sh
id: random-number-generator-bash
shell: bash
And the correct syntax for creating the output in the goodbye.sh script should be:
Goodbye.sh
echo "Goodbye"
echo "random-number-bash=$(echo 123)" >> $GITHUB_OUTPUT
Which then can be tested using the following workflow file:
Test workflow
on: [push]
jobs:
hello_world_job:
runs-on: ubuntu-latest
name: A job to say hello
steps:
- uses: actions/checkout#v3
- id: foo
uses: rickstaa/hello-world-composite-action-output-bug#main
with:
who-to-greet: 'Mona the Octocat'
- run: echo random-number ${{ steps.foo.outputs.random-number }}
shell: bash
- run: echo random-number ${{ steps.foo.outputs.random-number-bash }}
shell: bash

Increment version and update pom.xml <version> in Azure DevOps pipeline

I am trying to increment the version number for my builds and then update the pom.xml file with the new version. My Powershell to edit and save the pom.xml does not seem to work, I don't get it to reach the xml.project.version i.e. < version > tag and make changes to it.
Do you have any suggestion how to get it to find the < version > tag and save the updated document?
For additional info, the DevOps pipeline runs in windows-latest at the moment.
trigger:
branches:
include:
- localdev
variables:
- name: azureSubscription
value: 'xxxx'
- name: webAppName
value: 'xxx'
- name: environmentName
value: 'xxx'
- name: vmImageName
value: 'ubuntu-latest'
- name: version.MajorMinor
value: '1.0'
- name: version.Patch
value: $[counter(variables['version.MajorMinor'], 0)]
- name: stableVersionNumber
value: '$(version.MajorMinor).$(version.Patch)'
- name: prereleaseVersionNumber
value: 'Set dynamically below in a task'
- name: versionNumber
value: 'Set dynamically below in a task'
- name: isMainBranch
value: $[eq(variables['Build.SourceBranch'], 'refs/heads/main')]
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Fixversionnumber
displayName: Fix Version number
pool:
vmImage: $(vmImageName)
steps:
- task: PowerShell#2
displayName: Set the prereleaseVersionNumber variable value
inputs:
targetType: 'inline'
script: |
[string] $prereleaseVersionNumber = "$(stableVersionNumber)"
Write-Host "Setting the prerelease version number variable to '$prereleaseVersionNumber'."
Write-Host "##vso[task.setvariable variable=prereleaseVersionNumber]$prereleaseVersionNumber"
- task: PowerShell#2
displayName: Set the versionNumber to the stable or prerelease version number based on if the 'main' branch is being built or not
inputs:
targetType: 'inline'
script: |
[bool] $isMainBranch = $$(isMainBranch)
[string] $versionNumber = "$(prereleaseVersionNumber)"
if ($isMainBranch)
{
$versionNumber = "$(stableVersionNumber)"
}
Write-Host "Setting the version number to use to '$versionNumber'."
Write-Host "##vso[task.setvariable variable=versionNumber]$versionNumber"
- task: PowerShell#2
displayName: Set the name of the build (i.e. the Build.BuildNumber)
inputs:
targetType: 'inline'
script: |
[string] $buildName = "$(versionNumber)_$(Build.SourceBranchName)"
Write-Host "Setting the name of the build to '$buildName'."
Write-Host "##vso[build.updatebuildnumber]$buildName"
- task: PowerShell#2
displayName: Set the name of the build (i.e. the Build.BuildNumber)
inputs:
targetType: 'inline'
script: |
#Get version
$versionNum = "$(versionNumber)"
# Specify the file path
#$xmlFileName= "pom.xml"
# Read the existing file
[xml]$xml = Get-Content "pom.xml"
#[xml]$xmlDoc = Get-Content pom.xml
# If it was one specific element you can just do like so:
$xml.project.version = "$versionNum"
#Remove the old pom.xml
#Remove-Item $xmlFileName
# Then you can save that back to the xml file
$xml.Save("$(System.DefaultWorkingDirectory)\pom.xml")
# Print new file content
Write-Host "#########################################'$versionNum'.#########################################"
Write-Host "Setting the version number to use to '$versionNum'."
Write-Host "######################################### '$versionNum'.#########################################"
gc $(System.DefaultWorkingDirectory)\pom.xml
- task: Maven#3
displayName: 'Maven Package'
inputs:
mavenPomFile: 'pom.xml'
- task: CopyFiles#2
displayName: 'Copy Files to artifact staging directory'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)'
Contents: '**/target/*.?(war|jar)'
TargetFolder: $(Build.ArtifactStagingDirectory)
- task: buildDropFile
inputs:
targetPath: $(Build.ArtifactStagingDirectory)
artifactName: drop
You can use a third party extension Replace Tokens. It can search files to find what to replace by a parameter tokenPattern and replaces values with variables in the pipeline. Click "Git if free" and download it to your organization.
In your pom.xml file, please replace your version number to a unique value, like #{version}#.
<version>#{versionNumber}#</version>
Please notice the #{...}#, this will help task to find which content to replace. The versionNumber should be the variable that you want to replace.
Then, in your pipeline, search and add a replace tokens task. Here is an example:
steps:
- task: replacetokens#5
inputs:
targetFiles: '**/pom.xml'
encoding: 'auto'
tokenPattern: 'default' # The defult token pattern is #{...}#
writeBOM: true
actionOnMissing: 'continue'
keepToken: false
actionOnNoFiles: 'continue'
enableTransforms: false
enableRecursion: false
useLegacyPattern: false
enableTelemetry: true

Unable to read Github Actions job's outputs from another job

I have the following workflow in Github Actions, where I have a job that create some outputs and a dependant job that read those outputs, pretty similar to the example from the docs:
name: Sandbox
on:
push:
env:
POSTGRESQL_VERSION: "14.4.0-debian-11-r13"
jobs:
setup:
runs-on: ubuntu-latest
outputs:
prod_tag: "steps.prod_tag.outputs.prod_tag"
postgresql_version: "steps.postgresql_version.outputs.postgresql_version"
steps:
- uses: actions/checkout#v3
- id: prod_tag
run: |
if [[ ${{ github.ref_type }} == "tag" ]]; then
echo "::set-output name=prod_tag::${{github.ref_name}}"
else
echo "::set-output name=prod_tag::latest"
fi;
- id: postgresql_version
run: echo "::set-output name=postgresql_version::${POSTGRESQL_VERSION}"
- name: Show output variables
run: |
echo "PROD TAG: ${{steps.prod_tag.outputs.prod_tag}}"
echo "POSTGRESQL_VERSION: ${{steps.postgresql_version.outputs.postgresql_version}}"
show_outputs:
runs-on: ubuntu-latest
needs: setup
steps:
- run: |
echo "PROD TAG: ${{needs.setup.outputs.prod_tag}}"
echo "POSTGRESQL_VERSION: ${{needs.setup.outputs.postgresql_version}}"
However, in my example, it doesn't work as expected and show_outputs shows PROD TAG: steps.prod_tag.outputs.prod_tag and POSTGRESQL_VERSION: steps.postgresql_version.outputs.postgresql_versioninstead of the values set in the setup job, that should be latest and 14.4.0-debian-11-r13. In the step Show output variables of the setup job I can see that the values are properly set, and I've tried several different approaches (setting the variables from the same step, not taking the value from the environment variable, etc) but with no success.
Any idea what can be wrong with my example?
You should surround the variables with ${{ and }}
try with:
outputs:
prod_tag: ${{ steps.prod_tag.outputs.prod_tag }}
postgresql_version: ${{ steps.postgresql_version.outputs.postgresql_version }}
instead of:
outputs:
prod_tag: "steps.prod_tag.outputs.prod_tag"
postgresql_version: "steps.postgresql_version.outputs.postgresql_version"

Azure devops pipeline removing " from parameter value in bash step

I am passing in a parameter to the pipeline as shown below -
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
- test:
name: "test"
desired: '"abc","xyz"'
When I am passing these values in bash step it is coming as
"abc,xyz" and I am expecting it to be "abc","xyz" is there a way I can fix this ?
values may look similar, kindly look for " after end of value 1.
You can use ${{ convertToJson() }} to keep the double quota.
My sample yaml(please notice the structure of test):
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
test:
name: "test"
desired: '"abc","xyz"'
trigger: none
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo "${{ convertToJson(parameters.listOfValues.test.desired) }}"
Output value is bash task:
Or you can add \ before quota for the value:
My sample yaml:
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
test:
name: "test"
desired: '\"abc\",\"xyz\"'
trigger: none
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo ${{ parameters.ListofValues.test.desired }}
var="abc,xyz"
IFS="," read -a array <<< $var
var=""
for word in "${array[#]}"; do
var="$var\"$word\","
done
var="${var%,}"
echo "${var}"
# outputs:
# "abc","xyz"
Read the elements of var delimited by , into an array named array.
Then loop over the elements in the array and concatenate them into var while adding the required ".
At the end remove the extra trailing ,.
This solution works with any number of elements.

YAML azure piplelines & templates - Iterate through each values splitted from a string passed from main pipeline OR pass array as output with bash

I'm kinda stuck with this achievement i would like to have available in my build pipeline.
The main goal is to iterate through every value extracted by the function {{split ',' parameters.PARAMETER_PASSED}} when I'm already in the template form.
How it should looks like:
Build pass string that changes depending by one script value1,value2,value3,...,valueX
Template start
FOR EACH valueN, start a script with a specific displayName (in this case ${{ value}} will be ok)
Below you can find what i've achieved so far:
azure-pipelines.yaml
- stage: AutomateScriptContent
displayName: 'Some new trick'
jobs:
- job: CheckFileChanges
displayName: Prepare list
steps:
- checkout: self
persistCredentials: true
- bash: |
#....Something..Something
echo "##vso[task.setvariable variable=VALUE_LIST;isOutput=true;]value1,value2,value3"
fi
displayName: "Check"
name: check
- job: TestTemplate
variables:
LISTVAL: $[ dependencies.CheckFileChanges.outputs['check.VALUE_LIST'] ]
displayName: "Do something with values passed from env variable"
dependsOn: CheckFileChanges
steps:
- template: __templates__/test.template.yml
parameters:
MY_LIST: $(LISTVAL)
test.template.yml
parameters:
MY_LIST: ""
steps:
- variables:
website: {{split ',' parameters.MY_LIST}}
# - script: echo "${{ parameters.MY_LIST }}"
- ${{ each value in var.MY_LIST }}:
- script: 'echo ${{ value }}'
displayName: '${{ value }}'
I know that test.template.yml is not correct but I cannot understand how to make it work!
Any suggestion? Not sure if it's possible to pass from bash/powershell a new array with `echo "##vso[task.setvariable variable=VALUE_LIST;isOutput=true;]$MY_ARRAY"
Also one accepted solution could be adding every value of the array as new parameter, output it and then pass ALL the parameters (without providing the name like below) but i'm not sure if it's possible to pass the parameters without providing each single name of each parameter (example below).
variables:
LISTVAL: $[ dependencies.CheckFileChanges.outputs['check(GET_ALL_PARAMETERS)'] ]
Thank you in advance.
You have a few problems here.
In your test.template.yml
A steps sequence cannot have a variables section
The each keyword can only be used on parameters of type object.
There is no 'split' function.
Also, your parameter name in your pipeline 'WEBSITE_LIST' doesn't match the name defined in your template 'MY_LIST'
If you have a finite list of outputs from your first job you could do something like the below but this will not be sustainable if the list grows.
pipeline:
stages:
- stage: AutomateScriptContent
displayName: 'Some new trick'
jobs:
- job: CheckFileChanges
displayName: Prepare list
steps:
- checkout: self
persistCredentials: true
- bash: |
#....Something..Something
echo "##vso[task.setvariable variable=value1;isOutput=true;]foo"
echo "##vso[task.setvariable variable=value2;isOutput=true;]bar"
echo "##vso[task.setvariable variable=value3;isOutput=true;]lorem"
echo "##vso[task.setvariable variable=value4;isOutput=true;]ipsum"
displayName: "Check"
name: check
- job: TestTemplate
variables:
v1: $[ dependencies.CheckFileChanges.outputs['check.value1'] ]
v2: $[ dependencies.CheckFileChanges.outputs['check.value2'] ]
v3: $[ dependencies.CheckFileChanges.outputs['check.value3'] ]
v4: $[ dependencies.CheckFileChanges.outputs['check.value4'] ]
displayName: "Do something with values passed from env variable"
dependsOn: CheckFileChanges
steps:
- template: __templates__/test.template.yml
parameters:
MY_LIST: [$(v1), $(v2), $(v3), $(v4)]
template
parameters:
- name: MY_LIST
type: object
steps:
- ${{ each value in parameters.MY_LIST }}:
- script: 'echo ${{ value }}'
displayName: '${{value}}'
Edit: as you explain that you do not know the number of output values from your first job, this does not seem possible to achieve.
Variables are always strings and, while you could output a csv string variable from the first job, there is no function available to split this into an array for use as the parameter of your second job.

Resources