Set YAML variable value based on parameters - yaml

kinda stuck at this point and haven't been able to find any clear documentation for this example.
I have some parameters to a YAML pipeline shown below (SonarProjectKey & sonarProjectName). I am trying to set a variable to either true or false depending on whether these params have a value.
I have set them to initial values of ' ' deliberately so that you can run the pipeline without having to provide a value.
The following example doesnt work:
parameters:
- name: sonarProjectKey # Sonar project key parameter
displayName: Sonar Project Key
default: ' '
- name: sonarProjectName
displayName: Sonar Project Name
default: ' '
variables:
- name: runSonarAnalysis
value: $[and( not(eq('${{ parameters.sonarProjectKey }}', '')), not(eq('${{ parameters.sonarProjectName }}', '')))]
I have trued so many different ways and can't seem to get it to work.
Someone please help!

Try to created variable with below syntax
parameters:
- name: sonarProjectKey # Sonar project key parameter
displayName: Sonar Project Key
default: ' '
- name: sonarProjectName
displayName: Sonar Project Name
default: ' '
variables:
- ${{ if eq(${{ parameters.sonarProjectKey }}', ' ' ) }}:
- name: runSonarAnalysis
value: false
- ${{ else }}:
- name: runSonarAnalysis
value: true

Related

YAML loop in Azure pipelines i want to assign concatinated values

HI Azure pipeline i am getting struck and having error while doing this
parameters:
- name: applications_module
displayName: Applications
type: boolean
default: 'false'
values: []
- name: management_module
displayName: Management
type: boolean
default: 'false'
values: []
- name: Creation
displayName: Creation
type: boolean
default: 'false'
values: []
variables:
- name: FilterValue
${{each item in parameters}}:
${{if contains(item.name, '_module')}}:
value: item.name # here I want to concatenate all the parameters names that have name _module but this statement is throwing an error
I am not able to its saying "'value' is already defined" so can anyone help me regarding this
using ##VSO we can do that
steps:
- ${{ each parameter in parameters }}:
- bash: echo '##vso[task.setvariable variable=allParametersString]$(allParametersString)${{ parameter.Key }}'
- script:
echo 'concatenated strings by comma .->$(allParametersString)'

Azure Pipeline pass parameter from one job to another

My ci_pipeline.yml contains the following:
- stage: DeployDEV
pool:
vmImage: 'windows-latest'
variables:
- group: [REDACTED]
- name: test_var
value: firstval
jobs:
- job: BuildDeploy
displayName: "REDACTED"
steps:
- script: echo "test_var = $(test_var)"
displayName: first variable pass
- bash: echo "##vso[task.setvariable variable=test_var]secondValue"
displayName: set new variable value
- script: echo "test_var = $(test_var)"
displayName: second variable pass
name: Env
displayName: "Extract Source Branch Name"
- template: pipeline_templates/ci_pipeline_templates/build_deploy_dev.yml
parameters:
testvar: $(test_var)
and in the build_deploy_dev.yml template:
parameters:
- name: testvar
jobs:
- job: Test
displayName: "Testjob"
steps:
- script: echo "test_var=${{ parameters.testvar }}"
name: TestVar
displayName: 'test'
I need to be able to modify the variable in the main yml file BEFORE passing it into the template. However, test_var still remains as firstval. What am I doing wrong? It seems like the change is successful in the main yml file. The second variable pass script displays test_var=secondValue. How to make the change be permanent so the template can have it ?
As stated above: Parameters are evaluated in compile time. So, using parameters won't solve your problem, however you can create a new variable as an output and make your template job dependent from the job which generates a new variable afterwards you can use it in your template as follows:
ci_pipeline.yml
- stage: DeployDEV
pool:
vmImage: 'windows-latest'
variables:
- group: [REDACTED]
- name: test_var
value: firstval
jobs:
- job: BuildDeploy
displayName: "REDACTED"
steps:
- script: echo "test_var = $(test_var)"
displayName: first variable pass
- bash: echo "##vso[task.setvariable variable=new_var;isOutput=true]SomeValue"
displayName: set new variable value
name: SetMyVariable
name: Env
displayName: "Extract Source Branch Name"
- template: pipeline_templates/ci_pipeline_templates/build_deploy_dev.yml
Then in your build_deploy_dev.yml you can use the variable you've created previously:
jobs:
- job: Test
variables:
test_var: $[ dependencies.BuildDeploy.outputs['SetMyVariable.new_var'] ]
displayName: "Testjob"
dependsOn: BuildDeploy
steps:
- script: echo "test_var=$(test_var)"
name: TestVar
displayName: 'test'
Note that you can still leverage your $(test_var) for instance to check whether it has the value 'firstValue' and then in case positive you create the 'newVar' and use it on the other job, you could even use the value of $(test_var) in your 'newVar' and use the value which you previously set:
- bash: echo "##vso[task.setvariable variable=new_var]$(test_var)"
displayName: set new variable value
In this way you can dynamically 'change' the behavior of your pipeline and keep your template file.
Unfortunately you cannot use variables set during runtime, ie. $test_var, as parameters for templates. Think of templates less like functions and more like snippets. The pipeline essentially swaps the guts of the template for the reference to the template and all parameters are evaluated at that time and swapped.
So when you set $test_var to 'firstval', the template is evaluated at that time and the parameter is set to 'firstval' as well. Then when you set $test_var later in the yaml, it is too late. See the documentation below:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#context
It may feel dirty to reuse code but unfortunately this would be the recommended solution.
- stage: DeployDEV
pool:
vmImage: 'windows-latest'
variables:
- group: [REDACTED]
- name: test_var
value: firstval
jobs:
- job: BuildDeploy
displayName: "REDACTED"
steps:
- script: echo "test_var = $(test_var)"
displayName: first variable pass
- bash: echo "##vso[task.setvariable variable=test_var]secondValue"
displayName: set new variable value
- script: echo "test_var = $(test_var)"
displayName: second variable pass
- job: Test
displayName: "Testjob"
steps:
- script: echo "test_var=$(test_var)"
name: TestVar
displayName: 'test'

How to access value from a Object using a dynamic key in Yaml

I have a yaml configuration as follows:
parameters:
group: '$(group)'
acl:
certificateFile: AclCertificates.p12
provisioningProfileFile: AmericashDisProfile.mobileprovision
keystore: 'acl.jks'
sail:
certificateFile: AclCertificates.p12
provisioningProfileFile: AmericashDisProfile.mobileprovision
keystore: 'acl.jks'
steps:
- bash: |
echo ${{ parameters[$(group)]['certificateFile'] }}
I want to access the object value using the dynamic key. Here group: '$(group)' is a dynamic value which is coming from another var file.
I have tried a way of access the object value like ${{ parameters[$(group)]['certificateFile'] }} But its not working. I'm not able to figure out that how should i pass the parameter group in the echo ${{ parameters[$(group)]['certificateFile'] }} in order to get specific object's value.
For example, you have a YAML pipeline A:
parameters:
- name: test
type: object
default:
- name: Name1
path: Path1
- name: Name2
path: Path2
variables:
sth: ${{ join(';',parameters.test.*.name) }}
And then you can use YAML pipeline B to get the object value:
variables:
- template: azure-pipelines-2.yml # Template reference
steps:
- task: CmdLine#2
inputs:
script: 'echo "${{variables.sth}}"'

Azure devops pipeline removing " from parameter value in bash step

I am passing in a parameter to the pipeline as shown below -
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
- test:
name: "test"
desired: '"abc","xyz"'
When I am passing these values in bash step it is coming as
"abc,xyz" and I am expecting it to be "abc","xyz" is there a way I can fix this ?
values may look similar, kindly look for " after end of value 1.
You can use ${{ convertToJson() }} to keep the double quota.
My sample yaml(please notice the structure of test):
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
test:
name: "test"
desired: '"abc","xyz"'
trigger: none
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo "${{ convertToJson(parameters.listOfValues.test.desired) }}"
Output value is bash task:
Or you can add \ before quota for the value:
My sample yaml:
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
test:
name: "test"
desired: '\"abc\",\"xyz\"'
trigger: none
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo ${{ parameters.ListofValues.test.desired }}
var="abc,xyz"
IFS="," read -a array <<< $var
var=""
for word in "${array[#]}"; do
var="$var\"$word\","
done
var="${var%,}"
echo "${var}"
# outputs:
# "abc","xyz"
Read the elements of var delimited by , into an array named array.
Then loop over the elements in the array and concatenate them into var while adding the required ".
At the end remove the extra trailing ,.
This solution works with any number of elements.

YAML azure piplelines & templates - Iterate through each values splitted from a string passed from main pipeline OR pass array as output with bash

I'm kinda stuck with this achievement i would like to have available in my build pipeline.
The main goal is to iterate through every value extracted by the function {{split ',' parameters.PARAMETER_PASSED}} when I'm already in the template form.
How it should looks like:
Build pass string that changes depending by one script value1,value2,value3,...,valueX
Template start
FOR EACH valueN, start a script with a specific displayName (in this case ${{ value}} will be ok)
Below you can find what i've achieved so far:
azure-pipelines.yaml
- stage: AutomateScriptContent
displayName: 'Some new trick'
jobs:
- job: CheckFileChanges
displayName: Prepare list
steps:
- checkout: self
persistCredentials: true
- bash: |
#....Something..Something
echo "##vso[task.setvariable variable=VALUE_LIST;isOutput=true;]value1,value2,value3"
fi
displayName: "Check"
name: check
- job: TestTemplate
variables:
LISTVAL: $[ dependencies.CheckFileChanges.outputs['check.VALUE_LIST'] ]
displayName: "Do something with values passed from env variable"
dependsOn: CheckFileChanges
steps:
- template: __templates__/test.template.yml
parameters:
MY_LIST: $(LISTVAL)
test.template.yml
parameters:
MY_LIST: ""
steps:
- variables:
website: {{split ',' parameters.MY_LIST}}
# - script: echo "${{ parameters.MY_LIST }}"
- ${{ each value in var.MY_LIST }}:
- script: 'echo ${{ value }}'
displayName: '${{ value }}'
I know that test.template.yml is not correct but I cannot understand how to make it work!
Any suggestion? Not sure if it's possible to pass from bash/powershell a new array with `echo "##vso[task.setvariable variable=VALUE_LIST;isOutput=true;]$MY_ARRAY"
Also one accepted solution could be adding every value of the array as new parameter, output it and then pass ALL the parameters (without providing the name like below) but i'm not sure if it's possible to pass the parameters without providing each single name of each parameter (example below).
variables:
LISTVAL: $[ dependencies.CheckFileChanges.outputs['check(GET_ALL_PARAMETERS)'] ]
Thank you in advance.
You have a few problems here.
In your test.template.yml
A steps sequence cannot have a variables section
The each keyword can only be used on parameters of type object.
There is no 'split' function.
Also, your parameter name in your pipeline 'WEBSITE_LIST' doesn't match the name defined in your template 'MY_LIST'
If you have a finite list of outputs from your first job you could do something like the below but this will not be sustainable if the list grows.
pipeline:
stages:
- stage: AutomateScriptContent
displayName: 'Some new trick'
jobs:
- job: CheckFileChanges
displayName: Prepare list
steps:
- checkout: self
persistCredentials: true
- bash: |
#....Something..Something
echo "##vso[task.setvariable variable=value1;isOutput=true;]foo"
echo "##vso[task.setvariable variable=value2;isOutput=true;]bar"
echo "##vso[task.setvariable variable=value3;isOutput=true;]lorem"
echo "##vso[task.setvariable variable=value4;isOutput=true;]ipsum"
displayName: "Check"
name: check
- job: TestTemplate
variables:
v1: $[ dependencies.CheckFileChanges.outputs['check.value1'] ]
v2: $[ dependencies.CheckFileChanges.outputs['check.value2'] ]
v3: $[ dependencies.CheckFileChanges.outputs['check.value3'] ]
v4: $[ dependencies.CheckFileChanges.outputs['check.value4'] ]
displayName: "Do something with values passed from env variable"
dependsOn: CheckFileChanges
steps:
- template: __templates__/test.template.yml
parameters:
MY_LIST: [$(v1), $(v2), $(v3), $(v4)]
template
parameters:
- name: MY_LIST
type: object
steps:
- ${{ each value in parameters.MY_LIST }}:
- script: 'echo ${{ value }}'
displayName: '${{value}}'
Edit: as you explain that you do not know the number of output values from your first job, this does not seem possible to achieve.
Variables are always strings and, while you could output a csv string variable from the first job, there is no function available to split this into an array for use as the parameter of your second job.

Resources