HI Azure pipeline i am getting struck and having error while doing this
parameters:
- name: applications_module
displayName: Applications
type: boolean
default: 'false'
values: []
- name: management_module
displayName: Management
type: boolean
default: 'false'
values: []
- name: Creation
displayName: Creation
type: boolean
default: 'false'
values: []
variables:
- name: FilterValue
${{each item in parameters}}:
${{if contains(item.name, '_module')}}:
value: item.name # here I want to concatenate all the parameters names that have name _module but this statement is throwing an error
I am not able to its saying "'value' is already defined" so can anyone help me regarding this
using ##VSO we can do that
steps:
- ${{ each parameter in parameters }}:
- bash: echo '##vso[task.setvariable variable=allParametersString]$(allParametersString)${{ parameter.Key }}'
- script:
echo 'concatenated strings by comma .->$(allParametersString)'
Related
I have the following YAML pipeline that I cannot seem to get working based on the conditional values.
parameters:
- name: "workloads"
type: object
default:
wkld001: update
wkld002: "delete"
wkld003: "update"
wkld004: "update"
wkld005: "update"
wkld006: "update"
- name: "environment"
type: string
values:
- prd
- dev
- name: "landing_zone"
type: string
values:
- private
- integration
stages:
- stage:
jobs:
- job: create_params
steps:
- powershell: |
write-host "test"
- ${{each item in parameters.workloads}}:
- ${{ if eq(parameters.workloads[item.key].value, 'update') }}:
- job: "${{item.key}}"
dependsOn: create_params
steps:
- powershell: |
write-host "testing loop - ${{item.value}}"
What i want to do is have a specific command run based on the value set for the workload map.
When I run the above no conditions are met so only the pre-job runs.
The expected behaviour is the loop runs, and the right job is spawned based on the map conditions.
The example only shows the "update" condition only; I plan to have a few more.
I got there in the end with this.
pool:
name: "UOLUKSSHPOOL01"
parameters:
- name: "workloads"
type: object
default:
wkld001: update
wkld002: "delete"
wkld003: "update"
wkld004: "update"
wkld005: "update"
wkld006: "update"
- name: "environment"
type: string
values:
- prd
- dev
- name: "landing_zone"
type: string
values:
- private
- integration
stages:
- stage:
jobs:
- job: create_params
steps:
- powershell: |
write-host "test"
- ${{each item in parameters.workloads}}:
- ${{ if eq(item.value, 'update') }}:
- job: "${{item.key}}"
condition:
dependsOn: create_params
steps:
- powershell: |
write-host "testing loop - ${{item.value}}"
kinda stuck at this point and haven't been able to find any clear documentation for this example.
I have some parameters to a YAML pipeline shown below (SonarProjectKey & sonarProjectName). I am trying to set a variable to either true or false depending on whether these params have a value.
I have set them to initial values of ' ' deliberately so that you can run the pipeline without having to provide a value.
The following example doesnt work:
parameters:
- name: sonarProjectKey # Sonar project key parameter
displayName: Sonar Project Key
default: ' '
- name: sonarProjectName
displayName: Sonar Project Name
default: ' '
variables:
- name: runSonarAnalysis
value: $[and( not(eq('${{ parameters.sonarProjectKey }}', '')), not(eq('${{ parameters.sonarProjectName }}', '')))]
I have trued so many different ways and can't seem to get it to work.
Someone please help!
Try to created variable with below syntax
parameters:
- name: sonarProjectKey # Sonar project key parameter
displayName: Sonar Project Key
default: ' '
- name: sonarProjectName
displayName: Sonar Project Name
default: ' '
variables:
- ${{ if eq(${{ parameters.sonarProjectKey }}', ' ' ) }}:
- name: runSonarAnalysis
value: false
- ${{ else }}:
- name: runSonarAnalysis
value: true
I have a yaml configuration as follows:
parameters:
group: '$(group)'
acl:
certificateFile: AclCertificates.p12
provisioningProfileFile: AmericashDisProfile.mobileprovision
keystore: 'acl.jks'
sail:
certificateFile: AclCertificates.p12
provisioningProfileFile: AmericashDisProfile.mobileprovision
keystore: 'acl.jks'
steps:
- bash: |
echo ${{ parameters[$(group)]['certificateFile'] }}
I want to access the object value using the dynamic key. Here group: '$(group)' is a dynamic value which is coming from another var file.
I have tried a way of access the object value like ${{ parameters[$(group)]['certificateFile'] }} But its not working. I'm not able to figure out that how should i pass the parameter group in the echo ${{ parameters[$(group)]['certificateFile'] }} in order to get specific object's value.
For example, you have a YAML pipeline A:
parameters:
- name: test
type: object
default:
- name: Name1
path: Path1
- name: Name2
path: Path2
variables:
sth: ${{ join(';',parameters.test.*.name) }}
And then you can use YAML pipeline B to get the object value:
variables:
- template: azure-pipelines-2.yml # Template reference
steps:
- task: CmdLine#2
inputs:
script: 'echo "${{variables.sth}}"'
I am passing in a parameter to the pipeline as shown below -
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
- test:
name: "test"
desired: '"abc","xyz"'
When I am passing these values in bash step it is coming as
"abc,xyz" and I am expecting it to be "abc","xyz" is there a way I can fix this ?
values may look similar, kindly look for " after end of value 1.
You can use ${{ convertToJson() }} to keep the double quota.
My sample yaml(please notice the structure of test):
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
test:
name: "test"
desired: '"abc","xyz"'
trigger: none
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo "${{ convertToJson(parameters.listOfValues.test.desired) }}"
Output value is bash task:
Or you can add \ before quota for the value:
My sample yaml:
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
test:
name: "test"
desired: '\"abc\",\"xyz\"'
trigger: none
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo ${{ parameters.ListofValues.test.desired }}
var="abc,xyz"
IFS="," read -a array <<< $var
var=""
for word in "${array[#]}"; do
var="$var\"$word\","
done
var="${var%,}"
echo "${var}"
# outputs:
# "abc","xyz"
Read the elements of var delimited by , into an array named array.
Then loop over the elements in the array and concatenate them into var while adding the required ".
At the end remove the extra trailing ,.
This solution works with any number of elements.
I'm kinda stuck with this achievement i would like to have available in my build pipeline.
The main goal is to iterate through every value extracted by the function {{split ',' parameters.PARAMETER_PASSED}} when I'm already in the template form.
How it should looks like:
Build pass string that changes depending by one script value1,value2,value3,...,valueX
Template start
FOR EACH valueN, start a script with a specific displayName (in this case ${{ value}} will be ok)
Below you can find what i've achieved so far:
azure-pipelines.yaml
- stage: AutomateScriptContent
displayName: 'Some new trick'
jobs:
- job: CheckFileChanges
displayName: Prepare list
steps:
- checkout: self
persistCredentials: true
- bash: |
#....Something..Something
echo "##vso[task.setvariable variable=VALUE_LIST;isOutput=true;]value1,value2,value3"
fi
displayName: "Check"
name: check
- job: TestTemplate
variables:
LISTVAL: $[ dependencies.CheckFileChanges.outputs['check.VALUE_LIST'] ]
displayName: "Do something with values passed from env variable"
dependsOn: CheckFileChanges
steps:
- template: __templates__/test.template.yml
parameters:
MY_LIST: $(LISTVAL)
test.template.yml
parameters:
MY_LIST: ""
steps:
- variables:
website: {{split ',' parameters.MY_LIST}}
# - script: echo "${{ parameters.MY_LIST }}"
- ${{ each value in var.MY_LIST }}:
- script: 'echo ${{ value }}'
displayName: '${{ value }}'
I know that test.template.yml is not correct but I cannot understand how to make it work!
Any suggestion? Not sure if it's possible to pass from bash/powershell a new array with `echo "##vso[task.setvariable variable=VALUE_LIST;isOutput=true;]$MY_ARRAY"
Also one accepted solution could be adding every value of the array as new parameter, output it and then pass ALL the parameters (without providing the name like below) but i'm not sure if it's possible to pass the parameters without providing each single name of each parameter (example below).
variables:
LISTVAL: $[ dependencies.CheckFileChanges.outputs['check(GET_ALL_PARAMETERS)'] ]
Thank you in advance.
You have a few problems here.
In your test.template.yml
A steps sequence cannot have a variables section
The each keyword can only be used on parameters of type object.
There is no 'split' function.
Also, your parameter name in your pipeline 'WEBSITE_LIST' doesn't match the name defined in your template 'MY_LIST'
If you have a finite list of outputs from your first job you could do something like the below but this will not be sustainable if the list grows.
pipeline:
stages:
- stage: AutomateScriptContent
displayName: 'Some new trick'
jobs:
- job: CheckFileChanges
displayName: Prepare list
steps:
- checkout: self
persistCredentials: true
- bash: |
#....Something..Something
echo "##vso[task.setvariable variable=value1;isOutput=true;]foo"
echo "##vso[task.setvariable variable=value2;isOutput=true;]bar"
echo "##vso[task.setvariable variable=value3;isOutput=true;]lorem"
echo "##vso[task.setvariable variable=value4;isOutput=true;]ipsum"
displayName: "Check"
name: check
- job: TestTemplate
variables:
v1: $[ dependencies.CheckFileChanges.outputs['check.value1'] ]
v2: $[ dependencies.CheckFileChanges.outputs['check.value2'] ]
v3: $[ dependencies.CheckFileChanges.outputs['check.value3'] ]
v4: $[ dependencies.CheckFileChanges.outputs['check.value4'] ]
displayName: "Do something with values passed from env variable"
dependsOn: CheckFileChanges
steps:
- template: __templates__/test.template.yml
parameters:
MY_LIST: [$(v1), $(v2), $(v3), $(v4)]
template
parameters:
- name: MY_LIST
type: object
steps:
- ${{ each value in parameters.MY_LIST }}:
- script: 'echo ${{ value }}'
displayName: '${{value}}'
Edit: as you explain that you do not know the number of output values from your first job, this does not seem possible to achieve.
Variables are always strings and, while you could output a csv string variable from the first job, there is no function available to split this into an array for use as the parameter of your second job.