Azure devops pipeline removing " from parameter value in bash step - bash

I am passing in a parameter to the pipeline as shown below -
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
- test:
name: "test"
desired: '"abc","xyz"'
When I am passing these values in bash step it is coming as
"abc,xyz" and I am expecting it to be "abc","xyz" is there a way I can fix this ?
values may look similar, kindly look for " after end of value 1.

You can use ${{ convertToJson() }} to keep the double quota.
My sample yaml(please notice the structure of test):
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
test:
name: "test"
desired: '"abc","xyz"'
trigger: none
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo "${{ convertToJson(parameters.listOfValues.test.desired) }}"
Output value is bash task:
Or you can add \ before quota for the value:
My sample yaml:
parameters:
- name: ListofValues
displayName: values for test
type: object
default:
test:
name: "test"
desired: '\"abc\",\"xyz\"'
trigger: none
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo ${{ parameters.ListofValues.test.desired }}

var="abc,xyz"
IFS="," read -a array <<< $var
var=""
for word in "${array[#]}"; do
var="$var\"$word\","
done
var="${var%,}"
echo "${var}"
# outputs:
# "abc","xyz"
Read the elements of var delimited by , into an array named array.
Then loop over the elements in the array and concatenate them into var while adding the required ".
At the end remove the extra trailing ,.
This solution works with any number of elements.

Related

YAML loop in Azure pipelines i want to assign concatinated values

HI Azure pipeline i am getting struck and having error while doing this
parameters:
- name: applications_module
displayName: Applications
type: boolean
default: 'false'
values: []
- name: management_module
displayName: Management
type: boolean
default: 'false'
values: []
- name: Creation
displayName: Creation
type: boolean
default: 'false'
values: []
variables:
- name: FilterValue
${{each item in parameters}}:
${{if contains(item.name, '_module')}}:
value: item.name # here I want to concatenate all the parameters names that have name _module but this statement is throwing an error
I am not able to its saying "'value' is already defined" so can anyone help me regarding this
using ##VSO we can do that
steps:
- ${{ each parameter in parameters }}:
- bash: echo '##vso[task.setvariable variable=allParametersString]$(allParametersString)${{ parameter.Key }}'
- script:
echo 'concatenated strings by comma .->$(allParametersString)'

Azure Pipeline pass parameter from one job to another

My ci_pipeline.yml contains the following:
- stage: DeployDEV
pool:
vmImage: 'windows-latest'
variables:
- group: [REDACTED]
- name: test_var
value: firstval
jobs:
- job: BuildDeploy
displayName: "REDACTED"
steps:
- script: echo "test_var = $(test_var)"
displayName: first variable pass
- bash: echo "##vso[task.setvariable variable=test_var]secondValue"
displayName: set new variable value
- script: echo "test_var = $(test_var)"
displayName: second variable pass
name: Env
displayName: "Extract Source Branch Name"
- template: pipeline_templates/ci_pipeline_templates/build_deploy_dev.yml
parameters:
testvar: $(test_var)
and in the build_deploy_dev.yml template:
parameters:
- name: testvar
jobs:
- job: Test
displayName: "Testjob"
steps:
- script: echo "test_var=${{ parameters.testvar }}"
name: TestVar
displayName: 'test'
I need to be able to modify the variable in the main yml file BEFORE passing it into the template. However, test_var still remains as firstval. What am I doing wrong? It seems like the change is successful in the main yml file. The second variable pass script displays test_var=secondValue. How to make the change be permanent so the template can have it ?
As stated above: Parameters are evaluated in compile time. So, using parameters won't solve your problem, however you can create a new variable as an output and make your template job dependent from the job which generates a new variable afterwards you can use it in your template as follows:
ci_pipeline.yml
- stage: DeployDEV
pool:
vmImage: 'windows-latest'
variables:
- group: [REDACTED]
- name: test_var
value: firstval
jobs:
- job: BuildDeploy
displayName: "REDACTED"
steps:
- script: echo "test_var = $(test_var)"
displayName: first variable pass
- bash: echo "##vso[task.setvariable variable=new_var;isOutput=true]SomeValue"
displayName: set new variable value
name: SetMyVariable
name: Env
displayName: "Extract Source Branch Name"
- template: pipeline_templates/ci_pipeline_templates/build_deploy_dev.yml
Then in your build_deploy_dev.yml you can use the variable you've created previously:
jobs:
- job: Test
variables:
test_var: $[ dependencies.BuildDeploy.outputs['SetMyVariable.new_var'] ]
displayName: "Testjob"
dependsOn: BuildDeploy
steps:
- script: echo "test_var=$(test_var)"
name: TestVar
displayName: 'test'
Note that you can still leverage your $(test_var) for instance to check whether it has the value 'firstValue' and then in case positive you create the 'newVar' and use it on the other job, you could even use the value of $(test_var) in your 'newVar' and use the value which you previously set:
- bash: echo "##vso[task.setvariable variable=new_var]$(test_var)"
displayName: set new variable value
In this way you can dynamically 'change' the behavior of your pipeline and keep your template file.
Unfortunately you cannot use variables set during runtime, ie. $test_var, as parameters for templates. Think of templates less like functions and more like snippets. The pipeline essentially swaps the guts of the template for the reference to the template and all parameters are evaluated at that time and swapped.
So when you set $test_var to 'firstval', the template is evaluated at that time and the parameter is set to 'firstval' as well. Then when you set $test_var later in the yaml, it is too late. See the documentation below:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#context
It may feel dirty to reuse code but unfortunately this would be the recommended solution.
- stage: DeployDEV
pool:
vmImage: 'windows-latest'
variables:
- group: [REDACTED]
- name: test_var
value: firstval
jobs:
- job: BuildDeploy
displayName: "REDACTED"
steps:
- script: echo "test_var = $(test_var)"
displayName: first variable pass
- bash: echo "##vso[task.setvariable variable=test_var]secondValue"
displayName: set new variable value
- script: echo "test_var = $(test_var)"
displayName: second variable pass
- job: Test
displayName: "Testjob"
steps:
- script: echo "test_var=$(test_var)"
name: TestVar
displayName: 'test'

How to access value from a Object using a dynamic key in Yaml

I have a yaml configuration as follows:
parameters:
group: '$(group)'
acl:
certificateFile: AclCertificates.p12
provisioningProfileFile: AmericashDisProfile.mobileprovision
keystore: 'acl.jks'
sail:
certificateFile: AclCertificates.p12
provisioningProfileFile: AmericashDisProfile.mobileprovision
keystore: 'acl.jks'
steps:
- bash: |
echo ${{ parameters[$(group)]['certificateFile'] }}
I want to access the object value using the dynamic key. Here group: '$(group)' is a dynamic value which is coming from another var file.
I have tried a way of access the object value like ${{ parameters[$(group)]['certificateFile'] }} But its not working. I'm not able to figure out that how should i pass the parameter group in the echo ${{ parameters[$(group)]['certificateFile'] }} in order to get specific object's value.
For example, you have a YAML pipeline A:
parameters:
- name: test
type: object
default:
- name: Name1
path: Path1
- name: Name2
path: Path2
variables:
sth: ${{ join(';',parameters.test.*.name) }}
And then you can use YAML pipeline B to get the object value:
variables:
- template: azure-pipelines-2.yml # Template reference
steps:
- task: CmdLine#2
inputs:
script: 'echo "${{variables.sth}}"'

YAML azure piplelines & templates - Iterate through each values splitted from a string passed from main pipeline OR pass array as output with bash

I'm kinda stuck with this achievement i would like to have available in my build pipeline.
The main goal is to iterate through every value extracted by the function {{split ',' parameters.PARAMETER_PASSED}} when I'm already in the template form.
How it should looks like:
Build pass string that changes depending by one script value1,value2,value3,...,valueX
Template start
FOR EACH valueN, start a script with a specific displayName (in this case ${{ value}} will be ok)
Below you can find what i've achieved so far:
azure-pipelines.yaml
- stage: AutomateScriptContent
displayName: 'Some new trick'
jobs:
- job: CheckFileChanges
displayName: Prepare list
steps:
- checkout: self
persistCredentials: true
- bash: |
#....Something..Something
echo "##vso[task.setvariable variable=VALUE_LIST;isOutput=true;]value1,value2,value3"
fi
displayName: "Check"
name: check
- job: TestTemplate
variables:
LISTVAL: $[ dependencies.CheckFileChanges.outputs['check.VALUE_LIST'] ]
displayName: "Do something with values passed from env variable"
dependsOn: CheckFileChanges
steps:
- template: __templates__/test.template.yml
parameters:
MY_LIST: $(LISTVAL)
test.template.yml
parameters:
MY_LIST: ""
steps:
- variables:
website: {{split ',' parameters.MY_LIST}}
# - script: echo "${{ parameters.MY_LIST }}"
- ${{ each value in var.MY_LIST }}:
- script: 'echo ${{ value }}'
displayName: '${{ value }}'
I know that test.template.yml is not correct but I cannot understand how to make it work!
Any suggestion? Not sure if it's possible to pass from bash/powershell a new array with `echo "##vso[task.setvariable variable=VALUE_LIST;isOutput=true;]$MY_ARRAY"
Also one accepted solution could be adding every value of the array as new parameter, output it and then pass ALL the parameters (without providing the name like below) but i'm not sure if it's possible to pass the parameters without providing each single name of each parameter (example below).
variables:
LISTVAL: $[ dependencies.CheckFileChanges.outputs['check(GET_ALL_PARAMETERS)'] ]
Thank you in advance.
You have a few problems here.
In your test.template.yml
A steps sequence cannot have a variables section
The each keyword can only be used on parameters of type object.
There is no 'split' function.
Also, your parameter name in your pipeline 'WEBSITE_LIST' doesn't match the name defined in your template 'MY_LIST'
If you have a finite list of outputs from your first job you could do something like the below but this will not be sustainable if the list grows.
pipeline:
stages:
- stage: AutomateScriptContent
displayName: 'Some new trick'
jobs:
- job: CheckFileChanges
displayName: Prepare list
steps:
- checkout: self
persistCredentials: true
- bash: |
#....Something..Something
echo "##vso[task.setvariable variable=value1;isOutput=true;]foo"
echo "##vso[task.setvariable variable=value2;isOutput=true;]bar"
echo "##vso[task.setvariable variable=value3;isOutput=true;]lorem"
echo "##vso[task.setvariable variable=value4;isOutput=true;]ipsum"
displayName: "Check"
name: check
- job: TestTemplate
variables:
v1: $[ dependencies.CheckFileChanges.outputs['check.value1'] ]
v2: $[ dependencies.CheckFileChanges.outputs['check.value2'] ]
v3: $[ dependencies.CheckFileChanges.outputs['check.value3'] ]
v4: $[ dependencies.CheckFileChanges.outputs['check.value4'] ]
displayName: "Do something with values passed from env variable"
dependsOn: CheckFileChanges
steps:
- template: __templates__/test.template.yml
parameters:
MY_LIST: [$(v1), $(v2), $(v3), $(v4)]
template
parameters:
- name: MY_LIST
type: object
steps:
- ${{ each value in parameters.MY_LIST }}:
- script: 'echo ${{ value }}'
displayName: '${{value}}'
Edit: as you explain that you do not know the number of output values from your first job, this does not seem possible to achieve.
Variables are always strings and, while you could output a csv string variable from the first job, there is no function available to split this into an array for use as the parameter of your second job.

How to use an each expression to concatenate a bash script in Azure Pipelines

I have a few places, where I need to define a set of K8s secrets during deployment at various stages, so I want to extract the recurring script into a template:
parameters:
- name: secretName
type: string
default: ""
- name: secrets
type: object
default:
Foo: Bar
steps:
- task: Bash#3
displayName: Create generic secret ${{ parameters.secretName }}
inputs:
targetType: inline
script: |
echo "Creating generic secret ${{ parameters.secretName }}"
microk8s kubectl delete secret ${{ parameters.secretName }}
microk8s kubectl create secret generic ${{ parameters.secretName }} ${{ each secret in parameters.secrets }}: --from-literal=${{ secretKey }}="${{ secret.value }}"
I want to call it like this multiple times, to create all neccessary secrets for the deployment to each stage
- job: CreateSecrets
pool:
name: $(poolName)
steps:
- template: "Templates/template-create-secret.yml"
parameters:
secretName: "testSecret"
secrets:
username: $(staging-user)
password: $(staging-password)
foo: $(bar)
And it should simply execute a scriptn similar to this one:
kubectl create secret generic secretName \
-- from-literal=username=user1 \
-- from-literal=password=pass1 \
-- ...etc
With my current approach I am receiving the error:
/Code/BuildScripts/Templates/template-create-secret.yml (Line: 18,
Col: 15): The directive 'each' is not allowed in this context.
Directives are not supported for expressions that are embedded within
a string. Directives are only supported when the entire value is an
expression.
How is it possible to iterate over a parameter of type object and use its key and value to build a string for bash? The alternative would be to simply use a single key-value-pair per secret and create multiple secrets, which I'd like to avoid
If you concatenate the command arguments into a variable you can use that in a future step/ task. This example will concatenate all secrets within the key vaults indicated in the keyVaultSecretSources parameter into one command. It shouldn't be too hard to adjust so you can specify which secrets you'd like to include/ exclude:
- name: environment
type: string
- name: namespace
type: string
- name: releaseName
type: string
# contains an array of Azure key vault names
- name: keyVaultSecretSources
type: object
stages:
- stage: MountSecrets${{ parameters.releaseName }}
pool: [Your k8s Pool]
displayName: Mount Key Vault Secrets ${{ parameters.releaseName }}
# Then key vault arguments will be concatenated into the stage variable secretArgs
variables:
secretArgs: ""
jobs:
- deployment: [Your Job Deployment Name]
displayName: [Your Job Display Name]
strategy:
runOnce:
deploy:
steps:
# skip artifacts download for this stage
- download: none
- ${{ each keyVault in parameters.keyVaultSecretSources }}:
# 1. obtain all secrets from keyVault.name key vault
# 2. remove all Json formatting, left with each secret name on one line
# 3. assign to local variable secretNameArray as an array
# 4. loop through secretNameArray and assign the secret to local variable kvs
# 5. construct the argument --from-literal and append to local variable mountCommand
# 6. append mountCommand to the stage variable secretArgs
- task: AzureCLI#2
displayName: 'Concatenate Keyvault Secrets'
inputs:
azureSubscription: [Your subscription]
scriptType: 'bash'
failOnStandardError: true
scriptLocation: 'inlineScript'
inlineScript: |
secretNameArray=($(az keyvault secret list --vault-name ${{ keyVault.name }} --query "[].name" | tr -d '[:space:][]"' | sed -r 's/,+/ /g'));
for i in "${secretNameArray[#]}"; do kvs="$(az keyvault secret show --vault-name ${{ keyVault.name }} --name "$i" --query "value" -o tsv)"; mountCommand="$mountCommand --from-literal=$(echo -e "$i" | sed -r 's/-/_/g')='$kvs'"; done;
echo "##vso[task.setvariable variable=secretArgs;issecret=true]$(secretArgs)$mountCommand"
- task: Kubernetes#1
displayName: 'Kubectl Login'
inputs:
kubernetesServiceEndpoint: [Your Service Connection Name]
command: login
namespace: ${{ parameters.namespace }}
- task: AzureCLI#2
displayName: 'Delete Secrets'
inputs:
azureSubscription: [Your subscription]
scriptType: 'bash'
failOnStandardError: false
scriptLocation: 'inlineScript'
inlineScript: |
kubectl delete secret ${{ parameters.releaseName }}-keyvault -n '${{ parameters.namespace}}'
exit 0
- task: AzureCLI#2
displayName: 'Mount Secrets'
inputs:
azureSubscription: [Your subscription]
scriptType: 'bash'
failOnStandardError: false
scriptLocation: 'inlineScript'
inlineScript: |
kubectl create secret generic ${{ parameters.releaseName }}-keyvault$(secretArgs) -n '${{ parameters.namespace}}'
exit 0
- task: Kubernetes#1
displayName: 'Kubectl Logout'
inputs:
command: logout
According the doc: Parameter data types, we can know that we need to use the 'each' key words before the script and here is the doc to help you know more about the Runtime parameters. Here is the demo script:
parameters:
- name: listOfStrings
type: object
default:
- one
- two
steps:
- ${{ each value in parameters.listOfStrings }}:
- script: echo ${{ value }}

Resources