YAML Configuration replace one property value for the value of another - yaml

I would like to know if it is possible to replace the value of one value for the value of another one, like e.g. :
booking:
services:
dans:
PRIVATE_KEY: MIIEowIBAAKCAQEAr8nAQCQZ8hL0up8LzItKrBwIWhvbFgTtVEHjQIJ0Yw/F3u82
mode:
PRIVATE_KEY: {booking.services.dans.PRIVATE_KEY}

You can use anchors and aliases:
booking:
services:
dans:
PRIVATE_KEY: &a
MIIEowIBAAKCAQEAr8nAQCQZ8hL0up8LzItKrBwIWhvbFgTtVEHjQIJ0Yw/F3u82
mode:
PRIVATE_KEY: *a
This is not a replacement, but a reference; both PRIVATE_KEY keys will link to the same value.
YAML does not provide a way to refer to other values via some kind of path. Mind that {} in YAML do have a special meaning; they create flow mappings. What you wrote is equivalent to this:
booking:
services:
dans:
PRIVATE_KEY: MIIEowIBAAKCAQEAr8nAQCQZ8hL0up8LzItKrBwIWhvbFgTtVEHjQIJ0Yw/F3u82
mode:
PRIVATE_KEY:
booking.services.dans.PRIVATE_KEY:

Related

How to show all the parameters names as comma separated string and assign to One Varaible in YAML Azure Pipelines

How to show all the parameter's names as comma-separated strings (concatenated) and assign them to One Variable in YAML Azure so that i can use this variable in several places
i tried using
parameters:
- name: Var1
displayName: Testing
type: string
- name: Var2
displayName: Coding
type: string
- name: Var3
displayName: Marketing
type: string
variables:
- name: allParametersString
${{each item in parameters}}:
value: $allParametersString + ','+ item.displayName
my desired output is upon using $allParametersString I should get
Testing,Coding,Marketing
but this is leaving an error mentioning 'value' is already defined so can anyone help me? regarding this, I am searching for a solution for 2 weeks :(
I found the way of using bash to assign values will work for this i did
variables:
- name: allParametersString
value: ' '
steps:
- ${{ each parameter in parameters }}:
# the below code will help us reassign the values to variables with bash so i am appending all parameters separated by comma
- bash: echo '##vso[task.setvariable variable=allParametersString]$(allParametersString)${{ parameter.Key }}, '
- script:
echo 'printing all parameters names separated by comma .->$(allParametersString)'
Please let me know if I can improve it more
this helps me understand that in order to reassign or assign twice or concatenate the string using YAML this is the one way of doing it
Your current thinking is not feasible.
There are several things that bind you.
1, The first is the processing logic of yml expression in DevOps.
See this official document:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/runs?view=azure-devops#process-the-pipeline
From the first sentence given, we know your yml will be expanded like this:
parameters:
- name: Var1
displayName: Testing
type: string
default: value1
- name: Var2
displayName: Coding
type: string
default: value2
- name: Var3
displayName: Marketing
type: string
default: value3
variables:
- name: allParametersString
value: xxx
value: xxx
value: xxx
variable of yml concept doesn't allow the written method of the above. That's why you encountered error 'value' is already defined.
2, The second is the structure allowed by DevOps yml files.
Every section of yml definition has limited predefined key. This means that compile time cannot find other container to store the variable.
3, I am afraid the usage of yml expression does not support you to do so.
Refer to this:
Each Keyword This tell you the standard usage of 'each':
You can use the each keyword to loop through parameters with the
object type.
JOIN Expression
This is the way the DevOps yml pipeline provides compile-time flattening of data, but it doesn't work in your case.
And there's no such thing as an index or subscript to navigate to the last run.
4, By the way, the item object doesn't have such information 'displayName':
trigger:
- none
parameters:
- name: Var1
displayName: Testing
type: string
default: value1
- name: Var2
displayName: Coding
type: string
default: value2
- name: Var3
displayName: Marketing
type: string
default: value3
steps:
- ${{each item in parameters}}:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "${{ convertToJson(item) }}"
Result:
The DevOps yml pipeline has no built-in features to implement your needs. If you have to do this, a feasible method is to call the API to get the content of the yml file, then parse and get the parameter part (this is actually a self-designed parser), and then combine the acquired parameters into the variable in the script And pass logging command set variable(isoutput=true). In this way, other tasks can use this combined variable.
This can be done, but is overly complicated and you need to consider whether it is necessary to do such a thing.

Pywalify - evaluate nested maps

I'm using pykwalify to validate a schema.
Given this yaml:
variables:
dev:
options:
key: value
uat:
key: value
key2: value
prd:
key: value
key2: value
Under variables, any map should be allowed.
Under that second level (dev, uat, prd) - any key should be allowed, EXCEPT options. "options" should not be allowed here.
I've tried using a regex, but this is only evaluating the top level, and I'm not quite sure how evaluate the level nested under that "dev, uat, prd" level.
variables:
type: map
matching-rule: all
mapping:
regex;([^,]+):
type: any
regex;(^(?!.*options:).*$):
type: any
Another potential option would be if I have to explicitly list the values that are allowed, that would work too.
I see two issues in your snippet:
The regex to match anything but "options" is wrong.
The schema isn't properly structured for the nested mapping.
The following schema should provide what you need:
variables:
type: map
matching-rule: all
mapping:
regex;([^,]+):
type: map
mapping:
regex;(^(?!options$).*):
type: any

Chain functions in CloudFormation ImportValue, Join and Ref

I want to combine 3 functions in CloudFormation YAML but failing to do so.
I've got an exported parameter that I want to access foo-exportedParam
Then I want to import it, but taking into account that the prefix foo is dynamic and comes from a parameter in the template.
So I want something like
Name: Fn::ImportValue Fn::Join ['', [Fn::Ref prefix, "-exportedParam"]]
If I have the param prefix = foo, then this should translate into
Name: !ImportValue foo-exportedParam
Is this possible?
The documentation gave me a clue on the syntax. This works
Name:
Fn::ImportValue:
!Join ['', [!Ref prefix, "-exportedParam"]]
See nested simple sample: https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/intrinsic-function-reference-importvalue.html

The request in invalid - unrecognized type: string when applying kubectl [duplicate]

I want to set a boolean variable in configMap (or secret):
apiVersion: v1
kind: ConfigMap
metadata:
name: env-config
namespace: mlo-stage
data:
webpack_dev_server: false
But when I apply it, I get the following error:
The request is invalid: patch: Invalid value: "map[data:map[webpack_dev_server:false] metadata:map[annotations:map[kubectl.kubernetes.io/last-applied-configuration:{ blah blah blah}]]]": unrecognized type: string
I have tried to change the value to Off/No/False, all having the same problem.
It seems that the value of the keys in the data map can only be string, I have tried to change the value to "false", the yaml file is OK, but then the variable becomes a string but not boolean.
what should I do if I want to pass a boolean as value?
Values in a ConfigMap must be key-value string values or files.
Change:
data:
webpack_dev_server: false
To:
data:
webpack_dev_server: "false"
To your question:
what should I do if I want to pass a boolean as value?
You may handle this in the application, transform from string to bool.

Pass multiple parameters to DBParameterGroup in YAML CloudFormation

I have the following cloudformation template. The goal is to only create a parameter group if the user wants to and then to populate the RDS parameters in the parameter group with the contents of the cloudformation template parameters.
Parameters:
UseCustomParameterGroup:
Description: Toggle to 'Yes' to create a new parameter group.
Type: String
AllowedValues: ['Yes', 'No']
Default: 'No'
CustomParameters:
Description: Add custom parameters to your custom parameter group. Creating a custom parameter group without parameters creates a mirror of the default group.
Type: String
Conditions:
UseCustomParameterGroup: !Equals [!Ref 'UseCustomParameterGroup', 'Yes']
Resources:
CustomParameterGroup:
Type: AWS::RDS::DBParameterGroup
Condition: 'UseCustomParameterGroup'
Properties:
Family: "postgres10"
Parameters: !Ref "CustomParameters"
If I call this template from another template as so, it will fail with the error Value of property Parameters must be an object
Parameters:
USECUSTOMPARAMETERGROUP: 'Yes'
CUSTOMPARAMETERS: '{
"shared_preload_libraries": "pg_stat_statements",
"pg_stat_statements.track": "all"
}'
Resources:
Postgres:
Type: AWS::CloudFormation::Stack
Properties:
TemplateURL: https://..../rds-postgres-instance.yaml
TimeoutInMinutes: '60'
Parameters:
UseCustomParameterGroup: !Ref USECUSTOMPARAMETERGROUP
CustomParameters: !Ref CUSTOMPARAMETERS
The documentation for AWS::RDS::DBParameterGroup states the following for the Parameters parameter:
Type: A JSON object consisting of string key-value pairs, as shown in the following example
"Parameters" : { "Key1" : "Value1", "Key2" : "Value2", "Key3" : "Value3" }
I think this may be out of date for the YAML version of Cloudformation, but there is no documentation on how to pass this value multiple parameters.
I want the user to be able to define as few or many RDS parameters as they please without having to account for any of the thousands of possible parameters available to RDS.
Cloudformation doesn't allow you to convert string parameters to JSON objects (or YAML for that matter), your parameters are meant to be used as values for your definition keys.
Some other frameworks like serverless overcome this limitation by using a different template language that generates a Cloudformation compatible artifact after some processing, if this feature is critical to your process I advise you to migrate to one of those frameworks.

Resources