Duplicate key in YAML configuaration file - spring-boot

I have the following in YAML:
key1
key2: "value"
key1
key2
key3: "value2"
Get exception duplicate key key1.
Caused by: org.yaml.snakeyaml.parser.ParserException: while parsing MappingNode
Tried various combinations but was unable to parse it correctly.

Your YAML is syntactically invalid, but I am assuming it actually looks like this:
key1:
key2: "value"
key1:
key2:
key3: "value2"
Your error is that key1 is used two times as mapping key in the root node. This is illegal as per YAML spec:
The content of a mapping node is an unordered set of key: value node pairs, with the restriction that each of the keys is unique.
The solution is to make all keys of the same mapping unique:
key11:
key2: "value"
key12:
key2:
key3: "value2"

I too faced the same issue. Then it struck on me! The answer is simple.
From
mapping:
refresh:
schedule:
frequency:
milli: 86400000
mapping:
refresh:
schedule:
initial:
delay:
ms: 30000
to
mapping:
refresh:
schedule:
frequency:
milli: 86400000
initial:
delay:
ms: 30000

so below simple solution worked for me.
Basically, in the first scenario 'server' keyword came as a separate structure
in the 2d scenario 'server' keyword came as child structure.
I simply did a small indentation and it worked.
Before :->
server:
port: 8761
eureka:
client:
registerWithEureka: false
fetchRegistry: false
server :
waitTimeInMsWhenSyncEmpty: false
After :->
server:
port: 8761
eureka:
client:
registerWithEureka: false
fetchRegistry: false
server :
waitTimeInMsWhenSyncEmpty: false

You may fix it like this:
key1
key2: "value"
key2.key3: "value2"

Related

Filter yaml in helmfile

Is there a way to filter yaml files used in a helmfile?
I have a values.yaml:
component-a:
key1: value
key2: value
component-b:
key1: value
key3: value
In the helmfile I load the yaml and want to filter for a component template:
templates:
component: &component
chart: oci://myregistry/{{`{{ .Release.Name }}`}}
values:
- values.yaml | TODO filter for .Release.Name
releases:
- name: component-a
<<: *component
- name: component-b
<<: *component
I would like to have the values for both components in one file and only the ones for the release name filtered in the component template.

What is indexed array in YAML?

In my yaml spring-boot application config I have
additional-properties[auto.register.schemas]: false
additional-properties[use.latest.version]: true
and it works! I haven't found this syntax in the YAML specification. What does it mean? How can it be re-written using standard YAML? Is this the same as
additional-properties:
- auto.register.schemas: false
- use.latest.version: true
?
AFAIK:
Every element (separated by a dot) has to be on its own line and tabed accordingly.
foo:
bar:
name: value
name2: value2
fez: value
So your example would be:
additional-properties:
auto:
register:
schemas: false
After experimenting and after finding this answer, I conclude that (at least in Spring application.yaml):
camel.component.kafka:
additional-properties[auto.register.schemas]: false
additional-properties[use.latest.version]: true
is equivalent to
camel.component.kafka.additional-properties:
"[auto.register.schemas]": false
"[use.latest.version]": true
and this is equivalent to
camel:
component:
kafka:
additional-properties:
"[auto.register.schemas]": false
"[use.latest.version]": true

Serverless Framework - unrecognized property 'params'

I am trying to create a scheduled lambda function using the Serverless framework and to send it different parameters from different events.
here is my serverless configuration:
functions:
profile:
timeout: 10
handler: profile.profile
events:
- schedule:
rate: rate(1 minute)
params:
hello: world
The issue is that when I run sls deploy, I get the following error:
Serverless: at 'functions.profile.events[0]': unrecognized property 'params'
This is basically copied from the documentation here, so should work...
Am I missing something?
The documentation you're referencing is for Apache Open Whisk.
If you're using AWS, you'll need to use input as shown in the aws documentation
functions:
aggregate:
handler: statistics.handler
events:
- schedule:
rate: rate(10 minutes)
enabled: false
input:
key1: value1
key2: value2
stageParams:
stage: dev
The documentation that you referred to is for OpenWhisk https://www.serverless.com/framework/docs/providers/openwhisk/events/schedule/#schedule/.
Cloudwatch Events (now rebranded as EventBridge) is at https://www.serverless.com/framework/docs/providers/aws/events/schedule/#enabling--disabling. Sample code for reference
functions:
aggregate:
handler: statistics.handler
events:
- schedule:
rate: rate(10 minutes)
enabled: false
input:
key1: value1
key2: value2
stageParams:
stage: dev
- schedule:
rate: cron(0 12 * * ? *)
enabled: false
inputPath: '$.stageVariables'
- schedule:
rate: rate(2 hours)
enabled: true
inputTransformer:
inputPathsMap:
eventTime: '$.time'
inputTemplate: '{"time": <eventTime>, "key1": "value1"}'
Official docs at https://docs.aws.amazon.com/eventbridge/latest/userguide/scheduled-events.html
I could see one of my configuration something like below. There we use parameters instead of param.
functions:
test_function:
handler: handler.test_function
memorySize: 512
timeout: 60
events:
- http:
path: get-hello
method: get
request:
parameters:
queryStrings:
name: true

What is `<<` and `&` in yaml mean?

When I review the cryptogen(a fabric command) config file . I saw there symbol.
Profiles:
SampleInsecureSolo:
Orderer:
<<: *OrdererDefaults ## what is the `<<`
Organizations:
- *ExampleCom ## what is the `*`
Consortiums:
SampleConsortium:
Organizations:
- *Org1ExampleCom
- *Org2ExampleCom
Above there a two symbol << and *.
Application: &ApplicationDefaults # what is the `&` mean
Organizations:
As you can see there is another symbol &.
I don't know what are there mean. I didn't get any information even by reviewing the source code (fabric/common/configtx/tool/configtxgen/main.go)
Well, those are elements of the YAML file format, which is used here to provide a configuration file for configtxgen. The "&" sign mean anchor and "*" reference to the anchor, this is basically used to avoid duplication, for example:
person: &person
name: "John Doe"
employee: &employee
<< : *person
salary : 5000
will reuse fields of person and has similar meaning as:
employee: &employee
name : "John Doe"
salary : 5000
another example is simply reusing value:
key1: &key some very common value
key2: *key
equivalent to:
key1: some very common value
key2: some very common value
Since abric/common/configtx/tool/configtxgen/main.go uses of the shelf YAML parser you won't find any reference to these symbols in configtxgen related code. I would suggest to read a bit more about YAML file format.
in yaml if data is like
user: &userId '123'
username: *userId
equivalent yml is
user: '123'
username: '123'
or
equivalent json will is
{
"user": "123",
"username": "123"
}
so it basically allows to reuse data, you can also try with array instead of single value like 123
try converting below yml to json using any yml to json online converter
users: &users
k1: v1
k2: v2
usernames: *users

spring.profiles is not working as expected in spring boot

boot config *.yml file.
server.port: 2222
spring:
application:
name: x-service
data:
mongodb:
host: db.x
database: x
# userName: ${db.userName}
# password: ${db.password}
rabbitmq:
# port: ${queue.port}
host: queue.x
username: ${queue.userName}
password: ${queue.password}
listener:
max-concurrency: 1
prefetch: 1
acknowledge-mode: auto
auto-startup: true
dynamic: true
###########DEV##############
spring.profiles: dev
#queue.virtual.host: xuser
queue.userName: guest
queue.password: guest
queue.port: 5672
#db.userName:
#db.password:
falconUrl: http://x.y.com
##########DEFAULT###########
spring.profiles: qa
queue.virtual.host: xuser
queue.userName: xuser
queue.password: xpassword
queue.port: 3456
db.userName: xuser
db.password: xpassword
falconUrl: http://x.z.com
It gives me org.yaml.snakeyaml.parser.ParserException: while parsing MappingNode
in 'reader', line 1, column 1:
server.port: 2222
^
Duplicate key: spring.profiles
in 'reader', line 47, column 1:
error. If I comment properties of one of the profile.It works fine.
Can anyone please suggest what is wrong here?
The error message is actually quite specific and accurate: in the top-level mapping of your YAML file (the one starting with the key-value pair server.port and 2222 you have two identical keys (the scalar spring.profiles). And duplicate keys are not allowed in YAML, as the are required to be unique according to the specification.
The underlying problem is that if you want to change the configuration depending on the environment, you'll have to follow the documented specification, which states that:
A YAML file is actually a sequence of documents separated by --- lines, and each document is parsed separately to a flattened map.
If a YAML document contains a spring.profiles key, then the profiles value (comma-separated list of profiles) is fed into the Spring Environment.acceptsProfiles() and if any of those profiles is active that document is included in the final merge (otherwise not)
Your YAML file is a single implicit YAML document because it lacks the directive indicator --- that occurs at the beginning of an explicit YAML document. (the YAML directive ... that indicates end-of-document might not be supported properly supported by snake-yaml, at least it is not mentioned in the examples).
Your code should look like:
server.port: 2222
spring:
application:
name: x-service
data:
mongodb:
host: db.x
database: x
# userName: ${db.userName}
# password: ${db.password}
rabbitmq:
# port: ${queue.port}
host: queue.x
username: ${queue.userName}
password: ${queue.password}
listener:
max-concurrency: 1
prefetch: 1
acknowledge-mode: auto
auto-startup: true
dynamic: true
###########DEV##############
---
spring.profiles: dev
#queue.virtual.host: xuser
queue.userName: guest
queue.password: guest
queue.port: 5672
#db.userName:
#db.password:
falconUrl: http://x.y.com
##########DEFAULT###########
---
spring.profiles: qa
queue.virtual.host: xuser
queue.userName: xuser
queue.password: xpassword
queue.port: 3456
db.userName: xuser
db.password: xpassword
falconUrl: http://x.z.com
The statement in the documentation that "each document is parsed separately to a flattened map" is of course only true, if each of the documents has a mapping at the top level. That is what spring-boot expects, but you can as easily have a scalar or sequence at the top level of a document, and such documents are certainly not parsed by snake-yaml to a flattened map.

Resources