I'm using Symfony 2.6 and I'm following these tutorials how to use the validation callback constraint:
http://symfony.com/blog/new-in-symfony-2-4-a-better-callback-constraint
http://symfony.com/doc/current/reference/constraints/Callback.html#external-callbacks-and-closures
To invoke an external validation call I'm trying to use following yaml configuration:
App\APIBundle\Entity\Order:
properties:
id:
- Type:
type: integer
message: "Der Wert {{ value }} ist kein gültiger {{ type }}."
amount:
- Type:
type: integer
message: "Der Wert {{ value }} ist kein gültiger {{ type }}."
groups: [ "AppOrder", "AppOrderbasket" ]
- Callback: [App\APIBundle\Validator\Validator, validate]
groups: [ "AppOrder", "AppOrderbasket" ]
I run into following problems when trying to validate the amount property with an external callback validation class:
The function "validate" within the validation class App\APIBundle\Validator\Validator doesn't get invoked at all. I've tried to add validation groups by adding the "groups" property to the callback constraint. This seems not to be valid as i get this warning (Warning: trim() expects parameter 1 to be string, array given);
If I remove the "groups" property the warning dissapears but the validator is still not invoked.
Any ideas?
Thanks in advance
ninsky
You are now mixing the default option syntax with the normal syntax. That doesn't work.
If you only need to specify the default option (which is the callback option in case of the Callback constraint), you can use Callback: [App\APIBundle\Validator\Validator, validate]. However, if you have to define 2 options (in your case callback and groups), you have to use the normal syntax:
- Callback:
callback: [App\APIBundle\Validator\Validator, validate]
groups: [AppOrder, AppOrderbasket]
Related
I have a strange problem that I can't seem to get my head around. I am trying to define some variables for use as part of the job that will deploy bicep files via Azure CLI and execute PowerShell tasks.
I get this validation error when I try and execute the pipeline: While parsing a block mapping, did not find expected key
The line that it refers to is: - name: managementResourceDNSPrivateResolverName
On the research that I have done on this problem, it sounds like an indentation problem but on the face of it, it seems to look fine.
jobs:
- job: 'Deploy_Management_Resources'
pool:
vmImage: ${{ parameters.buildAgent }}
variables:
- name: managementResourceDNSPrivateResolverName
value: 'acme-$[ lower(parameters['environmentCode']) ]-$[ lower(variables['resourceLocationShort']) ]-private-dns-resolver'
- name: managementResourceGroupManagement
value: 'acme-infrastructure-rg-management'
- name: managementResourceRouteTableName
value: 'acme-$[ lower(variables['subscriptionCode']) ]-$[ lower(variables['resourceLocationShort']) ]-route-table'
- name: managementResourceVirtualNetworkName
value: 'acme-$[ lower(variables['subscriptionCode']) ]-$[ lower(variables['resourceLocationShort']) ]-vnet-internal-mng'
Thanks!
The error message ...parsing a block mapping, did not find expected key is usually a side-effect of malformed yaml. You'll see if often with variables if you have mixed formats of arrays and property elements
variables: # an array of objects
# variable group reference object
- group: myvariablegroup
# variable template reference object
- template: my-variables.yml
# variable object
- name: myVariable
value: 'value1'
# variable shorthand syntax
myVariable: 'value1' # this fails because it's a property instead of an array element
While it doesn't appear that the sample you've provided is malformed, I am curious about the use of $[ ] which is a runtime expression. The expression $[ lower(parameters['environmentcode']) ] refers to parameters which is are only available at compile time.
Change:
$[ lower(parameters['environmentCode']) ] to ${{ lower(parameters.environmentCode) }}
I am trying to create an prometheus-alert-expression that checks if a mountpoint is mounted and sends an alert if the mountpoint is missing .. The Idea was something like this:
groups:
- name: mountpoints
rules:
- alert: /ghome missing
expr: absent(node_filesystem_avail_bytes{mountpoint="/ghome", instance="my.machine.org:9100"})
for: 60s
labels:
severity: critical
annotations:
summary: "/ghome missing on ({{ $labels.instance }})."
description: "VALUE = {{ $value }}\n LABELS = {{ $labels }}"
This kinda works. But is there a way of passing a list/vector to the mountpoint(s) and/or instance(s).
Using this Expression I'll have to write an alert-rule for each instance and each mountpoint.
I had the Idea of trying regular-Expressions like
expr: absent(node_filesystem_avail_bytes{mountpoint=~"/ghome|/something|/other", instance=~"my.machine.org:9100|another.machine.org:9100"})
.. but this obviously does not work.
Does anybody have an idea how to implement this ?
Greetings
Volker
When we pass an expression such as absent(my_metric{label=~"1|2"}), this is evaluated as such:
my_metric{label=~"1|2"} might return 4 possible results:
No result
my_metric{label="1"}
my_metric{label="2"}
both my_metric{label="1"} and my_metric{label="2"}
And the absent function is then called upon these results, and for absent to return "1" it will only do so when there are no results. Missing the case when 1 of them is absent.
Unfortunately there's no one-liner for this, we'll have to be explict with absent, we can either have multiple alert-rules or use the or operator such as:
absent(my_metric{label="1"}) or absnet(my_metrci{label="2"})
When I run puppet agent --test I have no errors output but the user did not create.
My puppet hira.yaml configuration is:
---
version: 5
datadir: "/etc/puppetlabs/code/environments"
data_hash: yaml_data
hierarchy:
- name: "Per-node data (yaml version)"
path: "%{::environment}/nodes/%{::trusted.certname}.yaml"
- name: "Common YAML hierarchy levels"
paths:
- "defaults/common.yaml"
- "defaults/users.yaml"
users.yaml is:
accounts::user:
joed:
locked: false
comment: System Operator
uid: '1700'
gid: '1700'
groups:
- admin
- sudonopw
sshkeys:
- ssh-rsa ...Hw== sysop+moduledevkey#puppetlabs.com
I use this module
Nothing in Hiera data itself causes anything to be applied to target nodes. Some kind of declaration is required in a manifest somewhere or in the output of an external node classifier script. Moreover, the puppetlabs/accounts module provides only defined types, not classes. You can store defined-type data in Hiera and read it back, but automated parameter binding via Hiera applies only to classes, not defined types.
In short, then, no user is created (and no error is reported) because no relevant resources are declared into the target node's catalog. You haven't given Puppet anything to do.
If you want to apply the stored user data presented to your nodes, you would want something along these lines:
$user_data = lookup('accounts::user', Hash[String,Hash], 'hash', {})
$user_data.each |$user,$props| {
accounts::user { $user: * => $props }
}
That would go into the node block matched to your target node, or, better, into a class that is declared by that node block or an equivalent. It's fairly complicated for so few lines, but in brief:
the lookup function looks up key 'accounts::user' in your Hiera data
performing a hash merge of results appearing at different levels of the hierarchy
expecting the result to be a hash with string keys and hash values
and defaulting to an empty hash if no results are found;
the mappings in the result hash are iterated, and for each one, an instance of the accounts::user defined type is declared
using the (outer) hash key as the user name,
and the value associated with that key as a mapping from parameter names to parameter values.
There are a few problems here.
You are missing a line in your hiera.yaml namely the defaults key. It should be:
---
version: 5
defaults: ## add this line
datadir: "/etc/puppetlabs/code/environments"
data_hash: yaml_data
hierarchy:
- name: "Per-node data (yaml version)"
path: "%{::environment}/nodes/%{::trusted.certname}.yaml"
- name: "Common YAML hierarchy levels"
paths:
- "defaults/common.yaml"
- "defaults/users.yaml"
I detected that using the puppet-syntax gem (included if you use PDK, which is recommended):
▶ bundle exec rake validate
Syntax OK
---> syntax:manifests
---> syntax:templates
---> syntax:hiera:yaml
ERROR: Failed to parse hiera.yaml: (hiera.yaml): mapping values are not allowed in this context at line 3 column 10
Also, in addition to what John mentioned, the simplest class to read in your data would be this:
class test (Hash[String,Hash] $users) {
create_resources(accounts::user, $users)
}
Or if you want to avoid using create_resources*:
class test (Hash[String,Hash] $users) {
$users.each |$user,$props| {
accounts::user { $user: * => $props }
}
}
Note that I have relied on the Automatic Parameter Lookup feature for that. See the link below.
Then, in your Hiera data, you would have a key named test::users to correspond (class name "test", key name "users"):
---
test::users: ## Note that this line changed.
joed:
locked: false
comment: System Operator
uid: '1700'
gid: '1700'
groups:
- admin
- sudonopw
sshkeys:
- ssh-rsa ...Hw== sysop+moduledevkey#puppetlabs.com
Use of automatic parameter lookup is generally the more idiomatic way of writing Puppet code compared to calling the lookup function explicitly.
For more info:
PDK
Automatic Parameter Lookup
create_resources
(*Note that create_resources is "controversial". Many in the Puppet community prefer not to use it.)
I am using a combination of API Blueprint and Dredd to test an API my application is dependent on. I am using attributes in API blueprint to define the structure of the response's body.
Apparently I'm missing something though because the tests always pass even though I've purposefully defined a fake "required" parameter that I know is missing from the API's response. It seems that Dredd is only testing whether the type of the response body (array) rather than the type and the parameters within it.
My API Blueprint file:
FORMAT: 1A
HOST: http://somehost.net
# API Title
## Endpoints [GET /endpoint/{date}]
+ Parameters
+ date: `2016-09-01` (string, required) - Date
+ Response 200 (application/json; charset=utf-8)
+ Attributes (array[Data])
## Data Structures
### Data
- realParameter: 2432432 (number)
- realParameter2: `some string` (string, required)
- realParameter3: `Something else` (string, required)
- realParameter4: 1 (number, required)
- fakeParam: 1 (number, required)
The response body:
[
{
"realParameter": 31,
"realParameter2": "some value",
"realParameter3": "another value",
"realParameter4": 8908
},
{
"realParameter": 54,
"realParameter2": "something here",
"realParameter3": "and here too",
"realParameter4": 6589
}
]
And my Dredd config file:
reporter: apiary
custom:
apiaryApiKey: somekey
apiaryApiName: somename
dry-run: null
hookfiles: null
language: nodejs
sandbox: false
server: null
server-wait: 3
init: false
names: false
only: []
output: []
header: []
sorted: false
user: null
inline-errors: false
details: false
method: []
color: true
level: info
timestamp: false
silent: false
path: []
blueprint: myApiBlueprintFile.apib
endpoint: 'http://ahost.com'
Does anyone have any idea why Dredd ignores the fact that "fakeParameter" doesn't actually show up in the response body and still allows the test to pass?
You've run into a limitation of MSON, the language API Blueprint uses for describing attributes. In many cases, MSON describes what MAY be present in the data structure rather than what MUST exactly be present.
The most prominent case are arrays, where basically any content of the array is optional and thus the underlying generated JSON Schema doesn't put any constraints on array contents. Dredd just respects that, so indirectly it becomes a Dredd issue too, however there's not much Dredd can do about it.
There's an issue for the problem: apiaryio/mson#66 You can follow and comment under the issue to get updated about this. Dredd is usually very prompt in getting the latest API Blueprint parser, so once it's implemented in the language itself, it won't take long to appear in Dredd.
Obvious (but tedious) workaround is to specify your own JSON Schema with stricter rules using the + Schema section alongside the + Attributes section.
My validation.yml is given:
task:
- Email:
message: The email "{{ value }}" is not a valid email.
- MinLength: { limit: 50, message: You must be 50 or under to enter. }
My issue is that if I give "wrong-email" in the task-field it gives two error messages:
The email "wrong-email" is not a valid email.
You must be 50 or under to enter.
Actually, I want to show only one error-message at a time.
That means it should check for the validation "MinLength" only if it is a valid email.
Validation sequencing can be done using group sequences. I fixed group sequences for the YAML driver only today, so you might need to wait for the next release of the 2.0 or master branch.
MyEntity:
group_sequence: [MyEntity, Extra]
properties:
task:
- Email: { message: ... }
- MinLength { limit: 50, message: ..., groups: Extra }
Now the constraints in group "Extra" will only be validated if all constraints in group "MyEntity" (i.e. the default group) succeed.