How can I validate different schemas based on an enumerated property value with AJV? - validation

I need to validate a json document depending the value in one of the properties, specifically an enum property.
This is the JSON document to be validated:
{
"req": {
"user": "",
"company": "",
"dept": "",
"class": "",
"reqType": "account"
}
}
The reqType can take different values like account, dept, and class based on the enumerated values.
I have tried using anyOf for the same but it does not validate correctly.
For example, I have tried the schema below:
{
"$id": "http://example.com/example.json",
"type": "object",
"definitions": {},
"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"req": {
"$id": "/properties/req",
"type": "object",
"properties": {
"user": {
"$id": "/properties/req/properties/user",
"type": "string",
"title": "The User Schema ",
"default": "",
"examples": [
"a"
]
},
"company": {
"$id": "/properties/req/properties/company",
"type": "string",
"title": "The Company Schema ",
"default": "",
"examples": [
"b"
]
},
"dept": {
"$id": "/properties/req/properties/dept",
"type": "string",
"title": "The Dept Schema ",
"default": "",
"examples": [
"c"
]
},
"class": {
"$id": "/properties/req/properties/class",
"type": "string",
"title": "The Class Schema ",
"default": "",
"examples": [
"d"
]
},
"reqType": {
"$id": "/properties/req/properties/reqType",
"type": "string",
"title": "The Reqtype Schema ",
"default": "",
"examples": [
"account"
],
"enum": [
"account",
"dept",
"class"
]
}
},
"required": [
"reqType"
],
"anyOf": [
{
"properties": {
"reqType": {
"enum": [
"account"
]
}
},
"required": [
"user",
"company"
]
},
{
"properties": {
"reqType": {
"enum": [
"dept"
]
}
},
"required": [
"dept"
]
}
]
}
},
"required": [
"req"
]
}
This seems to work fine when it meets all of the conditions but when I check the failing case it throws an error for others as follows:
[
{
keyword: 'required',
dataPath: '.req',
schemaPath: '#/properties/req/anyOf/0/required',
params: {
missingProperty: 'user'
},
message: 'should have required property \'user\'',
schema: ['user', 'company'],
parentSchema: {
properties: [Object],
required: [Array]
},
data: {
company: 'b', dept: 'c', class: 'd', reqType: 'account'
}
},
{
keyword: 'enum',
dataPath: '.req.reqType',
schemaPath: '#/properties/req/anyOf/1/properties/reqType/enum',
params: {
allowedValues: [Array]
},
message: 'should be equal to one of the allowed values',
schema: ['dept'],
parentSchema: {
enum: [Array]
},
data: 'account'
},
{
keyword: 'anyOf',
dataPath: '.req',
schemaPath: '#/properties/req/anyOf',
params: {},
message: 'should match some schema in anyOf',
schema: [
[Object],
[Object]
],
parentSchema: {
'$id': '/properties/req',
type: 'object',
properties: [Object],
required: [Array],
anyOf: [Array]
},
data: {
company: 'b', dept: 'c', class: 'd', reqType: 'account'
}
}
]
I expected to get an error for the first and should have validated the 2nd case. Instead it says it did not get the enum value. Am i doing something wrong here?

You're doing it right. The anyOf keyword means that one or more of the given schemas must be valid.
It checks the first and finds that the enum keyword passes, but the required keyword fails. Therefore, this schema fails.
So, it moves on to the next schema and finds that the enum keyword fails and the required keyword passes. Therefore, this schema fails too.
anyOf did not find a valid schema, so it fails and reports that neither schema passes validation.
You see, anyOf doesn't know that enum has a special meaning in this schema. Both schemas have one keyword that passes and one that fails. To anyOf, they are the same.
Here is an alternative that can give you slightly better error messaging. The error messages still end up being quite cryptic, but they are focused where the problem really is.
{
"type": "object",
"properties": {
"req": {
"type": "object",
"properties": {
"reqType": { "enum": ["account", "dept", "class"] }
},
"required": ["reqType"],
"allOf": [
{
"anyOf": [
{
"not": {
"properties": {
"reqType": { "enum": ["account"] }
}
}
},
{ "required": ["user", "company"] }
]
},
{
"anyOf": [
{
"not": {
"properties": {
"reqType": { "enum": ["dept"] }
}
}
},
{ "required": ["dept"] }
]
}
]
}
},
"required": ["req"]
}

Related

Logicapp Expression to read Dynamic Json path - read child element where parent path may change but hierarchy remaining same

Hope all well.
I am in need of creating logicapp expression for reading child element in json where name of element & hierarchy remains same but parent name can be changing.
for example : JSON-1 :
{
"root": {
"abc1": {
"abc2": [
{
"element": "value1",
"element2": "value"
},
{
"element": "value2",
"element2": "valu2"
}
]
}
}
}
JSON-2 :
{
"root": {
"xyz1": {
"xyz2": [
{
"element": "value1",
"element2": "value"
},
{
"element": "value2",
"element2": "valu2"
}
]
}
}
}
I have tried these but no luck
approach-1: #{body('previous-action')?['']?['']?['element']
approach-2: #{body('previous-action')???['element']
Please let me know if anyone encountered this situation. Many thanks in advance.
I tend to find that converting the JSON to xml (at least in your case) is the simplest solution. Then when you've done that, you can't use XPath to simply make your selection.
Flow
In basic terms ...
I've defined a variable of type object that contains your JSON.
I then convert that JSON object to XML using this expression xml(variables('JSON Object'))
Next, initialize a variable is called Elements of type array (given you have multiple of them). The expression for setting that variable is where the smarts come in. That expression is ... xpath(xml(variables('XML')), '//element/text()') and it's getting the inner text of all element nodes in the XML.
Finally, loop through the results.
If you needed to take it up a level and get the second element then you'd need to change your xpath query to be a lot more generic so you can get the element2 nodes (and 3, 4, 5, etc. if they existed) in each array as well.
Note: I've stuck to your specific question of looking for element.
Result
This definition (which can be loaded directly into your tenant) demonstrates the thinking ...
{
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"actions": {
"For_Each_Element": {
"actions": {
"Set_Element": {
"inputs": {
"name": "Element",
"value": "#{item()}"
},
"runAfter": {},
"type": "SetVariable"
}
},
"foreach": "#variables('Elements')",
"runAfter": {
"Initialize_Element": [
"Succeeded"
]
},
"type": "Foreach"
},
"Initialize_Element": {
"inputs": {
"variables": [
{
"name": "Element",
"type": "string"
}
]
},
"runAfter": {
"Initialize_Elements": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Initialize_Elements": {
"inputs": {
"variables": [
{
"name": "Elements",
"type": "array",
"value": "#xpath(xml(variables('XML')), '//element/text()')"
}
]
},
"runAfter": {
"Initialize_XML": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Initialize_JSON_Object": {
"inputs": {
"variables": [
{
"name": "JSON Object",
"type": "object",
"value": {
"root": {
"abc1": {
"abc2": [
{
"element": "value1",
"element2": "value"
},
{
"element": "value2",
"element2": "valu2"
}
]
}
}
}
}
]
},
"runAfter": {},
"type": "InitializeVariable"
},
"Initialize_XML": {
"inputs": {
"variables": [
{
"name": "XML",
"type": "string",
"value": "#{xml(variables('JSON Object'))}"
}
]
},
"runAfter": {
"Initialize_JSON_Object": [
"Succeeded"
]
},
"type": "InitializeVariable"
}
},
"contentVersion": "1.0.0.0",
"outputs": {},
"parameters": {
"ParameterTest1": {
"defaultValue": "\"\"",
"type": "String"
}
},
"triggers": {
"manual": {
"inputs": {
"method": "GET",
"schema": {}
},
"kind": "Http",
"type": "Request"
}
}
},
"parameters": {}
}

Any idea how to do custom supportedCookingModes in Alexa discovery?

I'm trying to return a Discovery Response, but the supportedCookingModes only seems to accept standard values and only in the format of ["OFF","BAKE"], not Custom values as indicated by the documentation. Any idea on how to specify custom values?
{
"event": {
"header": {
"namespace": "Alexa.Discovery",
"name": "Discover.Response",
"payloadVersion": "3",
"messageId": "asdf"
},
"payload": {
"endpoints": [
{
"endpointId": "asdf",
"capabilities": [
{
"type": "AlexaInterface",
"interface": "Alexa.Cooking",
"version": "3",
"properties": {
"supported": [
{
"name": "cookingMode"
}
],
"proactivelyReported": true,
"retrievable": true,
"nonControllable": false
},
"configuration": {
"supportsRemoteStart": true,
"supportedCookingModes": [
{
"value": "OFF"
},
{
"value": "BAKE"
},
{
"value": "CUSTOM",
"customName": "FANCY_NANCY_MODE"
}
]
}
}
]
}
]
}
}
}
Custom cooking modes are brand specific. This functionality is not yet publicly available. I recommend you to choose one of the existing cooking modes:
https://developer.amazon.com/en-US/docs/alexa/device-apis/cooking-property-schemas.html#cooking-mode-values

Elastic Search Wildcard query with space failing 7.11

I am having my data indexed in elastic search in version 7.11. This is my mapping i got when i directly added documents to my index.
{"properties":{"name":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}}
I havent added the keyword part but no idea where it came from.
I am running a wild card query on the same. But unable to get data for keywords with spaces.
{
"query": {
"bool":{
"should":[
{"wildcard": {"name":"*hello world*"}}
]
}
}
}
Have seen many answers related to not_analyzed . And i have tried updating {"index":"true"} in mapping but with no help. How to make the wild card search work in this version of elastic search
Tried adding the wildcard field
PUT http://localhost:9001/indexname/_mapping
{
"properties": {
"name": {
"type" :"wildcard"
}
}
}
And got following response
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "mapper [name] cannot be changed from type [text] to [wildcard]"
}
],
"type": "illegal_argument_exception",
"reason": "mapper [name] cannot be changed from type [text] to [wildcard]"
},
"status": 400
}
Adding a sample document to match
{
"_index": "accelerators",
"_type": "_doc",
"_id": "602ec047a70f7f30bcf75dec",
"_score": 1.0,
"_source": {
"acc_id": "602ec047a70f7f30bcf75dec",
"name": "hello world example",
"type": "Accelerator",
"description": "khdkhfk ldsjl klsdkl",
"teamMembers": [
{
"userId": "karthik.r#gmail.com",
"name": "Karthik Ganesh R",
"shortName": "KR",
"isOwner": true
},
{
"userId": "anand.sajan#gmail.com",
"name": "Anand Sajan",
"shortName": "AS",
"isOwner": false
}
],
"sectorObj": [
{
"item_id": 14,
"item_text": "Cross-sector"
}
],
"geographyObj": [
{
"item_id": 4,
"item_text": "Global"
}
],
"technologyObj": [
{
"item_id": 1,
"item_text": "Artificial Intelligence"
}
],
"themeColor": 1,
"mainImage": "assets/images/Graphics/Asset 35.svg",
"features": [
{
"name": "Ideation",
"icon": "Asset 1007.svg"
},
{
"name": "Innovation",
"icon": "Asset 1044.svg"
},
{
"name": "Strategy",
"icon": "Asset 1129.svg"
},
{
"name": "Intuitive",
"icon": "Asset 964.svg"
},
],
"logo": {
"actualFileName": "",
"fileExtension": "",
"fileName": "",
"fileSize": 0,
"fileUrl": ""
},
"customLogo": {
"logoColor": "#B9241C",
"logoText": "EC",
"logoTextColor": "#F6F6FA"
},
"collaborators": [
{
"userId": "muhammed.arif#gmail.com",
"name": "muhammed Arif P T",
"shortName": "MA"
},
{
"userId": "anand.sajan#gmail.com",
"name": "Anand Sajan",
"shortName": "AS"
}
],
"created_date": "2021-02-18T19:30:15.238000Z",
"modified_date": "2021-03-11T11:45:49.583000Z"
}
}
You cannot modify a field mapping once created. However, you can create another sub-field of type wildcard, like this:
PUT http://localhost:9001/indexname/_mapping
{
"properties": {
"name": {
"type": "text",
"fields": {
"wildcard": {
"type" :"wildcard"
},
"keyword": {
"type" :"keyword",
"ignore_above":256
}
}
}
}
}
When the mapping is updated, you need to reindex your data so that the new field gets indexed, like this:
POST http://localhost:9001/indexname/_update_by_query
And then when this finishes, you'll be able to query on this new field like this:
{
"query": {
"bool": {
"should": [
{
"wildcard": {
"name.wildcard": "*hello world*"
}
}
]
}
}
}

Customizing the Oracle ORDS generated Swagger documentation

I'm writing a REST-API using Oracle ORDS.
ORDS generates a Swagger 2.0 API documentation on a predefined URL.
I can not find how to add custom information like a text for the endpoint description or the name and schema for the "object" returned from the endpoint.
Does anyone here know how to adjust the ORDS generated Swagger documentation?
We recently enhanced ORDS such that you could inject custom comments into the Swagger-style OpenAPI Docs.
New Features in 18.4.0
ENH:28028432 - Echo p_comments value into generated Swagger
documentation Earlier versions
Here's an example -
Defining my POST
BEGIN
ORDS.DEFINE_HANDLER(
p_module_name => 'EXAMPLES',
p_pattern => 'id/',
p_method => 'POST',
p_source_type => 'plsql/block',
p_items_per_page => 0,
p_mimes_allowed => 'application/json',
p_comments => 'This is a bad example, has no error handling',
p_source =>
'begin
insert into identity_table (words) values (:words);
commit;
end;'
);
COMMIT;
END;
/
Now if I go to the OpenAPI endpoint for my module, you can see the Description text for the handler has been 'injected' into the service documentation.
"This is a bad example, has no error handling" -- it's a free text field, so you can basically put anything you want there.
{
"swagger": "2.0",
"info": {
"title": "ORDS generated API for EXAMPLES",
"version": "1.0.0"
},
"host": "localhost:8080",
"basePath": "/ords/pdb2/jeff/examples",
"schemes": [
"http"
],
"produces": [
"application/json"
],
"paths": {
"/id/": {
"get": {
"description": "Retrieve records from EXAMPLES",
"produces": [
"application/json"
],
"responses": {
"200": {
"description": "The queried record.",
"schema": {
"type": "object",
"properties": {
"ID": {
"$ref": "#/definitions/NUMBER"
},
"WORDS": {
"$ref": "#/definitions/VARCHAR2"
}
}
}
}
},
"parameters": []
},
"post": {
"description": "This is a bad example, has no error handling",
"responses": {
"201": {
"description": "The successfully created record.",
"schema": {
"type": "object",
"properties": {}
}
}
},
"consumes": [
"application/json"
],
"parameters": [
{
"name": "payload",
"in": "body",
"required": true,
"schema": {
"$ref": "#/definitions/EXAMPLES_ITEM"
}
}
]
}
},
"/id/{pk}": {
"get": {
"description": "Retrieve records from EXAMPLES",
"produces": [
"application/json"
],
"responses": {
"200": {
"description": "The queried record.",
"schema": {
"type": "object",
"properties": {
"ID": {
"$ref": "#/definitions/NUMBER"
},
"WORDS": {
"$ref": "#/definitions/VARCHAR2"
}
}
}
}
},
"parameters": [
{
"name": "pk",
"in": "path",
"required": true,
"type": "string",
"description": "implicit",
"pattern": "^[^/]+$"
}
]
}
}
},
"definitions": {
"NUMBER": {
"type": "number"
},
"VARCHAR2": {
"type": "string"
},
"EXAMPLES_ITEM": {
"properties": {
"words": {
"type": "string"
}
}
}
}
}

How to force object key name in array

I am using YAML to mark up some formulas and using JSON schema to provide a reference schema.
An example of the YAML might be:
formula: # equates to '5 + (3 - 2)'
add:
- 5
- subtract: [3, 2]
While I have figured out how to make the immediate child object of the formula ("add" in this example) have the right key name and type (using a "oneOf"array of "required"s). I am not sure how to ensure that object of an array ("subtract") likewise use specific key names.
So far, I can ensure the type using the following. But with this method, as long as the object used matches the subtract type, it is allowed any key name, it is not restricted to subtract:
"definitions: {
"add": {
"type": "array",
"minItems": 2,
"items": {
"anyOf": [
{ "$ref": "#/definitions/value"}, # value type is an integer which allows for the shown scalar array elements
{ "$ref": "#/definitions/subtract" }
// other operation types
]
}
},
"subtract": {
"type": "array",
"minItems": 2,
"maxItems": 2,
"items": {
"anyOf": [
{ "$ref": "#/definitions/value"},
{ "$ref": "#/definitions/add" }
// other operation types
]
}
}
// other operation types
}
How can I introduce a restriction such that the keys of objects in the array match specific names, while still also allowing scalar elements?
It sounds like what you want is recursive references.
By creating a new definition which is oneOf the operations and value, which then allow items which then reference back to the new definition, you have recursive references.
"definitions: {
"add": {
"type": "array",
"minItems": 2,
"items": { "$ref": "#/definitions/operations_or_values"},
},
"subtract": {
"type": "array",
"minItems": 2,
"maxItems": 2,
"items": { "$ref": "#/definitions/operations_or_values"},
}
// other operation types
"operations_or_values": {
"anyOf": [
{ "$ref": "#definitions/add" },
{ "$ref": "#definitions/subtract" },
{ "$ref": "#definitions/value" }, # value type is an integer which allows for the shown scalar array elements
{ "$ref": "#definitions/[OTHERS]" },
]
}
}
I haven't had time to test this, but I believe it will be, or be close to, what you need. Let me know if it doesn't work. I may not have full understood the question.
What a fascinating problem! This remarkably concise schema can express any expression.
{
"type": ["object", "number"],
"propertyNames": { "enum": ["add", "subtract", "multiply", "divide"] },
"patternProperties": {
".*": {
"type": "array",
"minItems": 2,
"items": { "$ref": "#" }
}
}
}
So what I ended up doing was extending the idea of I already used with the '"oneOf"array of "required", adding an "anyOf".
Thus, an operator schema is now:
"definitions": {
"add": {
"type": "array",
"minItems": 2,
"items": {
"anyOf": [
{ "$ref": "#/definitions/literal" }, // equates to a simple YAML scalar
{ "$ref": "#/definitions/constant" },
{ "$ref": "#/definitions/variable" },
{
"type": "object",
"oneOf": [
{ "required": ["add"] },
{ "required": ["subtract"] }
// more operator names
],
"properties": {
"add": { "$ref": "#/definitions/add" },
"subtract": { "$ref": "#/definitions/subtract" }
// more operator type references
}
}
]
}
},
// more definitions
}
This can be refactored to something that applies more easily across different operators like so:
"definitions": {
"operands": {
"literal": { "type": "number" }, // equates to a simple YAML scalar
"constant": {
"type": "object",
"properties": {
"value": { "type": "number" }
},
"required": [ "value" ]
},
"variable": {
"type": "object",
"properties": {
"name": { type": "string" },
"type": { "type": "string" }
},
"required": [ "name", "type" ]
}
}
"operators": {
"add": {
"type": "array",
"minItems": 2,
"items": { "$ref": "#/definitions/anyOperandsOrOperators" }
},
"subtract": {
"type": "array",
"minItems": 2,
"maxItems": 2,
"items": { "$ref": "#/definitions/anyOperandsOrOperators" }
}
// more operator types
},
"anyOperator": {
"type": "object",
"oneOf": [
{ "required": ["add"] },
{ "required": ["subtract"] }
// more operator names
],
"properties": {
"add": { "$ref": "#/definitions/operators/add" },
"subtract": { "$ref": "#/definitions/operators/subtract" }
// more operator type references
}
},
"anyOperandsOrOperators":
{
"anyOf": [
{ "$ref": "#/definitions/operands/literal" },
{ "$ref": "#/definitions/operands/constant" },
{ "$ref": "#/definitions/operands/variable" },
{ "$ref": "#/definitions/anyOperator"}
]
}
}
And this means the YAML for an operator can look as follows
\/mapping \/mapping
add:[ 5, subtract:[ *constantA, *variableB ] ]
scalar^ ^mapping with specific name

Resources