API Blueprint Data Structures Enum Output to Schema - apiblueprint

I have defined an Enum at Data Structures and then compile with aglio. The output schema somehow adds a null. How can I get rid of that null?
## StatusType (enum[string])
+ Verified
+ Unverified
+ VerifiedLikely
+ Invalid
But at the generated schema after compiling using aglio.
"status": {
"enum": [
null,
"Verified",
"Unverified",
"VerifiedLikely",
"Invalid"
]
}

Related

How to handle where clause in GraphQL Schema

I am new to GraphQL and creating a API Server using Flask and GraphQL,
Facing some issues while handling the "where" clause in GraphQL Request.
The basic Request and Response is working fine . please find a short snippet of the Schema I have designed
type data {
edges:[data_edges]
}
type QueryCustom {
data: data
}
type Query {
query: QueryCustom
}
Below mentioned basic request (Without the where clause) is working fine with this schema
query {
query {
data {
edges { .... }
But Getting an error when I am executing the Request with the where clause
query dataClosingSoon($month: Long) {
query{
data(where: { LastModifiedDate: { CALENDAR_MONTH: { value: { eq: $month } } } } ) {
edges { ....... }
Following is the response I get:
{
"errors": [
{
"locations": [
{
"column": 40,
"line": 1
}
],
"message": "Unknown type 'Long'."
},
{
"locations": [
{
"column": 9,
"line": 5
}
],
"message": "Unknown argument 'where' on field 'QueryCustom.data'."
}
]
}
I need to understand how to handle the where condition.
GraphQL is not SQL. You cannot use SQL clauses such as WHERE, LIKE, etc. in a GraphQL query.
You need to look at the schema to check how can you filter your query. These filters are pre-defined in the schema. You cannot create custom filters (at least in a basic sense) for a GraphQL query.
Edit:
If you want to use the query you are trying to send, your schema should look like something this:
type data {
edges:[data_edges]
}
type Query {
data(where: Filter!): data
}
input type Filter {
lastModifiedDate: // the type of this field
// Rest of the input fields
}
Note that your first query and the second query are totally different. Your second query is clearly wrong due to two reasons:
The Query type does not have a field called data. It only has one field called query. (I wouldn't add a field named query under the Query type though).
Your data field does not have any inputs. But your document (the GraphQL request) clearly does.

fetch attribute type from terraform provider schema

am trying to find out a way to fetch the attribute type of a resource/data_source from a terraform providers schema (am currently using gcp, but will be extending to pretty much all providers).
My current flow of setup
Am running the terraform providers schema -json to fetch the providers schema
This will generate a huge json file with the schema structure of the provider
ref:
How to get that list of all the available Terraform Resource types for Google Cloud?
https://developer.hashicorp.com/terraform/cli/commands/providers/schema
And from this am trying to fetch the type of each attribute eg below
`
"google_cloud_run_service": {
"version": 1,
"block": {
"attributes": {
"autogenerate_revision_name": {
"type": "bool",
"description_kind": "plain",
"optional": true
},
`
4) My end goal is to generate variables.tf from the above schema for all resources and all attributes supported in that resource along with the type constraint
ref: https://developer.hashicorp.com/terraform/language/values/variables
I already got some help on how we can generate that
ref: Get the type of value using cty in hclwrite
Now the challenge is to work on complex structures like below
The below is one of the attributes of "google_cloud_run_service".
`
"status": {
"type": [
"list",
[
"object",
{
"conditions": [
"list",
[
"object",
{
"message": "string",
"reason": "string",
"status": "string",
"type": "string"
}
]
],
"latest_created_revision_name": "string",
"latest_ready_revision_name": "string",
"observed_generation": "number",
"url": "string"
}
]
],
"description": "The current status of the Service.",
"description_kind": "plain",
"computed": true
}
`
7) so based on the above complex structure type, I want to generate the variables.tf file for this kind of attribute using the code sample from point #5, and the desired output should look something like below in variables.tf
`
variable "autogenerate_revision_name" {
type = string
default = ""
description = "Sample description"
}
variable "status" {
type = list(object({
conditions = list(object({
"message" = string
"reason" = string
"status" = string
" type" = string
}))
"latest_created_revision_name" = string
"latest_ready_revision_name" = string
"observed_generation" = number
"url" = string
}))
default = "default values in the above type format"
}
`
The above was manually written - so might not exactly align with the schema, but i hope i made it understood , as to what am trying to achieve.
The first variable in the above code is from the first eg i gave in point #3 which is easy to generate, but the second eg in point #6 is a complex type constraint and am seeking help to get this generated
Is this possible to generate using the helper schema sdk (https://pkg.go.dev/github.com/hashicorp/terraform-plugin-sdk/v2#v2.24.0/helper/schema) ? along with code eg given in point #5 ?
Summary : Am generating json schema of a terraform provider using terraform providers schema -json, am reading that json file and generating hcl code for each resource, but stuck with generating type constraints for the attributes/variables, hence seeking help on this.
Any sort of help is really appreciated as am stuck at this for quite a while.
If you've come this far, then i thank you for reading such a lengthy question, and any sort of pointers are welcome.

Jackson derealization with SpringBoot : To get field names present in request along with respective field mapping

I have a requirement to throw different error in case of different scenarios like below, and there are many such fields not just 1.
e.g.
{
"id": 1,
"name": "nameWithSpecialChar$"
}
Here it should throw error for special character.
{
"id": 1,
"name": null
}
Here throw field null error.
{
"id": 1
}
Here throw field missing error.
Handling, 1st and 2nd scenario is easy, but for 3rd one, is there any way we can have a List of name of fields that were passed in input json at the time of serialization itself with Jackson?
One way, I am able to do it is via mapping request to JsonNode and then check if nodes are present for required fields and after that deserialize that JsonNode manually and then validate rest of the members as below.
public ResponseEntity myGetRequest(#RequestBody JsonNode requestJsonNode) {
if(!requestJsonNode.has("name")){
throw some error;
}
MyRequest request = ObjectMapper.convertValue(requestJsonNode, MyRequest .class);
validateIfFieldsAreInvalid(request);
But I do not like this approach, is there any other way of doing it?
You can define a JSON schema and validate your object against it. In your example, your schema may look like this:
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"type": "object",
"properties": {
"id": {
"description": "The identifier",
"type": "integer"
},
"name": {
"description": "The item name",
"type": "string",
"pattern": "^[a-zA-Z]*$"
}
},
"required": [ "id", "name" ]
}
To validate your object, you could use the json-schema-validator library. This library is built on Jackson. Since you're using Spring Boot anyway, you already have Jackson imported.
The example code looks more or less like this:
String schema = "<define your schema here>";
String data = "<put your data here>";
JsonSchemaFactory factory = JsonSchemaFactory.byDefault();
ObjectMapper m = new ObjectMapper();
JsonSchema jsonSchema = factory.getJsonSchema(m.readTree(schema));
JsonNode json = m.readTree(data);
ProcessingReport report = jsonSchema.validate(json);
System.out.println(report);
The report includes detailed errors for different input cases. For example, with this input
{
"id": 1,
"name": "nameWithSpecialChar$"
}
this output is printed out
--- BEGIN MESSAGES ---
error: ECMA 262 regex "^[a-zA-Z]*$" does not match input string "nameWithSpecialChar$"
level: "error"
schema: {"loadingURI":"#","pointer":"/properties/name"}
instance: {"pointer":"/name"}
domain: "validation"
keyword: "pattern"
regex: "^[a-zA-Z]*$"
string: "nameWithSpecialChar$"
--- END MESSAGES ---
Or instead of just printing out the report, you can loop through all errors and have your specific logic
for (ProcessingMessage message : report) {
// Add your logic here
}
You could check the example code to gain more information about how to use the library.

Publishing Avro messages using Kafka REST Proxy throws "Conversion of JSON to Avro failed"

I am trying to publish a message which has a union for one field as
{
"name": "somefield",
"type": [
"null",
{
"type": "array",
"items": {
"type": "record",
Publishing the message using the Kafka REST Proxy keeps throwing me the following error when somefield has an array populated.
{
"error_code": 42203,
"message": "Conversion of JSON to Avro failed: Failed to convert JSON to Avro: Expected start-union. Got START_ARRAY"
}
Same schema with somefield: null is working fine.
The Java classes are built in the Spring Boot project using the gradle plugin from the Avro schemas. When I use the generated Java classes and publish a message, with the array populated using the Spring KafkaTemplate, the message is getting published correctly with the correct schema. (The schema is taken from the generated Avro Specific Record) I copy the same json value and schema and publish via REST proxy, it fails with the above error.
I have these content types in the API call
accept:application/vnd.kafka.v2+json, application/vnd.kafka+json, application/json
content-type:application/vnd.kafka.avro.v2+json
What am I missing here? Any pointers to troubleshoot the issue is appreciated.
The messages I tested for were,
{
"somefield" : null
}
and
{
"somefield" : [
{"field1": "hello"}
]
}
However, it should be instead passed as,
{
"somefield" : {
"array": [
{"field1": "hello"}
]}
}

Issue while including enum type in unions within avro schema

I am working with Apache Kafka to send messages to Kafka topics. I am trying to use unions in Avro Schemas including enum types for message validation. But I am facing an issue with the usage of enum types within union. I am using Kafka REST API through POSTMAN tool to post a record/message to a topic with schema validation. Below is the request payload including schema and records inline -
{
"key_schema": "{\"type\": \"record\", \"name\": \"key\", \"fields\": [{\"name\": \"keyInput\", \"type\": \"string\"}]}",
"value_schema": "{\"type\": \"record\", \"name\": \"value\", \"fields\": [{\"name\": \"valueInput1\", \"type\": \"string\"},{\"name\": \"valueInput2\",\"type\":[{\"type\":\"enum\",\"name\":\"actorobjType\",\"symbols\":[\"Agent\",\"Group\"]},\"null\"],\"default\":null}]}",
"records": [
{
"key": {
"keyInput": "testUser-key"
},
"value": {
"valueInput1": "testUser-value",
"valueInput2": "Agent"
}
}
]
}
I am getting the following error when I am trying to insert a record using above request payload -
{
"error_code": 42203,
"message": "Conversion of JSON to Avro failed: Failed to convert JSON to Avro: Expected start-union. Got VALUE_STRING"
}
After searching in different sites including stackoverflow, I came through a suggestion
asking to explicitly specify the type while passing the record as below -
{
"key_schema": "{\"type\": \"record\", \"name\": \"key\", \"fields\": [{\"name\": \"keyInput\", \"type\": \"string\"}]}",
"value_schema": "{\"type\": \"record\", \"name\": \"value\", \"fields\": [{\"name\": \"valueInput1\", \"type\": \"string\"},{\"name\": \"valueInput2\",\"type\":[{\"type\":\"enum\",\"name\":\"actorobjType\",\"symbols\":[\"Agent\",\"Group\"]},\"null\"],\"default\":null}]}",
"records": [
{
"key": {
"keyInput": "testUser-key"
},
"value": {
"valueInput1": "testUser-value",
"valueInput2": {
"enum": "Agent"
}
}
}
]
}
But then I face the below error -
{
"error_code": 42203,
"message": "Conversion of JSON to Avro failed: Failed to convert JSON to Avro: Unknown union branch enum"
}
The same suggestion worked fine for unions with other types like string and map, but with unions including enum, that does not seem to work.
I also thought there may be some other type which needs to be used for enum specification, Hence I tried some other words like below -
"valueInput2": {
"string": "Agent"
}
and
"valueInput2": {
"enumeration": "Agent"
}
But none of them seem to work. Please help me resolve this.
I ended up here, and davis michael's answer gave a hint, which helped me eventually figure it out.
Within the context of the question,
"valueInput2": {
"actorobjType": "Agent"
}
As ENUM type is not exist in JSON format, value type should be changed to correct one:
namespace + type name
In your case, it will be namespace + actorobjtype : "agent"

Resources