Input validation with directives - graphql

I'm currently working on a server using GraphQL and I'm stuck on implementing input validation with directives. What I'm trying to do is to add directives to input types which allow me to verify the input given before passing it to the actual data fetcher.
schema:
directive #email on INPUT_FIELD_DEFINITION
type Mutation {
getAccounts(filter: InutAccount): [Account]
}
input InputAccount{
email: String #email
}
I've done the wiring and schema building part, the mutation works, but I'm having problems with implementing the schema which allows me to validate an email (ex: email has to contain "#gmail.com").

Input validation through directives do not work out of the box. I suggest two way of solving this.
Without external library - The resolver is in charge of the validation
This is the most basic solution. Your data fetcher (resolver), must return a DataFetcherResult, that may contain one or more GraphQLError. In the data fetcher, you can implement your validation, populate the DataFetcherResult GraphQLError and if no error is found perform your mutation.
This can be improved by mapping your GraphQL input object with a POJO, as graphl-java-tools would do, and use javax annotations and validators to validate your input before processing it.
With GraphQL-Java Extended Validation Library - https://github.com/graphql-java/graphql-java-extended-validation
This library does what you want and provides you some basic directives and associated constraints. Be aware the library is pretty new (there are a couple of bugs regarded nested inputs validation).

Related

What's the best way to validate an untagged Go struct

Whenever I create a Go struct, I use struct tags so that I can easily validate it using go-playground/validator - for example:
type DbBackedUser struct {
Name sql.NullString `validate:"required"`
Age sql.NullInt64 `validate:"required"`
}
I'm currently integrating with a third party. They provide a Go SDK, but the structs in the SDK don't contain validate tags, and they are unable/unwilling to add them.
Because of my team's data integrity requirements, all of the data pulled in through this third-party SDK has to be validated before we can use it.
The problem is that there doesn't seem to be a way to validate a struct using the Validator package without defining validation tags on the struct.
Our current proposed solution is to map all of the data from the SDK's structs into our own structs (with validation tags defined on them), perform the validation on these structs, and then, if that validation passes, we can use the data directly from the SDK's structs (we need to pass the data around our system using the structs defined by the SDK - so these structs with the validation tags could only be used for validation). The problem here is that we would then have to maintain these structs that are only used for validation, also, the mapping between structs has both time and memory costs.
Is there a better way of handling struct validation in this scenario?
Alternative technique is to use validating constructors: NewFoobar(raw []byte) (Foobar, error)
But I see that go-validator supports your case with a "struct-level validation":
https://github.com/go-playground/validator/blob/master/_examples/struct-level/main.go

GraphQL field-level validation in AppSync

I have an AppSync API that's mostly backed by a DynamoDB store. Most of the resolvers are hooked up directly to the DynamoDB sources, not using lambdas.
Some of the fields should have validation constraints, such as length or a regexp. In one particular case I would like to require that a state field contain an ISO 3166-2 value like US-NY. (GraphQL enums values can't contain hyphens, so that isn't an option here.)
Other than replacing some resolvers with lambdas, the only way I can think of to apply these sorts of validation rules is to do it in VTL in the RequestMappingTemplate. That would work, but it would be tedious and likely result in duplicate code. Are there alternatives?
Unfortunately, only way without lambda is VTL , I suggest that instead of writing validation directly inside RequestMappingTemplate, using pipeline resolver.(less duplicated)
Pipeline Resolvers contain one or more Functions which are executed in order.
Functions allow you to write common logic for reuse across multiple Resolvers in your schema. They are attached directly to a data source and like a Unit resolver, contain the same request and response mapping template format.
You can find a good example here.

validate request input phoenix elixir

I'm struggling to find something in the documentation that seems like it should be there...
In Phoenix I see validation at the point of trying to create an Ecto change set, but I'm not seeing much prior to that, upon validating the actual user input.
I'm not really a fan of exposing my data models across API boundaries, and I would rather just have structs representing the requests and responses, as they are likely very different shapes to my actual data models.
I'd like a way of converting user input to a struct and using some kind of validation framework to determine if the input is valid before I even think about hitting a database.
I've found https://github.com/CargoSense/vex and have gone down the route of converting the input to a struct, and using their validation, but there are a few things that worry me about this approach, namely:
I hear that there are issues with atoms in Elixir, and as structs are basically atom keyed maps, am I going to run into this atom exhaustion issue converting user input to these?
I also have some structs that would have nested structs. I'm currently checking the default value provided. If it's a struct doing some magic, based on the answers here In Elixir how do you initialize a struct with a map variable, to automatically convert a nested map to my nested struct. But again, I'm not sure if this is sensible.
The validations I'm defining in one DSL will be very similar in my Ecto models, and I would rather be using this for both.
Basically, how would you go about validating user input correctly in a Phoenix app. Am I on the right lines, or way off?

grails - I need to define my validation at runtime

I have an idea to read an XML document from the database and generate simple CRUD screens (via Grails) based on the data defined. My application will call RESTFul services to persist the data so I don't need Hibernate on the client side. I have ideas about how to generate the UI but where I'm stumped is in how to perform the validation.
I'll have a single, generic domain/command object that contains only the fields that are common for all instances of this "runtime" data type. All other fields are defined via the XML found in the database. I need something like this:
String xml // defines the fields, constraints, UI information for this data type
def constraints = {
callMyCustomValidator(obj)
}
and in my callMyCustomValidator method, I'll extract the xml for obj and perform my validation as needed.
Note: We have a working example of this in a different app (written in java/servlers/jsp) and without any formal "framework" this isn't difficult to do. Why do I need this? We need to add simple datatypes on the fly (via script) without a release.
You can use the validator to add custom validation to your domain class. Just add this to some of your common fields.

Can Content Negotiation Be Used to Control Return Types in the ASP.NET WebApi?

We're building a web api whose GET methods return DTOs. We'd like to build it so that, under certain circumstances, these DTOs are stripped of unnecessary properties in an effort to control the volume of data being sent down to the client. For example, when we return one of our email DTOs we sometimes would like the client to specify that it only needs a subject, date and ID and not the body of the email. In other scenarios, of course, the body of the email is needed.
What's the best way in the MVC WebApi to do this? I've looked into MediaTypeFormatters but they seem focused on the format of the data (JSONP, XML) rather than the content.
It sounds to me like you would like to have a custom mediatype.
This could be used in combination with a custom MediaTypeFormatter.
For instance, you could define your own mediatype (this is a bad example of a name):
application/vnd.me-shortform
And then, in your code you can omit filling in the emailbody and let the default formatter format your result.
Or you could write your own MediaTypeFormatter (subclassing an existing one) and register it for your custom mediatype.
Then, in the MediaTypeFormatter you could either through attributes on your DTO or something similar decide that the email body is not necessary and omit having it as part of the result.
Mark Seeman on Vendor Media Types should give you a good starting point.

Resources