NestJS GrapHQL ensure scalar type - graphql

I have custom scalars for my users ID, CustomerID and ProviderID, I would like to validate them when someone call a mutation to ensure that the given ID match a user and of correct type.
We cannot make CustomerScalar parseValue method asynchronous, so I'm looking for a nice way to deal with such things.
Maybe customerDecorator ? I don't know ? Any idea ?
I would like to access my repository using Dependencies Injection to ensure that the passed ID exists in the database and is really a Customer.
Field Middleware and Directive seems not to support Deps injection.
#InputType()
export class CreateBillInput {
#Field(() => CustomerIDScalar)
customerID: CustomerID;
#Field()
name: string;
#Field(() => Int)
amount: number;
}
What I wanted that cannot work :
#Injectable()
export class CustomerIDScalar implements CustomScalar<string, CustomerID> {
constructor(private userRepository: IUserRepository) {}
parseValue(value: string) {
return this.getCustomerID(value);
}
parseLiteral(ast: ValueNode) {
if (ast.kind !== Kind.STRING) {
throw new TypeError('Argument is not a string value.');
}
return this.getCustomerID(value);;
}
serialize(value: CustomerID) {
return value.value; // value sent to the client
}
// TODO: Not usable here
private async getCustomerID(userID: string): Promise<CustomerID> {
const user = await this.userRepository.getByID(userID);
if (!user || !user.isCustomer()) {
throw new BadRequestException('There is no such customer user for provided ID.');
}
return user.id as CustomerID;
}
}
Thanks

First of all. gql scalars validation are about technical check to validate all data are correct (schema check). It's not suppose to have any business rules validation stuff there.
To achieve desired result you can use the next things:
Nest validation pipes with #Args decorator:
#Mutation(returns => Group)
async createGroup(
#Args('group', new ValidationPipe())
input: CreateGroupInput,
)
class-validation https://github.com/typestack/class-validator
Simply put a regular ID/Int scalar for the input type & validate it later in service class that responsive for such operation (recommend to use this approach for such things in gql)

Related

How to trigger visitInputObject method on custom directive?

I'm building a custom directive in which I'm hoping to validate entire input objects. I'm using the INPUT_OBJECT type with the visitInputObject method on SchemaDirectiveVisitor extended class.
Every time I run a mutation using the input type then visitInputObject does not run.
I've used the other types/methods like visitObject and visitFieldDefinition and they work perfectly. But when trying to use input types and methods they will not trigger.
I've read all the available documentation I can find. Is this just not supported yet?
Some context code(Not actual):
directive #validateThis on INPUT_OBJECT
input MyInputType #validateThis {
id: ID
someField: String
}
type Mutation {
someMutation(myInput: MyInputType!): SomeType
}
class ValidateThisDirective extends SchemaDirectiveVisitor {
visitInputObject(type) {
console.log('Not triggering');
}
}
All the visit methods of a SchemaDirectiveVisitor are ran at the same time -- when the schema is built. That includes visitFieldDefinition and visitFieldDefinition. The difference is that when we use visitFieldDefinition, we often do it to modify the resolve function for the visited field. It's this function that's called during execution.
You use each visit methods to modify the respective schema element. You can use visitInputObject to modify an input object, for example to add or remove fields from it. You cannot use it to modify the resolution logic of an output object's field. You should use visitFieldDefinition for that.
visitFieldDefinition(field, details) {
const { resolve = defaultFieldResolver } = field
field.resolve = async function (parent, args, context, info) {
Object.keys(args).forEach(argName => {
const argDefinition = field.args.find(a => a.name === argName)
// Note: you may have to "unwrap" the type if it's a list or non-null
const argType = argDefinition.type
if (argType.name === 'InputTypeToValidate') {
const argValue = args[argName]
// validate here
}
})
return resolve.apply(this, [parent, args, context, info]);
}
}

Possible to genericize NGXS actions?

I'd like to write just one action to perform the same CRUD operations on state, just on different slices of it, while preserving type safety.
For example, I'd like to use the following action to apply a set operation to any slice with a generic type T:
export class Set_Entity<T> {
static readonly type = '[Entity] Set';
constructor(public payload: <T>) {}
}
This is problematic because the type will always be the same. Is it possible to somehow decorate this class so a unique type property can be passed in whenever it is used as the #Action?
Something like:
/* action* /
class Set_Entity<T> {
constructor(public entity: string, public payload: <T>) {}
}
/* state */
#Action(Set_Entity('[Groups] Set Group'/* <-- Changes the `type` property */))
set_group(
context: StateContext<Model>,
action: SetEntity<{entity: string, payload: Group}>,
) {
const entity = action.entity;
const data = action.payload;
context.patchState({ [entity]: data });
}
/* facade or something */
this.store.dispatch([
new Set_Entity<GroupEntityType>(
'user', // <-- the state slice
aRecord,
),
]);
Even this solution leaves more to be desired. Generic Actions still must be written for each state slice, for each CRUD operation. It would be nice to be able to use the same generic action for each CRUD op on each state slice.
I managed to do it beautifully with NGRX via typescript-fsa and typescript-fsa-reducers. Only needed one single generic action plus one single generic reducer for the entire state, all typesafe.
The action looked like this:
function generic_set_action<T>(sliceName: string): ActionCreator<T> {
const creator = actionCreatorFactory(sliceName);
const action = creator<T>('set')
return action; // Produces type of `sliceName/set`
}
// Create the action
generic_set_action<User>('sliceName')(payload)
The reducer:
export function create_generic_reducer<T>(sliceName: string) {
const action_set = generic_set_action<T>(sliceName);
return reducerWithInitialState({} as T)
.case(action_set, (state, data) => (data))
.build();
}
And finally when creating the reducers:
export const Reducers: ActionReducerMap<State> = {
coolSlice: create_generic_reducer<MySliceModel>('coolSlice'),
// repeat for each slice..
};
It would be great to be able to reproduce this with NGXS.

clean way to get same field by different key

Here is the problem. I can get member by ID and my query looks like below:
{
member(memberId:[1,2]) {
firstName
lastName
contacts {
}
}
}
Now I need to add few more query to get member by name and email like below
{
member(email:["abc#xy.com","adc#xy.com"]) {
firstName
lastName
contacts {
}
}
}
{
member(name:["abc","adc"]) {
firstName
lastName
contacts {
}
}
}
How do I design my graphQL query and schema? Should my query have just 1 field with multiple optional arguments? like below
Field("member", ListType(Member),
arguments = ids :: email :: name,
resolve = (ctx) => {
val id : Seq[Int] = ctx.arg("memberId")
ctx.ctx.getMemberDetails(id)
})
Or should I have multiple query with different field under a schema. like below
Field("memberById", ListType(Member),
arguments = Id :: Nil,
resolve = (ctx) => {
val id : Seq[Int] = ctx.arg("memberId")
ctx.ctx.getMemberDetails(id)
})
Field("memberByEmail", ListType(Member),
arguments = email :: Nil,
resolve = (ctx) => {
val id : Seq[Int] = ctx.arg("memberId")
ctx.ctx.getMemberDetails(id)
})
Field("memberByName", ListType(Member),
arguments = name :: Nil,
resolve = (ctx) => {
val id : Seq[Int] = ctx.arg("memberId")
ctx.ctx.getMemberDetails(id)
})
Thank you in advance. let me know in case you need more details.
You should think about advantanges and disadvantages of both solutions.
If you will prepare separate fields, you will get a lot of boilerplate.
On the other hand you can set all possible inputs as OptionalInputType, it makes schema field only. Disadvantage of this solutions is that Sangria cannot validate a field that at least one argument should be required, so you have to cover this case with proper response or whatever.
The third option is to make generic solution at Schema level. You can create a query with two arguments filterName and filterValues, first would be EnumType for Id, Email, Name, the second would be a list of strings.
Such solution avoid disadvantages of both previous solutions, it has required fields and it doesn't need spreading fields in schema for every filter. Additionally if you want to add any additional function you have only edit FilterName enum and a resolver function to cover this.
Finally you schema will looks like this:
enum FilterName {
ID
EMAIL
NAME
}
type Query {
member(filterName: FilterName!, filterValues: [String]!): Member!
}

Validate nested domain class instance in command object

I try to validate a nested domain class instance on a command object.
Having the following command object
package demo
import grails.databinding.BindingFormat
class SaveEventCommand {
#BindingFormat('yyyy-MM-dd')
Date date
Refreshment refreshment
static constraints = {
date validator: { date -> date > new Date() + 3}
refreshment nullable: true
}
}
And having the following domain class with its own constraints
package demo
class Refreshment {
String food
String drink
Integer quantity
static constraints = {
food inList: ['food1', 'food2', 'food3']
drink nullable: true, inList: ['drink1', 'drink2', 'drink3']
quantity: min: 1
}
}
I need when refreshment is not nullable the command object validates the date property and check the corresponding restrictions in refreshment instance
For now try with this code in the controller:
def save(SaveEventCommand command) {
if (command.hasErrors() || !command.refreshment.validate()) {
respond ([errors: command.errors], view: 'create')
return
}
// Store logic goes here
}
Here through !command.refreshment.validate() I try to validate the refresh instance but I get the result that there are no errors, even when passing data that is not correct.
Thank you any guide and thank you for your time
I typically just include some code that will use a custom validator to kick off validation for any property that is composed of another command object. For example:
thePropertyInQuestion(nullable: true, validator: {val, obj, err ->
if (val == null) return
if (!val.validate()) {
val.errors.allErrors.each { e ->
err.rejectValue(
"thePropertyInQuestion.${e.arguments[0]}",
"${e.objectName}.${e.arguments[0]}.${e.code}",
e.arguments,
"${e.objectName}.${e.arguments[0]}.${e.code}"
)
}
}
})
This way it's pretty clear that I want validation to occur. Plus it moves all the errors up into the root errors collection which makes things super easy for me.
Two things I could think of:
Implement grails.validation.Validateable on your command object
What happens when you provide an invalid date? Can you see errors while validating?

Scala unique validation, PlayFramework, Scalaz

I've found some limitations in Play Faramework default Validation.
My biggest limitation is uniqueness validation.
Let say I'm Validating user registration form and i want to check if passed login already exists.
To do so, i need to ask db to count users by name
UsersService.countByName(s: String): Future[Long]
Is there a posiblity to solve this problem using scalaz Validation and |#|?
case class RegistrationForm(login: String)
object RegistrationForm {
def nonEmptyLogin(login: String): ValidationNel[String, String] = {
if(login.isEmpty)
"validation.error.blank.login".failureNel
else
login.successNel
}
def isLoginUnique(login: String): Future[ValidationNel[String, String]] = {
???
}
def validate(registrationForm: RegistrationForm): Future[ValidationNel[String, RegistrationForm]] = {
nonEmptyLogin(registrationForm.login) |#|
isLoginUnique(registrationForm.login) {
(_) => registrationForm
}
}
}
How should I implement the isLoginUnique method?
I'm not sure if I wrote validdate method correctly either. I just wanted to show my vision of validation.

Resources