How do you handle optional schema/attribute? - normalizr

I would like to know how do to handle an optional attribute in a schema.
For instance, I've got like:
const billSchema = new schema.Entity('bills')
billSchema.define({
_embedded: {
currentStep: stepSchema,
latestText: textSchema,
steps: [stepSchema],
}
})
Sometimes on some requests, "steps" exists but on others, it doesn't. And I 'd like not to create a schema for each type of API request.
Any idea?

Related

Proper way to call RTK Query endpoint.select() with no arguments (and skip options)

I would like to use the endpoint.select() function to create selectors from cached RTK Query data, see RTK Advanced Patterns.
The documentation clearly states that if there is no query argument, you can pass undefined to select() (see the Selecting Users Data section).
However, in my case this does not work unless i trigger the query by the initiate() function. When triggering the query from the query hook on the other hand, the selector fails to retrieve the cached data.
The not working setup here is pretty simple:
export const productsApi = createApi({
baseQuery: fetchBaseQuery({ baseUrl: API.ENDPOINTS.PRODUCTS }),
reducerPath: 'productsApi',
endpoints: (builder) => ({
listAllProducts: builder.query({
query: ()=>'/list',
}),
}),
});
export const { useListAllProductsQuery } = productsApi;
Then in a customHook I call the useListAllProducts hook:
const {
data,
} = useListAllProductsQuery({skip:shouldSkip});
And finally in the selector:
export const selectProducts =
productsApi.endpoints.listAllProducts.select(); //undefined param as docs recommend
Potential Fix: (or more like a hacky workaround):
Strangely enough, I discovered that if i pass an argument (aka cacheKey) into the select function and pass that same cacheKey into the query hook, all of a sudden the stars align and everything works (although the docs state this is not necessary). So the modified code looks like:
// in selector
export const selectProducts =
productsApi.endpoints.listAllProducts.select('products');
// in hook
const {
data,
} = useListAllProductsQuery('products');
Im wondering if anyone can shed some wisdom on why this works, or even better can
recommend the best practice for utilizing the select function on a query with no cacheKey (since the docs seem incorrect or outdated?).
I would also like to point out, when calling select() without a parameter, a typescript warning surfaces indicating a parameter is required.
I am not certain where you think that the docs do state that you do not need an argument.
You will need to call select with the same argument as you call your hook to get a selector for that cache key - so if you call useMyQuery(), you can call select() - and if you call useMyQuery(5), you can call select(5) to get a selector for the cache key 5.
Those are individual cache entries and you will need a selector for each of those cache entries.
Also, could you clarify what exactly you mean by "not working"? Using the selector will give you only the cache entry, but not make a request - you are after all just selecting from the store. So before you used the hook or dispatched initiate, you will get an uninitiated cache entry.
I think, this can solve your problem
in component.jsx file
const state = useSelector((state: RootState)=>state);
console.log(getState(state, params));
in api.js file
export const getState = (state: RootState, params) => api.endpoints.getApiState.select(params)(state);

Apollo conditional data sources & initialization lifecycle

I have a specific use case where a user’s data sources are conditional - e.g based on the data sources saved in the database for every specific user.
This also means every data source has unique credentials for every user, which is fine for RESTDataSource because I can use the willSendRequest to set the Authentication headers before each request.
However, I have custom data sources that have proprietary clients (for example JSForce for Salesforce) - and they have their own fetch mechanism.
As of now - I have a custom transformer directive that fetches the tokens from the database and adds it into the context - however, the directive is ran before the dataSource.initialize() method - so that I can’t use the credentials there because the context still doesn’t have it.
I also don’t want to initialize all data sources for every user even if he doesn’t use said data source in this request - but the dataSources() function doesn’t accept any parameter and is not contextual.
Bottom line is - is it possible to pass data sources conditionally based even on the Express request? When is the right time to pass the tokens and credentials to the dataSource? Maybe add my own custom init function and call it from the directive?
So you have options. Here are 2 choices:
1. Just add your dataSources
If you just initialize all dataSources, internally it can check to see if the user has access. You could have a getClient function that resolves on the client or throws an UnauthorizedError, depending.
2. Don't just add your dataSources
So if you really don't want to initialize the dataSources at ALL, you can absolutely do this by adding the "dataSources" yourself, just like Apollo does it.
const server = new ApolloServer({
// this example uses apollo-server-express
context: async ({ req, res }) => {
const accessToken = req.headers?.authorization?.split(' ')[1] || ''
const user = accessToken && buildUser(accessToken)
const context = { user }
// You can't use the name "dataSources" in your config because ApolloServer will puke, so I called them "services"
await addServices(context)
return context
}
})
const addServices = async (context) => {
const { user } = context;
const services = {
userAPI: new UserAPI(),
postAPI: new PostAPI(),
}
if (user.isAdmin) {
services.adminAPI = new AdminAPI()
}
const initializers = [];
for (const service of Object.values(services)) {
if (service.initialize) {
initializers.push(
service.initialize({
context,
cache: null, // or add your own cache
})
);
}
}
await Promise.all(initializers);
/**
* this is where you have to deviate from Apollo.
* You can't use the name "dataSources" in your config because ApolloServer will puke
* with the error 'Please use the dataSources config option instead of putting dataSources on the context yourself.'
*/
context.services = services;
}
Some notes:
1. You can't call them "dataSources"
If you return a property called "dataSources" on your context object, Apollo will not like it very much [meaning it throws an Error]. In my example, I used the name "services", but you can do whatever you want... except "dataSources".
With the above code, in your resolvers, just reference context.services.whatever instead.
2. This is what Apollo does
This pattern is copied directly from what Apollo already does for dataSources [source]
3. I recommend you still treat them as DataSources
I recommend you stick to the DataSources pattern and that your "services" all extend DataSource. It's going to be easier for everyone involved.
4. Type safety
If you're using TypeScript or something, you're going to lose a bit of type safety, since the context.services is either going to be one shape or another. Even if you're not, if you're not careful, you may end up throwing "Cannot read property users of undefined" errors instead of "Unauthorized" errors. You might be better off creating "dummy services" that reflect the same object shape but just throw Unauthorized.

How to test a Nest.js parameter decorator that uses graphql execution context

I have created a decorator to retrieve the graphql query context and do some logic with it. It looks like this:
export const GraphQlProjections = (options?: ProjectionOptions) =>
createParamDecorator<ProjectionOptions, ExecutionContext, string[]>(
(opts: ProjectionOptions, ctx: ExecutionContext) => {
const gqlContext = GqlExecutionContext.create(ctx)
const info = gqlContext.getInfo()
const fields = Object.keys(fieldsProjection(info, opts))
return fields
}
)(options)
I'd like to write some unit tests for this - but I have no idea how to even go about it.
I've found some documentation for getting the decorator factory, but this does not help with setting up/mocking execution context to allow. Apollo server docs seem to reference some sort of mocking, but doesn't tell me how to achieve my goal.
I basically need to say 'Given this query, say, Query { user { name } }, what will my decorator return?'
To achieve this, it seems i need to somehow mock the execution context to contain a GraphQLResolveInfo object, which I also somehow need to generate. How can I achieve this? Or am i going about this the wrong way?

How to manually test input validation with NestJS and class-validator

TLNR: I was trying to test DTO validation in the controller spec instead of in e2e specs, which are precisely crafted for that. McDoniel's answer pointed me to the right direction.
I develop a NestJS entrypoint, looking like that:
#Post()
async doStuff(#Body() dto: MyDto): Promise<string> {
// some code...
}
I use class-validator so that when my API receives a request, the payload is parsed and turned into a MyDto object, and validations present as annotations in MyDto class are performed. Note that MyDto has an array of nested object of class MySubDto. With the #ValidateNested and #Type annotations, the nested objects are also validated correctly.
This works great.
Now I want to write tests for the performed validations. In my .spec file, I write:
import { validate } from 'class-validator';
// ...
it('should FAIL on invalid DTO', async () => {
const dto = {
//...
};
const errors = await validate( dto );
expect(errors.length).not.toBe(0);
}
This fails because the validated dto object is not a MyDto. I can rewrite the test as such:
it('should FAIL on invalid DTO', async () => {
const dto = new MyDto()
dto.attribute1 = 1;
dto.subDto = { 'name':'Vincent' };
const errors = await validate( dto );
expect(errors.length).not.toBe(0);
}
Validations are now properly made on the MyDto object, but not on my nested subDto object, which means I will have to instantiate aaaall objects of my Dto with according classes, which would be much inefficient. Also, instantiating classes means that TypeScript will raise errors if I voluntarily omits some required properties or indicate incorrect values.
So the question is:
How can I use NestJs built-in request body parser in my tests, so that I can write any JSON I want for dto, parse it as a MyDto object and validate it with class-validator validate function?
Any alternate better-practice ways to tests validations are welcome too!
Although, we should test how our validation DTOs work with ValidationPipe, that's a form of integration or e2e tests. Unit tests are unit tests, right?! Every unit should be testable independently.
The DTOs in Nest.js are perfectly unit-tastable. It becomes necessary to unit-test the DTOs, when they contain complex regular expressions or sanitation logic.
Creating an object of the DTO for test
The request body parser in Nest.js that you are looking for is the class-transformer package. It has a function plainToInstance() to turn your literal or JSON object into an object of the specified type. In your example the specified type is the type of your DTO:
const myDtoObject = plainToInstance(MyDto, myBodyObject)
Here, myBodyObject is your plain object that you created for test, like:
const myBodyObject = { attribute1: 1, subDto: { name: 'Vincent' } }
The plainToInstance() function also applies all the transformations that you have in your DTO. If you just want to test the transformations, you can assert after this statement. You don't have to call the validate() function to test the transformations.
Validating the object of the DTO in test
To the emulate validation of Nest.js, simply pass the myDtoObject to the validate() function of the class-validator package:
const errors = await validate(myDtoObject)
Also, if your DTO or SubDTO object is too big or too complex to create, you have the option to skip the remaining properties or subObjects like your subDto:
const errors = await validate(myDtoObject, { skipMissingProperties: true })
Now your test object could be without the subDto, like:
const myBodyObject = { attribute1: 1 }
Asserting the errors
Apart from asserting that the errors array is not empty, I also like to specify a custom error message for each validation in the DTO:
#IsPositive({ message: `Attribute1 must be a positive number.` })
readonly attribute1: number
One advantage of a custom error message is that we can write it in a user-friendly way instead of the generic messages created by the library. Another big advantage is that I can assert this error message in my tests. This way I can be sure that the errors array is not empty because it contains the error for this particular validation and not something else:
expect(stringified(errors)).toContain(`Attribute1 must be a positive number.`)
Here, stringified() is a simple utility function to convert the errors object to a JSON string, so we can search our error message in it:
export function stringified(errors: ValidationError[]): string {
return JSON.stringify(errors)
}
Your final test code
Instead of the controller.spec.ts file, create a new file specific to your DTO, like my-dto.spec.ts for unit tests of your DTO. A DTO can have plenty of unit tests and they should not be mixed with the controller's tests:
it('should fail on invalid DTO', async () => {
const myBodyObject = { attribute1: -1, subDto: { name: 'Vincent' } }
const myDtoObject = plainToInstance(MyDto, myBodyObject)
const errors = await validate(myDtoObject)
expect(errors.length).not.toBe(0)
expect(stringified(errors)).toContain(`Attribute1 must be a positive number.`)
}
Notice how you don't have to assign the values to the properties one by one for creating the myDtoObject. In most cases, the properties of your DTOs should be marked readonly. So, you can't assign the values one by one. The plainToInstance() to the rescue!
That's it! You were almost there, unit testing your DTO. Good efforts! Hope that helps now.
To test input validation with the validation pipes, I think it is agreed that the best place to do this is in e2e tests rather than in unit tests, just make sure that you remember to register your pipes (if you normally use app.useGlobalPipes() instead of using dependency injection)

Apollo GraphQL - How do I use an RxJS Subject as a variable with Apollo Client?

My type-ahead search was working great with REST but I'm converting to GraphQL, which has its challenges.
As the user types a last name into a form field the suggested results display in a data table below. Each letter is handled by the RxJS Subject.
The var searchTerm$ is a type of RXJS observable called a Subject binds to the HTML. The following is called from the OnViewInit lifecycle hook in an Angular app. The search is by the database column last_name.
However, this results in a Bad Request 400 error as the view loads and search doesn't work. I thought maybe this calls for a subscription but everything I find on those is about using web sockets to connect to a remote URL and server. Where do I go from here?
I'm using the Angular Apollo client with Apollo Express but I would be happy with any JS solution and try to figure it out from there. The server side is Nestjs which just wraps Apollo Server.
const lastNameSearch = gql `
query ($input: String!) {
lastNameSearch(input: $input) {
first_name
last_name
user_name
pitch
main_skill_title
skills_comments
member_status
}
}`;
this.apollo
.watchQuery({
query: lastNameSearch,
variables: {
last_name: searchTerm$, // Trying to use the observable here.
},
})
.valueChanges
.subscribe(result => {
console.log('data in lastNameSearch: ', result);
}),
The schema on the server:
lastNameSearch(input: String!): [Member]
The resolver:
#Query()
async lastNameSearch(#Args('input') input: String) {
const response = await this.membersService.lastNameSearch(input);
return await response;
}
Edit:
The error from the Network panel in dev tools. Console message worthless.
{"errors":[{"message":"Variable \"$input\" of required type \"String!\" was not provided.","locations":[{"line":1,"column":8}],"extensions":{"code":"INTERNAL_SERVER_ERROR","exception":{"stacktrace":["GraphQLError: Variable \"$input\" of required type \"String!\" was not provided."," at getVariableValues
And this goes on showing properties and methods in the app for another 300 lines or so.
First, a big thank you to the amazing Daniel Rearden for his help on various questions as I and lots of others on SO learn GraphQL! He has patience!
As Daniel pointed out in comments I had a simple mistake. I'll point it out in the commented code below. However, the big issue was trying to use an observable, subject, or similar method as a variable. Even if the RxJS subject is emitting a string GraphQL will hate trying to use a large object as a var. So I had to use a little reactive programming to solve this.
Setup the observable:
public searchTerm$ = new Subject<string>(); // Binds to the html text box element.
Second, let's set this up in a lifecycle hook where we subscribe to the observable so it will emit letters one at a time as they are typed into an input box.
ngAfterViewInit() {
let nextLetter: string;
// -------- For Last Name Incremental Query --------- //
this.searchTerm$.subscribe(result => {
nextLetter = result; // Setup a normal variable.
this.queryLastName(nextLetter); // Call the GraphQL query below.
});
}
Last step we have the GraphQL query and consuming the returned data object. This works perfect to say type a 'p' into the form and get back from a db all the last names starting with 'p' or 'P'. Type 'r' and the results narrow to last names starting with 'pr', and so on.
private queryLastName(nextLetter) {
const lastNameSearch = gql`
query ($input: String!) {
lastNameSearch(input: $input) {
first_name
last_name
user_name
pitch
main_skill_title
skills_comments
member_status
}
}`;
this.apollo
.watchQuery({
query: lastNameSearch,
variables: {
input: nextLetter, // Notice I had used last_name here instead of input.
},
})
.valueChanges
.subscribe(result => {
// Put the data into some UI in your app, in this case
// an Angular Material data table.
// Notice how we get the data from the returning object.
// The avoids the dreaded "null" error when the shape of the
// returned data doesn't match the query. This put an array
// of objects into the UI.
this.dataSource.data = result.data['lastNameSearch'];
},
);
}

Resources