Check queried subfields in runtime graphql Nestjs? - graphql

I try to optimize one of my queries which returns #ObjectType() like this:
isFeatureEnabled() {
.
.
.
return {
isDistributedInboundEnabled: await getIsDistributedInboundEnabled(),
isBusinessHoursEnabled: await getIsBusinessHoursEnabled(),
};
}
However, when executing this query with :
query GetCompany {
getCompany{
isFeatureEnabled{
isBusinessHoursEnabled
# isDistributedInboundEnabled <--- Do not query this !
}
}
}
both of the functions are executed (await getIsDistributedInboundEnabled(), await getIsBusinessHoursEnabled()).
Can I check which subfields are queried in runtime? That way only required functions will be executed on BE.

Ok, personally used approach described in this link and found it working with nest 8.x.x:
Fortunately, we can access the details about the GraphQL query with the #Info() decorator. The most straightforward way to use it is with the graphql-parse-resolve-info library.
import { Info, Query, Resolver } from '#nestjs/graphql';
import { Post } from './models/post.model';
import PostsService from './posts.service';
import { parseResolveInfo, ResolveTree, simplifyParsedResolveInfoFragmentWithType } from 'graphql-parse-resolve-info';
import { GraphQLResolveInfo } from 'graphql';
#Resolver(() => Post)
export class PostsResolver {
constructor(
private postsService: PostsService
) {}
#Query(() => [Post])
async posts(
#Info() info: GraphQLResolveInfo
) {
const parsedInfo = parseResolveInfo(info) as ResolveTree;
const simplifiedInfo = simplifyParsedResolveInfoFragmentWithType(
parsedInfo,
info.returnType
);
const posts = 'author' in simplifiedInfo.fields
? await this.postsService.getPostsWithAuthors()
: await this.postsService.getPosts();
return posts.items;
}
}

Related

How to react on GraphQL schema directives using graphql-tools

I have the following directive in y GraphQL schema to check if arguments are real slugs
directive #slug on ARGUMENT_DEFINITION
Query {
subject(id: ID! #slug): Subject
}
We had a working solution before using SchemaDirectiveVisitor from apollo-server-express, which to my understanding was just an re export from graphql-tools. As this was removed with the latest major release we have to refactor it. As far as I understand this, also the graphql-tools API has changed, so to react on directives we have to follow the examples here.
So this is what I have so far but it doesn't go into the fieldConfig.resolve function at all:
export default function (schema) {
return mapSchema(schema, {
[MapperKind.ARGUMENT]: fieldConfig => {
const slugDirective = getDirective(schema, fieldConfig, 'slug')?.[0]
if (slugDirective) {
const { resolve = defaultFieldResolver } = fieldConfig
fieldConfig.resolve = async function (
source,
args,
context,
info
) {
const result = await resolve(source, args, context, info)
console.log('result', result)
}
return fieldConfig
}
},
})
}

Multiple graphql queries in Gatsby component

I need to run multiple graphQL queries within a component and within the gatsby-node.js file. (Because Prismic is limited to 20 entries per answer...🙄)
I tried the following, just to see if I could create the graphql loop in the default function:
export default () => {
async function allPosts() {
let data
await graphql(`
query allDitherImages {
prismic {
allProjects(sortBy: meta_firstPublicationDate_DESC) {
totalCount
pageInfo {
startCursor
endCursor
hasNextPage
hasPreviousPage
}
edges {
node {
cover_image
cover_imageSharp {
name
}
}
}
}
}
}
`).then(initialRes => {
data = initialRes
})
return data
}
allPosts().then(result => {
console.log(result)
})
return null
}
But then Gatsby tells me that Gatsby related 'graphql' calls are supposed to only be evaluated at compile time, and then compiled away. Unfortunately, something went wrong and the query was left in the compiled code.
How can I run multiple graphql queries?
Thank you in advance :)
Michael
The gatsby-source-prismic-graphql package will create pages for all of your Prismic items (more than just the first 20), as it iterates over all items under the hood, so I'd advise looking into using that if you are looking to generate pages for all of those items.
But if you need to get all items and pass them in the pageContext or something, you'll need to do the recursion yourself in the gatsby-node.
In the gatsby-node, after you have defined the query, you can use something like this to iterate over the results and push to an array.
let documents = [];
async function getAllDocumentsRecursively (query, prop, endCursor = '') {
const results = await graphql(query, { after: endCursor })
const hasNextPage = results.data.prismic[prop].pageInfo.hasNextPage
endCursor = results.data.prismic[prop].pageInfo.endCursor
results.data.prismic[prop].edges.forEach(({node}) => {
documents.push(node)
});
if (hasNextPage) {
await getAllDocumentsRecursively(query, 'allDitherImages ', endCursor)
}
}
await getAllDocumentsRecursively(documentsQuery, 'allDitherImages ');
Then in your createPage, pass the array into the context:
createPage({
path: `/`+ node._meta.uid,
component: allDitherTempate,
context: {
documents: documents
}
})

Split the graphql resolvers file into seperatefiles

I'm working with GraphQL and have a resolvers.js file that looks like this:
const User = require("../models/User");
const Post = require("../models/Post");
module.exports = {
Query: {
async users(){...},
async user(){...},
async posts(){...},
async post(){...},
},
User: {...},
Post: {...},
Mutation: {
createUser(){...},
login(){...},
createPost(){...},
},
}
But if I have more models, queries and mutations the file is gonna be very long. How can I split this into seperate files? One for user queries and mutations, one for posts and so. Or is that not possible? Maybe there's a way to combine this with the schema.js file? So that I can split the schema too and put schema/resolver from User into a file. I'm still a beginner in coding.
I found a way to do it very easy actually. In the schema.js I can use lodash merge to combine multiple resolver files and for the typedefs I just use an array. This way I can split everything into seperate files.
const { merge } = require("lodash");
module.exports = makeExecutableSchema({
typeDefs: [typeDefs, userTypeDefs],
resolvers: merge(resolvers, userResolvers)
});
Just in case somebody is looking for an answer in 2020,
I had the similar issue,
and I tried to adapt the method mentioned,
but found an easier way to solve the problem.
I used graphql-tools's mergeResolvers to solve the issue - https://www.graphql-tools.com/docs/merge-resolvers/
Example code would be like this
const { mergeResolvers } = require('#graphql-tools/merge');
const clientResolver = require('./clientResolver');
const productResolver = require('./productResolver');
const resolvers = [
clientResolver,
productResolver,
];
module.exports mergeResolvers(resolvers);
The lodash merge would not differentiate Query and Mutation,
thus throwing an error in my case.
Here is how I did it. Do note that this is in Typescript.
You would define your resolvers in separate files, such as this:
import { DateBidListResolvers } from "../../types/generated";
export const DateBidList: DateBidListResolvers.Type = {
...DateBidListResolvers.defaultResolvers,
list: (_, __) => { // This is an example resolver of Type DateBidList
throw new Error("Resolver not implemented");
}
};
Then you would aggregate them together in a single file like this:
import { Resolvers } from "../../types/generated";
import { Query } from "./Query";
import { User } from "./User";
import { DateBid } from "./DateBid";
import { DateItem } from "./DateItem";
import { Match } from "./Match";
import { Mutation } from "./Mutation";
import { Subscription } from "./Subscription";
import { DateBidList } from "./DateBidList";
import { DateList } from "./DateList";
import { Following } from "./Following";
import { MatchList } from "./MatchList";
import { Message } from "./Message";
import { MessageItem } from "./MessageItem";
import { Queue } from "./Queue";
export const resolvers: Resolvers = {
DateBid,
DateBidList,
DateItem,
DateList,
Following,
Match,
MatchList,
Message,
MessageItem,
Mutation,
Query,
Queue,
Subscription,
User
};
You could then import that resolvers export into your configuration setup:
import { resolvers } from './resolvers/index';
// ... other imports here
export const server = {
typeDefs,
resolvers,
playground,
context,
dataSources,
};
export default new ApolloServer(server);
I hope this helps!

Switching from graphql-js to native graphql schemas?

Currently trying to switch from graphql-js to literal GraphQL types/schemas, I'd like to know if anyone has had any experience with this.
Let's take this really simple one :
const Person = new GraphQLObjectType({
name: 'Person',
fields: () => ({
name: {
type: GraphQLString,
description: 'Person name',
},
}),
});
I'd like to switch to the native GraphQL schema syntax i.e
type Person {
# Person name
name: String
}
However this would have to be incremental, and given the use of graphql-js, the best solution for now would be to parse GraphQL template literals to GraphQLObjectType (or any other type for that matter). Does anyone have experience doing this, I cannot seem to find any library for it unfortunately.
import { printType } from 'graphql';
printType(Person)
output:
type Person {
"""Person name"""
name: String
}
Here is the demo:
import { expect } from 'chai';
import { printType, printSchema, buildSchema, GraphQLSchema } from 'graphql';
import { logger } from '../util';
import { Person } from './';
describe('test suites', () => {
it('convert constructor types to string types', () => {
const stringTypeDefs = printType(Person).replace(/\s/g, '');
logger.info(printType(Person));
const expectValue = `
type Person {
"""Person name"""
name: String
}
`.replace(/\s/g, '');
expect(stringTypeDefs).to.be.equal(expectValue);
});
it('buildSchema', () => {
const stringTypeDefs = printType(Person);
const schema = buildSchema(stringTypeDefs);
expect(schema).to.be.an.instanceof(GraphQLSchema);
});
it('printSchema', () => {
const stringTypeDefs = printType(Person);
const schema = printSchema(buildSchema(stringTypeDefs));
logger.info(schema);
const expectValue = `
type Person {
"""Person name"""
name: String
}
`.replace(/\s/g, '');
expect(schema.replace(/\s/g, '')).to.be.eql(expectValue);
});
});
source code:
https://github.com/mrdulin/nodejs-graphql/blob/master/src/convert-constructor-types-to-string-types/index.spec.ts
You can use graphql-cli to extract a native graphql schema from a graphql server. All you need to do is..
Download the tool | npm i -g graphql-cli
Run graphql init in the directory of your project to
create .graphqlconfig file
Start your graphql server
Run graphql get-schema and this will generate a your schema in native graphql
SAMPLE .graphqlconfig
{
"projects": {
"my_sample_project": {
"schemaPath": "schema.graphql",
"extensions": {
"endpoints": {
"local": "http://localhost:8080/graphql"
}
}
}
}
}
We leverage the auto-generation of graphql schema/queries/mutations for our CI workflows.

Emit deprecation warnings with Apollo client

Background
We are working on a fairly large Apollo project. A very simplified version of our api looks like this:
type Operation {
foo: String
activity: Activity
}
type Activity {
bar: String
# Lots of fields here ...
}
We've realised splitting Operation and Activity does no benefit and adds complexity. We'd like to merge them. But there's a lot of queries that assume this structure in the code base. In order to make the transition gradual we add #deprecated directives:
type Operation {
foo: String
bar: String
activity: Activity #deprecated
}
type Activity {
bar: String #deprecated(reason: "Use Operation.bar instead")
# Lots of fields here ...
}
Actual question
Is there some way to highlight those deprecations going forward? Preferably by printing a warning in the browser console when (in the test environment) running a query that uses a deprecated field?
So coming back to GraphQL two years later I just found out that schema directives can be customized (nowadays?). So here's a solution:
import { SchemaDirectiveVisitor } from "graphql-tools"
import { defaultFieldResolver } from "graphql"
import { ApolloServer } from "apollo-server"
class DeprecatedDirective extends SchemaDirectiveVisitor {
public visitFieldDefinition(field ) {
field.isDeprecated = true
field.deprecationReason = this.args.reason
const { resolve = defaultFieldResolver, } = field
field.resolve = async function (...args) {
const [_,__,___,info,] = args
const { operation, } = info
const queryName = operation.name.value
// eslint-disable-next-line no-console
console.warn(
`Deprecation Warning:
Query [${queryName}] used field [${field.name}]
Deprecation reason: [${field.deprecationReason}]`)
return resolve.apply(this, args)
}
}
public visitEnumValue(value) {
value.isDeprecated = true
value.deprecationReason = this.args.reason
}
}
new ApolloServer({
typeDefs,
resolvers,
schemaDirectives: {
deprecated: DeprecatedDirective,
},
}).listen().then(({ url, }) => {
console.log(`🚀 Server ready at ${url}`)
})
This works on the server instead of the client. It should print all the info needed to track down the faulty query on the client though. And having it in the server logs seem preferable from a maintenance perspective.

Resources