Dataloader is not working as expected in graphql resolver - graphql

I'm having a dataloader along with my graphql like the following:
async function testDataLoader(accountNumber, req, args) {
const dummy = new DataLoader(async accountNumber => {
return new Promise(async (resolve, reject) => {
// rest call
return resolve([<rest result>])
});
});
return dummy.load(accountNumber)
}
export default {
Friends: {
query1: async ({req, args}) => {
const data = await testDataLoader(["12121"], req, args);
// do something with data
}
query2: async ({req, args}) => {
const data = await testDataLoader(["12121"], req, args);
// do something with data
}
}
};
When we query like:
Friends {
query1
query2
}
I expect dataloader to call my rest services only once. However, I could able to see my rest is called twice. Not sure, where I'm making the mistakes.

The issue is that every time you're calling testDataLoader, you're creating a new instance of DataLoader. You should create a single DataLoader instance per request (per resource you're loading). This way every time you call load you're interacting with the same cache.
You could do something like:
const dummy = new DataLoader(...);
async function testDataLoader(accountNumber) {
return dummy.load(accountNumber)
}
But this would persist the DataLoader between requests, which you don't want to do. What you should do is create the DataLoader instance as part of your context, which is recreated each time a request is executed.
const context = async ({ req }) => {
return {
testLoader = new DataLoader(...),
};
},
const server = new ApolloServer({
...
context,
})
Then just call your loader directly inside your resolver:
query2: async (parent, args, context) => {
const data = await context.testLoader.load(["12121"]);
...
}

Related

How to access Koa context (and koa-session) from Apollo resolvers?

Given the following Koa application initialization :
const apolloServer = new ApolloServer({
...processSchema(schemas),
...graphqlConfig,
cors: false, // already present with Koa
context: ctx => {
console.log( "*** 2", ctx.session );
// -> *** 2 undefined
}
});
const app = new Koa();
const serverHttp = app
.use(cors(CORS_CONFIG))
.use(session(SESSION_CONFIG, app))
.use(async (ctx, next) => {
console.log( "*** 1", ctx.session );
// *** 1 Session { ...session object }
await next();
if (!ctx.body) {
ctx.throw(404);
}
})
.use(koaBody())
.use(apolloServer.getMiddleware())
.listen(port)
;
As you can see, making any GraphQL query will output
*** 1 Session { ...session object }
*** 2 undefined
Showing that Apollo does not receive the context, neither the session.
Is it possible to have access to the session from context function?
Is it possible to have access to the session from a resolver?
Yes, it is possible to access session from the context function, and here's the way to do it:
const server = new ApolloServer({
typeDefs,
resolvers,
context: (req) => {
const { session } = req.ctx;
// return an object with whatever properties you
// need to be accessible inside resolvers as `context`
return {
userSession: session
}
}
})
Then, inside your resolver, you can access it the following way:
const resolvers = {
Query: {
books: (parent, args, context) {
const { userSession } = context;
const books = [...];
return books;
}
}
}
Hope this helps.

Apollo Server - Apply Authentication to Certain Resolvers Only with Passport-JWT

I currently have a Node.js back-end running Express with Passport.js for authentication and am attempting to switch to GraphQL with Apollo Server. My goal is to implement the same authentication I am using currently, but cannot figure out how to leave certain resolvers public while enabling authorization for others. (I have tried researching this question extensively yet have not been able to find a suitable solution thus far.)
Here is my code as it currently stands:
My JWT Strategy:
const opts = {};
opts.jwtFromRequest = ExtractJwt.fromAuthHeaderAsBearerToken();
opts.secretOrKey = JWT_SECRET;
module.exports = passport => {
passport.use(
new JwtStrategy(opts, async (payload, done) => {
try {
const user = await UserModel.findById(payload.sub);
if (!user) {
return done(null, false, { message: "User does not exist!" });
}
done(null, user);
} catch (error) {
done(err, false);
}
})
);
}
My server.js and Apollo configuration:
(I am currently extracting the bearer token from the HTTP headers and passing it along to my resolvers using the context object):
const apollo = new ApolloServer({
typeDefs,
resolvers,
context: async ({ req }) => {
let authToken = "";
try {
if (req.headers.authorization) {
authToken = req.headers.authorization.split(" ")[1];
}
} catch (e) {
console.error("Could not fetch user info", e);
}
return {
authToken
};
}
});
apollo.applyMiddleware({ app });
And finally, my resolvers:
exports.resolvers = {
Query: {
hello() {
return "Hello world!";
},
async getUserInfo(root, args, context) {
try {
const { id } = args;
let user = await UserModel.findById(id);
return user;
} catch (error) {
return "null";
}
},
async events() {
try {
const eventsList = await EventModel.find({});
return eventsList;
} catch (e) {
return [];
}
}
}
};
My goal is to leave certain queries such as the first one ("hello") public while restricting the others to requests with valid bearer tokens only. However, I am not sure how to implement this authorization in the resolvers using Passport.js and Passport-JWT specifically (it is generally done by adding middleware to certain endpoints, however since I would only have one endpoint (/graphql) in this example, that option would restrict all queries to authenticated users only which is not what I am looking for. I have to perform the authorization in the resolvers somehow, yet not sure how to do this with the tools available in Passport.js.)
Any advice is greatly appreciated!
I would create a schema directive to authorized query on field definition and then use that directive wherever I want to apply authorization. Sample code :
class authDirective extends SchemaDirectiveVisitor {
visitObject(type) {
this.ensureFieldsWrapped(type);
type._requiredAuthRole = this.args.requires;
}
visitFieldDefinition(field, details) {
this.ensureFieldsWrapped(details.objectType);
field._requiredAuthRole = this.args.requires;
}
ensureFieldsWrapped(objectType) {
// Mark the GraphQLObjectType object to avoid re-wrapping:
if (objectType._authFieldsWrapped) return;
objectType._authFieldsWrapped = true;
const fields = objectType.getFields();
Object.keys(fields).forEach(fieldName => {
const field = fields[fieldName];
const {
resolve = defaultFieldResolver
} = field;
field.resolve = async function (...args) {
// your authorization code
return resolve.apply(this, args);
};
});
}
}
And declare this in type definition
directive #authorization(requires: String) on OBJECT | FIELD_DEFINITION
map schema directive in your schema
....
resolvers,
schemaDirectives: {
authorization: authDirective
}
Then use it on your api end point or any object
Query: {
hello { ... }
getuserInfo():Result #authorization(requires:authToken) {...}
events():EventResult #authorization(requires:authToken) {...}
};

How do you do async setup outside of a lambda?

Config makes a call to the parameter store and returns a config object. I need to wait before initialising mysql.
const config = require('./config');
const mysql = require('serverless-mysql')(config);
exports.handler = (event, context) => {
// mysql stuff
}
I assume you need to wait for this to happen?
const mysql = require('serverless-mysql')(config)??
If so, then do this:
const config = require('./config');
async function mySQLStuff() {
try{
const mysql = await require('serverless-mysql')(config);
} catch (error) {
//handle error
}
return mysql;
};
exports.handler = (event, context) => {
mySQLStuff()
.then((data) => //mysql stuff)
};

Apollo GraphQL server; setting context to handle requests triggered by a fired subscription

I understand how to set the context object when creating a GraphQL server e.g.
const app = express();
app.use(GRAPHQL_URL, graphqlExpress({
schema,
context: {
foo: 'bar'
},
}));
so that the context object is passed to my resolvers when handling an incoming request.
However I'm not seeing this context object when the resolvers are triggered by a subscription (i.e. a client subscribes to a GraphQL subscription, and defines the shape of the data to be sent to them when the subscription fires); in that case the context appears to be an empty Object.
Is there way to ensure that my context object is set correctly when resolvers are called following a PubSub.publish() call?
I guess you are using the package subscription-transport-ws. In that case it is possible to add a context value in different execution steps.
See API. Two possible scenarios
If you have some kind of authentication. You could add a viewer in the context at the onConnect execution step. This is done at the first connection to the websocket and wont change until the connection is closed and opened again. See example.
If you want to add a context more dynamically you can add a kind of middleware before the execute step.It could look like this:
const middleware = (args) => new Promise((resolve, reject) => {
const [schema, document, root, context, variables, operation] = args;
context.foo = "bar"; // add something to context
resolve(args);
})
subscriptionServer = SubscriptionServer.create({
schema: executable.schema,
subscribe,
execute: (...args) => middleware(args).then(args => {
return execute(...args);
})
}, {
server: websocketServer,
path: "/graphql",
}, );
Here is my solution:
You can pass the context and do the authentication for graphql subscription(WebSocket )like this:
const server = new ApolloServer({
typeDefs,
resolvers,
context: contextFunction,
introspection: true,
subscriptions: {
onConnect: (
connectionParams: IWebSocketConnectionParams,
webSocket: WebSocket,
connectionContext: ConnectionContext,
) => {
console.log('websocket connect');
console.log('connectionParams: ', connectionParams);
if (connectionParams.token) {
const token: string = validateToken(connectionParams.token);
const userConnector = new UserConnector<IMemoryDB>(memoryDB);
let user: IUser | undefined;
try {
const userType: UserType = UserType[token];
user = userConnector.findUserByUserType(userType);
} catch (error) {
throw error;
}
const context: ISubscriptionContext = {
// pubsub: postgresPubSub,
pubsub,
subscribeUser: user,
userConnector,
locationConnector: new LocationConnector<IMemoryDB>(memoryDB),
};
return context;
}
throw new Error('Missing auth token!');
},
onDisconnect: (webSocket: WebSocket, connectionContext: ConnectionContext) => {
console.log('websocket disconnect');
},
},
});
You can pass the context argument of resolver using pubsub.publish method in your resolver like this:
addTemplate: (
__,
{ templateInput },
{ templateConnector, userConnector, requestingUser }: IAppContext,
): Omit<ICommonResponse, 'payload'> | undefined => {
if (userConnector.isAuthrized(requestingUser)) {
const commonResponse: ICommonResponse = templateConnector.add(templateInput);
if (commonResponse.payload) {
const payload = {
data: commonResponse.payload,
context: {
requestingUser,
},
};
templateConnector.publish(payload);
}
return _.omit(commonResponse, 'payload');
}
},
Now, we can get the http request context and subscription(websocket) context in
your Subscription resolver subscribe method like this:
Subscription: {
templateAdded: {
resolve: (
payload: ISubscriptionPayload<ITemplate, Pick<IAppContext, 'requestingUser'>>,
args: any,
subscriptionContext: ISubscriptionContext,
info: any,
): ITemplate => {
return payload.data;
},
subscribe: withFilter(templateIterator, templateFilter),
},
},
async function templateFilter(
payload?: ISubscriptionPayload<ITemplate, Pick<IAppContext, 'requestingUser'>>,
args?: any,
subscriptionContext?: ISubscriptionContext,
info?: any,
): Promise<boolean> {
console.count('templateFilter');
const NOTIFY: boolean = true;
const DONT_NOTIFY: boolean = false;
if (!payload || !subscriptionContext) {
return DONT_NOTIFY;
}
const { userConnector, locationConnector } = subscriptionContext;
const { data: template, context } = payload;
if (!subscriptionContext.subscribeUser || !context.requestingUser) {
return DONT_NOTIFY;
}
let results: IUser[];
try {
results = await Promise.all([
userConnector.findByEmail(subscriptionContext.subscribeUser.email),
userConnector.findByEmail(context.requestingUser.email),
]);
} catch (error) {
console.error(error);
return DONT_NOTIFY;
}
//...
return true;
}
As you can see, now we get the subscribe users(who establish the WebSocket connection with graphql webserver) and HTTP request user(who send the mutation to graphql webserver) from subscriptionContext and HTTP request context.
Then you can do the rest works if the return value of templateFilter function is truthy, then WebSocket will push message to subscribe user with payload.data, otherwise, it won't.
This templateFilter function will be executed multiple times depending on the count of subscribing users which means it's iterable. Now you get each subscribe user in this function and does your business logic to decide if push WebSocket message to the subscribe users(client-side) or not.
See github example repo
Articles:
GraphQL Subscription part 1
GraphQL Subscription part 2
If you're using Apollo v3, and graphql-ws, here's a docs-inspired way to achieve context resolution:
const wsContext = async (ctx, msg, args) => {
const token = ctx.connectionParams.authorization;
const currentUser = await findUser(token);
if(!currentUser) throw Error("wrong user token");
return { currentUser, foo: 'bar' };
};
useServer(
{
schema,
context: wsContext,
}
wsServer,
);
You could use it like so in your Apollo React client:
import { GraphQLWsLink } from '#apollo/client/link/subscriptions';
import { createClient } from 'graphql-ws';
const wsLink = new GraphQLWsLink(createClient({
url: 'ws://localhost:4000/subscriptions',
connectionParams: {
authorization: user.authToken,
},
}));

Await and Async callbacks hell

I want to make the UserDataGenerator class works like a traditional SYNC class.
My expectation is that userData.outputStructure can give me the data prepared.
let userData = new UserDataGenerator(dslContent)
userData.outputStructure
getFieldDescribe(this.inputStructure.tableName, field) is a ASYNC call which invokes Axios.get
Below is my current progress but it's still not waiting for the data ready when I print out the userData.outputStructure
export default class UserDataGenerator {
inputStructure = null;
outputStructure = null;
fieldDescribeRecords = [];
constructor(dslContent) {
this.outputStructure = Object.assign({}, dslContent, initSections)
process()
}
async process() {
await this.processSectionList()
return this.outputStructure
}
async processSectionList() {
await this.inputStructure.sections.map(section => {
this.outputStructure.sections.push(this.processSection(section));
})
}
async processSection(section) {
let outputSection = {
name: null,
fields: []
}
let outputFields = await section.fields.map(async(inputField) => {
return await this._processField(inputField).catch(e => {
throw new SchemaError(e, this.inputStructure.tableName, inputField)
})
})
outputSection.fields.push(outputFields)
return outputSection
}
async _processField(field) {
let resp = await ai
switch (typeof field) {
case 'string':
let normalizedDescribe = getNormalizedFieldDescribe(resp.data)
return new FieldGenerator(normalizedDescribe, field).outputFieldStructure
}
}
You're trying to await arrays, which doesn't work as you expect. When dealing with arrays of promises, you still need to use Promise.all before you can await it - just like you cannot chain .then on the array.
So your methods should look like this:
async processSectionList() {
const sections = await Promise.all(this.inputStructure.sections.map(section =>
this.processSection(section)
));
this.outputStructure.sections.push(...sections);
}
async processSection(section) {
return {
name: null,
fields: [await Promise.all(section.fields.map(inputField =>
this._processField(inputField).catch(e => {
throw new SchemaError(e, this.inputStructure.tableName, inputField)
})
))]
};
}

Resources