GraphQL : share the context among all the resolvers - graphql

I am implementing Authorization in GraphQL. I am storing authorization status in context and want to share it with every resolver. I know that the Apollo server has a context which is shared across all the resolvers i.e :
const server = new ApolloServer({
schema,
resolvers,
context: async ({ event, context, appId, userId }) => {
....
return { event, context, appId, userId };
}
})
But my problem is , I want to update the context in different files and other resolver should always have the updated context.
is there any way we can create the context in such way that the changes made in the context object is always reflected to other resolvers??

Related

Apollo server accessing all datasources within resolvers before API request is sent

In a graphql query with multiple resolvers, I'm looking for a way to count how many times datasources are called before the first datasource API request is sent. The project that I am working on requires me to either allow or stop all the requests if the number of the datasources called within resolvers in a graphql query exceeds a certain number.
I am using an instance of the RESTDataSource to make API calls and each one of the resolvers call one or more datasources from the RESTDataSource class. I've been looking into this and far as I know, the RESTDataSource class doesn't have a method that shows me all the datasources requested because it is only called by the resolver and per request.
My problem is, I'm not finding a place where I can have access to all the datasources that will be called before the request is sent. I found that in the Apollo server instantiation, the only thing that I have access to are the resolvers, and not the datasources within each resolver, and as far as I know, not before the request is made so I can't stop it if the number of datasources calls exceed a certain threshold. I was hoping I could access that in the willSendRequest method inside the RESTDataSource class since from what I know, this is the only method that intercepts the request before being sent, but I don't think it's possible.
I'm pretty new to Apollo and I've been reading about this but didn't find a solution. I'd really appreciate any help.
Here's a simplified snippet of my code (not the original code):
resolvers.ts
export const resolvers: Resolvers = {
Query: {
getCompanies: (_, __, { dataSources }) => {
return dataSources.companyDatasource.getCompanies();
},
getCompany: (_, { name }, { dataSources }) => {
return dataSources.companyDatasource.getCompanyByName(name);
},
getCompanyCEOs: async (_, { name }, { dataSources }) => {
const company = await dataSources.companyDatasource.getCompanyByName(name);
return dataSources.companyDatasource.getCEOs(company.id);
},
....
company.datasource.ts
export default class CompanyDatasource extends RESTDataSource {
async willSendRequest(request) {
// some logic
}
async getCompanies() {
return this.get(`some_api_url`);
}
async getCompanyByName(name) {
return this.get(`some_api_url?companyName=name`);
}
//other external API endpoints
...
}
main.ts
const server = new ApolloServer({
typeDefs: schema,
schema,
resolvers,
dataSources,
cache: 'bounded',
});
await server.start();
Edit: I'm limiting the number of unique datasource API calls because the API I'm hitting has a limit. I tried instantiating a counter in the RESTDataSource class and using it in the willSendRequest to count how many datasource calls there are, but the problem is this is counting request by request and has no access to all the API requests that are coming from the resolver. For instance, if the getCompanies API can be called only once and I have 2 upcoming requests, I'll have to let one of them pass and only stop the second, because at that point I don't know there's a second request coming. My team has agreed to stop both requests in case the number of upcoming requests exceeds the available limit for the endpoint (this is specified in our database), so this is why I need to know beforehand how many API requests are there before even allowing the first request.

apollo-server how to access datasource in koa/express middleware

I'm new to the apollo-server world, so maybe I've taken something wrong.
in apollo-server, DataSource is meant to create and initialize by ApolloServer. But I also want to access some DataSource in the koa middleware. Let's say, I have a DataSource called UserDataSource, and this is how I create it with ApolloServer:
const server = new ApolloServer({
typeDefs,
resolvers,
dataSources:function(){
return {
user: new UserDataSource(),
};
},
});
// this is my middleware
async function auth(ctx, next){
// Here I need to access `UserDataSource`
// pass control down to apollo-server
await next();
}
// koa server
const app = new Koa();
app.use(auth);
server.applyMiddleware({ app });
after this, I can access dataSources.user in resolvers. But some other middleware also need to access UserDataSource. Are there some official guides on how to achieve this?
PS, maybe I could manually create all the data sources in one middleware, and assign them to ctx. But I think this is a little hack.

How to use passport-local with graphql

I'm trying to implement GraphQL in my project and I would like to use passport.authenticate('local') in my login Mutation
Code adaptation of what I want:
const typeDefs = gql`
type Mutation {
login(userInfo: UserInfo!): User
}
`
const resolvers = {
Mutation: {
login: (parent, args) => {
passport.authenticate('local')
return req.user
}
}
Questions:
Was passport designed mostly for REST/Express?
Can I manipulate passport.authenticate method (pass username and password to it)?
Is this even a common practice or I should stick to some JWT library?
Passport.js is a "Express-compatible authentication middleware". authenticate returns an Express middleware function -- it's meant to prevent unauthorized access to particular Express routes. It's not really suitable for use inside a resolver. If you pass your req object to your resolver through the context, you can call req.login to manually login a user, but you have to verify the credentials and create the user object yourself before passing it to the function. Similarly, you can call req.logout to manually log out a user. See here for the docs.
If you want to use Passport.js, the best thing to do is to create an Express app with an authorization route and a callback route for each identify provider you're using (see this for an example). Then integrate the Express app with your GraphQL service using apollo-server-express. Your client app will use the authorization route to initialize the authentication flow and the callback endpoint will redirect back to your client app. You can then add req.user to your context and check for it inside resolvers, directives, GraphQL middleware, etc.
However, if you are only using local strategy, you might consider dropping Passport altogether and just handling things yourself.
It took me a while to wrap my head around the combination of GraphQL and Passport. Especially when you want to use the local strategy together with a login mutation makes life complicated. That's why I created a small npm package called graphql-passport.
This is how the setup of the server looks like.
import express from 'express';
import session from 'express-session';
import { ApolloServer } from 'apollo-server-express';
import passport from 'passport';
import { GraphQLLocalStrategy, buildContext } from 'graphql-passport';
passport.use(
new GraphQLLocalStrategy((email, password, done) => {
// Adjust this callback to your needs
const users = User.getUsers();
const matchingUser = users.find(user => email === user.email && password === user.password);
const error = matchingUser ? null : new Error('no matching user');
done(error, matchingUser);
}),
);
const app = express();
app.use(session(options)); // optional
app.use(passport.initialize());
app.use(passport.session()); // if session is used
const server = new ApolloServer({
typeDefs,
resolvers,
context: ({ req, res }) => buildContext({ req, res, User }),
});
server.applyMiddleware({ app, cors: false });
app.listen({ port: PORT }, () => {
console.log(`🚀 Server ready at http://localhost:${PORT}${server.graphqlPath}`);
});
Now you will have access to passport specific functions and user via the GraphQL context. This is how you can write your resolvers:
const resolvers = {
Query: {
currentUser: (parent, args, context) => context.getUser(),
},
Mutation: {
login: async (parent, { email, password }, context) => {
// instead of email you can pass username as well
const { user } = await context.authenticate('graphql-local', { email, password });
// only required if express-session is used
context.login(user);
return { user }
},
},
};
The combination of GraphQL and Passport.js makes sense. Especially if you want to add more authentication providers like Facebook, Google and so on. You can find more detailed information in this blog post if needed.
You should definitely use passport unless your goal is to learn about authentication in depth.
I found the most straightforward way to integrate passport with GraphQL is to:
use a JWT strategy
keep REST endpoints to authenticate and retrieve tokens
send the token to the GraphQL endpoint and validate it on the backend
Why?
If you're using a client-side app, token-based auth is the best practice anyways.
Implementing REST JWT with passport is straightforward. You could try to build this in GraphQL as described by #jkettmann but it's way more complicated and less supported. I don't see the overwhelming benefit to do so.
Implementing JWT in GraphQL is straightforward. See e.g. for express or NestJS
To your questions:
Was passport designed mostly for REST/Express?
Not in principle, but you will find most resources about REST and express.
Is this even a common practice or I should stick to some JWT library?
Common practice is to stick to JWT.
More details here: OAuth2 in NestJS for Social Login (Google, Facebook, Twitter, etc)
Example project bhere: https://github.com/thisismydesign/nestjs-starter

Log Query/Mutation actions to database for Auditing

My goal is to run some kind of webhook, cloud function or say I want to perform some kind of action after each query success or mutation success in graphql.
Means I want to log each and every action performed by users (kind of history of when what was created and updated).
How can this be implemented using some kind of middleware between graphql and DB (say mongo for now)?
Means that middleware should be responsible to run the logging action each time a query or mutation is called from front-end.
Tech stack being used is- Node, express, graphQl, Redis etc.
Any suggestions would really be appreciated.
Thanks
The solution I came up with was calling a function manually each time a query or mutate.
If you're using Apollo, you can utilize the formatResponse and formatError options for logging, as outlined in the docs.
const server = new ApolloServer({
typeDefs,
resolvers,
formatError: error => {
console.log(error);
return error;
},
formatResponse: response => {
console.log(response);
return response;
},
});
Using an extension can allow you to hook into different phases of the GraphQL request and allow more granular logging. A simple example:
const _ = require('lodash')
const { GraphQLExtension } = require('graphql-extensions')
module.exports = class LoggingExtension extends GraphQLExtension {
requestDidStart(options) {
logger.info('Operation: ' + options.operationName)
}
willSendResponse(o) {
const errors = _.get(o, 'graphqlResponse.errors', [])
for (const error of errors) {
logger.error(error)
}
}
}
There's a more involved example here. You can then add your extension like this:
const server = new ApolloServer({
typeDefs,
resolvers,
extensions: [() => new YourExtension()]
});
If you're using express-graphql to serve your endpoint, your options are a bit more limited. There's still a formatError option, but no formatResponse. There is a way to pass in an extensions array as well, but the API is different from Apollo's. You can take a look at the repo for more info.

Apollo client: Can #defer be used with client side resolvers?

For some reason, I had to build a client-side only GraphQL server, my schema is built as follow:
private buildSchema(): GraphQLSchema {
const allTypes: string = ...// my types
const allResolvers: IResolvers[] = ...// my resolvers
return makeExecutableSchema({
typeDefs: allTypes,
resolvers: allResolvers
});
}
The client is as follow:
this.client = new ApolloClient({
link: new SchemaLink({schema: this.buildSchema()}),
cache: new InMemoryCache({
addTypename: false
})
});
And everything works fine except that my queries are not defered. For instance if I run:
const gqlQuery: string = `
{
user {
name
slowResolver #defer {
text
}
}
}
`
const $result = this.apollo.getClient().watchQuery({
query: gql(gqlQuery)
});
The $result will be emited only when the whole query will be resolved (instead of user and then slowResolver as expected).
Any idea of what I missed in the workflow?
The #defer directive was actually removed from Apollo, although there's been some work done to reimplement it. Even if it's implemented, though, deferred queries would have to be handled outside of the execution context. In other words, executing the schema can return a deferred execution result, but something else (like Apollo server itself) has to handle how that response (both the initial payload, and the subsequent patches) are actually sent to the server over whatever transport.
If you're defining a schema client-side, unfortunately, it's not going to be possible to use the #defer directive.

Resources