GraphQL mutation "Cannot set headers after they are sent to the client" - graphql

I'm implementing graphql login mutation to authenticate user login credential. Mutation verifies the password with bcrypt then sends a cookie to the client, which will render user profile based on whether the cookie is a buyer or owner user).
GraphQL Login Mutation Code:
const Mutation = new GraphQLObjectType({
name: 'Mutation',
fields: {
loginUser: {
type: UserType,
args: {
email: { type: GraphQLString },
password: { type: GraphQLString }
},
resolve: function (parent, args, { req, res }) {
User.findOne({ email: args.email }, (err, user) => {
if (user) {
bcrypt.compare(args.password, user.password).then(isMatch => {
if (isMatch) {
if (!user.owner) {
res.cookie('cookie', "buyer", { maxAge: 900000, httpOnly: false, path: '/' });
} else {
res.cookie('cookie', "owner", { maxAge: 900000, httpOnly: false, path: '/' });
}
return res.status(200).json('Successful login');
} else {
console.log('Incorrect password');
}
});
}
});
}
}
}
});
Server.js:
app.use("/graphql",
(req, res) => {
return graphqlHTTP({
schema,
graphiql: true,
context: { req, res },
})(req, res);
});
Error message:
(node:10630) UnhandledPromiseRejectionWarning: Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client
[0] at ServerResponse.setHeader (_http_outgoing.js:470:11)
[0] at ServerResponse.header (/Users/xxx/xxx/server/node_modules/express/lib/response.js:771:10)
[0] at ServerResponse.append (/Users/xxx/xxx/server/node_modules/express/lib/response.js:732:15)
[0] at ServerResponse.res.cookie (/Users/xxx/xxx/server/node_modules/express/lib/response.js:857:8)
[0] at bcrypt.compare.then.isMatch (/Users/xxx/xxx/server/schema/schema.js:89:41)
I've done some research on this error, but can't seem to find a relevant answer. The issue seems to lie within response body being executing more than once, thus "cannot set headers after they are sent to the client". Since I'm sending both res.cookie() and res.status(200), how could I fix this problem?

express-graphql already sets the status and sends a response for you -- there's no need to call either res.status or res.json inside your resolver.
GraphQL always returns a status of 200, unless the requested query was invalid, in which case it returns a status of 400. If errors occur while executing the request, they will be included the response (in an errors array separate from the returned data) but the status will still be 200. This is all by design -- see additional discussion here.
Instead of calling res.json, your resolver should return a value of the appropriate type (in this particular case UserType), or a Promise that will resolve to this value.
Additionally, you shouldn't utilize callbacks inside resolvers since they are not compatible with Promises. If the bcrypt library you're using supports using Promises, use the appropriate API. If it doesn't, switch to a library that does (like bcryptjs) or wrap your callback inside a Promise. Ditto for whatever ORM you're using.
In the end, your resolver should look something like this:
resolve: function (parent, args, { req, res }) {
const user = await User.findOne({ email: args.email })
if (user) {
const isMatch = await bcrypt.compare(args.password, user.password)
if (isMatch) {
const cookieValue = user.owner ? 'owner' : 'buyer'
res.cookie('cookie', cookieValue, { maxAge: 900000, httpOnly: false, path: '/' })
return user
}
}
// If you want an error returned in the response, just throw it
throw new Error('Invalid credentials')
}

Related

Get POST request body in Vite server.proxy["/api"].configure

I am migrating a project from Webpack to Vite and have run into an issue with proxying requests to one of the endpoints in the MVC.Net backend.
Due to circumstances of the existing project, I need to handle certain calls manually - such as on initial page load of login page, check whether user is already authenticated and redirect to the main page.
I am trying to figure out how to use server.proxy.configure to handle these requests. I am managing fine with the GET requests, but I cannot seem to receive the POST request's body data.
Here is what I have at the moment:
server: {
proxy: {
"/api": {
target: "https://my.local.environment/",
changeOrigin: true,
configure: (proxy: HttpProxy.Server, options: ProxyOptions) => {
proxy.on("proxyReq", (proxyReq, req, res, options) => {
if (req.method === "GET") {
//handle simple get requests. no problems here
//...
} else {
const buffer = [];
console.log("received post request");
proxyReq.on("data", (chunk) => {
console.log("received chunk");
buffer.push(chunk);
});
proxyReq.on("end", () => {
console.log("post request completed");
const body = Buffer.concat(buffer).toString();
const forwardReq = http.request(
{
host: "https://my.local.environment",
port: 443,
method: "POST",
path: req.url,
headers: {
"Content-Type": "application/json",
"Content-Length": data.length,
},
},
(result) => {
result.on("data", (d) => {
res.write(d);
res.end();
});
}
);
forwardReq.on("error", (error) => {
console.log(error);
});
forwardReq.write(data);
forwardReq.end();
});
}
});
},
secure: false,
},
}
}
The problem is that neither proxyReq.on("data", (chunk) => { nor proxyReq.on("end", (chunk) => { ever actually trigger.
Additionally, req.body is undefined.
I have absolutely no idea where I am supposed to be getting the POST request's body.
I ended up finding a different question about the bypass option and this gave me the solution I was looking for. Ended up only handling the specific GET requests that I need to handle locally instead of forwarding to my deployed environment, and everything else gets handled automatically by vite.
"/api": {
target: "https://my.local.environment/",
changeOrigin: true,
agent: new https.Agent({
keepAlive: true,
}),
bypass(req, res, proxyOptions) {
if (req.method === "GET") {
//... here I get what I need and write to the res object
// and of course call res.end()
}
//all other calls are handled automatically
},
secure: false,
},

NestJs Timeout issue with HttpService

I am facing a timeout issue with nestJs Httpservice.
The error number is -60 and error code is 'ETIMEDOUT'.
I am basically trying to call one api after the previous one is successfully.
Here is the first api
getUaaToken(): Observable<any> {
//uaaUrlForClient is defined
return this.httpService
.post(
uaaUrlForClient,
{ withCredentials: true },
{
auth: {
username: this.configService.get('AUTH_USERNAME'),
password: this.configService.get('AUTH_PASSWORD'),
},
},
)
.pipe(
map((axiosResponse: AxiosResponse) => {
console.log(axiosResponse);
return this.getJwtToken(axiosResponse.data.access_token).subscribe();
}),
catchError((err) => {
throw new UnauthorizedException('failed to login to uaa');
}),
);
}
Here is the second api
getJwtToken(uaaToken: string): Observable<any> {
console.log('inside jwt method', uaaToken);
const jwtSignInUrl = `${awsBaseUrl}/api/v1/auth`;
return this.httpService
.post(
jwtSignInUrl,
{ token: uaaToken },
{
headers: {
'Access-Control-Allow-Origin': '*',
'Content-type': 'Application/json',
},
},
)
.pipe(
map((axiosResponse: AxiosResponse) => {
console.log('SUCUSUCSCUSS', axiosResponse);
return axiosResponse.data;
}),
catchError((err) => {
console.log('ERRRORRRORROR', err);
// return err;
throw new UnauthorizedException('failed to login for');
}),
);
}
Both files are in the same service file. Strangely, when i call the second api through the controller like below. It works fine
#Post('/signin')
#Grafana('Get JWT', '[POST] /v1/api/auth')
signin(#Body() tokenBody: { token: string }) {
return this.authService.getJwtToken(tokenBody.token);
}
When the two api's are called, however, the first one works, the second one that is chained is giving me the timeout issue.
Any ideas?
Two things that made it work: changed the http proxy settings and used switchMap.

Getting test coverage of GraphQL resolve function

I'm using mocha chai and supertest to test a new graphql endpoint set up on our Node/Express server.
I have all the tests running and passing accordingly but when I run the following script:
"test-coverage": "nyc mocha tests/ --recursive",
it is not counting the tests for users resolve function in the code coverage.
My graphql query endpoint looks as shown below:
const RootQuery = new GraphQLObjectType({
name: 'RootQueryType',
fields: () => ({
users: {
type: new GraphQLList(UserType),
args: {
searchByName: { type: GraphQLString },
queryNearbyUsers: { type: GraphQLBoolean },
skip: { type: GraphQLInt },
limit: { type: GraphQLInt }
},
async resolve(parent, args, req) {
const { searchByName, queryNearbyUsers, skip = 0, limit = 20 } = args
// No search criteria was specified so just return an error
if(!searchByName && !queryNearbyUsers)
throw new Error('NO_SEARCH_CRITERIA_SPECIFIED')
...
}
},
...
})
An example of one of my tests:
it('should throw an error (NO_SEARCH_CRITERIA_SPECIFIED) when no params supplied', function(done) {
request.post('spikeql')
.set({ Authorization: `Bearer ${token}`})
.send({ query: '{ users { _id firstName lastName displayName rpr distanceAway avatar { styles { thumb_square }}}}'})
.expect(200) // TODO: Setup GraphQL to match approproate HTTP res codes
.end((err, res) => {
if(err) return done(err)
let errors = res.body.errors
expect(errors).to.be.an('array')
// Check to make sure error was sent properly
expect(errors[0]).to.have.property('message', 'NO_SEARCH_CRITERIA_SPECIFIED')
expect(errors[0].message).to.be.a('string')
done()
})
})
I perform 3 other tests with different inputs for the GET_USERS query. All of them pass. It just doesn't get tracked in coverage report.
New to graphql and unit/integration testing so any help is appreciated. Can supply additional info if needed.

How to pass a request header to fastify plugin options at register

I can access the request header in a get or post call
fastify.get('/route1',(req,res,next)=>{
console.log(req.headers.Authorization)
...
}
I am looking for a way to pass it to a plugin register call, specifically fastify-graphql
const { graphqlFastify } = require("fastify-graphql");
fastify.register(graphqlFastify,
{
prefix: "/graphql",
graphql: {
schema: schema,
rootValue: resolvers,
context:{auth:req.headers.Authorization} <-----
}
},
err => {
if (err) {
console.log(err);
throw err;
}
}
);
Is there a way to write a wrapper or any ideas?
I think you can't do that.
If read the code you will find that:
fastify-graphql is calling runHttpQuery
runHttpQuery is calling context without passing the request
So I think that you should check the auth-client with a standard JWT and then use another token server-side.
The final solution could be to check Apollo 2.0 and open the issue on fastify-graphql.
Here a little snippet that explain the idea:
const fastify = require('fastify')({ logger: true })
const { makeExecutableSchema } = require('graphql-tools')
const { graphiqlFastify, graphqlFastify } = require('fastify-graphql');
const typeDefs = `
type Query {
demo: String,
hello: String
}
`
const resolvers = {
Query: {
demo: (parent, args, context) => {
console.log({ args, context });
return 'demo'
},
hello: () => 'world'
}
}
const schema = makeExecutableSchema({ typeDefs, resolvers })
fastify.register(graphqlFastify, {
prefix: '/gr',
graphql: {
schema,
context: function () {
return { serverAuth: 'TOKEN' }
},
},
});
fastify.listen(3000)
// curl -X POST 'http://localhost:3000/gr' -H 'Content-Type: application/json' -d '{"query": "{ demo }"}'
For anyone who need to access request headers in graphql context, try
graphql-fastify
Usage
Create /graphql endpoint like following
const graphqlFastify = require("graphql-fastify");
fastify.register(graphqlFastify, {
prefix: "/graphql",
graphQLOptions
});
graphQLOptions
graphQLOptions can be provided as an object or a function that returns graphql options
graphQLOptions: {
schema: schema,
rootValue: resolver
contextValue?: context
}
If it is a function, you have access to http request and response. This allows you to do authentication and pass authentication scopes to graphql context. See the following pseudo-code
const graphQLOptions = function (request,reply) {
const auth = decodeBearerToken(request.headers.Authorization);
// auth may contain userId, scope permissions
return {
schema: schema,
rootValue: resolver,
contextValue: {auth}
}
});
This way, context.auth is accessible to resolver functions allowing you to check user's scope/permissions before proceeding.

Relayjs Graphql user authentication

Is it possible to authenticate users with different roles solely trough a graphql server in combination with relay & react?
I looked around, and couldn't find much info about this topic.
In my current setup, the login features with different roles, are still going trough a traditional REST API... ('secured' with json web tokens).
I did it in one of my app, basically you just need a User Interface, this one return null on the first root query if nobody is logged in, and you can then update it with a login mutation passing in the credentials.
The main problem is to get cookies or session inside the post relay request since it does'nt handle the cookie field in the request.
Here is my client mutation:
export default class LoginMutation extends Relay.Mutation {
static fragments = {
user: () => Relay.QL`
fragment on User {
id,
mail
}
`,
};
getMutation() {
return Relay.QL`mutation{Login}`;
}
getVariables() {
return {
mail: this.props.credentials.pseudo,
password: this.props.credentials.password,
};
}
getConfigs() {
return [{
type: 'FIELDS_CHANGE',
fieldIDs: {
user: this.props.user.id,
}
}];
}
getOptimisticResponse() {
return {
mail: this.props.credentials.pseudo,
};
}
getFatQuery() {
return Relay.QL`
fragment on LoginPayload {
user {
userID,
mail
}
}
`;
}
}
and here is my schema side mutation
var LoginMutation = mutationWithClientMutationId({
name: 'Login',
inputFields: {
mail: {
type: new GraphQLNonNull(GraphQLString)
},
password: {
type: new GraphQLNonNull(GraphQLString)
}
},
outputFields: {
user: {
type: GraphQLUser,
resolve: (newUser) => newUser
}
},
mutateAndGetPayload: (credentials, {
rootValue
}) => co(function*() {
var newUser = yield getUserByCredentials(credentials, rootValue);
console.log('schema:loginmutation');
delete newUser.id;
return newUser;
})
});
to keep my users logged through page refresh I send my own request and fill it with a cookie field... This is for now the only way to make it work...

Resources