I am currently building a migration solution from an AWS Userpool to another using the CognitoTrigger "User Migration".
I have a Group I want to set during migration but I cannot do it because the user isn't created before the whole context finishes.
How can I solve this? I don't want to create a PostAuth - lambda because I only need/want/can run this once per migration and I also want to do this the instant (or up to a few minutes later) the migration happens. (or is it possible to make this PostAuth check if it is the first time it triggers?)
I tried PostConfirm in the hopes of this triggering when the user was created but that did not trigger.
If someone else runs into this - I solved this using a combination of a User Migration trigger and a Pre Token Generation trigger.
In the User Migration trigger (mostly copied from https://github.com/Collaborne/migrate-cognito-user-pool-lambda) look up and create the user if auth fails/user doesn't exist in the new pool.
In the Pre Token Generation trigger if the user hasn't been added to groups yet look up group membership in the old user pool (adminListGroupsForUser), add them to the new pool (adminAddUserToGroup). The crucial part is to override the group membership claims in the response so that they will be added to the token on the client side (groupsToOverride is just an array of the group names the user is part of):
event.response = {
"claimsOverrideDetails": {
"claimsToAddOrOverride": {
},
"groupOverrideDetails": {
"groupsToOverride": groupsToOverride,
}
}
};
Thank you #BrokenGlass, I used this approach. For anyone else here's an example Typescript preTokenGeneration lambda.
//preTokenGenerations.ts
import { PreTokenGenerationTriggerHandler } from 'aws-lambda';
import { preTokenAuthentication } from '../services/preTokenService';
export const handler: PreTokenGenerationTriggerHandler = async (event, context) => {
console.log({
event,
context,
request: event.request,
userAttributes: event.request.userAttributes,
clientMetadata: event.request.clientMetadata,
groupConfiguration: event.request.groupConfiguration,
})
const OLD_USER_POOL_ID = process.env.OLD_USER_POOL_ID;
if (!OLD_USER_POOL_ID) {
throw new Error("OLD_USER_POOL_ID is required for the lambda to work.")
}
const {
userPoolId,
request: {
userAttributes: {
email
}
},
region
} = event;
switch (event.triggerSource) {
case "TokenGeneration_Authentication":
const groupsToOverride = await preTokenAuthentication({
userPoolId,
oldUserPoolId: OLD_USER_POOL_ID,
username: email,
region
})
event.response = {
"claimsOverrideDetails": {
"claimsToAddOrOverride": {
},
"groupOverrideDetails": {
"groupsToOverride": groupsToOverride,
}
}
};
return event
default:
console.log(`Bad triggerSource ${event.triggerSource}`);
return new Promise((resolve) => {
resolve(event)
});
}
}
// preTokenService.ts
import { getUsersGroups, cognitoIdentityServiceProvider, assignUserToGroup } from "./cognito"
interface IPreTokenAuthentication {
userPoolId: string;
oldUserPoolId: string;
username: string;
region: string
}
export const preTokenAuthentication = async ({ userPoolId, oldUserPoolId, username, region }: IPreTokenAuthentication): string[] => {
const cognitoISP = cognitoIdentityServiceProvider({ region });
const newPoolUsersGroups = await getUsersGroups({
cognitoISP,
userPoolId,
username
});
// If the user in the new pool already has groups assigned then exit
if (newPoolUsersGroups.length !== 0) {
console.log("No action required user already exists in a group")
return;
}
const oldPoolUsersGroups = await getUsersGroups({
cognitoISP,
userPoolId: oldUserPoolId,
username
});
// If the user in the old pool doesn't have any groups then nothing else for this function to do so exit.
if (oldPoolUsersGroups.length === 0) {
console.error("No action required user migrated user didn't belong to a group")
return;
}
console.log({ oldPoolUsersGroups, newPoolUsersGroups })
await assignUserToGroup({
cognitoISP,
userPoolId,
username,
groups: oldPoolUsersGroups
})
return oldPoolUsersGroups;
}
// cognito.ts
import { AdminAddUserToGroupRequest, AdminListGroupsForUserRequest } from "aws-sdk/clients/cognitoidentityserviceprovider";
import { CognitoIdentityServiceProvider } from 'aws-sdk';
interface ICognitoIdentityServiceProvider {
region: string;
}
export const cognitoIdentityServiceProvider = ({ region }: ICognitoIdentityServiceProvider) => {
const options: CognitoIdentityServiceProvider.Types.ClientConfiguration = {
region,
};
const cognitoIdentityServiceProvider = new CognitoIdentityServiceProvider(options);
return cognitoIdentityServiceProvider;
}
interface IGetUsersGroups {
cognitoISP: CognitoIdentityServiceProvider,
userPoolId: string,
username: string
}
export const getUsersGroups = async ({ cognitoISP, userPoolId, username }: IGetUsersGroups): Promise<string[]> => {
try {
const params: AdminListGroupsForUserRequest = {
UserPoolId: userPoolId,
Username: username,
}
const response = await cognitoISP.adminListGroupsForUser(params).promise();
return response.Groups?.map(group => group.GroupName!) || [];
} catch (err) {
console.error(err)
return [];
}
}
interface IAssignUserToGroup {
cognitoISP: CognitoIdentityServiceProvider,
username: string;
groups: string[];
userPoolId: string;
}
/**
* Use Administration to assign a user to groups
* #param {
* cognitoISP the cognito identity service provider to perform the action on
* userPoolId the userPool for which the user is being modified within
* username the username or email for which the action is to be performed
* groups the groups to assign the user too
* }
*/
export const assignUserToGroup = async ({ cognitoISP, userPoolId, username, groups }: IAssignUserToGroup) => {
console.log({ userPoolId, username, groups })
for (const group of groups) {
const params: AdminAddUserToGroupRequest = {
UserPoolId: userPoolId,
Username: username,
GroupName: group
};
try {
const response = await cognitoISP.adminAddUserToGroup(params).promise();
console.log({ response })
} catch (err) {
console.error(err)
}
}
}
Tips, make sure under the trigger section in Cognito that you have the migration and preToken triggers set. You also need to ensure SRP is not enabled so the lambda can see the password to be able to successfully migrate the user.
Things to test is that when the user is first migrated that they are assigned their groups. And for future logins they are also assigned to their groups.
Let me know if anyone has any feedback or questions, happy to help.
Related
I am doing a user registration through nodejs, this was already done in react but only with one field. The problem is that I try to send more than one field in the metadata of the record but it only adds username to the profiles table and not the others. However I get user information on the front end and there is the additional metadata that was added.
I'm using supabase v2.0
const createStripeAccount = async(req, res) => {
const { username, email, password } = req.body;
try {
const account = await stripe.accounts.create({
type: 'express',
country: 'US',
email: email,
capabilities: {
card_payments: {requested: true},
transfers: {requested: true},
}
});
const { id:stripeAccountId } = account;
let newData = null;
if(stripeAccountId) {
console.log(typeof stripeAccountId)
newData = await supabase.auth.signUp(
{
email: email,
password: password,
options: {
data: {
username: username,
stripe_account_user_id: 'stripeAccountId', <---additional
is_founder: true, <---additional
website: 'www.sdsoso.cl' <---additional
}
}
}
)
}
const { data: {user} } = newData;
console.log(newData)
return res.json({
user
});
} catch(error) {
console.log('error', error);
}
}
The response from the server is the full metadata sended. But in the table only add username and not the others.
I found the solution and it was that I had forgotten that when I add new metadata I must update the function that is fired in the supabase trigger after registering.
So every time you want to pass more data to the profiles table after registration, update the function that is triggered. In my case:
BEGIN
INSERT INTO public.profiles(id, username, new_col_name)
VALUES (
NEW.id,
NEW.raw_user_meta_data -> 'username',
NEW.raw_user_meta_data -> 'new_col_name'
);
RETURN NEW;
END;
Good morning. I am using NextJS and get the session in the getServerSideProps block.
I am passing multiple parameters to the return object, however, the session does not pass. All of the other key value pairs works, but when I try to log the session, It comes undefined... I don't understand...
const DashboardPage = ({ session, secret }) => {
const [loading, setloading] = useState(false);
//This does not work
console.log('Session: ', session);
return (
<section className="border-4 border-orange-800 max-w-5xl mx-auto">
<CreateListModal userId={session?.userId} loading={loading} setloading={setloading} />
</section>
)
}
export const getServerSideProps = async context => {
// get sessions with added user Id in the session object
const session = await requireAuthentication(context);
// This works
console.log(session);
if (!session) {
return {
redirect: {
destination: '/signup',
permanent: false
}
}
}
else {
return {
props: {
session: session,
secret: 'Pasarika este dulce'
}
}
}
}
export default DashboardPage;
The purpose of the requireAuthentication function is to create a new session object where I will insert another key value pair that will contain the user id and the session that will be used in the entire app.
In that function I get the user by email that is returned by the session and get the Id from the db. I than return the new session object that looks like this:
{
user: {
name: 'daniel sas',
email: 'email#gmail.com',
image: 'https://lh3.googldeusercontent.com/a/A6r44ZwMyONqcfJORNnuYtbVv_LYbab-wv5Uyxk=s96-c',
userId: 'clbcpc0hi0002sb1wsiea3q5d'//This is the required thing in my app
},
expires: '2022-12-23T08:04:08.263Z'
}
The following function is used to get the data from the database
import { getSession } from "next-auth/react";
import prisma from './prisma'
// This function get the email and returns a new session object that includes
// the userId
export const requireAuthentication = async context => {
const session = await getSession(context);
// If there is no user or there is an error ret to signup page
if (!session) return null
// If the user is not found return same redirect to signup
else {
try {
const user = await prisma.user.findUnique({where: { email: session.user.email }});
if (!user) return null;
// Must return a new session here that contains the userId...
else {
const newSession = {
user: {
...session.user,
userId: user.id
},
expires: session.expires
};
return newSession;
}
}
catch (error) {
if (error) {
console.log(error);
}
}
}
}
I'm trying to implement Facebook, Google and Twitter authentication. So far, I've set up the apps within the respective developer platforms, added those keys/secrets to my Supabase console, and created this graphql resolver:
/* eslint-disable #typescript-eslint/explicit-module-boundary-types */
import camelcaseKeys from 'camelcase-keys';
import { supabase } from 'lib/supabaseClient';
import { LoginInput, Provider } from 'generated/types';
import { Provider as SupabaseProvider } from '#supabase/supabase-js';
import Context from '../../context';
import { User } from '#supabase/supabase-js';
export default async function login(
_: any,
{ input }: { input: LoginInput },
{ res, req }: Context
): Promise<any> {
const { provider } = input;
// base level error object
const errorObject = {
__typename: 'AuthError',
};
// return error object if no provider is given
if (!provider) {
return {
...errorObject,
message: 'Must include provider',
};
}
try {
const { user, session, error } = await supabase.auth.signIn({
// provider can be 'github', 'google', 'gitlab', or 'bitbucket'
provider: 'facebook',
});
console.log({ user });
console.log({ session });
console.log({ error });
if (error) {
return {
...errorObject,
message: error.message,
};
}
const response = camelcaseKeys(user as User, { deep: true });
return {
__typename: 'LoginSuccess',
accessToken: session?.access_token,
refreshToken: session?.refresh_token,
...response,
};
} catch (error) {
return {
...errorObject,
message: error.message,
};
}
}
I have three console logs set up directly underneath the signIn() function, all of which are returning null.
I can also go directly to https://<your-ref>.supabase.co/auth/v1/authorize?provider=<provider> and auth works correctly, so it appears to have been narrowed down specifically to the signIn() function. What would cause the response to return null values?
This is happening because these values are not populated until after the redirect from the OAuth server takes place. If you look at the internal code of supabase/gotrue-js you'll see null being returned explicitly.
private _handleProviderSignIn(
provider: Provider,
options: {
redirectTo?: string
scopes?: string
} = {}
) {
const url: string = this.api.getUrlForProvider(provider, {
redirectTo: options.redirectTo,
scopes: options.scopes,
})
try {
// try to open on the browser
if (isBrowser()) {
window.location.href = url
}
return { provider, url, data: null, session: null, user: null, error: null }
} catch (error) {
// fallback to returning the URL
if (!!url) return { provider, url, data: null, session: null, user: null, error: null }
return { data: null, user: null, session: null, error }
}
}
The flow is something like this:
Call `supabase.auth.signIn({ provider: 'github' })
User is sent to Github.com where they will be prompted to allow/deny your app access to their data
If they allow your app access, Github.com redirects back to your app
Now, through some Supabase magic, you will have access to the session, user, etc. data
I currently have a Node.js back-end running Express with Passport.js for authentication and am attempting to switch to GraphQL with Apollo Server. My goal is to implement the same authentication I am using currently, but cannot figure out how to leave certain resolvers public while enabling authorization for others. (I have tried researching this question extensively yet have not been able to find a suitable solution thus far.)
Here is my code as it currently stands:
My JWT Strategy:
const opts = {};
opts.jwtFromRequest = ExtractJwt.fromAuthHeaderAsBearerToken();
opts.secretOrKey = JWT_SECRET;
module.exports = passport => {
passport.use(
new JwtStrategy(opts, async (payload, done) => {
try {
const user = await UserModel.findById(payload.sub);
if (!user) {
return done(null, false, { message: "User does not exist!" });
}
done(null, user);
} catch (error) {
done(err, false);
}
})
);
}
My server.js and Apollo configuration:
(I am currently extracting the bearer token from the HTTP headers and passing it along to my resolvers using the context object):
const apollo = new ApolloServer({
typeDefs,
resolvers,
context: async ({ req }) => {
let authToken = "";
try {
if (req.headers.authorization) {
authToken = req.headers.authorization.split(" ")[1];
}
} catch (e) {
console.error("Could not fetch user info", e);
}
return {
authToken
};
}
});
apollo.applyMiddleware({ app });
And finally, my resolvers:
exports.resolvers = {
Query: {
hello() {
return "Hello world!";
},
async getUserInfo(root, args, context) {
try {
const { id } = args;
let user = await UserModel.findById(id);
return user;
} catch (error) {
return "null";
}
},
async events() {
try {
const eventsList = await EventModel.find({});
return eventsList;
} catch (e) {
return [];
}
}
}
};
My goal is to leave certain queries such as the first one ("hello") public while restricting the others to requests with valid bearer tokens only. However, I am not sure how to implement this authorization in the resolvers using Passport.js and Passport-JWT specifically (it is generally done by adding middleware to certain endpoints, however since I would only have one endpoint (/graphql) in this example, that option would restrict all queries to authenticated users only which is not what I am looking for. I have to perform the authorization in the resolvers somehow, yet not sure how to do this with the tools available in Passport.js.)
Any advice is greatly appreciated!
I would create a schema directive to authorized query on field definition and then use that directive wherever I want to apply authorization. Sample code :
class authDirective extends SchemaDirectiveVisitor {
visitObject(type) {
this.ensureFieldsWrapped(type);
type._requiredAuthRole = this.args.requires;
}
visitFieldDefinition(field, details) {
this.ensureFieldsWrapped(details.objectType);
field._requiredAuthRole = this.args.requires;
}
ensureFieldsWrapped(objectType) {
// Mark the GraphQLObjectType object to avoid re-wrapping:
if (objectType._authFieldsWrapped) return;
objectType._authFieldsWrapped = true;
const fields = objectType.getFields();
Object.keys(fields).forEach(fieldName => {
const field = fields[fieldName];
const {
resolve = defaultFieldResolver
} = field;
field.resolve = async function (...args) {
// your authorization code
return resolve.apply(this, args);
};
});
}
}
And declare this in type definition
directive #authorization(requires: String) on OBJECT | FIELD_DEFINITION
map schema directive in your schema
....
resolvers,
schemaDirectives: {
authorization: authDirective
}
Then use it on your api end point or any object
Query: {
hello { ... }
getuserInfo():Result #authorization(requires:authToken) {...}
events():EventResult #authorization(requires:authToken) {...}
};
I'm new to GraphQL and going to build a solution using GraphQL.
Everything looks cool but just concerned on how to implement the role based authorization inside GraphQL server (I'm considering using GraphQL.js/ apollo server)
I will have a users table which contains all users. Inside the users table there's a roles field which contains the roles of the particular user. The queries and mutations will be granted based on the roles of the user.
How can I implement this structure?
THANKS!
For apollo server developers, there have generally been 3 ways to implement authorization in Graphql:
Schema-based: Adding a directive to the graphql types and fields you want to protect
Middleware-based: Adding middleware (code that runs before and after your graphql resolvers have executed). This is the approach used by graphql-shield and other authorization libraries built on top of graphql-middleware.
Business logic layer: This is the most primitive but granular approach. Basically, the function that returns data (i.e. a database query, etc) would implement its own permissions/authorization check.
Schema-based
With schema-based authorization, we would define custom schema directives and apply them wherever it is applicable.
Source: https://www.apollographql.com/docs/graphql-tools/schema-directives/
//schema.gql
directive #auth(
requires: Role = ADMIN,
) on OBJECT | FIELD_DEFINITION
enum Role {
ADMIN
REVIEWER
USER
UNKNOWN
}
type User #auth(requires: USER) {
name: String
banned: Boolean #auth(requires: ADMIN)
canPost: Boolean #auth(requires: REVIEWER)
}
// main.js
class AuthDirective extends SchemaDirectiveVisitor {
visitObject(type) {
this.ensureFieldsWrapped(type);
type._requiredAuthRole = this.args.requires;
}
visitFieldDefinition(field, details) {
this.ensureFieldsWrapped(details.objectType);
field._requiredAuthRole = this.args.requires;
}
ensureFieldsWrapped(objectType) {
if (objectType._authFieldsWrapped) return;
objectType._authFieldsWrapped = true;
const fields = objectType.getFields();
Object.keys(fields).forEach(fieldName => {
const field = fields[fieldName];
const { resolve = defaultFieldResolver } = field;
field.resolve = async function (...args) {
// Get the required Role from the field first, falling back
// to the objectType if no Role is required by the field:
const requiredRole =
field._requiredAuthRole ||
objectType._requiredAuthRole;
if (! requiredRole) {
return resolve.apply(this, args);
}
const context = args[2];
const user = await getUser(context.headers.authToken);
if (! user.hasRole(requiredRole)) {
throw new Error("not authorized");
}
return resolve.apply(this, args);
};
});
}
}
const schema = makeExecutableSchema({
typeDefs,
schemaDirectives: {
auth: AuthDirective,
authorized: AuthDirective,
authenticated: AuthDirective
}
});
Middleware-based
With middleware-based authorization, most libraries will intercept the resolver execution. The below example is specific to graphql-shield on apollo-server.
Graphql-shield source: https://github.com/maticzav/graphql-shield
Implementation for apollo-server source: https://github.com/apollographql/apollo-server/pull/1799#issuecomment-456840808
// shield.js
import { shield, rule, and, or } from 'graphql-shield'
const isAdmin = rule()(async (parent, args, ctx, info) => {
return ctx.user.role === 'admin'
})
const isEditor = rule()(async (parent, args, ctx, info) => {
return ctx.user.role === 'editor'
})
const isOwner = rule()(async (parent, args, ctx, info) => {
return ctx.user.items.some(id => id === parent.id)
})
const permissions = shield({
Query: {
users: or(isAdmin, isEditor),
},
Mutation: {
createBlogPost: or(isAdmin, and(isOwner, isEditor)),
},
User: {
secret: isOwner,
},
})
// main.js
const { ApolloServer, makeExecutableSchema } = require('apollo-server');
const { applyMiddleware } = require('graphql-middleware');
const shieldMiddleware = require('shieldMiddleware');
const schema = applyMiddleware(
makeExecutableSchema({ typeDefs: '...', resolvers: {...} }),
shieldMiddleware,
);
const server = new ApolloServer({ schema });
app.listen({ port: 4000 }, () => console.log('Ready!'));
Business logic layer
With business logic layer authorization, we would add permission checks inside our resolver logic. It is the most tedious because we would have to write authorization-checks on every resolver. The link below recommends placing the authorization logic in the business logic layer (i.e. sometimes called 'Models' or 'Application logic' or 'data-returning function').
Source: https://graphql.org/learn/authorization/
Option 1: Auth logic in resolver
// resolvers.js
const Query = {
users: function(root, args, context, info){
if (context.permissions.view_users) {
return ctx.db.query(`SELECT * FROM users`)
}
throw new Error('Not Authorized to view users')
}
}
Option 2 (Recommended): Separating out authorization logic from resolver
// resolver.js
const Authorize = require('authorization.js')
const Query = {
users: function(root, args, context, info){
Authorize.viewUsers(context)
}
}
// authorization.js
const validatePermission = (requiredPermission, context) => {
return context.permissions[requiredPermission] === true
}
const Authorize = {
viewUsers = function(context){
const requiredPermission = 'ALLOW_VIEW_USERS'
if (validatePermission(requiredPermission, context)) {
return context.db.query('SELECT * FROM users')
}
throw new Error('Not Authorized to view users')
},
viewCars = function(context){
const requiredPermission = 'ALLOW_VIEW_CARS';
if (validatePermission(requiredPermission, context)){
return context.db.query('SELECT * FROM cars')
}
throw new Error('Not Authorized to view cars')
}
}
I've recently implemented role based authorisation by using GraphQL Shield, I found that using that package was the simplest way to do it. Otherwise you could add custom schema directives, here's a good article on how to do that: https://dev-blog.apollodata.com/reusable-graphql-schema-directives-131fb3a177d1.
There are a few steps you need to take to setup GraphQL Shield:
1 - Write an authentication function, here's a rough example you'll want to be doing much more than this i.e using JWTs and not passing the id:
export const isAdmin = async ({ id }) => {
try {
const exists = await ctx.db.exists.User({
id: userId,
role: 'ADMIN',
});
return exists
} catch (err) {
console.log(err);
return false
}
}
2 - In the file where you export all of your mutations and queries add the check:
const resolvers = {
...your queries and mutations
}
const permissions = {
Query: {
myQuery: isAdmin
}
}
export default shield(resolvers, permissions);
This will now the isAdmin function every time your Query is requested.
I hope that helps