How to refresh variables in react-apollo - graphql

Here is the scenario :
USER 1
1) Goes to login page
2) Writes email and password which are sent to the server by a mutation
3) Authentication OK -> the server returns a token and user informations (id, firstName, lastName)
4)The token and each user information are stored in a separate key in local storage
5) The user is redirected to the HomePage
6) The user goes to Profile Page
7) A query is sent to the server to retrieve all the informations about that user (thanks to the user id stored in local storage)
Here is the query :
const PROFILE_QUERY = gql`
query profileQuery($id: ID!) {
getUser(id: $id) {
firstName
lastName
email
}
}
`;
export default graphql(PROFILE_QUERY, {
options: {
variables: {
id: localStorage.getItem("id")
},
errorPolicy: "all"
}
})(ProfilePage);
8) The server returns the informations and the user can see them in the Profile Page
9) The user decides to logout
So everything is working for the first user, now a second user arrives at the same computer and the same browser.
USER 2
Same steps than the previous user from 1 to 7
The query sent to the server to retrieve user informations is not working because the id sent by the query is the id of the previous user (and a user is not allowed to retrieve informations about an other user)
If I do a browser refresh the query is sent with the good user id...
So why the id variable is not refreshed (seems like the id value in local storage is not read) at the first attempt ? How can I resolve this ?
Thank you for your help.

That happens because your options field is static and is evaluated when the file containing the query is first loaded. (I'm assuming somewhere, perhaps steps 3 or 4, the id in local storage is updated correctly for the second user.)
config.options can be an object like the one you are passing now or a function that is evaluated when needed by the query.
So to load the id from the localStorage each time instead of just once, you can do something like this:
options: () => ({
variables: {
id: localStorage.getItem("id")
},
errorPolicy: "all"
})

Then first user logged out, you need to reset Apollo store and clear local storage.

Related

How can we optimize the adding of multiple parameters to make the feature file generic to run sql query in latest cypress version?

I am using the below feature file to perform the sql query run using cy.task() in latest Cypress version. This is working fine, but is there any other clear way to perform sql query to include more parameters and to make a generic feature file line/ step definition ? Can someone please advise ?
Cypress: 10.3.0
Cypress-Cucumber-Preprocsessor
//Feature file line:
Scenario: Run the sql query to verify the user details in the User table
Given I run the sql query to get "FirstName, LastName, Email, Address" for the user email "sam#test.com"
// databaseQuery.cy.js
Then('I run the sql query to get {string} for the user email {string}', (query, userEmail) => {
cy.task('queryDb', `SELECT ${query} FROM Users WHERE Email="${userEmail}"`).then((result) => {
expect(result[0].FirstName).to.equal("Sam");
expect(result[0].LastName).to.equal("Thinker");
expect(result[0].Email).to.equal(userEmail);
expect(result[0].Address).to.equal("455 Sydney Street");
});
});
One way I can think of is to have an object of properties and values you would want to validate the response against and then use the properties from this same object in your query.
Let's assume you have an object as below.
const user = {
FirstName: "Sam",
LastName: "Thinker",
Email: "sam#test.com",
Address: "455 Sydney Street"
}
Then your code can be altered as something like below:
Scenario: Run the sql query to verify the user details in the User table
Given I run the sql query to get "userdetails" for the user email "sam#test.com"
Then(
'I run the sql query to get {string} for the user email {string}',
(query, userEmail) => {
const queryParams =
query == 'userdetails' ? Object.keys(user).join(', ') : '*';
cy.task(
'queryDb',
`SELECT ${queryParams} FROM Users WHERE Email="${userEmail}"`
).then((result) => {
expect(result[0]).to.deep.equal(user);
});
}
);
This would also mean the query params are hidden from the cucumber feature file.

Way to limit number of records a user can create in Amplify GraphQL API

I have an app where Auth is implemented using Cognito User Pools and API is a GraphQL API implemented using Amplify. In the Schema definitions, is there an easy way to limit the number of records a user can create. For example in the following schema...
type Product #model #auth(rules: [{ allow: owner }]) {
id: ID!
name: String!
description: String
}
I would like to limit the users to a maximum of 100 Products.
One way is via my front-end. When I detect that a user has reached 100 limit, I can just make the UI stop giving them the ability to add more. But if someone were to bypass the UI, they could create more than 100. Hence, I prefer to enforce this limit in the backend.
Is there a way to do this in the Schema definition, or elsewhere in AWS / DynamoDB ?
Thanks!
There isn't a straightforward way to do this that I'm aware of.
Below is how I would solve this.
Create a #key on Product on the owner property, so that you can query by owner.
Overwrite the CreateProduct mutation. In your custom resolver, before creating a new Product, query the Product table byOwner, using the owner id passed in, to count how many already exist.
Here is the documentation: https://docs.amplify.aws/cli/graphql-transformer/resolvers#add-a-custom-geolocation-search-resolver-that-targets-an-elasticsearch-domain-created-by-searchable
I think the easiest solution would be processing the API request in a lambda function that validates the request (product count < 100) before having the script write to the DB. Then you can null out the built-in mutations for the model to prevent unintended access.
Example Schema:
type Mutation {
addProduct(input: ProductAddInput): ProductAddOutput #function(name: "productLambda-${env}")
}
type Product
#model(queries: null, mutations: null, subscriptions: null) /* update these to what you need */
#auth(rules: [{ allow: owner }]) {
id: ID!
name: String!
description: String
}
In Lambda you can pull the username from the event.identity property and that should correlate to the owner field in the db. Since the AWS package is automatically loaded you should be looking at very fast script execution as long as your db indexes are set properly.
For the user product count, I see a couple of options:
A secondary index set up on the owner field so you don't do a ton of
scans
If you have a user table, you could add a field that counts
the products for each user and just update that table any time you
update the product table.

How to force a filter on server side to a graphql query?

Imagine the condition that I have a query called "users" that returns all the users and these users can be associated with one or more companies, so I have a type UserCompanies (I need it because it saves some more information beyond the relation). I'm using Prisma and I need to force a filter that returns only users that are of the same company as the requester.
I get the information of the company from JWT and need to inject this to the query before sending it to Prisma.
So, query should be like that:
query allUsers {
users {
name
id
status
email
userCompanies{
id
role
}
}
}
and on server side, I should transform it to: (user where is ok, just changing args)
query allUsers {
users(where: {
userCompanies_some: {
companyId: "companyId-from-jwt"
}
}) {
name
id
status
email
userCompanies(where: {
companyId: "companyId-from-jwt"
}){
id
role
}
}
}
I'm seeing a few resolutions to this, but I don't know if it is the best way:
1 - Using addFragmentToInfo, does the job to put conditions on the query, but if the query has a usercompanies already set, it gives me a conflict. Otherwise, it works fine.
2 - I can use an alias for the query, but after DB result I will need to edit all the results in array to overwrite the result.
3 - don't use info on Prisma and filter in js.
4 - Edit info(4th parameter) of type GraphqlResolveInfo

Send 2 different types of mails using mailchimp

I have a set of internal users for my project. Admin can activate/deactivate them. I want to send them a mail saying "your account has been deactivated" when their account is deactivated by admin. Similarly they should receive a mail saying "your account has been activated" when admin activates their account. How can I do this?
I am trying by creating 2 separate lists in mailchimp and two separate campaigns. but when I'm writing mailchimps credentials in my development.js with 2 separate list ids and then trying to get it in my javascript file,it is getting undefined (checked by console.log)..
Is there a way to do it by just single campaign/list?
Here's my development.js code of mailchimp credentials:
mailchimp: {
api_key: "***************-***",
list_id1: "*********", //internal users
list_id2: "*********" //internal deactivated users
},
my user.helper.js
const config = require('../../config/environment');
const Mailchimp = require('mailchimp-api-3');
const mailchimp = new Mailchimp(config.mailchimp.api_key);
exports.addToDeactivatedList = function (email, name) {
console.log(mailchimp.list_id1);
mailchimp.members.create(config.mailchimp.list_id1, {
email_address: email,
merge_fields: {
FNAME: name
},
status: 'subscribed'
}).then(user => { }).catch(e => {
console.log("deactivate list me add ho gya");
})
}
exports.addToActivatedList = function (email, name) {
console.log(mailchimp.list_id2);
mailchimp.members.create(config.mailchimp.list_id2, {
email_address: email,
merge_fields: {
FNAME: name
},
status: 'subscribed'
}).then(user => { }).catch(e => {
console.log("activate list me add ho gya");
})
}
and my user.controller.js (selective part only)
var helper = require('./user.helper');
.
.
if(req.body.status != user.status){
(req.body.status == "active") ? helper.addToActivatedList(user.email, user.name) : helper.addToDeactivatedList(user.email, user.name);
}
All the help will be appreciated. THANKS
I'd try to put everyone in the same list, and then create segments based on that list. After that, create a campaign based on that segment.
You could for instance create a custom list attribute that records wether or not an account is activated and create a segment based on that attribute. The campaign should then be based on that segment.
Perhaps also record the date an account has been activated or deactivated by the admin in another custom attribute and use that to check if a user already had an activation/deactivation mail.
MailChimp offers a feature for situations like this called automations. Automations allow you to send individual emails to subscribers when an event is triggered. So instead of creating separate campaigns every time a user is activated or deactivated, you can use just two automations and a single list.
Whether a user is active or not can be tracked with list merge fields. To do this, you'll need to add a new text merge field to your list. Let's name the field label 'Active'. Uncheck the 'Visible' checkbox so the user can't see it, and name your merge field something like 'ACTIVE'. You can use values like yes/no or true/false to identify the users by their active status.
Next, create your automations, one for activated users and one for deactivated users. You can set a trigger to send the email when a list field value is changed. So just make each of your two automations send the emails when the 'Active' list field values change to either 'yes' or 'no'.
Then all you need to do with the API is subscribe users to a single list whenever their accounts are activated or deactivated. Just make sure the new 'ACTIVE' merge field is set to 'yes' or 'no' when you do this, and any addresses already subscribed will be updated with the new value. So your mailchimp.members.create() would look something like this, based on the example from here:
mailchimp.members.create(<list_id>, {
email_address: <user_email>,
merge_fields: {
FNAME: name,
ACTIVE: 'yes' //Or 'no' if being used for deactivated users
},
status: 'subscribed'
})

HapiJS Catbox: How to search a key using some value's fields as search criteria?

In my app when a user is authenticated, I store his session data (including his email) in the server cache, and I create a sessionId that I use as the key.
When a user is deleted from the database, I want to check if he was logged in, that is, if there is a session in the cache that comes from his account, so I can drop that entry from the server cache too.
The problem is that the sessionId is not part of the User model, so I have to lookup his entry from the cache using his email, get the associated key, and drop the entry. Is that possible ?
Thanks in advance.
Catbox is just a key/value store and it doesn't look like there is a way to iterate through cache items like you want and find a user by another property. You need to know the key. You can either make the key the users' email or store sessionId in the database in another table.
You might be doing more work than you have to for session management. Yar is a hapi plugin that provides session management for you. Invalidating a session is pretty simple as well.
When the user logs out use yar.reset() to clear out the session.
Hapi-auth-cookie is another plugin for cookie-based session management.
I finally created a pre that return all the sessionIds associated to the account to be deleted, so I can delete them normally with server.cache.drop(key, cb). Error handling removed for brevity.
function currentSessionIds(request, reply) {
const sessionIds = [];
User.findOne({ _id: request.params.id }, (err, user) => {
const cacheDB = request.server.app.sessionsCache._cache.connection.settings.partition;
MongoClient.connect(`mongodb://host:port/${cacheDB}`, (err, db) => {
db.collection('sessions').find({ 'value.account.email': user.email }, { _id: 1 }).toArray((err, sessions) => {
sessions.forEach(session => sessionIds.push(session._id));
reply(sessionIds);
});
});
});
}
But of course this solution is too tied to mongodb and the way catbox-mongodb strategy store the data. If they change it my function is down.

Resources