I'm creating a model that I will use to authenticate users for API access, and I have a secret field where I want to store a Base64 encoded uuid/v4 generated value.
I went through the different field types and options, but still not seeing how I could achieve this.
Is there a way to hook in model instance creation, and set the value of my secret field ?
Yes, you can use the pre hooks.
In your situation, the basics would be:
AuthenticationModel.schema.pre("save", function(next) {
const secretValue = generateSecretValue();
this.secret = secretValue;
next();
});
That would go before your final AuthenticationModel.register(); in your model.js file.
This is how I set it up, also with the pre-save hook. My problem before was that I was getting the same random number again until I restarted the server.
Store.schema.pre('save', function (next) {
if (!this.updateId && this.isNew) {
// generates a random ID when the item is created
this.updateId = Math.random().toString(36).slice(-8);
}
next();
});
Using this.isNew was also useful in my case.
Related
I am building a Remix app, and wanted to record some user analytics in my database based on what page the user was viewing. I also wanted to do so on a route by route basis, rather than just simply the raw URL.
For example: I wanted to know "user viewed URL /emails/123" as well as "user viewed route /emails/$emailId"
This problem could be generalized as "I want to run a piece of server code once per user navigation"
For my tracking I'm assuming users have javascript enabled in their browser.
Solutions I tried:
Record in the loader
This would be something like:
export const loader: LoaderFunction = async ({ request, params }): Promise<LoaderData> => {
myDb.recordPageVisit(request.url);
}
This doesn't work because the loader can be called multiple times per page visit (eg. after an action is run)
It's possible that there's some value hidden in the request parameter that tells us whether this is an initial page load or if it's a later visit, but if so I couldn't find it, including when I examined the raw HTTP requests.
It's also annoying to have to put this code inside of every loader.
Record the URL in the node code
I'm using #remix-run/node as my base remix server, so I have the escape hatch of setting up node middleware, and I thought that might be a good answer:
const app = express();
app.use((req, res, next) => {
if (req.url.indexOf("_data") == -1) {
myDb.recordPageVisit(req.url);
}
next();
});
I tried ignoring routes with _data in them, but that didn't work because remix is being efficient and when the user navigates, it is using an ajax-y call to only get the loaderData rather than getting the full rendered page from the server. I know this is the behavior of Remix, but I had not remembered it before I went down this path :facepalm:
As far as I can tell it's impossible to stateless-ly track unique pageviews (ie based purely on the current URL) - you need see the user's previous page as well.
I wondered if referer would allow this to work statelessly, but it appears that the referer is not behaving how I'd hoped: the referer header is already set to the current page in the first loader request for the data for the page. So initial load and load-after-mutation appear identical based on referer. I don't know if this is technically a bug, but it's certainly not the behavior I'd expect.
I ended up solving this by doing the pageview tracking in the client. To support recording this in the DB, I implemented a route that just received the POSTs when the location changed.
The documentation for react-router's useLocation actually includes this exact scenario as an example.
From https://reactrouter.com/docs/en/v6/api#uselocation:
function App() {
let location = useLocation();
React.useEffect(() => {
ga('send', 'pageview');
}, [location]);
return (
// ...
);
}
However, that doesn't quite work in remix - the location value is changed after actions (same text value, but presumably different ref value). So I started saving the last location string seen, and then only report a new pageview when the location string value has changed.
So after adding that stateful tracking of the current location, I landed on:
export default function App() {
// ...other setup omitted...
const [lastLocation, setLastLocation] = useState("");
let location = useLocation();
const matches = useMatches();
useEffect(() => {
if (lastLocation == location.pathname) {
return;
}
// there are multiple matches for parent route + root route, this
// will give us the leaf route
const routeMatch = matches.find((m) => m.pathname == location.pathname);
setLastLocation(location.pathname);
fetch("/api/pageview", {
body: JSON.stringify({
url: location.pathname,
// routeMatch.id looks like: "/routes/email/$emailId"
route: routeMatch?.id }),
method: "POST",
}).then((res) => {
if (res.status != 200) {
console.error("could not report pageview:", res);
}
});
}, [location]);
The matches code is not necessary for tracking just raw URLs, but I wanted to extract the route form (eg /emails/$emailId), and matches.id is a close match to that value - I strip "routes/" serverside. Matches docs: https://remix.run/docs/en/v1/api/remix#usematches
Client side pageview tracking is a bit annoying since clients are flaky, but for the current remix behavior I believe this is the only real option.
Did it a different way for routes, remix is funny cuz of the whole parent route thing * so I use a route logger
Beans.io
https://www.npmjs.com/package/beansio
I'm only a few days in (transitioning from Ember) so please pardon my ignorance.
I have an array of object (profileaccounts) in my store.
I have many different components which are connected to the store and have code like below, sometimes the user is the person logged in, sometimes its a user that is being passed into the component (when someone looks at someone else's profile)
componentDidMount() {
let user = this.props.user;
let account = this.props.profiles.find(function (prof) { return prof.profile.id === user.id });
if (account == null) {
this.props.dispatch(userActions.getProfile(user.id));
}
}
This completely works but I don't want this code replicated over and over again. My gut feeling is that I should always call .getProfile(user.id) and its the job of the actions to determine if the data exist in the local cache (store) or does it needs to be added. If it needs to be added, add it, then either way return it.
Alternatively, maybe the user service (which represents the API and is called by the actions to populate the profiles) is supposed to look locally before it calls the API. Regardless, I don't know how (or if I should) to access the store from the actions or service, only from the connected component.
I haven't seen this scenario in any of the guides I've read about redux/react, so if anyone can include a resource to where I should be looking I'd appreciate it. If I'm totally going about this the wrong way, I'd be happy to know that too.
You can use redux-thunk to access state inside the action
link to thunk: https://github.com/reduxjs/redux-thunk
The code would be like this:
function incrementIfOdd() {
return (dispatch, getState) => {
const { counter } = getState();
if (counter % 2 === 0) {
return;
}
dispatch(increment());
};
}
Using thunk you'll be able to access state inside the action.
Your approach sounds good...You'll start with dispatching an action "dispatch(getProfile(userId))" then inside the action you'll do whatever you want and when you finally have the data you want to put in the store dispatch another action "dispatch(storeUserProfile(profile))" which will put the data in the store via reducer.
I'm trying to update the values and connections on my current viewer within the Relay store.
So without calling the mutation signIn if I print:
console.log(viewer.name) // "Visitor"
console.log(viewer.is_anonymous) // true
on Mutations we got the method updater which gives us the store, so in my mutation I'm doing something like this:
mutation SignInMutation($input: SignInInput!){
signIn(input: $input){
user {
id
name
email
is_anonymous
notifications{
edges{
node {
id
...NotificationItem_notification
}
}
}
}
token
}
}
So my updater method has:
const viewer = store.get(viewer_id);
const signIn = store.getRootField('signIn');
viewer.copyFieldsFrom(signIn.getLinkedRecord('user'))
After this I updated the store I got the name email is_anonymous fields updated with the data that just came from the graphql endpoint (I mean now name is "Erick", is_anonymous is now false, which is great), but If I try to do viewer.notifications and render it, the length of the viewer.connections seem to be 0 even when it has notifications.
How can I update my current viewer and add the notifications from the MutationPayload into the store without the need to force fetch?
Im using the latest relay-modern and graphql.
PS: Sorry for the bad formation, but is just impossible to format the code the way OF wants me to, i formated it to 4 spaces and still gave me errors.
With some reorganisation of your GraphQL schema it might be possible to remove the need to interact directly with the Relay store after your sign-in mutation. Consider:
viewer {
id
currentUser {
name
email
}
}
When a user that is not logged in, currentUser would return null.
You could then modify your login mutation to be:
mutation SignInMutation($input: SignInInput!){
signIn(input: $input){
viewer {
id
currentUser {
name
email
token
}
}
}
}
Knowing the 'nullability' of the currentUser field provides an elegant way of determining if the user is logged in or not.
Based on the presence of the token field implies that you are using JWT or similar to track login status. You would need to store this token in local storage and attach it to the headers of the outgoing Relay requests to your GraphQL endpoint if it is present.
Storing the token itself would have to be done in the onCompleted callback of where you make the mutation request (you will have access to the payload returned by the server in the arguments of the callback function).
As an alternative to the token, you could also explore using cookies which would provide the same user experience but likely require less work to implement then JWT tokens.
I need to check a property of my PFUser's in beforeSave triggers for each of my classes to determine if that user should be allowed to edit the piece of data they are attempting to edit.
For example, if a non-admin PFUser is attempting to edit or add to a class they shouldn't be allowed to, I want to prevent that in the beforeSave trigger. I access the keys being edited using dirtyKeys.
Parse-Server doesn't support .currentUser() like the old Parse server used to. How can I access the PFUser who is making the request? Is there a way to do it besides through session tokens?
Parse.Cloud.beforeSave("Class", function(request, response) {
//Get the keys that're being edited and iterate through them
var dirtyKeys = request.object.dirtyKeys();
for (var i = 0; i < dirtyKeys.length; ++i) {
var dirtyKey = dirtyKeys[i];
//Allow or don't allow editing of each key
if (userObject.get("<KEY>")) {
console.log('Class before save trigger IS key');
//ADD CLASS SPECIFIC FUNCTIONALITY HERE
} else {
console.log('Class before save trigger NOT key');
//ADD CLASS SPECIFIC FUNCTIONALITY HERE
}
}
});
Turns out the answer is much more obvious than I anticipated and was in the docs but I overlooked it despite my searching.
Since Parse.User.current() isn't working in Parse Server, the replacement is simply request.user. I was able to easily access all the data I needed from this and am good to go.
var user = request.user; // request.user replaces Parse.User.current()
We have a two state field called Primary that is set to either yes or no. When the field is set to no, the user should be able to change it to yes. When the field is set to yes, it should be disabled, so the user can no longer change it.
We have code in the onload event that handles this; that works just fine. The challenging case is the one where a user changes the field from no to yes and then saves the form. This should lock the field so the user can't change it back to no. We attempted to solve this by putting the following code in onsave event:
export function onSave() {
var primaryControl = Xrm.Page.getControl(
d.ConstituentAffiliation.AttributeNames.Primary.toLowerCase());
if (primaryControl) {
if (primaryControl.getAttribute().getValue()) {
primaryControl.setDisabled(true);
}
else {
primaryControl.setDisabled(false);
}
}
}
This partly works. It does disable the field so it can no longer be changed. The save doesn't work, however, because Dynamics CRM appears not to send the values of disabled fields back to the server during the save, so the new value does not actually get saved.
Any ideas would be welcome. :)
It seems the following line solves my problem:
Xrm.Page.getAttribute(d.ConstituentAffiliation.AttributeNames.Primary
.toLowerCase()).setSubmitMode("always");
So the code now reads as follows:
export function onSave() {
var primaryControl = Xrm.Page.getControl( d.ConstituentAffiliation.AttributeNames.Primary.toLowerCase());
if (primaryControl) {
if (primaryControl.getAttribute().getValue()) {
Xrm.Page.getAttribute( d.ConstituentAffiliation.AttributeNames.Primary.toLowerCase() ).setSubmitMode("always");
primaryControl.setDisabled(true);
}
else {
primaryControl.setDisabled(false);
}
}
}
I should credit this blog which was very helpful: http://blogs.msdn.com/b/arpita/archive/2012/02/19/microsoft-dynamics-crm-2011-force-submit-on-a-disabled-field-or-read-only-field.aspx
Looks like you've solved it however I was curious. Have you tried using a Business Rule? This kind of basic functionality is what Business Rules in CRM 2015 can handle quite well.
For example something like this:-