This is more a question of practices than an issue (though I suppose it's an issue in that I don't know if this is possible) with the Parse-React library, but I have a special case where I need to make a query based off of info I get from another query. Ideally I want to do this in the same component. Is this possible?
e.g. the way I have my project setup right now using only Parse and React libraries (not Parse-React) is:
var QueryOne = new ParseQuery("QueryOne");
QueryOne.find({
success: function(results) {
var listOfThingsToGetFromQueryTwo = results.attributes.listOfThingsToGetFromQueryTwo;
var QueryTwo = new ParseQuery("QueryTwo").containedIn("id", listOfThingsToGetFromQueryTwo).find({
success: function(results) { /* do other stuff */ }
});
}
});
This works for a one-time query, but I'd like to use some of the more reactive features of Parse-React. So my question is, using Parse-React, can I make a component observe (in this example) QueryOne AND a QueryTwo depending on QueryOne?
I naively set out to implement this using by observing QueryOne in a parent component and passing that data as a prop to a child component, and observing QueryTwo from the child component, but I realize now that the observe function isn't triggered on prop updates so the query isn't rerun. Is there some way of triggering the query to rerun from componentWillUpdate?
I filed an issue directly on the parse-react github but haven't heard back. (https://github.com/ParsePlatform/ParseReact/issues/124) Would greatly appreciate any help! Thanks.
Got it! Turns out the observe function is passed newProps and newState as parameters, so if I make QueryOne in the parent container and pass the child the props, I can then use in the containedIn filter on the child with newProps and it works as expected.
Related
I'm using react-redux for a project I'm working on. I noticed that when I grab an object from my store and edit it, the object in state changes without me dispatching the change (but doesn't trigger a re-render on the components attached to that reducer's object). How can I stop state from changing without a dispatch?
For example if I do:
export function changeNeonGreenColourValue(colour) {
return (dispatch) => {
var neonColours = store.getState().colours.neon;
neonColours.green = colour;
dispatch(push('./home'));
};
}
And then in the layoutComponent I log:
console.log(this.props.state.colours.neon.green)
The output is still whatever I passed into changeNeonGreenColourValue() as "colour" but the page doesn't re-render to show that change. I know to get the page to re-render, all I have to do is dispatch the appropriate reducer case but I don't want the state object being altered at all unless I have an appropriate dispatch.
Apparently the 'standard' deep copying technique for solving this is to do a JSON stringifying and parse as so: const copiedObj = JSON.parse(JSON.stringify(sourceObj)); Unfortunately if you use this on large objects that will need parsing frequently you're going to have performance issues in your app as I do, if anyone has any suggestions for this I welcome them.
edit: so both jQuery and Loadash have their own implementations of deep cloning that are supposed to be better performance-wise:
https://lodash.com/docs/#cloneDeep
I personally implemented loadash to resolve my issue and it worked fine with little to no performance issues. I highly recommend it over JSON.stringify.
Using Apollo cache as global store - for remote and local data, is very convenient.
However, while I've never used redux, I think that the most important thing about it is implementing flux: an event driven architecture in the front-end that separate logic and ensure separation of concerns.
I don't know how to implement that with Apollo. The doc says
When mutation modifies multiple entities, or if it creates or deletes entities, the Apollo Client cache is not automatically updated to reflect the result of the mutation. To resolve this, your call to useMutation can include an update function.
Adding an update function in one part of the application that handle all cache updates; by updating queries and/or fragments for the all other parts of the application, is exactly what we want to avoid in Flux / Event driven architecture.
To illustrate this, let me give a single simple example. Here, we have (at least 3 linked components)
1. InboxCount
Component that show the number of Inbox items in SideNav
query getInboxCount {
inbox {
id
count
}
}
2. Inbox list items
Component that displays items in Inbox page
query getInbox {
inbox {
id
items {
...ItemPreview
...ItemDetail
}
}
}
Both of those components read data from those GQL queries from auto generated hooks ie. const { data, loading } = useGetInboxItemsQuery()
3. AddItem
Component that creates a new item. Because it creates a new entity I need to manually update cache. So I am forced to write
(pseudo-code)
const [addItem, { loading }] = useCreateItemMutation({
update(cache, { data }) {
const cachedData = cache.readQuery<GetInboxItemsQuery>({
query: GetInboxItemsDocument,
})
if (cachedData?.inbox) {
// 1. Update items list GetInboxItemsQuery
const newItems = cachedData.inbox.items.concat(data.items)
cache.writeQuery({
query: GetInboxItemsDocument,
data: {
inbox: {
id: 'me',
__typename: 'Inbox',
items: newItems,
},
},
})
// 2. Update another query wrapped into another reusable method, here
setInboxCount(cache, newItems.length)
}
},
})
Here, my AddItem component must be aware of my different other queries / fragments declared in my application šMoreover, as it's quite verbose, complexity is increasing very fast in update method. Especially when multiple list / queries should be updated like here
Does anyone have recommendations about implementing a more independent components? Am I wrong with how I created my queries?
The unfortunate truth about update is that it trades simplicity for performance. A truly "dumb" client would only receive data from the server and render it, never manipulating it. By instructing Apollo how to modify our cache after a mutation, we're inevitably duplicating the business logic that already exists on our server. The only way to avoid this is to either:
Have the mutation return a larger section of the graph. For example, if a user creates a post, instead of returning the created post, return the complete user object, including all of the user's posts.
Refetch the affected queries.
Of course, often neither approach is particularly desirable and we opt for injecting business logic into our client apps instead.
Separating this business logic could be as simple as keeping your update functions in a separate file and importing them as needed. This way, at least you can test the update logic separately. You may also prefer a more elegant solution like utilizing a Link. apollo-link-watched-mutation is a good example of a Link that lets you separate the update logic from your components. It also solves the issue of having to keep track of query variables in order to perform those updates.
I am setting up my React project with Redux for the first time, and am running into an issue.
I have a basic container, mapStateToProps, action creator, and reducer in place, but I'm having this issue where when I load a certain page, the previous prop values are getting loaded to the page before the correct values are fetched. It makes sense that this is happening, but I was wondering if there was a way to get around this so that upon loading this component / container that the values would get cleared. (Or if it'd just be better to use React state instead of Redux state for this specific use case? The rest of the pages make sense to use Redux, but this is the only case where this becomes an issue)
Say I have a page that shows some state of some item: somePage.com/items/
The component and container are set up like this (skeletal for the sake of example):
class SomeComponent extends React.Component<......> {
constructor(props) {
super(props);
props.dispatch(fetchItemDetail(params));
}
render() {
// use the stuff in props to display item's info
}
}
function mapStateToProps(state) {
return {
someItemDetail: state.someItemDetail.X;
someOtherInfo: state.someDetail.Y;
}
}
export const ItemDetailContainer = connect(mapStateToProps)(SomeComponent)
The action creator involves calling an API and returning the payload, and the reducer is just a standard reducer that sticks the result of the API call into the state.
Basically, this "works", but if I navigate from going into SomeComponent with a parameter for ItemX, and click a link to go to SomeComponent for ItemY, ItemX's information will show until the call to fetch ItemY's info completes, then ItemY's info will appear.
What's the recommended approach for handling this issue? It looks like I can't just clear the props upon construction because they are readonly. Any thoughts?
Self-answer:
I got something to work. Basically, I ended up just creating an action creator, clearItemDetail w/ CLEAR_ITEM_DETAIL type. When the reducer sees actions with CLEAR_ITEM_DETAIL, they would then "clear" that part of the state. (I had multiple things to clear, so my actions and reducers were a little more complicated than this though.)
I dispatched the clearItemDetail() action creator inside the componentWillUnmount() lifecycle method of the component, and it seems to be working now.
Not sure if this is the best route though.
It looks like fetchItemDetail is an asynchronous call. That being so, write a reducer for fetchItemDetail to clear ItemY's data. The reducer can coexist with the middleware to handle the asynchronous call.
case FETCH_ITEM_DETAIL:
return Object.assign({}, state, {ItemY: null});
break;
You can handle the null value in ItemY in your component to not show any data. Then, upon completion of the asynchronous call, assign the returned value (this sounds like it's already done as the ItemY data does appear when the call is complete):
case FETCH_ITEM_COMPLETE:
return Object.assign({}, state, {ItemY: [your item y data]});
break;
I'm converting an existing state model to Redux and it has been painless for the most part. However the one point I'm having trouble with is converting "observed" state ajax requests. Essentially, I have certain ajax requests "linked" to other pieces of state, so no matter who modifies them they'll always be issued correctly. I can get similar behavior by subscribing to the Redux store updates, but firing actions in the listener feels like a hack.
A possible solution is to move logic to the action creator via the thunk pattern. Problem is that I'd either have to duplicate fetching logic across actions (since multiple actions could modify "observed" state), or pull most reducer logic to the action creator level. The action creator also shouldn't be aware of how the reducers will respond to issued actions.
I could batch "sub-actions" so I only need to place the appropriate fetching logic in each action "block", but this seems to violate the concept of actions producing a valid state. I'd rather have this liability at the action creator level.
Are there any generally accepted rules surrounding this? This is not a simple application where ad hoc ajax requests are made as components are interacted with, most data is shared between multiple components and requests are optimized and fetched in reaction to state change.
TLDR;
I want to fire ajax requests in response to changes in state, not when a specific action happens. Is there a better, "Redux specific" way of organizing action/actionCreators to mock this behavior, other than firing these actions in a subscribe listener?
Using store.subscribe()
The easiest way is to simply use store.subscribe() method:
let prevState
store.subscribe(() => {
let state = store.getState()
if (state.something !== prevState.something) {
store.dispatch(something())
}
prevState = state
})
You can write a custom abstraction that lets you register conditions for side effects so they are expressed more declaratively.
Using Redux Loop
You might want to look at Redux Loop which let you describe effects (such as AJAX) calls together with state updates in your reducers.
This way you can āreturnā those effects in response to certain actions just like you currently return the next state:
export default function reducer(state, action) {
switch (action.type) {
case 'LOADING_START':
return loop(
{ ...state, loading: true },
Effects.promise(fetchDetails, action.payload.id)
);
case 'LOADING_SUCCESS':
return {
...state,
loading: false,
details: action.payload
};
This approach is inspired by the Elm Architecture.
Using Redux Saga
You can also use Redux Saga that lets you write long-running processes (āsagasā) that can take actions, perform some asynchronous work, and put result actions to the store. Sagas watch specific actions rather than state updates which is not what you asked for, but I figured Iād still mention them just in case. They work great for complicated async control flow and concurrency.
function* fetchUser(action) {
try {
const user = yield call(Api.fetchUser, action.payload.userId);
yield put({type: "USER_FETCH_SUCCEEDED", user: user});
} catch (e) {
yield put({type: "USER_FETCH_FAILED",message: e.message});
}
}
function* mySaga() {
yield* takeEvery("USER_FETCH_REQUESTED", fetchUser);
}
Ā No One True Way
All these options have different tradeoffs. Sometimes people use one or two, or even all three of them, depending on what turns out to be most convenient for testing and describing the necessary logic. I encourage you to try all three and pick what works best for your use case.
You can use a middleware to fire up your remote actions in response to the local action.
Let say I have a local action:
const updateField = (val) => {
{type: UPDATE_FIELD, val}
}
And a input field with:
<input type='text' onChange={this.props.updateField.bind(this.val)}>
So in a nutshell when you type inside of the field it fires your action that in turn changes the state via reducer. Lets just forget how this action was passed to the component or what this.val is - we just assume this has been already solved and it is working.
All is fine about this setup but it only changes your state locally. To update the server you will have to fire another action. Lets build it:
const updateFieldOnServer = (val) => {
return (dispatch) => {
MAKE_AJAX.done(
FIRE_SOME_ACTIONS_ON_SUCCESS
).failure(
FIRE_SOME_ACTIONS_ON_FAILURE
)
}
}
This is just an simple thunk async action thats somehow makes ajax request, returns promises and does something else on success or failure.
So the problem we have now is that I want both of this actions to be fired when I change the state of my input but I can't have the onChange to take two functions. So I will create a middleware named ServerUpdatesMiddleware
import _ from 'lodash'
import {
UPDATE_FIELD,
} from 'actionsPath'
export default ({ dispatch }) => next => action => {
if(_.includes([UPDATE_FIELD], action.type)){
switch(action.type){
case UPDATE_FIELD:
dispatch(updateFieldOnServer(action.val))
}
}
return next(action)
}
I can add it to my stack:
import ServerUpdatesMiddleware from 'pathToMe'
const createStoreWithMiddleware = applyMiddleware(
ServerUpdatesMiddleware,
thunkMiddleware,
logger
)(createStore);
And right now every single time when updateField action will be dispatched It will automatically dispatch updateFieldOnServer action.
This is just example I think will describe the problem easily - this problem can be fixed in many other different ways but I think it nicely fits the requirements. It is just how I do things - hope it will help you.
I am using middlewares all the time and have many of them - never had any problem with this approach and it simplifies the application logic - you only have to look in a single place to find out whats going on.
Having modules that subscribe to the state updates and the launch Ajax requests (firing actions as they go) seems fine to me, since it puts the stores/reducers firmly in charge of triggering requests. In my large app, ALL Ajax requests and other async behaviours are done this way, so all actions can be just payloads, with no concept of 'action creators'.
If possible, avoid cascading sync actions. My async handlers never fire actions synchronously, but only once the request completes.
In my view, this is a much more functional approach than async action creators, which you may or may not prefer!
componentWillReceiveProps of react life cycle is the best place to do this. componentWillReceiveProps will be passed both new and old props and inside that you can check for the change and dispatch your action which in turn will fire the ajax call.
But the catch here is state object for which you are checking needs to be added as component's props via mapStateToProps, so that it gets passed to componentWillReceiveProps. Hope it helps!
In KnockoutJS, what's the proper way to update an observableArray of JSON data each time an AJAX command is run?
Right now, I'm blanking the array using something like viewmodel.items([]), then repopulating it with the JSON data from the server. Short of using the KnockoutJS mapping plugin (which might be the only way to do this) what is the correct path?
My server logic is going to send some of the same data each time, so I can't just iterate and push the items into the array unless I want duplicates.
//// Adding how I'm doing it today ////
I'm not sure why I'm doing it this way, but this is just how I initially figured out how to update. So basically, like I said before, I get JSON data, then I pass it to something like this:
_model.addIncident = function (json) {
var checked = json.UserTouches > 0 ? true : false;
_model.incidents.push({
id: ko.observable(json.IncidentIDString),
lastTouchId: ko.observable(json.UserLastTouchIDString),
weight: ko.observable(json.Weight),
title: ko.observable(json.Title),
checked: ko.observable(checked),
createdOn: ko.observable(json.IncidentCreatedOn),
servicename: ko.observable(json.Servicename),
inEdit: ko.observable(false),
incidentHistory: ko.observableArray(),
matchScore: ko.observable()
});
};
for each node in the JSON array. As you can see, I've got some custom observables in there that get build with every passing piece of data. Maybe this is the wrong way to go, but it's worked great up until now.
An observableArray is really just a normal observable with some extra methods for array operations.
So, if you want to set the value of an observableArray to a new array, you can just do:
viewModel.items(myNewArray)
The mapping plugin can help you update the existing items in an array with any updates. In this case, your UI will only be updated from any differences.
I know I'm way too late on this one as I found myself stuck in this situation just recently. We can use a simple Javascript util function as a work-around.
If you have already marked _model.incidents as observableArray, you can do something like this when binding the returned JSON data:
eval("_model.incidents("+JSON.stringify(json)+");");
It worked for me. Hope you have created your observable like this:
_model.incidents = ko.observableArray([]);