I have a component that takes some data in the props and make an ajax request with them.
var ItemList = React.createClass({
propTypes: {
filters: React.PropTypes.object.isRequired,
},
getInitialState: function() {
return {items: []};
},
componentDidMount: function() {
this.ajaxFetchItems(this.props.filters);
},
componentWillReceiveProps: function(nextProps) {
this.ajaxFetchItems(nextProps.filters);
},
ajaxFetchItems: function(filter) {
....
this.setState({items: data});
}
}
The problem is that the props are changed almost immediately, and sometimes the ajax call in componentDidMount is slightly slower than the one in componentWillReceiveProps, so the initial state is written after the first update.
How can I avoid that a slow componentDidMount will overwrite a fast componentWillReceiveProps?
There are better ways to handle the lifecycle of a react component that downloads its data?
You could put a timestamp in state for the latest update processed.
And somehow make sure that the timestamp of the original Ajax request is included in the Ajax results.
And add a shouldComponentUpdate() to check if the received results have a timestamp that is later than the timestamp in state. If not: return false, and your component will ignore the results.
By the way: componentDidMount and componentWillReceiveProps can by definition only be run in that order. I suspect that your first Ajax call takes long to return result, and your second call is fast. So you get the Ajax results back in the wrong order.
(Not due to slow react functions).
UPDATE:
Using shouldComponentUpdate is the react-way of dealing with this case: Its purpose is to allow for comparison of the new state with the old state, and based on that comparison, not rerender.
The issue is (most likely) generated by the order in which ajax responses come in:
Ajax call 1 (fired in componentDidMount in this example)
Ajax call 2 (fired in componentWillReceiveProps, trigger by parent of component)
Response from call 2 comes in
Response from call 1 comes in.
So a more generic question/ solution would be for "How to handle ajax responses coming back in the wrong order".
The timestamp (in shouldComponentUpdate) is one way to do it.
An alternative (described here) is to make the second request (in componentWillReceiveProps) abort the first ajax request.
Revisit:
After giving it some further thought (the calls in componentDidMount and componentWillReceiveProps did not feel right), a more general react-like way to approach your component would probably be as follows:
Your component's job is basically to:
receive filter via prop,
use filter to fetch list with ajax,
and render ajax reponse = list.
So it has 2 inputs:
filter (= prop)
list (= ajax response)
and only 1 output = list (which may be empty).
Workings:
The first time component receives filter as prop: it needs to send out ajax request, and render an empty list or some loading state.
all subsequent filters: component should send out a new ajax request (and kill possible outstanding old requests), and it should NOT re-render (!).
whenever it receives an ajax response, it should re-render the list (by updating state).
Setting this up with react would probably look something like this:
getInitialState() {
this.fetchAjax(this.props.filter); // initiate first ajax call here
return { list : [] }; // used to maybe display "loading.." message
}
componentWillReceiveProps(nextProps) {
this.fetchAjax(nextProps.filter); // send off ajax call to get new list with new filter
}
shouldComponentUpdate(nextProps, nextState) {
return (this.state.list != nextState.list); // only update component if there is a new list
// so when new props (filter) comes in there is NO rerender
}
render() {
createChildrenWith(this.state.list);
}
fetchAjax(filter) {
killOutStandingRequests(); // some procedure to kill old ajax requests
getListAsync…
request: filter // request new list with new filter
responseHandler: this.handleResponse // add responseHandler
}
responseHandler(data) {
this.setState({ list : data }); // put the new list in state, triggering render
}
The original timestamp in state would solve the question posted above, but I thought I'd share the revised react component as a bonus...
Related
I have a list of users that I want to cache so that different component in my Angular 5 app does not hit the web service, and rather return cached response. To do this I did the following:
getAllUsers() {
return this.getUncachedUsersList().publishReplay().refCount();
}
getUncachedUsersList() {
return this.http.get('https://......');
}
In the above code snippet, I have two methods. I call getAllUsers inside all the components that needs users list, except in the case where let say I am adding a user and then I need an updated list. In that case I call 'getUncachedUsersList' to get the latest.
The problem is, when I call 'getUncachedUsersList', I expect 'getAllUsers' to cache the new list, but instead it return the same old list that was cached before adding a new user. So I would like to know how can I clear the cached response and save the new response that I get from 'getUncachedUsersList' and return the new response when 'getAllUsers' is called.
Rathar than doing like this, you should considering maintain a cacheable Subject.
// behavior subject do cache the latest result
// each subscribe to userList$ get the latest
userList$ = new BehaviorSubject([]);
// each time getNewUserList get call
// userList$ get the new list by calling next
getNewUserList() {
this.http.get(`http://...`).subscribe(list => this.userList$.next(list));
}
This question already has answers here:
How do I return the response from an asynchronous call?
(41 answers)
Closed 5 years ago.
I'm using RequireJS while prototyping an application. I'm "faking" a real database by loading a json file via ajax.
I have several modules that need this json file, which I noticed results in multiple http requests. Since I'm already using RequireJS, I thought to myself "hey, why not load this json file as another module". Since a module can return an object, it seemed reasonable.
So I tried this:
// data.js
define(function(require){
const $ = require('jquery')
var data = {}
$.getJSON('/path/to/data.json', function(json_data){
data = json_data
})
// this does not work because getJSON is async
return data
})
// some_module.js
define(function(require){
const my_data = require('data')
console.log(data) // undefined, but want it to be an object
})
I understand why what I'm doing is not working. I'm not sure what the best way to actually do this would be though.
Things I don't want to do:
Change getJSON to async: false
add a while (data == null) {} before trying to return data
Is there an AMD-y want to accomplish what I'm trying to do? I'm sure there's a better approach here.
Edit
I just tried this. It works, but I'm not sure if this is a good or terrible idea:
// in data.js
return $.getJSON('/path/to/data.json')
// in some_module.js
const my_data = require('data')
my_data.then(function(){
console.log(my_data.responseText)
// do stuff with my_data.responseText
})
My concern is (1) browser support (this is a "promise", right?) and (2) if multiple modules do this at the same time, will it explode.
Because this question is specifically referring to using JQuery, you can actually do this without a native promise using JQuery's deferred.then().
// in data.js
return $.getJSON('/path/to/data.json')
// in some_module.js
const my_data = require('data') // this is a JQuery object
// using JQuery's .then(), not a promise
my_data.then(function(){
console.log(my_data.responseText)
// do stuff with my_data.responseText
})
Based on the description of then() in JQuery's docs, it looks like this is using a promise behind the scenes:
As of jQuery 1.8, the deferred.then() method returns a new promise that can filter the status and values of a deferred through a function, replacing the now-deprecated deferred.pipe() method. [...]
Callbacks are executed in the order they were added. Since deferred.then returns a Promise, other methods of the Promise object can be chained to this one, including additional .then() methods.
Since JQuery's .then() does work in IE, I guess they are polyfilling the promise for IE behind the scenes.
I have a page that lists events, in which admins are can delete individual items with an AJAX call. I want to reload the page when an event is deleted, but I am having trouble implementing it with my current understanding of express' usual req, res, and next.
Here is my current implementation (simplified):
Simple jQuery code:
$(".delete").click(function(e){
$.post("/events/delete",{del:$(this).val()})
})
in my routes file:
function eventCtrl(req,res){
Event.find({}).exec(function(err,events){
...
var context = {
events:events,
...
}
res.render('events',context);
});
}
function deleteCtrl(req,res,next){
Event.findById(req.param("del")).exec(function(err,event){
// delete my event from google calendar
...
event.remove(function(err){
...
return next();
});
});
}
app.get('/events',eventCtrl);
app.post('/events/delete',deleteCtrl,eventCtrl);
When I make a post request with AJAX all the req handlers are called, the events are deleted successfully, but nothing reloads. Am I misunderstanding what res.render() does?
I have also tried using a success handler in my jQuery code when I make the post request, with a res.redirect() from deleteCtrl, but my context is undefined in that case.
on the client side, you are using
$(".delete").click(function(e){
$.post("/events/delete",{del:$(this).val()})
})
this code does not instruct the browser to do anything when the response from the post is received. So nothing visible happens in the browser.
You problem is not located server side ; the server answers with the context object. You are simply not doing anything with this answer.
Try simply adding a successHandler.
Generally speaking this would not be a best practice. What you want to do is reconcile the data. If the delete is successful, then just splice the object out of the array it exists in client-side. One alternative would be to actually send back a refreshed data set:
res.json( /* get the refreshed data set */ );
Then client-side, in the callback, you'd actually just set the data source(s) back up based on the result:
... myCallback(res) {
// refresh the data source(s) from the result
}
I am attempting to write an Angular page to communicate with my Nodejs server, but I have ran into a snag.
I need to use multiple Ajax requests that rely on the data from previous ajax requests to work.
So Ajax request #1 provides data that is used by all other Ajax requests, and Ajax request #2 uses data from ajax request #1 to get the data that Ajax request #3 needs.
Since Angular is asynchronous, how can I make my script wait for the data from the first one before making the next ajax call.
id = ajax()
Wait for data
token = ajax(id)
wait for data
gametoken = ajax(id, token)
wait for data
Chandermani is correct, just remember to make sure to make the variables you need available in the scope that you need it.
var id,token,gametoken;
$http.get('http://host.com/first')
.then(function(result){
id=result;
return $http.get('http://host.com/second/'+id);
}
.then(function(result){
token = result
return $http.get('http://host.com/third'+id+'/'+token);
}
.then(function(result){
gametoken = result;
//Do other code here that requires id,token and gametoken
}
EDIT:
You don't have to chain the promises. If you want to make a call at a later date and you want to make sure the promises have resolved you can use $q.all();
var id,token,gametoken;
var p1 = $http.get('http://host.com/first')
.then(function(result){
id=result;
}
// Later on to make your new second call
$q.all([p1]).then(function(){
//Make second call knowing that the first has finished.
}
$q.all() takes an array so you can put in multiple promises if you want and it will wait until they have all resolved.
I have a custom event that I want to fire using jQuery's trigger method:
$(wizard).trigger('validatingStepValues');
Then in the wizard's current step code, I subscribe to this event as follow:
$(wizard).bind('validatingStepValues', function (){
// Validating step's form data here; returning false on invalid state.
});
Then in my wizard, again I want to be able to stop user from going to the next step, if a false value is returned from validation process? I'd like to have something like:
$(wizard).trigger('validatingStepValues', validReturnCallback, invalidReturnCallback)
Have you considered using something like:
function wizardValidator(successCallback, failureCallback) {
return function() {
// Validating step's form data here;
if (wasValid && successCallback) {
successCallback();
}
else if (! wasValid && failureCallback) {
failureCallback();
}
return wasValid;
};
}
$(wizard).bind('validatingStepValues', wizardValidator(validReturnCallback, invalidReturnCallback));
This requires that you know the callbacks that you want to use at the time you bind the event listener. If you want to be able to use different callback functions at different times, you could define additional event types, like:
$(wizard).bind('validatingStep2Values', wizardValidator(validStep2ReturnCallback, invalidStep2ReturnCallback));
$(wizard).bind('validatingStep3Values', wizardValidator(validStep3ReturnCallback, invalidStep3ReturnCallback));
Alternately, events that you create by calling trigger() propagate up the DOM hierarchy. Returning false an event handler cancels this propagation. So you could bind your desired success callback function as an event listener on your wizard's parent node. That won't do anything to allow your failure callback to be executed, however.