Making an Ajax request using data from previous ajax request - ajax

I am attempting to write an Angular page to communicate with my Nodejs server, but I have ran into a snag.
I need to use multiple Ajax requests that rely on the data from previous ajax requests to work.
So Ajax request #1 provides data that is used by all other Ajax requests, and Ajax request #2 uses data from ajax request #1 to get the data that Ajax request #3 needs.
Since Angular is asynchronous, how can I make my script wait for the data from the first one before making the next ajax call.
id = ajax()
Wait for data
token = ajax(id)
wait for data
gametoken = ajax(id, token)
wait for data

Chandermani is correct, just remember to make sure to make the variables you need available in the scope that you need it.
var id,token,gametoken;
$http.get('http://host.com/first')
.then(function(result){
id=result;
return $http.get('http://host.com/second/'+id);
}
.then(function(result){
token = result
return $http.get('http://host.com/third'+id+'/'+token);
}
.then(function(result){
gametoken = result;
//Do other code here that requires id,token and gametoken
}
EDIT:
You don't have to chain the promises. If you want to make a call at a later date and you want to make sure the promises have resolved you can use $q.all();
var id,token,gametoken;
var p1 = $http.get('http://host.com/first')
.then(function(result){
id=result;
}
// Later on to make your new second call
$q.all([p1]).then(function(){
//Make second call knowing that the first has finished.
}
$q.all() takes an array so you can put in multiple promises if you want and it will wait until they have all resolved.

Related

Multiple Observables waiting for single Observable completion on RetryWhen

I am trying to build an app using Oauth2.
There are multiple (GET) calls to API from the frontend and I deal with them like so:
// Call API for data
this.apiGet('/foo?id=bar).pipe(
// If it fails, use refresh token to obtain new access token and try again
retryWhen(x => {
// If there is existing request (Observable), use it
if (this.refreshTokenObservable !== null) {
return this.refreshTokenObservable;
}
// Otherwise make new request (fills this.refreshTokenObservable)
return this.getAccessByRefreshToken();
})
)
This works fine when there is only one apiGet(), however, when using multiple calls one right after another, the refreshTokenObservable variable do not fill quickly enough, causing multiple calls and, subsequently, errors.
Is there any way to prevent this?

mocha: can't use one request for mutlple `it` test

const request = require('supertest');
const server = request('http://localhost:9001');
describe('Get /static/component-list.json', function() {
const api = server.get('/static/component-list.json');
it('should response a json', function(done) {
api.expect('Content-Type', /json/, done);
});
it('200', function(done) {
api.expect(200, done); // This will failed
// server.get('/static/component-list.json').expect(200, done); // This will successed
});
});
when reuse api in the second test case, mocha will raise a Error:
The result of mocha test/api command:
How can I request the url once and use in multiple it case.
Solution
You have to create a new request for each test (each it) that you want to run. You cannot reuse the same request for multiple tests. So
describe('Get /static/component-list.json', function() {
let api;
beforeEach(() => {
api = server.get('/static/component-list.json');
});
Or if you want to reduce the number of requests made, then combine all your checks on the request into a single Mocha test.
Explanation
If you look at the code of supertest, you'll see that when you call an expect method with a callback, expect calls automatically calls end. So this:
api.expect('Content-Type', /json/, done);
is equivalent to this:
api.expect('Content-Type', /json/).end(done);
The end method is provided by superagent, which is what supertest uses to perform requests. The end method is what kicks off the request. It means you are done setting up the request and want to fire it off now.
The end method calls the request method, which is tasked with using Node's networking machinery to produce a Node request object that is used to perform the network operation. The problem is that request caches the Node request is produces but this Node request object is not reusable. So ultimately, a superagent or supertest request cannot be ended twice. You have to reissue the request for each test.
(You could manually flush the cached object between tests by doing api.req = undefined. But I strongly advise against this. For one thing, whatever optimization you might think you'd get is minimal because the network request still has to be made anew. Secondly, this amounts to messing with superagent's internals. It may break with a future release. Third, there may be other variables that hold state that might need to be reset together with req.)

Race condition between componentWillReceiveProps and componentDidMount

I have a component that takes some data in the props and make an ajax request with them.
var ItemList = React.createClass({
propTypes: {
filters: React.PropTypes.object.isRequired,
},
getInitialState: function() {
return {items: []};
},
componentDidMount: function() {
this.ajaxFetchItems(this.props.filters);
},
componentWillReceiveProps: function(nextProps) {
this.ajaxFetchItems(nextProps.filters);
},
ajaxFetchItems: function(filter) {
....
this.setState({items: data});
}
}
The problem is that the props are changed almost immediately, and sometimes the ajax call in componentDidMount is slightly slower than the one in componentWillReceiveProps, so the initial state is written after the first update.
How can I avoid that a slow componentDidMount will overwrite a fast componentWillReceiveProps?
There are better ways to handle the lifecycle of a react component that downloads its data?
You could put a timestamp in state for the latest update processed.
And somehow make sure that the timestamp of the original Ajax request is included in the Ajax results.
And add a shouldComponentUpdate() to check if the received results have a timestamp that is later than the timestamp in state. If not: return false, and your component will ignore the results.
By the way: componentDidMount and componentWillReceiveProps can by definition only be run in that order. I suspect that your first Ajax call takes long to return result, and your second call is fast. So you get the Ajax results back in the wrong order.
(Not due to slow react functions).
UPDATE:
Using shouldComponentUpdate is the react-way of dealing with this case: Its purpose is to allow for comparison of the new state with the old state, and based on that comparison, not rerender.
The issue is (most likely) generated by the order in which ajax responses come in:
Ajax call 1 (fired in componentDidMount in this example)
Ajax call 2 (fired in componentWillReceiveProps, trigger by parent of component)
Response from call 2 comes in
Response from call 1 comes in.
So a more generic question/ solution would be for "How to handle ajax responses coming back in the wrong order".
The timestamp (in shouldComponentUpdate) is one way to do it.
An alternative (described here) is to make the second request (in componentWillReceiveProps) abort the first ajax request.
Revisit:
After giving it some further thought (the calls in componentDidMount and componentWillReceiveProps did not feel right), a more general react-like way to approach your component would probably be as follows:
Your component's job is basically to:
receive filter via prop,
use filter to fetch list with ajax,
and render ajax reponse = list.
So it has 2 inputs:
filter (= prop)
list (= ajax response)
and only 1 output = list (which may be empty).
Workings:
The first time component receives filter as prop: it needs to send out ajax request, and render an empty list or some loading state.
all subsequent filters: component should send out a new ajax request (and kill possible outstanding old requests), and it should NOT re-render (!).
whenever it receives an ajax response, it should re-render the list (by updating state).
Setting this up with react would probably look something like this:
getInitialState() {
this.fetchAjax(this.props.filter); // initiate first ajax call here
return { list : [] }; // used to maybe display "loading.." message
}
componentWillReceiveProps(nextProps) {
this.fetchAjax(nextProps.filter); // send off ajax call to get new list with new filter
}
shouldComponentUpdate(nextProps, nextState) {
return (this.state.list != nextState.list); // only update component if there is a new list
// so when new props (filter) comes in there is NO rerender
}
render() {
createChildrenWith(this.state.list);
}
fetchAjax(filter) {
killOutStandingRequests(); // some procedure to kill old ajax requests
getListAsync…
request: filter // request new list with new filter
responseHandler: this.handleResponse // add responseHandler
}
responseHandler(data) {
this.setState({ list : data }); // put the new list in state, triggering render
}
The original timestamp in state would solve the question posted above, but I thought I'd share the revised react component as a bonus...

Reload page with new context in express

I have a page that lists events, in which admins are can delete individual items with an AJAX call. I want to reload the page when an event is deleted, but I am having trouble implementing it with my current understanding of express' usual req, res, and next.
Here is my current implementation (simplified):
Simple jQuery code:
$(".delete").click(function(e){
$.post("/events/delete",{del:$(this).val()})
})
in my routes file:
function eventCtrl(req,res){
Event.find({}).exec(function(err,events){
...
var context = {
events:events,
...
}
res.render('events',context);
});
}
function deleteCtrl(req,res,next){
Event.findById(req.param("del")).exec(function(err,event){
// delete my event from google calendar
...
event.remove(function(err){
...
return next();
});
});
}
app.get('/events',eventCtrl);
app.post('/events/delete',deleteCtrl,eventCtrl);
When I make a post request with AJAX all the req handlers are called, the events are deleted successfully, but nothing reloads. Am I misunderstanding what res.render() does?
I have also tried using a success handler in my jQuery code when I make the post request, with a res.redirect() from deleteCtrl, but my context is undefined in that case.
on the client side, you are using
$(".delete").click(function(e){
$.post("/events/delete",{del:$(this).val()})
})
this code does not instruct the browser to do anything when the response from the post is received. So nothing visible happens in the browser.
You problem is not located server side ; the server answers with the context object. You are simply not doing anything with this answer.
Try simply adding a successHandler.
Generally speaking this would not be a best practice. What you want to do is reconcile the data. If the delete is successful, then just splice the object out of the array it exists in client-side. One alternative would be to actually send back a refreshed data set:
res.json( /* get the refreshed data set */ );
Then client-side, in the callback, you'd actually just set the data source(s) back up based on the result:
... myCallback(res) {
// refresh the data source(s) from the result
}

Debugging Ajax requests in a Symfony environment

Not sure if SFDebug is any help in this situation. I am making an ajax post using jQuery. Which retrieves JSON data in my action URL and then makes a call to the Model method that executes the action. The part until my action URL, and the jQuery call to it work fine. With the data transmitted from the client to the server well received and no errors being made.
It is the part where it calls the method on the Model that is failing. My jQuery method looks like this:
$.post(url, jsonData, function(servermsg) { console.log(servermsg); }) ;
My server action is like this
public function executeMyAjaxRequest(sfWebRequest $request)
{
if($request->isXmlHttpRequest())
{
// process whatever
$servermsg = Doctrine_Core::getTable('table')->addDataToTable($dataArray);
return $this->renderText($servermsg);
}
return false;
}
The method of concern in the Table.class.php file looks like this:
public function addDataToTable($dataArray)
{
// process $dataArray and retrieve the necessary data
$data = new Data();
$data->field = $dataArray['field'];
.
.
.
$data->save();
return $data->id ;
}
The method fails up here in the model, when renderText in the action is returned and logged into the console, it returns the HTMl for SFDEBUG. Which indicates that it failed.
If this was not an Ajax call, I could debug it by seeing what the model method spat out, but this is a little tedious with Ajax in the mix.
Not looking for exact answers here, but more on how I can approach debugging ajax requests in a symfony environment, so if there are suggestions on how I can debug this, that would be great.
You must send cookie with session ide key via ajax
(Assuming you have XDEBUG configured on the server)
In order to trigger a debug session by an AJAX request you have to somehow make that request to send additional URL parameter XDEBUG_SESSION_START=1. For your example:
$.post(url + '?XDEBUG_SESSION_START=1', jsonData, function(servermsg) { console.log(servermsg); }) ;
You can also trigger it via cookie, but appending URL parameter usually easier.

Resources