AngularJS: Using $q to fire ajax calls synchronously - ajax

Is it possible to use $q to fire ajax requests synchronously in AngularJS?
I have a long list of vehicles, each vehicle has events associated with them and I need to retrieve the eventdetails of each event when the user expands the listing.
Right now, if the user expands the listing, I am firing up to 15 calls asynchronously and it seems to be causing issues with the API I'm consuming, so I'd like to see if performance is improved if I wait for each request finishes before firing the next.
I'm attempting to implement $q to delay the next request until the previous is finished, however I can't seem to wrap my head around using the service, here is what I currently have:
// On click on the event detail expander
$scope.grabEventDetails = function(dataReady, index) {
if (dataReady == false) {
retrieveEventDetails($scope.vehicles[index].events);
}
}
var retrieveEventDetails = function(events) {
// events is array
var deferred = $q.defer();
var promise = deferred.promise;
var retrieveData = function(data) {
return $http({
url: '/api/eventdetails',
method: 'POST',
data: {
event_number: data.number
},
isArray: true
});
}
_.each(events, function(single_event) {
promise.then(retrieveData(single_event).success(function(data) {
console.log(data);
}));
});
}
This is still firing asynchronously, Where am I going wrong with this?
I understand firing the requests synchronously isn't the best idea, at the moment I just want to see if performance is improved with the API at all.

You don't need $q to implement a promise as $http returns one.
_.each fires all the callbacks without especially waiting the promise.
All you do is call retrieveData for all events whenever your promise is resolved, and since you don't do a first call, it shouldn't even be working
You could do some recursive call like this :
var retrieveEventDetails = function(events) {
var evt = events.shift();
$http({
url: '/api/eventdetails',
method: 'POST',
data: {
event_number: evt.number
},
isArray: true
}).then(function(response){
console.log(response.data);
retrieveEventDetails(events);
});
}

I do think you should use $q as some other part of your application might need to get a promise.
A good example would be $routeProvider resolve option.
I made a little demo in plunker.
Solution:
retrieveData function should return a function (which returns a promise) instead of a just a promise.
That way we can create a promise chain: promise.then(fn).then(fn).then(fn).then(null,errorFn)
We must resolve the first promise to kick the chain.
var retrieveEventDetails = function(events) {
// events is array
var deferred = $q.defer();
var promise = deferred.promise;
var retrieveData = function(data) {
return function(){
return $http({
url: '/api/eventdetails',
method: 'POST',
data: {
event_number: data.number
},
isArray: true
})
}
}
deferred.resolve();
return events.reduce(function(promise, single_event){
return promise.then(retrieveData(single_event));
}, promise);
}

I'm not sure you even need $q here. In this example, each piece of data is registered in the controller as soon as it comes back from the call.
Live demo (click).
var app = angular.module('myApp', []);
app.controller('myCtrl', function($scope, myService) {
$scope.datas = myService.get();
});
app.factory('myService', function($http) {
var myService = {
get: function() {
var datas = {};
var i=0;
var length = 4;
makeCall(i, length, datas);
return datas;
}
}
function makeCall(i, length, datas) {
if (i < length) {
$http.get('test.text').then(function(resp) {
datas[i] = resp.data+i;
++i;
makeCall(i, length, datas);
});
}
}
return myService;
});
Here's a way using $q.all() that you can wait for all of the data to come through before passing it to the controller: Live demo (click).
var app = angular.module('myApp', []);
app.controller('myCtrl', function($scope, myService) {
myService.get().then(function(datas) {
$scope.datas = datas;
})
});
app.factory('myService', function($q, $http) {
var myService = {
get: function() {
var deferred = $q.defer();
var defs = [];
var promises = [];
var i=0;
var length = 4;
for(var j=0; j<length; ++j) {
defs[j] = $q.defer();
promises[j] = defs[j].promise;
}
makeCall(i, length, defs);
$q.all(promises).then(function(datas) {
deferred.resolve(datas);
});
return deferred.promise;
}
}
function makeCall(i, length, defs) {
if (i < length) {
$http.get('test.text').then(function(resp) {
defs[i].resolve(resp.data+i);
++i;
makeCall(i, length, defs);
})
}
}
return myService;
});

Related

Parallel asynchronous Ajax calls from the client

I have 20 data packet in the client and I am pushing one by one to the server via Ajax post. Each call take approximately one minute to yield the response. Is there any way to make few of these requests run parallel.
I have used Jquery promise. However, still the request waiting for the prior one to get completed.
var dataPackets=[{"Data1"},{"Data2"},{"Data3"},{"Data4"},{"Data5"},
{"Data6"},{"Data7"},{"Data8"},{"Data9"},{"Data10"},
{"Data11"},{"Data12"},{"Data13"},{"Data14"},{"Data15"},{"Data16"},
{"Data17"},{"Data18"},{"Data19"},{"Data20"}];
$(dataPackets).each(function(indx, request) {
var req = JSON.stringify(request);
setTimeout({
$.Ajax({
url: "sample/sampleaction",
data: req,
success: function(data) {
UpdateSuccessResponse(data);
}
});
}, 500);
});
The when...done construct in jQuery runs ops in parallel..
$.when(request1(), request2(), request3(),...)
.done(function(data1, data2, data3) {});
Here's an example:
http://flummox-engineering.blogspot.com/2015/12/making-your-jquery-ajax-calls-parallel.html
$.when.apply($, functionArray) allows you to place an array of functions that can be run in parallel. This function array can be dynamically created. In fact, I'm doing this to export a web page to PDF based on items checked in a radio button list.
Here I create an empty array, var functionArray = []; then based on selected items I push a function on to the array f = createPDF(checkedItems[i].value)
$(document).ready(function () {
});
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds){
break;
}
}
}
function exportPDFCollection() {
var f = null;
var x = 0;
var checkedItems = $("input:checked");
var count = checkedItems.length;
var reportList = $(checkedItems).map(
function () {
return $(this).next("label").text();
})
.get().join(",");
var functionArray = [];
var pdf = null;
for (var i = 0; i < count; i++) {
f = createPDF(checkedItems[i].value)
.done(function () {
pdf = checkedItems[x++].value;
alert('PDF => ' + pdf + ' created.');
})
.fail(function (jqxhr, errorText, errorThrown) {
alert('ajax call failed');
});
functionArray.push(f);
}
$.when.apply($, functionArray)
.done(function () {
$.get("http://yourserver/ExportPage.aspx",{reports: reportList})
.done(function () {
alert('PDF merge complete.');
})
.fail(function (jqxhr, errorText, errorThrown) {
alert('PDF merge failed. Please try again.');
});
return true;
});
}
function createPDF(webPage) {
return $.get(webPage);
}

How to reuse HTTP request with retry logic in AngularJS

Is it possible to execute the same HTTP request more than once in AngularJS? i.e. without re-defining the same request twice?
var retry = false;
var req = $http.get( 'ajax.php?a=StartSession&ref=' + code );
req.success(function(res) {
alert(res);
});
req.error( function(res) {
if(retry == false)
//run request again req.get(); ?
retry = true;
});
The previous answer is good in terms of reusing it as service. But it looks like you really want to abstract out the retry logic as well. Here is how i would do that.
app.service('SessionService', ['$http', '$q', function($http, $q){
var _this = this;
var _maxRetryCount = 5; //Just have a maxRetryCount
this.StartSession = function (code, retries){
//if you dont pass retry take the maxretryCount
retries = angular.isUndefined(retries) ? _maxRetryCount : retries;
return $http.get('ajax.php?a=StartSession&ref=' + code)
.then(function(result) {
//process and return the result
return result.data;
}, function (errorResponse) {
//If retries left decrement count and make the call again.
if(retries) {
return _this.StartSession(code, --retries); //here we are returning the promise
}
//All tried done Now Fail or return some data
return $q.reject('oops failed after retries');
});
}
}]);
And just inject SessionService anywhere say in yourcontroller:-
SessionService.StartSession(code).then(function(result){
//handle Result
}).catch(function(){
//handle fail condition
});
Plnkr
It's what services and factories were made for:
app.factory("dataFactory", ["$http", function($http) {
return {
call: function(code) {
return $http.get( 'ajax.php?a=StartSession&ref=' + code )
}
}
}]);
Inject and use
app.controller("myCtrl", ["dataFactory", function(dataFactory) {
var code = "myCode";
dataFactory.call(code).success(function(res) {
//gotcha
});
}]);

Angular $http, $q: track progress

Is there a way to track progress of http requests with Angular $http and $q? I'm making $http calls from a list of urls and then using $q.all I'm returning result of all requests. I would like to track progress of each request (promise resolved) so that I can show some progress to the user. I'm thinking of emitting event when a promise gets resolved but I'm not sure where should that be.
var d = $q.defer();
var promises = [];
for(var i = 0; i < urls.length; i++){
var url = urls[i];
var p = $http.get(url, {responseType: "arraybuffer"});
promises.push(p);
}
$q.all(promises).then(function(result){
d.resolve(result);
}, function(rejection){
d.reject(rejection);
});
return d.promise;
EDIT:
OK, after a bit of fiddling, this is what I've come up with
var d = $q.defer();
var promises = [];
var completedCount = 0;
for(var i = 0; i < urls.length; i++){
var url = urls[i];
var p = $http.get(url, {responseType: "arraybuffer"}).then(function(respose){
completedCount = completedCount+1;
var progress = Math.round((completedCount/urls.length)*100);
$rootScope.$broadcast('download.completed', {progress: progress});
return respose;
}, function(error){
return error;
});
promises.push(p);
}
$q.all(promises).then(function(result){
d.resolve(result);
}, function(rejection){
d.reject(rejection);
});
return d.promise;
Not sure if it is the right way of doing it.
I see you have already edit your own code, but if you need a more overall solution, keep reading
I once made a progress solution based on all pending http request (showing a indicator that something is loading, kind of like youtube has on the top progress bar)
js:
app.controller("ProgressCtrl", function($http) {
this.loading = function() {
return !!$http.pendingRequests.length;
};
});
html:
<div id="fixedTopBar" ng-controller="ProgressCtrl as Progress">
<div id="loading" ng-if="Progress.loading()">
loading...
</div>
</div>
.
Hardcore
For my latest project it wasn't just enought with just request calls. I started to get into sockets, webworker, filesystem, filereader, dataChannel and any other asynchronous calls that use $q. So i start looking into how i could get all the pending promises (including $http). Turns out there wasn't any angular solution, so i kind of monkey patched the $q provider by decorating it.
app.config(function($provide) {
$provide.decorator("$q", function($delegate) {
// $delegate == original $q service
var orgDefer = $delegate.defer;
$delegate.pendingPromises = 0;
// overide defer method
$delegate.defer = function() {
$delegate.pendingPromises++; // increass
var defer = orgDefer();
// decreass no mather of success or faliur
defer.promise['finally'](function() {
$delegate.pendingPromises--;
});
return defer;
}
return $delegate
});
});
app.controller("ProgressCtrl", function($q) {
this.loading = function() {
return !!$q.pendingPromises;
};
});
This may not perhaps fit everyone needs for production but it could be useful to developers to see if there is any unresolved issues that has been left behind and never gets called
Make a small general helper function:
function allWithProgress(promises, progress) {
var total = promises.length;
var now = 0;
promises.forEach(function(p) {
p.then(function() {
now++;
progress(now / total);
});
})
return $q.all(promises);
}
Then use it:
var promises = urls.map(function(url) {
return $http.get(url, {responseType: "arraybuffer"});
});
allWithProgress(promises, function(progress) {
progress = Math.round(progress * 100);
$rootScope.$broadcast('download.completed', {progress: progress});
}).catch(function(error) {
console.log(error);
});

Conditionally pause Javascript to wait for ajax

The variable ajaxdata is modified within the success function, if that hasn't been done yet, I would like to wait 2 seconds, then continue without it.
The use case is for a jqueryui autocomplete field. The autocomplete source is an ajax request, but if the user types quickly, and exits the field before the list loads, the field remains unset. Using the 'change' event on the autocomplete I check if the user entered a valid option without selecting it, but this doesn't work if the source hasn't loaded when the change event fires. So I would like to put a delay in the change function which waits, if the source (stored in the variable 'ajaxdata') is empty.
code:
input.autocomplete({
source: function (request, response){
$.ajax(
{
type: "GET",
url: "/some/url",
dataType: "json",
success: function(data){
response($.map(data,function(item){
return{
label: item.label,
value: item.value
}
}));
ajaxdata = data;
}
}
);
// ajaxopts = ajaxsource(request,response,ajaxurl,xtraqry)
},
change: function(event, ui) {
if (!ui.item) {
// user didn't select an option, but what they typed may still match
var enteredString = $(this).val();
var stringMatch = false;
if (ajaxdata.length==0){
/// THIS IS WHERE I NEED A 2 SECOND DELAY
}
var opts = ajaxdata;
for (var i=0; i < opts.length; i++){
if(opts[i].label.toLowerCase() == enteredString.toLowerCase()){
$(this).val(opts[i].label);// corrects any incorrect case
stringMatch = true;
break;
}
}
}
},
});
Edit:
To be more specific about the problem: This delay needs to be conditional. Meaning that if the data is already loaded (either because it came from a static source, or from an earlier ajax call) I do not want to have a delay.
If I'm understanding you properly, I think you just want to check and see if ajaxdata has been populated; but if it hasn't, only wait two more seconds and then just proceed without it.
Try this:
change: function(event, ui) {
if (!ui.item) {
// user didn't select an option, but what they typed may still match
if (ajaxdata.length==0){
/// THIS IS WHERE I NEED A 2 SECOND DELAY
//pass in 'this' so that you can use it
setTimeout(function() {correctCase(this);}, 2000);
}
}
}
. . . . .
function correctCase(inThis){
//I'm not sure what this variable does. do you really need it???
var stringMatch = false;
var enteredString = $(inThis).val();
//you still want to be sure that ajaxdata is not empty here
if (ajaxdata.length==0){
var opts = ajaxdata;
for (var i=0; i < opts.length; i++){
if(opts[i].label.toLowerCase() == enteredString.toLowerCase()){
$(inThis).val(opts[i].label); // corrects any incorrect case
stringMatch = true; //this variable doesn't seem to do anything after this???
break;
}
}
}
}
I'm not really sure what it is you're trying to do, but I'm pretty sure something like this would be a better way of doing it :
input.autocomplete({
source: function(request, response) {
return $.ajax({
type: "GET",
url: "/some/url",
dataType: "json"
});
},
change: function(event, ui) {
if (!ui.item) {
// user didn't select an option, but what they typed may still match
var enteredString = this.value;
var stringMatch = false;
//make sure ajax is complete
this.source().done(function(data) {
var opts = $.map(data, function(item) {
return {
label: item.label,
value: item.value
}
});
for (var i = 0; i < opts.length; i++) {
if (opts[i].label.toLowerCase() == enteredString.toLowerCase()) {
$(this).val(opts[i].label); // corrects any incorrect case
stringMatch = true;
}
}
});
}
}
});​
By default, JavaScript is asynchronous whenever it encounters an async function, it queued that function for later.
But if you want a pause js(ajax call or anything) for you can do it use promises
Case 1: output hello(will not wait for setTimeout)
https://jsfiddle.net/shashankgpt270/h0vr53qy/
//async
function myFunction() {
let result1='hello'
//promise =new Promise((resolve,reject)=>{
setTimeout(function(){
resolve("done");
result1="done1";
}, 3000);
//});
//result = await promise
alert(result1);
}
myFunction();
case 2: output done1(will wait for setTimeout)
https://jsfiddle.net/shashankgpt270/1o79fudt/
async function myFunction() {
let result1='hello'
promise =new Promise((resolve,reject)=>{
setTimeout(function(){
resolve("done");
result1="done1";
}, 3000);
});
result = await promise
alert(result1);
}
myFunction();

Node JS - get file via AJAX and then use the data

How do I do this asynchronously?
var getData, myFunc;
getData = function() {
var data = "";
$.get("http://somewhere.com/data.xml", function(d) {
data = $("#selector", d).html();
});
return data; // does not work, because async callback not yet fired
};
myFunc = function() {
var data = getData();
// do something with data here
};
I am happy to completely re-factor to achieve what I want. I am just don't know what design pattern achieves this.
Well, you can't. You can return a promise though:
var getData, myFunc;
getData = function () {
var d = $.Deferred();
$.get("http://somewhere.com/data.xml", function (data) {
d.resolve($("#selector", data).html())
});
return d.promise();
};
getData().then(function (data) {
alert(data);
});
demo http://jsfiddle.net/W75Kt/2/

Resources