$.ajax in a for loop pulling from a php file - ajax

I have a key_gen.php file that contains a function to generate a random key. When executed, the php file gives back one key (tested and it works).
In my javascript file I have a click event on a button (that works), something like this:
$('#kg_btn').click(function(){});
Then, inside my click event I have a functions:
var get_key = function(){
for(var i = 0; i < kg_quantity; i++) {
$.ajax ({
url: "../keygen/gen_rkey.php",
type: "GET",
datatype: "json",
success: function(data) {
var j_obj = JSON.parse(data);
//alert("Success!");
prs = j_obj.test;
console.log(prs);
//add_key();
},
error: function(error) {
alert("Error! Can't send the data");
}
}); //ajax call ends
}
}
When I run this function once (by setting up the "Kg_quantity" variable to 1), every time I click my button I get a correct behavior. The result is a different key on the console.log per click.
If I set up the "kg_quantity" to any other number than 1 (for example: 3,9,10), I do get the console.log messages back, but the number generated is the same.
I was hoping that by inserting the ajax object into a for-loop would execute the ajax call several times. I tried to put the ajax call within a prototype function, as well, but I get the same result.
Edit: I tried adding a closure (as Ross suggested), but I get the exact same result.
Thanks.

AJAX is asynchronous. Your loop will finish before the first AJAX response most likely.
I would restructure that so the success response from the AJAX call will trigger the next iteration. Or if you're lazy you can just set async:false:
$.ajax ({
url: "../keygen/gen_rkey.php",
type: "GET",
async: false,
....
Or even better, put the strain on the server and reduce the back-and-forth so you get all your keys in one response:
url: "../keygen/gen_rkey.php?qty=" + kg_quantity,
UPDATE: Async design method:
function getKeys(max,cur) {
cur = cur || 1;
if (cur == max) {
return;
}
$.ajax({
....
success(function(data) {
// do stuff....
// recursive iteration
getKeys(max,cur + 1);
}
});
}
getKeys(5);

Related

Ajax wait on success before next iteration in .each loop

I have an ajax call inside a .each loop wrapped in a setInterval function.
This handles updating of many divs on a dashboard with just a few lines of code on the html page.
I am worried about server lag vs client side speed. What will happen if the server has not responded with the data before the loop moves on to the next iteration?
So, my question is, can the loop be paused until the success is executed?
Ajax call:
setInterval(function() {
$(".ajax_update").each(function(){
$.ajax({
type: "POST",
dataType: "json",
url: "ajax/automated_update/confirmed_appointments.php",
data: "clinic_id=<? echo $clinic_id ?>&tomorrow=<? echo $tomorrow ?>&"+$(this).data('stored'), // serializes the form's elements.
success: function(data)
{
$(data[0]).html(data[1]);
}
});
});
}, 5000); //5 seconds*
</script>
I have looked into .ajaxComplete() but I dont see how to apply this as a solution.
I have also looked at turning the loop into something that calls itself like:
function doLoop() {
if (i >= options.length) {
return;
}
$.ajax({
success: function(data) {
i++;
doLoop();
}
});
}
But would that not interfere with .each? I dont understand how that would play nice with .each and looping based on my div class.
I just cant figure it out! Any help would be appreciated.
I was able to get .when working with the ajax call, but I dont understand how to make .when do what I need (stop the loop until the ajax call is done).
$(".ajax_update").each(function(){
$.ajax({
type: "POST",
dataType: "json",
url: "ajax/automated_update/confirmed_appointments.php",
data: "clinic_id=<? echo $clinic_id ?>&tomorrow=<? echo $tomorrow ?>&"+$(this).data('stored'), // serializes the form's elements.
success: function(data)
{
$(data[0]).html(data[1]);
}
});
$.when( $.ajax() ).done(function() {
alert("Finished it");
});
});
After thinking about your question a bit, perhaps a good solution would be to put an event in place that would trigger a new set of updates with a minimum time between your dashboard updates. This would ensure that all your updates process, that we do wait a minimum time between updates and then trigger the update cycle once again. Thus if you DO encounter any delayed ajax responses you do not try another until the previous one has all completed.
I have not fully tested this code but is should do what I describe:
//create a dashboard object to handle the update deferred
var dashboard = {
update: function (myquery) {
var dfr = $.Deferred();
$.ajax({
type: "POST",
dataType: "json",
url: "ajax/automated_update/confirmed_appointments.php",
data: "clinic_id=<? echo $clinic_id ?>&tomorrow=<? echo $tomorrow ?>&" + myquery,
success: dfr.resolve
});
return dfr.promise();
}
};
//create a simple deferred wait timer
$.wait = function (time) {
return $.Deferred(function (dfd) {
setTimeout(dfd.resolve, time);
});
};
// use map instead of your .each to better manage the deferreds
var mydeferred = $(".ajax_update").map(function (i, elem) {
return dashboard.update($(this).data('stored')).then(function (data, textStatus, jqXHR) {
$(data[0]).html(data[1]);
});
});
//where I hang my dashboardupdate event on and then trigger it
var mydiv = $('#mydiv');
var minimumDashboardUpdate = 5000;
$('#mydiv').on('dashboardupdate', function () {
$.when.apply($, mydeferred.get())
.then(function () {
$.when($.wait(minimumDashboardUpdate)).then(function () {
mydiv.trigger('dashboardupdate');
});
});
});
mydiv.trigger('dashboardupdate');

Observable values disappearing after being pushed into observableArray

I'm grabbing data from the server and pushing them into an observable array.
I'm pushing observables into an observable array.
As I push the data into the observables, the observables contain the data.
However as soon as I push the observables into the observable Array, a few of the observables are missing data.
self.mealFoods([]);
$.ajax({
url: "/mealsurl/1",
async: false,
dataType: 'json',
success: function(datad) {
for(var lia = 0; lia < datad.length; lia++){
var cats_url = "/catsurl/" + datad[lia].category_id;
var units_by_food_url = "/unitsurl/" + datad[lia].ndb_no;
var foodThing = new NewFood();
foodThing.foodId(parseInt(datad[lia].id)); //works
foodThing.category(parseInt(datad[lia].category_id)); //works
$.ajax({
url: cats_url,
dataType: 'json',
success: function(dat) {
foodThing.category_foods(dat); //works
}
});
foodThing.food(datad[lia].ndb_no); //works
$.ajax({
url: units_by_food_url,
dataType: 'json',
success: function(dat) {
foodThing.food.units(dat); //works
}
});
foodThing.unit(parseInt(datad[lia].seq)); //works
foodThing.number_of_unit(datad[lia].this_much); //works
self.mealFoods.push(foodThing);
// At this point when looking inside the mealFoods array: self.mealFoods()[0].food(), self.mealFoods()[0].unit(), self.mealFoods()[0].food.units(), self.mealFoods()[0].category_Foods() ALL ARE EMPTY
}
}
});
You, sir, are having a classic case of async-brain-melt. It is a common sympton in beginners but never fear for the recovery rate is nearly 100%. :)
I would wager your experience is with synchronous languages, that is, where if one line is written after the other, the lines written before are executed before, always.
A normal JavaScript function is synchronous. For example:
console.log(1);
console.log(2);
As expected, this prints 1 and then 2.
However, asynchronous code is not necessarily executed in the order it was declared. Consider this example using a setTimeout function, which schedules a function for later execution:
setTimeout(function(){ console.log(1); }, 1000);
console.log(2);
Now, the output will be 2 and 1, because 1 only ran 1000 millis after the setTimeout call.
So, I imagine you are beginning to understand how this applies to your problem.
Your calls to cats_url and units_by_food_url are asynchronous. Therefore, the following code does not wait for them to finish. So, when you access self.mealFoods()[0].food.units(), the success function has not yet grabbed the data!
What you need to do is to coordinate your asynchronous calls appropriately. There are many ways to achieve that. First, I'll teach you the most simple strategy, using only functions:
Grab the list from the server
When you have the list, iterate over each meal and start two ajax calls (up to here, you are already doing everything right)
Now comes the magic: when you have the results for either ajax call, you call an "itemComplete" function. This function will sync the two calls - it will only proceed if the two calls finished.
Finally, call a "listComplete" function each time any item is complete. This function must also check if all items are complete before proceeding.
So, it would look something like this:
$.ajax({
url: "/meals/1",
dataType: 'json',
success: function(list) {
var observableArray = ko.observableArray([]); // this will hold your list
var length = list.length;
var tries = 0;
var listComplete = function () {
tries++;
if (tries == length) {
// Hooray!
// All your items are complete.
console.log(observableArray());
}
};
list.forEach(function(item){
var propertyOneUrl = item.propertyOneUrl;
var propertyTwoUrl = item.propertyTwoUrl;
var propertyOneComplete = false;
var propertyTwoComplete = false;
var food = new Food(item.id);
var itemComplete = function () {
if (propertyOneComplete && propertyTwoComplete) {
// This item is complete.
observableArray.push(food);
// Let's warn list complete so it can count us in.
listComplete();
}
};
// Start your ajax calls
$.ajax({
url: propertyOneUrl,
dataType: 'json',
success: function (propertyOne) {
food.propertyOne(propertyOne);
// Declare that your first property is ready
propertyOneComplete = true;
// We can't know which property finishes first, so we must call this in both
itemComplete();
}
});
$.ajax({
url: propertyTwoUrl,
dataType: 'json',
success: function (propertyTwo) {
food.propertyTwo(propertyTwo);
// Declare that your second property is ready
propertyTwoComplete = true;
// We can't know which property finishes first, so we must call this in both
itemComplete();
}
});
}); //for each
} // success
});
Now, you probably realize how tiresome that pattern can be. That's why there are other ways to better solve this problem. One of these is a pattern called "Promises". You can learn more about them in these links:
https://www.promisejs.org/
http://blog.gadr.me/promises-are-not-optional/
And you'll be happy to know that jQuery.ajax() returns a Promise! So, now you can try and solve that problem using Promises. You'll end up with a much cleaner code.
Hope you make it!
It's because you are doing async ajax calls in a loop. Because whenever an ajax call is made it the loop continues it means that by the time the response comes back the object assigned to foodThing is now no longer what it was set to before the ajax call. Because a for loop is so quick is most likely that only the last object created in the loop is updated.
If you have a look at this simple loop it has the same problem:
for (var i = 0; i < 10; i++){
var a = new NewFood(i);
$.ajax({
url: "/catsurl/1",
dataType: 'json',
success: function(dat) {
console.debug(a.id);
}
});
}
By the time the ajax call comes back a has changed and what ends up happening is only 9 gets written out 10 times: http://jsfiddle.net/r6rwbtb9/
To fix this we would use a closure which is essentially wrapping the ajax call in a function in which we self contain the item we want to do something with:
for (var i = 0; i < 10; i++){
var a = new NewFood(i);
(function (a) {
$.ajax({
url: "/catsurl/1",
dataType: 'json',
success: function(dat) {
console.debug(a.id);
}
});
})(a);
}
And then you can see that the numbers 0-9 are output to the console: http://jsfiddle.net/r6rwbtb9/1/. It's also interesting to note that you can't ensure that each request will necessarily come back in the the same order. That is why sometimes the numbers could come back in a different order to 0-9 because some requests are quicker than others.
SO back to your code. In order to make sure you are updating the correct item for each callback you need to use a closure for each ajax call. There was also a problem with foodThing.food.units(dat) which needed to be foodThing.food().units(dat) as foodThing.food() is an observable.
So to wrap in closures we need to change the two ajax calls to this:
(function(category_foods){
$.ajax({
url: cats_url,
dataType: 'json',
success: function(dat) {
category_foods(dat);
}
});
})(foodThing.category_foods);
(function(units){
$.ajax({
url: units_by_food_url,
dataType: 'json',
success: function(dat) {
units(dat);
}
});
})(foodThing.food().units);

Chain multiple POST requests together using AJAX and jQuery

I received a suggestion from a prior question that I need to amend my code chain a series of POST requests together, but I don't have any idea how to accomplish this. Specifically, the advice I was given was to:
fire off a post, have its success handler fire off the next post,
etc... and then when all the posts are done, the final post's success
handler fires off the get
This strategy makes sense to me but I do not know how to implement. I am trying to prevent the call to GET before all of the calls to POST have completed. Currently, I have implemented $.when.apply to delay the sending of GET. Here is the code for that:
function(){
$.when.apply(undefined, InsertTheAPPs()).done(function () {
$.ajax({
url: sURL + "fileappeal/send_apps_email",
success: function() {
var m = $.msg("my message",
{header:'my header', live:10000});
setTimeout(function(){
if(m)m.setBody('...my other message.');
},3000);
setTimeout(function(){
if(m)m.close(function(){
window.location.replace(sURL+'client/view');
});
},6000);
$('#ajaxShield').fadeOut(1000);},
error: function(){
$.msg("error message",
{header:'error header', live:10000});
}
});
});
}
Here is the code for the jQuery $.each loop. This is the code that needs to not only begin, but must end before the ajax call to fileappeal/send_apps_email above:
function InsertTheAPPs(){
$('input[name=c_maybe].c_box').each(function(){
var jqxhrs = [];
if($(this).prop('checked')){
var rn = $(this).prop('value');
jqxhrs.push(
$.ajax({
url: sURL + 'fileappeal/insert_app',
type:"POST",
dataType: 'text',
data: {'rn': rn},
error: function(data) {console.log('Error:'+rn+'_'+data);}
})
)
return jqxhrs;
}
});
}
Could someone demonstrate how I can modify the code above to implement the strategy of chaining together the multiple POST calls?
Don't return from .each. It doesn't work that way. Instead do this:
var jqxhrs = [];
$(...).each(...
});
return jqxhrs;
Nothing is assigned to the return value of .each, which you can't get anyway. Returning from each allows it to be used like break/continue, which doesn't make sense in your context.
Moreover, the var jqxhrs inside of the each loop causes a new variable to be declared in that context on each iteration of the loop.

Wait for Ajax call to finish

I have a situation where in I m doing a number of AJAX calls using jquery and in turn returning JSON data from those calls into some variables on my page.
The issue is that the Ajax call takes a little time to get processed and in the mean time my control shifts to next statement where I intend to use the output of AJAX call.
Since the call takes time to return the data I am left with empty object that fails my function.
is there any way where I can wait for the finish of AJAX call to happen and proceed only when the result is returned from the call???
so this is my code where in I am trying to return transactionsAtError to some other jquery file where the control shifts to next statement before this call gets executed
this.GetTransactionAtErrors = function (callback) {
var transactionsAtError;
$.ajax({
url: ('/Management/GetTransactionsAtError'),
type: 'POST',
cache: false,
success: function (result) {
if (result && callback) {
transactionsAtError = (typeof (result) == "object") ? result : $.parseJSON(result);
}
}
});
return transactionsAtError;
}
Assuming you are using jQuery's $.getJSON() function, you can provide a callback function which will be executed once the data is returned from the server.
example:
$.getJSON("http://example.com/get_json/url", function(data){
console.log("the json data is:",data);
});
EDIT:
After seeing the code you added i can see what's your problem.
Your return transactionsAtError; line runs independently of the ajax call, i.e it will run before the ajax is complete.
you should just call your callback inside your success: function.
example:
this.GetTransactionAtErrors = function (callback) {
$.ajax({
url: ('/Management/GetTransactionsAtError'),
type: 'POST',
cache: false,
success: function (result) {
if (result && callback) {
var transactionsAtError = (typeof (result) == "object") ? result : $.parseJSON(result);
callback(transactionsAtError);
}
}
});
}
When you have your result in scope you can check wait for ongoin ajax calls to finish by using es6 promise:
function ajaxwait()
{
return(new Promise(function(resolve, reject) {
var i = setInterval(function() {
if(jQuery.active == 0) {
resolve();
clearInterval(i);
}
}, 100);
}));
}
You can use this like.
ajaxwait().then(function(){ /* Code gets executed if there are no more ajax calls in progress */ });
Use an es6 shim like this https://github.com/jakearchibald/es6-promise to make it work in older browsers.

jQuery.ajax() sequential calls

Hey. I need some help with jQuery Ajax calls. In javascript I have to generste ajax calls to the controller, which retrieves a value from the model. I am then checking the value that is returned and making further ajax calls if necessary, say if the value reaches a particular threshold I can stop the ajax calls.
This requires ajax calls that need to be processes one after the other. I tried using async:false, but it freezes up the browser and any jQuery changes i make at the frontend are not reflected. Is there any way around this??
Thanks in advance.
You should make the next ajax call after the first one has finished like this for example:
function getResult(value) {
$.ajax({
url: 'server/url',
data: { value: value },
success: function(data) {
getResult(data.newValue);
}
});
}
I used array of steps and callback function to continue executing where async started. Works perfect for me.
var tasks = [];
for(i=0;i<20;i++){
tasks.push(i); //can be replaced with list of steps, url and so on
}
var current = 0;
function doAjax(callback) {
//check to make sure there are more requests to make
if (current < tasks.length -1 ) {
var uploadURL ="http://localhost/someSequentialToDo";
//and
var myData = tasks[current];
current++;
//make the AJAX request with the given data
$.ajax({
type: 'GET',
url : uploadURL,
data: {index: current},
dataType : 'json',
success : function (serverResponse) {
doAjax(callback);
}
});
}
else
{
callback();
console.log("this is end");
}
}
function sth(){
var datum = Date();
doAjax( function(){
console.log(datum); //displays time when ajax started
console.log(Date()); //when ajax finished
});
}
console.log("start");
sth();
In the success callback function, just make another $.ajax request if necessary. (Setting async: false causes the browser to run the request as the same thread as everything else; that's why it freezes up.)
Use a callback function, there are two: success and error.
From the jQuery ajax page:
$.ajax({
url: "test.html",
context: document.body,
success: function(){
// Do processing, call function for next ajax
}
});
A (very) simplified example:
function doAjax() {
// get url and parameters
var myurl = /* somethingsomething */;
$.ajax({
url: myurl,
context: document.body,
success: function(data){
if(data < threshold) {
doAjax();
}
}
});
}
Try using $.when() (available since 1.5) you can have a single callback that triggers once all calls are made, its cleaner and much more elegant. It ends up looking something like this:
$.when($.ajax("/page1.php"), $.ajax("/page2.php")).done(function(a1, a2){
// a1 and a2 are arguments resolved for the page1 and page2 ajax requests, respectively
var jqXHR = a1[2]; /* arguments are [ "success", statusText, jqXHR ] */
alert( jqXHR.responseText )
});

Resources