Conditionally pause Javascript to wait for ajax - ajax

The variable ajaxdata is modified within the success function, if that hasn't been done yet, I would like to wait 2 seconds, then continue without it.
The use case is for a jqueryui autocomplete field. The autocomplete source is an ajax request, but if the user types quickly, and exits the field before the list loads, the field remains unset. Using the 'change' event on the autocomplete I check if the user entered a valid option without selecting it, but this doesn't work if the source hasn't loaded when the change event fires. So I would like to put a delay in the change function which waits, if the source (stored in the variable 'ajaxdata') is empty.
code:
input.autocomplete({
source: function (request, response){
$.ajax(
{
type: "GET",
url: "/some/url",
dataType: "json",
success: function(data){
response($.map(data,function(item){
return{
label: item.label,
value: item.value
}
}));
ajaxdata = data;
}
}
);
// ajaxopts = ajaxsource(request,response,ajaxurl,xtraqry)
},
change: function(event, ui) {
if (!ui.item) {
// user didn't select an option, but what they typed may still match
var enteredString = $(this).val();
var stringMatch = false;
if (ajaxdata.length==0){
/// THIS IS WHERE I NEED A 2 SECOND DELAY
}
var opts = ajaxdata;
for (var i=0; i < opts.length; i++){
if(opts[i].label.toLowerCase() == enteredString.toLowerCase()){
$(this).val(opts[i].label);// corrects any incorrect case
stringMatch = true;
break;
}
}
}
},
});
Edit:
To be more specific about the problem: This delay needs to be conditional. Meaning that if the data is already loaded (either because it came from a static source, or from an earlier ajax call) I do not want to have a delay.

If I'm understanding you properly, I think you just want to check and see if ajaxdata has been populated; but if it hasn't, only wait two more seconds and then just proceed without it.
Try this:
change: function(event, ui) {
if (!ui.item) {
// user didn't select an option, but what they typed may still match
if (ajaxdata.length==0){
/// THIS IS WHERE I NEED A 2 SECOND DELAY
//pass in 'this' so that you can use it
setTimeout(function() {correctCase(this);}, 2000);
}
}
}
. . . . .
function correctCase(inThis){
//I'm not sure what this variable does. do you really need it???
var stringMatch = false;
var enteredString = $(inThis).val();
//you still want to be sure that ajaxdata is not empty here
if (ajaxdata.length==0){
var opts = ajaxdata;
for (var i=0; i < opts.length; i++){
if(opts[i].label.toLowerCase() == enteredString.toLowerCase()){
$(inThis).val(opts[i].label); // corrects any incorrect case
stringMatch = true; //this variable doesn't seem to do anything after this???
break;
}
}
}
}

I'm not really sure what it is you're trying to do, but I'm pretty sure something like this would be a better way of doing it :
input.autocomplete({
source: function(request, response) {
return $.ajax({
type: "GET",
url: "/some/url",
dataType: "json"
});
},
change: function(event, ui) {
if (!ui.item) {
// user didn't select an option, but what they typed may still match
var enteredString = this.value;
var stringMatch = false;
//make sure ajax is complete
this.source().done(function(data) {
var opts = $.map(data, function(item) {
return {
label: item.label,
value: item.value
}
});
for (var i = 0; i < opts.length; i++) {
if (opts[i].label.toLowerCase() == enteredString.toLowerCase()) {
$(this).val(opts[i].label); // corrects any incorrect case
stringMatch = true;
}
}
});
}
}
});​

By default, JavaScript is asynchronous whenever it encounters an async function, it queued that function for later.
But if you want a pause js(ajax call or anything) for you can do it use promises
Case 1: output hello(will not wait for setTimeout)
https://jsfiddle.net/shashankgpt270/h0vr53qy/
//async
function myFunction() {
let result1='hello'
//promise =new Promise((resolve,reject)=>{
setTimeout(function(){
resolve("done");
result1="done1";
}, 3000);
//});
//result = await promise
alert(result1);
}
myFunction();
case 2: output done1(will wait for setTimeout)
https://jsfiddle.net/shashankgpt270/1o79fudt/
async function myFunction() {
let result1='hello'
promise =new Promise((resolve,reject)=>{
setTimeout(function(){
resolve("done");
result1="done1";
}, 3000);
});
result = await promise
alert(result1);
}
myFunction();

Related

admin-ajax.php do not recognizes 'action'. $_REQUEST is empty

After two days of fruitless research, I decided to join the community. I hope to get a solution. I develop a plug-in that, among other things, must implement the upload of documents. this should be done using ajax technology. the problem is that the request is approved, but admin_ajax.php reacts like no action was taken. Outside of wp this piece of code works fine, as it was thought out. The problems come with installing this code in wp. Below is my code
PHP. This code in the main class that will call from main modul of plugin
class main{
//other activation methods
private function register_scripts(){
add_action('wp_enqueue_scripts', array($this,'re_add_script'));
}
public function re_add_script() {
wp_enqueue_script('re_upload',plugins_url('re'.'/js/re_upload.js'),array('jquery'));
wp_localize_script('re_upload',"re_ajax",array(
'ajaxurl'=>admin_url("admin-ajax.php")));
add_action( 'wp_ajax_upload', 'processingUpload');
}
}//end of class
//callback function
function processingUpload(){
$clsUpload = new UploadsDocs();
$clsUpload->setRequestedData($_FILES,$_POST['doc_id']);
$clsUpload->checkUploadsFiles();
$clsUpload->outputFilesList();
wp_die();
}
jQuery 're_upload.js'
jQuery(document).ready(function (e) {
jQuery('#bt_upload').on('click', function () {
var toUpload=getFileListToUpload();
var form_data = new FormData();
var ins = input.files.length;
for (var x = 0; x < ins; x++) {
if (isFileToUpload(input.files[x],toUpload)){
form_data.append("files[]", input.files[x]);
}
}
form_data.append("doc_id", jQuery('#doc_id')[0].value);
var data_to_sent={
action: 'upload',
datas: form_data
};
jQuery.ajax({
url: re_ajax.ajaxurl, // point to server-side PHP script
dataType: 'text', // what to expect back from the PHP script
cache: false,
contentType: false,
processData: false,
data: data_to_sent,
type: 'post',
success: function (response) {
// do something
},
error: function (response) {
// do something
},
xhr: function(){
//upload Progress
var xhr = jQuery.ajaxSettings.xhr();
if (xhr.upload) {
xhr.upload.addEventListener('progress', function(event) {
var percent = 0;
var position = event.loaded || event.position;
var total = event.total;
if (event.lengthComputable) {
percent = Math.ceil(position / total * 100);
}
//update progressbar
jQuery('#bt_upload').css("display","none");
jQuery('#progress-wrp').css("display","block");
jQuery('#progress-wrp' +" .progress-bar").css("width", + percent +"%");
(percent<50)? jQuery('#progress-status').addClass('status-less-then-50'): jQuery('.status-less-then-50').removeClass('status-less-then-50').addClass('status-more-then-50');
jQuery('#progress-status').text("Uploading..."+percent +"%");
}, true);
}
return xhr;
},
mimeType:"multipart/form-data"
});
});
});
function getFileListToUpload(){
var list=[];
var elem = document.getElementsByClassName('preview');
var tag_li=elem[0].querySelectorAll('p');
for (var i=0;i<tag_li.length;i++){
list[i]=tag_li[i].textContent.split('(')[0];
}
return list;
}
function isFileToUpload(input_file,files_toUpload){
var res=false;
for(var i=0; i<files_toUpload.length;i++){
if (input_file.name==files_toUpload[i]){
res=true;
break;
}
}
return res;
}
The problem is
add_action( 'wp_ajax_upload', 'processingUpload');
is not called.
The upload is done in two separate invocations of the server. The first invocation displays the upload page to the user. The second invocation processes the AJAX request. Your call to
add_action( 'wp_ajax_upload', 'processingUpload');
is done in the first invocation where it is not needed but not in the second invocation where it is needed.
Please read https://codex.wordpress.org/AJAX_in_Plugins. (Observe carefully how the call to 'add_action( 'wp_ajax_...', ...) is done.) Further, you need to read about nonces.
Try to append action to your ajax url like:
url: re_ajax.ajaxurl?action=upload
and
data: form_data
or pass it to form_data like:
form_data.append('action', 'upload')

AngularJS: Using $q to fire ajax calls synchronously

Is it possible to use $q to fire ajax requests synchronously in AngularJS?
I have a long list of vehicles, each vehicle has events associated with them and I need to retrieve the eventdetails of each event when the user expands the listing.
Right now, if the user expands the listing, I am firing up to 15 calls asynchronously and it seems to be causing issues with the API I'm consuming, so I'd like to see if performance is improved if I wait for each request finishes before firing the next.
I'm attempting to implement $q to delay the next request until the previous is finished, however I can't seem to wrap my head around using the service, here is what I currently have:
// On click on the event detail expander
$scope.grabEventDetails = function(dataReady, index) {
if (dataReady == false) {
retrieveEventDetails($scope.vehicles[index].events);
}
}
var retrieveEventDetails = function(events) {
// events is array
var deferred = $q.defer();
var promise = deferred.promise;
var retrieveData = function(data) {
return $http({
url: '/api/eventdetails',
method: 'POST',
data: {
event_number: data.number
},
isArray: true
});
}
_.each(events, function(single_event) {
promise.then(retrieveData(single_event).success(function(data) {
console.log(data);
}));
});
}
This is still firing asynchronously, Where am I going wrong with this?
I understand firing the requests synchronously isn't the best idea, at the moment I just want to see if performance is improved with the API at all.
You don't need $q to implement a promise as $http returns one.
_.each fires all the callbacks without especially waiting the promise.
All you do is call retrieveData for all events whenever your promise is resolved, and since you don't do a first call, it shouldn't even be working
You could do some recursive call like this :
var retrieveEventDetails = function(events) {
var evt = events.shift();
$http({
url: '/api/eventdetails',
method: 'POST',
data: {
event_number: evt.number
},
isArray: true
}).then(function(response){
console.log(response.data);
retrieveEventDetails(events);
});
}
I do think you should use $q as some other part of your application might need to get a promise.
A good example would be $routeProvider resolve option.
I made a little demo in plunker.
Solution:
retrieveData function should return a function (which returns a promise) instead of a just a promise.
That way we can create a promise chain: promise.then(fn).then(fn).then(fn).then(null,errorFn)
We must resolve the first promise to kick the chain.
var retrieveEventDetails = function(events) {
// events is array
var deferred = $q.defer();
var promise = deferred.promise;
var retrieveData = function(data) {
return function(){
return $http({
url: '/api/eventdetails',
method: 'POST',
data: {
event_number: data.number
},
isArray: true
})
}
}
deferred.resolve();
return events.reduce(function(promise, single_event){
return promise.then(retrieveData(single_event));
}, promise);
}
I'm not sure you even need $q here. In this example, each piece of data is registered in the controller as soon as it comes back from the call.
Live demo (click).
var app = angular.module('myApp', []);
app.controller('myCtrl', function($scope, myService) {
$scope.datas = myService.get();
});
app.factory('myService', function($http) {
var myService = {
get: function() {
var datas = {};
var i=0;
var length = 4;
makeCall(i, length, datas);
return datas;
}
}
function makeCall(i, length, datas) {
if (i < length) {
$http.get('test.text').then(function(resp) {
datas[i] = resp.data+i;
++i;
makeCall(i, length, datas);
});
}
}
return myService;
});
Here's a way using $q.all() that you can wait for all of the data to come through before passing it to the controller: Live demo (click).
var app = angular.module('myApp', []);
app.controller('myCtrl', function($scope, myService) {
myService.get().then(function(datas) {
$scope.datas = datas;
})
});
app.factory('myService', function($q, $http) {
var myService = {
get: function() {
var deferred = $q.defer();
var defs = [];
var promises = [];
var i=0;
var length = 4;
for(var j=0; j<length; ++j) {
defs[j] = $q.defer();
promises[j] = defs[j].promise;
}
makeCall(i, length, defs);
$q.all(promises).then(function(datas) {
deferred.resolve(datas);
});
return deferred.promise;
}
}
function makeCall(i, length, defs) {
if (i < length) {
$http.get('test.text').then(function(resp) {
defs[i].resolve(resp.data+i);
++i;
makeCall(i, length, defs);
})
}
}
return myService;
});

jQuery.ajax() inside a loop [duplicate]

This question already has answers here:
JavaScript closure inside loops – simple practical example
(44 answers)
Closed 6 years ago.
If I call jQuery.ajax() inside a loop, would it cause the call in current iteration overwrite the last call or a new XHR object is assigned for the new request?
I have a loop that do this, while from console log I can see requests done 200 ok but just the result data of the last request in the loop is stored by the request success callback as supposed .
the code:
var Ajax = {
pages: {},
current_request: null,
prefetch: function () {
currentPath = location.pathname.substr(1);
if(this.pages[currentPath])
{
var current = this.pages[currentPath];
delete this.pages[currentPath];
current['name']=currentPath;
current['title']=$("title").text().replace(' - '.SITE_NAME, '');
current['meta_description']=$("meta[name=description]").attr('content');
current['meta_keywords']=$("meta[name=keywords]").attr('content');
}
var _Ajax = this;
//the loop in question *****
for(var key in this.pages)
{
$.ajax({
method: 'get',
url:'http://'+location.hostname+'/'+key,
success: function(data) {
_Ajax.pages[key] = data;
}
});
console.debug(this.pages);
}
if(current)
{
this.pages[currentPath] = current;
}
}
};//Ajax Obj
for(var i in pages)
{
Ajax.pages[pages[i]]={};
}
$(function() {
Ajax.prefetch();
});//doc ready
You'll need a closure for key:
for(var k in this.pages){
(function(key){
$.ajax({
method: 'get',
url:'http://'+location.hostname+'/'+key,
success: function(data) {
_Ajax.pages[key] = data;
}
});
console.debug(this.pages);
})(k);
}
that way you make sure that key is always the correct on in each ajax success callback.
but other than that it should work
i made a small closure demonstration using timeout instead of ajax but the principle is the same:
http://jsfiddle.net/KS6q5/
You need to use async:false in you ajax request. It will send the ajax request synchronously waiting for the previous request to finish and then sending the next request.
$.ajax({
type: 'POST',
url: 'http://stackoverflow.com',
data: data,
async: false,
success: function(data) {
//do something
},
error: function(jqXHR) {
//do something
}
});
I believe what's happening here has to do with closure. In this loop:
for(var key in this.pages)
{
$.ajax({
method: 'get',
url:'http://'+location.hostname+'/'+key,
success: function(data) {
_Ajax.pages[key] = data;
}
});
console.debug(this.pages);
}
The variable key is actually defined outside the for loop. So by the time you get to the callbacks, the value has probably changed. Try something like this instead:
http://jsfiddle.net/VHWvs/
var pages = ["a", "b", "c"];
for (var key in pages) {
console.log('before: ' + key);
(function (thisKey) {
setTimeout(function () {
console.log('after: ' + thisKey);
}, 1000);
})(key);
}
I was facing the same situation, I solved using the ajax call inside a new function then invoke the function into the loop.
It would looks like:
function a(){
for(var key in this.pages)
{
var paramsOut [] = ...
myAjaxCall(key,paramsOut);
.......
}
}
function myAjaxCall(paramsIn,paramsOut)
{
$.ajax({
method: 'get',
url:'http://'+location.hostname+'/'+paramsIn[0],
success: function(data) {
paramsOut[key] = data;
}
});
}
This is how I always do a ajax loop..
I use a recursive function that gets called after the xhr.readyState == 4
i = 0
process()
function process() {
if (i < 10) {
url = "http://some.." + i
var xhr = new XMLHttpRequest();
xhr.open("GET", url, true);
xhr.onreadystatechange = function () {
if (xhr.readyState == 4) {
alert(xhr.responseText)
i++
process()
}
}
xhr.send();
} else {
alert("done")
}
}

Bootstrap typeahead suggestions replaced when navigation

I'm using Bootstrap Typeahead to suggest som search results. The results are returned from a ajax ressource, and since this resource creates a delay, I'm experiencing a unfortunate effect.
Example:
If typing a 4 letter word, the suggestions will appear after 2 letters, I can then go through the results with the keys up/down, but suddenly the suggestions will reload because the last request has finished.
Is there any way to "cancel" any remaining, if user is currently using the keys up/down to go through the suggestions?
('#query').typeahead({
items: 4,
source: function (query,process) {
map = {};
$.getJSON('/app_dev.php/ajax/autosuggest/'+query, function (data) {
vehicles = [];
$.each(data, function(i,vehicle){
map[vehicle.full] = vehicle;
vehicles.push(vehicle.full);
});
process(vehicles);
});
},
updater: function (item) {
// do something here when item is selected
},
highlighter: function (item) {
return item;
},
matcher: function (item) {
return true;
}
});
I think the following will satisfy your needs (its hard to reproduce exactly) :
There is no easy way to abort a delayed response, but you could extend typeahead as I figured out here (without modifying bootstrap.js)
The concept is to catch keydown, detect if the event is KEY_UP or KEY_DOWN, set a flag is_browsing, and then abort process if is_browsing is true (that is, if the user has hitted KEY_UP or KEY_DOWN and no other keys afterwards).
Extending typeahead :
// save the original function object
var _superTypeahead = $.fn.typeahead;
// add is_browsing as a new flag
$.extend( _superTypeahead.defaults, {
is_browsing: false
});
// create a new constructor
var Typeahead = function(element, options) {
_superTypeahead.Constructor.apply( this, arguments )
}
// extend prototype and add a _super function
Typeahead.prototype = $.extend({}, _superTypeahead.Constructor.prototype, {
constructor: Typeahead
, _super: function() {
var args = $.makeArray(arguments)
// call bootstrap core
_superTypeahead.Constructor.prototype[args.shift()].apply(this, args)
}
//override typeahead original keydown
, keydown: function (e) {
this._super('keydown', e)
this.options.is_browsing = ($.inArray(e.keyCode, [40,38])>-1)
}
//override process, abort if user is browsing
, process: function (items) {
if (this.options.is_browsing) return
this._super('process', items)
}
});
// override the old initialization with the new constructor
$.fn.typeahead = $.extend(function(option) {
var args = $.makeArray(arguments),
option = args.shift()
// this is executed everytime element.modal() is called
return this.each(function() {
var $this = $(this)
var data = $this.data('typeahead'),
options = $.extend({}, _superTypeahead.defaults, $this.data(), typeof option == 'object' && option)
if (!data) {
$this.data('typeahead', (data = new Typeahead(this, options)))
}
if (typeof option == 'string') {
data[option].apply( data, args )
}
});
}, $.fn.typeahead);
This typeahead-extension could be placed anywhere, eg in a <script type="text/javascript"> -section
Testing the extension :
<input type="text" id="test" name="test" placeholder="type some text" data-provide="typeahead">
<script type="text/javascript">
$(document).ready(function() {
var url='typeahead.php';
$("#test").typeahead({
items : 10,
source: function (query, process) {
return $.get(url, { query: query }, function (data) {
return process(data.options);
});
}
});
});
</script>
A "serverside" PHP script that returns a lot of randomized options with forced delay, typeahead.php :
<?
header('Content-type: application/json');
$JSON='';
sleep(3); //delay execution in 3 secs
for ($count=0;$count<30000;$count++) {
if ($JSON!='') $JSON.=',';
//create random strings
$s=str_shuffle("abcdefghijklmnopq");
$JSON.='"'.$s.'"';
}
$JSON='{ "options": ['.$JSON.'] }';
echo $JSON;
?>
It really seems to work for me. But I cannot be sure that it will work in your case. Let me now if you have success or not.

backbone: issue an ajax call before resetting a collection

Right now I have a collection that fetches value, and after that every view attached to the reset event get rendered again
the problem is that I also have to issue another query to fetch the total number of records retrieved, and only after that ajax call is completed the reset event should be triggered
is more clear with a bit of code:
fetch: function() {
options = { data: this.getParams() };
this.fetch_total();
return Backbone.Collection.prototype.fetch.call(this, options);
},
fetch_total: function() {
var that = this;
var options = {
url: this.url + '/count',
data: this.getParams(),
contentType: 'application/json',
success: function(resp, status, xhr) {
that.total = parseInt(resp);
return true;
}
};
return $.ajax(options);
}
as you can see, I have to issue a get to localhost/myentity/count to get the count of entities...
The thing is I need the collection.total varaible to be updated before refreshing the views, that means I need both request, the GET to localhost/myentity and to localhost/myentity/count, to be completed before refreshing all the views...
any idea how can I achieve it???
If your $ of choice is jQuery>1.5, you could take advantage of the deferred object to manually trigger a reset event when both calls have completed. Similar to your answer, but a bit more readable and without chaining the calls:
fetch: function() {
options = {silent: true, data: this.getParams()};
var _this = this;
var dfd_total = this.fetch_total();
var dfd_fetch = Backbone.Collection.prototype.fetch.call(this, options);
return $.when(dfd_total, dfd_fetch).then(function() {
_this.trigger('reset', _this);
})
},
fetch_total: function() {
// what you have in your question
}
And a Fiddle simulating these calls http://jsfiddle.net/rkzLn/
Of course, returning the results and the total in one fetch may be more efficient, but I guess that's not an option.
I think #nikoshr's answer is a good one so that you don't have to modify your API. If you think that you want to lessen your calls to the server, then consider returning an object from that endpoint that has paging information.
{
count: 1243,
page: 3,
per_page: 10,
results: [
...
]
}
and then overriding the collection's parse functionality
parse: function(res) {
this.count = res.count;
this.page = res.page;
this.per_page = res.per_page;
// return the collection
return res.results;
}
RESOURCES
http://backbonejs.org/#Collection-parse
I think I found a way to do it. What I did was to silently fire the fetch call, without triggering the 'reset' event
There, from the callback, I issue the fetch of the total (GET to localhost/myentity/count)
and from the total callback, I finally trigge the reset event
in code is something like this:
fetch: function() {
var that = this;
options = {
// will manually trigger reset event after fetching the total
silent: true,
data: this.getParams(),
success: function(collection, resp) {
that.fetch_total();
}
};
return Backbone.Collection.prototype.fetch.call(this, options);
},
fetch_total: function() {
var that = this;
var options = {
url: this.url + '/count',
data: this.getParams(),
contentType: 'application/json',
success: function(resp, status, xhr) {
that.total = parseInt(resp);
// manually trigger reset after fetching total
that.trigger('reset', that);
return true;
}
};
return $.ajax(options);
}
This is my first attempt, I wonder if there's an easier way

Resources