I am currently implementing a sort of HTTP Push using Long Polling for browsers that don't support multipart ajax responses.
I have to admit that while the server side is working fine, i am relativly new to front end javascript development, and thus may have made some obvious mistakes
The problem is as follows LongPolling works perfectly on IE 6,7,8 and Firefox ( even though Firefox uses multipart i tested it with long polling too ) but Safari and Chrome enter
the browsers "busy" state during the ajax requests. ( they show the windows wait cursor, and Safari also shows its "Loading" indicator in the title bar )
This is of course not desireable..
Here is my code to do the long poll based on Jquery 1.4.1:
function MepSubscribeToQueueLongPoll(name, callback) {
var queueUrl = MepGetQueueUrl(name, "LongPoll");
MepLongPollStep(queueUrl, callback);
};
function MepLongPollStep(url, callback) {
$.ajax({
url: url,
async: true,
cache: false,
success: function (data,status,request) {
callback(request.responseText);
MepLongPollStep(url, callback);
}
});
};
Note that i am bypassing the data parsing functionality of Jquery by passing the request.responseText directly to the callback because Jquery does not seem to support multipart ajax respones and i wanted to be consistent across communication paths.
Since no better answer has stepped forward, I wonder if a simple timeout would solve the problem. Sorry to give a "guess" instead of a "I know this to be true answer", but this might actually fix it.:
function MepLongPollStep(url, callback) {
$.ajax({
url: url,
async: true,
cache: false,
success: function (data,status,request) {
callback(request.responseText);
window.setTimeout( function(){
MepLongPollStep(url, callback);
},10);
}
});
};
Related
This question is based on an issue I had before but never managed to resolve.
I'm running an userscript via FireMonkey which regulary sends requests to my backend server, those using CORS since they are cross-domain. For testing purposes my response headers are currently set very loosely backend-side:
header("Access-Control-Allow-Origin: *");
header("Access-Control-Allow-Method: POST");
header("Access-Control-Allow-Headers: *");
Frontend-side my requests are sent via jQuery AJAX, looking roughly like this:
$.ajax({
method: "POST",
url: this.queryURL,
timeout: this.timeout,
data: this.formData,
processData: false,
contentType: false,
success: onSuccess,
error: onError
});
Nothing special here, requests have been working fine ever since.
I want to get rid of jQuery now and use native XmlHttpRequest instead. The implementation looks like this:
const xhr = new XMLHttpRequest();
xhr.open("POST", this.queryURL);
xhr.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
xhr.timeout = this.timeout;
xhr.ontimeout = onTimeout;
xhr.onreadystatechange = () => {
if (xhr.readyState !== 4 || xhr.status === 0)
return;
if (xhr.status !== 200)
onError(xhr);
else
onSuccess(xhr.responseText);
};
xhr.send(this.formData);
For some reason the requests seem not to be sent anymore on Firefox, at least the network tab doesn't show any. Testing it on Chrome it still works, but I figured that the issue might have something to do with preflight requests, since I noticed something strange:
With jQuery AJAX:
With native XHR:
Using native XHR, Chrome seems to always have a first preflight request failing, with a second one being successful. This doesn't happen with AJAX. Not sure if this is helpful in any way, but could this explain why requests keep failing at all in Firefox, and how to resolve it?
After some more research I finally found out that this is actually a firefox-related issue with userscripts: https://bugzilla.mozilla.org/show_bug.cgi?id=1715249
In my project, I have a long-running process which is called by AJAX. Duration can be 1 to 15 mins.
While AJAX is running, I want to give updates to users. It just should show simply how many rows left to add into the database.
I found out that there are a few different options to realize this. Polling, SSE or WebSockets. I never worked with WebSockets, and I couldn't find a good example.
I'm trying now with SSE which I quite understand, and it is working properly.. but when the AJAX start running the connection to the eventSource will be pending. So while AJAX is running, there are no updates received.
Code:
<script type="text/javascript">
var es;
function checkProgress(id){
es = new EventSource('checkProgress.php');
es.addEventListener('message', function(e) {
console.log(e.data);
}, false);
}
checkProgress(1);
$(function() {
$('.submit').on('click', function() {
var form = $('form')[0];
var form_data = new FormData(form);
$.ajax({
type: 'POST',
url: 'submit.php',
contentType: false,
processData: false,
data: form_data,
success:function(response) {
console.log(response);
}
});
return false;
});
});
</script>
Screenshots:
Network log
Now actually I still didn't find any reference or example of how to implement SSE while there is an AJAX process running. All reference or examples give examples to let the getProgress file to do something.
I see you're using PHP. My best guess would be that you're also using the built-in PHP session management. The problem with this is that accessing the session is an exclusive operation. I would guess that your AJAX operation has opened and locked the session, preventing your SSE script from also opening the session. You might consider not opening the session or opening it read-only(Dead Link) (Archived).
I'm trying to improve performance of a monico editor completion item provider (it is currently making ajax calls to get the appropriate items ... the custom language is very large and complex).
I'm wonderingf if/how returning a Thenable CompletionList might help with this.
https://microsoft.github.io/monaco-editor/api/interfaces/monaco.languages.completionitemprovider.html
We initially started with a synchronous ajax call so that we are sure to have results to present, but that was causing too much blocking/interruption in typing flow. Now the ajax call is asynch but is not returning fast enough and we get a 'No suggestions' message.
I figured it out. In case anyone else is wondering how to do the same. From the provideCompletionItems function ...
return new Promise(function(resolve, reject) {
$.ajax({
url: 'someaddress.com',
dataType: 'json',
success: function(res) {
// create your keywords json here
resolve({items: keywords, isIncomplete: true});
},
error: function(xhr, error){
reject({items: [], isIncomplete: true})
},
});
});
The monaco editor will then display a nice little set of animated dots as a loading indicator while the ajax call happens.
guys. I have a juerymobile multi-page, and I have a button in #page-index, when click it, will send a ajax request to server, and changepage to #page-column, It run will in PC, but when i deploy the multi-page in phonegap, the button click can just run only twice, code is below:
function test()
{
$.mobile.changePage('#page_column');
$.ajax({
url: "http://192.168.168.120:8090/fcmobile/getTest",
dataType: "json"
}).done(function(data) {
alert(data.content);
});
}
I found if I remove $.mobile.changePage('#page_column');, the ajax request can be run well any times. but when I add the changePage code, it only can be run twice, in third time, ajax request can't even be send. Dose anybody know reason?
AJAX is made to be asynchronous, so no need to set async to false to get it working. Use events instead.
For example:
function test () {
$.ajax({
'url': "http://192.168.168.120:8090/fcmobile/getTest",
'dataType': 'json',
'success': function (json_data) {
$(document).trigger('test_json_data_loaded');
console.log(data);
}
});
}
$(document).on('test_json_data_loaded', function () {
$.mobile.changePage('#page_column');
});
When you set async to false, you're basically making it so that every time this AJAX request is made, the user will have to wait until all the data is fully loaded before the application/website can do/execute anything else...not good.
When i use jquery's $.post ajax function, page freezes for 2-3 seconds and then data received. Freezing time can change depends on the data received.
How can i prevent this ?
EDIT:
COde i am using, it actually receives very large data
$.post("../ajax_updates.php", { time: last_update }, function(data) {
if (data) {
if (data != "") {
$("#news_feed").prepend($(data).fadeIn('slow'));
}
}
});
If you load big amount of data through JavaScript this is normal, the problem is caused because your request is synchronous which will make your browser to wait this request to end before do anything else.
You need to make your request asynchronous
P.S. Use $.get instead of $.post to get information from the server, in some cases - specially if you code work under Windows IIS you will get an error about that.
P.S-1. And it make sense $.get is for getting data from the server and $.post is for sending data.
Try this:
$.ajaxSetup({
async: true
});
$.get("../ajax_updates.php", { time: last_update }, function(data) {
if (data && data != "") {
$("#news_feed").prepend($(data).fadeIn('slow'));
}
});
When you send the ajax request make sure that async is set to true. If it is set to false, the browser will freeze untill a response is received.
http://api.jquery.com/jQuery.ajax/