This question already has an answer here:
Determine if $.ajax error is a timeout
(1 answer)
Closed 4 years ago.
I have a stored procedure which takes 10 sec to execute and returns 10,000 records of data.
But every time I call the API, the call results in a connection timeout.
If I am able to increase the timeout of the REST request then it would be working fine.
My main question is: If a timeout occurs, how do I get the timeout error in the AJAX request, so that I show the user that timeout occurred? Every time the AJAX call fails and I do not receive an error indicating a timeout happened or that the data was not found.
Here is the thing you can implement to detect timeout error and show on front page for user:
$.ajax({
url: "/ajax_json_echo/",
type: "GET",
dataType: "json",
timeout: 1000,
success: function(response) { alert(response); },
error: function(xmlhttprequest, textstatus, message) {
if(textstatus==="timeout") {
alert("got timeout");
} else {
alert(textstatus);
}
}
});
I think if you are working with rest api and specially if you are as a client,it is always better to keep time-out.I have observed the similar issue of connection time-out and the problem was solved by putting the piece of code connection.setConnectTimeout(5000); will timeout in 5 sec.
Related
im working on a project using asp.net mvc. and im trying to read data from sql server every 5 second and update the user with some feedback.
what I've done till now is using ajax call combined with set timeout.
so every 5 second use ajax call to check if there any new data as shown below.
( the URL is just an example)
function ajaxRequest() {
$.ajax({
url: "https://jsonplaceholder.typicode.com/photos",
success: function (result) {
console.log(result);
},
complete: function (data) {},
}).then(function () {
setTimeout(ajaxRequest, 5000);
});
}
my question is, is there any down side for using set timeout or set interval with ajax like this?
or if there is any better solution for my problem.
worth to mention that the project will be used by at max 4 users at the same time. so I don't think over-loading the server with request will be a problem.
I have created a chat room which uses ajax request to check for new message every second by using setTimeOut function, I achieved this but my only question is that, is it okey to request data from server after every second? or it may cause some problems? below is my code:
function refresh(){
setTimeout(function(){
$.ajax({
type: 'POST',
url: 'checkNewMessage.php',
data: { sender:$sender, recipient:$recipient},
success: function(response) {
$('#newComm').val(response);
if($('#newComm').val()>$('#oldComm').val()){
$.ajax({
type: 'POST',
url: 'appendNewMessage.php',
data: { sender:$sender, recipient:$recipient},
success: function(response) {
$("#chatRoom").prepend(response).fadeIn(4000);
$('#oldComm').val($('#newComm').val());
}
});
}else{}
}
});
refresh();
},1000);
}
Well, this is a question with a depends on answer.
Polling the server with timed requests is not the best way of achieving what you would like to achieve. At this place I recommend using WebSockets: https://developer.mozilla.org/en-US/docs/Glossary/WebSockets
But back to your question. It depends on your server and the load it is going to take. Lets say you have ten active users. So your server would take about 10 requests per second - not too much.
You could run a benchmark and see how many requests per second your server can handle. But handling requests is not the same like answering each request.
If you don't have so many users on your chat, you might be ok with this approach. For bigger loads I highly recommend to switch to WebSockets.
I am experiencing two issues with my jQuery record-inserting process, and I am hoping that this wonderful SO community can help me to solve at least one of those issues. These are the issues:
Issue 1 - Intermittent server delay
The first issue relates to the fact that my Ubuntu 10.04 server seems to exhibit intermittent, 4.5 second delays when doing a POST of data to the mySQL database. Most POST commands are executed within a normal amount of milliseconds, but when a delay occurs, it always seems to be for approximately 4.5 seconds. This is not a busy, public server so it shouldn't be a matter of server load being the problem. These short videos demonstrate what I am trying to explain:
Video 1
Video 2
I have posted a question on serverfault and am awaiting some input from that forum which is probably more appropriate for this Issue 1.
Issue 2 - Timing of jQuery POST and GET Methods
The real issue that I am trying to resolve is to prevent the call to GET before all of the calls to POST have completed. Currently, I have implemented $.when.apply to delay the sending of GET. Here is the code for that:
function(){
$.when.apply(undefined, InsertTheAPPs()).done(function (){
$.ajax({
url: sURL + "fileappeal/send_apps_email",
success: function() {
var m = $.msg("my message",
{header:'my header', live:10000});
setTimeout(function(){
if(m)m.setBody('...my other message.');
},3000);
setTimeout(function(){
if(m)m.close(function(){
window.location.replace(sURL+'client/view');
});
},6000);
$('#ajaxShield').fadeOut(1000);
},
error: function(){
$.msg("error message",
{header:'error header', live:10000});
}
});
});
}
My problem arises due to the delay described above in Issue 1. The GET method is being called after all of the POST methods have begun, but I need the GET method to wait until all of the POST methods have ended. This is the issue that I need assistance with. Basically, what is happening is happening wrong here is that my confirmation email is being sent before all of the records have been completely inserted into the mySQL database.
Here is the code for the jQuery $.each loop. This is the code that needs to not only begin, but must end before the ajax call to fileappeal/send_apps_email above:
function InsertTheAPPs(){
$('input[name=c_maybe].c_box').each(function(){
var jqxhrs = [];
if($(this).prop('checked')){
var rn = $(this).prop('value');
jqxhrs.push(
$.ajax({
url: sURL + 'fileappeal/insert_app',
type:"POST",
dataType: 'text',
data: {'rn': rn},
error: function(data) {console.log('Error:'+rn+'_'+data);}
})
)
return jqxhrs;
}
});
}
Anyone have any suggestions for how I can workaround the server delay issue and prevent the call to the GET before all of the POST methods have completed? Thanks.
There's a small problem with your post. After you resolve it, this post should help you finish out your code: jQuery Deferred - waiting for multiple AJAX requests to finish
You're returning inside the .each but the function itself doesn't return anything. So your delay is not being given the array of ajax calls to wait for. And also, since your jqhrs is defined inside the each, the scope is per iteration over each c_box. Your method should look like this:
function InsertTheAPPs(){
var jqxhrs = [];
$('input[name=c_maybe].c_box').each(function(){
if($(this).prop('checked')){
var rn = $(this).prop('value');
jqxhrs.push(
$.ajax({
url: sURL + 'fileappeal/insert_app',
type:"POST",
dataType: 'text',
data: {'rn': rn},
error: function(data) {console.log('Error:'+rn+'_'+data);}
})
)
}
});
return jqxhrs;
}
You can also make your code easier. Since you just want to know if something is checked you can use the jquery pseudo class filter :checked such as:
function InsertTheAPPs(){
var jqxhrs = [];
$('input[name=c_maybe].c_box').filter(':checked').each(function(){
var rn = $(this).prop('value');
jqxhrs.push(
$.ajax({
url: sURL + 'fileappeal/insert_app',
type:"POST",
dataType: 'text',
data: {'rn': rn},
error: function(data) {console.log('Error:'+rn+'_'+data);}
})
)
});
return jqxhrs;
}
You could combine the filter on :checked into the main filter such as $('input[name=c_maybe].c_box:checked') but I left it in long form to really demonstrate what was going on.
I've got a small problem concerning my webpage that I'm updating using ajax. When I stay on my page for a while (an hour or so) it will stop loading and I can't display any of my pages in the browser. I've only started having this problem since I added this (simplified) javascript to my page:
var interval;
interval = setInterval('UpdateComs()',5000);
function FuncGo() {
$.post('data.php', { profile: pid }, function(data) {
$('.holder').html(data);
});
}
Since my server isn't down and this can't be the problem, I was thinking that this might be caused by too many connections? Could the above be opening more than one connection?
And if it does should I somehow close them?
Sorry for all the questions but I'm not too familiar with how connections work.. Thanks for any help or ideas
I suspect a random error (connection gets behind due to network traffic, maybe a timeout) that stops the setInterval. Always pass an annonomus function (not a string) to setTimeout or setInterval and only call the next function after the completion of the prior one.
function UpdateComs() {
$.ajax({
type: "POST",
url: 'data.php'
data: {profile:pid},
async: true,
cache: false,
timeout: 10000,
success: function(data){
$('.holder').html(data);
setTimeout(function(){UpdateComs(); },5000);
},
error: function(XMLHttpRequest, textStatus, errorThrown){
//do what you want with the error
setTimeout(function(){UpdateComs(); },5000);
}
});
}
New to ajax, so asking a very basic question.
-- Is there no way to make a Synchronous ajax call (async:false) with timeout set on it.?
http://www.ajaxtoolbox.com/request/
Timeout works perfect with Asynchronous call though in my application,
but for one particular scenario, I need a Synchronous call (the javascript should actually wait untill it hears back from the server), and this also works fine. But I need to handle a scenario where the sever could take long and a ajax timeout may be called.
Is there any other piece of standard documentation for ajax I could refer to?
Thanks
Basically, during a synchronous ajax request, the browser is blocked and no javascript can be executed while the browser is blocked. Because of this, jQuery can't abort the ajax request after a set timeout because jQuery is javascript and javascript can't be executed while the browser is blocked. This is the primary flaw in synchronous ajax.
Anytime you might want a synchronous request, you should instead use an asynchronous one with what should happen afterwards in the callback, as shown below;
$.ajax({
url : 'webservices.php',
timeout: 200,
dataType : 'json',
data : {
'cmd' : 'ping',
},
success : function(data, textStatus) {
$.ajax({
url : 'webservices.php',
async: false,
dataType : 'json',
data : {
'cmd' : 'feedback',
'data' : data,
'userinfo' : window.dsuser
},
success : function(data, textStatus) {
// success!
Status("Thanks for the feedback, "
+ window.dsuser.user + "!");
}
});
},
error : function(jqhdr, textStatus,
errorThrown) {
Status("There was trouble sending your feedback. Please try again later");
}
});
I don't believe it's possible to set a timeout on a synchronous call. When you set "async:false", I believe the browser actually locks up while waiting for the response. You should only use a synchronous request if you absolutely need to (because of the browser locking up).