We have a system that uses Firefox as a client for a Web application. The web page uses Dojo Ajax to perform a servlet POST request doing a long process. The problem is when the request exceeds 15 minutes (based on observation), the same Http request (same parameters) is performed repeatedly by the browser. The repeat request is received every 1 minute 20 seconds thereafter.
11:00:00 First Request
11:15:00 Repeat Request
11:16:20 Repeat Request
11:17:40 Repeat Request
11:19:00 Repeat Request
11:20:20 Repeat Request
My question is there a setting in the firefox config or even on the servlet part that can prevent the repeat request? This is a local system so I can control the browser settings.
Note: I know that a solution to this is to perform the long process in a thread and repeatedly poll the thread via javascript but my boss wants an easier fix if possible via settings in firefox.
Added based on comment:
The code used is the dojo toolkit dojo.xhrPost API to do the request. I am not sure if the dojo API is the one that is doing the reposting. Help from dojo expert too.
Added source based on Jeremy's comment:
dojo.xhrPost(
{
url: servlet,
content: {
jobName: 'DoLongProcess',
FUNCTIONNO: dojo.byId('hdFunctionNo').value
},
handleAs: handleAs,
handle: function(response) {
cursorStyle(cursor_style_auto);
}
}
);
The version of dojo I'm using is 1.3.1 Rev: 17468. Unfortunately I cannot change the dojo to higher version since that would require a regression test on all functions.
Related
How to prevent IE from caching the request sent to the server?
i tried by setting ("Cache-Control: no-cache) in the https response object but still the IE is caching my request data.
Please find tmy project details as below:
in my application i am sending login request to the server. so after i login if i take the memory dump using winHex tool i am able to get the password details in the memory.
i am clearing the dialog refrense also but still the request data is getting cached.
Please suggest me some work arround for this
You could try to add a parameter to your URL with a random value, this will prevent that the URL is always thesame.
Example:
Normal URL:
www.test.com/test.php
Fake different URL:
www.test.com/test.php?_dc=12353somerandomval
Make sure the _dc parameter always has a different value, you can (for example) use JavaScript date object for this (It returns the current time in milliseconds, which will virtually always be different):
params: {
_dc : new Date().getTime()
}
In a project I did a while back I had the exact same issues, I searched around and saw a few things that recommended adding a time stamp to the request, that does work too, but this was the most elegant way that worked for me.
$('document').ready(function () {
$.ajaxSetup({
cache: false
});
});
I'm developing an application which is supposed to serve different content for "normal" browser requests and AJAX requests for the same URL requested.
(in fact, encapsulate the response HTML in JSON object if the request is AJAX).
For this purpose, I'm detecting an AJAX request on the server side, and processing the response appropriately, see the pseudocode below:
function process_response(request, response)
{
if request.is_ajax
{
response.headers['Content-Type'] = 'application/json';
response.headers['Cache-Control'] = 'no-cache';
response.content = JSON( some_data... )
}
}
The problem is that when the first AJAX request to the currently viewed URL is made strange things happens on Google Chrome - if, right after the response comes and is processed via JavaScript, user clicks some link (static, which redirects to other page) and then clicks back button in the browser, he sees the returned JSON code instead of the rendered website (logging the server I can say that no request is made). It seems for me that Chrome stores the latest request response for the specific URL, and doesn't take into account that it has different content-type etc.
Is that a bug in the Chrome or am I misusing HTTP protocol ?
--- update 12 11 2012, 12:38 UTC
following PatrikAkerstrand answer, I've found following Chrome bug: http://code.google.com/p/chromium/issues/detail?id=94369
any ideas how to avoid this behaviour?
You should also include a Vary-header:
response.headers['Vary'] = 'Content-Type'
Vary is a standard way to control caching context in content negotiation. Unfortunately it has also buggy implementations in some browsers, see Browser cache vary broken.
I would suggest using unique URLs.
Depending of you framework capabilities you can redirect (302) the browser to URL + .html to force response format and make cache key unique within browser session. Then for AJAX requests you can still keep suffix-less URL. Alternatively you may suffix AJAX URL with .json instead .
Another options are: prefixing AJAX requests with /api or adding some cache boosting query params ?rand=1234.
Setting cache-control to no-store made it in my case, while no-cache didn't. This may have unwanted side effects though.
no-store: The response may not be stored in any cache. Although other directives may be set, this alone is the only directive you need in preventing cached responses on modern browsers.
Source: Mozilla Developer Network - HTTP Cache-Control
I am working with CakePHP on an app which has to run a time-consuming task via a single AJAX call, with secondary periodical AJAX calls checking on the progress of the task.
The Problem
While the time-consuming task (which takes >30 seconds) is running via it's AJAX request to CakePHP, the secondary progress AJAX request seems unable to be "blocking".
To clarify, the secondary progress AJAX request does not return any error, it simply does not return any response until the original time-consuming request finishes.
Once this original AJAX request finishes, the secondary progress AJAX request returns as expected.
It seems that execution of the progress request is being queued until the first AJAX call finishes, as the progress returned is 100%.
What I've Tried
I have tried multiple suggested solutions, including:
Changing the session handler to 'cake' in core.php - no fix
Setting the config security level to 'medium' in core.php - no fix
Disabling user agent checks in core.php - no fix
Testing multiple concurrent AJAX calls to a plain PHP script on the same server - works as expected
Any Ideas?
So it seems as though the issue is caused by CakePHP - has anyone experienced this in their own CakePHP app?
Thanks!
Session handling is set to file in php
from php.ini
[Session]
; Handler used to store/retrieve data.
; http://php.net/session.save-handler
session.save_handler = files
This prevent php from running more than one instance of session per user.
To prevent this run this in your php code:
session_write_close();
Just know session is now closed, so writing to session is no longer an option.
For some clarification, are you using the built-in AJAX helper (on prototype) or some external library like jQuery?
Usually, the javascript library of choice has an {async: true} available to force concurrency. See this example:
$.ajax({
type: 'GET',
url: '/fetch.php',
async: true,
success: function(data, status) {
$('#status').html(status);
}
});
For the built-in CakePHP AJAX helper, this option should do the trick: $options['type']. More info here. Do note that the AJAX helper is deprecated as of version 1.3 and the jsHelper should be able to takeover, see the request() method here for instance (also has an option called async).
I want to turn off the cache used when a URL call to a server is made from VBScript running within an application on a Windows machine. What function/method/object do I use to do this?
When the call is made for the first time, my Linux based Apache server returns a response back from the CGI Perl script that it is running. However, subsequent runs of the script seem to be using the same response as for the first time, so the data is being cached somewhere. My server logs confirm that the server is not being called in those subsequent times, only in the first time.
This is what I am doing. I am using the following code from within a commercial application (don't wish to mention this application, probably not relevant to my problem):
With CreateObject("MSXML2.XMLHTTP")
.open "GET", "http://myserver/cgi-bin/nsr/nsr.cgi?aparam=1", False
.send
nsrresponse =.responseText
End With
Is there a function/method on the above object to turn off caching, or should I be calling a method/function to turn off the caching on a response object before making the URL?
I looked here for a solution: http://msdn.microsoft.com/en-us/library/ms535874(VS.85).aspx - not quite helpful enough. And here: http://www.w3.org/TR/XMLHttpRequest/ - very unfriendly and hard to read.
I am also trying to force not using the cache using http header settings and html document header meta data:
Snippet of server-side Perl CGI script that returns the response back to the calling client, set expiry to 0.
print $httpGetCGIRequest->header(
-type => 'text/html',
-expires => '+0s',
);
Http header settings in response sent back to client:
<html><head><meta http-equiv="CACHE-CONTROL" content="NO-CACHE"></head>
<body>
response message generated from server
</body>
</html>
The above http header and html document head settings haven't worked, hence my question.
I don't think that the XMLHTTP object itself does even implement caching.
You send a fresh request as soon as you call .send() on it. The whole point of caching is to avoid sending requests, but that does not happen here (as far as your code sample goes).
But if the object is used in a browser of some sort, then the browser may implement caching. In this case the common approach is to include a cache-breaker into the statement: a random URL parameter you change every time you make a new request (like, appending the current time to the URL).
Alternatively, you can make your server send a Cache-Control: no-cache, no-store HTTP-header and see if that helps.
The <meta http-equiv="CACHE-CONTROL" content="NO-CACHE> is probably useless and you can drop it entirely.
You could use WinHTTP, which does not cache HTTP responses. You should still add the cache control directive (Cache-control: no-cache) using the SetRequestHeader method, because it instructs intermediate proxies and servers not to return a previously cached response.
If you have control over the application targeted by the XMLHTTP Request (which is true in your case), you could let it send no-cache headers in the Response. This solved the issue in my case.
Response.AppendHeader("pragma", "no-cache");
Response.AppendHeader("Cache-Control", "no-cache, no-store");
As alternative, you could also append a querystring containing a random number to each requested url.
Is there a limit to the number of simultaneous Ajax requests than can be launched toward an Apache server? For example, consider the following function to update div elements on a page (prototype JS):
function trigger_content_update(cell) {
//asynchronous : false is required for this to work properly
$$('.update').each(function(update_item){
new Ajax.Request('/neighbouring?.state=update_template&dummy='+(new Date()).getTime(),{
asynchronous: false,
parameters: {divid: update_item.id, source: cell},
onComplete: function(response) {
var elm = response.getHeader('Element');
if ($(elm) !== null) { $(elm).update(response.responseText) }
}
});
});
}
On my HTML page, there are 8 div elements that are marked with the "update" CSS selector, thus launching 8 ajax requests. The code works fine with the asynchronous property set to false, but as soon as i set asynchronous:true i can observe (in Firebug) most Ajax requests returning a 500 status (internal server error).
Once this occurs, it is required to restart apache to recover.
I'd check the server side code that's handling the requests.
As far as Apache is concerned, your Ajax request is just a POST - the same as if you'd submitted a form. 8 simultaneous requests should easily be handled by Apache, so it suggests that the server side code that Apache is running is locking up - perhaps it's trying to write to a data file and finding it locked?
I just wrote a test case where I sent out 10,000 simultanuous Ajax calls to a service. Works fine on Apache Tomcat. All service came back with a proper answer.
It sounds like your service is having some internal synchronization issues.