Grails: best way to send cache headers with every ajax call - ajax

It's well known that Internet Explorer aggressively caches ajax calls whereas all the other browsers grab the data fresh every time. This is usually bad: I've never encountered a case where I want ajax to NOT contact the server. Firefox, Safari and the other browsers know this and don't cache ajax calls.
To prevent IE from caching, you have to do one of the following:
add a cache-busting token to the query string (like ?time=[timestamp])
send a HTTP response header that specifically forbids IE to cache the request
use an ajax POST instead of a GET
I much prefer setting a no-cache header. It's the correct way: it tells all browsers not to cache, which is exactly what you intend. The query string method fills up the browser's cache with stuff that'll never be retrieved, leaving less room for legitimate cache content. And the POST method is a corruption of HTTP: POSTs are for modifying data.
In Grails, what's the best way to automatically send a do-not-cache header for all ajax requests? I don't want to modify any controllers, so I'm thinking there's got to be a cool filter trick or something.
Thanks!

Here's what I finally figured out. Most javascript libraries --including jQuery, YUI, Mootools and Prototype -- send the X-Requested-With: XmlHttpRequest header on every ajax request.
For any request that sends this header, you can send a response header back that tells it to not cache.
Below is a Grails filter that prevents caching of ajax requests that identify themselves with the X-Requested-With: XmlHttpRequest header:
// put this class in grails-app/config/
class AjaxFilters {
def filters = {
all(controller:'*', action:'*') {
before = {
if (request.getHeader('X-Requested-With')?.equals('XMLHttpRequest')) {
response.setHeader('Expires', '-1')
}
}
}
}
}
Some people prefer to use the Cache-Control: no-cache header instead of expires. Here's the difference:
Cache-Control: no-cache - absolutely NO caching
Expires: -1 - the browser "usually" contacts the Web server for updates to that page via a conditional If-Modified-Since request. However, the page remains in the disk cache and is used in appropriate situations without contacting the remote Web server, such as when the BACK and FORWARD buttons are used to access the navigation history or when the browser is in offline mode.
By adding this filter, you make Internet Explorer's caching consistent with what Firefox and Safari already do.
BTW, I've experienced the caching problem on IE8 and IE9. I assume the problem existed for IE7 and IE6 as well.

We use jQuery for all ajax calls so we add this block to our main.gsp (top-level layout):
<g:javascript>
jQuery(document).ready(function() {
$.ajaxSetup({
cache:false
});
});
</g:javascript>
Also answered here

Related

Chrome use ajax cached version when back is pressed

I use ajax request beside pushing history state to load new content and update the entire page. At the server the X-Requested-With header is used to decide to send the full page or just the content. But it seems chrome tends to use the cache no matter it's loaded with ajax or normal request (it doesn't respect headers when checking the cache).
The problem happens when I open a site page, I click a link to navigate to a new page using ajax then navigate to a new page by entering the url in address bar. When I hit back the ajax cached version (no matter it's html or json) is shown instead of full page. When the cache is disabled everything works fine.
Is there any way to force chrome respect the request headers when checking the cache?
After some research I found out that browsers tend to cache responses base on Request Method as well as URL. So they won't consider any request headers when checking the cache by default. But it's possible to force the browser to respect some headers when checking the cache by using Vary header.
So by adding this header (Vary:X-Requested-With) to each response that changes based on X-Requested-With request header, server is telling browser that this response may vary if your X-Requested-With header is changed and you must request a new response.

Browser doesn't distinguish a partial HTML got via AJAX and a full page

I've got a page that can be accessed at URL /products. When I visit it in a browser it responds with a full page within a layout. Here is a simplified example of request headers and response body:
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
<layout>
<products />
</layout>
When a user does some search the javascript updates results via AJAX. The results are rendered without a layout since it takes time to render and I don't need it anyway:
Accept: */*;q=0.5, text/javascript, application/javascript, application/ecmascript, application/x-ecmascript
X-Requested-With: XMLHttpRequest
<products />
So, this worked fine until I added caching Cache-Control: private, max-age=3600. Initially I though I would add Vary: X-Requested-With header and a browser would distinguish the two responses. However, when I get /products via AJAX and then visit /products in the browser it displays the partial AJAX response.
Is there a simple way to solve this problem?
P.S. I'm using Ruby on Rails and jQuery if that matters.
Have your Ajax call use a different URL like /products/partial.
You should have a diferent url for partial results (i.e ?partial=yes or something like this...)
OR
you can get the whole page via ajax and extract just the part you want using jquery.load().
$("#productsContainerHolder").load("/my/products/url #productsContainer", { myParam: "beer", myParam2: "cold"});
$.load will call your server with a 'GET' method, retrieve all contents, extract the #productsContainer from there and insert on "#productsContainerHolder"
<div id="productsContainerHolder">
<div id="productsContainer>
...
</div>
</div>
This article by Steve Luscher describes a similar case the problem was more intermittent than what you describe. The suggested solutions are:
Cancel all of the AJAX requests at the time the form is submitted
Use a different URL according to the response you expect
Steve went for #1 using cancel() on the ajax requests.
You don't mention what browsers you have used, there is a browser related question here
Use Vary: Accept. This should work.
probably the easiest is to set to 'False' the 'cache' parameter of the Ajax method of jquery. It will automatically append a timestamps to the URI preventing it from being cached.
This can be done application wide with the following snippet:
$.ajaxSetup({
cache: false
});
If the cache does matter even for dynamic request, you can generate yourself the timestamp based on the date along with the hour.
Try sending must-revalidate instead of private (which is more for proxy).
Cache-Control: max-age=3600, must-revalidate
I recommend reading this article: http://www.mnot.net/cache_docs/ it might help.
And also use Mark's tool http://redbot.org/ to test your results to eliminate your local machine or isp or what not.

How to prevent content being displayed from Back-Forward cache in Firefox?

Browser: Firefox 6.0
I've Page A with the following setup to make sure the content is NOT stored in the bfcache of the browser:
1) $(window).unload(function(){});
2) Following HTTP headers:
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="expires" content="-1" />
<meta http-equiv="cache-control" content="no-cache"/>
I've also hooked up the events pagehide and pageshow. When I am navigating away from the page, pagehide is invoked with CORRECT value for the event property persisted = false (that is what needed: no persistence in cache!)
After navigating a couple of pages, I've a window.history.go(-2); to go back to Page A. At this point, I want Firefox to poll the server for the updated version instead of displaying from the cache. The pageshow of Page A is invoked with CORRECT value for the event propertypersisted = false (meaning the page is NOT loaded from cache). BUT the page content is not the server data; it is the stale content (same as when navigating away from the page initially)! Fiddler also does not show a new request to server.
Google Chrome also exhibits the same behaviour. IE works as expected (reloads fresh data)!
Any idea what am i missing?
Thanks in advance!
There are multiple caches involved. There's the browser's document cache (bfache), the browser's HTTP cache, and possibly intermediate HTTP caches.
The <meta> tags you show above have absolutely no effect in current Chrome or Firefox. They may have an effect in IE.
So chances are, your page is just being read from the browser's HTTP cache.
If you really want to send no-cache HTTP headers, you should do that. But they need to be actual HTTP headers: as I said above, the <meta> tag "equivalents" do nothing.
And, importantly, any other intermediate caches are not going to be parsing your HTML so might cache things if you don't actually send the right HTTP headers.
If you set Cache-Control: "no-cache, no-store, must-revalidate" to http headers the page won't be cached in back-forward cache.
Firefox also considers event handlers on beforeunload event as a signal to not store page in BFC, but Safari ignores such handlers, so it's better to set correct http headers to indicate the nature of the page content (cacheable or variable)
There are two caches to bear in mind:
The bfcache (back-forwards cache)
The bfcache (in Firefox, Safari and Chrome) stores the page in memory, including any dynamic modifications to the DOM. It is used by Firefox, Safari and Chrome when pressing back. To attempt to ensure that the page is not stored in this cache, you need to run these lines:
window.addEventListener('unload', function(){});
window.addEventListener('beforeunload', function(){});
Note that this seems to work in desktop Firefox and Chrome, but doesn't always work in desktop Safari, or Android Chrome or Android Firefox or iOS Safari.
Note that Webkit documentation calls the bfcache the "Page Cache".
The normal browser cache
Pages are cached in the normal browser cache, unless you set the proper no-store value in the Cache-Control heading. To be extra sure, send this full header:
Cache-Control: max-age=0, no-cache, no-store, must-revalidate, private
Firefox, Safari and Chrome will first check the bfcache when pressing the back button. They will then fall back to the normal cache. So you need to both add an event listener to unload, and set this Cache-Control HTTP header. Note that using <meta> instead of the HTTP header may not work.
References:
Article on back/forward cache by Chrome Developer Relations
The answer below does not work any more:
From answer on SO, adding an unload event to window causes the back/forward cache to be cleared.
UPDATE. POSSIBLE SOLUTION:
BFCache can bring surprises to developers, because at least in Firefox when moving back/forward the page does not refresh even if it was told by HTTP headers. So it's better to assume that the page will not refresh.
On the other hand, what is the difference between getting page with outdated data because of BFCache, and finding a tab in your browser that you did not reload for ages?
If you care about those kind of things, write some javascript that checks server for updates and reloads sensitive information. This is a chance to turn your problem into win ).

jQuery AJAX request in IE9 not sending Cookie header

I'm using jQuery's ajax .get method to retrieve data from my server. Works perfect in Chrome, but in IE9 it is not sending the Cookie header and that breaks the app. Any idea why? Here's the jQuery code:
$.get(this.server + 'rest/photo/' + this.profileId + '/count', function(data) {
$('#imageCount').html(data);
});
I have the same problem here, I can't get the jQuery .ajax() function to work. The only workaround I found is this:
<meta http-equiv="X-UA-Compatible" content="IE=EmulateIE8" />
You can add this meta tag to the top of the page to get it working. But it doesn't feel like a good solution. I think the problem is that the xmlhttprequest object in IE9 is different, so jQuery cannot find the respective object, therefore ajax is not triggering.
I ran into a similar issue to the OP many years later with IE9 which, sadly, is still hanging on.
Every browser I tried, including IE10+, seemed fine with passing cookies to my backend, but IE9 would just drop them. It didn't seem to matter what attributes were on the cookies. The main page and API were on the same domains and the cookies matched, the schemes were the same. I wasn't doing anything with IFRAMES, so the P3P 'potato' hack didn't help.
So I started doing some research on what it was about IE9 that could be different. This Microsoft post was very enlightening, and outlines all the things the IE8 and IE9 did to help lock down CORS security holes:
Must use HTTP(S), and both endpoints must use the same scheme
Must use GET/POST
No custom headers allowed
Only text/plain content-type allowed
More sensitive to Security Zone settings
Cookies will be stripped from the request
That last item about the cookies got me thinking, what if IE9 thought I was making a cross-site request? It certainly looked like it was getting shot down in fine fashion like that. I had already checked some of the obvious things like the scheme and domain, but maybe I didn't check everything.
The solution? Specifically, I was using reqwest as my ajax library. It has a cross-origin parameter, which I had left set to true for some reason. Setting it (correctly) to false did the trick - all my cookies were picked up by the server. So it was a dumb mistake, but I learned a thing or two.
Hope this helps someone!

Can headers be sent in an AJAX request?

Can I call the server to set a new cookie with an AJAX request (that is, after the page has already loaded)?
For example, when a visitor hits a link, ajax would open a php file that sets a new cookie like this:
setcookie('cookiename', 'true', time()+3000, "/",'...');
But this is done after the html (the page containing the actual <a> tag pressed) was rendered. Is it nevertheless ok to set cookies in ajax? (maybe because the php file loaded is separate from the original html page).
You can have the server's response set a cookie, certainly. Remember that cookies are an HTTP thing, not an HTML thing; the fact that your original HTML file is already on the browser is irrelevant. Your ajax request is a separate HTTP request to the server, which (hopefully!) generates an HTTP response back to the browser; and that response can include a new Set-Cookie header.
I'm not a PHP person, you'll need to check that there are limitations in the PHP mechanism you're using for setting the cookie (I can't imagine there are). But fundamentally, no, there's no problem doing this. I've done it with both JSPs and classic ASP.
I've set cookies in the response to AJAX requests on my site and I haven't had any problems with it yet. (Although I haven't looked for problems.) It could be that some browsers don't set cookies when receiving them in an XmlHttpRequest but so far I've seen it work in IE, Chrome and Firefox.
Why not use javascript to edit cookies? Return the content of the cookie in JSON format and use javascript to store the values.

Resources