Prevent IE8's agreesive caching - caching

I am having an issue with IE8 and its caching behaviour.
If I hit the page index.html and then continue to hit it again I am servered a page from the server.
However if I hit the page index.html?ui=v2 then index.html then index.html?ui=v2 I am served the page from cache.
The problem is the querystring ui=v2 is used to set a cookie which dtermines the view to deliver. As the page comes from cache the cookies view mode is not updated and I am served the same content as displayed for index.html (with no querystring).
This is IE8 and below, no other browsers.
Keen for any input Ideally I do not want to update the meta or response headers.
Thanks in advancecokkies

Related

"Fetch as Google" renders all pages to look like my homepage

I am trying to figure out why my website's posts and pages such as my resume are getting a "Complete" status with a green check mark (seemingly no errors or redirects) when fetching and rendering as google, but all of them "render" and look like my homepage. The page speed insights tool seems to be using the same rendering engine as it seems to have the same issue.
Notes:
The html served from my website on initial page load is the correct HTML and content. No redirects occur. The initial page load does not fetch content via JS. I mention this because although my website is not a one page application (I'm using Wordpress), I do use ajax in combination with a post variable flag to fetch new page content when the user navigates to the next page (after the initial page load).
I have verified that all of my pages have been indexed using the "site:" trick in Google search. They are indexed properly, but they aren't "rendering" properly.
Should I be worried? Should I just ignore that the pages aren't rendering properly? It doesn't make any sense. Is anyone else having this issue?
Your resume page has a response type of content-type image/gif so google thinks that the page is an image??

Chrome use ajax cached version when back is pressed

I use ajax request beside pushing history state to load new content and update the entire page. At the server the X-Requested-With header is used to decide to send the full page or just the content. But it seems chrome tends to use the cache no matter it's loaded with ajax or normal request (it doesn't respect headers when checking the cache).
The problem happens when I open a site page, I click a link to navigate to a new page using ajax then navigate to a new page by entering the url in address bar. When I hit back the ajax cached version (no matter it's html or json) is shown instead of full page. When the cache is disabled everything works fine.
Is there any way to force chrome respect the request headers when checking the cache?
After some research I found out that browsers tend to cache responses base on Request Method as well as URL. So they won't consider any request headers when checking the cache by default. But it's possible to force the browser to respect some headers when checking the cache by using Vary header.
So by adding this header (Vary:X-Requested-With) to each response that changes based on X-Requested-With request header, server is telling browser that this response may vary if your X-Requested-With header is changed and you must request a new response.

HTTP GET request caching vs asp.net Output caching

I have a very basic doubt about caching. I have sent a request to an aspx page from browser. Since it is an http Get request by default it will be cached. I can see that in about:cache of browser. But if that page is cached, then how my modification (may be in css or js of that particular aspx page) is reflecting on next request. That means it is not taking that from cache?
But At that time cache expire shows something like this "1970-01-01 05:30:00" in about:cache of that aspx request. All other static pages (external js) it is showing future expiry date.
Does a "past expiry" date simply imply that the item should not be "fetched again" from cache?
If enabled Output caching I know the new modification will not see as long as the cache is not expired. But then how this asp.net output cache and http get by default caching mechanism differs? I know Output caching has the facility to cache it in server or proxy, so it will serve for multiple users. But at the browser level how it differs?
Btw I got the answer :)
http://devproconnections.com/aspnet/aspnet-pages-and-browser-caching
Thanks again.

Caching set not to hit server when page refresh

I have some JS, CSS, image files on my page and I'm using cache so when I revisit my page
browser doesn't hit my server.
But when I refresh my page, browser hit the server and my server returns 304 NOT MODIFIED.
I know it's usual phenomenon but I wanna know if there is any method to prevent browser from hitting my server and just load those from the cache instead when user refresh the page.
For example when I refresh google page some of their resources are loaded from cache not 304 NOT MODIFIED.
Can anybody help web newbie :) ?
This is normal behavior and ordinarily I'd just live with it. As long as you've set the expiration or max-age settings properly then the 304 messages will only happen upon refresh, as you say. And as long as your users aren't hitting refresh regularly, then the 304 will not typically occur as people browse the site.
However, there are some situations in which you can prevent the 304 message even when the user refreshed the page.
Setting the background-image of an element through Javascript.
Using an iframe to pull content that has a max-age (note that they can still refresh the iframe by right-clicking inside it and clicking refresh).
Here's a background-image example if you want to try it out...
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js"></script>
<div id="example-image" style="height: 250px;"></div>
<script type="text/javascript">
$("#example-image").css("background", "white url(/images/myImage.jpg) no-repeat");
</script>
UPDATE
I was thinking AJAX GETs would work but it appears I was wrong so I've removed that from the list. In any case, the solutions I've listed above do work but they are clunky and unpleasant which is why I normally don't bother with this.
UPDATE #2
Steve Souders has some interesting experiments regarding this issue...
http://www.stevesouders.com/blog/2013/02/26/reloading-post-onload-resources/

How to prevent content being displayed from Back-Forward cache in Firefox?

Browser: Firefox 6.0
I've Page A with the following setup to make sure the content is NOT stored in the bfcache of the browser:
1) $(window).unload(function(){});
2) Following HTTP headers:
<meta http-equiv="pragma" content="no-cache" />
<meta http-equiv="expires" content="-1" />
<meta http-equiv="cache-control" content="no-cache"/>
I've also hooked up the events pagehide and pageshow. When I am navigating away from the page, pagehide is invoked with CORRECT value for the event property persisted = false (that is what needed: no persistence in cache!)
After navigating a couple of pages, I've a window.history.go(-2); to go back to Page A. At this point, I want Firefox to poll the server for the updated version instead of displaying from the cache. The pageshow of Page A is invoked with CORRECT value for the event propertypersisted = false (meaning the page is NOT loaded from cache). BUT the page content is not the server data; it is the stale content (same as when navigating away from the page initially)! Fiddler also does not show a new request to server.
Google Chrome also exhibits the same behaviour. IE works as expected (reloads fresh data)!
Any idea what am i missing?
Thanks in advance!
There are multiple caches involved. There's the browser's document cache (bfache), the browser's HTTP cache, and possibly intermediate HTTP caches.
The <meta> tags you show above have absolutely no effect in current Chrome or Firefox. They may have an effect in IE.
So chances are, your page is just being read from the browser's HTTP cache.
If you really want to send no-cache HTTP headers, you should do that. But they need to be actual HTTP headers: as I said above, the <meta> tag "equivalents" do nothing.
And, importantly, any other intermediate caches are not going to be parsing your HTML so might cache things if you don't actually send the right HTTP headers.
If you set Cache-Control: "no-cache, no-store, must-revalidate" to http headers the page won't be cached in back-forward cache.
Firefox also considers event handlers on beforeunload event as a signal to not store page in BFC, but Safari ignores such handlers, so it's better to set correct http headers to indicate the nature of the page content (cacheable or variable)
There are two caches to bear in mind:
The bfcache (back-forwards cache)
The bfcache (in Firefox, Safari and Chrome) stores the page in memory, including any dynamic modifications to the DOM. It is used by Firefox, Safari and Chrome when pressing back. To attempt to ensure that the page is not stored in this cache, you need to run these lines:
window.addEventListener('unload', function(){});
window.addEventListener('beforeunload', function(){});
Note that this seems to work in desktop Firefox and Chrome, but doesn't always work in desktop Safari, or Android Chrome or Android Firefox or iOS Safari.
Note that Webkit documentation calls the bfcache the "Page Cache".
The normal browser cache
Pages are cached in the normal browser cache, unless you set the proper no-store value in the Cache-Control heading. To be extra sure, send this full header:
Cache-Control: max-age=0, no-cache, no-store, must-revalidate, private
Firefox, Safari and Chrome will first check the bfcache when pressing the back button. They will then fall back to the normal cache. So you need to both add an event listener to unload, and set this Cache-Control HTTP header. Note that using <meta> instead of the HTTP header may not work.
References:
Article on back/forward cache by Chrome Developer Relations
The answer below does not work any more:
From answer on SO, adding an unload event to window causes the back/forward cache to be cleared.
UPDATE. POSSIBLE SOLUTION:
BFCache can bring surprises to developers, because at least in Firefox when moving back/forward the page does not refresh even if it was told by HTTP headers. So it's better to assume that the page will not refresh.
On the other hand, what is the difference between getting page with outdated data because of BFCache, and finding a tab in your browser that you did not reload for ages?
If you care about those kind of things, write some javascript that checks server for updates and reloads sensitive information. This is a chance to turn your problem into win ).

Resources