Chrome caching like a mad browser - caching

I've got a web service that, like most others, uses js and css files. I use the old trick of appending a version number to the js and css file like; ?v=123 and that gets changed every time we update the service on production.
Now, this works fine on all browsers, except for Chrome. Chrome seems to prefer it's cached version over getting the new one and therefor seems to ignore the appended variable. In some cases, forcing it to refresh cache (cmd+r / ctrl+f5) wasn't enough so I had to go into options and clear out the cache for it to load up the new content.
Has anyone experienced this issue with Chrome? And if so, what was the resolution to the problem?

Chrome should certainly treat requests with varying query strings as different requests; a cached result for style.css?v=123 should never be used for style.css?v=124. If you're seeing different behavior, please file a bug at http://new.crbug.com/ and post the bug ID here.
That said, I'd first check to see whether the page was cached longer than you expected. If a new version of the page itself wasn't downloaded, then it would still be requesting ?v=123 as the HTML wouldn't have changed. If you're sending long-lived cache headers with the page, it's certainly possible that Chrome is caching it more aggressively than you expected. If that's the behavior you're seeing, please star http://crbug.com/8742 for updates.

I had also same experience
You can user Ctrl + Shift + R for cache free browsing in both Chrome + Mozilla.

I have had this experience as well.
I run a membership site which displays content such as "You must be logged in as a Gold member in order to see this content" if they are not logged in or are trying to view content not allowed by their membership level. But even if the user is logged in, the user would still see "You need to log in", due to Google Chrome's aggressive caching. In Firefox, however, it works fine as I test logging in and out of all 5 levels of membership - each displaying the proper content.
While Chrome's caching problem can be solved by clearing the cache every time the user logs in and out, it would be really annoying to take that approach.

Related

How to force the browser to show the most up to date files instead of relying on application cache?

It's very important for the website I'm working on to be offline-functional. I'm using a Cache Manifest to store all the files on the application cache, so that takes care of that and all is good and well.
BUT, as I read and noticed myself, the browser first shows the cached version of the site before checking for an update online. Hitting refresh reloads the cache again, with the new cached files this time (or what it had time to update for the swift refreshers).
I'm aware of this fix : http://www.html5rocks.com/en/tutorials/appcache/beginner/, where the user is told an update is available and is asked to refresh the page. Not a bad method, but still sketchy for user experience.
Is there any other way to force the browser to show the most up to date files if online? Would cache busting all files manually AND using a cache manifest fix this problem, or will it conflict with the cache manifest and cause problem to the offline functionality?
I found something that works well for me:
The URL linking to the web page contains a parameter. If there is ever a change to the page or related files, the url is changed to something like this: http:/ /www.mywebsite.com/mypage.html?v=3 where v=3 is changed depending on updates.
This is a longer fix to implement (finding every page affected by a change & changing all their cache busting links), but the pages at least show what they're supposed to on the first load and the cache manifest still load the update for offline viewing.

MVC 3 with IE, poor bundle performance

We have deployed an MVC 3 website on an IIS6 box.
Everything runs fine, but the performance is abysmal.
Can anyone help me understand
why am I getting 20 second response times to get a script bundle?
why bundled scripts are not cached by IE even if the Expires header is set?
The site is several times faster in Chrome (I have noticed the cache behaviour is correct), but we cannot force customers to use it.
Any help would be great. I'm kind of wondering if it's a server-side setting that's forcing the bundle recompilation each request, or if it's just IE acting like usual.
Edit: as per comments request, I'm including also the bundle request headers:
If you have different download times for a full reload between the two browsers it could be that you are doing intense computations with a client side framework like angularjs (I have seen big performance differences from highly complex angularjs apps between the two browsers).
If both your browsers show the same download time, it is either a network issue, or a server issue.
The IE caching could be a separate issue, break your problem into two parts - look for the cause of the slow downloads first.
All I can do now is suggest an approach to finding the issue.
Summary of what you know
It looks like you have:
Server sends an Expires header one year from now
When you reload the page (i.e. you don't force a full refresh using Ctrl+F5)
IE doesn't take any notice of the cache header, and when it sends it's new request it doesn't use If-Modified-Since or If-None-Match
Chrome behaves differently and respects the Expires and/or ETag response headers (it doesn't even make the request again for the bundle).
EDIT 1: You also seem to be saying (though it would be good to see a timeline from chrome) that Chrome downloads the files faster, implying it is not a server-side problem. Your latest comment states that Chrome's downloads are also slow. (end edit)
And you also seem to be saying that this behaviour is consistent (i.e. 100 requests in IE, and 100 requests in Chrome show the above behaviour with no deviations).
Approach
You should break this problem into two parts:
Why is the download so slow?
Is there a server-side performance problem? Look for common download times in IE and Chrome, and Firefox (it could be due to bundling/minification/compression on the server).
Is there a network connectivity issue (dropped packets, for instance)? Look for inconsistent download times, Start times, Request times, between requests in a given browser and the same behaviour across all browsers.
Is a script slowing down IE, but not Chrome (this is not uncommon, I maintain legacy sites where the scripts don't run well in IE but do in Chrome) - look at different profile results between browsers.
Why is the javascript not being cached in IE? Troubleshoot (1) first, then worry about this.
It is possible that the two are related, but approaching them separately will be a start. Number 1 is far easier to diagnose that 2, the top references to caching javascript in IE on the web are to prevent it in order to help with development.
Root cause diagnosis
EDIT 1 The first thing to do is try the site from a browser on the server, or very close to the server to see if you have a network issue. (end edit)
Tools like Fiddler, the browser developer tools, timeline and script profiler, and YSlow are your friend. Compare each of the following between Chrome and IE (and see what happens in Firefox as well) and spot the difference. Note: you may need to clear the browser cache between tests.
browser developer tools -> script profile: see if you have a slow running script in IE compared to Chrome
similar analysis in a tool like YSlow (look for comparisons between the two browsers, not script improvements)
request and response headers, and timeline from a normal (i.e. not full reload) page load
request and response headers, and timeline from a full page reload (Ctrl+F5)
Start and Request durations for every js file for a given browser, and between browsers (this may point to network issues)? I note that the Start and Request alone are taking 0.6s and 1s each in IE - that is very very poor performance.
5 requests, and 5 full reloads with cache clearing between (that is, don't chase a ghost - be consistent in your test methodology)
Download times should be no different between Chrome and IE with no scripts actually running so also add a control test. Assuming that your bundle files don't "do anything" (i.e. they contain functions that the page calls rather than kicking off long processes by themselves) then create a blank page in your site which references exactly the same javascript files - not just the bundle, but every single js reference.
With the control test you can compare pure download times and caching behaviour in IE to Chrome, without any client side javascript running (use the developer tools profiler to verify no scripts are running). If your bundle files do kick off long running things, just temporarily disable those things by putting return statements at the top of the script and concentrate only on the download into the browser.

Blank page on Azure

I have an application running in Azure (trial account). So far so good, everything has been nice, except for a long deploy times (10-15 minutes).
I've done a deploy recently and got a lot of weird bugs I cannot trace. For example, if I log in and thus a cookie is created (I use FormsAuthentication) all I get from the application is a blank page, as in, absolutely nothing is sent to the browser. The application works fine in the ASP.NET Web Dev Server, IIS Express, even the Azure Emulator!
What could be the issue? Searching the web hasn't been much help, with only a couple of unrelated issues.
I tried logging into the site (if I correctly understood from one if the comments, the url is versulo.com) and I didn't get any blank page with 404 status code.
However, there is another problem I spotted. Your site seems to be implementing caching inappropriately. The main page, the one from which you trigger the login and which is dynamic in nature contains an Expires header set at 5 minutes after the pages first load. That means that each call or redirect to that page within 5 minutes since it was first loaded, will be served from the browser's cache.
Because of that, after I login into your application I am redirected back to the home page which looks like I am not logged in. If I force a F5 refresh on the browser, then the page will indeed show me as logged in.
If instead of a refresh I try to login again (which is what I did in my first trials, since it looked like the login didn't work in the first time), I am getting an error page with the following message:
Sorry, there has been an error on the server.
500
The page looks like an application error page and even if it displays the 500 number, it is actually served with an HTTP 200.
So, while I am not 100% sure if this is also the cause of the problem described by you, you should remove the Expires headers from the dynamic pages your application is serving.
This can be because you're combining Forms Authentication with multiple instances. Are you using multiple instances? If that's the case, could you:
Try to change it to 1 instance. Does this fix the issue?
Try to make the following change to the web.config (configure machineKey): http://msdn.microsoft.com/en-us/library/ff649308.aspx
some partial views are not rendered at all;
Do you mean some pages are working fine, but others are not? It would be better if you can point out a pattern on what’s working and what’s not? For now, please make sure all referenced assemblies (except for default .NET assemblies and Windows Azure runtime) have Copy Local set to true. For example, MVC assemblies are considered as extensions to .NET, so please set Copy Local to true. In addition, you can also try to use Fiddler to monitor the requests to see what’s returned from the server.
Best Regards,
Ming Xu.
Could you provide a link to the application, or perhaps some source code?
When you say 'blank page', what is actually returned, a 404 / 500?
Have you inspected the IIS logs, or added some trace information to your code?
Have you tried accessing the service using it's ip address rather than domain name?

Sporadic page load failure

An issue has started recently, within Chrome and reportedly Firefox, pages would be loading fine and browsing would be as normal and then suddenly then a page would fail to load (continuing to spin as if loading). The page that fails is often not the same.
If I refresh the page or try to goto another page on the domain within the same browser, the browser doesn't even try to resolve the name or make a connection, and is then unable to load the page.
Swapping to another browser and I am back to browsing the domain normally again, while the original browser(in most cases Chrome) will not load the pages until a restart.
This has happened with 3 different people on 3 different machines in both Chrome and Firefox.
The domain that it is running off has allot of ajax calls within certain pages, I am not sure if the server is tripping out due to the number of requests from the one client...I am not sure.
I am not sure if this is a server, client or script functionality issue, as I can not personally reproduce it. I can do little to debug or work out what is causing this or how to fix it...
As you can see I am not sure of allot with this problem :) so I am throwing it out to stack-overflow in the hope that someone may have had similar experiences or have any directions I could look towards.
Cheers,
Brendan
If the page is making many requests in a short time, your firewall (router) may block it. I've noticed this behavior on my own router, and had to set it to a less restrictive level to make things work.

How do I share Safari's NSURLCache store?

Background
I'm building an app that links recent
web pages you've visited together.
To do this, I need to get the HTML
for recent URLs using Cocoa.
Right now, I'm using an invisible
WebView to do this.
As I understand it, if the URL isn't
in the cache for my app, this is
hitting web servers.
What I want
The chances are high that the URL I'm grabbing has already been cached by Safari as the page has already been visited.
I want my app to check Safari's cache for the URL first. If it's there, it should just use this data. If not, it should hit the web server and store the page in my app's cache.
I don't really want to have to parse the cache.db file from Safari using sqlite3 - I've no idea if this format will stay the same. I'm after something simpler and more high level.
What I've tried
I know that you can set up your own NSURLCache using the method initWithMemoryCapacity:diskCapacity:diskPath: but I don't want to try pointing this to the Safari cache in case it screws up Safari by writing to it.
Is there an easy, high level way of sharing the Safari cache?
UPDATE
Aha. I've just realised there may be a way to do this I've been missing.
I could make a new instance of NSURLCache with initWithMemoryCapacity:diskCapacity:diskPath:, point it at the Safari cache, then specify a cache policy of NSURLRequestReturnCacheDataDontLoad for the URL Request when loading the page.
When this fails, I could just try and load the page as normal. I'll try this out and update the question when I know more.
To be honest, you just can't do this.
Firstly, I'm pretty certain -[NSURLCache initWithMemoryCapacity:diskCapacity:diskPath:] won't work as you expect. It will instead blow away the old cache file to create its own; potentially highly upsetting Safari.
Secondly NSURLCache is a composite cache. That is, it caches data first in memory, and then moves it out to disk at some point. So even if you could properly access Safari's cache file (which you can't) you'd only be able to access the older cached data; not the most recent.

Resources