Firefox not re-validating static files from child frames - firefox

I'm working with an application that has an iframe - both the outer html body and the frame require certain javascript and CSS files. To cut down on load times, all these static files have expiry set to a year from now and essentially should be loaded from cache for normal page hits - which is expected behavior in IE8 and FF3.6
However, once I reload/refresh (F5) the page, I expect the browsers to send an 'If-Updated-Since' requests to the server for these files. IE8 sends the requests for all the files used outside as well as within the iframe. But, FF3.6 only sends the requests for files used outside (not for the files used within the iframe, it just loads those from cache!).
The response headers are exactly the same for all files regardless of whether they are in the iframe or not. Is there a reason for this behavior of FF? Any way to avoid it?
Note: I can append version parameters to the source, or add a version folder in the path, etc. But, I want to know if this quirk can be avoided/has a good reason behind it?

Firefox behaves correctly - the server indicated that the scripts are good for a year so there is no reason to send pointless requests which waste time, bandwidth and server resources. For debugging purposes you can keep the Shift key pressed while clicking the Reload button, it will make sure that all data is refreshed. However, for end users adding the version information to the URL (e.g. http://example.com/.../script.js?version=1.2.3) is probably the best solution. This makes sure that the cached version can be used as long as it is valid and the new version is downloaded as soon as you update the script.

Related

Send an entire web app as 1 HTTP response (html, js, css, images, ...)

Traditionally a browser will parse HTML and then send further requests to the server for all related data. This seems like inefficient to me, since it might require a large number of requests, even though my server already knows that a browser that wants to use this web application will need all of it's resources.
I know that js and css could be inlined, but that complicates server side code and img data as base64 bloats the size of the data... I'm aware as well that rendering can start before all assets are downloaded, which would potentially no longer work (depending on the implementation). I still feel that streaming an entire application in one go should be faster on slow connections than making tens of requests separately.
Ideally I would like the server to stream an entire directory into one HTTP response.
Does any model for this exist?
Does the reasoning make sense?
ps: If browser support for this is completely lacking, I'm wondering about a 2 step approach. Download a small JavaScript which downloads a compressed web app file, extracts it and plugs the resources into the page. Is anyone already doing something like this?
Update
I found one: http://blog.another-d-mention.ro/programming/read-load-files-from-zip-in-javascript/
I started to research related issues in order to find the way to get best results with what seems possible without changing web standards, and I wondered about caching. If I could send the last modified date of every subresource of a page along with the initial HTML page, a browser could avoid asking if modified headers once it has loaded every resource at least once. This would in effect be better than to send all resources with the initial request, since that would be beneficial only on the first load, and detrimental on subsequent loads, since it would be better for browsers to use their cache (as Barmar pointed out).
Now it turns out that even with a web extension you can not get hold of the if-modified-since header and so you surely can't tell the browser to use the cached version instead of contacting the server.
I then found this post from Facebook on how they tried to reduce traffic by hashing their static files and giving them a 1 year expiry date. This would mean that the url garantuees the content of the file. They still saw plenty of unnecessary if-modified-since requests and they managed to convince Firefox and Chrome to change the behaviour of their reload buttons to no longer reload static resources. For Firefox this requires a new cache-control: immutable header, for Chrome it doesn't.
I then remembered that I had seen something like that before and it turns out there is a solution for this problem which is more convenient than hashing the contents of resources and serving them from a database for at least ten years. It is to just a new version number in the filename. The even more convenient solution would be to just add a version query string, but it turns out that that doesn't always work.
Admittedly, changing your filenames all the time is a nuisance, because files referencing these files also need to change. However the files don't actually need to change. If you control the server it might be as simple as writing a redirect rule to make sure that logo.vXXXX.png will be redirected to logo.png (where XXXX is the last modified timestamp in seconds since epoch)[1]. Now let your template system automatically generate the timestamp, like in wordpress' wp_enqueue_script. WordPress actually satisfies itself with the query string technique. Now you can set the expiration date to a far future and use the immutable cache header. If browsers respect the cache control, you can now safely ignore etags and if-modified-since headers, since they are now completely redundant.
This solution guarantees the browser shall never ask for cache validation and yet you shall never see a stale resource, without having to decide on the expiry date in advance.
It doesn't answer the original question here about how to avoid having to do multiple requests to fetch the resources on the same page on a clean cache, but ever after (as long as the browser cache doesn't get cleared), you're good! I suppose that's good enough for me.
[1] You can even avoid the server overhead of checking the timestamp on every resource every time a page references it by using the version number of your application. In debug mode, for development, one can use the timestamp to avoid having to bump the version on every modification of the file.

MVC 3 with IE, poor bundle performance

We have deployed an MVC 3 website on an IIS6 box.
Everything runs fine, but the performance is abysmal.
Can anyone help me understand
why am I getting 20 second response times to get a script bundle?
why bundled scripts are not cached by IE even if the Expires header is set?
The site is several times faster in Chrome (I have noticed the cache behaviour is correct), but we cannot force customers to use it.
Any help would be great. I'm kind of wondering if it's a server-side setting that's forcing the bundle recompilation each request, or if it's just IE acting like usual.
Edit: as per comments request, I'm including also the bundle request headers:
If you have different download times for a full reload between the two browsers it could be that you are doing intense computations with a client side framework like angularjs (I have seen big performance differences from highly complex angularjs apps between the two browsers).
If both your browsers show the same download time, it is either a network issue, or a server issue.
The IE caching could be a separate issue, break your problem into two parts - look for the cause of the slow downloads first.
All I can do now is suggest an approach to finding the issue.
Summary of what you know
It looks like you have:
Server sends an Expires header one year from now
When you reload the page (i.e. you don't force a full refresh using Ctrl+F5)
IE doesn't take any notice of the cache header, and when it sends it's new request it doesn't use If-Modified-Since or If-None-Match
Chrome behaves differently and respects the Expires and/or ETag response headers (it doesn't even make the request again for the bundle).
EDIT 1: You also seem to be saying (though it would be good to see a timeline from chrome) that Chrome downloads the files faster, implying it is not a server-side problem. Your latest comment states that Chrome's downloads are also slow. (end edit)
And you also seem to be saying that this behaviour is consistent (i.e. 100 requests in IE, and 100 requests in Chrome show the above behaviour with no deviations).
Approach
You should break this problem into two parts:
Why is the download so slow?
Is there a server-side performance problem? Look for common download times in IE and Chrome, and Firefox (it could be due to bundling/minification/compression on the server).
Is there a network connectivity issue (dropped packets, for instance)? Look for inconsistent download times, Start times, Request times, between requests in a given browser and the same behaviour across all browsers.
Is a script slowing down IE, but not Chrome (this is not uncommon, I maintain legacy sites where the scripts don't run well in IE but do in Chrome) - look at different profile results between browsers.
Why is the javascript not being cached in IE? Troubleshoot (1) first, then worry about this.
It is possible that the two are related, but approaching them separately will be a start. Number 1 is far easier to diagnose that 2, the top references to caching javascript in IE on the web are to prevent it in order to help with development.
Root cause diagnosis
EDIT 1 The first thing to do is try the site from a browser on the server, or very close to the server to see if you have a network issue. (end edit)
Tools like Fiddler, the browser developer tools, timeline and script profiler, and YSlow are your friend. Compare each of the following between Chrome and IE (and see what happens in Firefox as well) and spot the difference. Note: you may need to clear the browser cache between tests.
browser developer tools -> script profile: see if you have a slow running script in IE compared to Chrome
similar analysis in a tool like YSlow (look for comparisons between the two browsers, not script improvements)
request and response headers, and timeline from a normal (i.e. not full reload) page load
request and response headers, and timeline from a full page reload (Ctrl+F5)
Start and Request durations for every js file for a given browser, and between browsers (this may point to network issues)? I note that the Start and Request alone are taking 0.6s and 1s each in IE - that is very very poor performance.
5 requests, and 5 full reloads with cache clearing between (that is, don't chase a ghost - be consistent in your test methodology)
Download times should be no different between Chrome and IE with no scripts actually running so also add a control test. Assuming that your bundle files don't "do anything" (i.e. they contain functions that the page calls rather than kicking off long processes by themselves) then create a blank page in your site which references exactly the same javascript files - not just the bundle, but every single js reference.
With the control test you can compare pure download times and caching behaviour in IE to Chrome, without any client side javascript running (use the developer tools profiler to verify no scripts are running). If your bundle files do kick off long running things, just temporarily disable those things by putting return statements at the top of the script and concentrate only on the download into the browser.

How can I detect during file upload if the file has moved or been deleted?

I have an ASP.NET MVC web page that has a file upload control. Under rare conditions the file referenced by the user moves or is deleted on the filesystem prior to the user triggering the post to the page. In IE9 the page successfully posts but the ContentLength is zero (expected) and can be handled server-side. However in Firefox I find that the POST action never reaches the server.
Is there anyway to detect that the file reference is still valid prior to posting the page? Or a way to detect that an error occurred client-side during the POST due to the moved/deleted file?
Using just input type="file", you have no access to check whether the file actually exists until an upload attempt is made. There are some emerging functionalities like FileReader which may help as browsers mature (as it's not available in all browsers) that should make the upload process far smoother (and will make detection of this situation more simple).
If you use an Ajax style upload process, you could initiate the upload right away to help prevent the issue from occurring in the first place.
Or, a bit hacky: one idea for Firefox would be to add a setTimeout in the onsubmit event that fires after a second ... and checks to see if the upload started (by querying the server using Ajax to a JsonResult action/function that can quickly see if an upload started, etc.). It's a bit messy though as you'll need to worry about timing issues -- and may be overkill just to handle the cases where this is occurring.

Chrome caching like a mad browser

I've got a web service that, like most others, uses js and css files. I use the old trick of appending a version number to the js and css file like; ?v=123 and that gets changed every time we update the service on production.
Now, this works fine on all browsers, except for Chrome. Chrome seems to prefer it's cached version over getting the new one and therefor seems to ignore the appended variable. In some cases, forcing it to refresh cache (cmd+r / ctrl+f5) wasn't enough so I had to go into options and clear out the cache for it to load up the new content.
Has anyone experienced this issue with Chrome? And if so, what was the resolution to the problem?
Chrome should certainly treat requests with varying query strings as different requests; a cached result for style.css?v=123 should never be used for style.css?v=124. If you're seeing different behavior, please file a bug at http://new.crbug.com/ and post the bug ID here.
That said, I'd first check to see whether the page was cached longer than you expected. If a new version of the page itself wasn't downloaded, then it would still be requesting ?v=123 as the HTML wouldn't have changed. If you're sending long-lived cache headers with the page, it's certainly possible that Chrome is caching it more aggressively than you expected. If that's the behavior you're seeing, please star http://crbug.com/8742 for updates.
I had also same experience
You can user Ctrl + Shift + R for cache free browsing in both Chrome + Mozilla.
I have had this experience as well.
I run a membership site which displays content such as "You must be logged in as a Gold member in order to see this content" if they are not logged in or are trying to view content not allowed by their membership level. But even if the user is logged in, the user would still see "You need to log in", due to Google Chrome's aggressive caching. In Firefox, however, it works fine as I test logging in and out of all 5 levels of membership - each displaying the proper content.
While Chrome's caching problem can be solved by clearing the cache every time the user logs in and out, it would be really annoying to take that approach.

Are there any diagnostics tools for troubleshooting content delivery with Opera Mini?

I have an application that I'm targeting a wide variety of devices and platforms. The application can render different HTML based upon the type of client. However due to the complexity of the application, it shares a considerable amount of JavaScript libraries that rely on a number of async and ajax method calls.
One of the targets for the application is Opera Mini. This "sort-of" works but it seems like sometimes when building up the specialized markup to send down to the Opera Mini JVM client it does not wait until the async calls are complete. Are there any techniques or tools to see what's going on with the Opera Server (not my application web server) Side processing of the page to determine what I can do to make this solid?
It would appear that after further investigation that the server side browser is fairly picky when it comes to CSS. I can't remember the exact problem, but as soon as I removed the stylesheet all content was displayed properly. At that point I slowly re-introduced the CSS and everythning came back online and worked as expected.
Your javascript will only be allowed a short time before it is aborted:
JavaScript running on the Mini server
will only run for a couple of seconds
before pausing, for resource
constraint reasons. This applies to
JavaScript run due to an event firing
e.g. onload, as well as code run
because of a user action.
~ http://dev.opera.com/articles/view/opera-mini-web-content-authoring-guidelines/#javascript
So the best would be to serve the least javascripty version of your site to the Opera Mini user-agent.
You can type server:source in the address bar once a page is loaded if you want to see the current DOM tree.
It's also possible to post that source to a script on your server using server:source?post=http://your.server.com/script. It will send three fields as a POST request: url, host and html. You can then make your script save it to a file.
(Answering an old question in case it helps someone.)

Resources