How to measure web assets browser cache efficiency - performance

Do You (smart guys :) have any idea what is the best way to measure cache efficieny of assets which are used on website (js, css, fonts etc).
How should I decide if for my website better way is (for example) to put all my JS files into one file or to seperate between some smaller files?
Many websites do it with one, two, three big files, but for example Facebook has many, many small files. It's clear that Facebook has a lot of returning visitors, so this strategy is better. But how to measure it.
I know I can check for GA and returning/new visitors ratio, but it's not very deep check. After that I still don't know which users during entering my website had some my assets in cache & which hadn't.

My suggestion would be to make an A/B test. Implement several solutions, measure impact on your users loading speed, look at 99 or 98 percentile and you'll see which works better.
Looking to returning/new visitors data is good approach, but not reliable in case you have not so many users.
Facebook uses many files because they also use http 2.0 (SPDY). It solves many problems with delivering multiple resources, for examples this protocol allows you to gzip http headers, not only bodies as good old http 1.1.
Look at other benefits of http 2.0

Related

Use google hosted jQuery-ui or self host custom download of jQuery UI?

I'm working on a site where we are using the slide function from jquery-ui.
The Google-hosted minified version of jquery-ui weighs 63KB - this is for the whole library. The custom download of just the slide function weighs 14KB.
Obviously if a user has cached the Google hosted version its a no-brainer, but if they haven't it will take longer to load as I could just lump the custom jquery-ui slide function inside of my main.js file.
I guess it comes down to how many other sites using jquery-ui (if this was just for the normal jquery the above would be a no-brainer as loads of sites use jquery, but I'm a bit unsure as per the usage of jquery-ui)...
I can't work out what's the best thing to do in the above scenario?
I'd say if the custom selective build is that small, both absolutely and relatively, there's a good reasons to choose that path.
Loading a JavaScript resource has several implications, in the following order of events:
Loading: Request / response communication or, in case of a cache hit - fetching. Keep in mind that CDN or not, the communication only affects the first page. If your site is built in a traditional "full page request" style (as opposed to SPA's and the likes), this literally becomes a non-issue.
Parsing: The JS engine needs to parse the entire resource.
Executing: The JS engine executes the entire resource. That means that any initialization / loading code is executed, even if that's initialization for features that aren't used in the hosting page.
Memory usage: The memory usage depends on the entire resource. That includes static objects as well as function (which are also objects).
With that in mind, having a smaller resource is advantageous in ways beyond simple loading. More so, a request for such a small resource is negligible in terms of communication. You wouldn't even think twice about it had it been a mini version of the company logo somewhere on the bottom of the screen where nobody even notices.
As a side note and potential optimization, if your site serves any proprietary library, or a group of less common libraries, you can bundle all of these together, including the jQuery UI subset, and your users will only have a single request, again making this advantageous.
Go with the Google hosted version
It is likely that the user would have recently visited a website that loads jQuery-UI hosted on Google servers.
It will take load off from your server and make other elements load faster.
Browsers load a fixed number of resources from one domain. Loading the jQuery-UI from Google servers will make sure it is downloaded concurrently with other resource that reside on your servers.
The Yahoo developer network recommends using a CDN. Their full reasons are posted here.
https://developer.yahoo.com/performance/rules.html
This quote from their site really seals it in my mind.
"Deploying your content across multiple, geographically dispersed servers will make your pages load faster from the user's perspective."
I am not an expert but my two cents are these anyway. With a CDN you can be sure that there is reduced latency, plus as mentioned, user is most likely to have picked it up from some other website hosted by googleAlso the thing I always care about, save bandwidth.

Frontend performance wise, is it better to combine tw bootstrap and font awesome into existing files or call them from cdn

2 best practices for performance frontend collide :
For example here : https://developer.yahoo.com/performance/rules.html
Minimize HTTP Requests
So if we follow this practice, we will combine all css and js file into one file, even for vendors ones, and maybe put them on a cdn, but even we won't benefit from a popular url/cached resources.
Use a Content Delivery Network
So we will pull tw bootstrap, font awesome (and other libraries I use like jquery) from popular cdn like this one http://www.bootstrapcdn.com or google ones, but we will make multiple http requests.
For popular frontend css/js, we cannot follow both rules.
In that case, which rule should we follow in priority ?
If many websites are linked to the same CDN, the multiple http requests won't affect the site's performance as much since the visitor's browser will most probably have cached the CDN when it visited another site using the same CDN so it doesn't have to load it again.
The advantages of using CDN over lesser http requests include:
-Decrease server load
-Faster content delivery
-100 percent availability
-Increase in the number of concurrent users
-More control of asset delivery
So, assuming the CDN is "popular" and heavily linked, then yes, it can lead to a better site performance as compared to merging your styles and/or scripts together for lesser http requests.

How effective is ajaxcrawling compared to serverside created website SEO?

I'm looking for real world experiences in regards to ajaxcrawling:
http://code.google.com/web/ajaxcrawling/index.html
I'm particularly concerned about the infamous Gizmodo failure of late, I know I can find them via Google now, but it's not clear to me how effective this method of ajaxcrawling is in comparison to serverside generated sites is.
I would like to make a wiki that lives mostly on the client side, and which is populated by ajax json. It just feels more fluid, and I think it would be a pluspoint over my competition. (wikipedia, wikimedia)
Obviously, for a wiki it's incredibly important to have working SEO.
I would be very happy for any experiences you have had dealing with clientside development.
My research shows that the general consensus on the web right now is, that you should absolutely avoid doing ajax sites unless you don't care about SEO (for example, a portfolio site, a corporate site etc).
Well, these SEO problems arise when you have a single page that loads content dynamically based on sophisticated client-side behavior. Spiders aren't always smart enough to know when JavaScript is being injected, so if they can't follow links to get to your content, most of them won't understand what's going on in a predictable way, and thus won't be able to fullly index your site.
If you could have the option of unique URLs that lead to static content, even if they all route back to a single page by a URL rewriting scheme, that could solve the problem. Also, it will yield huge benefits down the road when you've got a lot of traffic -- the whole page can be cached at the web server/proxy level, leading to less load on your servers.
Hope that helps.

How many rewrite rules should I expect to manage?

I'm dealing with a hosting team that is fairly skiddish of managing many rewrite rules. What are your experiences with the number of rules your sites are currently managing?
I can see dozens (if not more) coming up as the site grows and contracts and need to set expectations that this isn't out of the norm.
Thanks
it should be normal that you have a lot of rewrite rules to your site. As the site gets bigger the amount of pages you need to rewrite grows, and depending on what the pages do, you could have multiple rewrites per page. This is all based on how secure you are making your site. More security means more precautions.
module gives you the ability to transparently redirect one URL to another, without the user’s knowledge. This opens up all sorts of possibilities, from simply redirecting old URLs to new addresses, to cleaning up the ‘dirty’ URLs coming from a poor publishing system — giving you URLs that are friendlier to both readers and search engines.
QUOTE
so pretty much it's at your discretion. How secure do you want it to be.

Mixing Secure and Non-Secure Content on Web Pages - Is it a good idea?

I'm trying to come up with ways to speed up my secure web site. Because there are a lot of CSS images that need to be loaded, it can slow down the site since secure resources are not cached to disk by the browser and must be retrieved more often than they really need to.
One thing I was considering is perhaps moving style-based images and javascript libraries to a non-secure sub-domain so that the browser could cache these resources that don't pose a security risk (a gradient isn't exactly sensitive material).
I wanted to see what other people thought about doing something like this. Is this a feasible idea or should I go about optimizing my site in other ways like using CSS sprite-maps, etc. to reduce requests and bandwidth?
Browsers (especially IE) get jumpy about this and alert users that there's mixed content on the page. We tried it and had a couple of users call in to question the security of our site. I wouldn't recommend it. Having users lose their sense of security when using your site is not worth the added speed.
Do not mix content, there is nothing more annoying then having to go and click the yes button on that dialog. I wish IE would let me always select show mixed content sites. As Chris said don't do it.
If you want to optimize your site, there are plenty of ways, if SSL is the only way left buy a hardware accelerator....hmmm if you load an image using http will it be cached if you load it with https? Just a side question that I need to go find out.
Be aware that in IE 7 there are issues with mixing secure and non-secure items on the same page, so this may result in some users not being able to view all the content of your pages properly. Not that I endorse IE 7, but recently I had to look into this issue, and it's a pain to deal with.
This is not advisable at all. The reason browsers give you such trouble about insecure content on secure pages is it exposes information about the current session and leaves you vulnerable to man-in-the-middle attacks. I'll grant there probably isn't much a 3rd party could do to sniff venerable info if the only insecured content is images, but CSS can contain reference to javascript/vbscript via behavior files (IE). If your javascript is served insecurely, there isn't much that can be done to prevent a rouge script scraping your webpage at an inopportune time.
At best, you might be able to get a way with iframing secure content to keep the look and feel. As a consumer I really don't like it, but as a web developer I've had to do that before due to no other pragmatic options. But, frankly, there's just as many if not more defects with that, too, as after all, you're hoping that something doesn't violate the integrity of the insecure content so that it may host the secure content and not some alternate content.
It's just not a great idea from a security perspective.

Resources