I have a lot of experience with YUI2 and I'm getting up to speed on YUI3. The service I'm writing needs HTTPS, but the vanilla YUI experience loads from Yahoo's HTTP-only CDN, which quietly fails in Chrome and loudly fails in modern IE when the browser tries to mix an HTTPS page with HTTP javascript.
My goals are to get all of:
Site uses HTTPS
YUI works in Chrome & IE (so scripts also must be delivered over SSL)
Uses a modern version of YUI 3 (this disqualifies YUI PHP Loader which hasn't been updated to support even YUI 3.4, while 3.8 is "current")
Use roll up combos for speed instead of many JS and CSS files (this disqualifies Google's CDN... if YUI 3 is actually hosted there which I couldn't find.)
Site dynamically loads YUI dependencies (dependencies change regularly as I add functionality, going back to the configurator and saving a new bundle every time is a PITA)
The obvious solution appears to be to give up goal #5 and just self-host combos.
How can I meet all 5 goals?
The easiest way to solve it is to change base URL from
http://yui.yahooapis.com/ to
https://yui-s.yahooapis.com/
Depending upon your server environment, you have a couple of options.
Development
Download the latest YUI library, and upload the yui/build/ folder to your server. The seed file should work fine without modification, though you won't be able to take advantage of combo loading.
Production
Use the YUI Configurator to determine all the files that you will need for each module set, and download them manually from the combo links provided. Rename them to something suitable like yui3.8.0-node-rollup.js and serve these to your users.
Be advised that if you use different module sets for different scripts, you may need to make multiple sets of files from this process, depending upon how you set it up. There is also a question here about concatenating Javascript together, if you are curious.
As an addendum, in my past research, I discovered that pulling external libraries over a secure connection may not be a safe idea.
Related
I'm working on a site where we are using the slide function from jquery-ui.
The Google-hosted minified version of jquery-ui weighs 63KB - this is for the whole library. The custom download of just the slide function weighs 14KB.
Obviously if a user has cached the Google hosted version its a no-brainer, but if they haven't it will take longer to load as I could just lump the custom jquery-ui slide function inside of my main.js file.
I guess it comes down to how many other sites using jquery-ui (if this was just for the normal jquery the above would be a no-brainer as loads of sites use jquery, but I'm a bit unsure as per the usage of jquery-ui)...
I can't work out what's the best thing to do in the above scenario?
I'd say if the custom selective build is that small, both absolutely and relatively, there's a good reasons to choose that path.
Loading a JavaScript resource has several implications, in the following order of events:
Loading: Request / response communication or, in case of a cache hit - fetching. Keep in mind that CDN or not, the communication only affects the first page. If your site is built in a traditional "full page request" style (as opposed to SPA's and the likes), this literally becomes a non-issue.
Parsing: The JS engine needs to parse the entire resource.
Executing: The JS engine executes the entire resource. That means that any initialization / loading code is executed, even if that's initialization for features that aren't used in the hosting page.
Memory usage: The memory usage depends on the entire resource. That includes static objects as well as function (which are also objects).
With that in mind, having a smaller resource is advantageous in ways beyond simple loading. More so, a request for such a small resource is negligible in terms of communication. You wouldn't even think twice about it had it been a mini version of the company logo somewhere on the bottom of the screen where nobody even notices.
As a side note and potential optimization, if your site serves any proprietary library, or a group of less common libraries, you can bundle all of these together, including the jQuery UI subset, and your users will only have a single request, again making this advantageous.
Go with the Google hosted version
It is likely that the user would have recently visited a website that loads jQuery-UI hosted on Google servers.
It will take load off from your server and make other elements load faster.
Browsers load a fixed number of resources from one domain. Loading the jQuery-UI from Google servers will make sure it is downloaded concurrently with other resource that reside on your servers.
The Yahoo developer network recommends using a CDN. Their full reasons are posted here.
https://developer.yahoo.com/performance/rules.html
This quote from their site really seals it in my mind.
"Deploying your content across multiple, geographically dispersed servers will make your pages load faster from the user's perspective."
I am not an expert but my two cents are these anyway. With a CDN you can be sure that there is reduced latency, plus as mentioned, user is most likely to have picked it up from some other website hosted by googleAlso the thing I always care about, save bandwidth.
DocPad is described as being comparable to other static site generators, but it is also described as being "not limited to static site generation".
I've been browsing the DocPad website and other documentation and haven't yet been able to find anything that seems to explain how to incorporate dynamic content, and what types of limitations may be involved?
As a relative beginner, I am wondering if anyone can help me better understand the methodology whereby dynamic content would be incorporated into DocPad...? e.g. AJAX, and dynamic server-side scripts for doing things like dynamically loading pictures from Flickr into a webpage when a certain tag is clicked...
Thanks.
So there's a few ways DocPad facilitates dynamic content:
Via the regenerateEvery configuration option. This will regenerate your website every so often you specify. This great when combined with plugins like feedr for pulling in data from remote feed sources (like your latest social activity), as well as repocloner which clones out and keeps a git repository up to date inside your project. The benefit of this option is it's really easy to do and provides the illusion of a dynamic website. For instance the Benjamin Lupton Website applies this method to keep it's statistics on the home page, as well as the social data in the sidebar up to date. Every hour it regenerates with the latest information. Making it fast, and also illusively dynamic.
Via the dynamic meta-data property. When set to to true this tells the DocPad server we should re-render that document on each request, rather than just once. This works great inside the Kitchensink Skeleton for search pages and misc forms. This way is most similar to PHP development.
Via the serverExtend event. This event allows you to hook into and extend the DocPad server, allowing you to add extra server-side logic, handling, etc. Common use cases is to add extra routing to your server to handle route aliases, adding form processing such as a contact form, or to add a RESTULful interface for a Backbone.js application. The DocPad Website uses this to add extra routing and a regenerate post-receive hook for the documentation. The NodeChat Skeleton uses this to add the Socket.io server-side logic.
Via the API. This way is the most involved but can be quite rewarding if you just wish for DocPad to be a small part of an existing node.js application. With this, you can create a DocPad instance in your code and interact with it. The grunt-docs grunt task utilises this :)
I'd like to speed up my site's loading time in part by ensuring all CSS/JS is being cached by the browser, as recommend by Google's PageSpeed tool. But I'd like to ensure that visitors have the latest CSS/JS files, if they are updated and the cache now contains old code.
From my research so far, appending something like "?459454" to the end of the CSS/JS url is popular. But wouldn't that force the visitor's browser to re-download the CSS/JS file every time?
Is there a way to set the files to be cached by the browser, but ensure the browser knows about updated versions of the cached files?
If you're using Apache, you can use mod_pagespeed (mentioned earlier by symcbean) to do this automatically.
It would work best if you also use the ModPagespeedLoadFromFile directive since that will create a new URL as soon as it detects that the resource has changed on disk, however it will work fine without that (it will use the cache expiry time returned when it fetches the resource to rewrite it).
If you're using nginx, you could use ngx_pagespeed.
If you're using IIS, you could use IISpeed, which is not a Google product and I don't know it's full feature set.
Version numbers will work, but you can also append a hash of the file to the filename with your web framework or asset build script:
<script src="script-5054a101c8b164cbfa570d97fe23cc0d.js"></script>
That way, once your HTML changes to reflect this new version, browsers will just download and cache the updated version of your script.
As you say, append a query string to the URL of the asset, but only change it if the content is different, or change it when you deploy a new version.
appending something like "?459454" to the end of the CSS/JS url is popular. But wouldn't that force the visitor's browser to re-download the CSS/JS file every time?
No it won't force them to download each time, however there are a lot of intermediate proxies out there which ignore query strings on cacheable content - hence many tools (including mod_pagespeed which does automatic url rewriting based on file conents, and content merging on the fly along with lots of other cool tricks) move the version information into the path / filename.
If you've only got .htaccess type access then you can strip the version information out to map direct to a file, or use a scripted 404 redirector (but this is probably only a good idea if you're behind a caching reverse proxy).
Are there any disadvantages to using AJAX?
No integration with the browser's history.
If you build a site that requires Ajax to see content and perform tasks, you have several major problems. Ajax-only content/functions are invisible/unavailable to:
search bots
many mobiles
people with Javascript turned off
etc etc.
However, if you build a site using the progressive enhancement principle, those problems are solved, and you still get to serve nice-to-use Ajax to most users.
Progressive enhancement involves first creating your site using bare-bones (X)HTML, on REST-like principles (at least to the extent of requiring POST requests for state changes). Simple semantic markup; forget about CSS and Javascript.
Step one is to get that right, and have your entire site (or as much of it as makes sense) working nicely this way for search bots and Lynx-like user agents.
Then add a visual layer: CSS/graphics/media for visual polish, but don't significantly change your original (X)HTML markup; allow the original text-only site to stay intact and functioning. Keep your markup clean!
Third is to add a behavioural layer: Javascript (Ajax). Offer things that make the experience faster, smoother, nicer for users/browsers with Ajax-capable JS... but only those users.
Browser compatibility.
Asynchronized access to data means it's harder to make things go correctly in every combination of actions.
Dependency of javascript makes the site unusable for some. Also javascript performance can be a bottleneck in resource limited environments.
User may not know via the client that an AJAX operation was made, or if it failed. It can be difficult to recover from client side errors caused by a failed AJAX call.
Makes it really Hard to do functional testing .
Inability to update the client without "polling", which means querying the server every X seconds.
It requires javascript. And you have to admit to your friends how "Web 2.0" you are. Instead of being hard core old school: It's all tables for layout and frames for navigation for me.
Yes, Ajax is not supported by old browsers or browsers which don't have javascript enabled. Nowadays, most of the browsers do have support for Ajax -- even mobile browser like the one on the IPhone.
The biggest issue for me is that Ajax adds complexity to the project.
There are many ajax libraries out there, which are suppose to make life easier. In most cases, these libraries are easy to use to create a "Hello World" application. One of the main issues which is most of the times kept asside by Ajax libraries is (client-side) error handling/logging.
For larger projects, the developer has to understand the internals of the library, which adds a new learning discipline to the project.
Some of our big clients -for security reasons- took a corporate decision of having javascript switched off. Therefore no AJAX is possible.
If you are going to develop something using AJAX for a given client be sure that your client are allowed to use javascript.
Restrict your application to a reasonable number of browsers and browsers versions.
Crossbrowser compatibility can make your life miserable.
Ultimately, the problem is that it introduces is complexity. Most problems inherent with AJAX sites (bookmarking, browser history, graceful degradation, etc...) can be overcome with a good design, so there are not really any disadvantages to a well designed AJAX enabled site. The problem is a creating such a site requires a lot of design and very good developers who can manage the complexity.
Well.. we've developed a j2ee application using struts2 ajax capabilities. We find that the dojo implementation is quite slow. We did the following things:
1. Custom build of the dojo library. (increased dojo.js from 240kb to 350kb)
2. Took all the static stuff out of the struts jar and kept it outside.
The performance was significantly improved. But still it is quite heavy as you can guess with 350kb size..
Is struts2 ajax supposed to be this heavy? or is there any lighter implementation available?
Edit: I used Firebug and YSlow with my application. Couple of changes that improved my situation hugely are mentioned below:
Custom build of dojo (reduced the number of I/Os)
Move the static files out of Struts jar (helped a great deal)
tune your server to gzip the response (reduced the response size to 1/3)
Reduce number of images on your site.(this is obvious)
Will keep updating on further changes..
First of all check that you did everything on the server to facilitate caching (e.g., setting right HTTP headers, compression, server-side caching, upstream caches, and so on). See Improving performance… for more details.
The goal is to reduce I/O as much as possible — use Firebug or any other network traffic monitoring tool to see how much is sent back and forth. Try to minimize the number of I/O requests and the total number of bytes.
Don't forget that it applies to your dynamic data too — choose efficient formats, bundle several related requests together, remove all deadwood that is getting sent over and over unchanged.
If the custom build and server-side tweaks didn't help, consider restructuring your web app to be more light-weight. Examples:
Evaluate the splash screen technique discussed in the link above.
If you use a lot of different form widgets, see if it is really necessary, and fall back on regular DOM elements like "input", "button", "textarea", "select".
The same goes for layout widgets. See if simple CSS can help you out.
Evaluate building Dojo in layers instead of one monolithic dojo.js so only the necessary subset is loaded by web pages. See details in The Package System and Custom Builds.
Building web applications with Dojo for a living for last 2 years I still didn't see the one that cannot be optimized properly until it is fully accepted and perceived by end users as "fast", "nimble", and "light-weight".
Make sure you follow this faq first:
http://struts.apache.org/2.x/docs/performance-tuning.html
I usually re-write my own theme instead of using the struts2 ajax theme which has dojo built in. This way I can use whatever toolkit I want to use (jQuery). I saw the biggest performance improvements when I copied the templates folder from the jar to the root web directory for the webapp.
Last I checked, struts was shipping a release of Dojo (0.4) that's going on 2 years old. Dojo did a rewrite for version 0.9/1.0 that had significant performance gains and reduced code size. You should make sure you're running a recent version of Dojo (current version is 1.2.3) and use the build and tips from Eugene, above.