There are many specific 'display speed' amelioration techniques discussed on stackoverflow, however, you need to be aware of a particular option, before searching for how to do it.
Therefore, a guide-line strategy would be useful, particularly for those new to website development.
Due to this concept covering numerous individual areas.... the answer may prove best to be collaborative.
My website involves the usual suspects: js, css, google-fonts, images, and I have yet to embed 3 youtube videos.
As such, it likely represents a standard web site, encompassing the needs of many users.
I propose to commence an answer that can subsequently be added to, so that a general strategy can be determined.
Web Dev: Strategy to maximise user website display speed
(work in progress)
Step 1. Compression & Caching
Enable Gzip, or deflate compression, and set cache parameters.
(Typically achieved in .htaccess, unless you have control over your server.)
Step 2. Fonts
Best option is to stick with 'safe fonts'.
(Tested 'linked Google Roboto' in Dev-Tools - 2.8 seconds to download!… ditched it for zero delay.)
Step 3. Images
A single image might be larger than all other resources, therefore images must be the first in line for optimisation.
First crop the image to the absolute minimum required content.
Experiment with compression formats - photos (jpg), block colours (gif), pixels (png).
Then re-size the image (scale image) to the maximum size to be displayed on the site (controlled using css).
Reduced max-display size, allows greater jpg compression.
(if a photo site - provide a link to full size, to allow user discretion.)
Consider delayed image loading eg, LightLazyLoader where images are only loaded when the user scrolls to view more images.
Step 4. Arrange load sequence to provide initial user display
This requires the sequential positioning of linked resources.
Eg. If using responsiveslides, the script must be read before the images begin to display, otherwise many images will be stupidly displayed; until the script is read.
Ie. Breaking the ground rule, and loading scripts early, can present a working display, whilst other 'un-required elements' continue to load.
Step 5. Combine CSS and JS resources according to load sequence
Simply open a css resource in Notepad++ and copy it into, and above, your 'my.css' file… and repeat for JS.
Step 6. Minify the combined resources
Various program exist to achieve this… google tools offers this online… upload the file and download the minified version with your comments and white space stripped out.
(Remember to 'version name' the files, so that you keep your comments.)
Notes:
Keep testing the changes that you make (say) in google-dev-tools Ctrl+shift+I, Tab: Network.
This graphically shows element load times, and the overall timeline until fully loaded.
Where multiple elements begin loading at the same time… this is ideal.
Typically, another group will begin loading when the previous lot have finished.
It is preferable that all elements are downloaded simultaneously.
We are advised that this can be achieved by storing the different elements in different locations, to facilitate multi-threaded downloads (where elements cannot be combined).
Comment
Using 2Mb/s throttling, and a clean load, my 1st image display is now ready # 2 seconds (down from 15 seconds).
The 2s delay is due to other threads being occupied.
My objective is to commence the '1st image' load earlier, whilst ensuring that in less than 500ms the 2nd image is ready to be displayed.
This is a work in progress.
Related
if you were watching the State of the Union Address (http://www.whitehouse.gov/state-of-the-union-2013) you would have seen graphic supplements that appeared alongside of the video stream of the President that served to illustrate his key points.
The video on the site is a composite of this, but during the live streaming these were handled separately.
My question is: what is the best approach for doing this? especially if one wanted very tight control of the appearance of the graphics (i.e. right when the point is made, not before and not long after).
I'm wondering if any tools exist to facilitate this? I've been scouring google, but I don't think that I have the correct technical vocabulary for what I'm describing because I'm coming up blank.
I imagine AJAX would be a good starting point, but I'm not sure how to achieve the level of control that they had, or how to handle the back end of things.
For anyone who might encounter this challenge we devised two ways to solve it:
The first is a bit mickey mouse: It requires that you know how many images, etc you want to use beforehand (which in most cases you would). We wrote a script to repeatedly request an image and inserts it into the page, and on finding an image then request the next image in the chain.
Ie. Display default image -> request image 1
then, displaying image 1 -> request image 2
etc
From your end you can simply drop the images into a folder on your server when you are ready for them to go in. An advantage of this is that the images can be interactive, with links to other content, etc.
The big disadvantage, of course, is a lot of unnecessary requests to your page. In our case we anticipated enough traffic that it didn't seem wise. Also, there are plenty of opportunities for mistakes and depending how frequently your timer fires there are likely to be timing discrepancies.
The Second costs money: we found the program Ustream (http://www.ustream.tv/producer) which allows us all the image control we require in terms of timing with the advantage of providing support for media clips etc. And it allows you to record everything streamed.
The disadvantage is that what the user sees is an integrated video on your site, so that you have to handle links to related content and provide images (if you want your users to have access to them) separately.
Hope this comes in handy for someone
I would still welcome any suggestions on how to make the first method more effective
Link to article
A single HTML document can contain one or many 'page' containers
simply by stacking multiple divs with a data-role of "page". This
allows you to build a small site or application within a single HTML
document; jQuery Mobile will simply display the first 'page' it finds
in the source order when the page loads.
I'm wondering about the limitations of this.
I'm building a site for a gallery exhibition which contains 340 images and 19 media files (audio and video)
The exhibition is divided into 16 seaprate galleries, each containing anythign from 4 to 90 images each.
Is this method descibed by jQuery mobile possible?
Obviously i'll be keeping the image size reasonable for this.
Thanks
I haven't built anything that large even in my demo time playing with jqm, but if you start to see your entire page document being more than say a 1MB in total size with images, html, css etc you are going run into some longer than recommended load times. I recommend checking this with google speed tools - It will give you a nice graph of all your resources loaded and the total page size (you will need to be on the dev channel of google chrome to use this tool)
I'm curious to see how usable this ends up. Let us know. If you do run into issues you may just have to find some logical points to split your document or possibly look at using json & flickr api to load some of your image galleries.
this is not a question about a specific programming problem, it's about examining different concepts. If the moderators don't feel this is ok, delete my question.
I need to display 100 png images in a table td, and the images are 75x16 PNGs. In order to reduce the number of HTTP requests, I grouped all 166 images (only roughly 100 are shown at one time) in a big spritesheet, and have used the IMG tag to display the output, one image at a time. This is the code:
CSS:
.sprites {background-image:url('folder/spritesheet.png');background-color:transparent;background-repeat:no-repeat;display:inline;}
#png3 {height:16px;width:75px;background-position:0px 0;}
#png5 {height:16px;width:75px;background-position:-75px 0;}
PHP:
$classy = "png" . $db_field['imageid'];
echo "<td>" , "<img src='transparent.gif' class='sprites' id='$classy' alt='alt' align='absmiddle'/>" , "</td>";
$classy is a variable which is calling the correct image based on the SQL query output. transparent.gif is a 1px transparent gif. And this is the result, images are shown correctly inside a table:
The page loading speed increased significantly, maybe 50-60%. In one of my earlier questions some concerns were raised over this being good practice or not. But it works.
The only other solution I've found is using jar compression, but that concept works for Firefox only. This is the code which is used for displaying these same images using jar compression (PHP, no CSS):
$logo = "jar:http://www.website.com/logos.jar!/" . $db_field['imageid'] . ".png";
echo "<tr>" , "<td>" , "<img src='$logo' alt='alt' width='75' height='16'/>";
All of the 166 images are compressed in a jar archive and uploaded to the server, and as jar is a non-solid archive, only the called image is extracted, not all of them. This solution is lighting fast and I've never seen a faster way of displaying that many images. The concept is here and deserves a link. Another advantage over CSS sprites is that with jar each image can be individually optimized for size (e.g one is optimized to 64 colors, another one to 128, 32...depending on the image) and a large spritesheet can not be optimized as it contains a lot of colors.
So, does anyone know of a solution which would be equally fast as jar? If using CSS sprites to display content is bad practice - what is good practice which gives the same result? The key here is the loading speed of the website with as few HTTP requests as possible.
Not really an expert on this but I had my share in these thing also
HTTP Requests
Ever heard of the "2 concurrent connections" (recent browsers have around 6-8). Loading a lot of stuff means if 2 are loading at the same time, the others have to wait in line. Loading it in one big chunk is better. This is the main reason why spriting is used. Aside from the connection limit, you manage those "same purpose" images in one image.
Cache
Now, one big chunk I say but you might ask "Does that make it even worse?". Nope, becasue I have an ace up my sleeve and that's where "cache" comes in to play. One page load is all you need, and then, poof! The rest of the pages that need that image are like the speed of light and Saves you from another HTTP request. Never underestimate the power of the cache.
Images
Other things you can do is to optimize your images. I have used Fireworks and I really love it's image optimization tools. To optimize, here are personal guidelines i follow which you can use in your situation:
GIFs for icons, JPGs for images, PNGs for transparent stuff.
remove unused colors. yes, you can do this in some tools. cuts down size
never resize images in the html. have resized versions instead.
lose quality. yes, there is such thing. lower your image quality to reasonable limits. losing it too much makes your image too "cloudy" or "blocky"
progressive loading images. What it does is it "fast-loads" a blurred image then clears it up later.
avoid animated images. they are a bloat, not to mention annoying.
Server Tricks
There are connection limits - but that does not prevent you from using other domains or even subdomains! Distribute your content to other domains or subdomains to increase the number your connections. For images, dedicate a subdomain or two for it, like say img1.mysite.com and img2.mysite.com or another domain mysite2.com. not only is it beneficial for your user, it's beneficial to distributing server load.
Another method is using a Content Delivery Network (CDN). CDN has a global network of servers, which contain "cached" versions of your website resources. Like say i'm in Asia, when i view your site with CDN'ed resources, it finds that resource in the server nearest Asia.
Mark-up
Not necessarily related speed and semantics but the use of id should be reserved for more important purposes. If you use ID to mark images for their styles, what if there was another element that needs the same image? IDs need to be unique, they can't be used twice. So i suggest using multiple classes instead.
also, IDs take precedence over classes. to avoid unexpected overrides, use classes. learn more about CSS specificity.
.sprites {
background-image:url('folder/spritesheet.png');
background-color:transparent;
background-repeat:no-repeat;
display:inline;
height:16px; /*same width and heights? place them here instead*/
width:75px;
}
.png3 {
height:16px; /* in cases you need a different dimension, this will override */
width:75px;
background-position:0px 0;
}
.png5 {
background-position:-75px 0;
}
$classy = "png" . $db_field['imageid'];
echo <img src='transparent.gif' class='sprites {$classy}' alt='alt' align='absmiddle'/>";
I embed small images/icons in the style sheet:
.someicon{
background-image:url('data:image/png;base64,iVBORw0KGg....');
}
this works with all modern browsers, and does not require me to create sprites (plus it even saves one additional file to load).
In development, the images are defined in the style sheet normally like so:
.someicon{
background-image:url('../images/someicon.png');
}
and I have a system that generates the final style sheet (including consolidating all CSS into one, minifying and replacing the image reference with the data: url) automatically whenever I make a change to a style sheet.
This works well and saves me a ton of work. When compressed with gzip, the CSS file is not much bigger than the individual files added together. After optimizing the PNG/JPG files, the CSS for my start page is 63K uncompressed. Even with a slightly smaller sprite file, I would probably not save more than a fraction of a second in load time for the average user, so I do not bother with sprites.
In our web application, the users need to review a large number of images. This is my current layout. 20 images will be displayed at a time, with a pagination bar above the thumbnails. Clicking a thumbnail will show the enlarged image to the left. The enlarged image will follow the scrollbar so it's always visible. Quite simple actually.
I was wondering what the best interface would be in this scenario:
One option is to implement an infinite scroll script which will lazy load thumbnails as the user scrolls. The thumbnails not visible will be removed from the DOM. But my concern with this approach is the number of changes in the DOM slowing down the page.
Another option could be something like Google's Fastflip.
What do you think is the best approach for this application? Radical ideas welcomed.
I think the question you have to ask is: what action is user supposed to do? What's the purpose of the site?
If "review images" entails rating every image, I'd rather go with a Fastflip approach where the focus is on the single image. A thumbnail gallery will distract from the desired action and might result in a smaller amount of pics rated/reviewed.
If the focus should rather be on the comparison of a given image against others, I'd say try the gallery approach, although I wouldn't impement an infinite scroll with thumbnails because user can quickly get lost in the abundance of choices. I think a standard pagination (whether static or ajaxified) would be better if you choose to go this route.
Just my 2c.
If you paginate thumbnails, you can pre-generate a single image containing all thumbnails for each page, then use an image map to handle mouseover text and clicking. This will reduce the number of HTTP requests and possibly lead to fewer bytes sent. The separation distance between images should be minimized for this to be most efficient. This would have some disadvantages.
To reduce image download size at the expense of preprocessing, you can try to save each image in the format (PNG or JPG) most efficient for its contents using an algorithm like the one in ImageGuide. Similarly, if the images are poorly compressed (like JPEGs from a cell phone camera), they can be recompressed at the cost of some quality.
Once the site has some testers, you can analyze patterns in which images tend to be clicked (if a pattern exists) and preload the full-size images, or even pre-load all of them once the thumbs are loaded.
You might play with JPEG2000 images (you did say "radical ideas welcomed"), which thumb very easily, because the thumbnail and main image needn't be sent as if they are separate files. This is an advantage of the compression format -- it isn't the same as the hack of telling the browser to resize the full size image to represent its own thumbnail.
You can take a look at Google's WebP image format.
At the server side, a separate image server optimized for static content delivery, perhaps using NginX or the Tux webserver.
I would show the thumbnails, since the user might want to skip some of the pictures. I would also stay away of pagination in the terms of
<<first <previuos n of x next> last>>
and go for something more easy to implement and efficient. A
load x more pictures.
No infinite scroll whatsoever and why not, even no scroll at all. Just load x more, previous x.
Although this answer might be a bit unradical and boring, I'd go with exactly your suggestion of asynchronously loading the thumbnails (and of course main picture), if they come into view. A similar technique is used by Google+ in the pane to add persons to circles. This way, you keep the server resources and bandwidth on the pictures that are needed by the client. As Google+ shows, the operations on the DOM tree are fast enough and don't slow down a computer of the past years.
You might also prebuilt a few lines of the thumbnail table ahead with a dummy image (e.g. a "loading circle" animated gif) and replace the image. That way, the table in view is already built and does not need to be rerendered, as the flow elements following the table would have to be, if no images are in there during scrolling.
Instead of paginating the thumbnails (as suggested by your layout scheme), you could also think about letting users filter the images by tag, theme, category, size or any other way to find their results faster.
We use large background images (hi-res photos, up to 700 KB) for our page design.
It's part of the experience of the site that as you browse around, you see different images.
At the moment a different (random) image is loaded on each page request, from a pool of ~15 images, which could grow over time.
I'm looking for a sane way to optimize this:
To avoid the user having to download a big image file on every page view
To reduce load on the server (is this an issue, will the server keep the images in memory?)
The ideas I have so far include:
A timer which loads a different image at set intervals
Progressively loading other images in the background with ajax
Associating images with specific content (pages, tags)
The question is, how to keep it feeling somewhat random, while minimizing page load times and server hit?
I usually avoid sites with huge images, I am very impatient. I would rethink your design.
As a first step you should make sure, that the images can be properly cached:
use sane urls (no session id's etc)
set appropriate http headers ETag
Firstly, hearing that the background-images alone are 700kb astounds me. In addition to the content ON screen...that is a pretty heavy site.
For starters, I would try to use image compression tools. Two tools come to mind Imagemagick and PNGCrush. PNGCrush is excellent in reducing all the extraneous metadata attached to photos, without compromising photo quality.
I only recommend this as compressing the images will assist you in enabling the user to download a smaller quantity of content, which means quicker load times, which...at the end of the day...is what users want.
I would also cache the images, such that when a user re-visits the site, the image is already cached on their end. This minimises the HTTP requests that are made each time a user visits your site.
An example of where this technique is used on a commercial site is www.reactive.com. If you look the /js/headerImages.js file, they make use of image caching. Funnily enough, you will find the same src code at: http://javascript.internet.com/miscellaneous/random-image.html
Considering that you have mentioned that images are randomly loaded, I am assuming you are using a Javascript library such as jQuery to create the effect.
If you are, you can minimize page load times by using a CDN as opposed to referencing to a local copy of the jQuery lib which is stored on your server. I have performed performance testing on a site I made for a client, and over an average of 20 hits, saved 1.6 seconds through this technique!
Hope that helps for now :)