We use large background images (hi-res photos, up to 700 KB) for our page design.
It's part of the experience of the site that as you browse around, you see different images.
At the moment a different (random) image is loaded on each page request, from a pool of ~15 images, which could grow over time.
I'm looking for a sane way to optimize this:
To avoid the user having to download a big image file on every page view
To reduce load on the server (is this an issue, will the server keep the images in memory?)
The ideas I have so far include:
A timer which loads a different image at set intervals
Progressively loading other images in the background with ajax
Associating images with specific content (pages, tags)
The question is, how to keep it feeling somewhat random, while minimizing page load times and server hit?
I usually avoid sites with huge images, I am very impatient. I would rethink your design.
As a first step you should make sure, that the images can be properly cached:
use sane urls (no session id's etc)
set appropriate http headers ETag
Firstly, hearing that the background-images alone are 700kb astounds me. In addition to the content ON screen...that is a pretty heavy site.
For starters, I would try to use image compression tools. Two tools come to mind Imagemagick and PNGCrush. PNGCrush is excellent in reducing all the extraneous metadata attached to photos, without compromising photo quality.
I only recommend this as compressing the images will assist you in enabling the user to download a smaller quantity of content, which means quicker load times, which...at the end of the day...is what users want.
I would also cache the images, such that when a user re-visits the site, the image is already cached on their end. This minimises the HTTP requests that are made each time a user visits your site.
An example of where this technique is used on a commercial site is www.reactive.com. If you look the /js/headerImages.js file, they make use of image caching. Funnily enough, you will find the same src code at: http://javascript.internet.com/miscellaneous/random-image.html
Considering that you have mentioned that images are randomly loaded, I am assuming you are using a Javascript library such as jQuery to create the effect.
If you are, you can minimize page load times by using a CDN as opposed to referencing to a local copy of the jQuery lib which is stored on your server. I have performed performance testing on a site I made for a client, and over an average of 20 hits, saved 1.6 seconds through this technique!
Hope that helps for now :)
Related
recently I made a website for my photography. htttp://www.simotamas.com
I am a newbie, so its not the best site but it works fine for me, I got only one problem, when a site is loaded on a device for the first time, the gallery loading time takes up to 1-2 seconds.
Could you guys please check if I mess up something with the code?
Or should I made the pictures even smaller?
Any way I could increase the loading performance.
I would be really thankful for any advice.
Some points you can consider
Use thumbnails for preview (low resolution) , while clicking load actual image.
Load images of only visible part first then load the images in bottom. (May affect user experience)
if you have cpu power , use any libraries like cache tools or compression tools like
https://nielse63.github.io/php-image-cache/ . benchmark it carefully.
use gzip if you are not using gzip compression for your server.
The fact your website doesn't wait for the image to load is considered a plus (look into asynchronous web page content loading for a good read) that said you should compress your images before uploading them.. tinypng.com is a nice tool for it... But if it's a photography website doing so would reduce picture quality... Try to play with Photoshop save settings to find your ideal compromise between quality and size with respect to "memory" size... Pictures are heavy.. high definition and resolution will obviously result in heavier files to download
Update: another thing you could do is actually display (smaller) thumbnail and only load the full picture on request. I.e: user clicks and image opens in new tab
It would help if you create smaller thumb versions of your images so the browser can initially load these ones for the overview and no need for scaling way to big images down while rendering the page. An image should always be downloaded in the dimension it's going to be presented.
My website for architectural visualization: http://www.greenshell3d.com
I noticed on the networking tab / incognito it takes 15 seconds or so to load the above-the-fold content. (most notably the image slideshow.)
Some of the images in the slideshow load at the very end instead of the beginning of the website load process. Now I understand the browser handles this order, but perhaps there is another way. As it stands, the bounce-rate is too high and I expect it is because of load time.
I've seen a jquery snippet on github that allows one to control the order of image loads - do you think this is a good option? I'd be glad to hear any opinions before investing the time to fix this.
Any ideas? Thanks!
You said you are interested in any opinions as well, so first some general thoughts: There is no page fold. The web that we produce content for exists in so many different screen sizes + resolutions that it’s impossible to say "The fold is below this big image!". Yes, Google changed the pagespeed insights tool to make people load stuff on top of the page first, but I think their wording there is really bad.
Now to your image loading issue:
The first thing I would recommend is to reduce the size of all the images. They seem to be around 280 - 300 kb per image and you have a few of them. Since there is a translucent overlay over them anyways you can probably get away with reducing the image quality without people noticing it (because they don’t see the image directly). Play around with the values here.
I would then look into optimizing the code for the slider to load the first image first, then the rest of the page and the other images asynchronously maybe after that. Another trick could be to increase the slide fade time from the first slide to other slides so the slider doesn’t change if the next image isn’t ready yet. You said you found a jQuery script to implement that, that’s where I’d start.
As a general guideline: the position of requests in the source code usually determines the load order of things on the page. If your images are requested by JavaScript at the end of the page, that lead to the images being loaded later than you want them to be loaded.
I’ve been thinking a while about the best solution and as much as I read I get more and more confused. There are a lot of different libraries and helpers (most of them are outdated or for CI 1.x) and I really need your help.
I have a custom CMS based on CodeIgniter 2.1.3, news site that has about 40-50 images on the home page, but 80% of them are really small thumbnails in 3 different sizes and the other 20% of the images on the home page are in 2 sizes + for the inner pages when I list the news from a category there is 1 size of thumbnails. So in total I will need the original image for the news story, + 5-6 thumbnails sizes for the home page.
What’s the smartest way to deal with this? There will be let’s say 10-50 new news per day.
Is it still better to create 5-6 thumbnails per image during the upload?
What about the method “on the fly”? I’m more into this method, as I read, only the first visitor will call the library/helper to generate the thumbnails, and for the others the thumbnails will be already created so it won’t waste CPU. What about this method? Is it good practice?
What caching techniques I should use for these what I need?
Also I forgot to ask, how the other CMS system deal with generating the thumbnails? I mean about Wordpress, Drupal, Joomla, etc.
Do they store predefined sizes or generate them on the fly?
I guess their logic should be the best, or maybe not, but I want to implement something smart in my CodeIgniter CMS.
I didn’t mention, but I think it’s not important to this, I use Grocery CRUD for the admin panel.
Any help is appreciated.
Your best bet is to create images on the fly + use CDN like Amazon Cloudfront to cache the resized versions of your source image.
I’ve been using CodeIgniter for a number of years to build websites where lots of different sizes of images are used throughout the website. At the beginning I used to create every size needed out of the original image during the upload process (could easily end up with more than 5 thumbnails). This proved to be delivering the best performance – whenever you need an image of the certain size you just include it with no additional PHP processing. However I noticed that I end up with a huge number of images on the server, where the older ones may not even be used that often (e.g.: articles older than a year). Plus developing this way takes longer.
Then I started creating images on the fly, firstly using 3rd party libraries and later developed my own interface for CodeIgniter. This saves a lot of time, because during the upload process you save an original version of the image not worrying about resized versions. When displaying an image in the front end, all you normally need to do is to pass certain dimensions of the image required. Doing this way, not only you can get 5-6 versions of the image, but as many as you need. Also that’s a solution for the future when you redesign your website where the different sized images might be needed! What would you do when none of your 5 thumbnail options are no longer valid and you need different sizes?
You’re right, resizing an image on the fly can really be CPU consuming operation (especially when the large images are involved), therefore caching is a must. You can cache images right on your server or get CDN on top of that.
To keep the server tidy I normally run a cron job to delete on-the-fly images older than let's say a week. That saves space + doesn't cause harm - whenever image is needed to display, it'll just get recreated.
Check out timthumb, it's a script that resizes images on the fly and stores them in a cache. It's a simple as including an image tag with parameters in the URL.
ALso check this link which looks promising http://www.jenssegers.be/blog/31/Codeigniter-resizing-and-cropping-images-on-the-fly-continued
I love the way Drupal manage this. In Drupal 6 there was a module called imagecache (now is in core in Drupal 7, but functionality is very similar), which basically stores presets for images (image sizes, transformations, effects...) and when the visitor ask for an image the module generate different images based on presets and serve this images. This way you upload an image but have different images for different purposes.
The module has a really useful feature, if you want to change one preset, you can "flush" all the images related with that preset, so the visitors can see the changes.
Of course there are many other modules in Drupal related to imagecache or image styles, that add other effects like watermarks...
More information:
http://drupal.org/node/949222
http://drupal.org/node/163561
My website seems to take on this "squeegee" type load effect, where all of the graphics load from the top down with an ugly top to bottom wiping effect. Is there a way to make the actual way in which your website renders graphics prettier?
I'd be more interested in why your page is taking so long to load and/or render. If it takes several seconds to draw, even on a fast connection, you might want to look into why that is. Tools such as Fiddler, Firebug, IE Developer Tools, etc can help you look at what resources your page is downloading and how big each research is.
If you have massive resources on the page (such as BMP or PNG files that are several hundred K), see if you can convert them to other formats or resize them on the server to the size they render at.
If your HTML is massively complex, such as huge nested tables, you might want to look into simplifying that with more modern HTML and CSS styling.
If you do have huge, high-res bitmaps that need to be loaded, you might want to preload them with script and then render them dynamically when they finish loading.
I'm trying many problems with the time loading of my web page:
www.alvaromillan.es.
I've tried to minify the js and the images but the problem is, as you can see, that my web site is only this page so every image and js is on this document...
The loading time is really high and even the smooth scrolling movement lasts a lot and the first time you use it it doesn´t go fine...
Please may any of you help me??
I took a quick look at the page just using the chrome developer tools and while there are probably several things you can do which YSlow would suggest, I think the biggest gain would come from optimizing and spriting your images. 131 of the 156 requests on your page are for images. Thats alot of images and many are fairly small. Also alot of the images seem quite large in bytes for their size. Here is what I would do:
Combine the images using several sprite sheets about 50k-100k per sheet.
Use the PNG format.
Quantize the sprite sheets to 8bit PNGs. My guess is that you will not experience perceptible quality loss by doing this. You could use spmething like pngquant to do this.
Use something like optipng to apply lossless compression on the quantized image.
I think this will yield dramatic improvements.
As skaffman suggests, do run yslow and/or google page speed test for more thorough suggestions. I also like using webpagetest.org which provides great metrics for optimizing pages.
Give the YSlow Firefox plugin a try. It will analyze the load times of your site and advise you the best course of action to take to fix it.
OK, here's some quick initial thoughts...
Flush the page after the head so that the browser can start downloading those resources sooner.
Remove the iframe
jquery appears to be loaded twice - once directly and once via google.load
Can you defer the loading of the javascript until later e.g. put it at the bottom of the page or load it asynchronously?
Rather than preloading the slideshow images - why not load them on demand when clicked on, or lazy load them after the page has finished load?
Also do you really want IE to emulate IE6???