Reduce the HTTP Requests of 1000 images? - image

I know this question might sound a little bit crazy, but I tough that maybe someone could come up with a smart idea:
Imagine you have 1000 thumbnail images on a single HTML page.
The image size is about 5-10 kb.
Is there a way to load all images in a single request? Somehow zip all images into a single fileā€¦
Or do you have any other suggestions in the subject?
Other options I already know of:
CSS sprites
Lazy load
Set Expire headers
Downloads images across different hostnames

There are only two other options I can think of given your situation:
Use the "data:" protocol and echo a base64 encoded version of your thumbnails directly into the HTML page. I would not recommend this since you cannot then cache those images on the users browser.
Use HTML5's Web Storage to store all the images as records with the base64 encoded image data stored as BLOBs in a column. Once the database has downloaded to the users machine, use Javascript to loop through all the records and create the thumbnails on the page dynamically using something like jQuery. With this option you would need to wait till the entire database was done downloading on the end users browser, and they will need a fairly modern browser.
I think your best bet is a combination of lazy loading, caching with expires headers and serving images from multiple hostnames.
If the images can be grouped logically, CSS sprites may also work for you in addition to everything above. For example, if your thumbnails are for images uploaded on a certain day...you may be able to create a single file for each day which could then be cached on the users browser.

This is done by using what's called a CSS sprite; a single image with all the other images inside it, with the particular part that's wanted in the html selected by css.
See one tutorial at http://css-tricks.com/css-sprites

It sounds like you want something like SPDY's server push. When the client requests the HTML page (or the first image), SPDY allows the server to push the other resources without waiting for more requests.
Of course, this is still experimental.

You could try the montage command of imagemagick to create a single image.

Related

What is best for SEO, CSS resizing or thumbnails generation?

I have 4 sizes for a single image in a page of my eCommerce website.
600x600px , 350x350px , 220x220px , 110x110px
There are 3 solutions:
1- Loading the big image (600x600px) from server and cache it, then generating thumbnails using the cached one by a client-side plugin.
2- Loading the big image and thumbnails all from server. (in this case, thumbnails are generated in server)
3- Loading the big image and create thumbnails by resizing the big one using CSS. (or for example we can load 600x600px and 350x350px ones and create thumbnails by css from 350x350px one)
Which solution is the best for SEO ?
Or if there is any other way, I appreciate.
My consideration regarding your solutions, assuming you are building a "classical, Client-server paradigm" eCommerce website (not a SPA application).
I believe this solution involve some JavaScript for the
re-sizing, so image won't be visible to a Search Engine Crawlers (or
will be more difficult their indexation).
This seems the best approach. Thumbnails are generated at server side and rendered in the HTML at user/client request. Page
will be crawled by Search Engines together with your HTML for their
indexes. There is also less overhead at client side (performance) as
not dynamic image scaling is required.
The big image could potentially slow down downloading of your
page (depends of many factors), and could make your web page score
less in Search Engine algorithm. Also consider some user which can
access your page from mobile devices, speed of downloading it is
very important.
For SEO, please also consider the folowing:
Include a meaningful subject in image alt text.
Image captions are important because they are one of the most well-read pieces of content.
Use File Name using relevant keywords.
More from a reputable website:
http://searchenginewatch.com/sew/opinion/2120682/ranking-image-search

How to avoid multiple requests for a site with many images

I have a page with hundreds of images that I want to show only on hover over text.
I did it using CSS, but the page loads very slowly because it has over 700 images.
How can I twist it to load images only at the time of hovering instead of when I load the page?
The page is http://play-well.ro/comp
Thanks!
You should use sprites to merge small images into a single file (this way you will avoid multiple HTTP requests to fetch the images)
and you should also consider paginating the contents of the page itself to improve the performance in general

Good techniques for serving several images per page

I have a web application that needs to serve a large amount of small images per page (up to 100). I can use caching to reduce calls to the database/backend, but there is a noticeable impact from having to make so many separate requests for the images themselves, as the images take some time to request and render, especially on slower connections.
What good practices exist for serving several images on a page? I'm aware of using a CDN (e.g. S3 + Cloudfront) to reduce bottlenecking on http requests and serve content from a closer geographical location, as well as potentially loading images/content via Ajax only once they come to the user's view in the browser. Are there other techniques that might provide significant performance gains for image-heavy pages? It doesn't really matter whether they relate to hardware, frontend or something else.
Thanks.
Loading 100 images in one page request increases the page load time as each image requires time to load in browser.
simple technique is to load only one default image , means the source of each 100 image should be common default image and only one image wont take much time to load.
when page loads all of its content then try to load each single image with help jQuery.
use lazyload jQuery plugin to load all images after page load.
like this
<img class="lazy" src="default.jpg" data-original="img1.jpg" >
<img class="lazy" src="default.jpg" data-original="img2.jpg" >
<img class="lazy" src="default.jpg" data-original="img3.jpg" >
.......
<img class="lazy" src="default.jpg" data-original="img100.jpg">
and in script use following code
$(document).ready(function(){
$("img.lazy").lazyload();
});
You may add expires header to each image which allows browser to cache them rather requesting them on next request.
hope it will help you.
You can use a different domain for images - these will be called on different threads than for the current domain.
You can also host your images on a web server optimized to serve static content - this will be faster than a dynamic server.
The above can be extended to several such domains - if the browser is set to have 4 threads per domain, each domain you add will parallelize to an additional 4 (which is also one of the benefits of using a CDN).
Another common technique that may apply is the use of CSS sprites - if you have a bunch of images that are commonly used together, you can put them all in a single image and use CSS to only show the bits that are needed where they are needed.
You can always combine the images into a single image and use CSS to display only parts of it at a time (commonly called CSS sprites)
Google also has a rather in depth article about how they implemented "Instant Previews" that covers some of the optimizations:
http://googlecode.blogspot.com/2010/11/instant-previews-under-hood.html?m=1

ASP.Net MVC. Making Dynamic Images SEO Friendly

I have a website made to provide free web-based tools for making indie games. Currently, it only supports artists contributing to games. The features for helping artists consist of a set of artist community tools that allow artists to upload images based on a description, then we post that image in a gallery page. Other artists can upload their images and each image can have several revisions.
The way I chose to implement the image upload and display feature is by serializing uploaded images to a byte array and storing it in the database. When I need to display the image in the UI I just call a controller action I named "GetScaledGalleryImage" and pass in the image ID. That controller action takes the binary from the database and converts it back into an image, returning the requested image back.
This works very well functionally, but the problem I realized later is that the google crawler thinks all of my images are named "GetScaledGalleryImage" so if someone searches for "sylph" on google images, nothing comes up from my site, but if someone searches for site:watermintstudios.com getscaledgalleryimage, all of my images come up.
Here is an example of the URL that is being output in my HTML http://watermintstudios.com/EarnAMint/GetScaledMedia/68?scale=128
In the past, pre-MVC I would handle 404 errors and return content based on what was requested even if the page didn't actually exist. This would of course allow me to have the images pulled back by the image name (or description).
Is that the best way to do this? Or is there a better option? Something simpler would be better like if I could just do http://watermintstudios.com/EarnAMint/GetScaledMedia/Iris%20Doll?id=68&scale=128, but based on how google indexes images, would that give me what I need? Or do I need to provide image file extensions for maximum indexability?
Thanks all
It is important when doing Search Engine Optimization to always use alt="this is a crazy robot" for your images. This will help the crawler identify them. Note: always use alt, don't always name your images this is a crazy robot.

What is a good approach for dealing with the uploading of image sequences

I am implementing the back end for an online store. The store receives new products periodically. Each product comes with a sequence of images for a 3D rotation effect on the website. What is a good approach for uploading these images onto the web store? I'm currently using a web form but uploading each image using a separate upload form element feels like a waste of time. These sequences can have anywhere from 12-50 frames. Any suggestions for a better way?
If you want a minimum amount of change from your current solution you can add the multiple attribute to your file input box, and update your back-end to support it.
You can then select multiple files at once with a modern browser, for instance Firefox 3.6. Try:
<input type="file" multiple=""/>

Resources