Good techniques for serving several images per page - image

I have a web application that needs to serve a large amount of small images per page (up to 100). I can use caching to reduce calls to the database/backend, but there is a noticeable impact from having to make so many separate requests for the images themselves, as the images take some time to request and render, especially on slower connections.
What good practices exist for serving several images on a page? I'm aware of using a CDN (e.g. S3 + Cloudfront) to reduce bottlenecking on http requests and serve content from a closer geographical location, as well as potentially loading images/content via Ajax only once they come to the user's view in the browser. Are there other techniques that might provide significant performance gains for image-heavy pages? It doesn't really matter whether they relate to hardware, frontend or something else.
Thanks.

Loading 100 images in one page request increases the page load time as each image requires time to load in browser.
simple technique is to load only one default image , means the source of each 100 image should be common default image and only one image wont take much time to load.
when page loads all of its content then try to load each single image with help jQuery.
use lazyload jQuery plugin to load all images after page load.
like this
<img class="lazy" src="default.jpg" data-original="img1.jpg" >
<img class="lazy" src="default.jpg" data-original="img2.jpg" >
<img class="lazy" src="default.jpg" data-original="img3.jpg" >
.......
<img class="lazy" src="default.jpg" data-original="img100.jpg">
and in script use following code
$(document).ready(function(){
$("img.lazy").lazyload();
});
You may add expires header to each image which allows browser to cache them rather requesting them on next request.
hope it will help you.

You can use a different domain for images - these will be called on different threads than for the current domain.
You can also host your images on a web server optimized to serve static content - this will be faster than a dynamic server.
The above can be extended to several such domains - if the browser is set to have 4 threads per domain, each domain you add will parallelize to an additional 4 (which is also one of the benefits of using a CDN).
Another common technique that may apply is the use of CSS sprites - if you have a bunch of images that are commonly used together, you can put them all in a single image and use CSS to only show the bits that are needed where they are needed.

You can always combine the images into a single image and use CSS to display only parts of it at a time (commonly called CSS sprites)
Google also has a rather in depth article about how they implemented "Instant Previews" that covers some of the optimizations:
http://googlecode.blogspot.com/2010/11/instant-previews-under-hood.html?m=1

Related

Why are there unused images resources while using Google Chrome Lighthouse?

I try to improve the overall performance of loading time (especially images) with the Google Chrome Lighthouse extension of this website: https://muckenthaler.de/
When the Performance Test is finished I get a list of opportunities, please see this screenshot or test for yourself: https://capture.dropbox.com/YV5ii1vrj0xpfWwK
Under Serve images in next-gen formats are listed some image urls (like this one: https://muckenthaler.de/media/image/54/7c/cb/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg) that don't even appear on the specific page but somehow seem to get loaded into it and affect the performance.
How could I prevent this and why are these image resources loaded?
Here are the WebPageTest results for this page: https://www.webpagetest.org/result/220529_AiDcZP_759/1/details/#waterfall_view_step1
All you need to know from this thumbnail is that yellow rows indicate HTTP redirects and purple bars represent images. The longer the bar, the longer it took for the resource to load.
So we can tell a few things from this waterfall image:
there are many redirects
there are many images
many images take a very long time to load, relative to other resources
When I look up https://muckenthaler.de/media/image/54/7c/cb/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg in the response headers, I see at request 19 that there's a redirect to that image from https://muckenthaler.de/media/image/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg.
Looking up that image in your source code, I see
<img src="https://muckenthaler.de/media/image/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg"
alt=""
loading="eager">
Also note that this content is inside of something called <div class="hidden-elements">. These elements of class emotion--element are set to display: none so that the contents are not shown on screen, but loading="eager" on the images forces them to be loaded.
It seems like maybe your CMS (Shopware) is trying to eagerly preload images that will be used on other pages. That's not a terrible idea if you have a small number of lightweight images and users are very likely to navigate to those pages, but in this case it's loading dozens of images totalling over 30 MB. So definitely not recommended.
According to the CWV Tech Report, Shopware websites tend to only load 2 MB of images and have pretty good Core Web Vitals performance compared to other CMS and ecommerce platforms. That leads me to believe that there might be a misconfiguration on your end, or you may have installed a bad plugin.
First things first, a big thanks to Rick Viscomi for the research!
I found the answer which is basically Shopware 5 hidden elements, that can be shown and then be removed clicking the number next to the chain icon.
Here is a screenshot.

What is best for SEO, CSS resizing or thumbnails generation?

I have 4 sizes for a single image in a page of my eCommerce website.
600x600px , 350x350px , 220x220px , 110x110px
There are 3 solutions:
1- Loading the big image (600x600px) from server and cache it, then generating thumbnails using the cached one by a client-side plugin.
2- Loading the big image and thumbnails all from server. (in this case, thumbnails are generated in server)
3- Loading the big image and create thumbnails by resizing the big one using CSS. (or for example we can load 600x600px and 350x350px ones and create thumbnails by css from 350x350px one)
Which solution is the best for SEO ?
Or if there is any other way, I appreciate.
My consideration regarding your solutions, assuming you are building a "classical, Client-server paradigm" eCommerce website (not a SPA application).
I believe this solution involve some JavaScript for the
re-sizing, so image won't be visible to a Search Engine Crawlers (or
will be more difficult their indexation).
This seems the best approach. Thumbnails are generated at server side and rendered in the HTML at user/client request. Page
will be crawled by Search Engines together with your HTML for their
indexes. There is also less overhead at client side (performance) as
not dynamic image scaling is required.
The big image could potentially slow down downloading of your
page (depends of many factors), and could make your web page score
less in Search Engine algorithm. Also consider some user which can
access your page from mobile devices, speed of downloading it is
very important.
For SEO, please also consider the folowing:
Include a meaningful subject in image alt text.
Image captions are important because they are one of the most well-read pieces of content.
Use File Name using relevant keywords.
More from a reputable website:
http://searchenginewatch.com/sew/opinion/2120682/ranking-image-search

How to avoid multiple requests for a site with many images

I have a page with hundreds of images that I want to show only on hover over text.
I did it using CSS, but the page loads very slowly because it has over 700 images.
How can I twist it to load images only at the time of hovering instead of when I load the page?
The page is http://play-well.ro/comp
Thanks!
You should use sprites to merge small images into a single file (this way you will avoid multiple HTTP requests to fetch the images)
and you should also consider paginating the contents of the page itself to improve the performance in general

Reduce the HTTP Requests of 1000 images?

I know this question might sound a little bit crazy, but I tough that maybe someone could come up with a smart idea:
Imagine you have 1000 thumbnail images on a single HTML page.
The image size is about 5-10 kb.
Is there a way to load all images in a single request? Somehow zip all images into a single file…
Or do you have any other suggestions in the subject?
Other options I already know of:
CSS sprites
Lazy load
Set Expire headers
Downloads images across different hostnames
There are only two other options I can think of given your situation:
Use the "data:" protocol and echo a base64 encoded version of your thumbnails directly into the HTML page. I would not recommend this since you cannot then cache those images on the users browser.
Use HTML5's Web Storage to store all the images as records with the base64 encoded image data stored as BLOBs in a column. Once the database has downloaded to the users machine, use Javascript to loop through all the records and create the thumbnails on the page dynamically using something like jQuery. With this option you would need to wait till the entire database was done downloading on the end users browser, and they will need a fairly modern browser.
I think your best bet is a combination of lazy loading, caching with expires headers and serving images from multiple hostnames.
If the images can be grouped logically, CSS sprites may also work for you in addition to everything above. For example, if your thumbnails are for images uploaded on a certain day...you may be able to create a single file for each day which could then be cached on the users browser.
This is done by using what's called a CSS sprite; a single image with all the other images inside it, with the particular part that's wanted in the html selected by css.
See one tutorial at http://css-tricks.com/css-sprites
It sounds like you want something like SPDY's server push. When the client requests the HTML page (or the first image), SPDY allows the server to push the other resources without waiting for more requests.
Of course, this is still experimental.
You could try the montage command of imagemagick to create a single image.

AJAX load time - Host and Server issue?

I'm having an issue with slow AJAX calls. This is a common question, but I've done everything suggested in all the research I can find. I'm hoping to get a consensus form people who read this.
Basically, I make an ajax request to a php page, which gets info from a database.
Here is the page.
I've timed all of my javascript, mySQL, and php scripts, requests, and pages.
(If you run firebug you can see my time markers in the console, as well as in the xml)
As an example -
The mysql request takes 20ms
The PHP page takes 50ms
The ajax success script, which processes the small amount of xml (less than 1k) and generates the markers, takes 8ms to run.
Yet, loading the page takes nearly 4 seconds.
So, assuming none of my scripts are lagging, this has to be a problem with the response time from the server, or my own internet connection, right?
I'd appreciate any theories or thoughts.
Thank you
Ok looked at your page and here are some of the issues I saw that would affect speed:
It takes 4ms to get your data in your getMarkers function but it takes 892ms to read the xml file. I would recommend falling back to vanilla javascript to read your xml file as the amount of find's you are performing is really harming your performance here.
Minify and combine all of the scripts that are local on your server. I was getting some really high response times. You can eliminate 4 http requests by doing this which with the response times on your server will help a bit. (Note don't combine jquery or jquery ui in this)
Since your server is a bit slow (not your fault this will vary as you are probably on shared hosting) I would recommend linking jquery and jquery ui to the google cdn hosted versions. Here is a post on that Jquery CDN
You have 24 images on your page; 23 of which are under 4KB. Combine those 23 into one CSS sprite image and assign a 1px X 1px blank gif to be the inline html image and use the CSS sprite image instead. Here is good article on what this is if you are unfamiliar: CSS Sprites explained also here is good online css sprites generator: CSS Sprite generator
Make sure you need Jquery UI for this page. I didn't see anything that would have required it. If you can remove it you save yourself 206k. Remember to remove the associated CSS file if it isn't needed. This would save you another 2 calls.
Didn't dig too deep but if you are not already kick off the call to setup the google map in a $(document).ready() that way the rest of your page can load and you can display a loading animation in that area. This way users know something is happening and your page will appear to load a lot quicker.
So you can greatly speed things up by doing the above. You would go from 82 components down to 51 local and 2 more on Google CDN. If you can improve that xml read time you can shave nearly another second off load time as well

Resources