I have 4 sizes for a single image in a page of my eCommerce website.
600x600px , 350x350px , 220x220px , 110x110px
There are 3 solutions:
1- Loading the big image (600x600px) from server and cache it, then generating thumbnails using the cached one by a client-side plugin.
2- Loading the big image and thumbnails all from server. (in this case, thumbnails are generated in server)
3- Loading the big image and create thumbnails by resizing the big one using CSS. (or for example we can load 600x600px and 350x350px ones and create thumbnails by css from 350x350px one)
Which solution is the best for SEO ?
Or if there is any other way, I appreciate.
My consideration regarding your solutions, assuming you are building a "classical, Client-server paradigm" eCommerce website (not a SPA application).
I believe this solution involve some JavaScript for the
re-sizing, so image won't be visible to a Search Engine Crawlers (or
will be more difficult their indexation).
This seems the best approach. Thumbnails are generated at server side and rendered in the HTML at user/client request. Page
will be crawled by Search Engines together with your HTML for their
indexes. There is also less overhead at client side (performance) as
not dynamic image scaling is required.
The big image could potentially slow down downloading of your
page (depends of many factors), and could make your web page score
less in Search Engine algorithm. Also consider some user which can
access your page from mobile devices, speed of downloading it is
very important.
For SEO, please also consider the folowing:
Include a meaningful subject in image alt text.
Image captions are important because they are one of the most well-read pieces of content.
Use File Name using relevant keywords.
More from a reputable website:
http://searchenginewatch.com/sew/opinion/2120682/ranking-image-search
Related
I try to improve the overall performance of loading time (especially images) with the Google Chrome Lighthouse extension of this website: https://muckenthaler.de/
When the Performance Test is finished I get a list of opportunities, please see this screenshot or test for yourself: https://capture.dropbox.com/YV5ii1vrj0xpfWwK
Under Serve images in next-gen formats are listed some image urls (like this one: https://muckenthaler.de/media/image/54/7c/cb/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg) that don't even appear on the specific page but somehow seem to get loaded into it and affect the performance.
How could I prevent this and why are these image resources loaded?
Here are the WebPageTest results for this page: https://www.webpagetest.org/result/220529_AiDcZP_759/1/details/#waterfall_view_step1
All you need to know from this thumbnail is that yellow rows indicate HTTP redirects and purple bars represent images. The longer the bar, the longer it took for the resource to load.
So we can tell a few things from this waterfall image:
there are many redirects
there are many images
many images take a very long time to load, relative to other resources
When I look up https://muckenthaler.de/media/image/54/7c/cb/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg in the response headers, I see at request 19 that there's a redirect to that image from https://muckenthaler.de/media/image/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg.
Looking up that image in your source code, I see
<img src="https://muckenthaler.de/media/image/Sliderbild_Produkte_Steh_SItz_Tische_2.jpg"
alt=""
loading="eager">
Also note that this content is inside of something called <div class="hidden-elements">. These elements of class emotion--element are set to display: none so that the contents are not shown on screen, but loading="eager" on the images forces them to be loaded.
It seems like maybe your CMS (Shopware) is trying to eagerly preload images that will be used on other pages. That's not a terrible idea if you have a small number of lightweight images and users are very likely to navigate to those pages, but in this case it's loading dozens of images totalling over 30 MB. So definitely not recommended.
According to the CWV Tech Report, Shopware websites tend to only load 2 MB of images and have pretty good Core Web Vitals performance compared to other CMS and ecommerce platforms. That leads me to believe that there might be a misconfiguration on your end, or you may have installed a bad plugin.
First things first, a big thanks to Rick Viscomi for the research!
I found the answer which is basically Shopware 5 hidden elements, that can be shown and then be removed clicking the number next to the chain icon.
Here is a screenshot.
I have a web application that needs to serve a large amount of small images per page (up to 100). I can use caching to reduce calls to the database/backend, but there is a noticeable impact from having to make so many separate requests for the images themselves, as the images take some time to request and render, especially on slower connections.
What good practices exist for serving several images on a page? I'm aware of using a CDN (e.g. S3 + Cloudfront) to reduce bottlenecking on http requests and serve content from a closer geographical location, as well as potentially loading images/content via Ajax only once they come to the user's view in the browser. Are there other techniques that might provide significant performance gains for image-heavy pages? It doesn't really matter whether they relate to hardware, frontend or something else.
Thanks.
Loading 100 images in one page request increases the page load time as each image requires time to load in browser.
simple technique is to load only one default image , means the source of each 100 image should be common default image and only one image wont take much time to load.
when page loads all of its content then try to load each single image with help jQuery.
use lazyload jQuery plugin to load all images after page load.
like this
<img class="lazy" src="default.jpg" data-original="img1.jpg" >
<img class="lazy" src="default.jpg" data-original="img2.jpg" >
<img class="lazy" src="default.jpg" data-original="img3.jpg" >
.......
<img class="lazy" src="default.jpg" data-original="img100.jpg">
and in script use following code
$(document).ready(function(){
$("img.lazy").lazyload();
});
You may add expires header to each image which allows browser to cache them rather requesting them on next request.
hope it will help you.
You can use a different domain for images - these will be called on different threads than for the current domain.
You can also host your images on a web server optimized to serve static content - this will be faster than a dynamic server.
The above can be extended to several such domains - if the browser is set to have 4 threads per domain, each domain you add will parallelize to an additional 4 (which is also one of the benefits of using a CDN).
Another common technique that may apply is the use of CSS sprites - if you have a bunch of images that are commonly used together, you can put them all in a single image and use CSS to only show the bits that are needed where they are needed.
You can always combine the images into a single image and use CSS to display only parts of it at a time (commonly called CSS sprites)
Google also has a rather in depth article about how they implemented "Instant Previews" that covers some of the optimizations:
http://googlecode.blogspot.com/2010/11/instant-previews-under-hood.html?m=1
I have a website made to provide free web-based tools for making indie games. Currently, it only supports artists contributing to games. The features for helping artists consist of a set of artist community tools that allow artists to upload images based on a description, then we post that image in a gallery page. Other artists can upload their images and each image can have several revisions.
The way I chose to implement the image upload and display feature is by serializing uploaded images to a byte array and storing it in the database. When I need to display the image in the UI I just call a controller action I named "GetScaledGalleryImage" and pass in the image ID. That controller action takes the binary from the database and converts it back into an image, returning the requested image back.
This works very well functionally, but the problem I realized later is that the google crawler thinks all of my images are named "GetScaledGalleryImage" so if someone searches for "sylph" on google images, nothing comes up from my site, but if someone searches for site:watermintstudios.com getscaledgalleryimage, all of my images come up.
Here is an example of the URL that is being output in my HTML http://watermintstudios.com/EarnAMint/GetScaledMedia/68?scale=128
In the past, pre-MVC I would handle 404 errors and return content based on what was requested even if the page didn't actually exist. This would of course allow me to have the images pulled back by the image name (or description).
Is that the best way to do this? Or is there a better option? Something simpler would be better like if I could just do http://watermintstudios.com/EarnAMint/GetScaledMedia/Iris%20Doll?id=68&scale=128, but based on how google indexes images, would that give me what I need? Or do I need to provide image file extensions for maximum indexability?
Thanks all
It is important when doing Search Engine Optimization to always use alt="this is a crazy robot" for your images. This will help the crawler identify them. Note: always use alt, don't always name your images this is a crazy robot.
I have a developed PHP code to dynamically load files contained in a directory into a gallery / slideshow. I have many (40 - 50) of these gallery web pages which display images grouped by content. With hundreds of images, the dynamic gallery code allows me to add images to a directory without having to write code to each web page each time.
However I've realized that these files will be invisible to search engines since there isn't any HTML code to index on (e.g. the 'alt' tag). Does anyone have any suggestions on how to get these images indexed? Two ideas I've had:
1) Write a program to automatically generate a single web page for every jpeg file which will display the image when found with the search engine and contain a link to the gallery page where the user can see more content. The benefit to this method is not having to modify my live web pages. The downside is hundreds of additional files only to be found by a search engine.
2) Write a program to generate hidden links that can be pasted into my gallery html page - using the alt tag. The benefit to this method is that users would find my main gallery page with a search. The downside is having to cut and paste code to my live gallery web pages - defeating somewhat the purpose of a dynamic gallery.
I'm new at this, so any suggestions would be appreciated.
If I understand you correctly:
I would have one page that just lists thumbnails of pages, and then one page for each of the images, that shows a bigger version of each image, and all the meta-data you have. The best would be if you added a short unique snippet of text to each image, describing what in it.
I know this question might sound a little bit crazy, but I tough that maybe someone could come up with a smart idea:
Imagine you have 1000 thumbnail images on a single HTML page.
The image size is about 5-10 kb.
Is there a way to load all images in a single request? Somehow zip all images into a single fileā¦
Or do you have any other suggestions in the subject?
Other options I already know of:
CSS sprites
Lazy load
Set Expire headers
Downloads images across different hostnames
There are only two other options I can think of given your situation:
Use the "data:" protocol and echo a base64 encoded version of your thumbnails directly into the HTML page. I would not recommend this since you cannot then cache those images on the users browser.
Use HTML5's Web Storage to store all the images as records with the base64 encoded image data stored as BLOBs in a column. Once the database has downloaded to the users machine, use Javascript to loop through all the records and create the thumbnails on the page dynamically using something like jQuery. With this option you would need to wait till the entire database was done downloading on the end users browser, and they will need a fairly modern browser.
I think your best bet is a combination of lazy loading, caching with expires headers and serving images from multiple hostnames.
If the images can be grouped logically, CSS sprites may also work for you in addition to everything above. For example, if your thumbnails are for images uploaded on a certain day...you may be able to create a single file for each day which could then be cached on the users browser.
This is done by using what's called a CSS sprite; a single image with all the other images inside it, with the particular part that's wanted in the html selected by css.
See one tutorial at http://css-tricks.com/css-sprites
It sounds like you want something like SPDY's server push. When the client requests the HTML page (or the first image), SPDY allows the server to push the other resources without waiting for more requests.
Of course, this is still experimental.
You could try the montage command of imagemagick to create a single image.