I am benchmarking a custom brower and want to benchmark the rendering speeds of different types of images (gif, jpg, png) of the same file size to see which of the image formats this browser renders the fastest.
My process was just to have a simple seperate HTML page for each type of image and just use a Javascript counter before it is rendered and and after to measure the browser's rendering speed of that specific image.
Any thoughts on this process? Any thoughts on how to improve it?
Well, it's difficult to get meaningful generic results that way. You're measuring a combination of loading html, javascript and an image. Depending on where you're loading them from, you're also measuring the disk or network cache. The image rendering code is going to have some startup time, is dependent on a memory allocator and possibly garbage collection. Then there is image size, color depth, amount of compresion, number of images on the page, scaling, the influence of style sheets, the resolution of the javascript timer. Oh, and are you rendering to a visible part of a window, in a layer, or off-screen.
But don't worry, you'll be able to come up with a usable test. For your specific situation. Or the differences might even be very clear.
The Firefox Firebug plugin YSlow is pretty good
Related
recently I made a website for my photography. htttp://www.simotamas.com
I am a newbie, so its not the best site but it works fine for me, I got only one problem, when a site is loaded on a device for the first time, the gallery loading time takes up to 1-2 seconds.
Could you guys please check if I mess up something with the code?
Or should I made the pictures even smaller?
Any way I could increase the loading performance.
I would be really thankful for any advice.
Some points you can consider
Use thumbnails for preview (low resolution) , while clicking load actual image.
Load images of only visible part first then load the images in bottom. (May affect user experience)
if you have cpu power , use any libraries like cache tools or compression tools like
https://nielse63.github.io/php-image-cache/ . benchmark it carefully.
use gzip if you are not using gzip compression for your server.
The fact your website doesn't wait for the image to load is considered a plus (look into asynchronous web page content loading for a good read) that said you should compress your images before uploading them.. tinypng.com is a nice tool for it... But if it's a photography website doing so would reduce picture quality... Try to play with Photoshop save settings to find your ideal compromise between quality and size with respect to "memory" size... Pictures are heavy.. high definition and resolution will obviously result in heavier files to download
Update: another thing you could do is actually display (smaller) thumbnail and only load the full picture on request. I.e: user clicks and image opens in new tab
It would help if you create smaller thumb versions of your images so the browser can initially load these ones for the overview and no need for scaling way to big images down while rendering the page. An image should always be downloaded in the dimension it's going to be presented.
I figured out so far that AMP HTML greatly supports speeding up JavaScript, CSS and generally speaking rendering of the page.
However, yet another important speed topic in my experience are images.
How does AMP HTML tackle this issue such that images are perfectly compressed and resized for the current viewing device (tablet, mobile) and bandwidth (WLAN, 3G, Edge)
AMPs' image element amp-img supports the srcset (including support for the w modifier not natively available in Safari) and the sizes attribute in all browsers, so you can use modern responsive techniques to select the right image.
For now AMP itself does not do any image optimization itself. We might potentially start adding srcset attributes to image that don't have it on the proxy layer, but for now this is not happening.
Brief mention on this GitHub page:
The AMP HTML runtime can effectively manage image resources, choosing to delay or prioritize resource loading based on the viewport position, system resources, connection bandwidth, or other factors.
Basically the javascript library will be smarter about when it asks for image resources, which is a step up from what browsers do now: i.e. load all images in the background. This would be a more efficient use of available bandwidth.
My website for architectural visualization: http://www.greenshell3d.com
I noticed on the networking tab / incognito it takes 15 seconds or so to load the above-the-fold content. (most notably the image slideshow.)
Some of the images in the slideshow load at the very end instead of the beginning of the website load process. Now I understand the browser handles this order, but perhaps there is another way. As it stands, the bounce-rate is too high and I expect it is because of load time.
I've seen a jquery snippet on github that allows one to control the order of image loads - do you think this is a good option? I'd be glad to hear any opinions before investing the time to fix this.
Any ideas? Thanks!
You said you are interested in any opinions as well, so first some general thoughts: There is no page fold. The web that we produce content for exists in so many different screen sizes + resolutions that it’s impossible to say "The fold is below this big image!". Yes, Google changed the pagespeed insights tool to make people load stuff on top of the page first, but I think their wording there is really bad.
Now to your image loading issue:
The first thing I would recommend is to reduce the size of all the images. They seem to be around 280 - 300 kb per image and you have a few of them. Since there is a translucent overlay over them anyways you can probably get away with reducing the image quality without people noticing it (because they don’t see the image directly). Play around with the values here.
I would then look into optimizing the code for the slider to load the first image first, then the rest of the page and the other images asynchronously maybe after that. Another trick could be to increase the slide fade time from the first slide to other slides so the slider doesn’t change if the next image isn’t ready yet. You said you found a jQuery script to implement that, that’s where I’d start.
As a general guideline: the position of requests in the source code usually determines the load order of things on the page. If your images are requested by JavaScript at the end of the page, that lead to the images being loaded later than you want them to be loaded.
My website seems to take on this "squeegee" type load effect, where all of the graphics load from the top down with an ugly top to bottom wiping effect. Is there a way to make the actual way in which your website renders graphics prettier?
I'd be more interested in why your page is taking so long to load and/or render. If it takes several seconds to draw, even on a fast connection, you might want to look into why that is. Tools such as Fiddler, Firebug, IE Developer Tools, etc can help you look at what resources your page is downloading and how big each research is.
If you have massive resources on the page (such as BMP or PNG files that are several hundred K), see if you can convert them to other formats or resize them on the server to the size they render at.
If your HTML is massively complex, such as huge nested tables, you might want to look into simplifying that with more modern HTML and CSS styling.
If you do have huge, high-res bitmaps that need to be loaded, you might want to preload them with script and then render them dynamically when they finish loading.
In our web application, the users need to review a large number of images. This is my current layout. 20 images will be displayed at a time, with a pagination bar above the thumbnails. Clicking a thumbnail will show the enlarged image to the left. The enlarged image will follow the scrollbar so it's always visible. Quite simple actually.
I was wondering what the best interface would be in this scenario:
One option is to implement an infinite scroll script which will lazy load thumbnails as the user scrolls. The thumbnails not visible will be removed from the DOM. But my concern with this approach is the number of changes in the DOM slowing down the page.
Another option could be something like Google's Fastflip.
What do you think is the best approach for this application? Radical ideas welcomed.
I think the question you have to ask is: what action is user supposed to do? What's the purpose of the site?
If "review images" entails rating every image, I'd rather go with a Fastflip approach where the focus is on the single image. A thumbnail gallery will distract from the desired action and might result in a smaller amount of pics rated/reviewed.
If the focus should rather be on the comparison of a given image against others, I'd say try the gallery approach, although I wouldn't impement an infinite scroll with thumbnails because user can quickly get lost in the abundance of choices. I think a standard pagination (whether static or ajaxified) would be better if you choose to go this route.
Just my 2c.
If you paginate thumbnails, you can pre-generate a single image containing all thumbnails for each page, then use an image map to handle mouseover text and clicking. This will reduce the number of HTTP requests and possibly lead to fewer bytes sent. The separation distance between images should be minimized for this to be most efficient. This would have some disadvantages.
To reduce image download size at the expense of preprocessing, you can try to save each image in the format (PNG or JPG) most efficient for its contents using an algorithm like the one in ImageGuide. Similarly, if the images are poorly compressed (like JPEGs from a cell phone camera), they can be recompressed at the cost of some quality.
Once the site has some testers, you can analyze patterns in which images tend to be clicked (if a pattern exists) and preload the full-size images, or even pre-load all of them once the thumbs are loaded.
You might play with JPEG2000 images (you did say "radical ideas welcomed"), which thumb very easily, because the thumbnail and main image needn't be sent as if they are separate files. This is an advantage of the compression format -- it isn't the same as the hack of telling the browser to resize the full size image to represent its own thumbnail.
You can take a look at Google's WebP image format.
At the server side, a separate image server optimized for static content delivery, perhaps using NginX or the Tux webserver.
I would show the thumbnails, since the user might want to skip some of the pictures. I would also stay away of pagination in the terms of
<<first <previuos n of x next> last>>
and go for something more easy to implement and efficient. A
load x more pictures.
No infinite scroll whatsoever and why not, even no scroll at all. Just load x more, previous x.
Although this answer might be a bit unradical and boring, I'd go with exactly your suggestion of asynchronously loading the thumbnails (and of course main picture), if they come into view. A similar technique is used by Google+ in the pane to add persons to circles. This way, you keep the server resources and bandwidth on the pictures that are needed by the client. As Google+ shows, the operations on the DOM tree are fast enough and don't slow down a computer of the past years.
You might also prebuilt a few lines of the thumbnail table ahead with a dummy image (e.g. a "loading circle" animated gif) and replace the image. That way, the table in view is already built and does not need to be rerendered, as the flow elements following the table would have to be, if no images are in there during scrolling.
Instead of paginating the thumbnails (as suggested by your layout scheme), you could also think about letting users filter the images by tag, theme, category, size or any other way to find their results faster.