Why is Chrome so slow when loading many (≈130) png-images? - performance

Im making a page/application that presents a wide range of products. One view contains a lineup of about 130 products, eash represented by a png-image, the size varies from 33Kb to 150Kb.
The lineup can be scrolled horizontally whith the users scrollbar or custom controller, and when you hover each product I use som css-transistions to fade out all products except the one hovered, and to enlarge it.
It works perfectly smooth in Safari and decently Firefox, and in Chrome as long as I keep the image count-down. But the more images i try to add, both the scrolling and the transistions progressively gets slower until it's almost imposible to work with.
Is this some cache.problem in chrome? Is there any way around it?
I've tried to preload the images, but the problem isn't the loading time og the image, its the performance that seems to halt due to the sheer number of images.

You can combine all your little images in one big image, and load ONLY the big image, so you do only ONE HTTP request. For display you must set the offset (background-position property) of every image using css.
Tutorial

Related

What is the best way to display and apply filters to RAW images on macOS?

I am creating a simple photo catalogue application for macOS to see whether the latest APIs can significantly improve performance of loading directories with large numbers of images.
So far it looks pretty promising and loading around 600 45MB RAW image thumbnails using QLThumbnailGenerator and CGImageSourceCreateWithURL is super fast allowing thumbnail images and image metadata to be displayed almost instantly.
Displaying these images in a NSCollectionView using a CALayer in the NSCollectionViewItem's view also appears to be extremely fast and scrolling is very smooth.
I did find that QLThumbnailGeneratorseems to start failing after a few hundred images and starts returning error code 108 if I call the api in a continuous loop - I fixed that by calling CGImageSourceCopyPropertiesAtIndex immediately after the thumbnail generator api call - so maybe there is a timing issue or not enough file handles or something if the api is called to quickly and for too long.
However I am still having trouble rendering a full sized image to the display - here I am using a NSScrollView with a layer backed NSView documentView. Everything is super fast until the following call:
view.layer.contents = cgImage
And at this point the entire main thread hangs until the image has loaded - and this may take a few seconds.
Once it has loaded it's fine and zooming in and out by changing the documentView frame size is very fast - scrolling around the full size image is also super smooth without any of the typical hiccups.
Is there a way of loading these images without causing the UI to freeze ?
I've seen the recent WWDC2020 session where they demonstrate similar scrolling of large numbers of images but I haven't been able to find anything useful on loading large images other than CATiledLayer - but it's not really clear if that is the right answer for this problem.
The old Apple sample RawExpose seemed to be an option but most of that code is deprecated and it seems one has to use MetalKit not instead of GLKit - unfortunately there is no example of using MetaKit with Core Image that I can find.
FYI - I tried using some the new SwiftUI CollectionView and List but they seem to be significantly slower than AppKit and I found some of the collection view items never render - of course these could just be bugs in the macOS 11 beta.
OK - well I finally figured it out and it's complicated but simple. It's complicated because there are so many options to choose from and so many outdated sample apps to look at. In any event I think I have solved most if not all the issues related to using metal backed CALayers and rendering realtime updates of the images as CIFilter adjustments are applied. There are many pieces to the puzzle and happy to share if anyone is looking for help.
Some key pointers:
I am using CAMetalLayer and NSView
I override the CAMetalLayer.display(layer:) method and call the layer.setNeedsDisplay() when the user slides an adjustment slider.
I chain together all the CIFilters, including the RAW filter created with CIFilter(imageUrl:)
Most importantly I use the RAW filters scaleFactor parameter to size the image - encountered major performance issues using any other method to resize the image for the views size
Don't expect high performance if the image is zoomed right in - 50% is seems to be the limit for 45megapixel RAW imaged from Nikon D850.
A short video of the result is here https://youtu.be/5wp0CIWAoIM

Site speed image caching for image in pop-up

I'm creating a website that loads some images on first load, then if a user clicks on one of the images the same image opens in a popup but bigger (Lightbox).
My question is, is it better to just use the same large image and resize the dimensions for caching (So the user already has loaded the image) or is it better to first load a smaller thumbnail then the bigger image once the pop-up opens?
I'm trying to reduce site speed as much as possible as there are a lot of images.
I'm using Masonry for the site and Magnific Popup for the image expand if that helps.
the thumbnail image size is around 100kb whereas the larger image is between 200-300kb.
The main idea of any lightbox-type script is that they allow you to display preview images (thumbnails) on your page and load larger versions only when needed (e.g, when user clicks on the thumbnail). This greatly helps to reduce page weight and load time. Modern scripts (like fancybox) can display preview image while larger version is gradually appearing over it thus relieving users from staring at blank screen.
image = "Your image link / location here"/ZoomService
image.Zoom(100)
--if you use MilkWar web coding use that

Large images don't render in Chrome?

Very large images will not render in Google Chrome (although the scrollbars will still behave as if the image is present). The same images will often render just fine in other browsers.
Here are two sample images. If you're using Google Chrome, you won't see the long red bar:
Short Blue
http://i.stack.imgur.com/ApGfg.png
Long Red
http://i.stack.imgur.com/J2eRf.png
As you can see, the browser thinks the longer image is there, but it simply doesn't render. The image format doesn't seem to matter either: I've tried both PNGs and JPEGs. I've also tested this on two different machines running different operating systems (Windows and OSX). This is obviously a bug, but can anyone think of a workaround that would force Chrome to render large images?
Not that anyone cares or is even looking at this post, but I did find an odd workaround. The problem seems to be with the way Chrome handles zooming. If you set the zoom property to 98.6% and lower or 102.6% and higher, the image will render (setting the zoom property to any value between 98.6% and 102.6% will cause the rendering to fail). Note that the zoom property is not officially defined in CSS, so some browsers may ignore it (which is a good thing in this case since this is a browser-specific hack). As long as you don't mind the image being resized slightly, I suppose this may be the best fix.
In short, the following code produces the desired result, as shown here:
<img style="zoom:98.6%" src="http://i.stack.imgur.com/J2eRf.png">
Update:
Actually, this is a good opportunity to kill two birds with one stone. As screens move to higher resolutions (e.g. the Apple Retina display), web developers will want to start serving up images that are twice as large and then scaling them down by 50%, as suggested here. So, instead of using the zoom property as suggested above, you could simply double the size of the image and render it at half the size:
<img style="width:50%;height:50%;" src="http://i.stack.imgur.com/J2eRf.png">
Not only will this solve your rendering problem in Chrome, but it will make the image look nice and crisp on the next generation of high-resolution displays.

Optimize display of a large number of images (1000+) for performance and ease of use in a web application

In our web application, the users need to review a large number of images. This is my current layout. 20 images will be displayed at a time, with a pagination bar above the thumbnails. Clicking a thumbnail will show the enlarged image to the left. The enlarged image will follow the scrollbar so it's always visible. Quite simple actually.
I was wondering what the best interface would be in this scenario:
One option is to implement an infinite scroll script which will lazy load thumbnails as the user scrolls. The thumbnails not visible will be removed from the DOM. But my concern with this approach is the number of changes in the DOM slowing down the page.
Another option could be something like Google's Fastflip.
What do you think is the best approach for this application? Radical ideas welcomed.
I think the question you have to ask is: what action is user supposed to do? What's the purpose of the site?
If "review images" entails rating every image, I'd rather go with a Fastflip approach where the focus is on the single image. A thumbnail gallery will distract from the desired action and might result in a smaller amount of pics rated/reviewed.
If the focus should rather be on the comparison of a given image against others, I'd say try the gallery approach, although I wouldn't impement an infinite scroll with thumbnails because user can quickly get lost in the abundance of choices. I think a standard pagination (whether static or ajaxified) would be better if you choose to go this route.
Just my 2c.
If you paginate thumbnails, you can pre-generate a single image containing all thumbnails for each page, then use an image map to handle mouseover text and clicking. This will reduce the number of HTTP requests and possibly lead to fewer bytes sent. The separation distance between images should be minimized for this to be most efficient. This would have some disadvantages.
To reduce image download size at the expense of preprocessing, you can try to save each image in the format (PNG or JPG) most efficient for its contents using an algorithm like the one in ImageGuide. Similarly, if the images are poorly compressed (like JPEGs from a cell phone camera), they can be recompressed at the cost of some quality.
Once the site has some testers, you can analyze patterns in which images tend to be clicked (if a pattern exists) and preload the full-size images, or even pre-load all of them once the thumbs are loaded.
You might play with JPEG2000 images (you did say "radical ideas welcomed"), which thumb very easily, because the thumbnail and main image needn't be sent as if they are separate files. This is an advantage of the compression format -- it isn't the same as the hack of telling the browser to resize the full size image to represent its own thumbnail.
You can take a look at Google's WebP image format.
At the server side, a separate image server optimized for static content delivery, perhaps using NginX or the Tux webserver.
I would show the thumbnails, since the user might want to skip some of the pictures. I would also stay away of pagination in the terms of
<<first <previuos n of x next> last>>
and go for something more easy to implement and efficient. A
load x more pictures.
No infinite scroll whatsoever and why not, even no scroll at all. Just load x more, previous x.
Although this answer might be a bit unradical and boring, I'd go with exactly your suggestion of asynchronously loading the thumbnails (and of course main picture), if they come into view. A similar technique is used by Google+ in the pane to add persons to circles. This way, you keep the server resources and bandwidth on the pictures that are needed by the client. As Google+ shows, the operations on the DOM tree are fast enough and don't slow down a computer of the past years.
You might also prebuilt a few lines of the thumbnail table ahead with a dummy image (e.g. a "loading circle" animated gif) and replace the image. That way, the table in view is already built and does not need to be rerendered, as the flow elements following the table would have to be, if no images are in there during scrolling.
Instead of paginating the thumbnails (as suggested by your layout scheme), you could also think about letting users filter the images by tag, theme, category, size or any other way to find their results faster.

Increase page loading speed

In my app, I've a panorama-page which contains around 10 panorama items. Each panorama item has some path drawings, a list picker and few input fields.The problem i'm facing is that whenver i navigate to this page the navigation is very slow due to lot of content to initialize. If i comment the InitializeComponent(); the loading becomes fast.I thought of adding the XAML content in code, but the problem is that i've to access the input fields by their name in code, so it didn't worked.Any idea how i can speed up the navigation to the page.Thanks..
From the UI Guide:
Use either a single color background
or an image that spans the entire
panorama. If you decide to use an
image, any UI image type that is
supported by Silverlight is
acceptable, but JPEGs are recommended,
as they generally have smaller file
sizes than other formats.
You can use multiple images as a
background, but you should note that
only one image should be displayed at
any given time.
Background images should be between
480 x 800 pixels and 1024 x 800 pixels
(width x height) to ensure good
performance, minimal load time, and no scaling.
Consider hiding panorama sections
until they have content to display.
Also, 10 PanoramaItems seems like a lot since the recommended maximum is 4. You should either cut down on the number, or hide the content until it's required. Have a read of the best practice guide for Panoramas on MSDN.
I think you could improve page performance by creating usercontrols for the specific panorama items, add an empty panorama control to your page (with only the headers) and as picypg suggests load these usercontrols when they are needed.
Another way could be that you load the first page and show this one already to the user. In the background you could start loading the other panorama items.
My suggested approach would be for the first one. Using the lazyloading principle.
I woudl assume that your delays are due to the number of items on the page. This will lead to a very large object graph which will take a long time to create. I'd also expect it's using lots of memory and you have a very high fill rate which is slowing down the GPU.
Having input items/fields on PanoItems can cause UX issues if you're not careful.
That many panoItems could also cause potential navigation issues for the user.

Resources