I have a set of images (url addresses). Some of them are of acceptable size, but some are too big. When I load it, the app crashes because of big images.
Is there a way to get the image size (megabytes) before loading it? This way I can remove large images from the list, and avoid loading them.
If the server where the images are located supports it, you can issue a HEAD request. This can be done in Qt using e.g. QNetworkAccessManager::head.
It should also be possible in QML/Javascript, see HTTP HEAD Request in Javascript/Ajax?.
Related
I am trying to optimise my website so that I can spend less money on database, specifically images. I use Supabase as my DB and storage.
Right now I directly use the URL of the images in the storage, but I am not sure if it would be better to download the images and then display them, as they will be in the caché, while in the URL case they won't, right? (To be clear, there are two separate supabase functions for this, one for getting url and one for downloading).
I have noticed that, when using the URL, the first render is much slower and subsequent renders are way faster, but I am unable to find the images in my caché, that's why I was wondering which option was better.
Are there any best pratices on this topic?
Thanks!!
If you are trying to save space, you should use the URL because it saves space on storage, though downloading the image results in faster rendering.
Saving space -> Slower Rendering -> URL
Using more storage -> Faster Rendering -> Downloaded IMG
I'm making a browser extension. I want to get the file size of images on the page. I searched a bit and found this. However, I think it is using many resources if I want to get all the images with this approach even if the request is a HEAD request. Another reason is that maybe the server is using gzip or any other compression, so detecting the size of the file might be wrong with content-length. I want to know if it is possible to get the filesize without requesting the server.
Can I get the file size by accessing devtools?
Is it possible to copy the image to a canvas and find out its file size?
For each record, I have two images, small and large size. What is the best way to show an image size according to the size of the screen?
I know I can use css and mediaquery and show the small or large image using a div display none / block, but in this way, I understand that the user has downloaded both images even if they are not displayed, is that correct? Then, the page load increases. If so, what is the best option to do this?
For this problem the most efficient way is using the User-Agent to detect if a client is desktop, tablet or mobile.
I prefer to use Mobile_Detect or jenssegers/agent (based on Mobile_Detect).
Edit
Recently I installed the Google Pagespeed Module on my server. There is a filter, resize_rendered_image_dimensions, that returns images with sizes of the current client view (exactly what you are looking for).
Some other implementation would be lazyloading with Javascript and adding dimensions to the requested url depending on the viewport sizes.
We would like to display very large (50mb plus) images in Internet Explorer. We would like to avoid compression as compression algorithms are not what CSI would have us believe that they are and the resulting files are too lossy.
As a result, we have come up with two options: Silverlight Deep Zoom or a Flash based solution (such as Zoomify). The issue is that both of these require conversion to a tiled output and/or conversion to a specific file type (Zoomify supports a single proprietary file type, PFF).
What we are wondering is if a solution exists which will allow us to view the image without a conversion before hand.
PS: I know that you can write an application to tile the images (as needed or after the load process) and output them; however, we would like to do this without chopping up the file.
The tiled approach really is the right way to do it.
Your users don't want to download a 50mb file before they can start viewing the image. You don't want to spend the bandwidth to serve 50 megs to every user who might only view a fraction of your image.
If you serve the whole file, users will eventually be able to load and view it, but it won't run smoothly for most of them.
There is no simple non-tiled way to serve just a portion of an image unless you want to use a server-side library like imagemagik or PIL to extract a specific subset of the image for each user. You probably don't want to do that because it will place a significant load on your server.
Alternatively, you might use something like google's map tool to provide zooming and scaling. Some comments on doing that are available here:
http://webtide.wordpress.com/2008/08/27/custom-google-maps/
Take a look at OpenSeadragon. To make a image can work with OpenSeadragon, you should generate a zoomable image format which mentioned here. Then follow starting guide here
The browser isn't going to smoothly load a 50 meg file; if you don't chop it up, there's no reasonable way to make it not lag.
If you dont want to tile, you could have the server open the file and render a screen sized view of the image for display in the browser at the particular zoom resolution requested. This way you arent sending 50 meg files across the line when someone only wants to get an overview of the image. That is, the browser requests a set of coordinates and an output size in pixels, the server opens the larger image and creates a smaller image that fits the desired view, and sends that back to the web browser.
As far as compression, you say its too lossy, but if thats what you are seeing you are probably using the wrong compression algorithm or setting for the type of image you have. The jpg format has quality settings to control lossiness, and PNG compression is lossless (the pixels you get after decompressing are the exact values you had prior to compression). So consider changing what you are using as compression, and dont just rely on the default settings in an image editor.
We use large background images (hi-res photos, up to 700 KB) for our page design.
It's part of the experience of the site that as you browse around, you see different images.
At the moment a different (random) image is loaded on each page request, from a pool of ~15 images, which could grow over time.
I'm looking for a sane way to optimize this:
To avoid the user having to download a big image file on every page view
To reduce load on the server (is this an issue, will the server keep the images in memory?)
The ideas I have so far include:
A timer which loads a different image at set intervals
Progressively loading other images in the background with ajax
Associating images with specific content (pages, tags)
The question is, how to keep it feeling somewhat random, while minimizing page load times and server hit?
I usually avoid sites with huge images, I am very impatient. I would rethink your design.
As a first step you should make sure, that the images can be properly cached:
use sane urls (no session id's etc)
set appropriate http headers ETag
Firstly, hearing that the background-images alone are 700kb astounds me. In addition to the content ON screen...that is a pretty heavy site.
For starters, I would try to use image compression tools. Two tools come to mind Imagemagick and PNGCrush. PNGCrush is excellent in reducing all the extraneous metadata attached to photos, without compromising photo quality.
I only recommend this as compressing the images will assist you in enabling the user to download a smaller quantity of content, which means quicker load times, which...at the end of the day...is what users want.
I would also cache the images, such that when a user re-visits the site, the image is already cached on their end. This minimises the HTTP requests that are made each time a user visits your site.
An example of where this technique is used on a commercial site is www.reactive.com. If you look the /js/headerImages.js file, they make use of image caching. Funnily enough, you will find the same src code at: http://javascript.internet.com/miscellaneous/random-image.html
Considering that you have mentioned that images are randomly loaded, I am assuming you are using a Javascript library such as jQuery to create the effect.
If you are, you can minimize page load times by using a CDN as opposed to referencing to a local copy of the jQuery lib which is stored on your server. I have performed performance testing on a site I made for a client, and over an average of 20 hits, saved 1.6 seconds through this technique!
Hope that helps for now :)