Automatically decide the final quality for adaptive image compression - image

I have seen few websites like compressjpeg, kraken, tinyjpg and several others who decides optimum quality while compressing. If images are of quality 99 they will compress sometimes to 94 quality and sometimes to 70 level.
I tried to study their pattern and found that all of them are using imagemagick and most probably they have some tables which reads the rgb pattern of those images and decides what should be the optimum compression level.
I want quality to be dynamic for all images instead of the below imagemagick command, I am using currently:-
convert -quality 70% input.jpg output.jpg
Here is few images and their corresponding quality after compression
Name R G B Overall Size width height Tinyimg size Tinypng compression original
7.jpg 95.0354 120.168 158.313 124.506 266 1920 1200 159.8  70 91
2.jpg 155.466 126.892 121.507 134.622 59 720 378 55.3 92 94
3.jpg 230.791 230.596 230.532 230.64 28.5 720 378 10.3 69 94
1.jpg 74.8786 99.9428 101.71 92.1772 33.5 650 400 32.8 64 69
4.jpg 235.647 52.3033 50.1626 112.704 384 400 250 25.3 95 99
9.jpg 194.461 180.839 183.859 186.386 12.71 300 188 12.9 75 75
6.jpg 170.337 169.707 153.873 164.639 6.69 184 274 6.9 74 74
5.jpg 154.196 130.809 111.683 132.229 8.5 259 194 8.5 74 74
8.jpg 162.161 184.608 194.416 180.395 6.04 126 83 5.9 89 89
Any guidance will be useful.

I was going to put this as a comment, but I decided to put it as an answer so folks can add/remove arguments and conclusions.
I don't believe there is an optimum quality setting. I think it depends on the purpose of the image and the content of the image - maybe other things too.
If the image has lots of smooth gradients, you will need a higher quality setting than if the image has loads of (high frequency) details many of which can be lost without perceptible loss of quality.
If the purpose of the image is as a web preview, it can have a far lower quality setting than if the purpose of the image is to pass a piece of fine art landscape/portrait photography to a printer or a customer who has paid £1,000 for it (I'm looking at you Venture UK).
One thing you can do is set the maximum file size you wish to achieve, but that disregards all the above:
convert -size 2048x2048 xc:gray +noise random -define jpeg:extent=100KB out.jpg
I guess I am saying "it depends".

You can try jpeg-archive. This utility provides dynamic compression using various metrics such as SSIM, Multi-SSIM and Smallfry. The command you should try for is:-
jpeg-recompress --accurate -m smallfry --quality high image.jpg compressed.jpg
Note, that this method keeps subsampling on by default and should be kept so for the good size reduction.
IMO, Guetzli is right now not good for production, especially for large number of images.

The answer is Google's Guetzli.
See explanations here.

Related

Restore quality from Whatsapp compressed picture

Whatsapp is famous for compressing images degrading their quality. I have a picture where unfortunately the original was lost and the only copy left was sent by WhatsApp.
Details of the picture are dimension 1200 x 1600 pixels, size 163 Kb, dpi 96. I would like to print it as crispy and as big as possible. I have already found an online tool to enhance dpi from 96 to 300. What other tricks are available to try to restore the original quality as much as possible? I clearly know that getting back to the original quality is impossible.

How does memory usage in browsers work for images - can I do one large sprite?

I currently display 115 (!) different sponsor icons at the bottom of many web pages on my website. They're lazy-loaded, but even so, that's quite a lot.
At present, these icons are loaded separately, and are sized 75x50 (or x2 or x3, depending on the screen of the device).
I'm toying with the idea of making them all into one sprite, rather than 115 separate files. That would mean, instead of lots of tiny little files, I'd have one large PNG or WEBP file instead. The way I'm considering doing it would mean the smallest file would be 8,625 pixels across; and the x3 version would be 25,875 pixels across, which seems like a really very large image (albeit only 225 px high).
Will an image of this pixel size cause a browser to choke?
Is a sprite the right way to achieve a faster-loading page here, or is there something else I should be considering?
115 icons with 75 pixel wide sure will calculate to very wide 8625 pixels image, which is only 50px heigh...
but you don't have to use a low height (50 pixel) very wide (8625 pixel) image.
you can make a suitable rectangular smart size image with grid of icons... say, 12 rows of 10 icons per line...
115 x 10 = 1150 + 50 pixel (5 pixel space between 10 icons) total 1200 pixel wide approx.
50 x 12 = 600 + 120 pixel (5 pixel space between 12 icons) total 720 pixel tall approx.

More pixel PNG image can be smaller in size than lesser pixel image?

I have the following png images (both created by cropping out of screenshots of a desktop screen, using same software) :
280x261 : 79.4 KB
380 x 354 : 3.62 KB
I am confused. Shouldn't it take more bits to store the meta information about larger number of pixels than lesser number of pixels ?
The PNG format uses lossless compression, meaning that the operation if fully reversible.
It relies among others on Huffman coding, such that frequent colors are coded with less bits, and duplicate string elimination. So images with a "simpler" content can compress better.
Added by Mark Setchell
Just to illustrate Yves' answer... if you take your urn image and make all the non-white pixels black like this:
convert urn.png -fill black +opaque white blackurn.png
the file is now just 894 bytes:
-rw-r--r--# 1 mark staff 894 19 Feb 11:08 blackurn.png

ImageMagick - get optimal compression

I compared ImageMagick with other tools. In my example I just want to resize and compress images. The goal is an acceptable file size with good quality.
Original file size: 127 Kb
Comparison between ImageMagick and Caesium
Unscaled, quality set to 80%
ImageMagick: convert image.jpg -strip -quality 80 optImage.jpg
=> 123 Kb
Casesium: 101 Kb
Scaled to 640x359, quality set to 80%
ImageMagick: convert image.jpg -strip -resize 640x359! -quality 80 optImage.jpg
=> 48 Kb
Caesium: 33.6 Kb
So what is wrong with that? Is there any ImageMagick-option I should include? Or is the quality parameter different between these tools?
EDIT: is there any Linux shell tool which is able to resize (maybe crop) and compress as good as Caesium?
I found out that these quality-parameters are not the same; 80% quality on ImageMagick corresponds to 87% (not exact) in Caesium; to get a acceptable file size the ImageMagick quality-parameter should set to 80 (not losless). But I think it's not bad to use an extra losless compression for ImageMagick-resized images like jpegtran for JPGs and optiPNG for PNGs. They can reduce the file size a bit more.
If you have an acceptable file size in mind, you can tell ImageMagick what it is and it will do its best to honour it.
Like this:
convert image.jpg -strip -define jpeg:extent=88kb optImage.jpg
If you want a way to do something similar with Python, I wrote an answer that works pretty well here. It does a binary search for a JPEG quality that satisfies a maximum size requirement.

Typical PSNR values

In Image Restoration problem, (still image not video image), why Peak Signal-to-Noise Ratio (PSNR) should not be more than 50 or 55 dB? I got 63 dB and they say it is wrong. why and How it is wrong?
If your pixels are represented using 8 bits per sample, the maximum possible pixel value of the image is 255. 20*log10(255) = 48 dB the mean squared error (MSE) of noise is not considered yet. The typical compression ratio of jpeg is no less than 7. In that case the MSE is around 0.224, and the corresponding PSNR is 54 dB. So you probably will not get the PSNR as high as 63 dB.

Resources