ImageMagick - get optimal compression - image

I compared ImageMagick with other tools. In my example I just want to resize and compress images. The goal is an acceptable file size with good quality.
Original file size: 127 Kb
Comparison between ImageMagick and Caesium
Unscaled, quality set to 80%
ImageMagick: convert image.jpg -strip -quality 80 optImage.jpg
=> 123 Kb
Casesium: 101 Kb
Scaled to 640x359, quality set to 80%
ImageMagick: convert image.jpg -strip -resize 640x359! -quality 80 optImage.jpg
=> 48 Kb
Caesium: 33.6 Kb
So what is wrong with that? Is there any ImageMagick-option I should include? Or is the quality parameter different between these tools?
EDIT: is there any Linux shell tool which is able to resize (maybe crop) and compress as good as Caesium?

I found out that these quality-parameters are not the same; 80% quality on ImageMagick corresponds to 87% (not exact) in Caesium; to get a acceptable file size the ImageMagick quality-parameter should set to 80 (not losless). But I think it's not bad to use an extra losless compression for ImageMagick-resized images like jpegtran for JPGs and optiPNG for PNGs. They can reduce the file size a bit more.

If you have an acceptable file size in mind, you can tell ImageMagick what it is and it will do its best to honour it.
Like this:
convert image.jpg -strip -define jpeg:extent=88kb optImage.jpg
If you want a way to do something similar with Python, I wrote an answer that works pretty well here. It does a binary search for a JPEG quality that satisfies a maximum size requirement.

Related

Compressing an image to make it less than 4KB

I have an image of a person and I want to compress it to make it less than 4KB. I need to compress it and still have the face of the person recognizable even if the image will shrink.
Here is Theresa May at 142kB:
and resized to 72x72 and converted to greyscale and reduced to 2kB with ImageMagick at the command line:
convert original.jpg -resize 72x72 -colorspace gray -define jpeg:extent=2kb result.jpg
I can still recognise her.
Here is some other guy reduced to 1kB and I can still recognise him too:
ImageMagick is installed on most Linux distros and is available for macOS and Windows. Bindings are available for Python, PHP, Ruby, Javascript, Perl etc.
If you had further knowledge about your images, or your recognition algorithm, you may be able to do better. For example, if you knew that the centre of the image was more important than the edges, you could slightly blur, or reduce contrast in relatively unimportant areas and use the available space for more details in the important areas.
Mark Setchell has the right idea. But I might suggest one potential minor improvement. Remove any meta data including profiles, EXIF data etc. You can do that by either adding -strip
convert input.jpg -strip -resize 72x72 -colorspace gray -define jpeg:extent=2kb result.jpg
or by using -thumbnail rather than -resize. The former automatically does the strip.
convert input.jpg -thumbnail 72x72 -colorspace gray -define jpeg:extent=2kb result.jpg

Which Imagemagick convert parameters should be used to create a GPG public key photo?

Per GPG guidelines, photo IDs should be about 240x288 pixels and 4k-6k in size. I used: convert -resize 240x240 -quality 75 original.jpg gpguid.jpg. However, the resulting file is still 17k. Specifically, I am curious what options I have before decreasing the quality level.
This is the command I settled on: convert -resize 240x240 -quality 55 -strip -sampling-factor 4:2:0 original.jpg gpguid.jpg.
The original command yielded a file that was 17,169 bytes. The -strip argument brought the size down to 10,226 bytes by removing profiles and comments. The -sampling-factor argument reduced the size to 8,315 bytes by cutting the chroma channel's resolution in half without affecting the luminance resolution. Finally, turning down the quality to 55 brought me within the recommended 6K.
ImageMagick has a feature that allows you to set a maximum output file size, and you may find you get on better letting it do that rather than arbitrarily reducing quality to the point that all photos are within the specified size - that way you should hopefully retain as much quality as possible:
convert input.jpg -strip -resize 240x240 -define jpeg:extent=6kb result.jpg
The above command will result in a file just under 6kB.
If you want a way to do something similar with Python, I wrote an answer that works pretty well here. It does a binary search for a JPEG quality that satisfies a maximum size requirement.

Quicker conversion to WebP

I've tried WebP converter to convert images to WebP format, but it takes about 1-2 second for one image to convert. I have around 70 images and I'd like to convert them in less than a minute. Is there a quicker way how to do it?
'cwebp -lossless -q 0 -m 1' is used in https://developers.google.com/speed/webp/docs/webp_lossless_alpha_study for fast lossless comperession. It averages at 19 ms per image for a web corpus of 1000 randomly sampled PNG-images.
How big are these images? What kind of computer do you use to convert? Do you have an alpha-channel in the images? Do you convert to lossless or lossy? Which version of cwebp are you using?

Imagemagick scale and image quality

I'm scaling an image down to 50% of its ratio with the convert command and -scale parameter. The generated image quality is pretty bad. Is there any extra option i can use to get a better result ?
-scale is quick but I belive -resize or -thumbnail is better and you can use any filters you like.
Using -sharpen 0x1.2 with -resize x% with -quality 95 produces good results for me.
I was trying to create thumbnails of PDFs and was getting poor results until I added -density 400, which is a suggestion I found here. Below is an example of the difference it can make. I got the PDF from here. Look at the bear and also the lines behind the bird.
Without -density 400
Full command: convert -resize 500x500 catalog.pdf[0] catalog-page1-resized.png
File size: 180 KB
With -density 400
Full command: convert -density 400 -resize 500x500 catalog.pdf[0] catalog-page1-resized.png
File size: 185 KB
Using -quality 80 -adaptive-resize is better for larger photos.
If you need to blur the output, use -interpolative-resize instead of -adaptive-resize

imagemagick resizing and quality PNG

In my application I need to resize and make the quality on PNG files poorer.
In full size the PNGs are 3100x4400px using 2,20MB disk space.
When running the following command:
convert -resize 1400 -quality 10 input.png output.png
the images are resized to 1400x2000 using 5,33MB disk space.
So my question is: How can I reduce the file size?
You can further reduce quality of PNG by using posterization:
https://github.com/pornel/mediancut-posterizer (Mac GUI)
This is a lossy operation that allows zlib to compress better.
Convert image to PNG8 using pngquant.
It reduces images to 256 colors, so quality depends on the type of image, but pngquant makes very good palettes, so you might be surprised how often it works.
Use Zopfli-png or AdvPNG to re-compress images better.
This is lossless and recommended for all images if you have CPU cycles to spare.
After using imagemagick to resize, you can compress the image using pngquant.
On mac (with homebrew) brew install pngquant then:
pngquant <filename.png>
This will create a new image filename-fs8.png that is normally much smaller in size.
Help page says, that -quality option used with PNG sets the compression level for zlib, where (roughly) 0 is the worst compression, 100 - is the best (default is 75). So try to set -quality to 100 or even remove the option.
Another method is to specify PNG:compression-level=N, PNG:compression-strategy=N and PNG:compression-filter=N to achieve even better results.
http://www.imagemagick.org/script/command-line-options.php#quality
For lazy people that arrived here wanting to paste in a one liner:
mogrify -resize 50% -quality 50 *.png && pngquant *.png --ext .png --force
This modifies all of the pngs in the current directory in place, so make sure you have a backup. Modify your resize and quality parameters as suits your needs. I did a quick experiment and mogrify first, then pngquant, resulted in significantly smaller image size.
The ubuntu package for pngquant is called "pngquant" but I had it installed already on 20.04LTS so seems like it may be there by default.
I found that the best way was to use the
- density [value]
parameter.

Resources