Optmize size of tiled tiff with ImageMagick - image

I'm actualy creating Tiled TIFF from jpeg files by using ImageMagick. My aim is to work with IIPImage server.
I can generate easily files but my problem is that I have to deal with a large warehouse of images and it's crucial to optimize the space occupied by my TIFF files.
Thus, by using a compression of 45% (and tiles of 256x256) I obtain a acceptable quality and It's the maximum level of optimization I know.
With that configuration, my TIFF files have a little more the same size as the original jpeg files.
For exemple, if a jpeg weights 10Mo, the result TIFF weights 11.4Mo. It's good but not enought because if my initial warehouse weights 2To, I have to plan at least 4To for my project.
Thereby, I want to know if it exists a way for optimizing further the size of my TIFF files without losing more quality than 45%... By using ImageMagick or another tool.
For information, I'm using this command for generating TIFF.
convert <jpeg file> -quality 45 -depth 8 +profile '*' -define tiff:tile-geometry=256x256 -compress jpeg 'ptif:<tiff file>'
Thanks !

I thought I'd just add a note to #mark-setchell's excellent answer, but it came out too long, so I've made a separate one, sorry.
Your problem is that imagemagick (at least on current Ubuntu) saves pyramidal JPEG TIFFs as RGB rather than YCbCr, so they are huge. For example, wtc.jpg is a 10,000 x 10,000 pixel JPEG image saved with the default Q75:
$ time convert wtc.jpg -quality 45 -depth 8 +profile '*' -define tiff:tile-geometry=256x256 -compress jpeg 'ptif:x-convert.tif'
real 0m27.553s
user 1m10.903s
sys 0m1.129s
$ ls -l wtc.jpg x-convert.tif
-rw-r--r-- 1 john john 15150881 Mar 16 08:55 wtc.jpg
-rw-r--r-- 1 john john 37346722 Mar 30 20:17 x-convert.tif
You can see the compression type like this:
$ tiffinfo x-convert.tif | grep -i interp
Photometric Interpretation: RGB color
Perhaps there's some way to make it use YCbCr instead? I'm not sure how, unfortunately.
I would use libvips instead. It's more than 10x faster (on this laptop anyway), uses much less memory, and it enables YCbCr mode correctly, so you get much smaller files:
$ time vips tiffsave wtc.jpg x-vips.tif --compression=jpeg --tile --tile-width=256 --tile-height=256 --pyramid
real 0m2.180s
user 0m2.595s
sys 0m0.082s
$ ls -l x-vips.tif
-rw-r--r-- 1 john john 21188074 Mar 30 20:27 x-vips.tif
$ tiffinfo x-vips.tif | grep -i interp
Photometric Interpretation: YCbCr
If you set Q lower, you can get the size down more:
$ vips tiffsave wtc.jpg x-vips.tif --compression=jpeg --tile --tile-width=256 --tile-height=256 --pyramid --Q 45
$ ls -l x-vips.tif
-rw-r--r-- 1 john john 12664900 Mar 30 22:01 x-vips.tif
Though I'd stick at the default Q75 myself.

I am not familiar with IIPImage server, so my thoughts may be inappropriate. If you store a tiled TIFF, you are storing multiple resolutions and all but the highest resolution is redundant - so could you maybe just store the highest resolution and generate the lower ones on demand?
The "PalaisDuLouvre.tif" image is 2MB as a tiled TIF:
ls -lhr PalaisDuLouvre.tif
-rw-r--r--# 1 mark staff 1.9M 30 Mar 11:24 PalaisDuLouvre.tif
and it contains the same image at 6 different resolutions:
identify PalaisDuLouvre.tif
PalaisDuLouvre.tif[0] TIFF 4000x828 4000x828+0+0 8-bit sRGB 1.88014MiB 0.000u 0:00.000
PalaisDuLouvre.tif[1] TIFF 2000x414 2000x414+0+0 8-bit sRGB 1.88014MiB 0.000u 0:00.000
PalaisDuLouvre.tif[2] TIFF 1000x207 1000x207+0+0 8-bit sRGB 1.88014MiB 0.000u 0:00.000
PalaisDuLouvre.tif[3] TIFF 500x103 500x103+0+0 8-bit sRGB 1.88014MiB 0.000u 0:00.000
PalaisDuLouvre.tif[4] TIFF 250x51 250x51+0+0 8-bit sRGB 1.88014MiB 0.000u 0:00.000
PalaisDuLouvre.tif[5] TIFF 125x25 125x25+0+0 8-bit sRGB 1.88014MiB 0.000u 0:00.000
Yet, I can store it in better quality (90%) than your tiled TIFF like this:
convert PalaisDuLouvre.tif[0] -quality 90 fullsize.jpg
with size 554kB:
-rw-r--r-- 1 mark staff 554K 30 Mar 13:44 fullsize.jpg
and generate a tiled TIF the same as yours in under 1 second on demand with:
convert fullsize.jpg -define tiff:tile-geometry=256x256 -compress jpeg ptif:tiled.tif
Alternatively, you could use vips to make your TIFF pyramid even faster. The following takes 0.2secs on my iMac, i.e. nearly 5x faster than ImageMagick:
vips tiffsave fullsize.jpg vips.tif --compression=jpeg --Q=45 --tile --tile-width=256 --tile-height=256 --pyramid

Related

JPEG compression and progressive JPEG

I have a high res Baseline JPEG which I want to compress from 6MB to +- 300kb for my website and make it progressive.
Now I know how to do both, progressive with photoshop and compression with a tool online or gulp/grunt task.
I am wondering what is the best order for the image(best quality):
First, compress the original image and then make it progressive.
First, make it progressive and then compress the image.
doesn't matter :)
As regards the quality, that is a difficult call since it is dependent on the images - which you don't show. And if you are going for an 20x reduction in size, you must expect some loss of quality. So, I'll leave you to assess quality. As regards the processing...
You can do both at once with ImageMagick which is installed on most Linux distros and is available for macOS and Windows.
Check input image size is 6MB:
ls -lrht input.jpg
-rw-r--r-- 1 mark staff 6.0M 2 Dec 16:09 input.jpg
Check input image is not interlaced:
identify -verbose input.jpg | grep -i interlace
Interlace: None
Convert to progressive/interlaced JPEG and 300kB in size:
convert input.jpg -interlace plane -define jpeg:extent=300KB result.jpg
Check size is now under 300kB:
ls -lhrt result.jpg
-rw-r--r--# 1 mark staff 264K 2 Dec 16:11 result.jpg
Check now interlaced:
identify -verbose result.jpg | grep -i interlace
Interlace: JPEG
You can also use jpegtran which is lighter weight than ImageMagick:
jpegtran -copy none -progressive input.jpg output.jpg

ImageMagick error with montage command

I'm stitching 8 images of 8k by 8k pixels in a row using the montage command.
This is what I enter in:
montage -mode concatenate -limit area 0 -tile x1 image1.png image2.png image3.png image4.png image5.png image6.png image7.png image8.png out1.png
This is the error I get out:
montage: magick/quantum.c:215: DestroyQuantumInfo: Assertion `quantum_info->signature == 0xabacadabUL' failed.
Abort
Can anyone help? Thanks
You may get on better with this command which does what I think you are trying to do:
convert +append image{1..8}.png out.png
As you can see from the following identify command, the images have been laid out side-by-side to make an image 64k pixels wide as a result of the +append command. Just FYI, use -append to lay them out one above the other in a 64k pixel tall stack.
identify out.png
out.png PNG 64000x8000 64000x8000+0+0 8-bit sRGB 2c 62.4KB 0.000u 0:00.000
Your originally posted command also works fine on my ImageMagick Version:
ImageMagick 6.8.9-5 Q16 x86_64 2014-07-29

How to compare 2 (image) files in a (os x) bash script ignoring the last changed/modified date headers?

I would like to recursively compare screenshot files in a directory.
I tried using cmp but it always returns a difference - even if the images are not visually different - I guess the difference in the file must be the last changed and last modified dates.
Is there a way I could only compare the pixel content of the image files while ignoring these headers?
How about using compare from the ImageMagic software-suite (http://www.imagemagick.org). Available for mac and all modern Linux distributions.
I'm not that familiar with comparing images, but I tried creating some samples and ran following snippets,
$ compare -identify -metric MAE same1.png same2.png null
>> same1.png[0] PNG 640x400 640x400+0+0 8-bit DirectClass 1.64KB 0.010u 0:00.009
>> same2.png[0] PNG 640x400 640x400+0+0 8-bit DirectClass 1.64KB 0.010u 0:00.000
>> 0 (0)
$ compare -identify -metric MAE same1.png diff.png null
>> same1.png[0] PNG 640x400 640x400+0+0 8-bit DirectClass 1.64KB 0.010u 0:00.020
>> diff.png[0] PNG 640x400 640x400+0+0 8-bit DirectClass 6.01KB 0.000u 0:00.009
>> 209.225 (0.00319257)
And it seems to work as expected.
Hope that helps!
Edit, good point by DigitalTrauma, comparing between different formats/compression algorithms may be a problem,
$ compare -identify -metric MAE same1.png same2.xcf null
>> same1.png[0] PNG 640x400 640x400+0+0 8-bit DirectClass 1.64KB 0.080u 0:00.040
>> same2.xcf[0] XCF 640x400 640x400+0+0 8-bit DirectClass 2.73KB 0.070u 0:00.030
>> 0 (0)
$ compare -identify -metric MAE same1.png same2.bmp null
>> same1.png[0] PNG 640x400 640x400+0+0 8-bit DirectClass 1.64KB 0.010u 0:00.010
>> same2.bmp[0] BMP 640x400 640x400+0+0 8-bit DirectClass 768KB 0.000u 0:00.000
>> 0 (0)
$ compare -identify -metric MAE same1.png same2.jpg null
>> same1.png[0] PNG 640x400 640x400+0+0 8-bit DirectClass 1.64KB 0.010u 0:00.019
>> same2.jpg[0] JPEG 640x400 640x400+0+0 8-bit DirectClass 3.65KB 0.000u 0:00.009
>> 0.196766 (3.00245e-06)
So, when comparing against jpeg we get a difference, even though the pictures "look" the same. This is definitly not my area, but I don't think converting the pictures to the same format would make any difference since the compression (or whatever makes the pictures different) already is applied to the image.
$ convert same2.jpg same2-from-jpg.png
$ compare -identify -metric MAE same2.png same2-from-jpg.png null
>> same2.png[0] PNG 640x400 640x400+0+0 8-bit PseudoClass 256c 1.38KB 0.040u 0:00.020
>> same2-from-jpg.png[0] PNG 640x400 640x400+0+0 8-bit DirectClass 1.64KB 0.010u 0:00.000
>> 0.196766 (3.00245e-06)
Above we convert the jpg back to png and then compare it with the original, and it still differs.
Anyway, maybe this will give you some insight. I can definitely recommend ImageMagick when working with pictures.
I've found the following pieces of OSX software that can identify duplicate images by image content:
pdiff
PixCompare
Duplicate Image Detector
Dupe Guru Image Edition
pdiff will definitely work from a bash script. The rest are more GUI oriented but may also have a command line interface.

Imagemagick convert massively increasing filesize?

I want to resize some PNG images with imagemagick, but it's making the files 5 times larger when I convert them:
$ convert -resize 50% -quality 80 01.png 01_half.png
$ ls -hal 01*.*
-rw-rw-r-- 1 3.3M Sep 9 09:05 01_half.png
-rwxr-xr-x 1 651K Jan 13 2011 01.png
From 651KB to 3.3MB! Can anyone suggest how to stop this happening?
Please note that ImageMagick's quality option's behavior is different with PNG's than it is with eg JPG's, and 80 looks like a pretty odd value for PNG's. As explained in the manual quality's value is cut in 2 where the first digit controls the zlib compression factor, and the second digit controls the filter type.

Can ImageMagick return the image size?

I'm using ImageMagick from the command line to resize images:
convert -size 320x240 image.jpg
However, I don't know how to determine the size of the final image. Since this is a proportional image scale, it's very possible that new image is 100x240 or 320x90 in size (not 320x240).
Can I call the 'convert' command to resize the image and return the new image dimensions? For example, pseudo code:
convert -size 320x240 -return_new_image_dimension image.jpg // returns the new resized image dimensions
-ping option
This option is also recommended as it prevents the entire image from being loaded to memory, as mentioned at: https://stackoverflow.com/a/22393926/895245:
identify -ping -format '%w %h' image.jpg
man identify says:
-ping efficiently determine image attributes
We can for example test it out with some of the humongous images present on Wikimedia's "Large image" category e.g. this ultra high resolution image of Van Gogh's Starry Night which Wikimedia claims is 29,696 × 29,696 pixels, file size: 175.67 MB:
wget -O image.jpg https://upload.wikimedia.org/wikipedia/commons/e/e8/Van_Gogh_-_Starry_Night_-_Google_Art_Project-x0-y0.jpg
time identify -ping -format '%w %h' image.jpg
time identify -format '%w %h' image.jpg
I however observed that -ping at least in this case did not make any difference on the time, maybe it only matters for other image formats?
Tested on ImageMagick 6.9.10, Ubuntu 20.04.
See also: Fast way to get image dimensions (not filesize)
You could use an extra call to identify:
convert -size 320x240 image.jpg; identify -format "%[fx:w]x%[fx:h]" image.jpg
I'm not sure with the %w and %h format. While Photoshop says my picture is 2678x3318 (and I really trust Photoshop), identify gives me:
identify -ping -format '=> %w %h' image.jpg
=> 643x796
(so does [fx:w] and [fx:h])
I had to use
identify -ping -format '=> %[width] %[height]' image.jpg
=> 2678x3318
I don't know what's going on here, but you can see both values on standard output (where the width and height before the => are the correct ones)
identify -ping image.jpg
image.jpg PAM 2678x3318=>643x796 643x796+0+0 16-bit ColorSeparation CMYK 2.047MB 0.000u 0:00.000
The documentation says %w is the current width and %[width] is original width. Confusing.
%w and %h may be correct for most uses, but not for every picture.
If you specify option -verbose, convert prints:
original.jpg=>scaled.jpg JPEG 800x600=>100x75 100x75+0+0 8-bit sRGB 4.12KB 0.020u 0:00.009
^^^^^^

Resources