Vips + mozjpeg vs Imagick - imagick

I'm looking for a solution to get the best jpeg compression when down scaling an image. I'm comparing Vips + Mozjpeg and Imagick (convert).
My original file (pic.jpg) is 6.5 MB.
I run:
vipsthumbnail pic.jpg --size=1920x1280 --delete -o pic-vips-q96.jpg[Q=96,optimize_coding,strip,intelace]
and I my output file is 1.7 MB
I run:
convert pic.jpg -resize 1920x1280 -quality 96 -interlace plane -strip pic-imagick-q96.jpg
and my output file is 1.2 MB
Am I doing fair comparison here? Is Imagick that much better in compressing?
The link to the original image (from unsplash):
https://images.unsplash.com/photo-1545278068-cdca78378350
I'm comparing these two libraries because they both have Go bindings, which I need in my project.
Grateful for any advice!

libvips automatically disables chroma subsampling for Q > 90, so your two compression settings are not quite the same. Try this:
$ vipsthumbnail pic.jpg --size=1920x1280 -o pic-vips-q90.jpg[Q=90,optimize_coding,strip,interlace]
$ ls -l pic-vips-q90.jpg
-rw-r--r-- 1 john john 495764 Dec 20 17:17 pic-vips-q90.jpg
$ convert pic.jpg -resize 1920x1280 -quality 90 -interlace plane -strip pic-imagick-q90.jpg
$ ls -l pic-imagick-q90.jpg
-rw-r--r-- 1 john john 492029 Dec 20 17:17 pic-imagick-q90.jpg
So they are very close. The remaining difference is perhaps just in the downsize algorithm -- maybe libvips is making a very slightly sharper image.
libvips will probably be using libjpeg-turbo by default. If you want to compress with mozjpeg, you'll need to build everything from source.

Related

Page sizes inside PDF are wrong when putting images as separate pages

I'm using Image Magic 7.1
$ convert -version
Version: ImageMagick 7.1.0-40 beta Q16-HDRI x86_64 21a5642bc:20220620 https://imagemagick.org
Copyright: (C) 1999 ImageMagick Studio LLC
License: https://imagemagick.org/script/license.php
Features: Cipher DPC HDRI OpenMP(4.5)
Delegates (built-in): fontconfig freetype jng jpeg pangocairo png x zlib
Compiler: gcc (10.2)
When i attempt to place two image files into a PDF a very strange thing happens: Image sizes are correct, but page sizes of the resulting PDF are wrong, and the images are stretched to fit the page (However, if you were to extract those images, you'd see that the sizes are identical).
I am trying to do so in two different ways:
convert file.jpeg -resize 1795x1287^ -gravity center -extent 1795x1287 /tmp/1.jpeg
convert back.png /tmp/2.jpeg
convert /tmp/1.jpeg /tmp/2.jpeg output-test.pdf
back.png is already in the correct resoluion. I can see that temporary files are of correct resolution, but the PDF is wrong.
I am also trying to do this in a single command:
convert \
file.jpeg -resize 1795x1287^ -gravity center -extent 1795x1287 \
back.png \
-quality 100 \
output.pdf
The resulting file is displayed thusly
I have managed to generate the file the way i wanted, but i had to abandon using Image Magic for the last step. I have tried -density and -page, but to no avail. What worked in the end:
convert file.jpeg -resize 1795x1287^ -gravity center -extent 1795x1287 -density 300 -quality 100 /tmp/1.jpeg
convert back.png -density 300 -quality 100 /tmp/2.jpeg
img2pdf /tmp/1.jpeg /tmp/2.jpeg --pagesize 1795x1287 -o output-test.pdf

Convert entire folder to greyscale using image magick?

I am trying to convert an entire folder to grayscale, using image magick.
convert *.jpg -colorspace Gray -separate -average
is met with this error :
convert: `-average' # error/convert.c/ConvertImageCommand/3290.
What is the correct command for this?
If you have lots of files to process, use mogrify:
magick mogrify -colorspace gray *.jpg
If you have tens of thousands of images and a multi-core CPU, you can get them all done in parallel with GNU Parallel:
parallel -X magick mogrify -colorspace gray ::: *.jpg
Also, the following can be used in a script - for the context menu of file managers like Dolphin, Nautilus, Nemo, Thunar etc:
for filename in "${#}"; do
name="${filename%.*}"
ext="${filename##*.}"
cp "$filename" "$name"-grayscale."$ext"
mogrify -colorspace gray "$name"-grayscale."$ext"
rm "$name"-grayscale."$ext"~
done

Making white background transparent using ImageMagick

I have about 2700 images that I want to:
Convert to .png
Make the white background, transparent
To do this, I downloaded ImageMagick using Homebrew and ran the below command in the relevant directory:
find . -type f -name "*.jpg" -print0 | while IFS= read -r -d $'\0' file; do convert -verbose "$file" -transparent white "$file.png"; done
This worked, however the images still have a few white specks around them as per the below image. With off-white bottles, it's even harder because it makes some of the bottle transparent too!
In photoshop, you can adjust the "tolerance" of "MagicWand" to ensure that this doesn't happen but I'm not sure how you can do this using ImageMagick and can't find anything on Google.
Example of Image with white crust around outside
Can anyone help? Is there a way of doing this with ImageMagick? Is there a better way of processing these 2700 images to remove the white background?
Thanks
A
Use -fuzz option in ImageMagick
$ convert img.jpg -fuzz 32% -transparent #ffffff out.png
This will allow you to adjust the tolerance value. Hope this helped.

Simple way to resize large number of image files

I have a folder which contains about 45000 jpeg images. Most of them are from 10KB - 20Kb.
Now I want to write a script to resize all of them to fixed size 256x256. I wonder if there is any simple way to do that like: for a in *.jpg do .... I am using 8-core CPU with 8GB of RAM machine running Ubuntu 14.04, so it is fine if the process requires many resources
I would use GNU Parallel, like this to make the most of all those cores:
find . -name \*.jpg | parallel -j 16 convert {} -resize 256x256 {}
If you had fewer files, you could do it like this, but the commandline would be too long for 45,000 files:
parallel -j 16 convert {} -resize 256x256 {} ::: *.jpg
Also, note that if you want the files to become EXACTLY 256x256 regardless of input dimensions and aspect ratio, you must add ! after the -resize like this -resize 256x256!
As Tom says, make a backup first!
Here is a little benchmark...
# Create 1,000 files of noisy junk #1024x1024 pixels
seq 1 1000 | parallel convert -size 1024x1024 xc:gray +noise random {}.jpg
# Resize all 1,000 files using mogrify
time mogrify -resize 256x256 *.jpg
real 1m23.324s
# Create all 1,000 input files afresh
seq 1 1000 | parallel convert -size 1024x1024 xc:gray +noise random {}.jpg
# Resize all 1,000 files using GNU Parallel
time parallel convert -resize 256x256 {} {} ::: *.jpg
real 0m22.541s
You can see that GNU Parallel is considerably faster for this example. To be fair though, it is also wasteful of resources though because a new process has to be created for each input file, whereas mogrify just uses one process that does all the files. If you knew that the files were named in a particular fashion, you may be able to optimise things better...
Finally, you may find xargs and mogrify in concert work well for you, like this:
time find . -name \*.jpg -print0 | xargs -0 -I {} -n 100 -P 8 mogrify -resize 256x256 {}
real 0m20.840s
which allows up to 8 mogrify processes to run in parallel (-P 8), and each one processes up to 100 input images (-n 100) thereby amortizing the cost of starting a process over a larger number of files.
You could use the mogrify tool provided by ImageMagick
mogrify -resize 256x256 *.jpg
This modifies all files in place, resizing them to 256x256px. Make sure to take a backup of your originals before using this command.

Improving process for using ImageMagick to batch convert TIFF to PNG and resample

I have several folders of 600 dpi TIFFs (CCITT Group IV, so black & white) that I need to convert to screen resolution PNGs - so in ImageMagick terms, I need to convert the format and resample the images to ~80 dpi. My first approach was to perform this in a single mogrify command (this is in bash on Mac OS X):
for folder in $(find * -maxdepth 0 -type d ); \
do mogrify -path "$folder/medium" -format png -resample 31.5% "$folder/tiff/*.tif"; \
done
But the result was awful. The text in the resulting image was completely illegible. So I changed this to a two step process, (1) converting the TIFF to PNG at original resolution, then (2) downsizing the resolution:
for folder in $(find * -maxdepth 0 -type d ); \
do mogrify -path "$folder/medium" -format png "$folder/tiff/*.tif"; \
mogrify -resample 31.5% "$folder/medium/*.png"; \
done
While this process resulted in nice and crisp results at 80 dpi, the process was much slower, since I'm now writing the full resolution file to disk before downsizing the resolution.
Does anyone have a suggestion for the best way to accomplish a conversion and downsizing of resolution in a single step?
The sips tool can be used as follows:
sips -s format png -s dpiHeight 80 -s dpiWidth 80 -z 1200 1600 test.tiff --out test.png
Having said that in the resulting .png, the DPI settings don't seem to have been changed.
Also when resizing, it looks like you can only specify absolute pixel dimensions of the output image, and not a percentage of the input image. So you would have to grab the dimensions of the input image and calculate the new size explicitly:
#!/bin/bash
infile=test.tiff
outfile=test.png
pct=31 # only whole numbers for bash arithmetic
height=$(sips -g pixelHeight $infile | tail -1 | cut -d: -f2)
width=$(sips -g pixelWidth $infile | tail -1 | cut -d: -f2)
sips -s format png -s dpiHeight 180 -s dpiWidth 180 -z $((height*pct/100)) $((width*pct/100)) 1600 $infile --out $outfile
I know I am late to the party, but I was looking at this and wondered why you get poor quality when doing both setps in one go. I wondered if it was maybe down to using mogrify rather than convert, so I set about trying to improve it. So, this would be my first and best attempt:
#!/bin/bash
for f in */tiff/*.tif; do
out="${f%tif}png" # replace "tif" suffix with "png"
out=${out/tiff/medium} # replace "tiff" directory with "medium"
convert "$f" -resample 31.5% "$out"
done
And, if that still doesn't work, I could go for a second attempt which avoids writing the file to disk and then resampling it, and instead writes a PNG to stdout and then pipes that to a second convert that resamples and writes to disk - thereby avoiding the writing to disk of the big, intermediate PNG.
#!/bin/bash
for f in */tiff/*.tif; do
out="${f%tif}png" # replace "tif" suffix with "png"
out=${out/tiff/medium} # replace "tiff" directory with "medium"
convert "$f" PNG:- | convert - -resample 31.5% "$out"
done

Resources