Using ImageMagick, I'm trying to resize a JPEG's dimensions and reduce the file size.
The issue is that I don't want to worsen the image quality.
I've tried the following commands:
convert -resize 170x80 -resample 100x100 image1.jpg image2.jpg
=> A resized picture but with bad quality.
convert -resize 170x80 -quality JPEG image1.jpg image2.jpg
=> A resized image and with good quality, but the same file size.
convert -density 600 -resize 170x80 image1.jpg image2.jpg
=> A resized image but very bad quality.
I don't know what option I should use.
quality parameter has a numeric value. From -quality doc :
For the JPEG and MPEG image formats, quality is 1 (lowest image quality and highest compression) to 100 (best quality but least effective compression). The default is to use the estimated quality of your input image if it can be determined, otherwise 92.
You may use quality lower than the default 92 to reduce the size, e.g. 70 as:
convert -resize 170x80 -quality 70 image1.jpg image2.jpg
I've managed to solve this issue using convert and mogrify :
convert -flatten -colorspace RGB myImage.jpg myImage.jpg
&&
mogrify -quality JPEG -geometry 170x80 myImage.jpg
Related
There is a comic book PDF file which has a lot of white space at the bottom.
The content is almost half the length of the page size.
How to crop all pages in the PDF file?
I have tried imagemagick but the quality is poor.
convert -verbose -density 300 -interlace none -quality 100 output.pdf
In ImageMagick, try using a larger density and then resize by the inverse amount in percent. Here, I use density = 4*72 = 288 and then resize by 25% (1/4).
convert -density 288 image.pdf -resize 25% -fuzz 15% -trim +repage result.pdf
I'm trying to decode aztec codes from images using zxing library.
Images looks more or less like this:
https://imgur.com/a/5ExPy6q
So far my results are quite random.
I've tried a few image processing actions using imagemagick such as:
convert -brightness-contrast 50x20 in.png out.png
convert -colorspace Gray in.png out.png
And there was improvement but still most of codes fails to decode.
What specific image preprocessing actions should I do for such barcodes ?
You can try -lat (local area threshold) in Imagemagick. For example:
Input:
convert barcode.png -colorspace gray -negate -lat 20x20+10% -negate result.png
You can improve that a little by adding -morphology close:
convert barcode.png -colorspace gray -negate -lat 20x20+10% -negate -morphology open diamond:1 result2.png
I have many product images on my local drive that I got from different sources and some of them are messed up a bit. I'm talking about images that are large in resolution but it is apparent that this resolution has been achieved by resizing the image from a very small source.
Is there a software or something that could find these usualy high-res but low quality images? Thanks for any ideas.
I have a few ideas, and I'll show what I am getting at with ImageMagick which is installed on most Linux distros and is available (for free) for macOS and Windows.
Just to clarify what I am talking about, it is the lack of high-frequency information (detail) in images when they upsized (up-rezzed) from smaller images. Here is an example:
The technique goes like this. Take an image, copy it, scale it down to a percentage of its original size and then scale it back up and measure how much it differs from the original. Here is an example:
magick start.jpg -set option:geom "%G" \( +clone -resize 50% -resize "%[geom]"\! \) -metric MSE -compare -format "%[distortion]" info:
0.00220709
If I now do that in a loop, I can get the MSE ("Mean Squared Error") for resizing an image down to 10% and back up, down to 20% and back up, down to 30% and back up like this:
for ((size=10;size<100;size+=10)); do
distortion=$(magick start.jpg -set option:geom "%G" \( +clone -resize "${size}%" -resize "%[geom]"\! \) -metric MSE -compare -format "%[distortion]" info: 2>&1)
echo $size $distortion
done
Sample Output
10 0.00641669
20 0.00461728
30 0.00351362
40 0.0027639
50 0.00220709
60 0.00173019
70 0.00130171
80 0.000935031
90 0.000637741
If you run that again, but redirect the output to a file called "data", you can plot it with gnuplot:
gnuplot --persist -e "set yrange [0:0.01];set title '10: MSE vs Resize Percentage';plot 'data'"
Now, we come to the actual point. If I run the plot for a file that was up-rezzed from 75% of its original size, then again for a file that was up-rezzed from 50% of its original size, and again for 25% and 15%, I can put them together in an animation like this:
Hopefully, you can see that the purple points depart from the x-axis (where the MSE error is low) immediately at the point corresponding to the percentage of the original size from which the image was up-rezzed.
So, I am suggesting that you look at your images and find a threshold for the error that would correspond to the degree of up-rezzing likely to be present and then test the error for any individual image against that threshold.
This would be just the same if you are on Windows, all the code above is only for generating the plots and numbers to make animations. You just need to get the MSE with one line:
magick YOURIMAGE -set option:geom "%G" \( +clone -resize "${size}%" -resize "%[geom]"\! \) -metric MSE -compare -format "%[distortion]" info:
Per GPG guidelines, photo IDs should be about 240x288 pixels and 4k-6k in size. I used: convert -resize 240x240 -quality 75 original.jpg gpguid.jpg. However, the resulting file is still 17k. Specifically, I am curious what options I have before decreasing the quality level.
This is the command I settled on: convert -resize 240x240 -quality 55 -strip -sampling-factor 4:2:0 original.jpg gpguid.jpg.
The original command yielded a file that was 17,169 bytes. The -strip argument brought the size down to 10,226 bytes by removing profiles and comments. The -sampling-factor argument reduced the size to 8,315 bytes by cutting the chroma channel's resolution in half without affecting the luminance resolution. Finally, turning down the quality to 55 brought me within the recommended 6K.
ImageMagick has a feature that allows you to set a maximum output file size, and you may find you get on better letting it do that rather than arbitrarily reducing quality to the point that all photos are within the specified size - that way you should hopefully retain as much quality as possible:
convert input.jpg -strip -resize 240x240 -define jpeg:extent=6kb result.jpg
The above command will result in a file just under 6kB.
If you want a way to do something similar with Python, I wrote an answer that works pretty well here. It does a binary search for a JPEG quality that satisfies a maximum size requirement.
I'm scaling an image down to 50% of its ratio with the convert command and -scale parameter. The generated image quality is pretty bad. Is there any extra option i can use to get a better result ?
-scale is quick but I belive -resize or -thumbnail is better and you can use any filters you like.
Using -sharpen 0x1.2 with -resize x% with -quality 95 produces good results for me.
I was trying to create thumbnails of PDFs and was getting poor results until I added -density 400, which is a suggestion I found here. Below is an example of the difference it can make. I got the PDF from here. Look at the bear and also the lines behind the bird.
Without -density 400
Full command: convert -resize 500x500 catalog.pdf[0] catalog-page1-resized.png
File size: 180 KB
With -density 400
Full command: convert -density 400 -resize 500x500 catalog.pdf[0] catalog-page1-resized.png
File size: 185 KB
Using -quality 80 -adaptive-resize is better for larger photos.
If you need to blur the output, use -interpolative-resize instead of -adaptive-resize