In FineUploader Plugin scale , how can i define the height and width not the full width (max width option) - fine-uploader

how can I set height and width in scaling and can I depend on the image generated (quality and professional scale generation).

how can I set height and width in scaling
You can't. Specify a maxSize for each scaling.sizes entry and Fine Uploader will proportionally scale the image.
can I depend on the image generated (quality
Quality will be limited if you rely on the browser only. There is an entire section in the documentation that explains how you can generate higher-quality resizes by integrating a third-party resize library. I also discuss why you may or may not want to do this. From the documentation:
Fine Uploader's internal image resize code delegates to the drawImage
method on the browser's native CanvasRenderingContext2D object. This
object is used to manipulate a element, which represents a
submitted image File or Blob. Most browsers use linear interpolation
when resizing images. This can lead to extreme aliasing and moire
patterns which is a deal breaker for anyone resizing images for
art/photo galleries, albums, etc. These kinds of artifacts are
impossible to remove after the fact.
If speed is most important, and precise scaled image generation is not
paramount, you should continue to use Fine Uploader's internal scaling
implementation. However, if you want to generate higher quality scaled
images for upload, you should instead use a third-party library to
resize submitted image files, such as pica or limby-resize. As of
version 5.10 of Fine Uploader, it is extremely easy to integrate such
a plug-in into this library. In fact, Fine Uploader will continue to
properly orient the submitted image file and then pass a properly
sized to the image scaling library of your choice to receive
the resized image file, along with the original full-sized image file
drawn onto a for reference. The only caveat is that, due to
issues with scaling larger images in iOS, you may need to continue to
use Fine Uploader's internal scaling algorithm for that particular OS,
as other third-party scaling libraries most likely do not contain
logic to handle this complex case. Luckily, that is easy to account
for as well.
If you'd like to, for example, use pica to generate higher-quality
scaled images, simply pull pica into your project, and contribute a
scaling.customResizer function, like so:
scaling: {
customResizer: !qq.ios() && function(resizeInfo) {
return new Promise(function(resolve, reject) {
pica.resizeCanvas(resizeInfo.sourceCanvas, resizeInfo.targetCanvas, {}, resolve)
})
},
...
}

Related

Handleling image size on multiple device display on cordova-ionic-angular app

I'm building a new app with this great tool and i have a question to solve.
What is the best way to handle imnage size for multiple scren and multiple devices.
Apple = retina and non-retina
Android = ldpi, mdpi, hdpi, xhdpi, xxhdpi and tablets (all this with multiple screen resolution)
BlackBerry10 = one resolution (but not equal to the others)
WindowsPhone8 = one resolution (but not equal to the others)
For this case, what is the best way ?
Use SVG images (Optimizacion, perfomance, size of app) ??
Dedicate CSS tags for device pixel ratio (CSS Image Replacement) (the designer can kill me :smile: lol ) see the list http://bjango.com/articles/min-device-pixel-ratio/
CSS Sprite sheet
Another way
Before the decision, think in what is the best to apply in all devices.
Thanks in advance
There really isn't a single way to do this since each implementation of an image will require a different approach depending on its purpose. SVGs are great but not everything works as an SVG.
Media queries will be your ally.
See this: supporting multiple resolution and density of images in phonegap
and this for an alternate approach: Angular.js data-bind background images using media queries
There are also some nice polyfills for the html5 picture element which you might find useful.
See: https://github.com/scottjehl/picturefill
...and its angularjs directive implementation https://github.com/tinacious/angular-picturefill

Is there a good practice for serving images to mobile and HIDPI/retina devices yet?

I'm planing to create a blog and I'm thinking about how to serve images in the best possible quality without affecting loading times for mobile devices too much.
My content column is aprox. 768 display points wide and I'm posting screenshots using a retina Macbook so the images will be at least 1500px wide. Other Retina/HIDPI users should see the version best for them.
Is there a recommended best practice for this case? I'm talking only about content images (<img />); no CSS backgrounds. A true WordPress solution that serves images based on the uploaded hires images would be great.
I have used adaptive images in the past but it does not distinguish between HIDPI and normal clients. Other solutions serve for retina but not for different screen sizes.
No, there is no good method for doing this. This isn't a fault of WordPress, but of the state of the web in general. HiDPI on the web is currently a big hack, basically, and the web has not yet adapted to it properly.
Basically, there's no good way to specify multiple resolution images in the HTML in a way that the browser can then use to determine the proper image to show. CSS works, but only if you're using images as backgrounds. If you're dealing with IMGs, then you have to rely on Javascript to do the job, and that has various downsides depending on your methodology.
In the future, when browsers have started to adopt methods for specifying multiple resolution images, then this won't be as much of a problem.
Edit: This deserves a bit more explanation.
The basic problem with using JS to do this is that it relies on a technique of image-replacement. The JS code, through whatever logic it has, finds the IMGs on the page, then replaces them with higher resolution versions from some other file. No matter what method is used to do this, it has two major downsides:
First, the initial image is loaded in low resolution, then loaded again in high resolution when the image is replaced. This means extra bandwidth usage, and considering most HiDPI devices are currently mobile ones, this doesn't make much sense to waste that sort of bandwidth on those devices.
Second, the image is generally first displayed in low resolution and then replaced with the high resolution image. This means that there's a second or two of showing a fuzzy image on HiDPI screens, before it "pops" into focus as the replacement occurs. This is a really crappy effect to have.
Other methods exist by which you can simply serve the proper image in the first place, but they have downsides as well.
Server-side browser detection by User-Agent is a pretty crappy way to do things in the first place, and best avoided simply because there's so many different agents out there in the wild that maintaining a list of them is virtually impossible. However, if you want to go through the pain of it, this method can work.
Alternatively, it's possible to have Javascript in the browser detect the device-pixel-ratio of the browser and set that in a Cookie to return to the server. The server can then use this cookie to serve the proper resolution image. This works, but is unreliable because many mobile user agents fail to properly set and return the cookie, and even on those that do, the high-resolution images would only be shown on the second visit to the page. First impressions are low-res, which isn't good in general.
Like you can see: It's just hacks all the way down. Until the browser is capable of being told for a specific IMG that multiple versions exist, and their parameters, and then being allowed to choose for itself, then these are the only ways to do it. But, if things like the proposed HTML5 "srcset" or the PICTURE tag are implemented, then we'll have better choices.
After thinking a while about this, I have to offer two ways to approach (parts of) this problem (partly). Still: Read the answer from #Otto carefully. It has a lot of in depth details for a reason.
Javascript detection
I wrote an answer about »How to add default images« for attachments. You can alter the filter callback to always add a simple png/gif (that has just the background color) instead of the normal image or thumbnail. This would help you serving an extremely small image (maybe below 1kB - smushIt for the rescue) that the user agent/browser would cache at the first load and then simply insert for each other call to an image.
Then, simply the display pixel density with…
var dpr = 1;
if ( window.devicePixelRatio !== undefined )
dpr = window.devicePixelRatio;
if ( dpr > 1 ) {
// High pixel density pixel displays
}
…exchange the images when the DOM has fully loaded.
CSS routing
Adding stylesheets for different pixel density displays isn't that hard:
// For default/low density pixel displays only:
wp_enqueue_style(
'low_density_styles'
,trailingslashit( get_stylesheet_directory_uri() ).'low_density.css'
,array()
,filemtime( locate_template( 'low_density.css' ) )
,'screen'
);
// For high density pixel displays only:
$media = 'only screen and (min--moz-device-pixel-ratio: 2), ';
$media .= 'only screen and (-o-min-device-pixel-ratio: 2/1), ';
$media .= 'only screen and (-webkit-min-device-pixel-ratio: 2), ';
$media .= 'only screen and (min-device-pixel-ratio: 2)';
wp_enqueue_style(
'high_density_styles'
,trailingslashit( get_stylesheet_directory_uri() ).'high_density.css'
,array()
,filemtime( locate_template( 'high_density.css' ) )
,$media
);
Now you can simply switch between high/low density images - for your default images (header, logo, etc.). This doesn't work for post type attachments (again: refer to #Otto answer).

Is it better to use CSS / HTML to adjust an image size, or save multiple versions of the image?

Up until now, when a user has uploaded an image, I have been saving several different versions of it for use throughout my site. As the site has grown, so have the numbers of sizes needed.
At the moment each uploaded image is sized in to about 6 new images and saved on the server.
The downside is that every time I need to create a new size (right now, for instance, I'm making a new size for an image gallery), I have to cycle through all the thousands of images and re-cut a new size for each.
Whereas, when I started, it was a nice quick way to avoid resizing images on the fly, now it's starting to turn into a nightmare.
Is it better to continue saving different sizes, and just deal with the overhead, or is it better at this point to get maybe 3 general sizes, and resize them on the fly as needed?
"Resizing" images using html/css (e.g., specifying height & width) is generally not what you want to do - it results in poorly scaled images with artifacts from the resize, and is inefficient as the user is potentially downloading a much larger file than they actually need.
Rather, having some kind of server-side solution to allow for on-the-fly resizing is probably what you want. I'd recommend using ImageMagick - combined with the implementation for your favorite language and some web-server voodoo (e.g., using .htaccess for Apache), you can easily have /path/to/yourimage.png?50x50 fire a call to a resize script that resizes the image, saves it in a cache folder, and outputs the resized file to the browser. This is better all around - you get proper resizing, your user only downloads the exact file they need, and the end-result is cached so your resize action only occurs once. Check out Image::Magick::Thumbnail for an example (in perl)
Edit - if you respond to this with what server-side language/framework you are using, I would be happy to point you in the direction of a thumbnail/resizing implementation of ImageMagick or something else for your platform.
Multiple versions.
Some browsers simply don't scale these things well and you end up with choppy nasty in the image, bad pixelation, etc...
The exception could be if you know all the images are photographic. Then have versions for your larger sizes, but shrinking could be ok. But if these have illustration or text, the effect will be noticeable.
.resize {
width: 200px;
height : auto;
}
.resize {
width: auto;
height : 300px;
}

Very large images in web browser

We would like to display very large (50mb plus) images in Internet Explorer. We would like to avoid compression as compression algorithms are not what CSI would have us believe that they are and the resulting files are too lossy.
As a result, we have come up with two options: Silverlight Deep Zoom or a Flash based solution (such as Zoomify). The issue is that both of these require conversion to a tiled output and/or conversion to a specific file type (Zoomify supports a single proprietary file type, PFF).
What we are wondering is if a solution exists which will allow us to view the image without a conversion before hand.
PS: I know that you can write an application to tile the images (as needed or after the load process) and output them; however, we would like to do this without chopping up the file.
The tiled approach really is the right way to do it.
Your users don't want to download a 50mb file before they can start viewing the image. You don't want to spend the bandwidth to serve 50 megs to every user who might only view a fraction of your image.
If you serve the whole file, users will eventually be able to load and view it, but it won't run smoothly for most of them.
There is no simple non-tiled way to serve just a portion of an image unless you want to use a server-side library like imagemagik or PIL to extract a specific subset of the image for each user. You probably don't want to do that because it will place a significant load on your server.
Alternatively, you might use something like google's map tool to provide zooming and scaling. Some comments on doing that are available here:
http://webtide.wordpress.com/2008/08/27/custom-google-maps/
Take a look at OpenSeadragon. To make a image can work with OpenSeadragon, you should generate a zoomable image format which mentioned here. Then follow starting guide here
The browser isn't going to smoothly load a 50 meg file; if you don't chop it up, there's no reasonable way to make it not lag.
If you dont want to tile, you could have the server open the file and render a screen sized view of the image for display in the browser at the particular zoom resolution requested. This way you arent sending 50 meg files across the line when someone only wants to get an overview of the image. That is, the browser requests a set of coordinates and an output size in pixels, the server opens the larger image and creates a smaller image that fits the desired view, and sends that back to the web browser.
As far as compression, you say its too lossy, but if thats what you are seeing you are probably using the wrong compression algorithm or setting for the type of image you have. The jpg format has quality settings to control lossiness, and PNG compression is lossless (the pixels you get after decompressing are the exact values you had prior to compression). So consider changing what you are using as compression, and dont just rely on the default settings in an image editor.

Serving Images with on-the-fly resize

my company has recently started to get problems with the image handling for our websites.
We have several websites (adult entertainment) that display images like dvd covers, snapshots and similar. We have about 100'000 movies and for each movie we have an average of 30 snapshots + covers. Almost every image has an additional version with blurring and overlay for non-members, this results in about 50 images per movie or a total of 5 million base images. Each of the images is available in several versions, depending on where it's placed on the page (thumbnail, original, small preview, not-so-small preview, small image in the top-list, etc.) which results in more images than i cared to count.
Now i had the idea to use a server to generate the images on-the-fly since it became quite clumsy to generate all the different images for all the different pages (as different pages sometimes even need different image sizes for basically the same task).
Does anyone know of an image processing server that can scale down images on-the-fly so we only need to provide the original images and the web guys can just request whatever size they need?
Requirements:
Very High performance (Several thousand users per day)
On-the-fly blurring and overlay creation
On-the-fly resize (with and without keeping aspect ratio)
Can handle millions of images
Must be able to read JPG, GIF, PNG and BMP and convert between them
Security is not that much of a concern as i.e. the unblurred images can already be reached by URL manipulation and more security would be nice but it's not required and frankly i stopped caring (After failing to get into my coworkers heads why (for our small reseller page) it's a bad idea to use http://example.com/view_image.php?filename=/data/images/01020304.jpg to display the images).
We tried PHP scripts to do this but the performance was too slow for this many users.
Thanks in advance for any suggestions you have.
I suggest you set up a dedicated web server to handle image resize and serve the final result. I have done something similar, although on a much smaller scale. It basically eliminates the process of checking for the cache.
It works like this:
you request the image appending the required size to the filename like http://imageserver/someimage.150x120.jpg
if the image exists, it will be returned with no other processing (this is the main point, the cache check is implicit)
if the image does not exist, handle the 404 not found via .htaccess and reroute the request to the script that generates the image of the required size
in the script specify the list of allowed sizes to avoid attacks like scripts requesting every possible size to shut your server down
keep this on a cookieless domain to minimize unnecessary traffic
EDIT: I don't think that PHP itself would slow the process much, as PHP scripting in this case is reduced to a minimum: the image scaling is done by a builtin library written in C. Whatever you do you'll have to use a library like this (GD or libmagick or so) so that's unavoidable. With my system at least you totally skip the overhead of checking the cache, thus further reducing PHP interaction. You can implement this on your existing server, so I guess it's a solution well suited for your budget.
Based on
We tried PHP scripts to do this but the performance was too slow for this many users.
I'm going to assume you weren't caching the results. I'd recommend caching the resulting images for a day or two (i.e. have your script check to see if the thumbnail has already been generated, if so use it, if it hasn't generate it on the fly).
This would improve performance dramatically as I'd imagine the main/start page probably has a lot more hits than random video X, thus when viewing the main page no images have to be created as they're cached. When User Y views Movie X, they won't notice the delay as much since it just has to generate that one page.
For the "On-the-fly resize" aspect - how important is bandwidth to you? I'd want to assume you're going through so much with movies that a few extra kb in images per request wouldn't do too much harm. If that's the case, you could just use larger images and set the width and height and let the browser do the scaling for you.
The ImageCache and Image Exact Sizes solutions from the Drupal community might do this, and like most solutions OSS use the libraries from ImageMagik
There are some AMI images for Amazons EC2 service to do image scaling. It used Amazon S3 for image storage, original and scales, and could feed them through to Amazons CDN service (Cloud Front). Check on EC2 site for what's available
Another option is Google. Google docs now supports all file types, so you can load the images up to a Google docs folder, and share the folder for public access. The URL's are kind of long e.g.
http://lh6.ggpht.com/VMLEHAa3kSHEoRr7AchhQ6HEzHVTn1b7Mf-whpxmPlpdrRfPW216UhYdQy3pzIe4f8Q7PKXN79AD4eRqu1obC7I
Add the =s paramter to scale the image, cool! e.g. for 200 pixels wide
http://lh6.ggpht.com/VMLEHAa3kSHEoRr7AchhQ6HEzHVTn1b7Mf-whpxmPlpdrRfPW216UhYdQy3pzIe4f8Q7PKXN79AD4eRqu1obC7I=s200
Google only charge USD5/year for 20GB. There is a full API for uploading docs etc
Other answers on SO
How best to resize images off-server
Ok first problem is that resizing an image with any language takes a little processing time. So how do you support thousands of clients? We'll you cache it so you only have to generate the image once. The next time someone asks for that image, check to see if it has already been generated, if it has just return that. If you have multiple app servers then you'll want to cache to a central file-system to increase your cache-hit ratio and reduce the amount of space you will need.
In order to cache properly you need to use a predictable naming convention that takes into account all the different ways that you want your image displayed, i.e. use something like myimage_blurred_320x200.jpg to save a jpeg that has been blurred and resized to 300 width and 200 height, etc.
Another approach is to sit your image server behind a proxy server that way all the caching logic is done automatically for you and your images are served by a fast, native web server.
Your not going to be able to serve millions of resized images any other way. That's how Google and Bing maps do it, they pre-generate all the images they need for the world at different pre-set extents so they can provide adequate performance and be able to return pre-generated static images.
If php is too slow you should consider using the 2D graphic libraries from Java or .NET as they are very rich and can support all your requirements. To get a flavour of the Graphics API here is a method in .NET that will resize any image to the new width or height specified. If you omit a height or width, it will resize maintaining the right aspect ratio. Note Image can be a created from a JPG, GIF, PNG or BMP:
// Creates a re-sized image from the SourceFile provided that retails the same aspect ratio of the SourceImage.
// - If either the width or height dimensions is not provided then the resized image will use the
// proportion of the provided dimension to calculate the missing one.
// - If both the width and height are provided then the resized image will have the dimensions provided
// with the sides of the excess portions clipped from the center of the image.
public static Image ResizeImage(Image sourceImage, int? newWidth, int? newHeight)
{
bool doNotScale = newWidth == null || newHeight == null; ;
if (newWidth == null)
{
newWidth = (int)(sourceImage.Width * ((float)newHeight / sourceImage.Height));
}
else if (newHeight == null)
{
newHeight = (int)(sourceImage.Height * ((float)newWidth) / sourceImage.Width);
}
var targetImage = new Bitmap(newWidth.Value, newHeight.Value);
Rectangle srcRect;
var desRect = new Rectangle(0, 0, newWidth.Value, newHeight.Value);
if (doNotScale)
{
srcRect = new Rectangle(0, 0, sourceImage.Width, sourceImage.Height);
}
else
{
if (sourceImage.Height > sourceImage.Width)
{
// clip the height
int delta = sourceImage.Height - sourceImage.Width;
srcRect = new Rectangle(0, delta / 2, sourceImage.Width, sourceImage.Width);
}
else
{
// clip the width
int delta = sourceImage.Width - sourceImage.Height;
srcRect = new Rectangle(delta / 2, 0, sourceImage.Height, sourceImage.Height);
}
}
using (var g = Graphics.FromImage(targetImage))
{
g.SmoothingMode = SmoothingMode.HighQuality;
g.InterpolationMode = InterpolationMode.HighQualityBicubic;
g.DrawImage(sourceImage, desRect, srcRect, GraphicsUnit.Pixel);
}
return targetImage;
}
In the time that this question has been asked, a few companies have sprung up to deal with this exact issue. It is not an issue that's isolated to you or your company. Many companies reach the point where they need to look for a more permanent solution for their image processing needs.
Services like imgix serve as a proxy and CDN for image operations like resizing and applying overlays. By manipulating the URL, you can apply different transformations to each image. imgix serves billions of requests per day.
You can also stand up services on your own and put them behind a CDN. Open source projects like imageproxy are good for this. This puts the burden of maintenance on your operations team.
(Disclaimer: I work for imgix.)
What you are looking for is best matched by Thumbor http://thumbor.readthedocs.org/en/latest/index.html , which is open source, backed by a huge company (means it will not disappear tomorrow), and ships with a lot of nice features like detecting what is important on an image when cropping.
For low-cost plus CDN I'd suggest to combine it with Cloudfront and AWS storage, or a comparable solution with a free CDN like Cloudflare. These might not be the best performing CDN providers, but at least still perform better than one server and also offload your image server on the cheap. Plus, it will save you a TON of bandwidth cost.
If each different image is uniquely identifiable by a single URL then I'd simply use a CDN such as AKAMAI. Let your PHP script do the job and let AKAMAI handle the load.
Since this kind of business doesn't usually have budget problems, that'd be the only place I'd look at.
Edit: that works only if you do find a CDN that will serve this kind of content for you.
This exact same problem is now being solved by image resize services dedicated to this task. They provide following features:
In built CDN - you need not worry about image distribution
Image resize on the fly - any size needed is available
No storage needed - you just store base image and all variants are handled by service
Ecosystem libraries - you can just include javascript and your job is done for all devices and all browsers.
One such service is Gumlet. You can also try some open source alternative like nginx plugin which can also resize image on the fly.
(I work for Gumlet.)

Resources