Is there a good practice for serving images to mobile and HIDPI/retina devices yet? - image

I'm planing to create a blog and I'm thinking about how to serve images in the best possible quality without affecting loading times for mobile devices too much.
My content column is aprox. 768 display points wide and I'm posting screenshots using a retina Macbook so the images will be at least 1500px wide. Other Retina/HIDPI users should see the version best for them.
Is there a recommended best practice for this case? I'm talking only about content images (<img />); no CSS backgrounds. A true WordPress solution that serves images based on the uploaded hires images would be great.
I have used adaptive images in the past but it does not distinguish between HIDPI and normal clients. Other solutions serve for retina but not for different screen sizes.

No, there is no good method for doing this. This isn't a fault of WordPress, but of the state of the web in general. HiDPI on the web is currently a big hack, basically, and the web has not yet adapted to it properly.
Basically, there's no good way to specify multiple resolution images in the HTML in a way that the browser can then use to determine the proper image to show. CSS works, but only if you're using images as backgrounds. If you're dealing with IMGs, then you have to rely on Javascript to do the job, and that has various downsides depending on your methodology.
In the future, when browsers have started to adopt methods for specifying multiple resolution images, then this won't be as much of a problem.
Edit: This deserves a bit more explanation.
The basic problem with using JS to do this is that it relies on a technique of image-replacement. The JS code, through whatever logic it has, finds the IMGs on the page, then replaces them with higher resolution versions from some other file. No matter what method is used to do this, it has two major downsides:
First, the initial image is loaded in low resolution, then loaded again in high resolution when the image is replaced. This means extra bandwidth usage, and considering most HiDPI devices are currently mobile ones, this doesn't make much sense to waste that sort of bandwidth on those devices.
Second, the image is generally first displayed in low resolution and then replaced with the high resolution image. This means that there's a second or two of showing a fuzzy image on HiDPI screens, before it "pops" into focus as the replacement occurs. This is a really crappy effect to have.
Other methods exist by which you can simply serve the proper image in the first place, but they have downsides as well.
Server-side browser detection by User-Agent is a pretty crappy way to do things in the first place, and best avoided simply because there's so many different agents out there in the wild that maintaining a list of them is virtually impossible. However, if you want to go through the pain of it, this method can work.
Alternatively, it's possible to have Javascript in the browser detect the device-pixel-ratio of the browser and set that in a Cookie to return to the server. The server can then use this cookie to serve the proper resolution image. This works, but is unreliable because many mobile user agents fail to properly set and return the cookie, and even on those that do, the high-resolution images would only be shown on the second visit to the page. First impressions are low-res, which isn't good in general.
Like you can see: It's just hacks all the way down. Until the browser is capable of being told for a specific IMG that multiple versions exist, and their parameters, and then being allowed to choose for itself, then these are the only ways to do it. But, if things like the proposed HTML5 "srcset" or the PICTURE tag are implemented, then we'll have better choices.

After thinking a while about this, I have to offer two ways to approach (parts of) this problem (partly). Still: Read the answer from #Otto carefully. It has a lot of in depth details for a reason.
Javascript detection
I wrote an answer about »How to add default images« for attachments. You can alter the filter callback to always add a simple png/gif (that has just the background color) instead of the normal image or thumbnail. This would help you serving an extremely small image (maybe below 1kB - smushIt for the rescue) that the user agent/browser would cache at the first load and then simply insert for each other call to an image.
Then, simply the display pixel density with…
var dpr = 1;
if ( window.devicePixelRatio !== undefined )
dpr = window.devicePixelRatio;
if ( dpr > 1 ) {
// High pixel density pixel displays
}
…exchange the images when the DOM has fully loaded.
CSS routing
Adding stylesheets for different pixel density displays isn't that hard:
// For default/low density pixel displays only:
wp_enqueue_style(
'low_density_styles'
,trailingslashit( get_stylesheet_directory_uri() ).'low_density.css'
,array()
,filemtime( locate_template( 'low_density.css' ) )
,'screen'
);
// For high density pixel displays only:
$media = 'only screen and (min--moz-device-pixel-ratio: 2), ';
$media .= 'only screen and (-o-min-device-pixel-ratio: 2/1), ';
$media .= 'only screen and (-webkit-min-device-pixel-ratio: 2), ';
$media .= 'only screen and (min-device-pixel-ratio: 2)';
wp_enqueue_style(
'high_density_styles'
,trailingslashit( get_stylesheet_directory_uri() ).'high_density.css'
,array()
,filemtime( locate_template( 'high_density.css' ) )
,$media
);
Now you can simply switch between high/low density images - for your default images (header, logo, etc.). This doesn't work for post type attachments (again: refer to #Otto answer).

Related

Large images don't render in Chrome?

Very large images will not render in Google Chrome (although the scrollbars will still behave as if the image is present). The same images will often render just fine in other browsers.
Here are two sample images. If you're using Google Chrome, you won't see the long red bar:
Short Blue
http://i.stack.imgur.com/ApGfg.png
Long Red
http://i.stack.imgur.com/J2eRf.png
As you can see, the browser thinks the longer image is there, but it simply doesn't render. The image format doesn't seem to matter either: I've tried both PNGs and JPEGs. I've also tested this on two different machines running different operating systems (Windows and OSX). This is obviously a bug, but can anyone think of a workaround that would force Chrome to render large images?
Not that anyone cares or is even looking at this post, but I did find an odd workaround. The problem seems to be with the way Chrome handles zooming. If you set the zoom property to 98.6% and lower or 102.6% and higher, the image will render (setting the zoom property to any value between 98.6% and 102.6% will cause the rendering to fail). Note that the zoom property is not officially defined in CSS, so some browsers may ignore it (which is a good thing in this case since this is a browser-specific hack). As long as you don't mind the image being resized slightly, I suppose this may be the best fix.
In short, the following code produces the desired result, as shown here:
<img style="zoom:98.6%" src="http://i.stack.imgur.com/J2eRf.png">
Update:
Actually, this is a good opportunity to kill two birds with one stone. As screens move to higher resolutions (e.g. the Apple Retina display), web developers will want to start serving up images that are twice as large and then scaling them down by 50%, as suggested here. So, instead of using the zoom property as suggested above, you could simply double the size of the image and render it at half the size:
<img style="width:50%;height:50%;" src="http://i.stack.imgur.com/J2eRf.png">
Not only will this solve your rendering problem in Chrome, but it will make the image look nice and crisp on the next generation of high-resolution displays.

Is there a better solution than CSS sprites?

this is not a question about a specific programming problem, it's about examining different concepts. If the moderators don't feel this is ok, delete my question.
I need to display 100 png images in a table td, and the images are 75x16 PNGs. In order to reduce the number of HTTP requests, I grouped all 166 images (only roughly 100 are shown at one time) in a big spritesheet, and have used the IMG tag to display the output, one image at a time. This is the code:
CSS:
.sprites {background-image:url('folder/spritesheet.png');background-color:transparent;background-repeat:no-repeat;display:inline;}
#png3 {height:16px;width:75px;background-position:0px 0;}
#png5 {height:16px;width:75px;background-position:-75px 0;}
PHP:
$classy = "png" . $db_field['imageid'];
echo "<td>" , "<img src='transparent.gif' class='sprites' id='$classy' alt='alt' align='absmiddle'/>" , "</td>";
$classy is a variable which is calling the correct image based on the SQL query output. transparent.gif is a 1px transparent gif. And this is the result, images are shown correctly inside a table:
The page loading speed increased significantly, maybe 50-60%. In one of my earlier questions some concerns were raised over this being good practice or not. But it works.
The only other solution I've found is using jar compression, but that concept works for Firefox only. This is the code which is used for displaying these same images using jar compression (PHP, no CSS):
$logo = "jar:http://www.website.com/logos.jar!/" . $db_field['imageid'] . ".png";
echo "<tr>" , "<td>" , "<img src='$logo' alt='alt' width='75' height='16'/>";
All of the 166 images are compressed in a jar archive and uploaded to the server, and as jar is a non-solid archive, only the called image is extracted, not all of them. This solution is lighting fast and I've never seen a faster way of displaying that many images. The concept is here and deserves a link. Another advantage over CSS sprites is that with jar each image can be individually optimized for size (e.g one is optimized to 64 colors, another one to 128, 32...depending on the image) and a large spritesheet can not be optimized as it contains a lot of colors.
So, does anyone know of a solution which would be equally fast as jar? If using CSS sprites to display content is bad practice - what is good practice which gives the same result? The key here is the loading speed of the website with as few HTTP requests as possible.
Not really an expert on this but I had my share in these thing also
HTTP Requests
Ever heard of the "2 concurrent connections" (recent browsers have around 6-8). Loading a lot of stuff means if 2 are loading at the same time, the others have to wait in line. Loading it in one big chunk is better. This is the main reason why spriting is used. Aside from the connection limit, you manage those "same purpose" images in one image.
Cache
Now, one big chunk I say but you might ask "Does that make it even worse?". Nope, becasue I have an ace up my sleeve and that's where "cache" comes in to play. One page load is all you need, and then, poof! The rest of the pages that need that image are like the speed of light and Saves you from another HTTP request. Never underestimate the power of the cache.
Images
Other things you can do is to optimize your images. I have used Fireworks and I really love it's image optimization tools. To optimize, here are personal guidelines i follow which you can use in your situation:
GIFs for icons, JPGs for images, PNGs for transparent stuff.
remove unused colors. yes, you can do this in some tools. cuts down size
never resize images in the html. have resized versions instead.
lose quality. yes, there is such thing. lower your image quality to reasonable limits. losing it too much makes your image too "cloudy" or "blocky"
progressive loading images. What it does is it "fast-loads" a blurred image then clears it up later.
avoid animated images. they are a bloat, not to mention annoying.
Server Tricks
There are connection limits - but that does not prevent you from using other domains or even subdomains! Distribute your content to other domains or subdomains to increase the number your connections. For images, dedicate a subdomain or two for it, like say img1.mysite.com and img2.mysite.com or another domain mysite2.com. not only is it beneficial for your user, it's beneficial to distributing server load.
Another method is using a Content Delivery Network (CDN). CDN has a global network of servers, which contain "cached" versions of your website resources. Like say i'm in Asia, when i view your site with CDN'ed resources, it finds that resource in the server nearest Asia.
Mark-up
Not necessarily related speed and semantics but the use of id should be reserved for more important purposes. If you use ID to mark images for their styles, what if there was another element that needs the same image? IDs need to be unique, they can't be used twice. So i suggest using multiple classes instead.
also, IDs take precedence over classes. to avoid unexpected overrides, use classes. learn more about CSS specificity.
.sprites {
background-image:url('folder/spritesheet.png');
background-color:transparent;
background-repeat:no-repeat;
display:inline;
height:16px; /*same width and heights? place them here instead*/
width:75px;
}
.png3 {
height:16px; /* in cases you need a different dimension, this will override */
width:75px;
background-position:0px 0;
}
.png5 {
background-position:-75px 0;
}
$classy = "png" . $db_field['imageid'];
echo <img src='transparent.gif' class='sprites {$classy}' alt='alt' align='absmiddle'/>";
I embed small images/icons in the style sheet:
.someicon{
background-image:url('data:image/png;base64,iVBORw0KGg....');
}
this works with all modern browsers, and does not require me to create sprites (plus it even saves one additional file to load).
In development, the images are defined in the style sheet normally like so:
.someicon{
background-image:url('../images/someicon.png');
}
and I have a system that generates the final style sheet (including consolidating all CSS into one, minifying and replacing the image reference with the data: url) automatically whenever I make a change to a style sheet.
This works well and saves me a ton of work. When compressed with gzip, the CSS file is not much bigger than the individual files added together. After optimizing the PNG/JPG files, the CSS for my start page is 63K uncompressed. Even with a slightly smaller sprite file, I would probably not save more than a fraction of a second in load time for the average user, so I do not bother with sprites.

Optimize display of a large number of images (1000+) for performance and ease of use in a web application

In our web application, the users need to review a large number of images. This is my current layout. 20 images will be displayed at a time, with a pagination bar above the thumbnails. Clicking a thumbnail will show the enlarged image to the left. The enlarged image will follow the scrollbar so it's always visible. Quite simple actually.
I was wondering what the best interface would be in this scenario:
One option is to implement an infinite scroll script which will lazy load thumbnails as the user scrolls. The thumbnails not visible will be removed from the DOM. But my concern with this approach is the number of changes in the DOM slowing down the page.
Another option could be something like Google's Fastflip.
What do you think is the best approach for this application? Radical ideas welcomed.
I think the question you have to ask is: what action is user supposed to do? What's the purpose of the site?
If "review images" entails rating every image, I'd rather go with a Fastflip approach where the focus is on the single image. A thumbnail gallery will distract from the desired action and might result in a smaller amount of pics rated/reviewed.
If the focus should rather be on the comparison of a given image against others, I'd say try the gallery approach, although I wouldn't impement an infinite scroll with thumbnails because user can quickly get lost in the abundance of choices. I think a standard pagination (whether static or ajaxified) would be better if you choose to go this route.
Just my 2c.
If you paginate thumbnails, you can pre-generate a single image containing all thumbnails for each page, then use an image map to handle mouseover text and clicking. This will reduce the number of HTTP requests and possibly lead to fewer bytes sent. The separation distance between images should be minimized for this to be most efficient. This would have some disadvantages.
To reduce image download size at the expense of preprocessing, you can try to save each image in the format (PNG or JPG) most efficient for its contents using an algorithm like the one in ImageGuide. Similarly, if the images are poorly compressed (like JPEGs from a cell phone camera), they can be recompressed at the cost of some quality.
Once the site has some testers, you can analyze patterns in which images tend to be clicked (if a pattern exists) and preload the full-size images, or even pre-load all of them once the thumbs are loaded.
You might play with JPEG2000 images (you did say "radical ideas welcomed"), which thumb very easily, because the thumbnail and main image needn't be sent as if they are separate files. This is an advantage of the compression format -- it isn't the same as the hack of telling the browser to resize the full size image to represent its own thumbnail.
You can take a look at Google's WebP image format.
At the server side, a separate image server optimized for static content delivery, perhaps using NginX or the Tux webserver.
I would show the thumbnails, since the user might want to skip some of the pictures. I would also stay away of pagination in the terms of
<<first <previuos n of x next> last>>
and go for something more easy to implement and efficient. A
load x more pictures.
No infinite scroll whatsoever and why not, even no scroll at all. Just load x more, previous x.
Although this answer might be a bit unradical and boring, I'd go with exactly your suggestion of asynchronously loading the thumbnails (and of course main picture), if they come into view. A similar technique is used by Google+ in the pane to add persons to circles. This way, you keep the server resources and bandwidth on the pictures that are needed by the client. As Google+ shows, the operations on the DOM tree are fast enough and don't slow down a computer of the past years.
You might also prebuilt a few lines of the thumbnail table ahead with a dummy image (e.g. a "loading circle" animated gif) and replace the image. That way, the table in view is already built and does not need to be rerendered, as the flow elements following the table would have to be, if no images are in there during scrolling.
Instead of paginating the thumbnails (as suggested by your layout scheme), you could also think about letting users filter the images by tag, theme, category, size or any other way to find their results faster.

Web Design: opacity on images for my website

i'm buildling a website and I want to create a translucid menu. I know that .gif image types enable transparency, but from my experience, not translucidy(anything between being transparent and opaque) - by default it seems to set the opacity to 100%, ie a solid image without any translucity/transparency.
I'm not sure if the issue is with the file type, or with how I'm exporting my menu. If it's worth anything, I'm using Fireworks to create and export my menu.
As is, I'm exporting my seperate files for my menu as .pngs, which seem to support translucent images, but I know that I'll be wanting to reduce the file size of these images soon, so is there a better alternative to getting a semi-transparent image other than using the .png file type?
Thanks,
Patrick
I'd say PNG is probably the best bet. The more modern browsers (read: not IE6) understand the 8-bit alpha channel it provides, whereas GIFs just have the transparency key.
Often these days, the bottleneck on sites isn't the size of the image (either in dimensions or in data) but rather the number of requests that it takes to load a page. More modern website designs try to pack as many images into one using techniques like CSS Spriting (woot.com, most of google). The other bottleneck is often not setting caching up correctly, forcing return visitors to reload a bunch of stuff.
You'll see google's various pages caching everything it can, and reducing the number of things a single page needs to download (combine all Javascripts into one, all CSS stylesheets into one) so that the browser is make 2 and 3 requests instead of 15-20.
I'd go with PNGs, and look into CSS sprites and caching as an alternative optimization.
See here for an example of an image sprite used on google's homepage.

Serving Images with on-the-fly resize

my company has recently started to get problems with the image handling for our websites.
We have several websites (adult entertainment) that display images like dvd covers, snapshots and similar. We have about 100'000 movies and for each movie we have an average of 30 snapshots + covers. Almost every image has an additional version with blurring and overlay for non-members, this results in about 50 images per movie or a total of 5 million base images. Each of the images is available in several versions, depending on where it's placed on the page (thumbnail, original, small preview, not-so-small preview, small image in the top-list, etc.) which results in more images than i cared to count.
Now i had the idea to use a server to generate the images on-the-fly since it became quite clumsy to generate all the different images for all the different pages (as different pages sometimes even need different image sizes for basically the same task).
Does anyone know of an image processing server that can scale down images on-the-fly so we only need to provide the original images and the web guys can just request whatever size they need?
Requirements:
Very High performance (Several thousand users per day)
On-the-fly blurring and overlay creation
On-the-fly resize (with and without keeping aspect ratio)
Can handle millions of images
Must be able to read JPG, GIF, PNG and BMP and convert between them
Security is not that much of a concern as i.e. the unblurred images can already be reached by URL manipulation and more security would be nice but it's not required and frankly i stopped caring (After failing to get into my coworkers heads why (for our small reseller page) it's a bad idea to use http://example.com/view_image.php?filename=/data/images/01020304.jpg to display the images).
We tried PHP scripts to do this but the performance was too slow for this many users.
Thanks in advance for any suggestions you have.
I suggest you set up a dedicated web server to handle image resize and serve the final result. I have done something similar, although on a much smaller scale. It basically eliminates the process of checking for the cache.
It works like this:
you request the image appending the required size to the filename like http://imageserver/someimage.150x120.jpg
if the image exists, it will be returned with no other processing (this is the main point, the cache check is implicit)
if the image does not exist, handle the 404 not found via .htaccess and reroute the request to the script that generates the image of the required size
in the script specify the list of allowed sizes to avoid attacks like scripts requesting every possible size to shut your server down
keep this on a cookieless domain to minimize unnecessary traffic
EDIT: I don't think that PHP itself would slow the process much, as PHP scripting in this case is reduced to a minimum: the image scaling is done by a builtin library written in C. Whatever you do you'll have to use a library like this (GD or libmagick or so) so that's unavoidable. With my system at least you totally skip the overhead of checking the cache, thus further reducing PHP interaction. You can implement this on your existing server, so I guess it's a solution well suited for your budget.
Based on
We tried PHP scripts to do this but the performance was too slow for this many users.
I'm going to assume you weren't caching the results. I'd recommend caching the resulting images for a day or two (i.e. have your script check to see if the thumbnail has already been generated, if so use it, if it hasn't generate it on the fly).
This would improve performance dramatically as I'd imagine the main/start page probably has a lot more hits than random video X, thus when viewing the main page no images have to be created as they're cached. When User Y views Movie X, they won't notice the delay as much since it just has to generate that one page.
For the "On-the-fly resize" aspect - how important is bandwidth to you? I'd want to assume you're going through so much with movies that a few extra kb in images per request wouldn't do too much harm. If that's the case, you could just use larger images and set the width and height and let the browser do the scaling for you.
The ImageCache and Image Exact Sizes solutions from the Drupal community might do this, and like most solutions OSS use the libraries from ImageMagik
There are some AMI images for Amazons EC2 service to do image scaling. It used Amazon S3 for image storage, original and scales, and could feed them through to Amazons CDN service (Cloud Front). Check on EC2 site for what's available
Another option is Google. Google docs now supports all file types, so you can load the images up to a Google docs folder, and share the folder for public access. The URL's are kind of long e.g.
http://lh6.ggpht.com/VMLEHAa3kSHEoRr7AchhQ6HEzHVTn1b7Mf-whpxmPlpdrRfPW216UhYdQy3pzIe4f8Q7PKXN79AD4eRqu1obC7I
Add the =s paramter to scale the image, cool! e.g. for 200 pixels wide
http://lh6.ggpht.com/VMLEHAa3kSHEoRr7AchhQ6HEzHVTn1b7Mf-whpxmPlpdrRfPW216UhYdQy3pzIe4f8Q7PKXN79AD4eRqu1obC7I=s200
Google only charge USD5/year for 20GB. There is a full API for uploading docs etc
Other answers on SO
How best to resize images off-server
Ok first problem is that resizing an image with any language takes a little processing time. So how do you support thousands of clients? We'll you cache it so you only have to generate the image once. The next time someone asks for that image, check to see if it has already been generated, if it has just return that. If you have multiple app servers then you'll want to cache to a central file-system to increase your cache-hit ratio and reduce the amount of space you will need.
In order to cache properly you need to use a predictable naming convention that takes into account all the different ways that you want your image displayed, i.e. use something like myimage_blurred_320x200.jpg to save a jpeg that has been blurred and resized to 300 width and 200 height, etc.
Another approach is to sit your image server behind a proxy server that way all the caching logic is done automatically for you and your images are served by a fast, native web server.
Your not going to be able to serve millions of resized images any other way. That's how Google and Bing maps do it, they pre-generate all the images they need for the world at different pre-set extents so they can provide adequate performance and be able to return pre-generated static images.
If php is too slow you should consider using the 2D graphic libraries from Java or .NET as they are very rich and can support all your requirements. To get a flavour of the Graphics API here is a method in .NET that will resize any image to the new width or height specified. If you omit a height or width, it will resize maintaining the right aspect ratio. Note Image can be a created from a JPG, GIF, PNG or BMP:
// Creates a re-sized image from the SourceFile provided that retails the same aspect ratio of the SourceImage.
// - If either the width or height dimensions is not provided then the resized image will use the
// proportion of the provided dimension to calculate the missing one.
// - If both the width and height are provided then the resized image will have the dimensions provided
// with the sides of the excess portions clipped from the center of the image.
public static Image ResizeImage(Image sourceImage, int? newWidth, int? newHeight)
{
bool doNotScale = newWidth == null || newHeight == null; ;
if (newWidth == null)
{
newWidth = (int)(sourceImage.Width * ((float)newHeight / sourceImage.Height));
}
else if (newHeight == null)
{
newHeight = (int)(sourceImage.Height * ((float)newWidth) / sourceImage.Width);
}
var targetImage = new Bitmap(newWidth.Value, newHeight.Value);
Rectangle srcRect;
var desRect = new Rectangle(0, 0, newWidth.Value, newHeight.Value);
if (doNotScale)
{
srcRect = new Rectangle(0, 0, sourceImage.Width, sourceImage.Height);
}
else
{
if (sourceImage.Height > sourceImage.Width)
{
// clip the height
int delta = sourceImage.Height - sourceImage.Width;
srcRect = new Rectangle(0, delta / 2, sourceImage.Width, sourceImage.Width);
}
else
{
// clip the width
int delta = sourceImage.Width - sourceImage.Height;
srcRect = new Rectangle(delta / 2, 0, sourceImage.Height, sourceImage.Height);
}
}
using (var g = Graphics.FromImage(targetImage))
{
g.SmoothingMode = SmoothingMode.HighQuality;
g.InterpolationMode = InterpolationMode.HighQualityBicubic;
g.DrawImage(sourceImage, desRect, srcRect, GraphicsUnit.Pixel);
}
return targetImage;
}
In the time that this question has been asked, a few companies have sprung up to deal with this exact issue. It is not an issue that's isolated to you or your company. Many companies reach the point where they need to look for a more permanent solution for their image processing needs.
Services like imgix serve as a proxy and CDN for image operations like resizing and applying overlays. By manipulating the URL, you can apply different transformations to each image. imgix serves billions of requests per day.
You can also stand up services on your own and put them behind a CDN. Open source projects like imageproxy are good for this. This puts the burden of maintenance on your operations team.
(Disclaimer: I work for imgix.)
What you are looking for is best matched by Thumbor http://thumbor.readthedocs.org/en/latest/index.html , which is open source, backed by a huge company (means it will not disappear tomorrow), and ships with a lot of nice features like detecting what is important on an image when cropping.
For low-cost plus CDN I'd suggest to combine it with Cloudfront and AWS storage, or a comparable solution with a free CDN like Cloudflare. These might not be the best performing CDN providers, but at least still perform better than one server and also offload your image server on the cheap. Plus, it will save you a TON of bandwidth cost.
If each different image is uniquely identifiable by a single URL then I'd simply use a CDN such as AKAMAI. Let your PHP script do the job and let AKAMAI handle the load.
Since this kind of business doesn't usually have budget problems, that'd be the only place I'd look at.
Edit: that works only if you do find a CDN that will serve this kind of content for you.
This exact same problem is now being solved by image resize services dedicated to this task. They provide following features:
In built CDN - you need not worry about image distribution
Image resize on the fly - any size needed is available
No storage needed - you just store base image and all variants are handled by service
Ecosystem libraries - you can just include javascript and your job is done for all devices and all browsers.
One such service is Gumlet. You can also try some open source alternative like nginx plugin which can also resize image on the fly.
(I work for Gumlet.)

Resources