Does serving MIME type images improves loading speed, what benefits - mime

I am wondering if including MIME types images, instead of the images themselves, helps improve the loading speed in the page?
What are advantages of serving encoded MIME types instead of original files?
Thank you,

Related

Why it is possible to show a base64 encoded PNG with an "image/jpeg" data URL?

Here's as an example of a base64 encoded PNG data URL (using "image/png" data type, of course):
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABIAAAASCAYAAABWzo5XAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAAAZdEVYdFNvZnR3YXJlAHBhaW50Lm5ldCA0LjAuMTczbp9jAAAA1UlEQVQ4T62SsQ0CMQxFUzAEA1BQMgIlI1zJCIxAT0HJCBSUlFdQMAQlJWOE78hnObYDIuJLTzr/+/5FkpRz/guh2UM9pLQBWzDXvgVacG4lngncAH08QFgGLcEL0LAX34ROHCBcGaRLiEH+meAMXDhESBlkS3bVrh6KEZetQbOk7FmjmL5M40rKTmQSEJVdeXlCDtcSmgRkz4Ro32Zo+pK7+g7LqqEYjduBjsrzT6Mavl3xhzIJcXDkEBHfTl3WfNln8ARhyQR0sDkX6iU0ewjN38npDdYczGIKuRnZAAAAAElFTkSuQmCC
I noticed that (in Firefox and Chrome) things work even if data type is set to "image/jpeg" (and leaving all the rest untouched) like this:
data:image/jpeg;base64,iVBORw0KGgoAAAANSUhEUgAAABIAAAASCAYAAABWzo5XAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsMAAA7DAcdvqGQAAAAZdEVYdFNvZnR3YXJlAHBhaW50Lm5ldCA0LjAuMTczbp9jAAAA1UlEQVQ4T62SsQ0CMQxFUzAEA1BQMgIlI1zJCIxAT0HJCBSUlFdQMAQlJWOE78hnObYDIuJLTzr/+/5FkpRz/guh2UM9pLQBWzDXvgVacG4lngncAH08QFgGLcEL0LAX34ROHCBcGaRLiEH+meAMXDhESBlkS3bVrh6KEZetQbOk7FmjmL5M40rKTmQSEJVdeXlCDtcSmgRkz4Ro32Zo+pK7+g7LqqEYjduBjsrzT6Mavl3xhzIJcXDkEBHfTl3WfNln8ARhyQR0sDkX6iU0ewjN38npDdYczGIKuRnZAAAAAElFTkSuQmCC
But... Why?
They're both using the image handling subsystem, which ignores the mime type and just goes with the actual format of the image.
Specifically, most browsers will translate the viewing of an image into the viewing of an HTML webpage with an <img> tag in them. Since servers lie and browsers are supposed to be able to show even badly-configured websites, the part of the browser that deals with images will in most cases completely ignore any extensions or MIME types. There was no point programming in an exception for data: URIs.

mod_pagespeed rewrite all images into data:image format

I'm using mod_pagespeed and I'm searching for a way to rewrite all images into data:image format. Currently only small images are being included. I can't find anything in the documentation.
You can use the ModPageSpeedImageInlineMaxBytes option to configure the size threshold used by the inline_images filter. See the image optimization documentation for more details.
However, you should consider carefully if you want to actually inline all images? While you'll save making another request for the inlined images, you'll also miss out on benefits such as cache extension, and pay a penalty if you reference an image multiple times on a page.

Image size guidelines

This may well be a little of an open-ended question
The site I am working on requires to be optimised for performance. One of the key areas is to optimise the file sizes of the images used upon the site.
Unfortunatley these images are being created by employees who do not have the required knowledge for creating images for the web, and it is my job to produce a set of guidelines for them to use.
I was wondering whether there was any resource/guidlines/literature regarding typical images file sizes for images of different dimensions - as I would like to include something like this to aid them to ensure their images are being created properly.
Any info would be greatly appreciated.
Thanks in advance
I can't answer the opinion question, but I can suggest some guidelines that will keep your images smaller.
First off, if they're using Photoshop to edit their images, it's likely they're storing a whole bunch of crap in the headers (digital papertrail, EXIF data, and such). Also, folks will frequently save in too high a bit depth.
For novice users, trying to explain why they need to use "save for web" is more likely to confuse them. Instead, just point them at:
http://www.smushit.com/ysmush.it/
This site is rather handy - it will compress all the images on a page you specify, or you can upload the images.
You should strongly consider writing some guidelines about where images are stored as well. It's frequently very beneficial to have your static image content stored on several servers, apart from your dynamic content. Most browsers will only download a limited # of files at a time from any given website (usually it's 2).
Unless there's a good reason, all your images should be cached using one of the HTTP cache techniques (expires, etags, etc).
Good luck.
72 dpi as a resolution and either jpeg or png formats work best.
Try to use images at the exact pixel area size they will end up being displayed as. This is specified by the images height and width attributes.
You can set the output quality of a jpeg image which will also save file size although there is a trade off against image quality.
I hope this is of use.

Image Rendering Test

I am benchmarking a custom brower and want to benchmark the rendering speeds of different types of images (gif, jpg, png) of the same file size to see which of the image formats this browser renders the fastest.
My process was just to have a simple seperate HTML page for each type of image and just use a Javascript counter before it is rendered and and after to measure the browser's rendering speed of that specific image.
Any thoughts on this process? Any thoughts on how to improve it?
Well, it's difficult to get meaningful generic results that way. You're measuring a combination of loading html, javascript and an image. Depending on where you're loading them from, you're also measuring the disk or network cache. The image rendering code is going to have some startup time, is dependent on a memory allocator and possibly garbage collection. Then there is image size, color depth, amount of compresion, number of images on the page, scaling, the influence of style sheets, the resolution of the javascript timer. Oh, and are you rendering to a visible part of a window, in a layer, or off-screen.
But don't worry, you'll be able to come up with a usable test. For your specific situation. Or the differences might even be very clear.
The Firefox Firebug plugin YSlow is pretty good

How would you optimise/simulate 'random' loading of large image files?

We use large background images (hi-res photos, up to 700 KB) for our page design.
It's part of the experience of the site that as you browse around, you see different images.
At the moment a different (random) image is loaded on each page request, from a pool of ~15 images, which could grow over time.
I'm looking for a sane way to optimize this:
To avoid the user having to download a big image file on every page view
To reduce load on the server (is this an issue, will the server keep the images in memory?)
The ideas I have so far include:
A timer which loads a different image at set intervals
Progressively loading other images in the background with ajax
Associating images with specific content (pages, tags)
The question is, how to keep it feeling somewhat random, while minimizing page load times and server hit?
I usually avoid sites with huge images, I am very impatient. I would rethink your design.
As a first step you should make sure, that the images can be properly cached:
use sane urls (no session id's etc)
set appropriate http headers ETag
Firstly, hearing that the background-images alone are 700kb astounds me. In addition to the content ON screen...that is a pretty heavy site.
For starters, I would try to use image compression tools. Two tools come to mind Imagemagick and PNGCrush. PNGCrush is excellent in reducing all the extraneous metadata attached to photos, without compromising photo quality.
I only recommend this as compressing the images will assist you in enabling the user to download a smaller quantity of content, which means quicker load times, which...at the end of the day...is what users want.
I would also cache the images, such that when a user re-visits the site, the image is already cached on their end. This minimises the HTTP requests that are made each time a user visits your site.
An example of where this technique is used on a commercial site is www.reactive.com. If you look the /js/headerImages.js file, they make use of image caching. Funnily enough, you will find the same src code at: http://javascript.internet.com/miscellaneous/random-image.html
Considering that you have mentioned that images are randomly loaded, I am assuming you are using a Javascript library such as jQuery to create the effect.
If you are, you can minimize page load times by using a CDN as opposed to referencing to a local copy of the jQuery lib which is stored on your server. I have performed performance testing on a site I made for a client, and over an average of 20 hits, saved 1.6 seconds through this technique!
Hope that helps for now :)

Resources