Using HTML5 LocalStorage for Fonts / Images / Popular Plugins - image

Has anyone used localStorage successfully to store font files referenced by CSS? How can I implement this?
For images, I know I can binary encode the images in script and save into localStorage, but that would mean a very large script code. If the bloated script is loaded everytime, I don't see any real benefit. If the script is cached as a seperate file, it would be the same as caching the image file in the first place. Am I missing something or there really is no benefit is normal circumstance for localStoraging images.
Anyone successfully implemented localStoraging popular plugins from Facebook/Google/Twitter and willing to share which are the ones most useful/applicable for caching?

I have no idea what your browser support requirements are like.
localStorage works in IE8+.
#font-face has varying support, but IE9+ supports WOFF fonts.
I think you'd need to base64 encode your fonts for them to work with localStorage.
dataUri's (required for base64'd fonts) have size limitations in IE8 (I believe it's 32k).
Cufon seriously sucks if you ask me. In my case it's WOFF support or nothing.
This article talks about storing images in local storage. Of course it's insanely fast:
http://www.sencha.com/learn/taking-sencha-touch-apps-offline/
As for fonts, well I'm looking at doing that myself. I'll let you know how it works out :)
Also, check out caniuse.com for browser support:
http://caniuse.com/#search=localstorage
http://caniuse.com/#search=datauri

Hope this helps:
From experience, I can tell you don't do it. You can use CSS3 #fontface, but that will still give you some issues. Most stable solution I have found is to use cufon: http://cufon.shoqolate.com/generate/.
I agree with your assessment and have explored it in detail on my own. Cached images beat encoded images in most cases.
No, but can only image that this will give you much grief in cross-browser compatibility.
Thanks,
Matt

Related

What is the best practice for image compression

I am receiving control of a website and I need to take care of an image compression process.
Right now, when uploading an image, it gets stored on the server with high quality and when the website's being cached, the image is getting compressed for the cache. So the cache has a compressed copy of the image while the original, high quality image, is still stored on the server.
The tool which is responsible of doing what I have just described was developed by the current owner of the website and since I am not getting that tool I will need another one. The site is currently using Pydio and I have not seen any compression option there.
Since it seems I need a new tool for the image compression process, I want to know first what is the best practice, performance-wise, for handling the compression and I know there some good, experienced developers here.
I thought about some options:
Keep it the way it is now, which is to store the original image on the server and when caching, compress it for the cache (Best compatibility with the website since this is what the tool currently being used doing).
Compress all images the moment they are being uploaded and so I will have only the compressed images on the server and use them to cache (Save storage space, but don't know how to combine it with Pydio).
Have a cron which will compress all the images which are not already compressed (Gives me the ability to upload images freely without worrying about compressing them, though the images will not be immediately compressed).
Upload the image to a website which compresses the image and then take the outputted image and upload it (Really, sounds stupid and a lot of messing around in order to upload an image)..
What do you think will be the best practice, and why? Also, Is there a better practice for compressing the images?
Plus, if you know any tool which has an API for it or anything, I will be thankful to hear about it.
The website is built using PHP.
Since the question you're asking is a general-approach one, I will put my two-cents in.
On your approaches:
Option 4 - You could use some offline software or an external site for compression, but it seems tedious work. If I needed to upload one image per day, I would probably choose this option.
Option 2 - I would rather not do compression on upload since you lose the original image. Image compression can ruin some images very badly.
As for options 1&3 - I think it depends on the resources of your server, the number of images, the traffic of your site, etc. Generally, I prefer compressing/caching on request, not upload, but for a smaller site, it shouldn't make much difference.
As for the API - generally, you have two options: do the work on your server/site or use an external service.
When it comes to services, we use CloudImage, it has very simple API and it helps a lot with the compression process (and resizing if you need it). Also, you have the benefits of the CDN, which will boost the performance. Since you are using Pydio, I assume you need data security and privacy, so CloudImage may be a good option for you since they take the privacy stuff really seriously.
If you prefer to do this yourself, and given that you use PHP, I would recommend ImageMagick and the PHP library IMagick. You can control every parameter of the compression and the documentation is pretty good. The only downside is that to achieve best compression without losing quality, it is a bit of trial-and-error at first.
Good luck!
Send your image on Whatsapp to other, received Image will be compressed to the significance size

Dynamic image resizing in the Cloud for Responsive website

I have a responsive (RWD) website which works OK on mobile devices. My problem is that pictures are sort of "heavy" on smartphones and uselessly large on older phones.
I know there are plenty of tools either offline or online (such as: http://www.resizeyourimage.com/) to resize pictures and I know I could roll my own image resizer with GD and the like (PHP here), but I was wondering if someone here is aware of a way to have images automatically resized.
For example by piping them through a proxy of some kind, such as:
http://cloudservice/w_320/http://myserver/mypic.jpg" />
A free service highly preferrable.
This way I wouldn't have to retrofit old pictures nor is it necessary to provide multiple versions of the same picture.
I hope my question makes sense...
There are many such services, and a similar question has been asked before.
All reliable solutions will also requires a tiny bit of client-side javascript. Cookies don't work on the first page load (which is most of them), and sniffing gives useless data if you're doing RWD with breakpoints. Excepting slimmage (and solutions with <noscript> tags), most will download 2 copies of each image (or worse, fail accessibility and SEO requirements).
I favor the DRY & CSS-friendly Slimmage.js, as its author, but there is also Picturefill for those who want art direction support (and are willing to handle the resulting markup complexity). Both can be used with any RIAPI-compliant server-side module, such as ImageResizer (disclaimer of authorship applies here too).
If you have access to a Windows (or linux/mono) server, consider self-hosting.
Dynamic imaging SaaS products appear and fail on a regular basis, so have a backup plan in place to replace the URLs if your SaaS isn't RIAPI-compliant. If your HTML isn't dynamic or can't be post-processed, you're going to have... fun.
A few services (free or in beta):
CDNConnect (RIAPI-compliant third-pary service based on ImageResizer)
BoxResizer (free, but uptime not guaranteed)
Sqish
Resizor
Some non-free (and non-compliant) services
http://www.resrc.it/pricing/us
https://responsive.io/plans#pricing-list
https://www.maikoapp.com/
http://www.thumbr.io/plans_and_prices
You should check out WURFL Image Tailor.
Works pretty much as you describe. You refer the images through a proxy like this:
<img src="//wit.wurfl.io/[full-url-to-your-image]">
The proxy will then detect the screen size of the user-agent and resize the image accordingly. This service also take some arguments that allows you to explicitly set height, width and percentage of screen size.
One image resizing service you can use is https://gumlet.com. You can use any image source with it and resize images exactly as per your need.
For example, to get image width of 300 px, you can write
https://subdomain.gumlet.com/image.jpg?width=300
P.S. I work at Gumlet.

Would it be good for the performance if we avoid to use #font-face of high traffic site?

Would it be good to for the performance if we avoid to use #font-face of high traffic site or it will not make much difference?
Client want to use custom fonts using http://www.google.com/webfonts but if it can make the bad impact on performance then we will go for Web safe fonts.
Using a third party font foundry will slow down the initial load of your page - the browser has to do a DNS lookup, create a new TCP connection, many font foundries rely on piece of js to determine which format font to send to the browser - all these add delay.
Some foundries also don't compress fonts when they could e.g. Typekit doesn't compress .eot fonts, they also tend to have very short expiry times for fonts.
There's a good article with a comparison between Typekit and Google Fonts here - http://www.artzstudio.com/2012/02/web-font-performance-weighing-fontface-options-and-alternatives/
On a recent site I was reviewing Typekit was adding 0.5s to initial page load, but as with everything 'your mileage may vary' so you should test and measure.
Well, it's definitely some extra 'burden', but with proper handling (e.g. font compression) it'll probably matter... less. It's still gonna be something extra, huh?
If you're going to use the web fonts API and include fonts hosted by Google, then Google is going to feel the heat, not you. You'll only have to worry about how long it takes to load these fonts. If you use ten web fonts from Google, that will add considerably to your load times. But it still won't cost you any extra bandwidth, of course, because Google is paying for all of that.

What is the maximum size of local storage available for Firefox add-ons and/or Chrome extensions?

Designing some extensions that will continuously update a dataset that is used to render additional information on webpages.
This isn't standardized, but a Chrome extension is limited to 5 MB unless you ask for extra space: https://developer.chrome.com/extensions/manifest#permissions
You could use this as a rough guide to see if the problem you were trying to solve is appropriate for localStorage. If it's a lot less than 5MB, you're probably okay (though note many browsers don't have full support yet), but if it's more, it's probably the wrong solution.

What is Progressive Enhancement?

Jeff mentioned the concept of 'Progressive Enhancement' when talking about using JQuery to write stackoverflow.
After a quick Google, I found a couple of high-level discussions about it.
Can anyone recommend a good place to start as a programmer.
Specifically, I have been writing web apps in PHP and would like to use YUI to improve the pages I am writing, but a lot of them seem very JavaScript based, with most of the donkey work being done using JavaScript. To me, that seems a bit overkill, since viewing the site without Javascript will probably break most of it.
Anyone have some good places to start using this idea, I don't really care about the language.
Ideally, I would like to see how you start creating the static HTML first, and then adding the YUI (or whatever Ajax framework) to it so that you get the benefits of a richer client?
As you've said
To me, that seems a bit overkill, since viewing the site without Javascript will probably break most of it.
This isn't progressive enhancement. Progressive enhancement is when the site works perfectly without JavaScript or CSS, and then adding (layering) these extra technologies/code to increase the usability and functionality of the website.
The best example I can give is the tag input box on this website. With JavaScript turned off, it would still work allowing you to enter tags separated with a space. With JavaScript turned on, you get a drop down with suggestions of previous entries.
This is progressive enhancement.
See also Unobtrusive JavaScript which is the bedrock progressive enhancement is built.
Going at it from the other direction is sometimes referred to as graceful degradation. This is usually needed when the site is built first with the enhanced functionality afforded by the various technologies then modified to degrade gracefully for browsers with those technologies are not available.
It is also graceful degradation when designing to work with older browsers (ancient in the Internets terminology) such as IE 5.5, Netscape, etc...
In my opinion it is much more work to gracefully degrade the application. Progressively enhancing it tends to be much more efficient; however, sometimes the need to take an existing app and make it accessible in these lacking environments arise.
Basically, if your site still works with JavaScript turned off, then anything you add with JavaScript can be considered progressive enhancement.
Some people may think that this is unnecessary, but plenty of people browse with addons like NoScript (or, with JavaScript simply turned off in their browser settings). In addition, many Mobile web browsers may or may not support JavaScript. So, it's always a good idea to test your site completely with and without JavaScript.
Progressive Enhancement is a development technique that stresses the primacy of the semantic HTML, then testing for browser-capability and conditionally "layering" on JavaScript and/or CSS enhancements for the browsers that can utilize those enhancements.
One of the keys is understanding that we're testing for what the browser can do, as opposed to browser-sniffing. Modernizr is a very popular browser-capability test suite.
Progressive-Enhancement is inherently (section 508) accessible; it provides for meeting the letter of the law and the spirit of the rule.
The Filament Group wrote the excellent "Designing With Progressive Enhancement" book on the subject. (I am not affiliated with Filament Group, though they are so freaking smart I wish I were.)
This is such an important concept and it saddens me that so few web developers understand it.
Basically, start by building a site/framework in Plain Old HTML -- structural elements, links and forms. Then add on some style and then shiny stuff (Ajax or what have you).
It's not very difficult. Like palehorse says, graceful degradation is more work.
Websites should work in any user agent, not look the same (not even look but sound if your vision impaired), just work.
Progressive Enhancement:
The plain HTML/CSS site is awesome (fully working and user-friendly).
Adding JavaScript defines a new level of awesome.

Resources