Optimal delivery method for a large quantity of images - image

I have a website centered around an online chat application where each user can have up to several hundred contacts. Each contact has there own profile image. I want to make it so that the contact's profile image is loaded next to there name. However, having the user download 100+ images every time they load the site seems intensive (Studies have shown that as much as 40% of users don't utilize there cache). Each image is around 60x60 pixels in dimension.
When I search on google or sign on to facebook, dozens of images are served nearly instantaneously. Beyond just having fast servers and a good connection, what are the optimal methods for delivering so many images to the user?
Possible approaches I have come up with are:
Storing each user's profile image in a database, constructing one image in a php file, than having the user download that, then using css to display each profile image. However, this seems extremely intense on the server and referencing such a large file so many times might take a toll on the user's browser.
Using nginx rather than apache to server the images (nginx generally works better to server static content such as this). However, this seems more like an optimization to a solution, rather than a solution in itself.
I am also aware that data can be delivered across persistent http connections so multiple requests do not have to be made to the server for multiple files. However, exactly how many files can be delivered across one persistent connection. Would this persistent model mean that just having the images load as separate files would not necessarily be a bad idea?
Any suggestions, solutions, and/or notes on personal experiences with relevant matters would be greatly appreciated. Scalability is extremely important here, as well as cross-browser support (IE7+, Opera, Firefox, Chrome, Safari)
EDIT: I AM NOT USING JQUERY.

Here's a jquery plugin that delays loading images until they're actually needed (i.e., only loads images "above the fold".)
http://www.appelsiini.net/2007/9/lazy-load-images-jquery-plugin
An alternative may be to use Flash to display just the images. The advantage is Flash is a much stronger local cache that you have programm

Related

store images locally vs cloudinary vs s3

The settings:
Blog with posts, buily with Laravel, where:
Every post can have max of 1 image (nullable).
Max posts in the blog is 1000. Let's assume there are 1000 posts for the discussion.
Every post has a comment section. Where registered users can comment and include an image in their comment. Lets assume every post has 2 images in the comments.
So in total it counts to 3000 images* that need to be stored (and resized I guess), presented, and etc.
This is the ideal amount in the long range, I'm not looking for a "scaleable" solution, since there is not going to be a crazy exponential growth.
*In reality for the time being it is less, and I assume that for these amounts of media files it doesn't really matter if its 1000/1500/2000 or 3000.. Correct me if that's wrong.
Few extra things to note:
I'm hosting it in shared hosting (I can store up to 300k files).
I want it to be secured, so no malicious file is uploaded in the cover of an image file.
I'm looking for a budget solution (so if s3 will start charging hard after 12 months it makes it not relevant), preferably free (.
So the dilemma is between storing all images locally in Storage folder (manipulating images with some Laravel package). Other possibility is cloudinary, which I don't know much about, just that it enables to store/manipulate/backup/use their api to present the images I stored there.
If I choose to do it locally - is it safe to store a user uploaded image locally? how do I make sure it's not malware in disguise of an image file?
With this amount of images/content can it cause performance issues in the shared hosting when storing locally?
What would be the advantages of using cloudinary for me?
Thanks.
Cloudinary can actually help a lot in this case.
Instead of storing the resources locally and writing something up to manipulate them, you could integrate Cloudinary in the project.
This would free up server space. Storing images locally may or may not impact performance, depending on the architecture, but freeing server resources is always a good practice.
Also, manipulation and delivery of images could be done on the fly when first requested (or eagerly before they are requested if you want to) by a simple API call. So you don't need to make up something new, but leverage already existing API.
Cloudinary also has a fully-featured free tier that you could use. If you don't expect exponential growth at the moment, that tier would be more than enough for the project
Full disclosure: I'm currently working at Cloudinary, (but the above still holds :) ).

Load times with #font-face vs. Google fonts or localhost files vs. CDN's

Is loading fonts via storing them on your server and using #font-face slower than loading them from Google's font API? Or does it always depend on the font and vary from situation to situation?
And the same for Javascript and other similar files: is it faster or slower to load from CDN's than to store the files on your server and load them (locally on the server)?
Or are there too many variables involved from situation to situation to generalize to a single answer? I would imagine that it depends on which CDN you're accessing and/or your personal server settings and the size/nature of the files you're loading, etc, but I was just curious if there might be a general rule or strategy to knowing which is faster?
A CDN might be faster, on the base that it is built with speed in mind (high performance, tuned web servers, good caching...) and it is usually composed by a network of geographically distributed servers, lowering latence both because they are nearer and because they share the load. Also, they could be directly placed on backbones, which allow for much faster transfer rates than a low-to-mid-priced server will ever do.
Thus said, for a low traffic website mostly visited from one specific country, in turn near to the server location, the difference in load is irrelevant.
The reason for using Google or jQuery CDN is both saving bandwidth (if the respective owner allows you to use theirs, of course) on your server and be sure you do not miss urgent patches, as they will push fixed versions on the CDN as soon as possible, while you have to get notified, download the new version, then load it on your server (although I guess that this is not a great issues in modern, sanboxed browsers).

Embedding code (css, js) into a document on high profile sites

Is there an advantage of some sort (speed or performance wise) to embed your CSS and JS into your web page, as opposed to keeping the code in sparate files? I was raised to believe that keeping code separate in separate files makes things easier to maintain. However, on high profile websites like amazon or google even facebook, I see a lot of embed code. Is there a performance reason they choose to do so or is it just an old/new way of doing things. I suppose my question is similar to this one: Should I inline CSS & JS in mobile sites to save bandwidth?
But I would like to hear form experts, most notably from people who worked on high profile web sties and have done so, if any.
P.S.
Bonus Question: Last html comment on amazon web pages is <!-- MEOW --> does it mean anything or is it just a funny prank?
There are good reasons to inline resources, but as with most things, it also has its tradeoffs. The simplest case for inlining is cases where the cost of an HTTP connection is much more than the resource itself, ex: if you have a 10x10 icon you need to show, a dedicated request for that may not be worth it vs. inlining the data via a data URI.
This is especially true when and if you have many small resources that need to be fetchd. Most browsers limit themselves to a max of 6 connections per host, so if you have 60 resources which need to be fetched, then you'll be blocked for a significant chunk of time.
To work around these case we've invented other workarounds: domain sharding to go over the 6 connection limit, and "spriting" to fetch one resource vs multiple.
If you take a look at mod_pagespeed (Apache module), which does many of these optimizations on the fly for you, then the recommended setting we provide is to inline any resource that's below 2kb. That's a pretty good rule of thumb for today's stack.
Once SPDY is more widely deployed, many of these workarounds can be eliminated: no need to do domain sharding, cost of extra requests is much less, etc.
Stoyan did an experiment that you might find interesting http://www.phpied.com/style-tag-to-inline-style-attrrib/
CSS/JS external files typically get cached on the user's hard drive under that users browser's profile. So unless you change the code frequently, you won't really be doing yourself a favor by putting it inline.
Definitely saves you time from maintenance, but you can easily call in a javascript/css file and embed the code on the page you're populating on the server side, but that also means you're making your server do additional work.
As for the MEOW - yeah, them trying to be funny, or it's code... for... cat...

how to save images on a server (structure)

I'm programming a application with mvc2. Users should be able to upload images to their profile.
The best way to save the images is to save them in a database, but I think it is the most expensive one too. (I'm using MSSql)
I thought the best way would be to save them on the server. I thought about: a User uploads images, the server resize them and save the Image on the server and the image-path in a database.
But what about if I do not have any more capacities on my server an I have to use a second one or third one?
My question: what is the best way to handle images on a server? What is the best way to be flexible?
Thanks for your answers!!!
There are a number of factors to consider.
The best way isn't necessarily storing images in a database. That can be a good choice, especially if you want to implement access control on the images. However, this comes at the cost of having to pull the image from the database and loading it into memory so that it can be streamed out by a server.
However, as these are profile images, and presumably visible by anyone who visits that user's profile, I'd advocate storing the file as a file on the server and storing a reference to that file in your database.
When it comes down to it, web servers are very good at serving files efficiently. If you can make use of that, you should.
Finally, you have concerns about space. In the first instance, you can prevent images of a certain size from being uploaded in the first place. You can also, as you suggest, auto-crop to a selected size.
If sheer volume of users becomes a problem, you can always store your media on a separate server, storing a fully qualified link to each resource, eg :-
<img src="http://images2.mydomain.com/image/profile_123.png" alt="A profile pic" />

Google Analytics for mobile and server-side caching

I need to implement Google Analytics for a mobile site that is behind a CDN. This means that every page of content (including any tracking pixel references) is going to be cached for anywhere between 15 seconds and 5 minutes. I see that the GA tracking code implements a random number (utmn) in constructing the pixel, possibly to distinguish separate requests, separate users or simply to bust their own cache.
Does anyone know if it is safe to leave the page content cached? (I assume it is) Will we lose much tracking data?
Also, is it safe to serve the ga.aspx pixel itself from the CDN (where it would get cached) or does each GA pixel URL need to be uniquely addressed?
Does anyone have any recommendations on how best to implement GA? Server load from traffic is a huge concern for us, but we also want accurate numbers.
The mobile code will not work properly unless the utmn is generated for each call, so if you can serve the content so that it won't get cached (on server side), that would solve the problem.

Resources