How to compress images on SERVER - image

I have a website which is also a social media networking website where user can upload profile, cover & article picture.
So currently there are around 25000+ Images available many of them are even 2 MB- 10 MB large, so i want to compress them all at once.
Is that possible if yes then please specify how i can do that, because currently i'm using huge band-witch and also site is pretty slow.
Thanks!

Related

store images locally vs cloudinary vs s3

The settings:
Blog with posts, buily with Laravel, where:
Every post can have max of 1 image (nullable).
Max posts in the blog is 1000. Let's assume there are 1000 posts for the discussion.
Every post has a comment section. Where registered users can comment and include an image in their comment. Lets assume every post has 2 images in the comments.
So in total it counts to 3000 images* that need to be stored (and resized I guess), presented, and etc.
This is the ideal amount in the long range, I'm not looking for a "scaleable" solution, since there is not going to be a crazy exponential growth.
*In reality for the time being it is less, and I assume that for these amounts of media files it doesn't really matter if its 1000/1500/2000 or 3000.. Correct me if that's wrong.
Few extra things to note:
I'm hosting it in shared hosting (I can store up to 300k files).
I want it to be secured, so no malicious file is uploaded in the cover of an image file.
I'm looking for a budget solution (so if s3 will start charging hard after 12 months it makes it not relevant), preferably free (.
So the dilemma is between storing all images locally in Storage folder (manipulating images with some Laravel package). Other possibility is cloudinary, which I don't know much about, just that it enables to store/manipulate/backup/use their api to present the images I stored there.
If I choose to do it locally - is it safe to store a user uploaded image locally? how do I make sure it's not malware in disguise of an image file?
With this amount of images/content can it cause performance issues in the shared hosting when storing locally?
What would be the advantages of using cloudinary for me?
Thanks.
Cloudinary can actually help a lot in this case.
Instead of storing the resources locally and writing something up to manipulate them, you could integrate Cloudinary in the project.
This would free up server space. Storing images locally may or may not impact performance, depending on the architecture, but freeing server resources is always a good practice.
Also, manipulation and delivery of images could be done on the fly when first requested (or eagerly before they are requested if you want to) by a simple API call. So you don't need to make up something new, but leverage already existing API.
Cloudinary also has a fully-featured free tier that you could use. If you don't expect exponential growth at the moment, that tier would be more than enough for the project
Full disclosure: I'm currently working at Cloudinary, (but the above still holds :) ).

Best Way to Reduce Bandwidth Usage with Magento eCommerce

Currently I am working on a website that is using the Magento eCommerce platform.
Although the website has only seen about 1,000 visitors over a 30 day period, it is using over 70 GB of bandwidth. The website has Cache enabled to help reduce the amount of resources it takes to load each page, but it does not appear to be helping. I was hoping to find some pro tips on how to reduce the amount of bandwidth usage to avoid costly overage fee's with the hosting provider.
The website is http://fantasyfootballdraftboard.net if you would like to review the site. The primary purpose of the website is to sell fantasy football draft boards online, so I would prefer not to remove the large image on the home page. I've used Pingdom speed test, and it claims the site only uses roughly 2.5 MB of bandwidth to load each page. After a pretty in-depth analysis of Google Analytics, Page Views, and the amount of bandwidth it takes to load each page (2.5 mb according to Pingdom), the numbers just do not add up.
Does anybody have any suggestions or ideas for me? Does Magento use a lot more resources and bandwidth than other eCommerce websites?
Thanks in advance,
I ran your site through webpagetest and there are a few recommendations from there. You should certainly compress your transfer. Personally, I would recommend you avoid png files for images unless you really need them (eg for see through effects). Jpegs are much more efficient and compressed.
Go through webpagetest and I'd bet you could knock off at least a megabyte!
I am check your website, but find any problom.
and bandwith is no relation magento system.
maybe your image are other site used or stolen ftp password by cracker,
HTTP_REFERER check in apache config (or .haccess file )
http://www.webmasterworld.com/apache/4515652.htm

Speed up loading time of web page with huge gifs

I have a website with lots of huge gif images. I have limited each page to 5 imagesbut the loading time is yet very high (+60seg). The images are around 2MB in size.
Is there a way of speeding up loading? Because of the nature of the images, i think they cannot be compressed (again) because it would decrease quality significantly. The images are "soundless mini videos" of funny situations.
I also thought about creating multiple connection to download images faster (as many download acelerators do), but i doubt it to be possible on client side.
I also tried load images one per one (aka wait first image to be download and then adding through DOM the next), but total time increased (less connections = slower total download speed).
Have you some idea?
UPDATE: Solved by using cloudflare (See answer)
I solved the problem by using cloudflare
CloudFlare protects and accelerates any website online. Once your
website is a part of the CloudFlare community, its web traffic is
routed through our intelligent global network. We automatically
optimize the delivery of your web pages so your visitors get the
fastest page load times and best performance.
Now my website is loading in seconds instead of minutes, it looks my hosting service was poor.

Optimal delivery method for a large quantity of images

I have a website centered around an online chat application where each user can have up to several hundred contacts. Each contact has there own profile image. I want to make it so that the contact's profile image is loaded next to there name. However, having the user download 100+ images every time they load the site seems intensive (Studies have shown that as much as 40% of users don't utilize there cache). Each image is around 60x60 pixels in dimension.
When I search on google or sign on to facebook, dozens of images are served nearly instantaneously. Beyond just having fast servers and a good connection, what are the optimal methods for delivering so many images to the user?
Possible approaches I have come up with are:
Storing each user's profile image in a database, constructing one image in a php file, than having the user download that, then using css to display each profile image. However, this seems extremely intense on the server and referencing such a large file so many times might take a toll on the user's browser.
Using nginx rather than apache to server the images (nginx generally works better to server static content such as this). However, this seems more like an optimization to a solution, rather than a solution in itself.
I am also aware that data can be delivered across persistent http connections so multiple requests do not have to be made to the server for multiple files. However, exactly how many files can be delivered across one persistent connection. Would this persistent model mean that just having the images load as separate files would not necessarily be a bad idea?
Any suggestions, solutions, and/or notes on personal experiences with relevant matters would be greatly appreciated. Scalability is extremely important here, as well as cross-browser support (IE7+, Opera, Firefox, Chrome, Safari)
EDIT: I AM NOT USING JQUERY.
Here's a jquery plugin that delays loading images until they're actually needed (i.e., only loads images "above the fold".)
http://www.appelsiini.net/2007/9/lazy-load-images-jquery-plugin
An alternative may be to use Flash to display just the images. The advantage is Flash is a much stronger local cache that you have programm

What are the best practices for image serving?

What techniques do people commonly use for uploading, storing and presenting images with a CMS?
Do you store them in the database or on the file system?
Do you generate thumbnails on upload? Or on the fly, then maybe cache them for reuse? Or rely on browser scaling?
Typically, most content management systems will store images the actual data of image uploads to the file systems and then add a link to the file within the database. Thumbnails can either be generated on upload or on first request (on the fly is considered inefficient, especially given the cheap cost of storage). Browser scaling is a bad idea (images may be uploaded as multi megabyte uncompressed files) but is done by some systems.
i agree with kevin. i can't think of any cms that doesn't store in the file system. then only issue that comes up with that technique is if you are planning on clustering multiple web servers to run your cms. if thats the case then you have to plan on it and have the ability to point all the web servers to the same file storage location.
the technique ive used for years is on upload, resize the image to something practical for the web, then generate the thumbnail, then write them to the file system and record the pointer in the database.
if the site is a huge site then you need serve the images from cache servers because file systems are very slow in comparison to network IO. take facebook for example, they have billions of images on their site and last i heard 80% were held in cache servers around the world in ram. the file storage array they have is more or less a backup to the cache servers.

Resources