What is the best practice for image compression - caching

I am receiving control of a website and I need to take care of an image compression process.
Right now, when uploading an image, it gets stored on the server with high quality and when the website's being cached, the image is getting compressed for the cache. So the cache has a compressed copy of the image while the original, high quality image, is still stored on the server.
The tool which is responsible of doing what I have just described was developed by the current owner of the website and since I am not getting that tool I will need another one. The site is currently using Pydio and I have not seen any compression option there.
Since it seems I need a new tool for the image compression process, I want to know first what is the best practice, performance-wise, for handling the compression and I know there some good, experienced developers here.
I thought about some options:
Keep it the way it is now, which is to store the original image on the server and when caching, compress it for the cache (Best compatibility with the website since this is what the tool currently being used doing).
Compress all images the moment they are being uploaded and so I will have only the compressed images on the server and use them to cache (Save storage space, but don't know how to combine it with Pydio).
Have a cron which will compress all the images which are not already compressed (Gives me the ability to upload images freely without worrying about compressing them, though the images will not be immediately compressed).
Upload the image to a website which compresses the image and then take the outputted image and upload it (Really, sounds stupid and a lot of messing around in order to upload an image)..
What do you think will be the best practice, and why? Also, Is there a better practice for compressing the images?
Plus, if you know any tool which has an API for it or anything, I will be thankful to hear about it.
The website is built using PHP.

Since the question you're asking is a general-approach one, I will put my two-cents in.
On your approaches:
Option 4 - You could use some offline software or an external site for compression, but it seems tedious work. If I needed to upload one image per day, I would probably choose this option.
Option 2 - I would rather not do compression on upload since you lose the original image. Image compression can ruin some images very badly.
As for options 1&3 - I think it depends on the resources of your server, the number of images, the traffic of your site, etc. Generally, I prefer compressing/caching on request, not upload, but for a smaller site, it shouldn't make much difference.
As for the API - generally, you have two options: do the work on your server/site or use an external service.
When it comes to services, we use CloudImage, it has very simple API and it helps a lot with the compression process (and resizing if you need it). Also, you have the benefits of the CDN, which will boost the performance. Since you are using Pydio, I assume you need data security and privacy, so CloudImage may be a good option for you since they take the privacy stuff really seriously.
If you prefer to do this yourself, and given that you use PHP, I would recommend ImageMagick and the PHP library IMagick. You can control every parameter of the compression and the documentation is pretty good. The only downside is that to achieve best compression without losing quality, it is a bit of trial-and-error at first.
Good luck!

Send your image on Whatsapp to other, received Image will be compressed to the significance size

Related

Dynamic image resizing in the Cloud for Responsive website

I have a responsive (RWD) website which works OK on mobile devices. My problem is that pictures are sort of "heavy" on smartphones and uselessly large on older phones.
I know there are plenty of tools either offline or online (such as: http://www.resizeyourimage.com/) to resize pictures and I know I could roll my own image resizer with GD and the like (PHP here), but I was wondering if someone here is aware of a way to have images automatically resized.
For example by piping them through a proxy of some kind, such as:
http://cloudservice/w_320/http://myserver/mypic.jpg" />
A free service highly preferrable.
This way I wouldn't have to retrofit old pictures nor is it necessary to provide multiple versions of the same picture.
I hope my question makes sense...
There are many such services, and a similar question has been asked before.
All reliable solutions will also requires a tiny bit of client-side javascript. Cookies don't work on the first page load (which is most of them), and sniffing gives useless data if you're doing RWD with breakpoints. Excepting slimmage (and solutions with <noscript> tags), most will download 2 copies of each image (or worse, fail accessibility and SEO requirements).
I favor the DRY & CSS-friendly Slimmage.js, as its author, but there is also Picturefill for those who want art direction support (and are willing to handle the resulting markup complexity). Both can be used with any RIAPI-compliant server-side module, such as ImageResizer (disclaimer of authorship applies here too).
If you have access to a Windows (or linux/mono) server, consider self-hosting.
Dynamic imaging SaaS products appear and fail on a regular basis, so have a backup plan in place to replace the URLs if your SaaS isn't RIAPI-compliant. If your HTML isn't dynamic or can't be post-processed, you're going to have... fun.
A few services (free or in beta):
CDNConnect (RIAPI-compliant third-pary service based on ImageResizer)
BoxResizer (free, but uptime not guaranteed)
Sqish
Resizor
Some non-free (and non-compliant) services
http://www.resrc.it/pricing/us
https://responsive.io/plans#pricing-list
https://www.maikoapp.com/
http://www.thumbr.io/plans_and_prices
You should check out WURFL Image Tailor.
Works pretty much as you describe. You refer the images through a proxy like this:
<img src="//wit.wurfl.io/[full-url-to-your-image]">
The proxy will then detect the screen size of the user-agent and resize the image accordingly. This service also take some arguments that allows you to explicitly set height, width and percentage of screen size.
One image resizing service you can use is https://gumlet.com. You can use any image source with it and resize images exactly as per your need.
For example, to get image width of 300 px, you can write
https://subdomain.gumlet.com/image.jpg?width=300
P.S. I work at Gumlet.

Does slicing bigger image (2 MB~5 MB) helpful for loading sliced images on web for better appeal

Well I am struggling between best practice and nice to have feature and need your opinion before embarking on fruit(less/full) endeavor.
To improve server performance its been suggested to have less server calls. But then I dislike the part where a big file takes a long time to load. I would rather prefer to load file in chunks for better appeal (like google map loads in layer/tiles).
What is the take of community on this?
Thanks
Its better you Use CDN (Content Delivery Network) for static images rendering in your webpage, which should be hosted in other Server or Cloud environment.
You'll surely see the performnace improvement.
Thanks

Unity 3D: Asset Bundles vs. Resources folder vs www.Texture

So, I've done a bit of reading around the forums about AssetBundles and the Resources folder in Unity 3D, and I can't figure out the optimal solution for the problem I'm facing. Here's the problem:
I've got a program designed for standalone, that loads "books" full of .png and .jpg images. The pages are, at the moment, the same every time the program starts. At the start of the scene for any "book", it's loading all those images at once using www.texture and a path. I'm realizing now, however, that this is possibly an non-performant method for accessing things at runtime -- it's slow! Which means the user can't do anything for 5-20 seconds while the scene starts and the book's page images load up (on non-legendary computers). SO, I can't figure out which of the three things would be the fastest:
1) Loading one asset bundle per book (say 20 textures # 1 mb each).
2) Loading one asset bundle per page (1 mb each).
3) Either of the first two options, but loaded from the resources folder.
Which one would be faster, and why? I understand that asset bundles are packaged by unity, but does this mean that the textures inside will be pre-compressed and easier on memory at load time? Does the resources folder cause less load time? What gives? As I understand it, the resources folder loads into a cache -- but is it the same cache that the standalone player uses normally? Or is this extra, unused space? I guess another issue is that I'm not sure what the difference is between loading things from memory and storing them in the cache.
Cheers, folks...
The Resource folders are bundled managed assets. That means they will be compressed by Unity, following the settings you apply in the IDE. They are therefore efficient to load at runtime. You can tailor the compression for each platform, which should further optimize performance.
We make expensive use of Resources.Load() to pull assets and it performs well on both desktop and mobile.
There is also a special folder, called StreamingAssets, that you can use to put bundled un-managed assets. This is where we put the videos we want to play at runtime, but don't want Unity to convert them to the default ogg codec. On mobile these play in the native video player. You can also put images in there and loading them is like using WWW class. Slow, because Unity needs to sanitize and compress the images at load time.
Loading WWW is slower due to the overhead of processing asset, as mentioned above. But you can pull data from a server or from outside the application "sandbox".
Only load what you need to display and implement a background process to fetch additional content when the user is busy going through the first pages of each book. This would avoid blocking the UI too long.
Optimize the images to reduce the file size. Use tinypng, if you need transparent images, or stick to compressed JPGs
Try using Power of 2 images where possible. This should speed up the runtime processing a little.
ath.
Great answer from Jerome about Resources. To add some additional info for future searches regarding AssetBundles, here are two scenarios:
Your game is too big
You have a ton of textures, say, and your iOS game is above 100 mb -- meaning Apple will show a warning to users and prevent them from downloading over cellular. Resources won't help because everything in that folder is bundled with the app.
Solution: Move the artwork you don't absolutely need on first-run into asset bundles. Build the bundles, upload them to a server somewhere, then download them at runtime as needed. Now your game is much smaller and won't have any scary warnings.
You need different versions of artwork for different platforms
Alternative scenario: you're developing for iPhone and iPad. For the same reasons as above you shrink your artwork as much as possible to hit the 100 mb limit for iPhone. But now the game looks terrible on iPad. What do?
Solution: You create an asset bundle with two variants. One for phones with low res artwork, and one for tablets with high res artwork. In this case the asset bundles can be shipped with the game or sent to a server. At run-time you pick the correct variant and load from the asset bundle, getting the appropriate artwork without having to if/else everywhere.
With all that being said, asset bundles are more complicated to use, poorly documented, and Unity's demos don't work properly at times. So seriously evaluate whether you need them.

Using HTML5 LocalStorage for Fonts / Images / Popular Plugins

Has anyone used localStorage successfully to store font files referenced by CSS? How can I implement this?
For images, I know I can binary encode the images in script and save into localStorage, but that would mean a very large script code. If the bloated script is loaded everytime, I don't see any real benefit. If the script is cached as a seperate file, it would be the same as caching the image file in the first place. Am I missing something or there really is no benefit is normal circumstance for localStoraging images.
Anyone successfully implemented localStoraging popular plugins from Facebook/Google/Twitter and willing to share which are the ones most useful/applicable for caching?
I have no idea what your browser support requirements are like.
localStorage works in IE8+.
#font-face has varying support, but IE9+ supports WOFF fonts.
I think you'd need to base64 encode your fonts for them to work with localStorage.
dataUri's (required for base64'd fonts) have size limitations in IE8 (I believe it's 32k).
Cufon seriously sucks if you ask me. In my case it's WOFF support or nothing.
This article talks about storing images in local storage. Of course it's insanely fast:
http://www.sencha.com/learn/taking-sencha-touch-apps-offline/
As for fonts, well I'm looking at doing that myself. I'll let you know how it works out :)
Also, check out caniuse.com for browser support:
http://caniuse.com/#search=localstorage
http://caniuse.com/#search=datauri
Hope this helps:
From experience, I can tell you don't do it. You can use CSS3 #fontface, but that will still give you some issues. Most stable solution I have found is to use cufon: http://cufon.shoqolate.com/generate/.
I agree with your assessment and have explored it in detail on my own. Cached images beat encoded images in most cases.
No, but can only image that this will give you much grief in cross-browser compatibility.
Thanks,
Matt

Free web image management

I am looking for free web image management system/script/...
I was using and still use photobucket service, but my account is free and has limited space and bandwidth. Now I am approaching the limits. On another side I have web hosting account and want to use it for image hosting instead or in addition to my photobucket. Sounds good. I can use ftp to upload my images and I am fine with it. What I miss is photobucket's web interface to my images. I am talking about photo galleries or portfolio like or something. But basic list of thumbnails, so I can see my images and easy get link to specific image in different formats to past into forum posts or into other web pages referring to that specific picture. Besides, I need easy way to organize pictures in albums/subalbums (like in file system). I see gazillion of image gallery systems, but have hard time to find for what I need. Oh, and I do not want database, just flatfiles/directories.
Anything come to mind?
There's Coppermine Gallery.
I was just looking for something similar.
I know Coppermine pretty well (I run a site powered by it): it is very flexible, with tons of options, and relatively easy to mod to your needs if you know a bit of php. It also reads and displays EXIF data if you configure it to do so. There is a lively community of developers for Coppermine. There is also a plugin that displays BBcode (http://forum.coppermine-gallery.net/index.php/topic,74043.msg356623.html#msg356623), unfortunately only image by image (not in bulk like the ImageShack Uploader). The drawback is that Coppermine is a pretty bulky script and does not perform super fast, especially on slow servers.
Bravenet seems to be a service more than a script.
I'm also checking out Lightbox 2 (http://lokeshdhakar.com/projects/lightbox2/#example) which seems nice and tiny, but not a chance of getting the EXIF file into the displayed image or getting the BBcode.
Will keep an eye on this thread

Resources