Prestashop img folder is too big (almost 40GB) - image

I have a Prestashop store which is getting bigger and bigger, around 40 GB. Recently I have realized that the folder img/ is taking around 95% of disk space. After counting my products and average number of images per product and average size per image I am sure it should not exceed 5GB. Please help.

There is a number of things you can try:
Set write permission on folders and files in img/* directory. It can be
done on your hosting cPanel or by direct ssh access, or any other
way you are comfortable with.
In your prestashop backoffice, go to Preference -> Images. Check if
"Generate high resolution images" option is active. If so, you can
disable it, as it takes very huge space (doubles images folder size)
and it does not help that much as people use laptops and
mobile phones to browse online stores nowadays.
On the same page, at the bottom, Regenerate thumbnails. It will
clear old images and create new ones.
I believe after following these steps you should be able to save a lot of space.
Also, check prestashop addons for image cleaning modules. For example: http://addons.prestashop.com/en/22308-redundant-image-cleaner.html
This module will help you to delete images that are no longer used.
I hope this helps.

Related

Website loads image with 1-2 second delay. Could I increase the performance somehow?

recently I made a website for my photography. htttp://www.simotamas.com
I am a newbie, so its not the best site but it works fine for me, I got only one problem, when a site is loaded on a device for the first time, the gallery loading time takes up to 1-2 seconds.
Could you guys please check if I mess up something with the code?
Or should I made the pictures even smaller?
Any way I could increase the loading performance.
I would be really thankful for any advice.
Some points you can consider
Use thumbnails for preview (low resolution) , while clicking load actual image.
Load images of only visible part first then load the images in bottom. (May affect user experience)
if you have cpu power , use any libraries like cache tools or compression tools like
https://nielse63.github.io/php-image-cache/ . benchmark it carefully.
use gzip if you are not using gzip compression for your server.
The fact your website doesn't wait for the image to load is considered a plus (look into asynchronous web page content loading for a good read) that said you should compress your images before uploading them.. tinypng.com is a nice tool for it... But if it's a photography website doing so would reduce picture quality... Try to play with Photoshop save settings to find your ideal compromise between quality and size with respect to "memory" size... Pictures are heavy.. high definition and resolution will obviously result in heavier files to download
Update: another thing you could do is actually display (smaller) thumbnail and only load the full picture on request. I.e: user clicks and image opens in new tab
It would help if you create smaller thumb versions of your images so the browser can initially load these ones for the overview and no need for scaling way to big images down while rendering the page. An image should always be downloaded in the dimension it's going to be presented.

How to reduce Xcode project file size

So I am trying to submit my WWDC scholarship app, however the file size limit is 100mb, and mine is currently 132mb. I have spent the past few hours reducing the size of the images, and compressing them, but I only saved about 10mb...
So now I am trying to figure out what is taking up all of the space, and what I can delete to get it under 100mb.
I noticed that when I go into the 'Developer' folder and right click on my project and tap 'Get info', it shows that the file is 132mb, however if I go in and check the three folders individually they only add up to about 40mb.
If I go to ~/Library/Developer/DerivedData then the file for this project is about 250mb, so is there something in there that can be deleted?
Thanks in advance!
~/Library/Developer/DerivedData can be deleted in its entirety at any time. However, doing so will not affect the size of the app.
Looks first at any assets that you have included, images, videos, sounds, data, and fonts. Figure out what can be eliminated, reduced, or hosted externally. Many times you can replace large images with drawing code.
Select your project target > Build Settings, check your VALID_ARCHS:
Remove useless ones, only keep right & less archs for your project, more archs will lead the final archived binary bigger. I think you can only keep x86_64 there in this case.
More reading about the ARCH, you can take a look at THIS ANSWER (it's about iOS).
I finally found the problem. I used the terminal command to uncover hidden files, and found that there was a .git file in the folder which was taking up 93mb. I have now deleted the file and it brought down the size to 37mb.

How to Remove Specific Dimenstion Image in Wordpress

I am searching to solve this issue for 3 hours. my theme is generating such images which i don't need. i am trying to get support but answer is not coming. can someone please tell me
How can i stop generating (one or more) specific dimension images to
save my bandwidth. i am not asking about the default images of
wordpress like medium, full or small. and also i am not thinking to
remove code.
i just need a code which i put on my site (probably in functions.php file) which can stop generating images of a specific dimensions.

Best way to deal with image thumbnails on news web site (custom CMS based on codeigniter)?

I’ve been thinking a while about the best solution and as much as I read I get more and more confused. There are a lot of different libraries and helpers (most of them are outdated or for CI 1.x) and I really need your help.
I have a custom CMS based on CodeIgniter 2.1.3, news site that has about 40-50 images on the home page, but 80% of them are really small thumbnails in 3 different sizes and the other 20% of the images on the home page are in 2 sizes + for the inner pages when I list the news from a category there is 1 size of thumbnails. So in total I will need the original image for the news story, + 5-6 thumbnails sizes for the home page.
What’s the smartest way to deal with this? There will be let’s say 10-50 new news per day.
Is it still better to create 5-6 thumbnails per image during the upload?
What about the method “on the fly”? I’m more into this method, as I read, only the first visitor will call the library/helper to generate the thumbnails, and for the others the thumbnails will be already created so it won’t waste CPU. What about this method? Is it good practice?
What caching techniques I should use for these what I need?
Also I forgot to ask, how the other CMS system deal with generating the thumbnails? I mean about Wordpress, Drupal, Joomla, etc.
Do they store predefined sizes or generate them on the fly?
I guess their logic should be the best, or maybe not, but I want to implement something smart in my CodeIgniter CMS.
I didn’t mention, but I think it’s not important to this, I use Grocery CRUD for the admin panel.
Any help is appreciated.
Your best bet is to create images on the fly + use CDN like Amazon Cloudfront to cache the resized versions of your source image.
I’ve been using CodeIgniter for a number of years to build websites where lots of different sizes of images are used throughout the website. At the beginning I used to create every size needed out of the original image during the upload process (could easily end up with more than 5 thumbnails). This proved to be delivering the best performance – whenever you need an image of the certain size you just include it with no additional PHP processing. However I noticed that I end up with a huge number of images on the server, where the older ones may not even be used that often (e.g.: articles older than a year). Plus developing this way takes longer.
Then I started creating images on the fly, firstly using 3rd party libraries and later developed my own interface for CodeIgniter. This saves a lot of time, because during the upload process you save an original version of the image not worrying about resized versions. When displaying an image in the front end, all you normally need to do is to pass certain dimensions of the image required. Doing this way, not only you can get 5-6 versions of the image, but as many as you need. Also that’s a solution for the future when you redesign your website where the different sized images might be needed! What would you do when none of your 5 thumbnail options are no longer valid and you need different sizes?
You’re right, resizing an image on the fly can really be CPU consuming operation (especially when the large images are involved), therefore caching is a must. You can cache images right on your server or get CDN on top of that.
To keep the server tidy I normally run a cron job to delete on-the-fly images older than let's say a week. That saves space + doesn't cause harm - whenever image is needed to display, it'll just get recreated.
Check out timthumb, it's a script that resizes images on the fly and stores them in a cache. It's a simple as including an image tag with parameters in the URL.
ALso check this link which looks promising http://www.jenssegers.be/blog/31/Codeigniter-resizing-and-cropping-images-on-the-fly-continued
I love the way Drupal manage this. In Drupal 6 there was a module called imagecache (now is in core in Drupal 7, but functionality is very similar), which basically stores presets for images (image sizes, transformations, effects...) and when the visitor ask for an image the module generate different images based on presets and serve this images. This way you upload an image but have different images for different purposes.
The module has a really useful feature, if you want to change one preset, you can "flush" all the images related with that preset, so the visitors can see the changes.
Of course there are many other modules in Drupal related to imagecache or image styles, that add other effects like watermarks...
More information:
http://drupal.org/node/949222
http://drupal.org/node/163561

How would you optimise/simulate 'random' loading of large image files?

We use large background images (hi-res photos, up to 700 KB) for our page design.
It's part of the experience of the site that as you browse around, you see different images.
At the moment a different (random) image is loaded on each page request, from a pool of ~15 images, which could grow over time.
I'm looking for a sane way to optimize this:
To avoid the user having to download a big image file on every page view
To reduce load on the server (is this an issue, will the server keep the images in memory?)
The ideas I have so far include:
A timer which loads a different image at set intervals
Progressively loading other images in the background with ajax
Associating images with specific content (pages, tags)
The question is, how to keep it feeling somewhat random, while minimizing page load times and server hit?
I usually avoid sites with huge images, I am very impatient. I would rethink your design.
As a first step you should make sure, that the images can be properly cached:
use sane urls (no session id's etc)
set appropriate http headers ETag
Firstly, hearing that the background-images alone are 700kb astounds me. In addition to the content ON screen...that is a pretty heavy site.
For starters, I would try to use image compression tools. Two tools come to mind Imagemagick and PNGCrush. PNGCrush is excellent in reducing all the extraneous metadata attached to photos, without compromising photo quality.
I only recommend this as compressing the images will assist you in enabling the user to download a smaller quantity of content, which means quicker load times, which...at the end of the day...is what users want.
I would also cache the images, such that when a user re-visits the site, the image is already cached on their end. This minimises the HTTP requests that are made each time a user visits your site.
An example of where this technique is used on a commercial site is www.reactive.com. If you look the /js/headerImages.js file, they make use of image caching. Funnily enough, you will find the same src code at: http://javascript.internet.com/miscellaneous/random-image.html
Considering that you have mentioned that images are randomly loaded, I am assuming you are using a Javascript library such as jQuery to create the effect.
If you are, you can minimize page load times by using a CDN as opposed to referencing to a local copy of the jQuery lib which is stored on your server. I have performed performance testing on a site I made for a client, and over an average of 20 hits, saved 1.6 seconds through this technique!
Hope that helps for now :)

Resources