The settings:
Blog with posts, buily with Laravel, where:
Every post can have max of 1 image (nullable).
Max posts in the blog is 1000. Let's assume there are 1000 posts for the discussion.
Every post has a comment section. Where registered users can comment and include an image in their comment. Lets assume every post has 2 images in the comments.
So in total it counts to 3000 images* that need to be stored (and resized I guess), presented, and etc.
This is the ideal amount in the long range, I'm not looking for a "scaleable" solution, since there is not going to be a crazy exponential growth.
*In reality for the time being it is less, and I assume that for these amounts of media files it doesn't really matter if its 1000/1500/2000 or 3000.. Correct me if that's wrong.
Few extra things to note:
I'm hosting it in shared hosting (I can store up to 300k files).
I want it to be secured, so no malicious file is uploaded in the cover of an image file.
I'm looking for a budget solution (so if s3 will start charging hard after 12 months it makes it not relevant), preferably free (.
So the dilemma is between storing all images locally in Storage folder (manipulating images with some Laravel package). Other possibility is cloudinary, which I don't know much about, just that it enables to store/manipulate/backup/use their api to present the images I stored there.
If I choose to do it locally - is it safe to store a user uploaded image locally? how do I make sure it's not malware in disguise of an image file?
With this amount of images/content can it cause performance issues in the shared hosting when storing locally?
What would be the advantages of using cloudinary for me?
Thanks.
Cloudinary can actually help a lot in this case.
Instead of storing the resources locally and writing something up to manipulate them, you could integrate Cloudinary in the project.
This would free up server space. Storing images locally may or may not impact performance, depending on the architecture, but freeing server resources is always a good practice.
Also, manipulation and delivery of images could be done on the fly when first requested (or eagerly before they are requested if you want to) by a simple API call. So you don't need to make up something new, but leverage already existing API.
Cloudinary also has a fully-featured free tier that you could use. If you don't expect exponential growth at the moment, that tier would be more than enough for the project
Full disclosure: I'm currently working at Cloudinary, (but the above still holds :) ).
Related
I am developing a eCommerce website in ASP.NET MVC 3 in C#. Using SQL Server 2008R2. My question is if I have 5 images that I want to show in gridView with thumbnails (e.g. something like Amazon website that gives customers couple of pictures to show) would it be advisory if the images are coming from the database or should I reside in the Content\Images folder? There are quite a few sub-categories in sub-category in my db design. What is the most common suit for a professional developer to follow? Thanks. I know there are few options for third party tools like jquery & Telerik Extensions. So I will use them.
Thanks
From my experience and research it is better to put it in a folder/content structure. Yes, there are security things with opening directories to the public but if you instead upload a file via ftp dynamically the problems are solved. I have heard of horror stories about storing files in database and have seen the issues come up but have resolved them. Basically, it is easier to write to database and there are not the security issues of opening up a directory to public but just make sure to regularly check backups that the files are not corrupt or make sure the data is on a fail over cluster where that will never be a problem.
So summary: Database is fine just regularly check backups by restoring them that they are not corrupt or run as a fail over cluster. Otherwise just go with the typical folder/content structure but use ftp to upload the file so there are no open directories to the public.
For me, the best anwser to this question is this: To BLOB or Not To BLOB: Large Object Storage in a Database or a Filesystem
Sumary: Application designers often face the question of whether to store large objects in a filesystem or in a database. Often this decision is made for application design simplicity. Sometimes, performance measurements are also used. This paper looks at the question of fragmentation – one of the operational issues that can affect the performance and/or manageability of the system as deployed long term. As expected from the common wisdom, objects smaller than 256K are best stored in a database while objects larger than 1M are best stored in the filesystem. Between 256K and 1M, the read:write ratio and rate of object overwrite or replacement are important factors. We used the notion of “storage age” or number of object overwrites as way of normalizing wall clock time. Storage age allows our results or similar such results to be applied across a number of read:write ratios and object replacement rates.
I'm not sure if this is the right place for this question, and will be happy to remove the Q if needed.
When a site grows from a just-a-fun project to a site with bigger load of visitor, and you want to enable them to upload videos, you might find yourself in a need of a better hosting, including dedicated server and a no-limit web traffic (or some reasonable limit).
So, if people can upload their videos, and if page has around 1000-10000 visitors per day, what kind of hosting is there to choose from? What is needed in that case?
Thx
You are looking for a scalable solution.
The term cloud hosting comes to mind. Hosting your site in full or in parts (only the large media perhaps) at a cloud provider resolves the problem of the storage limit of servers in the easiest (and cheapest) manner.
I have a website centered around an online chat application where each user can have up to several hundred contacts. Each contact has there own profile image. I want to make it so that the contact's profile image is loaded next to there name. However, having the user download 100+ images every time they load the site seems intensive (Studies have shown that as much as 40% of users don't utilize there cache). Each image is around 60x60 pixels in dimension.
When I search on google or sign on to facebook, dozens of images are served nearly instantaneously. Beyond just having fast servers and a good connection, what are the optimal methods for delivering so many images to the user?
Possible approaches I have come up with are:
Storing each user's profile image in a database, constructing one image in a php file, than having the user download that, then using css to display each profile image. However, this seems extremely intense on the server and referencing such a large file so many times might take a toll on the user's browser.
Using nginx rather than apache to server the images (nginx generally works better to server static content such as this). However, this seems more like an optimization to a solution, rather than a solution in itself.
I am also aware that data can be delivered across persistent http connections so multiple requests do not have to be made to the server for multiple files. However, exactly how many files can be delivered across one persistent connection. Would this persistent model mean that just having the images load as separate files would not necessarily be a bad idea?
Any suggestions, solutions, and/or notes on personal experiences with relevant matters would be greatly appreciated. Scalability is extremely important here, as well as cross-browser support (IE7+, Opera, Firefox, Chrome, Safari)
EDIT: I AM NOT USING JQUERY.
Here's a jquery plugin that delays loading images until they're actually needed (i.e., only loads images "above the fold".)
http://www.appelsiini.net/2007/9/lazy-load-images-jquery-plugin
An alternative may be to use Flash to display just the images. The advantage is Flash is a much stronger local cache that you have programm
What techniques do people commonly use for uploading, storing and presenting images with a CMS?
Do you store them in the database or on the file system?
Do you generate thumbnails on upload? Or on the fly, then maybe cache them for reuse? Or rely on browser scaling?
Typically, most content management systems will store images the actual data of image uploads to the file systems and then add a link to the file within the database. Thumbnails can either be generated on upload or on first request (on the fly is considered inefficient, especially given the cheap cost of storage). Browser scaling is a bad idea (images may be uploaded as multi megabyte uncompressed files) but is done by some systems.
i agree with kevin. i can't think of any cms that doesn't store in the file system. then only issue that comes up with that technique is if you are planning on clustering multiple web servers to run your cms. if thats the case then you have to plan on it and have the ability to point all the web servers to the same file storage location.
the technique ive used for years is on upload, resize the image to something practical for the web, then generate the thumbnail, then write them to the file system and record the pointer in the database.
if the site is a huge site then you need serve the images from cache servers because file systems are very slow in comparison to network IO. take facebook for example, they have billions of images on their site and last i heard 80% were held in cache servers around the world in ram. the file storage array they have is more or less a backup to the cache servers.
My question is about displaying thumbnails and storage.
Let's say I have a website where users can upload photos and view them in albums.
How are the photos usually stored in this scenario? Are the images themselves or are the file paths usually stored in the database?
If the photos are large and you want to display thumbnails, is it better to:
save a copy of the image and a reduced size image, only displaying the larger if requested?
use HTML to reduce the size?
It's almost always a bad idea to store images in a database. BLOBs can really slow down a database something fierce. It also limits your ability to spread storage around different drives. When the files are separate, you can even have one or more separate image servers to reduce the load on the main dynamic server. My recommendations are:
In your database table, have columns for both the directory the image resides in and the image name. That way you are free to change where images are stored, round-robin drives, add more storage later and put new images in the new storage, or whatever you want. Storing the path and the filename in separate fields makes it trivial to move images from one directory to another.
You definitely want to generate thumbnail images to reduce your network bandwidth and make your application run faster. However, you can generate the thumbnails on demand, or when the system load is low. If you're on Linux, ImageMagick is wonderful at automated batch resizing of images. It can even resize by a percentage instead of an absolute amount.
Some software such as TikiWiki stores the photos in a database. It then also caches thumbnail sized photos in the database.
Other software stores it in a directory. This is the way Gallery2 operates. I find the directory approach more scaleable. If a different size than the original is requested, typically the app will use ImageMagick to resize the photo, and then store a copy of the resized photo.
Another alternative is to re-upload the photo to a service like S3, and not store the photo locally at all.
This is common question and the basic answer is that it depends. You need to give more information. What database are you planning on using? SQL Server 2008 has some good new features for handling this scenario with FILESTREAM function. Generally I prefer to put them in the database, but if you just stuff them in their without thinking about design and access requirements you could have poor performance as the number of photos increases.
IF you are absolutely positively sure that your web server will always have access to the file system hosting the images, then go that route. Maybe.
However, if at any time you think you might need to, i don't know, create an image server because the hard drive on your web server is running out of space OR that you need to run multiple web servers, then save yourself the trouble and store them in a database. The hard part in storing on a file system is the security requirements of crossing the network.
Also, bear in mind that not all database servers are created equal in this regard. SQL 2008 introduced a FILESTREAM data type which actually stores the images on the local file system while allowing all read / write access through the db server. This has the added benefit of allowing you to run virus scanners on the incoming files while in storage.
Oracle has had some nice file storage facilities for awhile now. MySQL? I don't think I'd want to try, but you might be okay.
As to the second question: save a thumbnail along with the image. This process occurs only once per image and saves on presentation bandwidth. Using HTML to size an image down really does nothing for the client.