Handling image content in gae web application - image

I am building a new web application. I use google appengine , jsf2, primefaces , java technologies for building this web application. I have to build a dynamic image gallery within this web application. ie registered users should be able to load images to a gallery and then it should be available for viewing by the public.
My issue is that :
1) app engine allows only a maximum of 1mb file to be written to its blob store at a time.
2) app engine doesn't allow to write to the server file system.
3) Should I store each image as a blob to the gae database?. But if I do that the whole application will be damn slow as there can be a lot of images. So reading the images from the blob store can make it slow and will cost heaps of processing power.
Am really confused about a proper solution and couldn't find any proper recommendation in the web. I am pretty sure that there will be a descent solution available !
It would be a great help if some one with prior experience in building web application's which deals with a lot of image content could advice me a good solution.

The solution is to use upload URLs when users are uploading images, so that they get uploaded directly to the Blobstore:
https://developers.google.com/appengine/docs/java/blobstore/#Java_Uploading_a_blob
Then just reference the blobs directly via getServingUrl() rather than having your application try to read them into memory.

You really should use GCS (Google Cloud Storage), and not the blobstore. The call from a users browser to an image stored in GCS goes directly to there, and is not charged to your app.

Related

Storing files in a webserver

I have a project using MEAN stack that uploads imagefiles to a server and the names of the images to db. Then the images are shown for users of the applications kinda like an image gallery.
I have been trying to figure out an effiecent way of storing the imagefiles. atm im storing them under the angular application in a folder /var/www/app/files
What are the usual ways of storing them in a cloud server like digital ocean, heroku and many others.
Im a bit thrown off by the fact they offer many options for datastorage.
Lets say that hundres of thousands of images were uploaded by the application to the server.
Saving all of them in inside your front end app in a subfolder might not be the best solution? or am i wrong with this.
I am very new to these webserver cloud services and how they actually operate.
Can someone clarify on what would be the optimal solution.
Thanks!
Saving all of them in inside your front end app in a subfolder might not be the best solution?
You're very right about this. Over time this will get cluttered, and unless you use some very convoluted logic, will slow down your server.
If you're using Angular and this is in the public folder sent to every user, this is even worse.
The best solution to this is using something like an AWS S3 Bucket (DigitalOcean has Block Storage and I believe Heroku has something a bit different). These services offer storage of files, and essentially act as logic-less servers. You can set some privacy policies and other settings, but there's no runtime like NodeJS that can do logic for you.
Your Node server (or any other server setup) interfaces with this storage server, and handles most of the fetching and storing of files. You can optionally limit these storage services so they can only communicate with your Node server, so any file traffic would be done through your Node server.

Caching on web player

I'm creating a runtime obj importer in unity, but I need create a cache system on my web player. I tried "PlayerPrefs" but is to small to store all information. Is there another possibility to store information in client side, close 10MB.
Client-side caching is usually done in memory for speed. If your data can't fit in memory, then you'll need to write it to the local drive or send it to the back-end.
Since the local file storage is unavailable in the Unity3D Web Player, you'll have to either send it to some sort of back-end or develop and in-memory store by hand.
Hope it helps.

Amazon AWS and usage model for S3 storage

There is this example on amazon, a high traffic web application. I noticed that they are using S3 as their content delivery method. I was wondering if I need to have a Web Server for the content delivery and a Web App for my application. I don't understand why they have 2 web servers and 2 web app in the diagram.
And what is the best way to set up a website that serves images and static contents through S3 and the rest of the content through the regular storage.
My last question is, can I consider S3 as a main storage, reliable enough that I can only keep my static content there and don't have a normal storage as a backup ?
That is a very general diagram, specific diagrams will vary depending on the specifics of the overall architecture.
Having said that, I believe the Web Server represents something like Apache or Nginx and the App Server represent something like Rails, Rack Server, Unicorn, Gunicorn, Django, Sinatra, Flask, Jetty, Tomcat, etc. In some cases you can merge the Web Server and the App Server together like for example deploying Apache with python mod_wsgi to run your Django app. (So depends on Architecture)
what is the best way to set up a website that serves images and static
contents through S3 and the rest of the content through the regular
storage.
There's no really best way other than just point your dynamic content to your Databases (SQL and NoSQL) and point your static files to an S3 bucket (images, css, Jquery code, etc) You can also use third party modules depending on your application stack. For example you can accomplish this in Django with the django-storages module. You can find similar modules for other app stacks like Rails.
My last question is, can I consider S3 as a main storage, reliable
enough that I can only keep my static content there and don't have a
normal storage as a backup ?
S3 is pretty reliable, they provide a 99.999999999% reliability of your data. That goes down if you use their RRS (Reduced Redundancy Storage), but if you want to use it you probably want to back up your data in a non RRS bucket anyways. Anyhow, if it's extremely critical data, you are more than free to backup your data somewhere else just in case.
Notice in the diagram that they also recommend using CloudFront for your static files and this is especially useful if your users will be accessing your application from different geographical areas.
Hope this helps.

Storing Images Externally

I got a page, with not that much bandwidth, therefore I want to store the images externally on another server, that offers unlimited bandwidth. Any suggestions on how to do this, or maybe a better solution?
Image storage on different server
Check out this similar post, Facebook does it, Google does it, so it's a preferred solution to store images on another server. You can assign links to the images dynamically or statically from the external server and that's all you need to do! You need to take care of the hierarchy how the images are to be stored in the external server.

What are the best practices for image serving?

What techniques do people commonly use for uploading, storing and presenting images with a CMS?
Do you store them in the database or on the file system?
Do you generate thumbnails on upload? Or on the fly, then maybe cache them for reuse? Or rely on browser scaling?
Typically, most content management systems will store images the actual data of image uploads to the file systems and then add a link to the file within the database. Thumbnails can either be generated on upload or on first request (on the fly is considered inefficient, especially given the cheap cost of storage). Browser scaling is a bad idea (images may be uploaded as multi megabyte uncompressed files) but is done by some systems.
i agree with kevin. i can't think of any cms that doesn't store in the file system. then only issue that comes up with that technique is if you are planning on clustering multiple web servers to run your cms. if thats the case then you have to plan on it and have the ability to point all the web servers to the same file storage location.
the technique ive used for years is on upload, resize the image to something practical for the web, then generate the thumbnail, then write them to the file system and record the pointer in the database.
if the site is a huge site then you need serve the images from cache servers because file systems are very slow in comparison to network IO. take facebook for example, they have billions of images on their site and last i heard 80% were held in cache servers around the world in ram. the file storage array they have is more or less a backup to the cache servers.

Resources