How to upload huge files into webserver - ftp

I have a virtual machine on google cloud and i create a webserver on this machine (ubuntu 12.04). I will service my website on this machine.
My website shows huge size images which format is jpeg2000. Also my website supports, users can upload their images and share other people.
But problem is images' size about 1 ~ 3 gb and i can not use standart file upload methods (php file upload) because when the connection gone, that upload starts again. So i need to better way ?
I am thinking about google drive api. If i create a common google drive account and users upload this account on my website using google drive api. Is it will good way ?

Since you're uploading files to Drive, you can use the Upload API with uploadType=resumable.
Resumable upload: uploadType=resumable. For reliable transfer, especially important with larger files. With this method, you use a session initiating request, which optionally can include metadata. This is a good strategy to use for most applications, since it also works for smaller files at the cost of one additional HTTP request per upload.
However, do note that there's a storage limit for the account. If you want have more capacity, you'll have to purchase it.

Related

How to pipe upload stream of large files to Azure Blob storage via Node.js app on Azure Websites?

I am building a Video service using Azure Media Services and Node.js
Everything went semi-fine till now, however when I tried to deploy the app to Azure Web Apps for hosting, any large files fail with 404.13 error.
Yes I know about maxAllowedContentLength and not, that is NOT a solution. It only goes up to 4GB, which is pathetic - even HDR environment maps can easily exceed that amount these days. I need to enable users to upload files up to 150GB in size. However when azure web apps recieves a multipart request it appears to buffer it into memory until a certain threshold of either bytes or just seconds (upon hitting which, it returns me a 404.13 or a 502 if my connection is slow) BEFORE running any of my server logic.
I tried Transfer-Encoding: chunked header in the server code, but even if that would help, since Web Apps doesn't let the code run, that doesn't actually matter.
For the record: I am using Sails.js at backend and Skipper is handling the stream piping to Azure Blob Service. Localhost obviously works just fine regardless of file size. I made a duplicate of this question on MSDN forums, but those are as slow as always. You can go there to see what I have found so far: enter link description here
Clientside I am using Ajax FormData to serialize the fields (one text field and one file) and send them, using the progress even to track upload progress.
Is there ANY way to make this work? I just want it to let my serverside logic handle the data stream, without buffering the bloody thing.
Rather than running all this data through your web application, you would be better off having your clients upload directly to a container in your Azure blob storage account.
You will need to enable CORS on your Azure Storage account to support this. Then, in your web application, when a user needs to upload data you would instead generate a SAS token for the storage account container you want the client to upload to and return that to the client. The client would then use the SAS token to upload the file into your storage account.
On the back-end, you could fire off a web job to do whatever processing you need to do on the file after it's been uploaded.
Further details and sample ajax code to do this is available in this blog post from the Azure Storage team.

OSX: How to easily view thumbnails of lots of images in an S3 bucket

In my company we need to share images with clients. This is done via S3 buckets, using Transmit to upload the images.
The administrators want to be able to see thumbnail images of all the images. I thought mounting the bucket and viewing it in Finder would work (I imagined that Finder would cache the thumbnails after visiting for the first time) but it only works with small numbers of images, not thousands. It doesn't seem to cache the Thumbnails and takes a long time to generate them. (Also tried this using ExpanDrive, which had the same issues).
Anyone got a solution to this?
There is an automated capability in Amazon S3 to display thumbnails of files within S3 buckets.
One option, as you have done, is to mount an Amazon S3 bucket as a drive and have the operating system display pictures, but this requires a lot of communication between your computer and S3 because it actually needs to download each picture.
Or, you could download all the pictures to a local drive by using the aws s3 sync command from the AWS Command-Line Interface (CLI). That way, things will work faster (but with the downside of having to download all the files).
An alternative would be to write a small program web app (possibly just a JavaScript page) that obtains a list of all pictures and then displays them in the browser. You can create thumbnails "on-the-fly" using services like Cloudinary or Imgix.
Usually, we don't like tool requests on StackOverflow. Bot here you have a nice list:
Cyberduck It supports S3 connections. Also as paid version in the App Store.
MCSTools-s3 (App Store)
3hub (App Store)
Bucket Explorer
Also make sure to have google searched for "s3 browser mac"

create google drive upload ability

I create a little web system, written on PHP. Now we want to allow our clients to access out little system. On each job, they must upload one or more files (100-200 mb max size per file). Till now, i upload them via PHP component to server, but we have a lot of trouble with our ISP, and i decide to use free Google drive account. So, i read tutorials, but i can not understand clearly:
Is there a way to upload file from client browser directly to Google drive, without upload to our server first? As far as I see, i can use php library to operate with my Google drive and upload files, but - unfortunately - files must be on out server first, which is my big problem.
Big thanks in advance for every one which can help us.
Direct upload from javascript to Drive is very easy. Check out the Drive Picker at https://developers.google.com/picker/docs/ which does everything for you.

Photo and Video hosting on a shared server?

I need to have several videos and photos on my website.
Primarily videos will range to more than a 100, and photos might be more than 10000.
Since i am using a shared server hosting, i cant have enough space to upload them on my server nor will the performance graph be any good.
Hence i decided, i can upload the videos on YouTube and embed them in my web site.
However the problem is with the photographs. Which would be the best photostorage web service which can
A) Have a Web API
B) Would have no need to create a badge or in any way not make it obvious that the photo is from some other source.
C) Unlimited Storage
D) High performance retrieval.
picasa fit all your requirements except unlimited storage. but you have a lot of space and can buy extra storage for little money
I don't know if you want a cloud based service or a self-hosted service?
If you want to self hosted an image hosting service, then you can check this project, ImageS3, web admin console, REST api, and of course, no usage limitation on APIs.
If you wish to use the cloud based image hosting service, then you can use imgur.com, imageshack.com or www.imgix.com.

Hosting web site images: Flickr PRO, Amazon S3 or...?

I'd like to save some of my site monthly bandwidth allocation and I'm wondering if I can use Flickr PRO or I should rely on Amazon S3 as an hosting service for my web site images. (My Web Application allows users to upload their own pictures and at the moment it's managing around 40GB of data)
I've never used Amazon's services and I like the idea of using Flickr REST Api do dynamically upload images from my webApp.
I like also the idea of having virtually unlimited space to store images on Flickr for only 25$/year but I'm not sure if I can use their service on my web site.
I think that my account can be banned if I use Flickr services to store images (uploaded by users of my website) that are not only for 'personal use'.
What's your experience and would you suggest other services rather than Amazon's S3 or is this the only available option at the moment?
Thanks
edit: Flickr explicitly says 'Don’t use Flickr for commercial purpose', you could always contact them to ask to evaluate your request but it sounds to me like I can't use their services to achieve what I want. S3 looks like the way to go then...
Even though a rough estimate of what I'm going to spend every month is still scaring
5000 visit/day
* 400 img/user (avg 50kB/image)
* 30 days
= ~3TB of traffic
* 0.15$/GB (Amazon S3)
= 429$/month
is there any cheaper place to host my images?
400 images per user seems high? Is that figure from actual stats?
Amazon S3 is great and it just works!
A possible cheaper option is Google. Google docs now supports all file types, so you can load the images up to a Google docs folder, and share the folder for public access. The URL's are kind of long e.g.
http://lh6.ggpht.com/VMLEHAa3kSHEoRr7AchhQ6HEzHVTn1b7Mf-whpxmPlpdrRfPW216UhYdQy3pzIe4f8Q7PKXN79AD4eRqu1obC7I
Add the =s paramter to scale the image, cool! e.g. for 200 pixels wide
http://lh6.ggpht.com/VMLEHAa3kSHEoRr7AchhQ6HEzHVTn1b7Mf-whpxmPlpdrRfPW216UhYdQy3pzIe4f8Q7PKXN79AD4eRqu1obC7I=s200
Google only charge USD5/year for 20GB. There is a full API for uploading docs etc
I love amazon S3. There are so many great code libraries (LitS3) and browser plugins (S3Fox) and upload widgets (Flajaxian) that make it really easy to use.
And you only pay for what you use. I use it a lot and have only ever experienced down time on one occasion.
Nivanix is an s3 competitor. I haven't used them, but they have a bit more functionality (image resizing) etc.
Edit:The link about Nivanix is dead now(2015/07/21), because Nivanix was dead.

Resources