create google drive upload ability - google-api

I create a little web system, written on PHP. Now we want to allow our clients to access out little system. On each job, they must upload one or more files (100-200 mb max size per file). Till now, i upload them via PHP component to server, but we have a lot of trouble with our ISP, and i decide to use free Google drive account. So, i read tutorials, but i can not understand clearly:
Is there a way to upload file from client browser directly to Google drive, without upload to our server first? As far as I see, i can use php library to operate with my Google drive and upload files, but - unfortunately - files must be on out server first, which is my big problem.
Big thanks in advance for every one which can help us.

Direct upload from javascript to Drive is very easy. Check out the Drive Picker at https://developers.google.com/picker/docs/ which does everything for you.

Related

Is there possible to transfer file from Google Drive to Windows Server IIS?

I generate dailly some files in Google Aps Script like Sheets, Docs, Raports and I want to know how to transfer this file directly to IIS server. Don't expect any ready solution but if you can show me the way how to start or which technology I should use I will be grateful.
I think about Google Script something like fetch API, but doesn't know how "tell the serwer" to collect and save this file.
You can try to use the Google Drive API, download the file from Google Drive first, and then transfer it to Windows Server IIS.
More information you can refer to this link: https://developers.google.com/drive.

Is there a way to create a custom download link for a google drive file with my own domain

Sorry for my poor knowledge about it. I would appreciate if anybody could explainme how can I create a custom download link for a google drive file. I got a domain in freenom.com and I want to change the download link of a google drive file to use my domain name in the url
What I’m try to do is avoid the final url docs.googleusercontent.com/docs/... using a custom link with my domain.
Is that possible?
I know it sound crazy but in my workplace they block docs.google url with a proxy filter and others cloud services like multcloud.com and koofr.eu too. Beside using proxy web or vpn is very suspect for they. 
So I need a way to download my files with my own url without lifting suspicions trace. 
I know is a nightmare but you never wish to live here. The only internet where I can download something is in my workplace.
Any solution with google colab may helpme too. Maybe create a temporal server via python in google colab and asign my domain name. But I just don’t know how to do it. With flask-ngrok I create one in google colab but the link I obtain is through de domain ngrok and Is blocked too.
The objetive is to create a custom download link to one file that doesn’t lifting suspicions trace and maybe reemplace in the future the file in google drive rename it so that keep the same path.
Any help will be appreciated.
Thanks for your time
Answer
I did it, finally I did it. I’m a beginner, so for me, founding a method to circumvent the stupid proxy filters of my workplace without leaving any suspects in order to download files from the internet is great!!!!!! wonderful!!!!!!. Also I got to say that in my country the price of the internet is a madness. Imagine that you would have to pay the equivalent of 250 dollars of your salary for only 2 gigabytes of data usage in your cellular. And also imagine that you would have to pay the equivalent of 375 dollars of your salary monthly for only 30 hours per month through adsl modem connection with the ridiculous speed of 1 Mbit/s. But only in my workplace the internet is free and fast (at least for me). Add to that situation the fact that they are watching you every move you do in your internet traffic.
As a communist country they control everything and in my workplace they have a man only for censure the internet blocking any suspicious page they found, and if they catching you using vpn or proxy web you receive a big problem. They block google drive, dropbox, and any other cloud storage services. Also they block any site you can use to download files if that site is not related to your work.
But hey, I finally found a way. First I got some free domains in freenom.com and I put them a name related with my work in order to not leave any suspicious trace in my web traffic. Then I got a free hosting service and associated each domains with a Wordpress site retouched in order to apparent something related with my work. Then I use my internet in my cellphone for download files from internet to my google drive account using google colab in order to store my files. After that I use google colab to upload my files from google drive to the ftp account of one of my Wordpress site created and then I create download links for all my file with my own domain name. And finally in my workplace I prepare a curl download script to download my files intermittently in order to not leave the appearance of downloading a big file traffic consuming. Finally I download the equivalent of 1 gb per day. Sad but true. I’m working now in a way to create a free ssl connection for my downloads with authentication system that won’t let the censor man to access to my links. Mabe with a Wordpress plugin

How to upload huge files into webserver

I have a virtual machine on google cloud and i create a webserver on this machine (ubuntu 12.04). I will service my website on this machine.
My website shows huge size images which format is jpeg2000. Also my website supports, users can upload their images and share other people.
But problem is images' size about 1 ~ 3 gb and i can not use standart file upload methods (php file upload) because when the connection gone, that upload starts again. So i need to better way ?
I am thinking about google drive api. If i create a common google drive account and users upload this account on my website using google drive api. Is it will good way ?
Since you're uploading files to Drive, you can use the Upload API with uploadType=resumable.
Resumable upload: uploadType=resumable. For reliable transfer, especially important with larger files. With this method, you use a session initiating request, which optionally can include metadata. This is a good strategy to use for most applications, since it also works for smaller files at the cost of one additional HTTP request per upload.
However, do note that there's a storage limit for the account. If you want have more capacity, you'll have to purchase it.

using your own ftp server to receive uploaded files on my website

Im looking for an out of the box solution to be able to add an upload form so that my users can upload large files from my website onto my own FTP server.
Has anyone found a good service to accomplish this? Again I want to be able to use my own server in my office and i also need a form attached to the uploaded file.
I run a graphics printing company and need to be able to receive large files that my designers send to me.
I want my user experience to be painless and not complicated as possible so i would prefer if they did not have to download any FTP clients like filezilla or transmit.
I just want them to fill out the form
upload their files
click send
then i receive it on my server
If there is any off the shelves solution for this that would be amazing.
Thank you!
I guess this is an "out of the box" web app. It allows you to brand the app to look like your own web site by modifying a couple of files. All the functionality is built in. It is called Simple2FTP and can be found at www.Simple2ftp.com
Maintaining a ftp server is not trivial. There are various dropbox-type services on the
web that are very easy to use.

Backing up Isolated storage

Some of my users have asked to have their data backed up (off the phone). I'm using Isolated storage to store their data. I'm wondering if there is a way that I can somehow get that information package it up and send it to a off phone location.
Is there a way to do this? What is the best way to accomplish this?
I don't know if skydrive will handle backing up the type of data you have but there is an API now.
http://msdn.microsoft.com/en-us/windowslive/default.aspx
Examples on GitHub:
https://github.com/liveservices/LiveSDK
Someone posted a question on how to backup a SQLCE database to skydrive and there didn't seem to be an API to backup random things only specific things like Pictures or Music. I'm sure something for that could be figured out by storing some data from a browser and watching the traffic in Fiddler.
edit: side note, theres an AWS beta API for WP7 now.
If you don't know what the data is, you can search for it like this:
Find Existing Files and Directories in Isolated Storage
Then you can use SkyDrive to save users data onto it.
You can use the Live SDK.

Resources