Add files from Google Cloud Storage to Drive - google-api

I'm developing a backend service in NodeJS. It processes images from Google Cloud Storage by requesting a temporary link and sending this link to a third party analysis service. I'd also want the images to be added to a shared Google Drive folder. Is there any possible way to do this easily (e.g. by using the Drive API and posting the link to the file, instead of downloading the file and subsequently uploading it). In other words: does the Drive API accept links to files, instead of uploading them? Or is there any other clever way of sharing Google Cloud to Drive easily (as it's both Google services).
Thanks

The best way to do this right now would be to use a Google Colab notebook:
Create a Google Colab notebook and connect to your GCP bucket and mount Google Drive to it:
# authenticate
from google.colab import auth
auth.authenticate_user()
# set your gcp project
!gcloud config set project my-project
# mount your drive
from google.colab import drive
drive.mount('/content/drive')
Run a gsutil command to copy your GCP storage data to the mounted drive
!gsutil -q -m cp -r gs://my-bucket-name /content/drive/My\ Drive/

I have done some searching but, I don't see a way to add files from Google Cloud Storage to Google Drive other than doing downloading and uploading them. Also AFAIK, gsutil, which Google Cloud Storage APIs can interact with, also supports downloading and uploading of files when sharing to other storage.
And, as answered in Cloud Storage - Frequently Asked Questions, Google Drive and Google Cloud Storage are two different storage services wherein both allow programmatic access to their functionality, but the goals of the APIs are quite different.
However, you may want to try using Request Endpoints wherein you can access Google Cloud Storage through three request endpoints (URIs). Which one you use depends on the operation you are performing.
IMPORTANT NOTE: The Google Cloud Storage URIs described on this page are subject to change.
You can use the following URLs to access an object:
XML API
storage.googleapis.com/<bucket>/<object>
<bucket>.storage.googleapis.com/<object>
JSON API
www.googleapis.com/download/storage/v1/b/<bucket>/o/<object-encoded-as-URL-path-segment>?alt=media
These URLs support secure sockets layer (SSL) encryption, which means you can use either HTTP or HTTPS. If you authenticate to the Google Cloud Storage API using OAuth 2.0, you should use HTTPS.
You may want to also check Sharing and Collaboration for more information.

Related

How to connect Spring MVC application and cloud storage

I do application on Spring MVC where I need to store users photos. There are some ways to store files, but they have disadvantages:
in local storage - limit of host storage
in DB - cashe, limit of DB, long process of converting images to store in DB
I want ask you, is there some way to upload images(any files) on cloud service, for example https://i.onthe.io/ or google drive and then load them to my application (on JSP page).
There will be 2 steps to upload into Google Drive from a Spring application.
1.Implement Oauth2 Authorization - Either by using Google APIs or Spring OAuth2
2.Use Drive API Client Library for Java to upload/download files into Google drive.
Refer GoogleDrive API JavaDoc here.
Spring Cloud AWS maybe can help you enter link description here

How to upload huge files into webserver

I have a virtual machine on google cloud and i create a webserver on this machine (ubuntu 12.04). I will service my website on this machine.
My website shows huge size images which format is jpeg2000. Also my website supports, users can upload their images and share other people.
But problem is images' size about 1 ~ 3 gb and i can not use standart file upload methods (php file upload) because when the connection gone, that upload starts again. So i need to better way ?
I am thinking about google drive api. If i create a common google drive account and users upload this account on my website using google drive api. Is it will good way ?
Since you're uploading files to Drive, you can use the Upload API with uploadType=resumable.
Resumable upload: uploadType=resumable. For reliable transfer, especially important with larger files. With this method, you use a session initiating request, which optionally can include metadata. This is a good strategy to use for most applications, since it also works for smaller files at the cost of one additional HTTP request per upload.
However, do note that there's a storage limit for the account. If you want have more capacity, you'll have to purchase it.

Laravel: Google Cloud Storage and Google Compute Engine

I need to store user data to my google cloud storage bucket.
I'm running laravel on google compute engine.
Could someone assist with this?
I've found only tutorials for google app engine so far.
The Google Cloud Storage JSON API is a simple, JSON-backed interface for accessing and manipulating Google Cloud Storage projects in a programmatic way.
You will need to use Google APIs Client Library for PHP to send requests to the Google Cloud Storage JSON API. See the links below for more info and examples:
https://cloud.google.com/storage/docs/json_api/
https://cloud.google.com/storage/docs/json_api/v1/json-api-php-samples

Google Cloud Storage to serve website images

We will launching a Google Campaign for our Website and expecting high number of users visiting our website.
Hence, I did some pre-calculations and figured out that serving images from Cloud servers would be best approach, which are currently being served from dedicated server.
I haven't got any clue on how Google Cloud Storage works or any other service. So can someone please guide me to relevant steps that I should be taking for hosting all our images to Google Cloud Storage and how can I serve them from Europe, and mapping of subdomain.
Currently I am following this Guide
Edit:
Before going for Cloud I compared the purpose of CDN vs Cloud and this what I figured out.
CDN: Used for serving contents from multiple regions: Speed is the purpose
Cloud: Used for serving contents for high bandwidth usage: High Availability is the purpose
And my Main purpose is High Availability, I hope I have gained correct information from dear friend Google.
Are you looking for this: https://developers.google.com/storage/docs/bucketnaming
You need verifying your domain name so it can use a CNAME in Google Cloud Storage
https://groups.google.com/forum/#!topic/gs-discussion/V-nLULNRQLI
After second thought, are you sure you need Google Cloud Storage?
It sounds like you just need a CDN or Amazon S3 stuff.
If memory serves, you need to do the following to use Google Cloud Storage with a custom domain:
verify your own domain names, such as example.com
upload images to Google Cloud Storage, you can use tools such as gsutil etc
serve these images with your own subdomain names such as images.example.com etc
You can serve images from google cloud storage. The nice benefit is: Google will do the serving for you and when you use the images api to create serving_url's you can crop and size the images while serving.
Look at this gist for details.
If you like to use you own domain as part of the image url, you cannot use https!
Serving Static Files
Applications often need to serve static files such as JavaScript, images, and CSS in addition to handling dynamic requests. Apps in the flexible environment can serve static files from a Google Cloud option like Cloud Storage.
https://cloud.google.com/appengine/docs/flexible/nodejs/serving-static-files

Mapping Windows drive path to a RESTful API for cental file storage

Is this possible? Is there a programme that will allow this to happen? I have a program that needs to access a lot of data from a central storage, but the likes of Amazon S3 only allows access via RESTful API which is no good for this program. Only UNC or drive letters are acceptable...
Help!
Bernard
I believe that you can map a Windows drive letter to a WebDav store. There are plenty of online Webdav storage providers.

Resources