Is there a way to export Google Cloud Storage files to Google Drive using Python?
I was doing the Google Dataflow tutorial using the Google Shell, which is basically a single apache_beam command. I noticed that the command takes an output destination which is a Google Storage location. I wanted to know if, after the command is run, does Google provide a way to take the output and export it to Google Drive.
Google does not have any mechanism in place for developers to allow for movement of files between their services.
You will need to download your file from Google cloud to your own machine then upload it to google drive.
Related
We have found various solutions around upload the file to google cloud bucket from local system. However I am wondering if is there a way we can upload file to bucket using the public URL or link.
https://googleapis.dev/ruby/google-cloud-storage/latest/index.html
I want to upload a file from remote url to GCS bucket via ruby code. Any suggestion here would be really appreciated.
Your code sits between the remote URL and the Google Cloud Storage (GCS) Bucket.
You've 2 alternatives:
(As you describe) Download the file behind the remote URL to a file system accessible to your code and then upload it to GCS;
Stream the file from the remote location into memory (you'll need to write this) and then (using GCS client library) stream the file into a GCS object.
You tagged question with ruby-on-rails-3
Old rails versions use uploaders like carrierwave
It's possible to use it to upload files to GCS
You can upload not only local files using this gem but also from remote URL, just use special attribute
I've finally been able to upload two videos as blobs into a storage account container. I can see the blobs from the portal when I drill down into Storage > storage account name > Container > container name.
I can also see them from the CLI with the command "storage blob list".
However
When I attempt to upload the content into my Media service account - I select upload content from storage, Select the Account, then the container... and I get the erroneous message that there are no blobs
Clearly, there are - but they are not showing up. Any clues?
(see attached screen shots)
Did you try with Azure Media Services Explorer ? It's a very nice tool to work with Azure Media Services without any line of code !
You can download it directly from GitHub: https://github.com/Azure/Azure-Media-Services-Explorer
EDIT : ok, I think I have found why the blob list is empty. I did not saw that your two files have no extensions. I have just repro your issue with a file without extension, as you can see below:
To work with Azure Media Services encoders, your files need to have valid extensions.
Hope this helps,
Julien
I had uploaded files mostly media files to Azure's File Storage, which I am able to see in Azure's Explorer as well. But when I view the file as anonymous user, I am not able to view the file. Tried to check with Permissions setting as well, but to no avail.
Any help would be welcomed :)
Azure files have Shared Access Signatures (SAS). This is a key that you compute with the storage account key, that gives access to a particular URL. Here is an example (storage account name is obfuscated here):
https://mystorageaccount.file.core.windows.net/sampleshare/2.png?sv=2015-04-05&sr=f&si=sampleread&sig=Zq%2BfflhhbAU4CkCuz9q%2BnUFEM%2Fsg2PbXe3L4MeCC9Bo%3D&sip=0.0.0.0-255.255.255.255
You have sample code on how to create a SAS with Azure files at https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/, ยง"Generate a shared access signature for a file or file share".
You can also do it interactively with a number of tools. For instance, CloudXPlorer has this feature.
I want to try AppHarbor, but I have an application which stores uploaded files in certain place on a filesystem. Is it compatible with AppHarbor? Can I store files in the file system and access them later?
(what kind of path can I expect, like c:\blabla something or what?)
Thank you.
You can store files on the local filesystem, but the application directory is wiped on each new deployment so it's not recommended to rely on for file storage.
Instead we recommend that you use a cloud storage service such as Amazon S3, Google Cloud Storage or similar. There are .NET libraries for both services.
We recently wrote a blog post about uploading files directly to S3 and GCS from the browser that you might want to read.
If you are using a background worker, you need to 'Enable File System Write Access' in the settings of you application.
Then, you are permitted access to write to: Path.GetTempPath()
Sourced from this support question: http://support.appharbor.com/discussions/problems/5868-create-directory-in-background-worker
I have an app in Heroku, I need simple file storage for uploaded images for this I
used send_data using attachment_fu plugin.
After that I have used the tmp/ directory to write this file and want to display on browser, But these files are not displayed in browser.
How can I display these images on browser?
What is the alternate solution to store and retrieve images?
Thanks!
You cannot store uploaded files on Heroku.
You must use an alternative strategy. A couple alternative strategies:
Store uploaded files in your database on Heroku. Because database systems are not specifically optimized for storing files, you should test out how well your application performs under load before you use this strategy.
Store uploaded files in an external storage service. Amazon S3 is a popular cloud storage service that anyone can use, and it has pay-as-you-go pricing just like Heroku. (In fact, Heroku itself runs on top of another of Amazon's cloud services, EC2.)