Im trying to integrate Google Cloud Storage for file uploads and serving in a laravel app that is stored in Google Compute Engine.
What would be the best practice for that?
Is it the right way of using the Google Cloud Storage?
Should I use the Storage facade for that? or a Job for copying local file uploads to the cloud?
Thanks
You can use the Google APIs Client Library for PHP to send requests to the Google Cloud Storage JSON API. Visit this link for example.
Better approach is to directly uploading to cloud storage.
As directly uploading needs to upload file data only once in other process you will have to first upload it to some temp file and then from that temp file to cloud storage. So as data transfer is more it would be slower and image can be unavailable also till the time job runs and move it from temp location to cloud storage.
Now let's address how to upload directly in cloud storage using
Laravel. Below steps can be followed to achieve it :-
Import CloudStorageTools
use google\appengine\api\cloud_storage\CloudStorageTools;
Create bucket url to directly uploading file
$bucket_options = ['gs_bucket_name' => $my_bucket];
$cloud_storage_upload_url = CloudStorageTools::createUploadUrl('/upload/handler', $bucket_options);
Now above generated $cloud_storage_upload_url can be used in action tag of form to upload file directly.
Example:-
<form action="{{ upload_url }}" enctype="multipart/form-data" method="post">
Footnote :- Upload url we generated would be only accessible till 10 minutes of it's creation.
Also documentation for same can be found here.
Related
The project I am currently working on needs a search engine to search a couple of 10.000 pdf files. When the user searches via the website for a certain keyword, the search engine will return a snippet of the pdf files matching his search criteria. The user then has the option to click on a button to view the entire pdf file.
I figured that the best way to do this was using elasticsearch + fscrawler (https://fscrawler.readthedocs.io/en/fscrawler-2.7/). Running some tests today and was able to crawl to a folder on my local machine.
For serving the PDF files (via a website), I figured I could store the PDF files in a google cloud storage and then use the link of the google cloud storage to let the users view the pdf files. However, FS Crawler does not seem to be able to access the bucket. Any tips or ideas on how to solve this. Feel free to criticize the work method described above. If there are better ways to make the users of the website access the PDF files, I would love to hear it.
Thanks in advance and kind regards!
You can use s3fs-fuse to mount s3 bucket into your file system and then use normal Local FS crawler.
Alternatively, you can fork fscrawler and implement a crawler for s3 similar to crawler-ftp.
I have AWS S3 buckets full of images and was wondering how I could browse them without needing to download the whole batch first. Is there a way that I can pipe them into feh through awscli or some other method?
Amazon S3 can act as a web server for your images. However, it simply serves the image files when they are requested. You will need to write an application or web page that incorporates those images into a form suitable for viewing.
For example, you could list the files in the Amazon S3 bucket and then convert them into an HTML page with lots of <img src=... /> tags. The web browser would then download the images from S3 and insert them into the web page in your browser.
If you are looking for a full-featured photo management app, try services like Prime Photos from Amazon or SmugMug. They've done all the hard work for you.
I am trying to implement Dropbox on my website and so far I've been able to upload, fetch file metadata, user details and also download the file on my local machine(using the Dropbox API v2).
But, I would like to import the file directly from Dropbox and upload it to the server to be processed further....I'm able to generate the link for the chosen file using the "Chooser"
Dropbox API explorer lists all the possible APIs dropbox can provide.
To build the website I'm using laravel 5.6.17
Your help would be much appreciated. Thanks in advance
I want to store image on firebase and want to use it somewhere. I went through this SO post and tried this demo.
It stores images as data:url format. But I want to upload and store images as physical file so I can use it further, something like this url http://example.com/some_image.png
Can I achieve this with Firebase?
No. The Firebase API currently does not support storing files outside of its database.
Update: at Google I/O 2016 Firebase added Firebase Storage to its offering, which is a service dedicated to storing files.
is it possible to create a file in my Dropbox/Google drive (or any could service) via HTTP POST and GET only ?
I want to send a HTTP POST with text content in it and whatever the content is it will be written to a file in my storage.
i was able to do this on a free web hosting using PHP but it is possible to be done on could or maybe is it possible to host PHP on cloud so i can use the same PHP code the i used on the site ?
Yes, this is possible. For Dropbox, see https://www.dropbox.com/developers/core/docs#files_put.
For Drive, see Uploading files. It talks through the steps for uploading file content over http.