Laravel download from google drive (filesystem) or (hide api key from link) - laravel

First way to download:
https://www.googleapis.com/drive/v3/files/1YIH4zfM0P1xa-_mfZNGiIY8qZrIEt-rF/?key=API_KEY&alt=media
Putting my API KEY in the link it gives me ability to download the file directly from my google drive.But I don't want to give my API_KEY to the users.
2.I can also access my drive another way: (using nao-pon/flysystem-google-drive)
Route::get('/download/{rest?}', function ($rest) {
$metadata = Storage::cloud()->getMetadata($rest);
$readStream = Storage::cloud()->getDriver()->readStream($rest);
return response()->stream(function () use ($readStream) {
fpassthru($readStream);
}, 200, [
'Content-Type' => $metadata['mimetype'],
'Content-disposition' => 'attachment; filename="'.$metadata['name'].'"', // force download?
]);
})->where('rest','(.*)');
This way I don't have to use api_key but the server has to download the whole file it's a stream but still that uses server resources.
On the other hand https://www.googleapis.com/drive/v3/files/1YIH4zfM0P1xa-_mfZNGiIY8qZrIEt-rF/?key=API_KEY&alt=media needs Content-Type and File name as it doesn't have on it.And also I have no idea how to hide the api key.
So what do you suggest.Is there any other way not to response()->stream to not download the whole file by stream through server and then send it to the user?Multiple users downloading the files would use all the bandwidth,so the download speed drops so fast.

Related

Laravel Minio Temporary URL

I'm looking for a hacky way to create temporary URLs with Minio
I see on the Laravel docs it says: Generating temporary storage URLs via the temporaryUrl method is not supported when using MinIO.
However from some digging I noticed that I can upload images successfully using:
AWS_ENDPOINT=http://minio:9000
I can't view them because the temporary url is on http://minio:9000/xxx
If I change the AWS endpoint to
AWS_ENDPOINT=http://localhost:9000
The temporary url is on http://localhost:9000/xxx, the signature is validated and the file can be viewed.
The issue exists in this call to make the command. The $command needs to have the host changed but I don't know if I can do that by just passing in an option.
$command = $this->client->getCommand('GetObject', array_merge([
'Bucket' => $this->config['bucket'],
'Key' => $this->prefixer->prefixPath($path),
], $options));
There is also the option to just change the baseUrl by providing a temporary_url in the filesystem config. however, because the URL has changed the signature is invalid.
Is there a way I can update the S3Client to use a different host either by passing an option to the getCommand function or by passing a new S3Client to the AWS adapter to use the correct host?
A very hacky solution I've found is to re-create the AwsS3Adatapter:
if (is_development()) {
$manager = app()->make(FilesystemManager::class);
$adapter = $manager->createS3Driver([
...config("filesystems.disks.s3_private"),
"endpoint" => "http://localhost:9000",
]);
return $adapter->temporaryUrl(
$this->getPathRelativeToRoot(),
now()->addMinutes(30)
);
}

How can I save an image via API in Laravel Server from React Native

I am trying to save some info and an image sent through api from React Native app. This is the request that I get if I log it.
[2022-08-05 10:11:09] local.ERROR: array ('{
"data":{
"_parts":' => array ('[
"image", {
"modificationDate":"1659694264000","size":2883040,"mime":"image/jpeg","height":4160,"width":3120,"path":"file:///storage/emulated/0/Android/data/com.wheeloffortune/files/Pictures/image-6ab1ad33-2580-416b-be7d-09a0662739218182247041115654861.jpg"
}' => NULL,
),)
When I tested api from postman I get image as a file and have saved it easily without a sweat. But from the app the developer sends in the format I pasted above. Can I somehow save image from the above request.

Unable to get content of file using google drive api

I am working with a react application where you can access google drive files using react-google-picker and I am using get api of google apis where you can pass "alt=media" as a URL parameter to get the file content. It's working as expected for text files but for PDF/Doc files, it returns metadata instead of actual content of file.
Is there any other way to get the content using google api's ??
this is the code for getting content of the file.
axios.get(`https://www.googleapis.com/drive/v2/files/${data.docs[0].id}?key=${developerKey}&alt=media`, {
headers: {
Authorization: Bearer ${this.state.token}
}
})

How to control access to files at another server in Laravel

I have a host for my Laravel website and another (non-laravel) for stored files. Direct access to my files are blocked completely by default and I want to control access to them by creating temporary links in my Laravel site. I know how to code, just want to know the idea of how to do it (not details).
From the Laravel docs
Temporary URLs For files stored using the s3 or rackspace driver, you
may create a temporary URL to a given file using the temporaryUrl
method. This methods accepts a path and a DateTime instance specifying
when the URL should expire:
$url = Storage::temporaryUrl(
'file.jpg', now()->addMinutes(5)
);
You could also make your own solution by directing all image request through your own server and making sure the file visibility is set to private.
Here is an example of how a controller could return image from your storage
public function get($path)
{
$file = Storage::disk('s3')->get($path);
// Do your temp link solution here
return response($file, 200)->header('Content-Type', 'image/png');
}
What i am using right now is Flysystem provided in laravel.Laravel Flysystem integration use simple drivers for working with local filesystems, Amazon S3 and other some space provide also. So for this doesn't matter whether is a server is laravel server or not.
Even better, it's very simple in this to switch between server by just changing server configuration in API.
As far as I know we can create temporary Url for s3 and rackspace in this also by calling temporaryUrl method. Caching is already in this.
That's the thing.
If your files are uploaded on an AWS S3 server
then,
use Storage;
$file_path = "4/1563454594.mp4";
if( Storage::disk('s3')->exists($file_path) ) {
// link expiration time
$urlExpires = Carbon::now()->addMinutes(1);
try {
$tempUrl = Storage::disk('s3')->temporaryUrl($file_path, $urlExpires);
} catch ( \Exception $e ) {
// Unable to test temporaryUrl, its giving driver dont support it issue.
return response($e->getMessage());
}
}
Your temporary URL will be generated, After given expiration time (1 minute). It will expire.

Google Drive Ruby Client Returning Different Responses

So I'm working with the Google-Drive-Ruby-Client API to export some data into spreadsheets. The problem is, when I'm trying to get the exportLink to download the spreadsheets into a CSV format, what I'm getting as a response in the Google API Explorer and what I'm getting when using in my application doesn't match up. Specifically, the entire exportLinks sections of the response is missing.
Has this way of downloading the spreadsheets been deprecated? Is there any other way to export spreadsheets into a CSV format? Or any other workable format (perhaps a multidimensional array?)
client = Google::APIClient.new(:application_name => APPLICATION_NAME)
client.authorization = authorize
drive_api = client.discovered_api('drive', 'v2')
result = client.execute!(
:api_method => drive_api.files.get,
:parameters => {:fileId => "1MBP9Q9Q-9ZgLoYnY8ExS-EcxHLESI_vcK4J91ngp6-Q"})
file = result.data
puts("Fetched #{file["title"]}")
puts("Getting downloadURL")
exportLinks = file['exportLinks']
From there, I should be able to call exportLinks['text/csv'] is my understanding, but the response does not include the exportLinks section like it does in the API Explorer.
Edit: I use the same authorization that Google uses in its quick start guide, and that all works fine, as well as other methods work fine, so I know I'm connecting to the Drive API correctly, I'm just not sure why my responses aren't matching the API Explorer.

Resources