Laravel how to upload/download files to Amazon S2 using KMS - laravel

I am trying to upload/download files from amazon s3 using KMS, although i have implemented this without using KMS by the following code,
$filePath = "users/" . $user_id . "/". $name;
Storage::disk('s3')->put($filePath, file_get_contents($file));
$fileName = Storage::disk('s3')->url($filePath);
Now I am trying to add the KMS feature to my bucket, create my key, and implement the bucket policy. Still, I can't get the possible syntax for uploading/downloading files using KMS for laravel/PHP; I am a newbie here for s3.

Related

Heroku file upload success, but error when download

Hi i am building a laravel-livewire app on heroku
the app requires file upload and I'm using FileUpload of livewire, it works fine on local, but when I try it on heroku, it says upload successful, but when I download the file it gets "No file" message. I don't know where the error lies.
here is my source code:
_ In controller:
public function updatedFile()
{
// $this->validate();
$fileUpload = new File();
$fileUpload->url = $this->file->storeAs('public/files/' . auth()->id(), $this->file->getFilename());
$fileUpload->size_file = $this->getFileSize($this->file);
$fileUpload->file_name = $this->file->getClientOriginalName();
$fileUpload->model_name = $this->model_name;
$fileUpload->model_id = $this->model_id;
$fileUpload->admin_id = auth()->check() ? auth()->id() : null;
$fileUpload->save();
if ($this->model_id == null)
$this->list[] = $fileUpload->id;
}
_ In view
<a href="{{ $canDownload ? asset('storage/' . substr($val['url'], 7, strlen($val['url']) - 7)) : '#' }}" download>
<span class="d-block mb-0" style="word-break: break-all;">{{ $val['file_name'] }}</span>
<small class="kb">{{ $val['size_file'] }}</small>
</a>
```
The immediate issue may just be with relative vs. absolute paths.
But even once you resolve that you'll find that your uploads disappear frequently and unpredictably. This is due to Heroku's ephemeral filesystem.
To store uploads long-term you'll need to use a third-party service like Amazon S3 or Azure Blob Storage. It looks like Livewire supports this directly:
The previous example demonstrates the most basic storage scenario: Moving the temporarily uploaded file to the "photos" directory on the app's default filesystem disk.
However, you may want to customize the file name of the stored file, or even specify a specific storage "disk" to store the file on (maybe in an S3 bucket for example).
It also provides the following example:
// Store in the "photos" directory in a configured "s3" bucket.
$this->photo->store('photos', 's3');
And links to the relevant Laravel documentation, saying that Livewire uses the same API. Just make sure to configure an S3 bucket.
You can also completely bypass your server, having uploads go directly from the user's browser to your S3 bucket. This is particularly useful with large uploads.
Make sure to use the correct disk when building your download URL.

Php laravel Upload file directly to AWS S3 bucket

Can anyone help me how to upload a file into aws S3 bucket using PHP laravel. But the file should directly get uploaded into S3 using pre signed URL.
I will try to answer this question. So, there are two ways to do this:
You send the pre-signed URL to Frontend Client and let them upload the file to S3 directly, and once uploaded they notify your server of the same.
You receive the file directly on the server and upload it to S3, in this case, you won't need any pre-signed URL, as you would have already configured the AWS access inside the project.
Since solution 1 is self-explanatory, I will try to explain the solution 2.
Laravel provides Storage Facade for handling filesystem operations. It follows the philosophy of multiple drivers - Public, Local Disk, Amazon S3, FTP plus option of extending the driver.
Step 1: Configure your .env file with AWS keys, you will need the following values to start using Amazon S3 as the driver:
AWS Key
AWS Secret
AWS Bucket Name
AWS Bucket Region
Step 2: Assuming that you already have the file uploaded to your server. We will now upload the file to S3 from our server.
If you have mentioned s3 as the default disk, following snippet will do the upload for you:
Storage::put('avatars/1', $fileContents);
If you are using multiple disks, you can upload the file by:
Storage::disk('s3')->put('avatars/1', $fileContents);
We are done! Your file is now uploaded to your S3 bucket. Double-check it inside you S3 bucket.
If you wish to learn more about Laravel Storage, click here.
use Storage;
use Config;
$client = Storage::disk('s3')->getDriver()->getAdapter()->getClient();
$bucket = Config::get('filesystems.disks.s3.bucket');
$command = $client->getCommand('PutObject', [
'Bucket' => $bucket,
'Key' => '344772707_360.mp4' // file name in s3 bucket which you want to access
]);
$request = $client->createPresignedRequest($command, '+20 minutes');
// Get the actual presigned-url
return $presignedUrl = (string)$request->getUri();
We can use 'PutObject' to generate a signed-url for uploading files onto S3.
Make sure this package is insalled:
composer require league/flysystem-aws-s3-v3 "^1.0"
Create access credentials on AWS and set these variables in .env file
AWS_ACCESS_KEY_ID=ORJATNRFO7SDSMJESWMW
AWS_SECRET_ACCESS_KEY=xnzuPuatfZu09103/BXorsO4H/xxxxxxxxxx
AWS_DEFAULT_REGION=ap-south-1
AWS_BUCKET=xxxxxxx
AWS_URL=http://xxxxx.s3.ap-south-1.amazonaws.com/
public function uploadToS3(Request $request)
{
$file = $request->file('file');
\Storage::disk('s3')->put(
'path/in/s3/filename.jpg',
file_get_contents($file->getRealPath())
);
}
Create credentials here:

Laravel AWS S3 Storage image cache

I have Laravel based web and mobile application that stores images on AWS S3 and I want to add cache support because even small number of app users produce hundreds and sometimes thouthands of GET requests on AWS S3.
To get image from mobile app I use GET request that is being handled by code like this
public function showImage(....) {
...
return Storage::disk('s3')->response("images/".$image->filename);
}
On the next image you can see response headers that I receive. Cache-Control shows no-cache so I assume that mobile app won't cache this image.
How can I add cache support for this request? Should I do it?
I know that Laravel Documentaion suggests caching for Filestorage - should I implement it for S3? Can it help to decrease GET requests count of read files from AWS S3? Where can I find more info about it.
I would suggest to use a temporary URL as described here: https://laravel.com/docs/7.x/filesystem#file-urls
Then use the Cache to store it until it is expired:
$value = Cache::remember('my-cache-key', 3600 * $hours, function () use ($hours, $image) {
$url = Storage::disk('s3')->temporaryUrl(
"images/".$image->filename, now()->addMinutes(60 * $hours + 1)
);
});
Whenever you update the object in S3, do this to delete the cached URL:
Cache::forget('my-cache-key');
... and you will get a new URL for the new object.
You could use a CDN service like CloudFlare and set a cache header to let CloudFlare keep the cache for a certain amount of time.
$s3->putObject(file_get_contents($path), $bucket, $url, S3::ACL_PUBLIC_READ, array(), array('Cache-Control' => 'max-age=31536000, public'));
This way, files would be fetched once by CloudFlare, stored at their servers, and served to users without requesting images from S3 for every single request.
See also:
How can I reduce my data transfer cost? Amazon S3 --> Cloudflare --> Visitor
How to set the Expires and Cache-Control headers for all objects in an AWS S3 bucket with a PHP script

How to control access to files at another server in Laravel

I have a host for my Laravel website and another (non-laravel) for stored files. Direct access to my files are blocked completely by default and I want to control access to them by creating temporary links in my Laravel site. I know how to code, just want to know the idea of how to do it (not details).
From the Laravel docs
Temporary URLs For files stored using the s3 or rackspace driver, you
may create a temporary URL to a given file using the temporaryUrl
method. This methods accepts a path and a DateTime instance specifying
when the URL should expire:
$url = Storage::temporaryUrl(
'file.jpg', now()->addMinutes(5)
);
You could also make your own solution by directing all image request through your own server and making sure the file visibility is set to private.
Here is an example of how a controller could return image from your storage
public function get($path)
{
$file = Storage::disk('s3')->get($path);
// Do your temp link solution here
return response($file, 200)->header('Content-Type', 'image/png');
}
What i am using right now is Flysystem provided in laravel.Laravel Flysystem integration use simple drivers for working with local filesystems, Amazon S3 and other some space provide also. So for this doesn't matter whether is a server is laravel server or not.
Even better, it's very simple in this to switch between server by just changing server configuration in API.
As far as I know we can create temporary Url for s3 and rackspace in this also by calling temporaryUrl method. Caching is already in this.
That's the thing.
If your files are uploaded on an AWS S3 server
then,
use Storage;
$file_path = "4/1563454594.mp4";
if( Storage::disk('s3')->exists($file_path) ) {
// link expiration time
$urlExpires = Carbon::now()->addMinutes(1);
try {
$tempUrl = Storage::disk('s3')->temporaryUrl($file_path, $urlExpires);
} catch ( \Exception $e ) {
// Unable to test temporaryUrl, its giving driver dont support it issue.
return response($e->getMessage());
}
}
Your temporary URL will be generated, After given expiration time (1 minute). It will expire.

Laravel 5.5 displaying Images in AWS S3 Bucket with Private permission

I am develping a portal using Laravel 5.5 deployed on AWS server and using S3 bucket for image and file storage. The front end is developed with Angular.
I can successfully store the images on S3 Bucket with the following piece of code with "PRIVATE" Permissions:
$filename = pathinfo($filenamewithextension, PATHINFO_FILENAME);
$extension = $request->file('avatar_image')->getClientOriginalExtension();
$filenametostore = $uuid.'_'.time().'.'.$extension;
Storage::disk('s3')->put($filenametostore, fopen($request->file('avatar_image'), 'r+'));
My problem is how can I send the image url of the image uploaded to the front end to be displayed with PRIVATE permissions granted to the image.
Does letting the images set to PUBLIC has a security issue ??
Your feedback much appreciated.

Resources