Laravel 5.5 displaying Images in AWS S3 Bucket with Private permission - laravel

I am develping a portal using Laravel 5.5 deployed on AWS server and using S3 bucket for image and file storage. The front end is developed with Angular.
I can successfully store the images on S3 Bucket with the following piece of code with "PRIVATE" Permissions:
$filename = pathinfo($filenamewithextension, PATHINFO_FILENAME);
$extension = $request->file('avatar_image')->getClientOriginalExtension();
$filenametostore = $uuid.'_'.time().'.'.$extension;
Storage::disk('s3')->put($filenametostore, fopen($request->file('avatar_image'), 'r+'));
My problem is how can I send the image url of the image uploaded to the front end to be displayed with PRIVATE permissions granted to the image.
Does letting the images set to PUBLIC has a security issue ??
Your feedback much appreciated.

Related

Laravel how to upload/download files to Amazon S2 using KMS

I am trying to upload/download files from amazon s3 using KMS, although i have implemented this without using KMS by the following code,
$filePath = "users/" . $user_id . "/". $name;
Storage::disk('s3')->put($filePath, file_get_contents($file));
$fileName = Storage::disk('s3')->url($filePath);
Now I am trying to add the KMS feature to my bucket, create my key, and implement the bucket policy. Still, I can't get the possible syntax for uploading/downloading files using KMS for laravel/PHP; I am a newbie here for s3.

Heroku file upload success, but error when download

Hi i am building a laravel-livewire app on heroku
the app requires file upload and I'm using FileUpload of livewire, it works fine on local, but when I try it on heroku, it says upload successful, but when I download the file it gets "No file" message. I don't know where the error lies.
here is my source code:
_ In controller:
public function updatedFile()
{
// $this->validate();
$fileUpload = new File();
$fileUpload->url = $this->file->storeAs('public/files/' . auth()->id(), $this->file->getFilename());
$fileUpload->size_file = $this->getFileSize($this->file);
$fileUpload->file_name = $this->file->getClientOriginalName();
$fileUpload->model_name = $this->model_name;
$fileUpload->model_id = $this->model_id;
$fileUpload->admin_id = auth()->check() ? auth()->id() : null;
$fileUpload->save();
if ($this->model_id == null)
$this->list[] = $fileUpload->id;
}
_ In view
<a href="{{ $canDownload ? asset('storage/' . substr($val['url'], 7, strlen($val['url']) - 7)) : '#' }}" download>
<span class="d-block mb-0" style="word-break: break-all;">{{ $val['file_name'] }}</span>
<small class="kb">{{ $val['size_file'] }}</small>
</a>
```
The immediate issue may just be with relative vs. absolute paths.
But even once you resolve that you'll find that your uploads disappear frequently and unpredictably. This is due to Heroku's ephemeral filesystem.
To store uploads long-term you'll need to use a third-party service like Amazon S3 or Azure Blob Storage. It looks like Livewire supports this directly:
The previous example demonstrates the most basic storage scenario: Moving the temporarily uploaded file to the "photos" directory on the app's default filesystem disk.
However, you may want to customize the file name of the stored file, or even specify a specific storage "disk" to store the file on (maybe in an S3 bucket for example).
It also provides the following example:
// Store in the "photos" directory in a configured "s3" bucket.
$this->photo->store('photos', 's3');
And links to the relevant Laravel documentation, saying that Livewire uses the same API. Just make sure to configure an S3 bucket.
You can also completely bypass your server, having uploads go directly from the user's browser to your S3 bucket. This is particularly useful with large uploads.
Make sure to use the correct disk when building your download URL.

Php laravel Upload file directly to AWS S3 bucket

Can anyone help me how to upload a file into aws S3 bucket using PHP laravel. But the file should directly get uploaded into S3 using pre signed URL.
I will try to answer this question. So, there are two ways to do this:
You send the pre-signed URL to Frontend Client and let them upload the file to S3 directly, and once uploaded they notify your server of the same.
You receive the file directly on the server and upload it to S3, in this case, you won't need any pre-signed URL, as you would have already configured the AWS access inside the project.
Since solution 1 is self-explanatory, I will try to explain the solution 2.
Laravel provides Storage Facade for handling filesystem operations. It follows the philosophy of multiple drivers - Public, Local Disk, Amazon S3, FTP plus option of extending the driver.
Step 1: Configure your .env file with AWS keys, you will need the following values to start using Amazon S3 as the driver:
AWS Key
AWS Secret
AWS Bucket Name
AWS Bucket Region
Step 2: Assuming that you already have the file uploaded to your server. We will now upload the file to S3 from our server.
If you have mentioned s3 as the default disk, following snippet will do the upload for you:
Storage::put('avatars/1', $fileContents);
If you are using multiple disks, you can upload the file by:
Storage::disk('s3')->put('avatars/1', $fileContents);
We are done! Your file is now uploaded to your S3 bucket. Double-check it inside you S3 bucket.
If you wish to learn more about Laravel Storage, click here.
use Storage;
use Config;
$client = Storage::disk('s3')->getDriver()->getAdapter()->getClient();
$bucket = Config::get('filesystems.disks.s3.bucket');
$command = $client->getCommand('PutObject', [
'Bucket' => $bucket,
'Key' => '344772707_360.mp4' // file name in s3 bucket which you want to access
]);
$request = $client->createPresignedRequest($command, '+20 minutes');
// Get the actual presigned-url
return $presignedUrl = (string)$request->getUri();
We can use 'PutObject' to generate a signed-url for uploading files onto S3.
Make sure this package is insalled:
composer require league/flysystem-aws-s3-v3 "^1.0"
Create access credentials on AWS and set these variables in .env file
AWS_ACCESS_KEY_ID=ORJATNRFO7SDSMJESWMW
AWS_SECRET_ACCESS_KEY=xnzuPuatfZu09103/BXorsO4H/xxxxxxxxxx
AWS_DEFAULT_REGION=ap-south-1
AWS_BUCKET=xxxxxxx
AWS_URL=http://xxxxx.s3.ap-south-1.amazonaws.com/
public function uploadToS3(Request $request)
{
$file = $request->file('file');
\Storage::disk('s3')->put(
'path/in/s3/filename.jpg',
file_get_contents($file->getRealPath())
);
}
Create credentials here:

Laravel AWS S3 Storage image cache

I have Laravel based web and mobile application that stores images on AWS S3 and I want to add cache support because even small number of app users produce hundreds and sometimes thouthands of GET requests on AWS S3.
To get image from mobile app I use GET request that is being handled by code like this
public function showImage(....) {
...
return Storage::disk('s3')->response("images/".$image->filename);
}
On the next image you can see response headers that I receive. Cache-Control shows no-cache so I assume that mobile app won't cache this image.
How can I add cache support for this request? Should I do it?
I know that Laravel Documentaion suggests caching for Filestorage - should I implement it for S3? Can it help to decrease GET requests count of read files from AWS S3? Where can I find more info about it.
I would suggest to use a temporary URL as described here: https://laravel.com/docs/7.x/filesystem#file-urls
Then use the Cache to store it until it is expired:
$value = Cache::remember('my-cache-key', 3600 * $hours, function () use ($hours, $image) {
$url = Storage::disk('s3')->temporaryUrl(
"images/".$image->filename, now()->addMinutes(60 * $hours + 1)
);
});
Whenever you update the object in S3, do this to delete the cached URL:
Cache::forget('my-cache-key');
... and you will get a new URL for the new object.
You could use a CDN service like CloudFlare and set a cache header to let CloudFlare keep the cache for a certain amount of time.
$s3->putObject(file_get_contents($path), $bucket, $url, S3::ACL_PUBLIC_READ, array(), array('Cache-Control' => 'max-age=31536000, public'));
This way, files would be fetched once by CloudFlare, stored at their servers, and served to users without requesting images from S3 for every single request.
See also:
How can I reduce my data transfer cost? Amazon S3 --> Cloudflare --> Visitor
How to set the Expires and Cache-Control headers for all objects in an AWS S3 bucket with a PHP script

laravel XMLHttpRequest cannot load https://website.com/images/1554690945.png. No 'Access-Control-Allow-Origin' header is present

I'm having problem with my laravel file system CORS, I'm trying to cache the image from the url (which is also my website) in my ionic application but it's failing because of the error. I tried the image from https://reqres.in/api/users/1 and there is no problem caching the image in my ionic application. I guess the problem here is in my laravel website
In one of my current projects I have to save 200+ images in my Ionic App from a request to my server.
The way I handled this problem was converting the image to Base64 using Image Intervention and responding to the request with back to the app to then save the Base64 in the Ionic Storage like so.
Laravel Controller
public function grabImages(Request $request){
$image = (string) Image::make('public/bar.png')->encode('data-url');
$data = {
'base64' : $image,
'file_name' : 'test'
}
return $data;
}
Ionic
After receiving the data you can just store it in the Ionic Storage and access it wherever you would like to, even offline.
To display it all you have to do is set the image source to the Base64.
Using this method also solves a few problems, such as the user cannot see the images in the image gallery, as well as allows you to store them and use them offline for as long as you would like and remove them whenever.
As ImJT said I am using the barryvdh's laravel-cors plugin as well.
Hope this answered your question, good luck!

Resources