Laravel get image from S3 instead of local - laravel

I have a Laravel 5 app which was previously storing images locally. I have now modified it so that images are stored on an S3 server, I previously retrieved the local images like this...
$image_contents = Storage::get('myimages/logos/' . $image->filename);
Now I have moved to S3 storage, how can I instead get the image from the S3 bucket?

It's the same but you have to put your AWS bucket path:
Storage::disk('s3')->get('AWS_BUCKET_PATH');

Related

How to add multiple Media from Disk in Laravel Spatie Media Library?

I'm saving images from local disk to cloud (DO storage) disk by following codes in controller
$claim->addMediaFromDisk($front_image, 'public')->usingFileName("front-image")->toMediaCollection('claim-images', 'do_spaces');
$claim->addMediaFromDisk($right_image, 'public')->usingFileName("right-image")->toMediaCollection('claim-images', 'do_spaces');
$claim->addMediaFromDisk($left_image, 'public')->usingFileName("left-image")->toMediaCollection('claim-images', 'do_spaces');
this trick works but saving those images in 3 different directory in cloud storage. But I want all three images in same directory.
I see there is built in method for adding multiple media from request. But how can I do it form disk. I was expecting something like addMultipleMediaFromDisk(!). Is there any solution ?
Laravel version: 7.30
Spatie media library version: 7.20
//where the original file is saved on the local disk and the conversions on S3.
$media = $claim->addMedia($pathToImage)->storingConversionsOnDisk('s3')
->toMediaCollection('claim-images', 'local');

How to store images with heroku laravel app

With store route in laravel i pass data (text) to MySQL and image to storage. However i cant store images on local storage (after new deploy) changes in file system resets so i need to do it with external storage. Whats the easiest way to store images (low size) and to have permission to fetch them (with js)?
You need to store them to something persistent, like Amazon S3.
You can use Laravel's Storage system for this. Once you've set up an S3 disk with the right AWS credentials, it's as simple as:
$path = $request->file('your-field')->store('your-folder', 's3')
or:
$path = Storage::disk('s3')->put('path/to/folder', new File('/path/to/temporary/file'));
I encountered the same problem recently and here is what I figured out for a solution:
Heroku does not have storage for images, which calls for using a service that stores images for free and use it.
OR
You can just move them to the public folder instead of the storage folder, and use this in your view code:
<img src="{{ secure_asset($whereEverYouStoredYourImage->theImageAttribute) }}" alt="example">

upload image bytes to cloudinary upload API

I am using cloudinary API to get different resolution of the image/video files.I am able to use upload API using following code and get the required resolutions from java code
Map uploadResult = cloudinary.uploader().upload(file, options);
Now i need to perform the same from aws lamda using java code since I need to get resolutions of content stored in s3 bucket. I can think of 2 possible ways of doing the same 1) point cloudinary to use s3 urls - this requires setup 2)byte array buffer or IO input stream. Is there any sample java code available to option 2 ?
I am referring to upload API here
https://cloudinary.com/documentation/image_upload_api_reference#upload
This has some reference with python
Correct way for uploading image bytes to cloudinary
Please check this example:
File file = new File("<image_path>");
byte[] fileContent = Files.readAllBytes(file.toPath());
cloudinary.uploader().upload(fileContent, ObjectUtils.emptyMap()));
--Yakir

Parse Files Migration

I have migrated parse server and pointed all the client application to the new standalone parse server. I have used the parse files utils to migrate existing files from parse to aws s3. The migration completed properly and I can see the images in my s3 bucket. There is an option to add prefix to the migrated files which I have done.
Now on the client website when I check the URL of the images they are the same starting with 'tfss' which means they are still getting rendered from parse hosted S3 bucket. What are the steps I need to take to make sure the images are getting rendered from my s3 bucket?
Do I need to remove the fileKey from parse server or what?
The config that I used for file migration is as follows
module.exports = {
applicationId: <APPLICATION ID>,
masterKey: <MASTER KEY>,
mongoURL: <NEW MONGODB URL>,
serverURL: "https://api.parse.com/1",
filesToTransfer: 'all',
renameInDatabase: false,
renameFiles: false,
aws_accessKeyId: <NEW S3 BUCKET ACCESS KEY>,
aws_secretAccessKey: <NEW S3 BUCKET SECRET>,
aws_bucket: <BUCKET NAME>,
aws_bucketPrefix: "prod_migrated_"
};
Thanks in advance. Please help with further steps.
Without having your Parse-Server configuration, it is a bit hard to know how you have it setup, but here are a few things to check:
If you have all your files in S3 and all the clients are pointing to your new Parse Server then you can remove the fileKey parameter from your Parse Server configuration. This will prevent Parse Server from formatting the file URL with the hosted hostname and fileKey.
Verify in your Parse Server filesAdapter configuration for S3, you have set a proper baseUrl, bucketPrefix and directAccess parameters as stated in the documentation. The baseUrl should be something similar to https://<BUCKET_NAME>.s3.amazonaws.com.
Verify that you have also setup the proper bucket policy to grant read privileges to be able to get the URL (see S3 adapter documentation). You can verify this by trying to access one of the images in your S3 bucket in the browser.

Uploading an image to S3 using aws-sdk v2

I'm having a hell of a time working with the aws-sdk documentation, all of the links I follow seem outdated and unusable.
I'm looking for a straight forward implementation example of uploading an image file to an S3 bucket in Ruby.
let's say the image path is screenshots/image.png
and I want to upload it to the bucket my_bucket
AWS creds live in my ENV
Any advice is much appreciated.
Here is how you can upload a file from disk to the named bucket and key:
s3 = Aws::S3::Resource.new
s3.bucket('my_bucket').object('key').upload_file('screenshots/image.png')
That is the simplest method. You should replace 'key' with the key you want it stored with in Amazon S3. This will automatically upload large files for you using the multipart upload APIs and will retry failed parts.
If you prefer to upload always using PUT object, you can call #put or use an Aws::S3::Client:
# using put
s3 = Aws::S3::Resource.new
File.open('screenshots/image.png', 'rb') do |file|
s3.bucket('my_bucket').object('key').put(body:file)
end
# using a client
s3 = Aws::S3::Client.new
File.open('screenshots/image.png', 'rb') do |file|
s3.put_object(bucket:'my_bucket', key:'key', body:file)
end
Also, the API reference documentation for the v2 SDK is here: http://docs.aws.amazon.com/sdkforruby/api/index.html

Resources