How to prevent the file from public on Digital Ocean Space? - laravel

Hi everyone I stuck on Digital Ocean that I want to prevent my file from the public.
First of all. I set the .env file like this
DO_SPACES_KEY= THE KEY
DO_SPACES_SECRET= THE SECRET
DO_SPACES_ENDPOINT=https://sgp1.digitaloceanspaces.com
DO_SPACES_REGION=sgp1
DO_SPACES_BUCKET= MY BUCKET NAME
DO_SPACES_URL=https://mydomain.sgp1.digitaloceanspaces.com
Then I set the config->filesystem.php
'do_spaces' => [
'driver' => 's3',
'key' => env('DO_SPACES_KEY'),
'secret' => env('DO_SPACES_SECRET'),
'region' => env('DO_SPACES_REGION'),
'bucket' => env('DO_SPACES_BUCKET'),
'url' => env('DO_SPACES_URL'),
'endpoint' => env('DO_SPACES_ENDPOINT'),
'visibility' => 'public',
],
After that make the controller store the file
//convert image name
$stringImageReFormat=base64_encode('_'.time());
$ext=$request->file('image')->getClientOriginalExtension();
$imageName=$stringImageReFormat.".".$ext;
$imageEncoded=File::get($request->image);
//upload & insert
Storage::disk('do_spaces')->put('public/user_image/'.$imageName,$imageEncoded);
// Insert Data to Table
$user=new User();
$user->image=$imageName;
$user->save();
On my blade template, I retrieve the file like this
{{ Storage::disk('do_spaces')->url('public/user_image/'.$user->image) }}
This is what I get when I don't set the visibility to public
This XML file does not appear to have any style information associated with it. The document tree is shown below.
<Error>
<Code>AccessDenied</Code>
<BucketName>mybucket</BucketName>
<RequestId>tx0000000000000088d0617-00607228ae-13200e4-sgp1b</RequestId>
<HostId>13200e4-sgp1b-sgp1-zg02</HostId>
</Error>
If I set the visibility in the filesystem.php to public. I can see the files without authentication.
Thank you in advance for any help or advice.

Related

File upload to s3 not working after setting visibility to public

Trying to configure Laravel to upload to my aws s3 bucket. It is working fine until I change visibility to public. Then it seems to work or at least it does not show any error but nothing gets uploaded to aws.
Here is the part in my register controller where I am uploading a profile picture
if($request->hasFile('avatar')) {
$file = $request->file('avatar');
$filename = $file->getClientOriginalName();
$file->storeAs('avatars/' . $user->id, $filename, 's3');
$user->update([
'avatar' => $filename,
]);
}
And here is the configuration for s3 in filesystems.php
's3' => [
'driver' => 's3',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION'),
'bucket' => env('AWS_BUCKET'),
'url' => env('AWS_URL'),
'endpoint' => env('AWS_ENDPOINT'),
'use_path_style_endpoint' => env('AWS_USE_PATH_STYLE_ENDPOINT', false),
'throw' => false,
'visibility' => 'public',
],
Without the 'visibility' => 'public', it is working fine but as soon as I add it nothing gets uploaded anymore.
I had this before, you need to Edit Object Ownership to ACLs enabled. By default it is disabled which means that all objects. on the bucket is yours as a private, but ACLs let you choose if you won't object to being public or private.
if you will create a new s3 bucket, in Object Ownership enabled ACLs.
or if you have already created a bucket go to permissions and edit Object Ownership to enable ACLs.
What worked for me was enabling public access to my bucket and adding bucket policy that makes the files in question publicly readable. You can read the official docs here.
Also, make sure to remove the "visibility" => "public" line from Laravel as this prevents the upload from taking place.

Laravel and React Upload image via Storage Facades fail

I want to upload a logo for my reports.
This is a snippet from my uploadLogo function
$file = $request->file;
Storage::disk('logo')->put('logo.png', $file);
I've created a logo profile in filesystems.php like this.
'logo' => [
'driver' => 'local',
'root' => public_path() . '/img',
'url' => env('APP_URL').'/public',
'visibility' => 'public',
],
But it eventually created the file in a 'random' ( or misunderstood ) location with a random name.
public\img\logo.png\M4FGLpZzAsyxn8NHiJLxo95EoP7I3CkIWvqkiQsv.png
What am I missing in my setup here?
You can store the file directly of the request's file (UploadedFile) object. And use storeAs to save by the name you supply. The Storage::put and UploadedFile::store` methods generate random names for the filed being stored.
$path = $request->file->storeAs('img', 'logo.png', 'logo');
More info https://laravel.com/docs/8.x/filesystem#storing-files and https://laravel.com/docs/8.x/requests#storing-uploaded-files

Download files stored in AWS S3 from Laravel Nova results in bug

We are using Nova package for our admin backend inside our Laravel App. All files and images are stored in the AWS S3 bucket.
After trying to download a file from Nova, the download begins with the name download.json and Server Error message.
Files are stored correctly in S3, I can check it manually, also the path to them inside S3 is correctly stored in the database.
Here is the code we use to create a download field in Nova
->download(function(){
return Storage::disk('s3')->download($this->name);
})
->onlyOnDetail()
$this->name holds the path inside the s3 bucket.
config/filesystems.php is also defined:
'disks' => [
...
's3' => [
'driver' => 's3',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION'),
'bucket' => env('AWS_BUCKET'),
'url' => env('AWS_URL'),
],
Nova documentation did not helped me on this problem. Any input would be really helpful.
UPDATE:
The problem was not in code but in Configuration.
Without changing the configuration the following code did help:
Text::make('File/Document', function() {
$linkToFile = Storage::disk('s3')->temporaryUrl($this->name, now()->addMinutes(1));
return 'Download file';
})
->asHtml(),
Hard to see any issue without seeing the complete function, but make sure your name property $this->name has the same value as your remote file 'key' as shown in your Amazon s3 bucket.
Also, make sure the your .env file is correct and contains the following value:
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_access_key
AWS_DEFAULT_REGION=your_default_region
AWS_BUCKET=your_bucket_name
AWS_URL=your_url #if applicable
Hope this makes sense.
Edit: One more thing, in filesystem.php, this line:
'url' => env('AWS_URL'),
was changed in Laravel 6.x according to this bug and became:
'endpoint' => env('AWS_URL'),

No such file, Storing file into the local storage is not working on production

I have a laravel application deployed on Elasticbeanstalk, I'm working on a feature where I need to get a zip file from s3 bucket, store it into the local storage in order to be able to use laravel-zip to remove a pdf file from that zip.
the code is working locally, but I'm receiving 'No Such file error' after testing on production:
// get the file from s3 and store it into local storage
$contents = Storage::disk('s3')->get($file_name);
$zip_local_name = 'my_file.zip';
Storage::disk('local')->put($zip_local_name, $contents);
// use laravel-zip to remove the unwanted pdf file from the result
$manager = new ZipManager();
$file_path = storage_path('app').'\\'.$zip_local_name; // register existing zips
$manager->addZip(Zip::open($file_path));
$zip = $manager->getZip(0);
$zip->delete($data["Iso_Bus"]["field_name"].'.pdf');
$zip->close();
I made sure that the file exists on s3, so I think my main problem is that the file is not stored in the local storage.
Any help is appreciated
Edit filesystems configrations:
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
'public' => [
'driver' => 'local',
'root' => storage_path('app/public'),
'url' => env('APP_URL').'/storage',
'visibility' => 'public',
],
's3' => [
'driver' => 's3',
'key' => '***',
'secret' => '***',
'region' => '***',
'bucket' => '****',
'url' => '****',
],
],
You're getting full path for the file wrongly, try this one instead:
$file_path = Storage::disk('local')->path($zip_local_name);
Note: It's better to check if the Storage::put was successful before continue:
// get the file from s3 and store it into local storage
$contents = Storage::disk('s3')->get($file_name);
$zip_local_name = 'my_file.zip';
if (Storage::disk('local')->put($zip_local_name, $contents)) {
// `Storage::put` returns `true` on success, `false` on failure.
// use laravel-zip to remove the unwanted pdf file from the result
$manager = new ZipManager();
$file_path = $file_path = Storage::disk('local')->path($zip_local_name);
$manager->addZip(Zip::open($file_path));
$zip = $manager->getZip(0);
$zip->delete($data["Iso_Bus"]["field_name"].'.pdf');
$zip->close();
}

Storing files outside the Laravel 5 Root Folder

I am developing a laravel 5 project and storing image files using imagine. I would want to store my image files in a folder outside the project's root folder. I am stuck at the moment The external folder where image files are supposed to be stored, I want to make it accessible via a sub-domain something like http://cdn.example.com Looking towards your solutions.
The laravel documentation could give you a helping hand.
Otherwise you could go to config/filesystems.php and add your own custom storage path for both local and production:
return [
'default' => 'custom',
'cloud' => 's3',
'disks' => [
'local' => [
'driver' => 'local',
'root' => storage_path().'/app',
],
'custom' => [
'driver' => 'custom',
'root' => '../path/to/your/new/storage/folder',
],
's3' => [
'driver' => 's3',
'key' => 'your-key',
'secret' => 'your-secret',
'region' => 'your-region',
'bucket' => 'your-bucket',
],
'rackspace' => [
'driver' => 'rackspace',
'username' => 'your-username',
'key' => 'your-key',
'container' => 'your-container',
'endpoint' => 'https://identity.api.rackspacecloud.com/v2.0/',
'region' => 'IAD',
],
],
];
get ur path name from base_path(); function, then from the string add your desired folder location.
suppose ur
base_path() = '/home/user/user-folder/your-laravel-project-folder/'
So ur desired path should be like this
$path = '/home/user/user-folder/your-target-folder/'.$imageName;
make sure u have the writing and reading permission
You can move all or a part of storage folder in any folder of yours in your server. You must put a link from old to new folder.
ln -s new_fodler_path older_folder_path
You can make a new virtual host to serve the new folder path.

Resources