How can i download the files from s3 bucket to local directory using Laravel - laravel

I currently confuse why my exec command for downloading the files from s3 to local directory, not working on laravel, however in AWS command line interface, the command that I created is downloading and working well. please see the reference that I attach below.
I have here my sample code that I created on my public function
Function:
public function send_zip_files_to_store() {
$source = 's3://compexp/11-10-2019/01790exp.zip';
$destination = 'C:\xampp\htdocs';
exec('aws cp sync ' . $source . ' ' . $destination);
}
Route:
Route::get('/send_zip_files_to_store','DashController#send_zip_files_to_store')->name('send_zip_files_to_store');

Related

Laravel 5.5 - Store image to S3

I am currently writing images created by ImageMagick to local storage in a Laravel 5.5 app like this...
$imagick->writeImages(storage_path('app/files/' . $tempfoldername . '_' . $title . '/' . $title . '_page.jpg'), false);
I have now setup an S3 bucket on AWS to store image to instead, how can i modify the above statement to store them in the bucket instead?
I have already set Laravel up with the S3 details and can successfully read and write to the S3 bucket.
Should I do as I am doing and move them afterwards? Or can I do it directly from that imagemagick statement?
Since you're processing the image using image magick, You have 2 options:
First option
Store the image in the local folder, then upload, then unlink
Storage::disk('s3')->put($title . '_page.jpg', new File($filePath));
unlink($filePath);
Or add the image directly to s3 using the following
Storage::disk('s3')->put($title . '_page.jpg', $imagick->getImageBlob());

Laravel Jobs: Can't delete file after queue processed

I just cant figure why is this happening. When I dispatch job which uploads file to my Amazon S3, I want to delete id and update relation (channel). Everything works but I cant delete the file. I cant delete the file even manually. Windows says some process is still using it. To delete it manually I have to end queue worker first. Wierd part is that file is uploaded o my S3, relation is updated in database and queue worker in terminal says job is processed. No failed or running jobs in the jobs table. Anyone know whats going on?
My Job handle below.
public function handle()
{
$path = storage_path() . "/uploads/" . $this->fileId;
$fileName = $this->fileId . ".png";
if(Storage::disk("s3images")->put("profile/" . $fileName, fopen($path, "r+"))){
File::delete($path);
}
$this->channel->image_filename = $fileName;
$this->channel->save();
}
Try
unlink($path)
It worked for me.

How to download a pdf file from storage laravel

I just saw a few tutorials about Laravel storage, but I can't understand how can I create a link for the user to download the actual file that was uploaded.
I managed to upload a pdf file to:
storage/app/public/ccf/b443e9db8dc05f503ede6e670c34bf92.pdf.
I ran the artisan command: php artisan storage:link
But I can't understand what url I should put in the link for the user to download the file.
I tried:
<?php
$path1 = asset('storage/public/ccf/b443e9db8dc05f503ede6e670c34bf92.pdf');
$path2 = storage_path('ccf/b443e9db8dc05f503ede6e670c34bf92.pdf');
$path3 = Storage::url('b443e9db8dc05f503ede6e670c34bf92.pdf');
?>
Path1
Path2
Path3
None work.
This works: $path5 = Storage::url('public/ccf/b443e9db8dc05f503ede6e670c34bf92.pdf');

laravel 5 cant save image

When trying to save an image on my server, in my laravel 5 project with intervention image class, with the following code :
$pathFull = public_path('images/original/brand/' . $filename);
$img = Image::make($image->getRealPath());
$img->encode('jpg')->save($pathFull);
I get the error:
NotWritableException in Image.php line 138:
Can't write image data to path
(/var/www/mydomain.com/public/images/original/brand/nanan.jpg)
So ive changed permission on the folder (that already exists) with:
sudo chmod -R 775 /var/www/mydomain.com/public/images
Ive checked the permissions they are 775 so that command works. I tried it localy (xampp) and it worked fine and the driectory paths are fine. I keep getting this error only if i use 777 i don't but thats danerous.
What else can I try to keep the server save and not use 777?
Maybe the paths are not the same. Check it out. See the original directory?
$pathFull = public_path('images/brand/thumb/' . $filename);
Can't write image data to path
(/var/www/mydomain.com/public/images/original/brand/nanan.jpg)
sudo chmod -R 775 /var/www/mydomain.com/public/images/original/brand
I have made recently a method like yours. It works fine for me. Here it is:
$image = Input::file('image');
$filename = Input::file('image')->getClientOriginalName();
$path = public_path('images/' . Auth::user()->email . '/' . $filename);
$img = Image::make($image->getRealPath());
$img->encode('jpg')->save($path);

Laravel: create an upload directory on the fly

I have got this,
$destinationPath = public_path(). '/img/'. $username;
which I thought would create the directory alright, but it came up with a
Symfony \ Component \ HttpFoundation \ File \ Exception \ FileException
Unable to create the "/img/alvarito" directory
Any idea what is wrong? seems to me that it is trying to create also the /img/, which already exists, what I naturally want is that inside that 'img' directory creates me subdirectories for each user as they upload their files.
thanks a lot
A
You may try this:
$destinationPath = public_path(). '/img/'. $username;
if(!file_exists($destinationPath)) File::makeDirectory($destinationPath);
This will create the directory if it doesn't exist.

Resources