Is there a way to zip and download files and folders which are in Amazon S3 bucket, together in Laravel? I Want to zip the three folders and one file in the picture together and download it
Here's a half baked solution in a route file. Hope it helps.
https://flysystem.thephpleague.com/docs/adapter/zip-archive/
composer require league/flysystem-ziparchive
I put this in routes/web.php just to play with.
<?php
use Illuminate\Support\Facades\Storage;
use League\Flysystem\Filesystem;
use League\Flysystem\ZipArchive\ZipArchiveAdapter;
Route::get('zip', function(){
// see laravel's config/filesystem.php for the source disk
$source_disk = 's3';
$source_path = '';
$file_names = Storage::disk($source_disk)->files($source_path);
$zip = new Filesystem(new ZipArchiveAdapter(public_path('archive.zip')));
foreach($file_names as $file_name){
$file_content = Storage::disk($source_disk)->get($file_name);
$zip->put($file_name, $file_content);
}
$zip->getAdapter()->getArchive()->close();
return redirect('archive.zip');
});
You'll definitely want to do something different than just plopping it in the public dir. Maybe stream it out straight out as a download or save it somewhere better. Feel free to post comment/questions and we can discuss.
I did it the following way after looking at some solutions by streaming the zip directly to the client by using https://github.com/maennchen/ZipStream-PHP :
if ($uploads) {
return response()->streamDownload(function() use ($uploads) {
$opt = new ArchiveOptions();
$opt->setContentType('application/octet-stream');
$zip = new ZipStream("uploads.zip", $opt);
foreach ($uploads as $upload) {
try {
$file = Storage::readStream($upload->path);
$zip->addFileFromStream($upload->filename, $file);
}
catch (Exception $e) {
\Log::error("unable to read the file at storage path: $upload->path and output to zip stream. Exception is " . $e->getMessage());
}
}
$zip->finish();
}, 'uploads.zip');
}
Related
I am doing CRUD operations for images. when I am doing the update operation image gets updated but the old image doesn't delete. The following code updates only the file name in the database but I need to remove the old image from the destination folder too while updating else the folder size will become too large. Any ideas would be great. Here my code.
<input type="file" name="profileimage" value="{{ isset($editabout) ? $editabout->profileimage : '' }}" class="custom-file-input" id="exampleInputFile">
Here is my controller
public function update(updateAboutusProfile $request, AboutUs $about)
{
$data=$request->only(['profiletext']);
if($request->hasFile('profileimage')){
$profileimage = $request->profileimage->store('aboutus', 'public');
$oldimage = $about->profileimage;
Storage::delete($oldimage);
$data['profileimage'] = $profileimage;
}
$about->update($data);
toastr()->success('Profile updated successfully');
return redirect(route('about.index'));
//
}
What can be the error I need to resolve,
thank you
Try setting the disk you used to store the file
Storage::disk('public')->delete($oldimage);
Delete image from your server, you have to reference location of file in directory server, means you could not reference by url link to delete it.
Laravel file is locate in public folder.
Example: your files are located in public/images
$image_path = "/images/filename.ext"; // Value is not URL but directory file path
if(File::exists($image_path)) {
File::delete($image_path);
}
Enjoy ;)
You should save the path of the file to the database and then simply remove it using \Storage::delete() facade.
Store image using hash name
if($request->hasFile('profileimage')){
$store_path = 'aboutus';
$profileimage = $request->profileimage->store($store_path,'public');
$oldimage = $about->profileimage;
$file_address = $store_path.'/'.$request->profileimage->hashName();
\Storage::disk('public')->delete($oldimage);
$data['profileimage'] = $file_address;
}
Store image using original name
if($request->hasFile('profileimage')){
$store_path = 'aboutus';
$profileimage = $request->profileimage->storeAs($store_path, $uploadedFile->getClientOriginalName(),'public');
$oldimage = $about->profileimage;
$file_address = $store_path.'/'.$request->profileimage->getClientOriginalName();
\Storage::disk('public')->delete($oldimage);
$data['profileimage'] = $file_address;
}
Delete the old image while updating
public function update(Request $request, $id)
{
// other updates
if ($request->hasFile('image')) {
//delete old image if exist
if (File::exists(public_path($oldImage))) {
File::delete(public_path(oldImage));
}
//new image process
}
}
i am uploading file in storage like this :
/storage/uploads/contract/19/12199/document.pdf
now i need to to allow only authenticated user to see those document , i use this route :
Route::get('/storage/{pathToFile}', function($pathToFile) {
if (auth()->user()) {
return response()->file($pathToFile);
} else {
return 'Nope, sorry bro, access denied!';
}
});
this didn't work , still all files can be acceded even if user not logged in .
any idea ?
thanks
Did you symlink the public folder to storage folder? If so it would still be accessible because default public entrypoint would be "public" folder so "[host]/storage" would be available in that folder.
What I did in the past was use S3 driver and set file visibility to private then use:
public function get($path, $image)
{
$file = Storage::disk('s3')->get("private/images/". $path. "/" . $image);
return response($file, 200)->header('Content-Type', 'image/png');
}
In your case this would be changed to:
if (auth()->user()) {
$file = Storage::disk('s3')->get($pathToFile);
return response()->file($file);
} else {
return 'Nope, sorry bro, access denied!';
}
Note: S3 driver supports multiple storage solutions: https://flysystem.thephpleague.com/v1/docs/adapter/aws-s3-v2/
For some reason I am not able to delete a file I upload with $request->file->store.
I tried \File::delete, \File::Delete, \Storage::Delete, and \Storage::delete. I want to avoid using unlink as that is not the laravel way.
May you please help me?
This is my code:
if ($user->avatar_path) {
// delete old one
\File::delete(app_path().$avatar_path);
}
$avatar_path = $request->file('avatar')->store('uploads/users/'.$user->id.'/avatar', 'public');
$user->update($request->except('avatar') + ['avatar_path' => $avatar_path]);
We see here that I check if there is an old avatar, and then I want to delete it. Then I upload the next one and update the database with the new path.
Finally solved this. I had to use Storage::disk('public')->delete($old_avatar_path);. Final code:
if ($request->has('avatar')) {
$old_avatar_path = $user->avatar_path;
$avatar_path = $request->file('avatar')->store('avatars', 'public');
$user->update($request->except('avatar') + ['avatar_path' => $avatar_path]);
if ($old_avatar_path) {
// delete old one
Storage::disk('public')->delete($old_avatar_path);
}
I am performing SSH in Laravel whereby I connect to another server and download a file. I am using Laravel Collective https://laravelcollective.com/docs/5.4/ssh
So, the suggested way to do this is something like this
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path);
if($result) {
return $path;
} else {
return 401;
}
Now that successfully downloads the file and moves it to my local server. However, I am always returned 401 because $result seems to be Null.
I cant find much or getting the result back from the SSH. I have also tried
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path, function($line){
dd( $line.PHP_EOL);
});
But that never gets into the inner function.
Is there any way I can get the result back from the SSH? I just want to handle it properly if there is an error.
Thanks
Rather than rely on $result to give you true / false / error, you can check if the file was downloaded successfully in another way:
// download the file
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path);
// see if downloaded file exists
if ( file_exists($path) ) {
return $path;
} else {
return 401;
}
u need to pass file name also like this in get and put method:
$fileName = "example.txt";
$get = \SSH::into('scripts')->get('/remote/somelocation/'.$fileName, base_path($fileName));
in set method
$set = \SSH::into('scripts')->set(base_path($fileName),'/remote/location/'.$fileName);
in list
$command = SSH::into('scripts')->run(['ls -lsa'],function($output) {
dd($output);
});
I have been going round and round with this. I have uploads working with the follow:
public function store(Tool $tool)
{
If(Input::hasFile('file')){
$file = Input::file('file');
$name = $file->getClientOriginalName();
$path=Storage::put('public',$file); //Storage::disk('local')->put($name,$file,'public');
$file = new File;
$file->tool_id = $tool->id;
$file->file_name = $name;
$file->path_to_file = $path;
$file->name_on_disk = basename($path);
$file->user_name = \Auth::user()->name;
$file->save();
return back();
}
however when I try to download with:
public function show($filename)
{
$url = Storage::disk('public')->url($filename);
///$file = Storage::disk('public')->get($filename);
return response()->download($url);
}
I get the FileNotFound exception from laravel
However, if I use this instead:
$file = Storage::disk('public')->get($filename);
return response()->download($file);
I get
FileNotFoundException in File.php line 37: The file "use calib;
insert into
notes(tool_id,user_id,note,created_at,updated_at)
VALUES(1,1,'windows server 2008 sucks',now(),now());" does not exist
which is the actual content of the file...
It can obviously find the file. but why wont it download?
Try this:
return response()->download(storage_path("app/public/{$filename}"));
Replace:
$file = Storage::disk('public')->get($filename);
return response()->download($file);
With:
return response()->download(storage_path('app/public/' . $filename));
response()->download() takes a path to a file, not a file content. More information here: https://laravel.com/docs/5.4/responses#file-downloads
If any one still could not find their file even though the file clearly exists then try
return response()->file(storage_path('/app/' . $filename, $headers));
It could be due to a missing directory separator or it isn't stored inside the public folder.