How to queue upload to s3 using Laravel? - laravel

I'm dispatching a job to queue my video file, the files are being stored on s3.
Everything is working except if I upload a video file for example that's 20mb, when I look in my bucket it says the file is 120b. So this makes me think that I'm uploading the path and filename as a string instead of the file object.
And for some reason, when I try getting the file using the Storage::get() or File::get() and dd the result, it shows a bunch or random and crazy characters.
It seems like I can only get these weird characters, or a string, I can't get the file object for some reason.
In my controller I'm also storing it in the public disk (I will delete the file later in my Jobs/UploadVideos.php file).
CandidateProfileController.php:
$candidateProfile = new CandidateProfile();
$candidateProfile->disk = config('site.upload_disk');
// Video One
if($file = $request->file('video_one')) {
$file_path = $file->getPathname();
$name = time() . $file->getClientOriginalName();
$name = preg_replace('/\s+/', '-', $name);
$file->storePubliclyAs('videos', $name, 'public');
$candidateProfile->video_one = $name;
}
if($candidateProfile->save()) {
// dispatch a job to handle the image manipulation
$this->dispatch(new UploadVideos($candidateProfile));
return response()->json($candidateProfile, 200);
} else {
return response()->json([
'message' => 'Some error occurred, please try again.',
'status' => 500
], 500);
}
Jobs/UploadVideos.php:
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
protected $candidateprofile;
public $timeout = 120;
public $tries = 5;
/**
* Create a new job instance.
*
* #param CandidateProfile $candidateProfile
*/
public function __construct(CandidateProfile $candidateProfile)
{
$this->candidateprofile = $candidateProfile;
}
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
$disk = $this->candidateprofile->disk;
$filename = $this->candidateprofile->video_one;
$original_file = storage_path() . '/videos/' . $filename;
try {
// Video One
Storage::disk($disk)
->put('videos/'.$filename, $original_file, 'public');
// Update the database record with successful flag
$this->candidateprofile->update([
'upload_successful' => true
]);
} catch(\Exception $e){
Log::error($e->getMessage());
}
}

File Storage docs
The 2nd parameter for put() should be the contents of the file not the path to the file. Also, unless you've updated the public disk in your config/filesystem.php, the video isn't going to be stored in storage_path() . '/videos/...'.
To get this to work you should just need to update your Job code:
$filename = 'videos/' . $this->candidateprofile->video_one;
Storage::disk($this->candidateprofile->disk)
->put($filename, Storage::disk('public')->get($filename), 'public');
$this->candidateprofile->update([
'upload_successful' => true,
]);
Also, wrapping your code in a try/catch will mean that the Job won't retry as it will technically never fail.

Related

Resize image and store it to s3

Default image upload process in my app like this.
Get image from request and store it to s3 and a local variable.
$path = $request->file("image")->store("images", "s3");
After this I make it public.
Storage::disk("s3")->setVisibility($path, 'public');
And store to DB like this.
$variable = ModelName::create([
"image" => basename($path),
"image_url" => Storage::disk("s3")->url($path),
But how to resize the image before store it to s3?
I try to write like this
$extension = $request->file('image')->getClientOriginalExtension();
$normal = Image::make($request->file('image'))->resize(160, 160)->encode($extension);
$filename = md5(time()).'_'.$request->file('image')->getClientOriginalName();
$img = Storage::disk('s3')->put('/images/'.$filename, (string)$normal, 'public');
And then
"image" => basename($filename ),
"image_url" => Storage::disk("s3")->url($img),
This works except one thing. I can't get URL (to store DB) for uploaded image.
How to get correct public url for uploaded image?
Note:I use Intervention Image package
Storage::put() would only return the path if it's an instance of File or UploadedFile (source). In this case $normal isn't, so put() would return a boolean instead. Also, using getClientOriginalExtension() probably isn't a good idea since it's not considered a safe value (source).
So here's a bit improved version:
$filename = $request->file('file')->hashname();
$image = Image::make($request->file('file'))->resize(160, 160);
Storage::disk('s3')->put('/images/'.$filename, $image->stream(), 'public');
$url = Storage::disk('s3')->url('/images/'.$filename);
You can now save $url and $filename into your db.
You can try my code snippet.
$file = $request->file('image');
$fileName = md5(time()).'.'.$file->getClientOriginalExtension();
/** #var Intervention $image */
$image = Image::make($file);
if ($image->width() > ($maxWidth ?? 500)) {
$image = $image->resize(500, null, function ($constraint) {$constraint->aspectRatio();});
}
$image = $image->stream();
try {
/** #var Storage $path */
Storage::disk('s3')->put(
$dir . DIRECTORY_SEPARATOR . $fileName, $image->__toString()
);
} catch (\Exception $exception) {
Log::debug($exception->getMessage());
}

Image source not readable : Image Upload in Laravel

I'm trying to upload an image to our system. It is working well when I'm using it in localhost but when I'm testing in server, it's throwing an error Image source not readable. I'm trying to upload a 2.2MB photo since my max size of image upload is 3MB. Uploading images less than or equal to 2MB in server has no problem but in local, it accepts up to 3MB in which it is the expected behavior. I'm using docker for my local development and my server is CentOS7. Is there some configurations I should touch so that it can accept 3MB image size in my server? Or is there something I have to change in my code below in processing the image?
REQUEST
public function rules()
{
return [
'photo' => 'image|mimes:jpeg,png,jpg|max:3000',
];
}
Controller
$photo = $request->file('photo');
$server_dir = storage_path(config('const.upload_local_temp_path'));
FileHelper::addDirectory($server_dir, 0777);
$file_name = FileHelper::makeUniqFileName($photo->getClientOriginalExtension(), $server_dir);
$filepath = $server_dir . $file_name;
$img_path = FileHelper::storeResizeImg($photo->path(), $filepath, null, 300);
$img_url = FileHelper::getPublicPath($img_path);
return $img_url;
FILEHELPER
/*
*
*/
public static function addDirectory($directory, $mod)
{
if (!File::exists($directory)) {
File::makeDirectory($directory, $mod, true);
}
}
/**
*
*/
public static function storeResizeImg($file, $filepath, $width, $height)
{
$org_img = Image::make($file);
$org_img->orientate();
$org_img = $org_img->resize($width, $height, function ($constraint) {
$constraint->aspectRatio();
});
$org_img->save($filepath);
return $filepath;
}
public static function makeUniqFileName($ext, $path)
{
$file_name = '';
while (1) {
$file_name = sha1(rand().microtime()).'.'.$ext;
if (!File::exists($path. $file_name)) {
break;
}
}
return $file_name;
}
UPDATE
Error message that the image size is greater than 3MB is working in my local, but in server it says : The given data was invalid.
Please let me know of your thoughts.

File upload using foreach in Laravel [duplicate]

This question already has an answer here:
File uploading in Laravel
(1 answer)
Closed 3 years ago.
Been working on this problem for 2 days and still cannot figure it out. I am trying to upload multiple files into storage in my Laravel project. I know my code works up to the foreach as I tested this with dd.
My controller:
$files = $request->file('current_plan_year_claims_data_file_1');
$folder = public_path(). "\storage\\$id";
if (!File::exists($folder)) {
File::makeDirectory($folder, 0775, true, true);
}
if (!empty($files)) {
foreach($files as $file) {
Storage::disk(['driver' => 'local', 'root' => $folder])->put($file->getClientOriginalName(), file_get_contents($file));
}
}
I see that you are trying to store the files directly in public folder, but why not use the Storage API of Laravel and use the public disk? You can do something like this to upload the files to the public directory:
$id = 123;
$files = $request->file();
$folder = $id;
if (count($files) > 0) {
foreach ($files as $file) {
$file->store($folder, ['disk' => 'public']);
}
}
And be sure that you have linked the storage path to public:
php artisan storage:link
Focus on $files = $request->file(); line. When you don't pass an argument to file() method, all uploaded file instances are returned. Now when you will loop over the $files array, you will get access to individual uploaded files.
And then you can store the file using your logic, i.e. you can use the original name or whatever else. Even you can use the Storage facade to process the file instance.
i.e. if you want to store the files with their original names, I find this a cleaner way rather than what you are doing:
$id = 123;
$files = $request->file();
$folder = $id;
if (count($files) > 0) {
foreach ($files as $file) {
Storage::disk('public')->putFileAs(
$folder,
$file,
$file->getClientOriginalName()
);
}
}
And as suggested by #cbaconnier, you can use allFiles() method too that's more descriptive:
$files = $request->allFiles();
I hope this helps.
You're trying to iterate over files, and file is just a reference to request->file(), which is a SINGLE UploadedFile object.
As indicated by your comment, you have multiple file inputs with different name attributes, so you can't easily loop over them with one statement, eg: if you had multiple files all uploaded as "attachments[]" as the input name attribute, you could get them all with $request->allFiles('attachments'), however, if you want to keep the input names as they are, this should be close to what you want.
public function foo(Request $request, $id){
$folder = public_path(). "\storage\\$id";
if (!File::exists($folder)) {
File::makeDirectory($folder, 0775, true, true);
}
$files = array();
$files[] = $request->file('current_plan_year_claims_data_file_1');
$files[] = $request->file('prior_plan_year_claims_data_file_1');
$files[] = $request->file('etc_file_whatever');
foreach($files as $file) {
Storage::disk(['driver' => 'local', 'root' => $folder])->put($file->getClientOriginalName(), file_get_contents($file));
}
}
Side note, i'm not sure what you're doing with File and public_path, but if your goal is just to put something in your app storage, something like this should work fine
public function foo(Request $request, $id){
if(!\Storage::exists($id)){
\Storage::makeDirectory($id);
}
$files = array();
$files[] = $request->file('current_plan_year_claims_data_file_1');
$files[] = $request->file('prior_plan_year_claims_data_file_1');
$files[] = $request->file('etc_file_whatever');
foreach($files as $file) {
\Storage::put("$id/" . $file->getClientOriginalFileName(), $file);
}
}

How to upload an image using Laravel?

The problem:
I want to upload an image to a mySQL database using Laravel.
what I have tried:
I looked for other stack-overflow questions but they weren't helpful.
the result I am expecting :
is to have the image name or path saved to a column in my table on the database , to retrieve and display it later as a post in a blog.
First you need the form on your view (don't forget the csrf token):
<form action="/image-upload" method="POST" enctype="multipart/form-data">
#csrf
<input type="file" name="image">
<button type="submit">Upload</button>
</form>
And on your routes file add the route for POST method:
Route::post('image-upload', 'ImageUploadController#imageUploadPost');
Then on your Controller create the function that will validate and move your image to the 'public/images' folder.
public function imageUploadPost()
{
request()->validate([
'image' => 'required|image|mimes:jpeg,png,jpg,gif,svg|max:2048',
]);
$imageName = time().'.'.request()->image->getClientOriginalExtension();
request()->image->move(public_path('images'), $imageName);
}
For better solution please read this: Laravel File Storage
Actually with Laravel it only involves a few lines of code. Let's say you have a user that has an avatar which is stored in the database. Here's how you would store and retrieve the avatar from the database:
1. First you'll need to have an avatar column in the users table that can store binary data. Depending on how large you want to allow the avatar image to be, the data type of the column can be one of the following:
BLOB up to 64KB
MEDIUMBLOB up to 16MB
LONGBLOB up to 4GB
2. To store the uploaded image in the database you can do this:
Route::post('user/{id}', function (Request $request, $id) {
// Get the file from the request
$file = $request->file('image');
// Get the contents of the file
$contents = $file->openFile()->fread($file->getSize());
// Store the contents to the database
$user = App\User::find($id);
$user->avatar = $contents;
$user->save();
});
3. To fetch and ouput the avatar you can do the following:
Route::get('user/{id}/avatar', function ($id) {
// Find the user
$user = App\User::find(1);
// Return the image in the response with the correct MIME type
return response()->make($user->avatar, 200, array(
'Content-Type' => (new finfo(FILEINFO_MIME))->buffer($user->avatar)
));
});
NOTE: Please have this in your mind, MySQL isn't a suitable solution to store BLOB. You may need to use an object storage service like Amazon S3.
Use this to upload image
/**
* Store a newly created resource in storage.
*
* #param \Illuminate\Http\Request $request
* #return \Illuminate\Http\Response
*/
public function store(Request $request)
{
// $this->validate($request,[//'movie_name'=>'required',
// // 'description'=>'required',
// //'video_url'=>'required',
// 'image'=>'required|mimes:jpeg,jpg,png,gif|required|max:10000',
// ]);
if ($request->hasFile('image') && $request->hasFile('image2')) {
$file = $request->file('image');
//$image=$file->getClientOriginalName();
$image = time().'.'.$file->getClientOriginalExtension();
$destinationPath ='assets/admin/uploads/image/';
$file->move($destinationPath,$image);
//echo $destinationPath;exit();
//echo $image."<br/>";
$file2 = $request->file('image2');
$bg_images = time().'.'.$file2->getClientOriginalExtension();
//$bg_images=$file2->getClientOriginalName();
$destinationPath ='assets/admin/uploads/bg_images/';
$file2->move($destinationPath,$bg_images);
$insert_data=array('movie_name'=>$request->movie_name,
'description'=>$request->description,
'video_url'=>$request->video_url,
'image'=>$image,
'bg_images'=>$bg_images,
'created_at'=>now(),
'updated_at'=>now()
);
//print_r($insert_data);exit();
}
else
{
if ( $request->hasFile('image2')) {
$file2 = $request->file('image2');
$bg_images = time().'.'.$file2->getClientOriginalExtension();
//$bg_images=$file2->getClientOriginalName();
$destinationPath ='assets/admin/uploads/bg_images/';
$file2->move($destinationPath,$bg_images);
//echo $destinationPath;exit();
//echo $bg_images;
$insert_data=array('movie_name'=>$request->movie_name,
'description'=>$request->description,
'video_url'=>$request->video_url,
//'image'=>$image,
'bg_images'=>$bg_images,
'created_at'=>now(),
'updated_at'=>now()
);
//print_r($insert_data);exit();
}
if ($request->hasFile('image') ) {
$file = $request->file('image');
//$image=$file->getClientOriginalName();
$image = time().'.'.$file->getClientOriginalExtension();
$destinationPath ='assets/admin/uploads/image/';
$file->move($destinationPath,$image);
//echo $destinationPath;exit();
//echo $image."<br/>";
$insert_data=array('movie_name'=>$request->movie_name,
'description'=>$request->description,
'video_url'=>$request->video_url,
'image'=>$image,
//'bg_images'=>$bg_images,
'created_at'=>now(),
'updated_at'=>now()
);
// print_r($insert_data);exit();
}
if ( ! $request->hasFile('image2') && ! $request->hasFile('image') ) {
$insert_data=array('movie_name'=>$request->movie_name,
'description'=>$request->description,
'video_url'=>$request->video_url,
//'image'=>$image,
// 'bg_images'=>$bg_images,
'updated_at'=>now()
);
// print_r($update_data);exit();
}
}
//exit();
// echo $image;
//exit();
//print_r($insert_data);exit();
$insert=DB::table('movies')->insert($insert_data);
if ($insert) {
return redirect()->route('admin.list_movies')->withSuccess('Record saved');
}
else {
return redirect()->route('admin.list_movies')->withError('Record not saved');
}
}

fine-uploader PHP Server Side Merge

I'm been experimenting with Fine Uploader. I am really interested in the chunking and resume features, but I'm experiencing difficulties putting the files back together server side;
What I've found is that I have to allow for a blank file extension on the server side to allow the upload of the chunks, otherwise the upload will fail with unknown file type. It uploads the chunks fine with file names such as "blob" and "blob63" (no file extension) however is does not merge them back at completion of upload.
Any help or pointers would be appreciated.
$('#edit-file-uploader').fineUploader({
request: {
endpoint: 'upload.php'
},
multiple: false,
validation:{
allowedExtentions: ['stl', 'obj', '3ds', 'zpr', 'zip'],
sizeLimit: 104857600 // 100mb * 1024 (kb) * 1024 (bytes)
},
text: {
uploadButton: 'Select File'
},
autoUpload: false,
chunking: {
enabled: true
},
callbacks: {
onComplete: function(id, fileName, responseJSON) {
if (responseJSON.success) {
/** some code here **??
}
}
});
And this is the server side script (PHP):
// list of valid extensions, ex. array("stl", "xml", "bmp")
$allowedExtensions = array("stl", "");
// max file size in bytes
$sizeLimit = null;
$uploader = new qqFileUploader($allowedExtensions, $sizeLimit);
// Call handleUpload() with the name of the folder, relative to PHP's getcwd()
$result = $uploader->handleUpload('uploads/');
// to pass data through iframe you will need to encode all html tags
echo htmlspecialchars(json_encode($result), ENT_NOQUOTES);
/******************************************/
/**
* Handle file uploads via XMLHttpRequest
*/
class qqUploadedFileXhr {
/**
* Save the file to the specified path
* #return boolean TRUE on success
*/
public function save($path) {
$input = fopen("php://input", "r");
$temp = tmpfile();
$realSize = stream_copy_to_stream($input, $temp);
fclose($input);
if ($realSize != $this->getSize()){
return false;
}
$target = fopen($path, "w");
fseek($temp, 0, SEEK_SET);
stream_copy_to_stream($temp, $target);
fclose($target);
return true;
}
/**
* Get the original filename
* #return string filename
*/
public function getName() {
return $_GET['qqfile'];
}
/**
* Get the file size
* #return integer file-size in byte
*/
public function getSize() {
if (isset($_SERVER["CONTENT_LENGTH"])){
return (int)$_SERVER["CONTENT_LENGTH"];
} else {
throw new Exception('Getting content length is not supported.');
}
}
}
/**
* Handle file uploads via regular form post (uses the $_FILES array)
*/
class qqUploadedFileForm {
/**
* Save the file to the specified path
* #return boolean TRUE on success
*/
public function save($path) {
return move_uploaded_file($_FILES['qqfile']['tmp_name'], $path);
}
/**
* Get the original filename
* #return string filename
*/
public function getName() {
return $_FILES['qqfile']['name'];
}
/**
* Get the file size
* #return integer file-size in byte
*/
public function getSize() {
return $_FILES['qqfile']['size'];
}
}
/**
* Class that encapsulates the file-upload internals
*/
class qqFileUploader {
private $allowedExtensions;
private $sizeLimit;
private $file;
private $uploadName;
/**
* #param array $allowedExtensions; defaults to an empty array
* #param int $sizeLimit; defaults to the server's upload_max_filesize setting
*/
function __construct(array $allowedExtensions = null, $sizeLimit = null){
if($allowedExtensions===null) {
$allowedExtensions = array();
}
if($sizeLimit===null) {
$sizeLimit = $this->toBytes(ini_get('upload_max_filesize'));
}
$allowedExtensions = array_map("strtolower", $allowedExtensions);
$this->allowedExtensions = $allowedExtensions;
$this->sizeLimit = $sizeLimit;
$this->checkServerSettings();
if(!isset($_SERVER['CONTENT_TYPE'])) {
$this->file = false;
} else if (strpos(strtolower($_SERVER['CONTENT_TYPE']), 'multipart/') === 0) {
$this->file = new qqUploadedFileForm();
} else {
$this->file = new qqUploadedFileXhr();
}
}
/**
* Get the name of the uploaded file
* #return string
*/
public function getUploadName(){
if( isset( $this->uploadName ) )
return $this->uploadName;
}
/**
* Get the original filename
* #return string filename
*/
public function getName(){
if ($this->file)
return $this->file->getName();
}
/**
* Internal function that checks if server's may sizes match the
* object's maximum size for uploads
*/
private function checkServerSettings(){
$postSize = $this->toBytes(ini_get('post_max_size'));
$uploadSize = $this->toBytes(ini_get('upload_max_filesize'));
if ($postSize < $this->sizeLimit || $uploadSize < $this->sizeLimit){
$size = max(1, $this->sizeLimit / 1024 / 1024) . 'M';
die(json_encode(array('error'=>'increase post_max_size and upload_max_filesize to ' . $size)));
}
}
/**
* Convert a given size with units to bytes
* #param string $str
*/
private function toBytes($str){
$val = trim($str);
$last = strtolower($str[strlen($str)-1]);
switch($last) {
case 'g': $val *= 1024;
case 'm': $val *= 1024;
case 'k': $val *= 1024;
}
return $val;
}
/**
* Handle the uploaded file
* #param string $uploadDirectory
* #param string $replaceOldFile=true
* #returns array('success'=>true) or array('error'=>'error message')
*/
function handleUpload($uploadDirectory, $replaceOldFile = FALSE){
if (!is_writable($uploadDirectory)){
return array('error' => "Server error. Upload directory isn't writable.");
}
if (!$this->file){
return array('error' => 'No files were uploaded.');
}
$size = $this->file->getSize();
if ($size == 0) {
return array('error' => 'File is empty');
}
if ($size > $this->sizeLimit) {
return array('error' => 'File is too large');
}
$pathinfo = pathinfo($this->file->getName());
$filename = $pathinfo['filename'];
//$filename = md5(uniqid());
$ext = #$pathinfo['extension']; // hide notices if extension is empty
if($this->allowedExtensions && !in_array(strtolower($ext), $this->allowedExtensions)){
$these = implode(', ', $this->allowedExtensions);
return array('error' => 'File has an invalid extension, it should be one of '. $these . '.');
}
$ext = ($ext == '') ? $ext : '.' . $ext;
if(!$replaceOldFile){
/// don't overwrite previous files that were uploaded
while (file_exists($uploadDirectory . DIRECTORY_SEPARATOR . $filename . $ext)) {
$filename .= rand(10, 99);
}
}
$this->uploadName = $filename . $ext;
if ($this->file->save($uploadDirectory . DIRECTORY_SEPARATOR . $filename . $ext)){
return array('success'=>true);
} else {
return array('error'=> 'Could not save uploaded file.' .
'The upload was cancelled, or server error encountered');
}
}
}
In order to handle chunked requests, you MUST store each chunk separately in your filesystem.
How you name these chunks or where you store them is up to you, but I suggest you name them using the UUID provided by Fine Uploader and append the part number parameter included with each chunked request. After the last chunk has been sent, combine all chunks into one file, with the proper name, and return a standard success response as described in the Fine Uploader documentation. The original name of the file is, by default, passed in a qqfilename parameter with each request. This is also discussed in the docs and the blog.
It doesn't look like you've made any attempt to handle chunks server-side. There is a PHP example in the Widen/fine-uploader-server repo that you can use. Also, the documentation has a "server-side" section that explains how to handle chunking in detail. I'm guessing you did not read this. Have a look.) in the Widen/fine-uploader-server repo that you can use. Also, the documentation has a "server-side" section that explains how to handle chunking in detail. I'm guessing you did not read this. Have a look.
Note that, starting with Fine Uploader 3.8 (set to release VERY soon) you will be able to delegate all server-side upload handling to Amazon S3, as Fine Uploader will provide tight integration with S3 that sends all of your files directly to your bucket from the browser without you having to worry about constructing a policy document, making REST API calls, handling responses from S3, etc. I mention this as using S3 means that you never have to worry about handling things like chunked requests on your server again.

Resources