I get the following error after having deployed my Laravel app to AWS ElasticBeanstalk for the first time. The health check is green. I've changed the environment and database settings to match those on AWS, as instructed by AWS docs.
ErrorException in Filesystem.php line 109:
file_put_contents(/var/app/current/myapp-master/bootstrap/cache/services.php): failed to open stream: No such file or directory
This is at the line 109:
public function put($path, $contents, $lock = false)
{
return file_put_contents($path, $contents, $lock ? LOCK_SH : 0);
}
Try to create /bootstrap/cache directory and set correct permission on it:
chmod 777 -R bootstrap/cache
Related
// controller
$file = $request->file('video');
$uri = Vimeo::upload($file, [
'name' => 'test',
'description' => 'easy upload'
]);
return $uri;
file_put_contents(): Write of 207 bytes failed with errno=13 Permission denied
The process running the app needs write permission to the temp directory (defaults to /tmp) for TusUpload to work.
For Laravel $request->file to work, the process also needs write permission to the file directory (this defaults to public/uploads in the root directory of the app, but can be configured)
I am getting the error below when i try to upload a file greater than 25mb to amazon s3 using laravel aws sdk, however files below 25mb are uploading successfully. I have everything setup correctly in my .env file. I have no idea why this is happening.
Any help would be appreciated.
Error:
Error executing "ListObjects" on
"bucketdomain/?prefix=b6d767d2f8ed5d21a44b0e5886680cb9%filename%2F&max-keys=1&encoding-type=url";
AWS HTTP error: cURL error 7: Failed to connect to bucketdomain
port 443: Network unreachable (see
http://curl.haxx.se/libcurl/c/libcurl-errors.html)
Save function in laravel:
$v = Storage::disk('s3')->put($path, file_get_contents($file), 'public');
unlink($file->getPathname());
return response()->json(["message" => "File uploaded successfully!"]);
Upload function in laravel:
if ($receiver->isUploaded() === false) {
throw new UploadMissingFileException();
}
$save = $receiver->receive();
if ($save->isFinished()) {
database entries...
return $this->saveChunkFile($file,$userFolderName,$path,$fileName);
}
$handler = $save->handler();
return response()->json([
"Percentage" => $handler->getPercentageDone()
]);
I am using reusable.js in client side to upload files in chunks & the code above is to handle the chunks and merge them when done and pass to the saveChunkFile function.
Picture:
The file is to be stored in the 2nd folder from top but there is not file that is why i think the error is thrown on size function and these files (chunks) are being generated and not stopping still.
i am trying to upload image via laravel and then retrieve its url to save in database and return back to the front end application , image upload is working fine at localhost but not at ec2 instance .
Image is uploaded successfully and can be downloaded via filezilla also
i have setup chmod -R 777 for the storage directory in laravel but its still not working
public static function upload_files($type, $file, $id)
{
if ($type == 'profile_pic') {
$image = $file->store('profile_pic','public');
$image = asset('storage/'.$image);
if ($image) {
return $image;
} else {
return false;
}
}
}
it return http://localhost/trip/public/storage/profile_pic/MaHskQD2VcLSlC11VV3agbBNdh7j7k82liewYBw3.png at localhost and when i click on the link, image is loaded successfully
while on my server it loads
http://mydomain/storage/app/profile_pic/MaHskQD2VcLSlC11VV3agbBNdh7j7k82liewYBw3.png and throws 404 not found error
result of ls -lrt is "-rwxrwxrwx 1 apache ec2-user 190916 Jan 23 10:06 MaHskQD2VcLSlC11VV3agbBNdh7j7k82liewYBw3.png"
please run this command on your mydomain terminal:
php artisan storage:link
I'm trying to get S3 set up with my Laravel app (which is running on my local machine) but I'm getting the following error when trying to upload an image.
$my_file = 'file.txt';
$handle = fopen($my_file, 'w') or die('Cannot open file: '.$my_file);
$data = 'Test data to see if this works!';
fwrite($handle, $data);
$storagePath = Storage::disk('s3')->put("uploads", $my_file, 'public');
This is the error I'm getting:
Error executing "PutObject" on "https://landlord-files.s3.eu-west-2.amazonaws.com/uploads"; AWS HTTP error: cURL error 60: SSL certificate problem: unable to get local issuer certificate (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
I am trying to create a folder in the app directory but for whatever reason the there is a lock symbol in the folder icon and the user and it's group is www-data.
$folder_path = app_path('core/'.$module);
if (!file_exists($folder_path)) {
File::makeDirectory($folder_path, $mode = 0755, $recursive = false, $force = false);
//mkdir($folder_path, 0777);
exit;
} else {
//
}
When I was trying to create the folder there was a permission denied error so I did chmod -R 777 app/core/ after this when I try to create the user associated to it is www-data.
Why is it www-data and not my username and why is there a lock symbol?
NOTE: Using nginx web server.
As default, nginx and php-fpm will run as user www-data. You can check in your /etc/nginx/nginx.conf and /etc/php/{version}/fpm/pool.d/www.conf
See here if you want to config your web app run as your username.