It's almost 48 hours that I'm facing an issue with the files upload to Digital Ocean spaces from laravel and I can't make it work.
I've successfully built a livewire component that handle the multiple upload of images, each image is stored in the laravel local storage.
As on this website we plan to host thousands of different images, we have now decided to use Digital Ocean Spaces as storage disk.
In order to do so, as first I installed league/flysystem-aws-s3-v3 composer package as required.
Than in my config/filesystem.phpI've done the following :
'do_spaces' => [
'driver' => 's3',
'key' => env('DO_SPACES_KEY'),
'secret' => env('DO_SPACES_SECRET'),
'endpoint' => env('DO_SPACES_ENDPOINT'),
'region' => env('DO_SPACES_REGION'),
'bucket' => env('DO_SPACES_BUCKET'),
],
And in my .env the following :
DO_SPACES_KEY=key
DO_SPACES_SECRET=secret
DO_SPACES_ENDPOINT=https://ams3.digitaloceanspaces.com
DO_SPACES_REGION=AMS3
DO_SPACES_BUCKET=bucket_name
In my livewire component, after uploading the image to local storage, for each image I'm dispatching a job(dispatch(new ResizeUploadToDo($pic, $extension));) which is supposed to :
Retrieve the image from local storage
Pass it to Intervention library in order to resize it and apply a watermark.
Upload the manipulated Image to Digital Ocean Spaces
Remove the old image from local storage
This is the code I've written so far in my job handle() method :
public function handle()
{
$path = Storage::disk('local')->path($this->pic->storage_file_path);
$img=Image::make($path);
$img->resize(600, 600);
$folderStructure = Carbon::now()->format('Y') . '/' . Carbon::now()->format('m') . '/' . Carbon::now()->format('d'). '/';
$filename=time().'.'.$this->extension;
Storage::disk('do_spaces')->put($folderStructure . $filename, $img, 'public');
}
The issues that I'm now facing are the following :
If i try to dd($img) right after instanciate it with Intervention
library, i get the following :
Intervention\Image\Image {#1570 ▼ // app\Jobs\Images\ResizeUploadToDo.php:49
#driver: Intervention\Image\Gd\Driver {#1578 ▼
+decoder: Intervention\Image\Gd\Decoder {#1598 ▼
-data: null
}
+encoder: Intervention\Image\Gd\Encoder {#363 ▼
+result: null
+image: null
+format: null
+quality: null
}
}
#core: GdImage {#394 ▼
+size: "600x600"
+trueColor: true
}
#backups: []
+encoded: ""
+mime: "image/png"
+dirname: "C:\Users\Gianmarco\wa\hotelista\storage\app22/12/12"
+basename: "SPcNL5FD3OZV4heHWA103J4n5YU8xOCG1SU7pyMd.png"
+extension: "png"
+filename: "SPcNL5FD3OZV4heHWA103J4n5YU8xOCG1SU7pyMd"
}
To me it seems like the retrieved Image is empty, is it correct? Or how should i do to correctly retrieve the image from local storage?
I've noticed that, If the job runs with queue driver sync, files
get uploaded to Digital Ocean Spaces but they are empty (file size:
0MB); while, if the job runs with queue driver 'database' files are not uploaded at all to Digital Ocean Spaces.
Doea anybody know how to solve this matter?
Hope somebody can help me with it.
Thank you
Add this variable 'url' => env('DO_SPACES_URL') below 'bucket' => env('DO_SPACES_BUCKET'), in filesystems.php.
In your .env put the variable DO_SPACES_URL=https://{your bucket name}.ams3.digitaloceanspaces.com
Upload file like this:
Storage::disk('do_spaces')->put('uploads', $request->file('file'), 'public');
Related
I am trying to manage DO's Spaces with Laravel's 8 Storage, however I am getting errors which seems to come from Laravel's side.
At start I wrote this line in terminal as I was instructed in Laravel's documentation
composer require league/flysystem-aws-s3-v3 "~1.0"
afterwards I edited my environmental variables
DO_SPACES_KEY=*KEY*
DO_SPACES_SECRET=*SECRET*
DO_SPACES_ENDPOINT=ams3.digitaloceanspaces.com
DO_SPACES_REGION=AMS3
DO_SPACES_BUCKET=test-name
also added changes in config/filesystems.php
'do_spaces' => [
'driver' => 's3',
'key' => env('DO_SPACES_KEY'),
'secret' => env('DO_SPACES_SECRET'),
'endpoint' => env('DO_SPACES_ENDPOINT'),
'region' => env('DO_SPACES_REGION'),
'bucket' => env('DO_SPACES_BUCKET'),
],
After visiting this test Route
Route::get('/test', function (Request $request) {
Storage::disk('do_spaces')->put('test.txt', 'hello world');
});
I am getting this error
Error executing "PutObject" on "//test-name./test-name/test.txt"; AWS HTTP error: cURL error 6: Couldn't resolve host 'test-name' (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for http://test-name./test-name/test.txt
It seems that problem occurs while laravel is trying to compile url which should not look as it is here (wrong - http://test-name./test-name/test.txt). However I have no clue how to fix this issue and what I am doing wrong, since I was following all steps as many tutorials and documetations were telling to do.
I had the same problem. I solved it next way:
Add https:// to DO_SPACES_ENDPOINT (https://ams3.digitaloceanspaces.com)
In put method use path to text.txt:
Storage::disk('do_spaces')->put('YOUR_SPACE_NAME/YOUR_FOLDER_NAME(if you have)/test.txt', 'hello world');
So I am having an UploadedFile I want to upload to specific disk 'local'.
// $storedFilePath == "kml/uploadedfile.xml";
$storedFilePath = $request->file->store("kml", "local");
The file is actually uploaded into storage/app/kml/uploadedfile.xml - that is CORRECT, following the config for local disk:
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
],
On the other hand, I need to get the full URL path (relative from project root) to this file. I tried
Storage::disk("local")->url($storedFilePath);
But this gets me /storage/kml/uploadedfile.xml - note the path does miss app/ folder.
One line in Laravel's FilesystemAdapter.php that determines the path is here (function getLocalUrl):
$path = '/storage/'.$path;
Given the fact that $path is starting with kml/ it perfectly makes sense that it doesn't work. My question is, why. Why are these not corresponding and how do I get the full internal (so not public) path?
Ok so it was actually quite simple, yet I believe the documentation about this in Laravel is quite confusing. Laravel is just expecting everyone to have a symlink from public/storage into storage/app/public
They somehow even think that everyone using local disk and method url wants to know the public url, which is quite always true only for public disk.
config/filesystems.php
added 'url'
'local' => [
'driver' => 'local',
'root' => storage_path('app'),
'url' => storage_path('app')
],
Works correctly now
im trying to enable this library for my localhost environment.
http://glide.thephpleague.com/1.0/config/integrations/laravel/
Current Laravel Version 5.5
gd2 is enabled in wamp extensions.
I cant seem to find where the problem is.
Path is ok, image exists on it.
See following code for server config.
$server = ServerFactory::create([
'response' => new LaravelResponseFactory(app('request')),
'source' => $source,
//'cache' => new Filesystem(new Adapter('../storage/app/cache/')),
'cache' => $cache,
'cache_path_prefix' => '.cache',
'base_url' => 'transform-img',
]);
now i use this
return $server->getImageResponse($path, request()->all());
it does not give any error.
when i dd() this, i get this response.
StreamedResponse {#1151 ▼
#callback: Closure {#1177 ▶}
#streamed: false
-headersSent: false
+headers: ResponseHeaderBag {#1176 ▶}
#content: null
#version: "1.0"
#statusCode: 200
#statusText: "OK"
#charset: null
}
Callback Closure:
#callback: Closure {#1252 ▼
class: "League\Glide\Responses\SymfonyResponseFactory"
this: LaravelResponseFactory {#1231 …}
use: {▼
$stream: stream resource #543 ▼
timed_out: false
blocked: true
eof: false
wrapper_type: "plainfile"
stream_type: "STDIO"
mode: "rb"
unread_bytes: 0
seekable: true
uri: "D:\wamp\www\Bankrolla\storage\app/public\.cache/img/logo_no_text.png/32c8e67d979eab40a7ef6d1854f1f7cc"
options: []
}
}
file: "D:\wamp\www\Bankrolla\vendor\league\glide-symfony\src\Responses\SymfonyResponseFactory.php"
line: "48 to 54"
}
as statusCode shows 200 and there is no error for file not found, still it does not load any image but shows a placeholder on browser when i navigate.
What can be the issue. if i try to replace image name with any other random string i get error for image not found. so this means it does find the image. thou it fails to render the image.
I have googled, searched on their github comments, could not find any problem similar as mine.
I only get a blank page/image if i load it directly.
also i looked in to the cache directory, it includes the files and those files dimensions are resized. so i am not sure where it goes wrong even when it generates the cache files.
may be i am missing any point here, if anyone can point me to the right direction would be very helpful.
Update:
Value of $source variable:
Filesystem {#1225 ▼
#adapter: Local {#1226 ▼
#pathSeparator: "\"
#permissionMap: array:2 [▼
"file" => array:2 [▼
"public" => 420
"private" => 384
]
"dir" => array:2 [▼
"public" => 493
"private" => 448
]
]
#writeFlags: 2
-linkHandling: 2
#pathPrefix: "D:\wamp\www\Bankrolla\storage\app/public\"
}
#plugins: []
#config: Config {#1229 ▼
#settings: []
#fallback: null
}
}
Storage Directory in my public directory(its a symbolic link of original storage)
Storage Directory of Laravel
The URL i am calling this from.
{localhostDomainHere}/image/img/logo_no_text.png?w=100&h=100&fit=crop-center
Don't know if you already fixed it. But we encountered the same problem a few days ago. After a long search, we found out that the error is caught by a new line in a config:
So check all your config files for space or newline before openings tag. Otherwise, your response is not a valid response anymore and you will get the empty box.
If it is not a config file you need to check all the files that are loaded on the request.
the path mentioned by $source => put images there. if it is 1.jpg, then call the url server_url/img/1.jpg . img is from your route for the function you posted in question. In my case $source as well as $cache was /storage/app and i put image in it and called the route on that image. Hope this helps you.
Check this
This is my code :
<?php
namespace App\Http\Controllers;
use Illuminate\Http\Request;
use Illuminate\Contracts\Filesystem\Filesystem;
use League\Glide\Responses\LaravelResponseFactory;
use League\Glide\ServerFactory;
class GlideController extends Controller
{
public function show(Filesystem $filesystem, $path)
{
$server = ServerFactory::create([
'response' => new LaravelResponseFactory(app('request')),
'source' => $filesystem->getDriver(),
'cache' => $filesystem->getDriver(),
'cache_path_prefix' => '.cache',
'base_url' => 'img',
]);
return $server->getImageResponse($path, request()->all());
}
}
route :
Route::get('/img/{path}', 'GlideController#show')->where('path', '.*');
this is the content of my storage/app
[I run the script from localhost]
I'm trying to upload files using Laravel 5.4 to AWS S3 bucket but I get this error:
Error executing "PutObject" on "https://bucket_name.s3.amazonaws.com/1520719994357906.png"; AWS HTTP error: cURL error 60: SSL certificate problem: unable to get local issuer certificate (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
In filesystems.php:
's3' => [
'driver' => 's3',
'key' => 'KRY_HERE',
'secret' => 'SECRET_HERE',
'region' => 'us-east-1',
'bucket' => 'bucket_name', //has global access to read files
],
In the controller:
Storage::disk('s3')->put($imageName, file_get_contents(public_path('galleries/').$imageName));
How to solve this? If I upload the app to EC2 instance does it require SSL installed to upload files to S3 bucket? Thanks in advance.
Uploading from server worked fine no need to install SSL it just doesn't work from localhost.
it just doesn't work from localhost,if you want to do working it on localhost you have to do some changes in vendor directory.(For your local use only)
vendor/guzzle/src/handler/CurlFactory.php
Near around line no 350.Comment this two line and add new two line,otherwise replace this two line.(as you wish)
if ($options['verify'] === false) {
unset($conf[\CURLOPT_CAINFO]);
$conf[\CURLOPT_SSL_VERIFYHOST] = 0;
$conf[\CURLOPT_SSL_VERIFYPEER] = false;
} else {
/* $conf[\CURLOPT_SSL_VERIFYHOST] = 2;
$conf[\CURLOPT_SSL_VERIFYPEER] = true;*/ //Comment this two line
$conf[\CURLOPT_SSL_VERIFYHOST] = 0;
$conf[\CURLOPT_SSL_VERIFYPEER] = false;
}
Now it's work fine.
Installation Process
I followed this tutorial to install aws Package in Laravel 5.3
My Code is below
$s3 = \App::make('aws')->createClient('s3');
$s3->putObject(array(
'Bucket' => 'Bucket_Name',
'Key' => 'AWS_ACCESS_KEY_ID',
'SourceFile' => 'http://domainname/sample.txt',
));
I am trying a txt file with around 50 bytes contents and got below error.
A sha256 checksum could not be calculated for the provided upload
body, because it was not seekable. To prevent this error you can
either 1) include the ContentMD5 or ContentSHA256 parameters with your
request, 2) use a seekable stream for the body, or 3) wrap the
non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You
should be careful though and remember that the CachingStream utilizes
PHP temp streams. This means that the stream will be temporarily
stored on the local disk.
Am I missing something?
SourceFile must be a local file path. The Body parameter allows stream, so you should be able to do a request with guzzle and pass the body to it.
$client = new GuzzleHttp\Client();
$response = $client->get('http://domainname/sample.txt');
$s3->putObject([
'Bucket' => 'Bucket_Name',
'Key' => 'AWS_ACCESS_KEY_ID',
'Body' => $response->getBody(),
]);