Installation Process
I followed this tutorial to install aws Package in Laravel 5.3
My Code is below
$s3 = \App::make('aws')->createClient('s3');
$s3->putObject(array(
'Bucket' => 'Bucket_Name',
'Key' => 'AWS_ACCESS_KEY_ID',
'SourceFile' => 'http://domainname/sample.txt',
));
I am trying a txt file with around 50 bytes contents and got below error.
A sha256 checksum could not be calculated for the provided upload
body, because it was not seekable. To prevent this error you can
either 1) include the ContentMD5 or ContentSHA256 parameters with your
request, 2) use a seekable stream for the body, or 3) wrap the
non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You
should be careful though and remember that the CachingStream utilizes
PHP temp streams. This means that the stream will be temporarily
stored on the local disk.
Am I missing something?
SourceFile must be a local file path. The Body parameter allows stream, so you should be able to do a request with guzzle and pass the body to it.
$client = new GuzzleHttp\Client();
$response = $client->get('http://domainname/sample.txt');
$s3->putObject([
'Bucket' => 'Bucket_Name',
'Key' => 'AWS_ACCESS_KEY_ID',
'Body' => $response->getBody(),
]);
Related
It's almost 48 hours that I'm facing an issue with the files upload to Digital Ocean spaces from laravel and I can't make it work.
I've successfully built a livewire component that handle the multiple upload of images, each image is stored in the laravel local storage.
As on this website we plan to host thousands of different images, we have now decided to use Digital Ocean Spaces as storage disk.
In order to do so, as first I installed league/flysystem-aws-s3-v3 composer package as required.
Than in my config/filesystem.phpI've done the following :
'do_spaces' => [
'driver' => 's3',
'key' => env('DO_SPACES_KEY'),
'secret' => env('DO_SPACES_SECRET'),
'endpoint' => env('DO_SPACES_ENDPOINT'),
'region' => env('DO_SPACES_REGION'),
'bucket' => env('DO_SPACES_BUCKET'),
],
And in my .env the following :
DO_SPACES_KEY=key
DO_SPACES_SECRET=secret
DO_SPACES_ENDPOINT=https://ams3.digitaloceanspaces.com
DO_SPACES_REGION=AMS3
DO_SPACES_BUCKET=bucket_name
In my livewire component, after uploading the image to local storage, for each image I'm dispatching a job(dispatch(new ResizeUploadToDo($pic, $extension));) which is supposed to :
Retrieve the image from local storage
Pass it to Intervention library in order to resize it and apply a watermark.
Upload the manipulated Image to Digital Ocean Spaces
Remove the old image from local storage
This is the code I've written so far in my job handle() method :
public function handle()
{
$path = Storage::disk('local')->path($this->pic->storage_file_path);
$img=Image::make($path);
$img->resize(600, 600);
$folderStructure = Carbon::now()->format('Y') . '/' . Carbon::now()->format('m') . '/' . Carbon::now()->format('d'). '/';
$filename=time().'.'.$this->extension;
Storage::disk('do_spaces')->put($folderStructure . $filename, $img, 'public');
}
The issues that I'm now facing are the following :
If i try to dd($img) right after instanciate it with Intervention
library, i get the following :
Intervention\Image\Image {#1570 ▼ // app\Jobs\Images\ResizeUploadToDo.php:49
#driver: Intervention\Image\Gd\Driver {#1578 ▼
+decoder: Intervention\Image\Gd\Decoder {#1598 ▼
-data: null
}
+encoder: Intervention\Image\Gd\Encoder {#363 ▼
+result: null
+image: null
+format: null
+quality: null
}
}
#core: GdImage {#394 ▼
+size: "600x600"
+trueColor: true
}
#backups: []
+encoded: ""
+mime: "image/png"
+dirname: "C:\Users\Gianmarco\wa\hotelista\storage\app22/12/12"
+basename: "SPcNL5FD3OZV4heHWA103J4n5YU8xOCG1SU7pyMd.png"
+extension: "png"
+filename: "SPcNL5FD3OZV4heHWA103J4n5YU8xOCG1SU7pyMd"
}
To me it seems like the retrieved Image is empty, is it correct? Or how should i do to correctly retrieve the image from local storage?
I've noticed that, If the job runs with queue driver sync, files
get uploaded to Digital Ocean Spaces but they are empty (file size:
0MB); while, if the job runs with queue driver 'database' files are not uploaded at all to Digital Ocean Spaces.
Doea anybody know how to solve this matter?
Hope somebody can help me with it.
Thank you
Add this variable 'url' => env('DO_SPACES_URL') below 'bucket' => env('DO_SPACES_BUCKET'), in filesystems.php.
In your .env put the variable DO_SPACES_URL=https://{your bucket name}.ams3.digitaloceanspaces.com
Upload file like this:
Storage::disk('do_spaces')->put('uploads', $request->file('file'), 'public');
I'm trying to generate and then download a csv file in Laravel 9.
The generation is ok at the correcte location (public/files/OUT/) but i'm getting the content in the response and no download occurs.
here is my code :
$path = storage_path('app\public\files\OUT\\');
$filename='dataTemplate';
$f = fopen($path.$filename.'.csv', 'wb');
// => PUT DATAS IN CSV => no problem
fclose($f);
// file is generated successfully at path location
$headers = [
'Content-Type' => 'text/csv',
];
return response()->download($path.'dataTemplate.csv','dataTemplate.csv',$headers);
Thanks in advance
I am trying to manage DO's Spaces with Laravel's 8 Storage, however I am getting errors which seems to come from Laravel's side.
At start I wrote this line in terminal as I was instructed in Laravel's documentation
composer require league/flysystem-aws-s3-v3 "~1.0"
afterwards I edited my environmental variables
DO_SPACES_KEY=*KEY*
DO_SPACES_SECRET=*SECRET*
DO_SPACES_ENDPOINT=ams3.digitaloceanspaces.com
DO_SPACES_REGION=AMS3
DO_SPACES_BUCKET=test-name
also added changes in config/filesystems.php
'do_spaces' => [
'driver' => 's3',
'key' => env('DO_SPACES_KEY'),
'secret' => env('DO_SPACES_SECRET'),
'endpoint' => env('DO_SPACES_ENDPOINT'),
'region' => env('DO_SPACES_REGION'),
'bucket' => env('DO_SPACES_BUCKET'),
],
After visiting this test Route
Route::get('/test', function (Request $request) {
Storage::disk('do_spaces')->put('test.txt', 'hello world');
});
I am getting this error
Error executing "PutObject" on "//test-name./test-name/test.txt"; AWS HTTP error: cURL error 6: Couldn't resolve host 'test-name' (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for http://test-name./test-name/test.txt
It seems that problem occurs while laravel is trying to compile url which should not look as it is here (wrong - http://test-name./test-name/test.txt). However I have no clue how to fix this issue and what I am doing wrong, since I was following all steps as many tutorials and documetations were telling to do.
I had the same problem. I solved it next way:
Add https:// to DO_SPACES_ENDPOINT (https://ams3.digitaloceanspaces.com)
In put method use path to text.txt:
Storage::disk('do_spaces')->put('YOUR_SPACE_NAME/YOUR_FOLDER_NAME(if you have)/test.txt', 'hello world');
I am using guzzle to downlaod file from url and save it into my storage.
So I have a code look like this
$response = $this->client->request('GET', $model->url, [
'stream' => true
]);
$body = $response->getBody();
while (!$body->eof()) {
Storage::append($this->filePath, $body->read(1024));;
}
but when I open the folder where my file is located, I see that the file size is changing and sometimes it is zero. So in the end I am getting a invalid file.
How can I solve this problem?
[I run the script from localhost]
I'm trying to upload files using Laravel 5.4 to AWS S3 bucket but I get this error:
Error executing "PutObject" on "https://bucket_name.s3.amazonaws.com/1520719994357906.png"; AWS HTTP error: cURL error 60: SSL certificate problem: unable to get local issuer certificate (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
In filesystems.php:
's3' => [
'driver' => 's3',
'key' => 'KRY_HERE',
'secret' => 'SECRET_HERE',
'region' => 'us-east-1',
'bucket' => 'bucket_name', //has global access to read files
],
In the controller:
Storage::disk('s3')->put($imageName, file_get_contents(public_path('galleries/').$imageName));
How to solve this? If I upload the app to EC2 instance does it require SSL installed to upload files to S3 bucket? Thanks in advance.
Uploading from server worked fine no need to install SSL it just doesn't work from localhost.
it just doesn't work from localhost,if you want to do working it on localhost you have to do some changes in vendor directory.(For your local use only)
vendor/guzzle/src/handler/CurlFactory.php
Near around line no 350.Comment this two line and add new two line,otherwise replace this two line.(as you wish)
if ($options['verify'] === false) {
unset($conf[\CURLOPT_CAINFO]);
$conf[\CURLOPT_SSL_VERIFYHOST] = 0;
$conf[\CURLOPT_SSL_VERIFYPEER] = false;
} else {
/* $conf[\CURLOPT_SSL_VERIFYHOST] = 2;
$conf[\CURLOPT_SSL_VERIFYPEER] = true;*/ //Comment this two line
$conf[\CURLOPT_SSL_VERIFYHOST] = 0;
$conf[\CURLOPT_SSL_VERIFYPEER] = false;
}
Now it's work fine.