Error uploading from Laravel 5.4 to S3 bucket - laravel

[I run the script from localhost]
I'm trying to upload files using Laravel 5.4 to AWS S3 bucket but I get this error:
Error executing "PutObject" on "https://bucket_name.s3.amazonaws.com/1520719994357906.png"; AWS HTTP error: cURL error 60: SSL certificate problem: unable to get local issuer certificate (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
In filesystems.php:
's3' => [
'driver' => 's3',
'key' => 'KRY_HERE',
'secret' => 'SECRET_HERE',
'region' => 'us-east-1',
'bucket' => 'bucket_name', //has global access to read files
],
In the controller:
Storage::disk('s3')->put($imageName, file_get_contents(public_path('galleries/').$imageName));
How to solve this? If I upload the app to EC2 instance does it require SSL installed to upload files to S3 bucket? Thanks in advance.

Uploading from server worked fine no need to install SSL it just doesn't work from localhost.

it just doesn't work from localhost,if you want to do working it on localhost you have to do some changes in vendor directory.(For your local use only)
vendor/guzzle/src/handler/CurlFactory.php
Near around line no 350.Comment this two line and add new two line,otherwise replace this two line.(as you wish)
if ($options['verify'] === false) {
unset($conf[\CURLOPT_CAINFO]);
$conf[\CURLOPT_SSL_VERIFYHOST] = 0;
$conf[\CURLOPT_SSL_VERIFYPEER] = false;
} else {
/* $conf[\CURLOPT_SSL_VERIFYHOST] = 2;
$conf[\CURLOPT_SSL_VERIFYPEER] = true;*/ //Comment this two line
$conf[\CURLOPT_SSL_VERIFYHOST] = 0;
$conf[\CURLOPT_SSL_VERIFYPEER] = false;
}
Now it's work fine.

Related

Issues in uploading files to digital ocean spaces from laravel

It's almost 48 hours that I'm facing an issue with the files upload to Digital Ocean spaces from laravel and I can't make it work.
I've successfully built a livewire component that handle the multiple upload of images, each image is stored in the laravel local storage.
As on this website we plan to host thousands of different images, we have now decided to use Digital Ocean Spaces as storage disk.
In order to do so, as first I installed league/flysystem-aws-s3-v3 composer package as required.
Than in my config/filesystem.phpI've done the following :
'do_spaces' => [
'driver' => 's3',
'key' => env('DO_SPACES_KEY'),
'secret' => env('DO_SPACES_SECRET'),
'endpoint' => env('DO_SPACES_ENDPOINT'),
'region' => env('DO_SPACES_REGION'),
'bucket' => env('DO_SPACES_BUCKET'),
],
And in my .env the following :
DO_SPACES_KEY=key
DO_SPACES_SECRET=secret
DO_SPACES_ENDPOINT=https://ams3.digitaloceanspaces.com
DO_SPACES_REGION=AMS3
DO_SPACES_BUCKET=bucket_name
In my livewire component, after uploading the image to local storage, for each image I'm dispatching a job(dispatch(new ResizeUploadToDo($pic, $extension));) which is supposed to :
Retrieve the image from local storage
Pass it to Intervention library in order to resize it and apply a watermark.
Upload the manipulated Image to Digital Ocean Spaces
Remove the old image from local storage
This is the code I've written so far in my job handle() method :
public function handle()
{
$path = Storage::disk('local')->path($this->pic->storage_file_path);
$img=Image::make($path);
$img->resize(600, 600);
$folderStructure = Carbon::now()->format('Y') . '/' . Carbon::now()->format('m') . '/' . Carbon::now()->format('d'). '/';
$filename=time().'.'.$this->extension;
Storage::disk('do_spaces')->put($folderStructure . $filename, $img, 'public');
}
The issues that I'm now facing are the following :
If i try to dd($img) right after instanciate it with Intervention
library, i get the following :
Intervention\Image\Image {#1570 ▼ // app\Jobs\Images\ResizeUploadToDo.php:49
#driver: Intervention\Image\Gd\Driver {#1578 ▼
+decoder: Intervention\Image\Gd\Decoder {#1598 ▼
-data: null
}
+encoder: Intervention\Image\Gd\Encoder {#363 ▼
+result: null
+image: null
+format: null
+quality: null
}
}
#core: GdImage {#394 ▼
+size: "600x600"
+trueColor: true
}
#backups: []
+encoded: ""
+mime: "image/png"
+dirname: "C:\Users\Gianmarco\wa\hotelista\storage\app22/12/12"
+basename: "SPcNL5FD3OZV4heHWA103J4n5YU8xOCG1SU7pyMd.png"
+extension: "png"
+filename: "SPcNL5FD3OZV4heHWA103J4n5YU8xOCG1SU7pyMd"
}
To me it seems like the retrieved Image is empty, is it correct? Or how should i do to correctly retrieve the image from local storage?
I've noticed that, If the job runs with queue driver sync, files
get uploaded to Digital Ocean Spaces but they are empty (file size:
0MB); while, if the job runs with queue driver 'database' files are not uploaded at all to Digital Ocean Spaces.
Doea anybody know how to solve this matter?
Hope somebody can help me with it.
Thank you
Add this variable 'url' => env('DO_SPACES_URL') below 'bucket' => env('DO_SPACES_BUCKET'), in filesystems.php.
In your .env put the variable DO_SPACES_URL=https://{your bucket name}.ams3.digitaloceanspaces.com
Upload file like this:
Storage::disk('do_spaces')->put('uploads', $request->file('file'), 'public');

How to Fix Client Error: file_get_contents(): in Cpanel with Laravel Project inside

i have problem when do seed in my laravel project in cpanel.
this is the errors
Client Error: file_get_contents(): https:// wrapper is disabled in the server configuration by allow_url_fopen=0
at vendor/kavist/rajaongkir/src/HttpClients/BasicClient.php:74
70▕
71▕ private function executeRequest(string $url): array
72▕ {
73▕ set_error_handler(function ($severity, $message) {
➜ 74▕ throw new BasicHttpClientException('Client Error: '.$message, $severity);
75▕ });
76▕
77▕ $rawResponse = file_get_contents($url, false, $this->context);
Please Someone help me
this is my LocationsTableSeeder.php
public function run()
{
$daftarProvinsi = RajaOngkir::provinsi()->all();
foreach ($daftarProvinsi as $provinceRow) {
Province::create([
'province_id' => $provinceRow['province_id'],
'nama' => $provinceRow['province'],
]);
$daftarKota = RajaOngkir::kota()->dariProvinsi($provinceRow['province_id'])->get();
foreach ($daftarKota as $cityRow) {
Kabupaten::create([
'province_id' => $provinceRow['province_id'],
'city_id' => $cityRow['city_id'],
'nama' => $cityRow['city_name'],
'type' => $cityRow['type'],
'postal_code' => $cityRow['postal_code'],
]);
}
}
}
It's a good practice to disable file_get_contents ability to open remote URLs (like the ones starting HTTP) on shared servers (that frequently use Cpanel) to avoid the download/injection of malicious scripts in your server.
Go to Cpanel PHP options and enable allow_url_fopen, as pointed by apokryfos, usually it's at Switch To PHP Options menu. Some providers will not allow this change via Cpanel and you might need to open a support ticket.
Usually, this option cannot be changed by ini_set or via the PHP script itself in any other way.

How to manage data at DO Spaces with Laravel Storage

I am trying to manage DO's Spaces with Laravel's 8 Storage, however I am getting errors which seems to come from Laravel's side.
At start I wrote this line in terminal as I was instructed in Laravel's documentation
composer require league/flysystem-aws-s3-v3 "~1.0"
afterwards I edited my environmental variables
DO_SPACES_KEY=*KEY*
DO_SPACES_SECRET=*SECRET*
DO_SPACES_ENDPOINT=ams3.digitaloceanspaces.com
DO_SPACES_REGION=AMS3
DO_SPACES_BUCKET=test-name
also added changes in config/filesystems.php
'do_spaces' => [
'driver' => 's3',
'key' => env('DO_SPACES_KEY'),
'secret' => env('DO_SPACES_SECRET'),
'endpoint' => env('DO_SPACES_ENDPOINT'),
'region' => env('DO_SPACES_REGION'),
'bucket' => env('DO_SPACES_BUCKET'),
],
After visiting this test Route
Route::get('/test', function (Request $request) {
Storage::disk('do_spaces')->put('test.txt', 'hello world');
});
I am getting this error
Error executing "PutObject" on "//test-name./test-name/test.txt"; AWS HTTP error: cURL error 6: Couldn't resolve host 'test-name' (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for http://test-name./test-name/test.txt
It seems that problem occurs while laravel is trying to compile url which should not look as it is here (wrong - http://test-name./test-name/test.txt). However I have no clue how to fix this issue and what I am doing wrong, since I was following all steps as many tutorials and documetations were telling to do.
I had the same problem. I solved it next way:
Add https:// to DO_SPACES_ENDPOINT (https://ams3.digitaloceanspaces.com)
In put method use path to text.txt:
Storage::disk('do_spaces')->put('YOUR_SPACE_NAME/YOUR_FOLDER_NAME(if you have)/test.txt', 'hello world');

Issue with PEM file while building a query using Laravel and Goutte

I'm building a website using Laravel 8 and Goutte 4. I'm trying to send a request to a website that requires authentication with a .pem file. In order to do so, I include the 'local_cert' param when creating the instance of http-client like this:
use Goutte\Client;
use Symfony\Component\HttpClient\HttpClient;
class Query
{
protected $client;
public function __construct() {
$this->client = (new Client(
HttpClient::create([
'timeout' => 30,
'local_cert' => //absolute path to the .pem file,
])
));
}
public function query() {
$this->client->request('GET', "https://palena.sii.cl/cvc_cgi/dte/of_solicita_folios");
}
}
This works perfectly well on my local server, but when I try it on the production server, I get an Exception: "Problem with the local SSL certificate". The .pem file being used is the same in development and production, and the php can read the file in the production server.
The full error in the laravel Log is:
[2020-12-02 21:32:30] production.ERROR: Problem with the local SSL certificate for "https://palena.sii.cl/cvc_cgi/dte/of_solicita_folios". {"userId":1,"exception":"[object] (Symfony\Component\HttpClient\Exception\TransportException(code: 0): Problem with the local SSL certificate for "https://palena.sii.cl/cvc_cgi/dte/of_solicita_folios". at /my/path/vendor/symfony/http-client/Chunk/ErrorChunk.php:65)
did you fixed? i only make this work with GuzzleHttp
$requestAuthentication = $clientAuth->request('GET', $urlAuthentication, [
'cert' => [$rutaPem, 'clave'],
'cookies' => $cookiesAuth,
]);

Upload file in S3 using Laravel 5.3

Installation Process
I followed this tutorial to install aws Package in Laravel 5.3
My Code is below
$s3 = \App::make('aws')->createClient('s3');
$s3->putObject(array(
'Bucket' => 'Bucket_Name',
'Key' => 'AWS_ACCESS_KEY_ID',
'SourceFile' => 'http://domainname/sample.txt',
));
I am trying a txt file with around 50 bytes contents and got below error.
A sha256 checksum could not be calculated for the provided upload
body, because it was not seekable. To prevent this error you can
either 1) include the ContentMD5 or ContentSHA256 parameters with your
request, 2) use a seekable stream for the body, or 3) wrap the
non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You
should be careful though and remember that the CachingStream utilizes
PHP temp streams. This means that the stream will be temporarily
stored on the local disk.
Am I missing something?
SourceFile must be a local file path. The Body parameter allows stream, so you should be able to do a request with guzzle and pass the body to it.
$client = new GuzzleHttp\Client();
$response = $client->get('http://domainname/sample.txt');
$s3->putObject([
'Bucket' => 'Bucket_Name',
'Key' => 'AWS_ACCESS_KEY_ID',
'Body' => $response->getBody(),
]);

Resources