Uploading image file to google cloud using Laravel - laravel

Hi i have trying to upload image file to google cloud storage using laravel API.
i have integrated google sdk via composer and i try to hit with postman i am getting the url and get stored in my database but the image file is not uploaded in the folder in google cloud .i created a folder with name 'avatars' in by bucket.
here is my code.
this is my controller
public function updateAvatar (AvatarUploadRequest $request) {
$me = Auth::user();
$disk = Storage::disk('gcs');
$url = $disk->url('avatars'. "/" . $me->uuid . ".jpg");
$me->avatar = $url;
$me->save();
return $this->prepareItem($me);
}
this is my filesystems.php file
'gcs' => [
'driver' => 'gcs',
'project_id' => env('GOOGLE_CLOUD_PROJECT_ID', 'my-project-id'),
'key_file' => env('GOOGLE_CLOUD_KEY_FILE', null),
'bucket' => env('GOOGLE_CLOUD_STORAGE_BUCKET', 'my-bucket-name'),
'path_prefix' => env('GOOGLE_CLOUD_STORAGE_PATH_PREFIX', null),
'storage_api_uri' => env('GOOGLE_CLOUD_STORAGE_API_URI',
'https://console.cloud.google.com/storage/browser/my-project-id/'),
],
This all i have done. did i missing anything? any additional configuration needed?

This is because you are generating url but not storing in file in the disk,
here is the code example
fist get file contents from request:
$avatar = $request->file('avatar')
second save it into storage:
Storage::disk('gcs')->put('avatars/'.$me->uuid , $file);

Step 1:First Create an Account in google cloud storage For that you will need credit card details but it won't charge, when we were not gooing for "Upgrade" because this is to make sure that it is not Robot.
Step 2:Create a Project in google cloud storage, here for example Project Name is "My-project".
Step 3: Create a Bucket in the Project "My-project". for example here i created "My-buckets"
Step 4: Create a Folder in the bucket for example here i created a folder name "avatars".
##Step 5: Go to the optinon IAM & Admin => Service accounts => Create Service account => Put Service Account name should ne the Bucket name "my-buckets".=> Check Furnish a new private key and save then a new json file will download and put that file in the project.Here i rename it as my-buckets.json.
Go to The Project in xampp
Step 6: Go to the userController.php
[app->http->controllers->api->v2->userController.php]
$me = Auth::user();
$file = $request->file('avatar');
$name= $me->uuid . ".".$file->getClientOriginalExtension();
$filePath = 'avatars/' . $name;
$disk = Storage::disk('gcs')->put($filePath, file_get_contents($file));
$gcs = Storage::disk('gcs');
$url = $gcs->url('avatars'. "/" . $me->uuid . ".jpg");
$me->avatar = $url;
$me->save();
return $this->prepareItem($me);
}
Step 7: Go to the filesystems.php
[config->filesystems.php]
Set a driver for the google cloud
'gcs' => [
'driver' => 'gcs',
'project_id' => env('GOOGLE_CLOUD_PROJECT_ID', 'my-project-209405'),
'key_file' => env('GOOGLE_APPLICATION_CREDENTIALS', './my-buckets.json'), // optional: /path/to/service-account.json
'bucket' => env('GOOGLE_CLOUD_STORAGE_BUCKET', 'my-buckets'),
'path_prefix' => env('GOOGLE_CLOUD_STORAGE_PATH_PREFIX', null), // optional: /default/path/to/apply/in/bucket
'storage_api_uri' => env('GOOGLE_CLOUD_STORAGE_API_URI', 'https://storage.googleapis.com/my-buckets/'), // see: Public URLs below
],
Add the Path Of my-buckets.json we got at Step 5 to the Key_file
step 8: Dowload Google SDK Console
Url - https://cloud.google.com/sdk/
step 9: First we dont have the Access to the Account which the google cloud is created ,To get access we need to run google cloud command in google SDK console
Run : gcloud auth login
Then it will open the brouser a asking the gmail account which we created the google cloud storage and allow the permission for google sdk to access, Then it will show the current project that we are, in the console.
step 10: Run the command to enable the public accessibility of the object. The URL that we are getting and stored in the database is not having publicly access
Run : gsutil iam ch allUsers:objectViewer gs://my-buckets
Hope You can Upload your file from the project to the Cloud Storage
Thank You
****Harisankar.H****

Related

Laravel: how to upload pdf file directly to Google Cloud Storage bucket without first saving it locally

I am using the google/cloud-storage package in an API and successfully uploading pdf files to a Google Cloud Storage bucket. However, the pdf files are first saved locally before they are uploaded to the Google Cloud Storage bucket.
How can I skip saving them locally and instead upload them directly to the Google Cloud Storage bucket? I am planning to host the API on Google App Engine.
This is the post for it.
This is what I am doing currently:
$filename = $request['firstname'] . '.pdf';
$fileStoragePath = '/storage/pdf/' . $filename;
$publicPath = public_path($fileStoragePath);
$pdf = App::make('dompdf.wrapper');
$pdf->loadView('pdfdocument', $validatedData)
$pdf->save($publicPath);
$googleConfigFile = file_get_contents(config_path('googlecloud.json'));
$storage = new StorageClient([
'keyFile' => json_decode($googleConfigFile, true)
]);
$storageBucketName = config('googlecloud.storage_bucket');
$bucket = $storage->bucket($storageBucketName);
$fileSource = fopen($publicPath, 'r');
$newFolderName = $request['firstname'].'_'.date("Y-m-d").'_'.date("H:i:s");
$googleCloudStoragePath = $newFolderName.'/'.$filename;
/*
* Upload a file to the bucket.
* Using Predefined ACLs to manage object permissions, you may
* upload a file and give read access to anyone with the URL.
*/
$bucket->upload($fileSource, [
'predefinedAcl' => 'publicRead',
'name' => $googleCloudStoragePath
]);
Is it possible to upload files to a Google Cloud Storage bucket without first saving them locally?
Thank you for your help.
I have not verified this code, but the class PDF member output() returns a string.
$pdf = App::make('dompdf.wrapper');
$pdf->loadView('pdfdocument', $validatedData)
...
$bucket->upload($pdf->output(), [
'predefinedAcl' => 'publicRead',
'name' => $googleCloudStoragePath
]);
You can simply the client code. Replace:
$googleConfigFile = file_get_contents(config_path('googlecloud.json'));
$storage = new StorageClient([
'keyFile' => json_decode($googleConfigFile, true)
]);
with
$storage = new StorageClient([
'keyFilePath' => config_path('googlecloud.json')
]);

laravel s3 signed url does not work with pdf

I have recently added s3 as a storage to my laravel application. I use signed url which works perfectly with uploaded images, but it does not with pdfs. I receive access denied, for the pdfs. If I make the files public via S3 console, I can receive it.
I am uploading these files with this mehtid:
Storage::disk('s3')->put();
I have tried signing the url with these two methods:
$url = Storage::disk('s3')->temporaryUrl(
$path, Carbon::now()->addMinutes(5)
);
$s3 = \Storage::disk('s3');
$client = $s3->getDriver()->getAdapter()->getClient();
$expiry = "+10 minutes";
$command = $client->getCommand('GetObject', [
'Bucket' => \Config::get('filesystems.disks.s3.bucket'),
'Key' => $path,
]);
$request = $client->createPresignedRequest($command, $expiry);
return (string) $request->getUri();
Any help would be appriciated!
I found out the solution.
The problem was, when I gave the url to pdf.js, it automatically changed "&" chars in the url to "&" , and s3 didnt recognize this url. I solved it using js string replace method
var pdfUrl = '{{\App\Models\DataHelper::getImgUrl($note->file)}}'; //getting the singed url
pdfUrl = pdfUrl.replaceAll("&", '&');
showPDF(pdfUrl); //pdf.js

Laravel 6 Unit Test file upload unable to find file at path

I'm writing a unit test in laravel 6 to test file uploading. I have a form that is creating a group and there is a file upload for covers.
My unit test is as follows
$this->WithoutExceptionHandling();
//login as admin
$admin = $this->signInAsAdmin();
Storage::persistentFake('public');
$attributes = [
'group_name' => $this->faker->name,
'group_description' => $this->faker->paragraph,
'group_privacy' => '0',
'group_cover' => $file = UploadedFile::fake()->image('random.jpg')
];
//create new group
$response = $this->post('/admin/groups', $attributes);
unset($attributes['group_cover']);
Storage::disk('public')->assertExists($file);
//check the data exists in database
$this->assertDatabaseHas('groups', $attributes);
//how is the uploaded file written to database
//check the data exists in database
$this->assertDatabaseHas('media', ['file_name' => $file]);
//make sure the title appears on the group list
$this->get('/admin/groups')->assertSee($attributes['group_name']);
The error I am getting is:
Unable to find a file at path [php624B.tmp].
Failed asserting that false is true.
I have also set in my php.ini file the upload_tmp_dir
upload_tmp_dir = C:\path_to_laravel_app\storage\tmp\uploads
What seems to be happening is that the faker library is not creating a temporary image. Doesn't matter what folder I set the tmp location to.
Any help would be appreciated
Danny

Laravel - IBM-Cloud Object Storage

I am trying to setup my Laravel application to use the Object Storage service from IBM-Cloud. What I want is to upload a file and get a static public URL foreach file, but I am currently having some trouble to access the file after upload.
Installed package:
league/flysystem-aws-s3-v3
Created a new service provider for the bluemix storage suggested in this post:
How to connect Laravel 5 app with object-storage bucket?
Within my controller I use the following call to upload the file:
Storage::disk('object-storage')->put($full_name,$file);
Upload works fine, and I can see the file in the bucket. The problem appears when I am trying to access the file.
According to the IBM documentation I need to set the ACL to public-read to be able to access the file publicly. After some research I modified the Filesystem call:
Storage::extend('object-storage', function($app, $config) {
$client = S3Client::factory([
'credentials' => [
'key' => $config['key'],
'secret' => $config['secret'],
],
'region' => $config['region'],
'version' => $config['version'],
'endpoint' => $config['endpoint'],
]);
$adapter = new AwsS3Adapter($client, $config['bucket_name']);
return new Filesystem($adapter,['ACL' => 'public-read']);
});
I have also tried to set the visibility trough the Storage call in the controller:
Storage::disk('object-storage')->setVisibility($full_name,'public-read');
Then I tried to access the file to read the visibility by using the getVisibility:
Storage::disk('object-storage')->getVisibility($full_name);
This gives me an 404 error on getObjectAcl with message:
The specified key does not exist on https://bucket-name.s3.eu-gb.objectstorage.softlayer.net/sApQNtdUvJYg7YWsL8IbCe26U6EK8v.png?acl
If I try to copy the URL address and paste it in my browser I get Access Denied error.
The authentication credentials that is used within the calls is set as Manager.
Is there anyone who has a solution for this problem, or does it exist any guide to upload and read the files using Laravel?
I did so.
$response = $filesystem->put($new_name, file_get_contents($file), ['ACL' => 'public-read'] );
try this
Storage::disk('object-storage')->put($full_name,$file, ['ACL' => 'public-read']);

Laravel, getting uploaded file's url

I'm currently working on a Laravel 5.5 project, where I want to upload files, and then, I want to get their url back of the file (I have to use it client side).
now my code looks like this:
public function pdfUploader(Request $request)
{
Log::debug('pdfUploader is called. ' . $request);
if ($request->hasFile('file') && $request->file('file')->isValid()) {
$extension = $request->file->extension();
$fileName = 'tmp' . round(microtime(true) * 1000) . '.' . $extension;
$path = $request->file->storeAs('temp', $fileName);
return ['status' => 'OK', 'path' => URL::asset($path)];
}
return ['status' => 'NOT_SAVED'];
}
It works fine, I got back the OK status, and the path, but when I want to use the path, I got HTTP 404. I checked, the file is uploaded fine..
My thought is, I should register the new url in the routes. If I have to, how can I do it dynamically, and if its not necessary what is wrong with my function?
Thx the answers in advance.
By default laravel store all uploaded files into storage directory, for example if you call $request->file->storeAs('temp', 'file.txt'); laravel will create temp folder in storage/app/ and put your file there:
$request->file->storeAs('temp', 'file.txt'); => storage/app/temp/file.txt
$request->file->storeAs('public', 'file.txt'); => storage/app/public/file.txt
However, if you want to make your uploaded files accessible from the web, there are 2 ways to do that:
Move your uploaded file into the public directory
$request->file->move(public_path('temp'), $fileName); // => public/temp/file.txt
URL::asset('temp/'.$fileName); // http://example.com/temp/file.txt
NOTE: make sure that your web server has permissions to write to the public directory
Create a symbolic link from storage directory to public directory
php artisan storage:link
This command will create a symbolic link from public/storage to storage/app/public, in this case we can store our files into storage/app/public and make them accessible from the web via symlinks:
$request->file->storeAs('public', $fileName); // => storage/app/public/file.txt
URL::asset('storage/'.$fileName); // => http://example.com/stoage/file.txt

Resources