Please am absolutely new to laravel and I'm learning how to upload image to Amazon s3 which am done with that....but the problem is how to show the image in my laravel blade....
This solve my problem...
$imagePath = request()->file('image');
$setImageName = time().'.'.$imagePath->getClientOriginalExtension();
$filePath = 'upload/'.$setImageName; //please note that 'upload' will be created at AWS s3, inside your bucket.
$storage = Storage::disk('s3');
$storage->input($filePath, fopen($imagePath, 'r+'), 'public');
//please note, before your use Storage ::disk('s3); you must call it in your controller.
How to save the path to your database...
$posts->image = $filePath;
$posts->save();
How to display in your view....
url($post->image) }}" >
//Note you must loop through for you to get $post.
HOPE IT HELPS.....
Related
I am using the google/cloud-storage package in an API and successfully uploading pdf files to a Google Cloud Storage bucket. However, the pdf files are first saved locally before they are uploaded to the Google Cloud Storage bucket.
How can I skip saving them locally and instead upload them directly to the Google Cloud Storage bucket? I am planning to host the API on Google App Engine.
This is the post for it.
This is what I am doing currently:
$filename = $request['firstname'] . '.pdf';
$fileStoragePath = '/storage/pdf/' . $filename;
$publicPath = public_path($fileStoragePath);
$pdf = App::make('dompdf.wrapper');
$pdf->loadView('pdfdocument', $validatedData)
$pdf->save($publicPath);
$googleConfigFile = file_get_contents(config_path('googlecloud.json'));
$storage = new StorageClient([
'keyFile' => json_decode($googleConfigFile, true)
]);
$storageBucketName = config('googlecloud.storage_bucket');
$bucket = $storage->bucket($storageBucketName);
$fileSource = fopen($publicPath, 'r');
$newFolderName = $request['firstname'].'_'.date("Y-m-d").'_'.date("H:i:s");
$googleCloudStoragePath = $newFolderName.'/'.$filename;
/*
* Upload a file to the bucket.
* Using Predefined ACLs to manage object permissions, you may
* upload a file and give read access to anyone with the URL.
*/
$bucket->upload($fileSource, [
'predefinedAcl' => 'publicRead',
'name' => $googleCloudStoragePath
]);
Is it possible to upload files to a Google Cloud Storage bucket without first saving them locally?
Thank you for your help.
I have not verified this code, but the class PDF member output() returns a string.
$pdf = App::make('dompdf.wrapper');
$pdf->loadView('pdfdocument', $validatedData)
...
$bucket->upload($pdf->output(), [
'predefinedAcl' => 'publicRead',
'name' => $googleCloudStoragePath
]);
You can simply the client code. Replace:
$googleConfigFile = file_get_contents(config_path('googlecloud.json'));
$storage = new StorageClient([
'keyFile' => json_decode($googleConfigFile, true)
]);
with
$storage = new StorageClient([
'keyFilePath' => config_path('googlecloud.json')
]);
Good evening, i hosted a Laravel project and found some bug.
First of all, here is my code in my controller:
$request->file('FotoHT')->storeAs('FotoHT', $filenameToStore); //this is to safe
$path = asset('storage/app/FotoHT/'. $filenameToStore);
$width = Image::make($path)->width();
$height = Image::make($path)->height();
I save a picture, and it works but when i try using
Image::make($path)->width();
it give me the wrong url. It give me
https://myweb/public/storage/app/FotoHT/_MG_9549_1578913993.jpg
while the image should be accessed from
https://myweb/storage/app/FotoHT/_MG_9549_1578913993.jpg
Can anyone give me a help/solution?
You need to save the image encode to the public path
$fieldFile = $request->file('FotoHT');
$image = Image::make($fieldFile)->width();
Storage::disk('public')->put("FotoHT/".$filenameToStore, (string) $image->encode());
$path=$request->file('FotoHT')->storeAs('FotoHT',$filenameToStore);
$width = Image::make($path)->width();
$height = Image::make($path)->height();
Try this command.You can use asset() helper method for img tag in view
I have a test asserting that images can be uploaded. Here is the code...
// Test
$file = UploadedFile::fake()->image('image_one.jpg');
Storage::fake('public');
$response = $this->post('/api/images', [
'images' => $file
]);
Then in controller i am doing something simpler..
$file->store('images', 'public');
And asserting couple of things. and it works like charm.
But now i need to resize the image using Intervention image package. for that i have a following code:
Image::make($file)
->resize(1200, null)
->save(storage_path('app/public/images/' . $file->hashName()));
And in case if directory does not existing i am checking first this and creating one -
if (!Storage::exists('app/public/images/')) {
Storage::makeDirectory('public/images/', 666, true, true);
}
Now Test should be green and i will but the issue is that every time i run tests it upload a file into storage directory. Which i don't want. I just need to fake the uploading and not real one.
Any Solution ?
Thanks in advance :)
You need to store your file using the Storage facade. Storage::putAs does not work because it does not accept intervention image class. However you can you use Storage::put:
$file = UploadedFile::fake()->image('image_one.jpg');
Storage::fake('public');
// Somewhere in your controller
$image = Image::make($file)
->resize(1200, null)
->encode('jpg', 80);
Storage::disk('public')->put('images/' . $file->hashName(), $image);
// back in your test
Storage::disk('public')->assertExists('images/' . $file->hashName());
I have this to upload pictures to one bucket on s3 in AWS
$image = $picture;
$ext = explode(";", explode("/",explode(",", $image)[0])[1])[0];
$image = str_replace('data:image/'.$ext.';base64,', '', $image);
$image = str_replace(' ', '+', $image);
$imageName = str_random(10) . '.' . $ext;
$fullImagePath = 'datasheets/' . $imageName;
Storage::disk('s3')->put($fullImagePath, base64_decode($image));
$DataSheetPicture = new DataSheetPicture();
$DataSheetPicture->data_sheet_id = $DataSheet->id;
$DataSheetPicture->picture = Storage::disk('s3')->url($fullImagePath);
$DataSheetPicture->save();
The above code works fine, it uploads the pictures successfully to the bucket, but on this line
$DataSheetPicture->picture = Storage::disk('s3')->url($fullImagePath);
It saves the URL in the database like these
/datasheets/6GcfzgUPrA.jpeg
/datasheets/AuqHmu8p0W.jpeg
But I need get the URL like this
https://s3.REGION.amazonaws.com/BUCKET-NAME/FULL-IMAGE-PATH
I don't want to concatenate the region or the bucket name because it could be dynamic
The following will give you the proper URL:
return Storage::disk('s3')->url($filename);
Since Laravel 5.2 you're also able to use cloud()
return Storage::cloud()->url($filename);
I don't want to concatenate the region or the bucket name because it could be dynamic
Then you must be modifying your config and not doing this manually to work correctly, for example:
config([
'filesystem.disks.s3.bucket' => 'my_bucket',
'filesystem.disks.s3.region' => 'my_region'
]);
If you remove the AWS_URL setting from your .env file, the Storage::disk('s3')->url($fullImagePath) should give you the proper URL that you need
See discussion also here: https://laracasts.com/discuss/channels/laravel/storage-url-from-s3?page=1&replyId=482913
OK so when I want to upload an image. I usually do something like:
$file = Input::file('image');
$destinationPath = 'whereEver';
$filename = $file->getClientOriginalName();
$uploadSuccess = Input::file('image')->move($destinationPath, $filename);
if( $uploadSuccess ) {
// save the url
}
This works fine when the user uploads the image. But how do I save an image from an URL???
If I try something like:
$url = 'http://www.whereEver.com/some/image';
$file = file_get_contents($url);
and then:
$filename = $file->getClientOriginalName();
$uploadSuccess = Input::file('image')->move($destinationPath, $filename);
I get the following error:
Call to a member function move() on a non-object
So, how do I upload an image from a URL with laravel 4??
Amy help greatly appreciated.
I don't know if this will help you a lot but you might want to look at the Intervention Library. It's originally intended to be used as an image manipulation library but it provides saving image from url:
$image = Image::make('http://someurl.com/image.jpg')->save('/path/saveAsImageName.jpg');
$url = "http://example.com/123.jpg";
$url_arr = explode ('/', $url);
$ct = count($url_arr);
$name = $url_arr[$ct-1];
$name_div = explode('.', $name);
$ct_dot = count($name_div);
$img_type = $name_div[$ct_dot -1];
$destinationPath = public_path().'/img/'.$name;
file_put_contents($destinationPath, file_get_contents($url));
this will save the image to your /public/img, filename will be the original file name which is 123.jpg for the above case.
the get image name referred from here
Laravel's Input::file method is only used when you upload files by POST request I think. The error you get is because file_get_contents doesn't return you laravel's class. And you don't have to use move() method or it's analog, because the file you get from url isn't uploaded to your tmp folder.
Instead, I think you should use PHP upload an image file through url what is described here.
Like:
// Your file
$file = 'http://....';
// Open the file to get existing content
$data = file_get_contents($file);
// New file
$new = '/var/www/uploads/';
// Write the contents back to a new file
file_put_contents($new, $data);
I can't check it right now but it seems like not a bad solution. Just get data from url and then save it whereever you want