upload image bytes to cloudinary upload API - aws-lambda

I am using cloudinary API to get different resolution of the image/video files.I am able to use upload API using following code and get the required resolutions from java code
Map uploadResult = cloudinary.uploader().upload(file, options);
Now i need to perform the same from aws lamda using java code since I need to get resolutions of content stored in s3 bucket. I can think of 2 possible ways of doing the same 1) point cloudinary to use s3 urls - this requires setup 2)byte array buffer or IO input stream. Is there any sample java code available to option 2 ?
I am referring to upload API here
https://cloudinary.com/documentation/image_upload_api_reference#upload
This has some reference with python
Correct way for uploading image bytes to cloudinary

Please check this example:
File file = new File("<image_path>");
byte[] fileContent = Files.readAllBytes(file.toPath());
cloudinary.uploader().upload(fileContent, ObjectUtils.emptyMap()));
--Yakir

Related

How to add multiple Media from Disk in Laravel Spatie Media Library?

I'm saving images from local disk to cloud (DO storage) disk by following codes in controller
$claim->addMediaFromDisk($front_image, 'public')->usingFileName("front-image")->toMediaCollection('claim-images', 'do_spaces');
$claim->addMediaFromDisk($right_image, 'public')->usingFileName("right-image")->toMediaCollection('claim-images', 'do_spaces');
$claim->addMediaFromDisk($left_image, 'public')->usingFileName("left-image")->toMediaCollection('claim-images', 'do_spaces');
this trick works but saving those images in 3 different directory in cloud storage. But I want all three images in same directory.
I see there is built in method for adding multiple media from request. But how can I do it form disk. I was expecting something like addMultipleMediaFromDisk(!). Is there any solution ?
Laravel version: 7.30
Spatie media library version: 7.20
//where the original file is saved on the local disk and the conversions on S3.
$media = $claim->addMedia($pathToImage)->storingConversionsOnDisk('s3')
->toMediaCollection('claim-images', 'local');

Uploading images to Spring Boot and S3 all In-Memory

I have an Angular webapp that uses a Spring Boot REST service as its backing web service.
I am adding a "Profiles" feature for users, and as part of this I want to stand up an endpoint that allows users to upload profile images for themselves and immediately upload those files to S3 (where I will host all the images from).
Looking at several Spring Boot/file upload tutorials :
http://www.mkyong.com/spring-boot/spring-boot-file-upload-example/
I update avatar image and display it but the avatar does not change in Spring Boot , why?
Many others
It seems that the standard way of handling such file upload is exposing a controller endpoint that accepts MultipartFiles like so:
#RestController
#RequestMapping("/v1/profiles")
public class ProfileController {
#PostMapping("/photo")
public ResponseEntity uploadProfilePhoto(#RequestParam("mpf") MultipartFile mpf)
// ...
}
Looking at all this code, I can't tell if the MultipartFile instance is in-memory or if Spring sets its location somewhere (perhaps under /tmp?) on the disk.
Looking at the AWS S3 Java SDK tutorial, it seems the standard way to upload a disk-based File is like so:
File file = new File(uploadFileName);
s3client.putObject(new PutObjectRequest(bucketName, keyName, file));
So it looks like I must have a File on disk in order to upload to S3.
I'm wondering if there is a way to keep everything in memory, or whether this is a bad idea and I should stick to disks/File instances!
Is there a way to keep the entire profile image (MultipartFile) in-mempory inside the controller method?
Is there a way to feed (maybe via serialization?!) a MultipartFile instance to S3's PutObjectRequest?
Or is this all a terrible idea (if so, why?!)?
Is there a way to keep the entire profile image (MultipartFile) in-mempory inside the controller method?
No, there is NO way to keep an image File in-memory because File object in java represents a path in file system.
Is there a way to feed (maybe via serialization?!) a MultipartFile instance to S3's PutObjectRequest?
No, from S3's API documentation, there is no way for S3 to deserialize to the image file for you after/during the upload.
Or is this all a terrible idea (if so, why?!)?
It depends on your specific case but it is generally not preferred.
If - there are not many users uploading images at the same time, your memory is probably enough to handle.
Else - You can easily get out-of-memory problems.
If you insist on doing so, S3 API can upload an InputStream (If I remember correctly). You can convert your Multipart File to an InputStream.
This SO thread talks about uploading to S3 with InputStream
You can also take a look at File.createTempFile() to create a temp file.
I have been looking at the same thing. Basically you want a user to be able to be able to upload a photo album and have those photos served from S3 and probably have them secured so only that user can upload/delete/etc.
I believe the simpler answer is in spring boot to get a Pre-signed URL from S3. https://docs.aws.amazon.com/AmazonS3/latest/dev/PresignedUrlUploadObjectJavaSDK.html
which basically gives you a token defining the bucket, and object key ("/bobs_profile/smiling_bob.jpg") and a time limit for that image to be uploaded.
Give that to your angular app (or ionic app) to upload the image to that location.
That should do it. but someone let me know if I'm wrong.
The only issue that I see is if bob wants to upload "bobs_nude_photo.jpg" and only wants spring security logged in people to be able to see it... well I'm sure there is an S3 solution for that??

How to send local image file via an api to parse.com?

I am trying to build up a web application with javascript and nodejs. In order to hide the ApplicationID/Key, I use API to handle the data saving.
For example, if I need to save a object on Parse. Instead of using Parse Javascript SDK on the client side, I send the object to the server and then ask server to save the parse.
Everything was working fine until I try to upload images to the server. It turns out I need to somehow upload the images to server before I can save these images to parse class because PFFile need urls to upload the images. But the image at this time is still in local. I was thinking to convert image to base64 string and then server can convert it back to image data and then save it to the parse. However, I didn't succeed with this approach. Can anyone provides some insight? Thanks
When uploading a image, you can just use
uploadImg(photo):void{
var parseFile = new Parse.File("image.png", {base64:photo});
parseFile.save();
}
the parameter photo is base64.
*Are you using AWS s3 to store your images?

Uploading an image to S3 using aws-sdk v2

I'm having a hell of a time working with the aws-sdk documentation, all of the links I follow seem outdated and unusable.
I'm looking for a straight forward implementation example of uploading an image file to an S3 bucket in Ruby.
let's say the image path is screenshots/image.png
and I want to upload it to the bucket my_bucket
AWS creds live in my ENV
Any advice is much appreciated.
Here is how you can upload a file from disk to the named bucket and key:
s3 = Aws::S3::Resource.new
s3.bucket('my_bucket').object('key').upload_file('screenshots/image.png')
That is the simplest method. You should replace 'key' with the key you want it stored with in Amazon S3. This will automatically upload large files for you using the multipart upload APIs and will retry failed parts.
If you prefer to upload always using PUT object, you can call #put or use an Aws::S3::Client:
# using put
s3 = Aws::S3::Resource.new
File.open('screenshots/image.png', 'rb') do |file|
s3.bucket('my_bucket').object('key').put(body:file)
end
# using a client
s3 = Aws::S3::Client.new
File.open('screenshots/image.png', 'rb') do |file|
s3.put_object(bucket:'my_bucket', key:'key', body:file)
end
Also, the API reference documentation for the v2 SDK is here: http://docs.aws.amazon.com/sdkforruby/api/index.html

In which format to send image via node js to store with gridfs

I have a client written in xcode and I would like to upload the user pic to be stored on the server.
The server run node js and I store the uploaded files with gridfs
How should I send the picture in nodejs query.
Is it suppose to be binary format of the pic?
If so, does this mean -
the client should create a binary format of the image in xcode
the client should send the binary format as string appended to the url request for node
the server stores the string in gridfs
the client retrieves the image and parse/present it as jpg/png image?
I would just use an HTTP POST similar to how you would do the same in a web form. Check this post for an example of how to do this with iOS/cocoa.
ios Upload Image and Text using HTTP POST
Once the file gets to your nodejs server, its up to you how you want to handle. If using express framework setup middleware thusly:
/** Form Handling */
app.use(express.bodyParser({
uploadDir: app.settings.tmpFolder,
keepExtensions: true
}))
app.use(express.limit('5mb'))
Then you can access the uploaded image with req.files, pre-process and then store in gridfs with your MongoDB module of choice.

Resources