Google App Engine : Wrong Serving Url - image

I have created a Google App Engine project where it's possible to upload photos. Uploading part is working fine and all the photos are uploaded in proper size. But when I try getting images.get_serving_url , it returns me serving_url appended with lh3.googleusercontent.com but according to GoogleAppEngine documentation it must return serving_url something like lh3.gghpt.com . Also, the problem which comes is that the photos on that serving_url is 4-6 times smaller than the uploaded ones and when I view in the GoogleAppEngine console, all those photos have same size as the uploaded ones. I don't know why GoogleAppEngine is not returning the actual sized images.

Try specifying size=0 in the images.get_serving_url method call.
eg. images.get_serving_url(blob_key, size=0)

Related

How to detect a photo has been modified through Google Photos API?

Getting the list of MediaItems can be done through Google Photos API as well as getting the MediaItem metadata as well as the media item itself.
What if the picture was modified online (e.g. brightness/contrast), then saved?
The MediaItem does not contain something like a hash-code.
How to detect if the photo has been modified?
Does Google Photos API support this use case and how?
There is currently no ability to see if a file was changed in the Google Photos api.
There is however a feature request for this currently Provide Modified Date in metadata which might be along what you are looking for.
As suggested in a comment you could probably do this yourself using MD5 but its not going to help you if you want the api to tell you if theres been a change your going to have to test the md5
Even though DalmTo's answer is true, there is a workaround to this issue.
The HTTP Content-Length header is set with every response when downloading a media item, so one can "probe" the actual media item and stop downloading.
The Content-Length value is the value the item would have as file size after downloading. Assuming changes don't end up having the same file size, this value will be different if a file has been changed (cropped, rotated, etc...).

Uploading images to Spring Boot and S3 all In-Memory

I have an Angular webapp that uses a Spring Boot REST service as its backing web service.
I am adding a "Profiles" feature for users, and as part of this I want to stand up an endpoint that allows users to upload profile images for themselves and immediately upload those files to S3 (where I will host all the images from).
Looking at several Spring Boot/file upload tutorials :
http://www.mkyong.com/spring-boot/spring-boot-file-upload-example/
I update avatar image and display it but the avatar does not change in Spring Boot , why?
Many others
It seems that the standard way of handling such file upload is exposing a controller endpoint that accepts MultipartFiles like so:
#RestController
#RequestMapping("/v1/profiles")
public class ProfileController {
#PostMapping("/photo")
public ResponseEntity uploadProfilePhoto(#RequestParam("mpf") MultipartFile mpf)
// ...
}
Looking at all this code, I can't tell if the MultipartFile instance is in-memory or if Spring sets its location somewhere (perhaps under /tmp?) on the disk.
Looking at the AWS S3 Java SDK tutorial, it seems the standard way to upload a disk-based File is like so:
File file = new File(uploadFileName);
s3client.putObject(new PutObjectRequest(bucketName, keyName, file));
So it looks like I must have a File on disk in order to upload to S3.
I'm wondering if there is a way to keep everything in memory, or whether this is a bad idea and I should stick to disks/File instances!
Is there a way to keep the entire profile image (MultipartFile) in-mempory inside the controller method?
Is there a way to feed (maybe via serialization?!) a MultipartFile instance to S3's PutObjectRequest?
Or is this all a terrible idea (if so, why?!)?
Is there a way to keep the entire profile image (MultipartFile) in-mempory inside the controller method?
No, there is NO way to keep an image File in-memory because File object in java represents a path in file system.
Is there a way to feed (maybe via serialization?!) a MultipartFile instance to S3's PutObjectRequest?
No, from S3's API documentation, there is no way for S3 to deserialize to the image file for you after/during the upload.
Or is this all a terrible idea (if so, why?!)?
It depends on your specific case but it is generally not preferred.
If - there are not many users uploading images at the same time, your memory is probably enough to handle.
Else - You can easily get out-of-memory problems.
If you insist on doing so, S3 API can upload an InputStream (If I remember correctly). You can convert your Multipart File to an InputStream.
This SO thread talks about uploading to S3 with InputStream
You can also take a look at File.createTempFile() to create a temp file.
I have been looking at the same thing. Basically you want a user to be able to be able to upload a photo album and have those photos served from S3 and probably have them secured so only that user can upload/delete/etc.
I believe the simpler answer is in spring boot to get a Pre-signed URL from S3. https://docs.aws.amazon.com/AmazonS3/latest/dev/PresignedUrlUploadObjectJavaSDK.html
which basically gives you a token defining the bucket, and object key ("/bobs_profile/smiling_bob.jpg") and a time limit for that image to be uploaded.
Give that to your angular app (or ionic app) to upload the image to that location.
That should do it. but someone let me know if I'm wrong.
The only issue that I see is if bob wants to upload "bobs_nude_photo.jpg" and only wants spring security logged in people to be able to see it... well I'm sure there is an S3 solution for that??

How to send local image file via an api to parse.com?

I am trying to build up a web application with javascript and nodejs. In order to hide the ApplicationID/Key, I use API to handle the data saving.
For example, if I need to save a object on Parse. Instead of using Parse Javascript SDK on the client side, I send the object to the server and then ask server to save the parse.
Everything was working fine until I try to upload images to the server. It turns out I need to somehow upload the images to server before I can save these images to parse class because PFFile need urls to upload the images. But the image at this time is still in local. I was thinking to convert image to base64 string and then server can convert it back to image data and then save it to the parse. However, I didn't succeed with this approach. Can anyone provides some insight? Thanks
When uploading a image, you can just use
uploadImg(photo):void{
var parseFile = new Parse.File("image.png", {base64:photo});
parseFile.save();
}
the parameter photo is base64.
*Are you using AWS s3 to store your images?

How to fetch ooyala image?

I've been looking around the web all day long but I'm not able to find a way to get the URL for a video image that's hosted on Ooyala. I read that the URL's vary from video to video, but I wonder if there isn't ANY way to get the image from the embed code.
In case you're wondering, I'm not uploading the vids to Ooyala myself, I'm simply running a site that embeds some Ooyala vids (as well as videos from other sites). Does anyone have a solution, or is there simply no way for me to get a preview image?
Thanks!
I don't know on which technology you are working upon. But i have used ruby on rails with ooyala V2. So Ooyala have a ruby on rails module. Which can be downloaded from their site.
Also, u need to have OOYALA_API_KEY,OOYALA_V2_SECRET_CODE. Which u can get by logging into ur ooyala account under developer section.
After getting the module of ruby on rails from ooyala.
The code is simple as follows:
ooyala_obj = Ooyala::API.new(OOYALA_API_KEY,OOYALA_V2_SECRET_CODE)
thumbnail = ''
response = ooyala_obj.get("/v2/assets/#{embed_code}/generated_preview_images")
response.each do |attribute|
if attribute["url"].present?
thumbnail = attribute["url"]
break
end
end
return thumbnail
As only embed code will not work, you need to have a verfied signature to get the generated preview image from ooyala which is only done by API KEY AND SECRET CODE.
I hope I helped you..

Download large file from Skydrive to Windows Phone 7

Am having some trouble with the SkyDrive download process and hoping you can help me.
Following the standard SkyDrive API & examples, I've set up a page that browses the SkyDrive folder structure, lets User click on a file, prompt to download, and it all works correctly.
Where I'm having trouble is when the file downloaded is large, I get the OutOfMemoryException thrown at around the 100Mb mark.
Dennis speaks on this problem here http://dotnet.dzone.com/articles/2-things-you-should-consider but it relates to a direct URL download, not via the SkyDrive architecture.
I've tried extracting the URL from SkyDrive and doing the direct download that way but haven't had any success.
Here is the code I'm using - the "item" object is of type SkyDriveItem, having iterated through a folders content and selected this file.
LiveConnectClient downloadClient = new LiveConnectClient(App.Session);
try
{
downloadClient.DownloadCompleted += new EventHandler<LiveDownloadCompletedEventArgs>(downloadClient_DownloadCompleted);
downloadClient.DownloadProgressChanged += new EventHandler<LiveDownloadProgressChangedEventArgs>(downloadClient_DownloadProgressChanged);
downloadClient.DownloadAsync(item.ID + "/content", item);
This will work fine when the file isn't too large, but as mentioned, select a big file (>100Mb) and it dies with the OutOfMemory exception.
Any pointers?
Thanks in advance
Resolved - While I was never able to use the downloadClient.DownloadAsync() method to download large files, playing with the downloadClient.getAsync() and using the Pre-Authenticated URL via a regular Stream downloader does the trick.

Resources