efficiently serve images from app engine with readable URL - image

Whats a fast and or cheap way to serve an image from a readable url on google app engine:
<img src="http//mycustomdomain.com/image-server/my-readable-url>
(these urls cant be changed so I can't use get_serving_url without the cost of a redirect)
In the documentation on Serving a Blob:
Note: If you are serving images, a more efficient and potentially
less-expensive method is to use get_serving_url using the App Engine
Images API rather than send_blob. The get_serving_url function lets
you serve the image directly, without having to go through your App
Engine instances.
Here are five options off the top of my head that i'm considering based on the size of the image & how quickly the image needs to be returned. (each option would hopefully use edge cache)
Datastore lookup for serving url (precomputed by get_serving_url) & redirect to serving url.
Datastore lookup for blobkey & send_blob
Datastore lookup for blobProperty & send out (increased storage cost but maybe ok for icons etc)
Somehow bake the URL into a google cloud storage bucket name to avoid datastore lookup and simply redirect to that bucket (assuming this isn't possible?)
Some appy.yaml hack that queues up these images and deploys them to app engine as static files falling back to options 1-4 if static file not found. (assuming option 5 isn't possible?)
(datastore costs $0.18 / GB / Month, static files and blobstore cost only $0.026 / GB / Month)
Are there any other options I haven't considered? Is option 2 the best?

I suggest using google cloud storage with a custom domain.
see here:
https://cloud.google.com/storage/docs/website-configuration
you can upload to cloud storage from your app:
https://cloud.google.com/appengine/docs/java/googlestorage/
(that's for java)

Related

How load private Amazon S3 files with javascript?

I have a difficult problem:
I am using OpenSeadragon for viewing large photos on my private Laravel webapplication.
The photos and tiles are stored in my private Amazon S3 bucket.
But how can I access this private photos in my javascript OpenSeadragon component in a safe what?
What I have done: I created a router function in my Laravel application that redirects to Amazon S3:
function getTiles($tile) {
// validation && authorisation
return redirect()->to(\Storage::disk('s3')->temporaryUrl($tile, now()->addMinutes(5)));
}
And I have configured my OpenSeadragon component (according https://openseadragon.github.io/examples/tilesource-custom/) so this component loads the tiles from my router function.
This works, but the problem is: it is very slow, because OpenSeadragongs loads > 100 tiles per second.
I am searching for a good, fast and safe solution for this problem...
I can change my Amazon OpenSeadragon tiles folder visibility to "public" with a difficult random foldername, but anyone who know this foldername, he can download the photo. That's not a good solution...
The best way to achieve this is likely by placing Cloudfront in front our your S3 bucket and switch from presigned URL's to presigned cookies.
The Cloudfront documentation for presigned cookies is here: https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-signed-cookies.html
It should allow you to basically pre-allow access to all the subphotos for zooming, as long as you use a wildcard character to give access.
The AWS docs on choosing between presigned URLs and presigned cookies: https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-choosing-signed-urls-cookies.html
Note that it specifically mentions "You want to provide access to multiple restricted files, for example, all of the files for a video in HLS format or all of the files in the subscribers' area of a website."

Google App Engine & Images server

I'm having difficulties understanding if my idea of an image gallery will work as I can't seem to get it working.
What I have:
An Google App Engine running with a simple website that serves products where each product can have images
A Google Storage bucket with 1.000.000's images
What I planned to do:
Add a CDN & Load balancer to the Google Storage bucket to serve the images worldwide fast on a subdomain.
Status: This works. At least it serves the images.
Problems:
But I have the feeling that the architecture is not right as the Google App Engine can't be put behind the same load balancer & CDN to serve all the static content via this CDN. And I see no way to add the content caching headers. The documentation of Google says I should be able to add cache keys in the loadbalancer config. But I've been 10 times through this config and the back-end bucket config but no luck to find any. Also in the app.yaml of the Google App Engine you can't set this as the images are not servered via the App Engine....
So questions:
Is it logical in this setup to have a GAE and a separate load
balancer with a storage bucket with the images?
How do I add cache-control headers to the CDN/bucket config of Google Cloud CDN?
Assuming that the GCS bucket setup you already have in place allows you to serve an image via the CDN & Load balancer as you desire, let's say on a URL like gs://www.example.com/img.png then handling such request will already include all the required cache control.
If so then in your GAE app-provided pages, instead of referencing an image via a relative path to your site, like <img src="/static/img.png">, which would indeed require handling its own cache management inside the GAE app code, you could simply reference the image via its corresponding URL in the existing CDN setup: <img src="gs://www.example.com/img.png">, with all cache control already included.
Note: I didn't actually try it (I don't have such GCS CDN setup), but I see no reason for which this wouldn't work.

what bits of info are stored in pinterest image urls?

The first bit before the _ is the id of the pin...what are the ZZtfjmGQ used for? I'm assuming the _c is probalby something to do with size.
http://media-cache-lt0.pinterest.com/upload/33284484717557666_HZtfjmFQ_c.jpg
I'm building an image upload service in node.js and was curious what other sites do to store the image.
Final images are served from a CDN, evident by the subdomain in the URL. The first bit, as you pointed out, is the id of the image, the second bit is a UID to get around cache limitations for image versions, and the last bit is image size.
A limitation of CDNs is the inability to process the image after upload. To get around this, my service uploads the files to my Nodejs server where I then serve the image back to the client. I use a jQuery script the user can use to crop the image which sends crop coordinates back to the server and I use ImageMagick to create the various sizes of the the uploaded image. You can obviously eliminate the crop step and just use aspect ratio's to automatically create the needed image sizes. I then upload the final images to the CDN for hosting to end users.
When a user needs to update a photo already in the CDN, the user uploads to nodejs server, images are processed and sized, the UID hash is updated and then uploaded to the CDN. If you want to keep things clean (and cut on CDN cost) you can delete the old "version" at this step as well. However, in my service I give the option to backtrack to an older version if needed.
The rest of the CRUD actions are pretty self explanatory. You can read a list of images available from the CDN using the ID (My service has a user id as well as an image id to allow more robust query operations) and deleting is as simple as identifying the image you want to delete.

How to cache images and html files in PhoneGap

I need a way for cache images and html files in PhoneGap from my site. I'm planning that users will see site without internet connection like it will be with it. But I see information only about sql data storing, but how can I store images (and use later).
To cache images check out this library -of which I'm the creator-:
imgcache.js
. It's designed for the very purpose of caching images using the local filesystem. If you check out the examples you will see that it can also detect when an image fails to be loaded (because you're offline or you have a very bad connection) and then replaces it automatically with the cached image. The user of the webapp doesn't even notice it's offline.
As for html pages.. if they're html static files, they could be stored locally in the web app (file:// in phonegap).
If they're dynamically generated pages, check the localStorage API if you have a small amount of data, otherwise the filesystem API.
For my web app I retrieve only json data from my server (and process/render it using Backbone+Underscore). The json payload is stored into the localStorage. If the application gets offline, it will fetch json data from the localStorage instead of the server (home-baked fork of Backbone.dualStorage)
You then get the full offline experience: pages+images.
Caching like you might need for simple offline operation is not exactly that easy.
Your first option is the cache manifest. It has some limitations (like the size of the cache) but might work for you since it was designed to do what you want.
Another options is that you can store content on the disk of the device using the file system APIs. This has some drawbacks like security and the fact that you have to load the file from a path / url that is different than you might normally load it from on the web. Check out the hydra plugin for an example of this.
One final option might be to store stuff in localStorage (which has the benefit of being private on all platforms) and then pull it out of there when needed ... that means base64'ing all your images tho so that is a pretty big departure from just standard caching.
Caching is very much possible on Android OS. but on Apple as stated above there are limitations with the size of the images and cache size etc.
If you are willing to integrate and allow the caching on iOS you can use "cache manifest" to do so. but keep the draw backs and limitations in mind.
Also
if you want to save the file to Documents folder under my App, Apple will reject your App. The reason is the system backup all data under Documents folder to iCould after iOS6, so Apple does not allow big data like images or JSON file which could sync from your server again to keep in this folder.
So there is another work around which is good So one can use LocalFileSystem.TEMPORARY instead. It does not save the data to Library/Cache, but it save data to temp folder of App, which does not been auto backup to iCloud and not auto deleted either.
Regards
Rajeev

Resize large images in App Engine

I've got an app on Google App Engine that will accept image uploads from users. The problem that I envision is that users will upload these images directly from their cameras, and file sizes are often greater than 1MB, which is the limit for the image API (which would be used to resize the images).
What's the best way to accept the upload of say a 1.5MB image file, and resize it to under 1MB?
While this is not clear in the App Engine documentation, this is possible by using a combination of the Blobstore and the Image Manipulation Service.
You must:
Upload the Image into the Blobstore
Retrieve the Image from the Blobstore
Perform the Image Manipulation with an Image resulting in less than 1mb in size
I've written up a post about this -> http://socialappdev.com/uploading-and-re-sizing-large-images-on-app-engine-11-2010.
Here are two (similar) ways to solve this:
If you want to keep everything controlled yourself, you can put a resize script on a server of yours, which takes the URL to the raw uploaded image (which can be up to 10MB due to HTTP response size limit, but you would have to store it as 1MB chunks in the datastore), downloads it from your application, resizes it, and then POSTs it back to your application. All this interaction would need some kind of authorization of course, so that it can't be abused. Alternatively, POST the image directly to your external server, but then you have to either send the other form data back to your application, or use a separate form for the image upload.
Use an external imaging service. I would recommend Picnik. See their API documentation. As you can see, it lets you make a form that posts the image directly to their servers, then the user can edit the image (and resize), then the image is posted back to your server. With this solution you have to upload the image in a separate form, since Picnik receives all your POST data.
I recommend point 2, because it doesn't require you to go around Google App Engine limitations and since your users are uploading images straight from the camera, they will probably want to do something with them anyways (such as crop.)
That's a conundrum. The "obvious" answer, using google.appengine.api.images.resize, won't work because it's too big. :) So you will have to use third-party software, either on the server (which will be tricky because of App Engine's limitations) or the cilent (e.g. a Java uploader).

Resources