How load private Amazon S3 files with javascript? - laravel

I have a difficult problem:
I am using OpenSeadragon for viewing large photos on my private Laravel webapplication.
The photos and tiles are stored in my private Amazon S3 bucket.
But how can I access this private photos in my javascript OpenSeadragon component in a safe what?
What I have done: I created a router function in my Laravel application that redirects to Amazon S3:
function getTiles($tile) {
// validation && authorisation
return redirect()->to(\Storage::disk('s3')->temporaryUrl($tile, now()->addMinutes(5)));
}
And I have configured my OpenSeadragon component (according https://openseadragon.github.io/examples/tilesource-custom/) so this component loads the tiles from my router function.
This works, but the problem is: it is very slow, because OpenSeadragongs loads > 100 tiles per second.
I am searching for a good, fast and safe solution for this problem...
I can change my Amazon OpenSeadragon tiles folder visibility to "public" with a difficult random foldername, but anyone who know this foldername, he can download the photo. That's not a good solution...

The best way to achieve this is likely by placing Cloudfront in front our your S3 bucket and switch from presigned URL's to presigned cookies.
The Cloudfront documentation for presigned cookies is here: https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-signed-cookies.html
It should allow you to basically pre-allow access to all the subphotos for zooming, as long as you use a wildcard character to give access.
The AWS docs on choosing between presigned URLs and presigned cookies: https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-choosing-signed-urls-cookies.html
Note that it specifically mentions "You want to provide access to multiple restricted files, for example, all of the files for a video in HLS format or all of the files in the subscribers' area of a website."

Related

How to integrate an image proxy server with a caching proxy?

I have an imgproxy server, as shown below. It's able to transform an image (resize, crop) using URL parameters.
Now I would like to add a caching proxy. So an image only would be processed (resized, cropped) if that doesn't exist in the cache.
I've read that AWS Cloudfront or Cloudflare or maybe Google Cloud CDN could be the caching proxy, what would be great. But unfortunately I didn't find any example of how to do that. I appreciate if anyone can help me.
On Cloudflare, you can leverage the following services for your use case:
Cloudflare Images: for CDN, storage and resizing services
Cloudflare Image Resizing, combined with the Cloudflare CDN via reverse proxy to allow on the fly resizing and optimization of the images (differences with Images listed here)
You can also store and resize images on your site, using the Cloudflare CDN (DNS base reverse proxy) and put it in front of your image resizing and storage stack. This is explained here in detail.
You can find here the steps to create a Cloudflare account and add your domain to it.
Hi Paolo just I also i had same doubt about how to cache, but in my case i want to use a self server,
I created a project they caches the images based on params and url requested these is generated an hash MD5, when the image with same params is requested the proxy search the md5 if is cached otherwise they regenerate a new one image resized
You can consider that for use to S3 / cloudflare any else
You can take a look of the project:
https://github.com/sefirosweb/Imgproxy-With-Cache

Google App Engine & Images server

I'm having difficulties understanding if my idea of an image gallery will work as I can't seem to get it working.
What I have:
An Google App Engine running with a simple website that serves products where each product can have images
A Google Storage bucket with 1.000.000's images
What I planned to do:
Add a CDN & Load balancer to the Google Storage bucket to serve the images worldwide fast on a subdomain.
Status: This works. At least it serves the images.
Problems:
But I have the feeling that the architecture is not right as the Google App Engine can't be put behind the same load balancer & CDN to serve all the static content via this CDN. And I see no way to add the content caching headers. The documentation of Google says I should be able to add cache keys in the loadbalancer config. But I've been 10 times through this config and the back-end bucket config but no luck to find any. Also in the app.yaml of the Google App Engine you can't set this as the images are not servered via the App Engine....
So questions:
Is it logical in this setup to have a GAE and a separate load
balancer with a storage bucket with the images?
How do I add cache-control headers to the CDN/bucket config of Google Cloud CDN?
Assuming that the GCS bucket setup you already have in place allows you to serve an image via the CDN & Load balancer as you desire, let's say on a URL like gs://www.example.com/img.png then handling such request will already include all the required cache control.
If so then in your GAE app-provided pages, instead of referencing an image via a relative path to your site, like <img src="/static/img.png">, which would indeed require handling its own cache management inside the GAE app code, you could simply reference the image via its corresponding URL in the existing CDN setup: <img src="gs://www.example.com/img.png">, with all cache control already included.
Note: I didn't actually try it (I don't have such GCS CDN setup), but I see no reason for which this wouldn't work.

efficiently serve images from app engine with readable URL

Whats a fast and or cheap way to serve an image from a readable url on google app engine:
<img src="http//mycustomdomain.com/image-server/my-readable-url>
(these urls cant be changed so I can't use get_serving_url without the cost of a redirect)
In the documentation on Serving a Blob:
Note: If you are serving images, a more efficient and potentially
less-expensive method is to use get_serving_url using the App Engine
Images API rather than send_blob. The get_serving_url function lets
you serve the image directly, without having to go through your App
Engine instances.
Here are five options off the top of my head that i'm considering based on the size of the image & how quickly the image needs to be returned. (each option would hopefully use edge cache)
Datastore lookup for serving url (precomputed by get_serving_url) & redirect to serving url.
Datastore lookup for blobkey & send_blob
Datastore lookup for blobProperty & send out (increased storage cost but maybe ok for icons etc)
Somehow bake the URL into a google cloud storage bucket name to avoid datastore lookup and simply redirect to that bucket (assuming this isn't possible?)
Some appy.yaml hack that queues up these images and deploys them to app engine as static files falling back to options 1-4 if static file not found. (assuming option 5 isn't possible?)
(datastore costs $0.18 / GB / Month, static files and blobstore cost only $0.026 / GB / Month)
Are there any other options I haven't considered? Is option 2 the best?
I suggest using google cloud storage with a custom domain.
see here:
https://cloud.google.com/storage/docs/website-configuration
you can upload to cloud storage from your app:
https://cloud.google.com/appengine/docs/java/googlestorage/
(that's for java)

Best way to store 1000s of small images (<5k) either in MongoDb or S3?

I'm planning on storing 1000s (hopefully even millions some day) of profile images from facebook and twitter. Their usual size is less than 5k.
What is the best way to do this either in MongoDB or on Amazon S3 and avoid disk fragmentation or similar issues?
Any pointers/tips on the do's and don'ts would be very helpful as well.
Yeah, publish profile images from the associated Social site (Facebook, Twitter, etc), but if you have to store uploaded images onto S3, rather than reading the file (from S3) and re-stream it to your user, you can enable the "Website" feature and have your images linked to S3 directly.
So your html image tag will be like:
<img src="http://<amazon s3 - website - endpoint>/<image filename>" title="something">
Why not just store the usernames instead? The profile image can be accessed via the Facebook Graph API (just replace "username" with any Facebook user's username). You'll also save the work of keeping the profile pictures updated.
<img src="http://graph.facebook.com/username/picture" />

can I use CDN with images?

can I use CDN with images ? and if can then how to use it with upload from website to CDN server
Seems like there are a few options to accomplish this.
The first one would be using the CDN as Origin. In which case, there is already an answer with some advice.
The second option would be using your current website as Origin for the images. In which case you will need to do some DNS work that would look something like this:
Published URL -> CDN -> Public Origin
Step 1 - images.yoursite.com IN CNAME images.yoursite.com.edgesuite.net --- This entry will send all traffic requests for the images subdomain to Akamai's CDN edge network.
Step 2 - origin-images.yoursite.com IN A or IN CNAME Public front end for the images
So the way it works is that in step one you get a request for one of your images, which will be then sent via DNS to the edge network in the CDN (in this case Akamai HTTP only). If the CDN does not already have the image in cache or if its cache TTL is expired, it will then forward the request to the public origin you have setup to pull the file, apply any custom behavior rules (rewrites, cache controls override, etc), cache the content if marked as cacheable and then serve the file to the client.
There is a lot of customization that can be done when serving static content via CDN. The example above is very superficial and it is that way to easily illustrate the logic at a very high level.
Yes, and you can check with your CDN provider on the methods they allow for uploading,
such as
pull (CDN server download the files from your website/server)
or
push (sent from your website/server to the CDN server)
Example : automatic push to CDN deployment strategy
Do you mean you want to use a CDN to host images? And you want to upload images from your website to the CDN or use the website run by the company hosting the CDN to upload the images?
Ok, firstly yes you can use a CDN with images. In fact it's advised to do so.
Amazon CloudFront and RackspaceCloud's Cloudfiles are the two that immediately spring to mind. Cloudfiles you can upload either by their API or through their website and CloudFront you upload to Amazon's S3 storage which then hooks into the CloudFront CDN.
In common CDN setups you actually don't upload images to the CDN. Instead, you access your images via a CDN, quite like accessing resources via an online Proxy. The CDN, in turn, will cache your images according to your HTTP cache headers and make sure that subsequent calls for the same image will be returned from the closest CDN edge.
Some recommended CDNs - AWS CloudFront, Edgecast, MaxCDN, Akamai.
Specifically for images, you might want to take a look at Cloudinary, http://cloudinary.com (the company I work at). We do all of this for you - you upload images to Cloudinary, request Cloudinary for on-the-fly image transformations, and get the results delivered via Akamai's high-end CDN.

Resources