Update picture on my Windows Azure Storage (Refresh) - caching

I've Silverlight project
and on Windows azure storage, I upload an image here :
https://**.blob.core.windows.net/profilepicture/3d5978a1-3e51-4212-b129-9ff401149bc0
i see my picture, but when i update this picture, i see my old picture (I think it's because caching), when i check with "Azure storage explorer" my picture was change...
How i can Force the refresh on my silverlight application for to see my last update?
Thanks you very much
if you have same question, ask me.

You can force the cache expiration for the BLOB, read this doc:
http://msdn.microsoft.com/en-us/library/windowsazure/gg680306.aspx

Best way to force a refresh is to create a new filename when replacing a blob in Azure Storage.
You should write your app to be able to know the current name dynamically, so the app is always pulling the latest one.
There are cache-control headers that you can set on the blob, but you cannot gaurantee that every intermediate proxy will honor them, so changing the filename (the Guid in your URL above) is the only solution gauranteed to work.

Related

Azure storage and CDN and getting blob

I have problem with caching in CDN for Azure Storage. I have set up storage, add CDN on it, and put custom domain.
And all works fine, until recently, when one image I uploaded, get stuck on CDN. When I'm getting to it without CDN all works fine, but over CDN it always shown old image. I tried everything, I put custom cache expiration, I deleted, I moved it... But nothing works. And I waited for one day, maybe will fix by Auzura automatically, or some caching will expiry, but nothing.
Does anybody had similar problem before? How to fix it?
All other images (blobs) in same container works fine.
You have to be really careful when planning the CDN usage. When you enable for CDN, you have to be in full control of all and any Blob files that will be served through the CDB.
This is by being explicit about setting x-ms-blob-cache-control property on a Put Blob, Put Block List, or Set Blob Properties request, or use the Azure Managed Library to set the BlobProperties.CacheControl property.
In case you forgot to set this property before the file was accessed by the CDN, the CDN assumes 7 days as TTL (Time-to-live for that file). Any consequent change of the settings (cache-control property of the blob) will not take effect until after the 7 days TTL elapses. I believe you have accidentally entered into this default 7 days TTL (hoping it is not the worst - wrongly set cache-expire header with a longer period)
You can read more on best practices for controlling CDN content here. And I warmly ask you to give your 3 votes at this feature request.

Is there a way to overwrite existing Blobs in the blobstore

I'm using the high performance image serving feature in App Engine to serve up images from the blobstore. However, I'd like users to be able to modify those images (e.g. rotate, crop etc.) and then write those change back to the blobstore, overwriting the original blob. I know I can write to new blobs in the blobstore, as documented here: http://code.google.com/appengine/docs/python/blobstore/overview.html#Writing_Files_to_the_Blobstore
but I don't see a way to overwrite existing blobs. Is this possible in App Engine?
My use case is as follows:
User uploads image, and app engine generates a link via
get_serving_url
The user may then use that link outside of my app, e.g. link to it
on their blog to display the image
If that image is changed later on in my app (rotation, etc.) , I'd
like their image link to reflect those changes
Files stored in blobstore are immutable, once they have been written than can not be changed (only served or deleted).
I think you should try to build your own controller for generate file serving url
- In Datastore each blobFile record have own ID (you manage it) and version ID
- for first upload , set new ID and version
- When user change your image, save new blobstore, keep ID and set new version field
In serving controller generate link by iD, when user call it, get the newest version for serving
It's just my opinion, hope it helpful !

How to cache images and html files in PhoneGap

I need a way for cache images and html files in PhoneGap from my site. I'm planning that users will see site without internet connection like it will be with it. But I see information only about sql data storing, but how can I store images (and use later).
To cache images check out this library -of which I'm the creator-:
imgcache.js
. It's designed for the very purpose of caching images using the local filesystem. If you check out the examples you will see that it can also detect when an image fails to be loaded (because you're offline or you have a very bad connection) and then replaces it automatically with the cached image. The user of the webapp doesn't even notice it's offline.
As for html pages.. if they're html static files, they could be stored locally in the web app (file:// in phonegap).
If they're dynamically generated pages, check the localStorage API if you have a small amount of data, otherwise the filesystem API.
For my web app I retrieve only json data from my server (and process/render it using Backbone+Underscore). The json payload is stored into the localStorage. If the application gets offline, it will fetch json data from the localStorage instead of the server (home-baked fork of Backbone.dualStorage)
You then get the full offline experience: pages+images.
Caching like you might need for simple offline operation is not exactly that easy.
Your first option is the cache manifest. It has some limitations (like the size of the cache) but might work for you since it was designed to do what you want.
Another options is that you can store content on the disk of the device using the file system APIs. This has some drawbacks like security and the fact that you have to load the file from a path / url that is different than you might normally load it from on the web. Check out the hydra plugin for an example of this.
One final option might be to store stuff in localStorage (which has the benefit of being private on all platforms) and then pull it out of there when needed ... that means base64'ing all your images tho so that is a pretty big departure from just standard caching.
Caching is very much possible on Android OS. but on Apple as stated above there are limitations with the size of the images and cache size etc.
If you are willing to integrate and allow the caching on iOS you can use "cache manifest" to do so. but keep the draw backs and limitations in mind.
Also
if you want to save the file to Documents folder under my App, Apple will reject your App. The reason is the system backup all data under Documents folder to iCould after iOS6, so Apple does not allow big data like images or JSON file which could sync from your server again to keep in this folder.
So there is another work around which is good So one can use LocalFileSystem.TEMPORARY instead. It does not save the data to Library/Cache, but it save data to temp folder of App, which does not been auto backup to iCloud and not auto deleted either.
Regards
Rajeev

Azure Blob storage, CDN and cache expiring

We're using Azure CDN, but we've stumbled upon a problem. Before, content could not be updated. But we added the option for our users to crop their picture, which changes the thumbnails. See, the image is not being created as new, instead we just update the stream of the blob.
There doesn't seem to be any method to clear the cache, update any headers or anything else.
Is the only answer here to make a new blob and delete the old?
Thanks.
CDN would still cache the content, unless the cache-expiry passes, or the file name changes.
CDN is best for static content with a high cache hit ratio.
Using CDN for dynamic content is not recommended, because it causes the user to wait for a double hop from storage to cdn and from cdn to user.
You also pay twice the bandwidth on the initial load.
I guess the only workaround right now is to pass a dummy parameter in the request from the client to force download the file every time.
http://resourceurl?dummy=dummyval

Reload Document into Google Docs Viewer (Clear Cache)

Google Docs Viewer (http://docs.google.com/viewer) creates a cache of a document after the first viewing. To see what I mean, try the following:
Upload file.pdf to your server (i.e., http://example.com).
Visit http://docs.google.com/viewer?url=http://example.com/file.pdf
Upload a new file to replace file.pdf (but use the same name).
Revisit http://docs.google.com/viewer?url=http://example.com/file.pdf.
Google Docs Viewer still shows the old file.pdf.
Anyone know how to correct this?
(I have already tried clearing browser cache, switching browsers, and logging in with a different google account to view the link.)
It appears there is no way to clear the cache. Although, from my experience, Google tends to do it automatically about once a day.
Maybe if you append a dynamic query string parameter to filename maybe cache will not work.
ex: http://docs.google.com/viewer?url=http://example.com/file.pdf?time=3454354
I added ?time=0
Seemed to work.

Resources