Download Google Drive SDK thumbnailLink provides cached file - caching

Using Google Drive SDK I retrieve the thumbnailLink property for a Google document and I then use this to download the generated image, which I cache on a file server. However, I'm seeing that I often get a thumbnail of an older version of my document, it could be a version cached by Google drive.
This thumbnail link has this form:
https://docs.google.com/...&sz=s220
You can get different thumbnail sizes based on the sz argument. The interesting thing is that I'm seeing different versions (older or newer thumbnails of my document) of the thumbnail depending on the value of the sz argument.
Is there a way to get a fresh thumbnail when a Google document has been updated?

There does appear to be some caching going on with the thumbnail URLs, such that certain ranges of sizes are cached together. In my experience those caches do expire however, although not in a way you can completely control (as you've noted requesting a different size can sometimes cause a cache miss). The team is working on changing the thumbnail serving to a new backend that should resolve the issue, but I don't have a timeline for that change.

Related

retrieve older cached versions of webpage

Is it possible to retrieve old cached version of a webpage older than the most recent cache provided by google search? Something like a history of a webpage?
Google's cached link only provides their most recent cache. Is there a way to get to older versions, either via google or maybe another similar website?
You can use Wayback machine archive.org for that or cache viewer website like cachearchive.com.

Is there a conventional way to storing selections within the image (for example, for tagging many users on facebook?)

Suppose you have hundreds of group photos, all in different formats (GIF, PNG, JPG etc). You run a social network where users may identify people on the group photos, hence on the backend, the server needs to store "rectangles" within the photo. Is there a conventional "rectangle storage" format to store such tiles within the image, as well as the software to display pictures, with these "rectangles" superimposed?
There are three standards for regions in an image. The Metadata Working Group standard, the Microsoft standard, and just recently, the IPTC standard. The MWG standard was the one supported by the old Google Picasa program. The Microsoft standard was the one supported by the old Windows Photo Gallery. The IPTC standard is really new and I'm not sure if there is any software that supports it yet.
For the MWG standard, I would point you to their website, but it has been offline for more than a year. You can find a copy of the PDF that describes it attached to this post in the Exiftool forums. I'm not sure but I think that is the standard supported by Lightroom.

magento image loading takes too much time

I have a problem with magento application. the site taking too much time to load. problem is with category page , images of category are not loaded as it should.
can anybody help to find solution.hosting server is bluehost.
everything was working fine previously, but now i am facing these issue.
Thanks.
Other than caching the images, there isnt much you can do except maybe upgrading the hosting account. If you are on a shared hosting account and loading 10+ product images on a page, it may take 6-7 seconds to load.
You should check the size of the images. If you are uploading images from your digital camera, a lot of times they can be several megabytes in size.
I would suggest using jpg images no larger than 800px x 800px in size. And make sure Magento cache is enabled. Each full resolution image in file size should be under 250KB.
You could also attach your magento store to a cloud service that caches images and pages which can greatly improve speed.
One more thing you can do, open a ticket with bluehost and ask them to check the speed of the site, lots of times they have tools they can run diagnostics. Maybe its something other than the images slowing things down like javascript getting hung up (any extensions installed related to the images?)

Map Tile Caching for Offline Viewing

I'm trying to build an application that will use open source maps from Open Street Maps (though the concept should be applicable to any map provider). The application will enable the user to specify a number of waypoints along a route prior to departure.
Because I don't have a data plan for my cell phone (and because rambling in the countryside rarely gives you a good connection), I want to be able to pre-load the relevant map tiles for the waypoints and/or route before departure so that maps can continue to be used without a data connection.
My initial thoughts are to download the required tiles from the map provider and store them in isolated storage. However, the Bing Maps control implementation, which uses the TileSource class relies on returning an absolute URI that it can download the tile(s) from, which clearly won't work with data stored in isolated storage.
The question has already been asked: Windows Phone 7 Map Control with custom layer in offline mode, but wasn't answered and I'm wondering if since then anyone has cracked the problem.
I've seen this done with a custom layer placed over the map. Tiles are then loaded from anywhere you like (IsolatedStorage, online, somehwere else?) into the custom layer.
Sorry, I don't have any code I can share which demonstrates this at the moment but am currently doing something very similar.
I built a small prototype using OpenStreetMaps for Android. I think it might be interesting to look at the repository and therefore, find a solution similar to mine. I did download the maps before hand, but maybe you can use an online solution for this. This is the repo: https://github.com/kikofernandez/OpenStreetMapExample and the video of how it could look like: https://vimeo.com/40619538.
I used for this prototype OpenLayers, OpenStreetMaps, JavaScript and a WebView in Android. I would like to give you further details but it was just a prototype.
If you can store the data locally (embed it in the XAP), you can reference it via an absolute URI. Chris Walshie talks about it here.
So, for example, once you have the installation path for the app, you can reference the resource like this:
Uri toResource = new Uri("file:///Applications/Install/4FFA38B5-00AF-4760-A7EB-7C0C0BC1D31A/Install/EMBEDDED_RESOURCE", UriKind.Absolute);
Have you set the Build Action on your image(s) to Content?
If your app is running on WP8 then use the built in maps control in the Windows Phone 8 SDK as this already supports offline maps out of the box. If targeting WP7 it is possible to get offline maps to work but takes a lot of work. I created this for a customer a few years ago and I believe that it took me a little over 3000 lines of code to do. Mind you they wanted to also have a framework for adding tiles from various sources such as downloading over and area and downloading zipped files. They way I managed to get the rendering to work was to a canvas to the map without setting it's position. This will be default make it a child of the map but it will not move. I then made the canvas the same size as the map and used the resize event to resize the canvas should the map be resized. I then used the view change event to trigger a method to render the tiles. When this event fired I first calculated all the tiles in view using the code found here: http://msdn.microsoft.com/en-us/library/bb259689.aspx
I then would pull the tiles from isolated storage and draw them on the canvas. For performance I keep track of which tiles I added to the canvas so that if the tile was still in view I simply changed it's position rather than reloading it from isolated storage. I also removed any images that were no longer in view. Overall this works fine but there were some minor issues such as not having the smooth transition between zoom levels. If you really wanted that it is possible to get that to work but requires a lot more math. Also, if you zoom into an area where there is no tiles you end up with an empty map. You can create a custom map mode to prevent the user from going into areas where you don't have tiles.
A solution
The question is a bit old, but there's a solution for anyone who can use Qt.
The solution is not limited to the Windows Phone platform, I've done it targetting Android, and it also works on my desktop.
In Qt, you'll want to patch the OSM Plugin used by QtLocation. It's simple, quick and easy.
How to do it ?
A quick implementation could modify the QGeoTiledMappingManagerEngineOsm class to make it call your own QGeoTileFetcher instead of QGeoTileFetcherOsm.
There may be better ways to acheive this, but at least it works for me.
Basically, you make a fetcher that reads tiles from the filesystem instead of the network.
You build your filesystem database once, from an online resource for instance (see below) and you deploy it with your application for its offline use.
Where do I get tiles from ?
Information how to get the tiles to your offline implementation is available here :
http://wiki.openstreetmap.org/wiki/Slippy_map_tilenames
Here are two sources for tiles that can be used for free :
Open Street Maps project servers
Mapquest Open Tiles servers
Take care of the licensing and terms of use.
Open Street Map
Project : wiki.openstreetmap.org/wiki/Main_Page
License : www.openstreetmap.org/copyright
Terms of use : wiki.openstreetmap.org/wiki/Tile_usage_policy
Servers are currently named like *.tile.openstreetmap.org
MapQuest-OSM Tiles
Project : developer.mapquest.com/web/products/open/map
License : opendatacommons.org/licenses/odbl/
Terms of use : developer.mapquest.com/web/info/terms-of-use
Servers are currently named like otile*.mqcdn.com
(Sorry for strange links : I haven't got enough reputation to post real links).

Is it safe to use code from code.jquery.com for long-term application?

I am using Ajax / jquery on a webpage i am designing... in order for it to function, i include (at the top of my page) the javascript at: http://code.jquery.com/jquery-1.4.4.js
This works great and all, but i have a fear that
1) the code might get changed without me knowing, then i encounter problems and try to debug for days / hours before finding that the code at this site changed
2) the website is no longer used / specific code no longer hosted years from now
So would it be safer to save that javascript file onto my server, and access it from there?
You should use either a Microsoft or Google CDN. It will be much faster, it will be cached for a lot of your users and it's guaranteed to be there, as opposed to the jQuery link you include.
http://code.jquery.com is jQuery's CDN (provided by Media Temple). The code at http://code.jquery.com/jquery-1.4.4.js will never change; jQuery will release a new version (which will be at a different URL), if anything needs to change (which happens all the time; version 1.5b was released today).
The jQuery guys know what they're doing, and they setup a CDN so people can easily link to jQuery. They're just as (un)likely to bring down the CDN as Google and Microsoft are at bringing theirs down.
See http://docs.jquery.com/Downloading_jQuery for more information.
Having said that, it would seem the Google hosted version (http://ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js), is referenced more in websites; this leads to a small performance advantage as far as your users are concerned, as the file has more chance of being cached.
It's safe, notice the version number? As jQuery is updated then that version number will change.
Of course using a CDN will always mean that it's possible for the content delivery network to go out of business. But that's the case with any non directly controlled server.
You of course could use the Google CDN for jQuery, I highly recommend it.
Relevant:
http://code.google.com/apis/libraries/devguide.html#jquery

Resources