When I log-in to my Gmail Inbox it starts caching the mails one-by-one in JavaScript.
When I click on a mail in the Inbox, it doesn't send an Ajax request then to fetch the mail contents.
Instead it serves from an already cached JavaScript array.
Is there any good jQuery plugin to implement this?
I came across a few but they don't seem to be under active development.
http://plugins.jquery.com/project/jCache
http://plugins.jquery.com/project/jCacher
Any better plugin?
Edit1:
My requirement is exactly same as what Gmail is doing.
There is a ticket management system which shows a list of open tickets(say 100 tickets on a page) and once you click on a ticket its details are displayed. I want to cache the details of all 100 tickets displayed on the page.
I am planning to implement the cache as object of key-value pairs only. But I am looking for a plugin which takes care of tasks like setting/getting values from cache, auto-updating the cache periodically etc.
Storing in JS object shall be enogh for me. I don't see any advantages of using HTML5 local storage as
* No offline browsing is required and
* I wan't to load fresh data every time a new window is opened
* I won't need huge amount of memory
You could use some of the new html5 localstorage http://diveintohtml5.ep.io/storage.html
I think Google is rather using HTML5 local storage than caching. They seem to be big fans of HTML5 and adopt anything as soon as it's available. If you must use cookies, I'd recommend this one.
As Pointy suggested, the cache implementation was indeed heavily dependent on my application. Hence I have written my own code to handle this requirement. Thanks to all.
Related
I'm working on a site where we are using the slide function from jquery-ui.
The Google-hosted minified version of jquery-ui weighs 63KB - this is for the whole library. The custom download of just the slide function weighs 14KB.
Obviously if a user has cached the Google hosted version its a no-brainer, but if they haven't it will take longer to load as I could just lump the custom jquery-ui slide function inside of my main.js file.
I guess it comes down to how many other sites using jquery-ui (if this was just for the normal jquery the above would be a no-brainer as loads of sites use jquery, but I'm a bit unsure as per the usage of jquery-ui)...
I can't work out what's the best thing to do in the above scenario?
I'd say if the custom selective build is that small, both absolutely and relatively, there's a good reasons to choose that path.
Loading a JavaScript resource has several implications, in the following order of events:
Loading: Request / response communication or, in case of a cache hit - fetching. Keep in mind that CDN or not, the communication only affects the first page. If your site is built in a traditional "full page request" style (as opposed to SPA's and the likes), this literally becomes a non-issue.
Parsing: The JS engine needs to parse the entire resource.
Executing: The JS engine executes the entire resource. That means that any initialization / loading code is executed, even if that's initialization for features that aren't used in the hosting page.
Memory usage: The memory usage depends on the entire resource. That includes static objects as well as function (which are also objects).
With that in mind, having a smaller resource is advantageous in ways beyond simple loading. More so, a request for such a small resource is negligible in terms of communication. You wouldn't even think twice about it had it been a mini version of the company logo somewhere on the bottom of the screen where nobody even notices.
As a side note and potential optimization, if your site serves any proprietary library, or a group of less common libraries, you can bundle all of these together, including the jQuery UI subset, and your users will only have a single request, again making this advantageous.
Go with the Google hosted version
It is likely that the user would have recently visited a website that loads jQuery-UI hosted on Google servers.
It will take load off from your server and make other elements load faster.
Browsers load a fixed number of resources from one domain. Loading the jQuery-UI from Google servers will make sure it is downloaded concurrently with other resource that reside on your servers.
The Yahoo developer network recommends using a CDN. Their full reasons are posted here.
https://developer.yahoo.com/performance/rules.html
This quote from their site really seals it in my mind.
"Deploying your content across multiple, geographically dispersed servers will make your pages load faster from the user's perspective."
I am not an expert but my two cents are these anyway. With a CDN you can be sure that there is reduced latency, plus as mentioned, user is most likely to have picked it up from some other website hosted by googleAlso the thing I always care about, save bandwidth.
2 popular Wordpress plugins, Super Cache and Contact Form 7 - there are still some issues expected around _nonce AJAX calls.. How it is possible to make them work together smoothly? Definitely, there are some situations you want to keep ALL your posts (pages) super-cached, and not to be served dinamically without caching while preserving and providing contact form functionality.
When not checking the "Cache rebuild" in the settings, Super Cache will function:
standard pages will NOT enforce a new cache file to be generated (e.g. it REALLY serves an (if) existing supercache file whithout triggering a new cache file to be generated
pages with a wpcf7-form included WILL enforce/trigger the re-generation of supercache file while serving the initial on the request. I think that is the point when things go wrong.
Question: How to stop unwanted re-generation of pages with a wpcf7-form included? Based on wpcf7 plugin homepage Docs by the author, my expectation is that the wpcf7 form has evolved and been developing to meet this ajax-call requirements such _nonce calls. Any idea how to resolve this?
to be hard-coded (e.g. exclude _nonce call and/or URL parameter..) within wpcf7 form (this is against WP standards) or
settings in Super cache?
any other idea, solution or alternatives?
In this thread wpcf7 author says:
Contact Form 7 uses WP nonce as secret key. As nonce life is 24 hours by default, if cache clears less than 24 hours, the two plugins should work without problem, even if the page is cached.
I'd like to implement some visual indicator of various sections for the items whose status is pending in my app similar to facebook's / google plus unread notification indicator...I have written an API to fetch the count to be displayed, but I am stuck at updating it everytime an item gets added or deleted, I could think of two approaches which I am not satisfied with, first one being making an API call related to the count whenever a POST or DELETE operation is performedSecond one being refreshing the page after some time span...
I think there should be much better way of doing this from server side, any suggestion or any gem to do so?
Even in gmail it is refreshed on client request. The server calculates the amount of new items, and the client initiates a request (probably with AJAX). This requires an almost negligible amount of data and process time, so probably you can get away with it. Various cache gems even can store the part of the page refreshed if no data changed since last request, which even solves the problem of calculating only when something changed.
UPDATE:
You can solve the problem basically two ways: server side push, and a client side query. The push is problematic, for various reasons, rarely used in web environment, at least as far as I know. Most of the pages (if not all) uses timed query to refresh such information. You can check it with the right tool, like firebug for firefox. You can see as individual requests initiated towards the server.
When you fire a request trough AJAX, the server replies you. Normally it generates a page fragment to replace the old content with the new, but some cache mechanism can intervene, and if nothing changed, you may get the previously stored cache fragment. See some tutorial here, for various gems, one of them may fit your needs.
If you would prefer a complete solution, check Faye (tutorial here). I haven't used it, but may worth a try, seems simple enough.
Looking at a lot of web applications (websites/services/whatever) that have a 'streaming' component (typically this is a 'Social' app): Think: Facebook's 'Wall', Twitter 'Feed', LinkedIn's 'News Feed'.
They have a pretty similar characteristic: 'A notice of new items is added to the page (automatically assuming via a background Ajax call', but the new HTML representing the newest feed items isn't loaded to the page until the users click this update link.'
I guess I'm curious if this design decision is for any of the following reasons and if so: could anyone whom has worked on one of these types of apps explain the reasoning they found for doing it this way:
User experience (updates for a large number of 'Facebook Friends' or
'Pages' or 'Tweets' would move too quickly for one to absorb and
read with any real intent, so the page isn't refresh automatically.
Client-side performance: fetching a simple 'count' of updates
requires less bandwidth (less loadtime), less JS running to update
the page for anyone whom has the site open, and thus a lighter
weight feel on the client-side.
Server-side performance: Fewer requests coming into the server to
gather more information about recent updates (less outgoing
bandwidth, more free cycles to be grabbing information for those
whom do request it (by clicking the link). While I'm sure the owners
of these websites aren't 'short on resources', if everyone whom had
Twitter or Facebook open in the browser got a full-update fetched
from the server every-time one was created I'm sure it would be a
much more sig. drag on resources.
They are actually trying to save resources (it takes a cup of coffee
to perform a Google search (haha)) and sending a few bytes of data
to the page representing the count of new updates is a lot lighter
of a load on applications that are being used simultaneously on
hundreds and thousands of browser windows (not to mention API
requests).
I have a few more questions depending on the answer to this first question as well...so I'll probably add those here or ask another question!!
Thanks
P.S. This question got trolled off of the 'Web Applications' site -- so I brought my questions here where they're not to 'broad' or 'off-topic' (-8
Until the recent UI changes to Facebook, they did auto-load new content. It was extremely frustrating from a user perspective, as you'd be reading through the list of your friend's posts and all of a sudden everything would shift and you'd have no idea where the post you were just reading went.
I'd imagine this is the main reason.
I'm still pretty new to AJAX and javascript, but I'm getting there slowly.
I have a web-based application that relies heavily on mySQL and there are individual user accounts that are accessed and the UI is populated with user specific data.
I'm working on getting rid of a tabbed navigation bar that currently loads new pages because all that changes from page to page is information within one box.
The thing is that box needs to reload info from the database, etc.
I have had great help from users here showing that I need to call the database within the php page that ajax is calling.
OK-so pardon the lengthy intro-what I'm wondering is are there any specific limitations to what ajax can call that I need to know about? IE: someone mentioned that it's best not to call script files and that I should remove scripts from the php page that is being called and keep those in the 'parent' page. Any other things like this I need to keep in mind?
To clarify: I'm not looking to discuss the merits/drawbacks of the technology. I'm wondering about specific coding implementation that I need to be aware of (for example-I didn't until yesterday realize that if even if I had established a mySQL connection on the page, that I would need to re establish that connection in my called page as well...makes perfect sense now).
XMLHttpRequest which powers ajax has a number of limitations. I recommend brushing up on the same origin policy. This is a pivotal rule because it limits where AJAX calls can be made.
First, you can't have Javascript embedded in the HTTP response to an AJAX call. That's a security issue.
No mention of the dynamics of the database, but if the data to be displayed in tabs doesn't have to be real-time, why not cache it server-side?
I find that like any other protocol, Ajax works best in tightly controlled conditions. It wouldn't make much sense for updating nearly the whole page, unless you find that the user experience is improved with an on-page 'loader'. Without going into workarounds, disadvantages will include losing the browser back button / history, issues such as the one your friend mentioned, and also embedded resources and other rich content can suffer as well, and just having an extra layer of complexity to deal with in your app. Don't treat it as magic sauce for your app - make sure every use delivers specific results that benefit your client / audience.
IMHO, it's best to put your client side javascript in a separate page and then import it - neater container. one thing I've faced before is how to call xml back which contains code to run such as more javascript - it's worth checking if this is likely earlier on and avoiding, than having to look at evals.
Mildly interesting.