I tried to use the HTML5 Application Cache to improve the performances of on-line sites (all the tutorials on the web write only about usage with off-line apps)
I created the manifest listing all the js, css and images, and the performances were really exciting, until I found that even the page HTML was cached, despite it was not listed in the manifest.
The pages of the site are in PHP, so I don't want them to be cached.
From http://www.whatwg.org/specs/web-apps/current-work/multipage/offline.html :
Authors are encouraged to include the main page in the manifest also, but in practice the page that referenced the manifest is automatically cached even if it isn't explicitly mentioned.
Is there a way to have this automatic caching disabled?
Note:
I know that caching can be controlled via HTTP headers, but I just wanted to try this way as it looks quite reliable, clean and powerful.
Related
On the, The Net Ninja Youtube channel I see the Ninja has disabled cache in his Laravel 6 tutorial. Just curious what's the benefit over enabling when coding?
So that changes made to resources loaded into the page, such as images, stylesheets, and scripts are always refreshed and reloaded when the page itself is reloaded (so you don't need to remember to press Shift+F5 or Ctrl+Shift+R).
When caching is enabled the browser may prefer its cached versions which may be stale instead of always using the latest-built assets.
However disabling caching is unnecessary if you use a content-addressing scheme for off-page resources (i.e. the URI of a script file or stylesheet includes its content hash (SHA-256, etc)).
Serving uncached ("fresh") pages to crawlers like Googlebot is a default DNN behavior, according to DNN. Many of our website's pages are heavy and we utilize DB caching extensively.
What setting can I tweak to serve cached pages to crawlers? I could not find anything in web.config or rules.config that seemed related to that. Is it something I need to add?
I found no documentation googling for it. Please help. Thanks!
There is nothing in the DNN Platform that serves content differently to crawlers than that of regular visitors to the site, so they should be seeing the same thing that the rest see.
I'm working on a site where we are using the slide function from jquery-ui.
The Google-hosted minified version of jquery-ui weighs 63KB - this is for the whole library. The custom download of just the slide function weighs 14KB.
Obviously if a user has cached the Google hosted version its a no-brainer, but if they haven't it will take longer to load as I could just lump the custom jquery-ui slide function inside of my main.js file.
I guess it comes down to how many other sites using jquery-ui (if this was just for the normal jquery the above would be a no-brainer as loads of sites use jquery, but I'm a bit unsure as per the usage of jquery-ui)...
I can't work out what's the best thing to do in the above scenario?
I'd say if the custom selective build is that small, both absolutely and relatively, there's a good reasons to choose that path.
Loading a JavaScript resource has several implications, in the following order of events:
Loading: Request / response communication or, in case of a cache hit - fetching. Keep in mind that CDN or not, the communication only affects the first page. If your site is built in a traditional "full page request" style (as opposed to SPA's and the likes), this literally becomes a non-issue.
Parsing: The JS engine needs to parse the entire resource.
Executing: The JS engine executes the entire resource. That means that any initialization / loading code is executed, even if that's initialization for features that aren't used in the hosting page.
Memory usage: The memory usage depends on the entire resource. That includes static objects as well as function (which are also objects).
With that in mind, having a smaller resource is advantageous in ways beyond simple loading. More so, a request for such a small resource is negligible in terms of communication. You wouldn't even think twice about it had it been a mini version of the company logo somewhere on the bottom of the screen where nobody even notices.
As a side note and potential optimization, if your site serves any proprietary library, or a group of less common libraries, you can bundle all of these together, including the jQuery UI subset, and your users will only have a single request, again making this advantageous.
Go with the Google hosted version
It is likely that the user would have recently visited a website that loads jQuery-UI hosted on Google servers.
It will take load off from your server and make other elements load faster.
Browsers load a fixed number of resources from one domain. Loading the jQuery-UI from Google servers will make sure it is downloaded concurrently with other resource that reside on your servers.
The Yahoo developer network recommends using a CDN. Their full reasons are posted here.
https://developer.yahoo.com/performance/rules.html
This quote from their site really seals it in my mind.
"Deploying your content across multiple, geographically dispersed servers will make your pages load faster from the user's perspective."
I am not an expert but my two cents are these anyway. With a CDN you can be sure that there is reduced latency, plus as mentioned, user is most likely to have picked it up from some other website hosted by googleAlso the thing I always care about, save bandwidth.
Now that I've successfully cached my web page, how do I uncache it after making a change?
My user can't dl the latest version, even after I've changed a comment in my cache.manifest file.
My server is an IIS server.
The thing with caching is, well, stuff gets cached. Browsers won't, in general, try to download anything you've told them to cache until the cached items expire.
If you set everything to cache for a certain time span, the browser won't try to download any of the cached items until the end of it, which includes the cache.manifest file itself, by the sound of it.
Typically, you don't want to cache the content of the website, because then that makes it hard to change. Instead, you want to cache the various pieces, like images, css, and javascript, that the various pages of your site need. If you do this right, you can get a huge benefit for your users, and still have control over those resources, since you can always link to a different version of a particular resource in the content of the pages.
That said, if you do need to cache some portions of your pages, you can use server-side caching to reuse portions that are expensive to put together.
I'm trying to come up with ways to speed up my secure web site. Because there are a lot of CSS images that need to be loaded, it can slow down the site since secure resources are not cached to disk by the browser and must be retrieved more often than they really need to.
One thing I was considering is perhaps moving style-based images and javascript libraries to a non-secure sub-domain so that the browser could cache these resources that don't pose a security risk (a gradient isn't exactly sensitive material).
I wanted to see what other people thought about doing something like this. Is this a feasible idea or should I go about optimizing my site in other ways like using CSS sprite-maps, etc. to reduce requests and bandwidth?
Browsers (especially IE) get jumpy about this and alert users that there's mixed content on the page. We tried it and had a couple of users call in to question the security of our site. I wouldn't recommend it. Having users lose their sense of security when using your site is not worth the added speed.
Do not mix content, there is nothing more annoying then having to go and click the yes button on that dialog. I wish IE would let me always select show mixed content sites. As Chris said don't do it.
If you want to optimize your site, there are plenty of ways, if SSL is the only way left buy a hardware accelerator....hmmm if you load an image using http will it be cached if you load it with https? Just a side question that I need to go find out.
Be aware that in IE 7 there are issues with mixing secure and non-secure items on the same page, so this may result in some users not being able to view all the content of your pages properly. Not that I endorse IE 7, but recently I had to look into this issue, and it's a pain to deal with.
This is not advisable at all. The reason browsers give you such trouble about insecure content on secure pages is it exposes information about the current session and leaves you vulnerable to man-in-the-middle attacks. I'll grant there probably isn't much a 3rd party could do to sniff venerable info if the only insecured content is images, but CSS can contain reference to javascript/vbscript via behavior files (IE). If your javascript is served insecurely, there isn't much that can be done to prevent a rouge script scraping your webpage at an inopportune time.
At best, you might be able to get a way with iframing secure content to keep the look and feel. As a consumer I really don't like it, but as a web developer I've had to do that before due to no other pragmatic options. But, frankly, there's just as many if not more defects with that, too, as after all, you're hoping that something doesn't violate the integrity of the insecure content so that it may host the secure content and not some alternate content.
It's just not a great idea from a security perspective.