Google AMP Cache - hot to force loading index.html from cache? - caching

Is there any way how to force loading main homepage (index.html) to load from AMP Cache?
I have all images loaded from Cache according to manual: https://developers.google.com/amp/cache/overview
But in DevTools audit there is still an error for the homepage (not being served through http/2 - from the cache)

I’m not sure exactly what you mean but think you may be misunderstanding the point of the AMP cache.
The Google AMP Cache is not like a CDN (Content Delivery Network) that always sits in front of your site, though in certain instances it acts like one.
The Google AMP Cache is automatically populated by Google when it crawls your site. Any searches on Google while on mobile will then serve your AMP pages, rather than your normal pages, and will also serve them from the Google AMP cache rather than from your domain. This is done for a number of reasons, but primarily to create the “instant loading” effect that AMP gives when loaded from Google Search results (aka Search Engine Results Page or SERP). In this case the whole page including the index page is served from the Google AMP Cache.
Other sites and domains can also decide to display AMP pages instead of your HTML pages if they want, and can decide to serve them from the Google AMP cache, from their own AMP cache (though, other than Google, only Cloudflare have implemented their own AMP Cache AFAIK) or directly from your home page (in which case there is no cache used). Twitter for example automatically replaces links with their AMP equivalents but loads from the real domain so is fast (due to AMP) but not “instant” (like it is in the Google Search Results).
So you, as a site owner, don’t decide when to use the AMP Cache - the calling application (e.g. Google SERPS, Twitter) decides that. And if the calling app/page doesn’t use an AMP Cache, then it is served directly from your domain and therefore whatever technology your domain supports (e.g. HTTP/1.1 or HTTP/2). You can of course give out the AMP Cache URL instead of your real one if you want.
You seem to suggest you have altered your page to replace all images and the like with references to the AMP cache - is that so? If so that sounds like a bad idea, as the cache is loaded from your site which now depends on the cache, which is loaded from your site, which is... etc.

Related

How to use Google AMP Cache for my AMP Website

How to use or setup a google AMP cache for my amp website. I have seen many site manages their request from google.com server . How it is possible for me?
To make Google serve your pages as AMP-page, the page must be re-written / modified to make it fit the AMP requirements.
AMP-pages are stripped down versions of web pages, with some restrictions compared to ordinary websites. The HTML markup is slightly different, i.e. and there are other restrictions such as CSS limited to 100kb. Those sites are separately hosted and as soon as Google crawled it, it might be included in the AMP mobile search results.
See AMP documentation for the details on how to implement it.

How we can speed up Joomla webpage

Using Joomla System - Page Cache, my webpage is now around 4-5 sec.
But i have few pages which will be shown only to registered users. I just checked its taking around 10-15 sec. When i inspected using chrome, i can see few things, i have livechat, which is taking around 2 sec, and few things. But live chat is showing in homepage also. but that page is speed.
Wanted to know is Joomla system cache plugin will not work for registered users visible page. or any other plugin i can use to speed up this type of pages.
Joomla have one JCH Optimize plugin which will decrease your website load speed.
It will compress all css and js file into one file.That file will store in cache so website speed will be up.
This plugin will be helpful to you.
Thanks
Are you using Joomla.
for some components only had the problem of page cache. if u need to clear cache.
and you need to speed up the joomla site
follow the basic step:
Enable Gzip Compression
Using the Gzip Compression feature, you can compress your website pages before sending them to the user. After that, they will be uncompressed by the user’s browser. And this process takes less time than transferring uncompressed pages.
Enable Cache System
Optimization Settings (Images, CSS, Java Scripts…)
Now Check Your Joomla Website Speed
it may use full to speed up your site.
Wanted to know is Joomla system cache plugin will not work for registered users visible page.
Per the Joomla! Documentation, Page Caching:
Only caches pages for guest visitors (not for logged in visitors)
or any other plugin i can use to speed up this type of pages.
Aside from JCH Optimize (which was already mentioned), another component I recommend is JotCache, which is far better than just the Joomla! default cache.
You may, also, use GTmetrix to analyze your site against both Google PageSpeed and Yahoo! Yslow.
Finally, you may try using a CDN to speed up resource delivery. Here are a few:
MaxCDN
Amazon CloudFront
Azure CDN
CDN77
CDNetworks
CDNlion
CacheFly
EdgeCast Networks
KeyCDN
SkyparkCDN
You can use CDN for Joomla! to incorporate the CDN technology.
Overall, your best bet is going to be a combination of the CDN and JCH settings to trim down the overall weight of the site using GTmetrix to compare the site after each change.
Further reading: Joomla Performance & Speed

Use google hosted jQuery-ui or self host custom download of jQuery UI?

I'm working on a site where we are using the slide function from jquery-ui.
The Google-hosted minified version of jquery-ui weighs 63KB - this is for the whole library. The custom download of just the slide function weighs 14KB.
Obviously if a user has cached the Google hosted version its a no-brainer, but if they haven't it will take longer to load as I could just lump the custom jquery-ui slide function inside of my main.js file.
I guess it comes down to how many other sites using jquery-ui (if this was just for the normal jquery the above would be a no-brainer as loads of sites use jquery, but I'm a bit unsure as per the usage of jquery-ui)...
I can't work out what's the best thing to do in the above scenario?
I'd say if the custom selective build is that small, both absolutely and relatively, there's a good reasons to choose that path.
Loading a JavaScript resource has several implications, in the following order of events:
Loading: Request / response communication or, in case of a cache hit - fetching. Keep in mind that CDN or not, the communication only affects the first page. If your site is built in a traditional "full page request" style (as opposed to SPA's and the likes), this literally becomes a non-issue.
Parsing: The JS engine needs to parse the entire resource.
Executing: The JS engine executes the entire resource. That means that any initialization / loading code is executed, even if that's initialization for features that aren't used in the hosting page.
Memory usage: The memory usage depends on the entire resource. That includes static objects as well as function (which are also objects).
With that in mind, having a smaller resource is advantageous in ways beyond simple loading. More so, a request for such a small resource is negligible in terms of communication. You wouldn't even think twice about it had it been a mini version of the company logo somewhere on the bottom of the screen where nobody even notices.
As a side note and potential optimization, if your site serves any proprietary library, or a group of less common libraries, you can bundle all of these together, including the jQuery UI subset, and your users will only have a single request, again making this advantageous.
Go with the Google hosted version
It is likely that the user would have recently visited a website that loads jQuery-UI hosted on Google servers.
It will take load off from your server and make other elements load faster.
Browsers load a fixed number of resources from one domain. Loading the jQuery-UI from Google servers will make sure it is downloaded concurrently with other resource that reside on your servers.
The Yahoo developer network recommends using a CDN. Their full reasons are posted here.
https://developer.yahoo.com/performance/rules.html
This quote from their site really seals it in my mind.
"Deploying your content across multiple, geographically dispersed servers will make your pages load faster from the user's perspective."
I am not an expert but my two cents are these anyway. With a CDN you can be sure that there is reduced latency, plus as mentioned, user is most likely to have picked it up from some other website hosted by googleAlso the thing I always care about, save bandwidth.

Using application Cache with on-line sites

I tried to use the HTML5 Application Cache to improve the performances of on-line sites (all the tutorials on the web write only about usage with off-line apps)
I created the manifest listing all the js, css and images, and the performances were really exciting, until I found that even the page HTML was cached, despite it was not listed in the manifest.
The pages of the site are in PHP, so I don't want them to be cached.
From http://www.whatwg.org/specs/web-apps/current-work/multipage/offline.html :
Authors are encouraged to include the main page in the manifest also, but in practice the page that referenced the manifest is automatically cached even if it isn't explicitly mentioned.
Is there a way to have this automatic caching disabled?
Note:
I know that caching can be controlled via HTTP headers, but I just wanted to try this way as it looks quite reliable, clean and powerful.

Google AdSense bot's algorithm and behavior

I am interesting in Google AdSense bot's algorithm and behavior with web site. I did not work with AdSense and i do not have account. So i need your help to understand:
1) Gbot from time to time downloads all pages from web site. Am i right?
2) Gbot do not understand dynamic content (loaded by ajax). So i must generate static content and return it within html page and this pages must show identical content for all users and for Gbot?
3) Because of (1) and (2) i cannot use only root path http://example.com with some "main" widget. I must generate unique pages for example http://example.com/thread?id=101 ?
4) Gbot downloads pages (1) for grabbing (indexing) keywords from them and then store (on it's servers) these information for example by key/value (where key is page path, value is tag cloud). Am i right?
5) When web site has been opened in browser by user. Integrated html AdSense's code loads some JavaScript. As i understand by "googling" this JavaScript do not index page, but makes call (with some parameter key==page_path) to Google's server and gets appropriate ad links. Then shows this ad links in it's frame. Is it right behavior? Maybe JavaScript makes some local indexing of page's content?
6) How Gbot and AdSense's JavaScript work with cookies? As i understand AdSense can use cookies for show appropriate ad links. If it is right, please give me some use cases;)
I know that "true" algorithm is known only by engineers from Google. But some of you had experience with AdSense and AdSense html/javascript. Please correct my vision of it;)
Thank you very much for any advice!!!
P.S. This question is very important for me. It is not some question for fun! So Please do not close it;)
1) Yes if Googlebot can access the pages and if it knows about the pages through a link, XMLSitemaps, Google +1, etc.
2) Googlebot will now make AJAX / XHR requests to understand AJAX content (http://googlewebmastercentral.blogspot.com/2011/11/get-post-and-safely-surfacing-more-of.html).
Yes, you should show the same content to Googlebot as you would Users, otherwise this would be consider cloaking, which is against their guidelines.
3) This question isn't clear. But basically it's preferable to have the URL change because Google will then know how to index the content separately. If you're using AJAX then you might want to consider permalinks like you suggested, or you can use HTML5 popstate.
4) Yes Google will index the words on the page. I'm not certain they store it as a key/value pair. I'm not even sure if they're still using Big Table (http://labs.google.com/papers/bigtable.html) ... but it's likely they use Big Table or a similar system to store the inverted index.
5) The Adsense code is embedded Javascript ... for new webpages that Google hasn't seen before, it tries to deliver the most relevant ads based on the information it's found on the web about the site or possibly through anchor text of links pointing to that page. However, to get a more accurate understanding of the content of the page, Google will send an adsense specific bot to crawl your page ... sometimes you'll see it come very fast, even as soon as you load the page for the first time. It uses a different user agent than the traditional Googlebot ... you can find all the User Agents from Google here (http://www.google.com/support/webmasters/bin/answer.py?answer=1061943)
6) Google's crawlers don't accept cookies and won't pass back cookies to your server. It has to do with the massively distributed nature of Google crawlers that makes maintain cookies or sessions extremely difficult.

Resources