How cache all page before user visit website in cakephp - caching

how to cache all page before user visit my website.
You can suggest i how to make that.
Please more specify because i dont know much about cache, i only know use it in cakephp.

This might be just what you are looking for:
https://github.com/mcurry/html_cache
It will serve up an html cached version of a page without hitting cakephp or even php. This will make it thousands of times faster than cake. See this link as well, item #4.
http://www.pseudocoder.com/archives/2009/03/17/8-ways-to-speed-up-cakephp-apps/

Related

Is there anyway, to load/visit a full webpage (all the subpages) using a bot?

Currently I have a webpage, which works with varnish caching. However for this cache to work, the page must be visited before it starts loading from cache. My problem is that I have thousand of pages in this site, and I cant visit them one by one, because it would take a lot. Is there any webcrawler or something similar to do this task? Just to visit the page, because visiting the page generates the cache file, and then it works way faster.
just generate a sitemap with an online sitemap generator, it will crawl all the site

Can we identify googlebot like search engines hit on particular URL

My Problem:
My client site which displays more products and it adds more page load/weight. So i decided to use ajax more products loading and it works well. But here it affects the seo - and no products or deals has been indexed(Even i suggest the client to submit product via googlebase but client doesnot like that idea and he wants direct google crawling into site also he wants less time page load).
Question:
Can we identify the googlebot crawling request to the server or mozila like browser user agent request to the site(server).
Suggestion I have
I tried to identify user agent from requests but that doesnot working(or i might missing something?) Please anyone have correct solution for this problem to reduce the page load time using ajax and get googlebot also to crawl the website.
You should just search stackoverflow for "Google AJAX SEO". There are a number of questions around this.
In short, Google has a specification to make AJAX sites crawlable: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started?hl=sv-SE
You can also look into PushState as an SEO option as well.
One tactic that is used to solve this is to harness the pagination function of whatever framework or CMS you are using. You load one page of content and display pagination links in your view then use JavaScript to hide the pagination links and fetch the content of the linked pagination page via Ajax and append it to the current page. Take a look at how infinite-scroll works for inspiration:
http://www.infinite-scroll.com/
Basically you need to be at least loading links to pages that contain the other content so that search engines can crawl the content, but you can hide these links for the users who have JavaScript Enabled.
But to better answer your question, it is possible to redirect robots using htaccess:
redirect all bots using htaccess apache
But it is better SEO, as far as I understand it, to have the content or links to it, actually available on the page.

How to serve cached pages to first time visitors?

Is there a way to cache pages from previous visitors and then share that cashe to first time visitors?
I know this can't be done on client side but not sure about sever side of things.
I'm hoping you can point me in the right direction and maybe some resources as I can't find find much on this.
Sure, that's how page caching works in general. Your site code will do something like this:
look in the cache for this page
if (it's in the cache) {
serve it
} else {
generate the page
store the page in the cache
serve it
}
So, the very first visitor to the page will cause it to be cached, and then all subsequent visitors will get the cached version. This can be done at the application level (i.e., via code written by you or perhaps some library you're using) or at the server level, like with Squid.

when to use AJAX and when not to use AJAX in web application

We have web applications elgifto.com, roadbrake.com in which we used AJAX at many places, especially to update major portions of a page. All the important functionality of elgifto.com was implemented using AJAX. Now we realize a few issues due to AJAX implementation.
All the content implemented using
AJAX is not available to the SEO
bots and it is hurting the page rank
of our site.
Users will not be able to bookmark
some of the pages as they are always
available through AJAX.
When we want to direct the user from
one page through an anchor link to
another page having AJAX, we find it
difficult.
So now we are thinking of removing AJAX for these pages and use it only for small functionality such as something similar to marking a question as favorite in SO. So before going ahead and removing, we want to know expert's opinion on this. Thanks.
The problem is not "AJAX" per se, but your implementation of it. Just as a for instance, you can fix the 'bookmark' problem like google maps does it: provide a generated link for each state of your webapp.
SEO can befixed by supplying various of these state-links to the crawlers, either organically trough links in your site, or by supplying a list (sitemap).
If you implement 2, you can fix 1 and 3 with those links.
In the end you must figure out if the effort is worth it, and if you are not overusing AJAX ofcourse, but the statements you've made are not set in stone at all.
I'm costantly developing ajax based websites, with no problems for SEO at all. You just have to use it in the best possible way.
For example, I have a website with normal links pointing to normal webpages (PHP pages), this for normal navigation if a user doesn't have JS enabled. But if a user has JS enabled, a script will change the links behavior, only fetching the content of the page needed.
This way you still have phisycal separated webpages with all their content, which will be indexed as normal.

So many caches when coding in asp.net mvc 3 and don't know what to use

I'm writing an application in MVC3. it has features like login, a simple forum, news, and pages that get their main content from the db.
I'm looking into caching right now.
First I tried the simple [OutputCache] attribute but noticed that it caches the same content for every user. Normally it wouldn't be that much of a problem, but - for example - the login box is cached too and therefore it shows the same content for every user (and everybody will just see that they are logged in as admin). Even if I set Location=OutputCacheLocation.Client, after a logout the cached page still shows that I'm logged in.
No matter, I thought I can always try Response.WriteSubstitution, but for some reason it seems to be broken in MVC3.
I'm now reading about the "ASP.NET MVC Result Cache", and it seems interesting, but is it a proper way to handle caching?
Also am I able to cache childactions, or partial views in an otherwise very dynamic page?
There are so many options and I don't know what should I use and when.
Sorry that my question is so vague, but I don't even know what to ask in this case.
I think this post my solve your problem.
MVC3 custom outputcache
Good luck

Resources