Should I show none AJAX versions of my page to googlebot? - ajax

I have a site, lets say has a paged lists of products.
The pagination is AJAX based, it degrades 100% without JavaScript.
With JavaScript Turned on or off this Url will show the same content. (page 4 of my list of products) the same applies for any filter and ordering filters.
~/products/list/4 { 4 = page number }
When googlebot lands on page 1 of my products it won't be able to page through the products because it is AJAX based pagination. So if I turn off the AJAX Pagination and fall back to "server side" pagination if the useragent == googlebot then it can index all my Urls which will have the same content as an AJAX enable page.
I have read about using #! but my site does has the same functioallity my urls are the same with or withour JS enabled.
Hope that makes sense.

Please, do not show a diferent content if is Googlebot, because it may be considered black hat SEO. The best way to do define witch version you prefer is to use canonical metatag.
If you do not know, please also read this: Making AJAX Applications Crawlable

Related

Difficulty finding out how dynamic ajax content is loaded

So I want to crawl information about the following 2 products:
1. http://www.dollargeneral.com/product/index.jsp?productId=86154836
2. http://www.dollargeneral.com/product/index.jsp?productId=58607576
Product No.2 offers "save 5% on all order with Auto delivery"
Product No.1 doesn't
When I crawl this page, the "div" about the offer part doesn't show up, so I think this information is loaded by ajax.
Now my questions is: after using chrome dev tool to check the "Network" using XHR filter, both URLs only send 1 request that contains seemingly useless information.
So where is the offer information coming from? Thank you!
These are two different products. One simply has the offer attached to it while the other does not. Some products include the offer and others don't.

Does using AJAX on your website drop your page views while ranking?

Since its related to AJAX technology so I thought this is the best place to ask.
I am displaying 5 articles at a time to the user on my website and when he clicks 'Next' I load the next 5 articles using AJAX without loading the entire page.The result is that he always stays at the same page .
One of my friend told me that website ranking depends on number of page views and I think this obviously reduce my page views.
Should I not use AJAX then?
(This might be a stupid question but I seriously have no idea about ranking and SEO so please help)
By loading your content dynamically Google will not see the entire page. Only the part that is loaded. So, if Google rank is important for you it's better to not use an infinity loader.
Actually it is not a good idea to navigate page using AJAX. Consider a scenario,
display 5 articles first then by clicking Next button, next 5 items will load and so on... by using this the page will not become Search engine friendly.
in this case search engine can't locate your contents exactly and will crawl only initial contents.
but with some efforts you can make ajax navigation search engine friendly.. see example here.
Currently the scheme of loading content of page dynamically is not a good idea for SEO friendly web page but try considering other ajax page navigation schemes that might help the page to make dynamic as well as search engine friendly.
some suggested ajax navigation schemes are listed below,
http://nickjohnson.com/b/how-to-make-ajax-search-engine-friendly-seo
http://ajax.rswebanalytics.com/
http://www.symatix.co.uk/articles/ajax/search-engine-friendly-ajax-navigation

Ajax Load: opinion request

The business web application that I need to build for our company will use an accordion menu (such as the jQuery UI accordion).
I would like to use link jQuery click events to the accordion menu and load the content dynamically with the ajax .load shorthand.
Currently I'm dealing with an internal conversation where someone mentions that Ajax is slower than a regular browser request.
Now my question is: is that true, taken into account that:
The loading time of the accordion-content should be faster (all scripts / css / accordion / header loaded only once at the beginning);
The Ajax request does not influence the performance of any server actions;
Should I really use a browser request - and will it really be faster?
It seems very un-intuitive, using an accordion to use it as a static element that is just displayed on every page refresh.
I built a prototype before, using Ajax Load to load div containers with html and javascript, and it loaded as fast as a normal browser request would.
As for an answer, an ajax request is indeed smaller than a regular request for a page.
You have a few options where you could - pre-fetch the data, store this in an array or object and load it onclick of the slider,
or simply use ajax, this however will create another request which, if you have alot of users could potentially slow down your website.
pre-loading data into your html in hidden divs, or via a js.php directly into an array / object would be the most efficient way depending on the size of the contents.
Do know that ajax loaded content is not indexed by search engines and will be less SEO friendly.

jQuery AJAX Load Method - Delay

I'll admit that I'm pretty new web development (only been coding for about a year) and especially green when it comes to JS / jQuery.
A specific web page I've built loads different data based on hovering over certain categories: country clubs, resorts, hotels, etc. When I built the site on my local machine, the javascript function was super quick. However, on the live site, it has a long delay before the data swap happens.
The URL is: http://preferredparkingsolutions.com/client_list.html
Which links to a javascript function at: http://preferredparkingsolutions.com/scripts/clientHover.js
Which replaces the display div (#client_list) by pulling data from a text file.
Is there a better / faster way of doing this?
Yes, this could be optimised by loading the content in up-front and caching it. Currently you are doing a HTTP request each for each and every hover - even if the user has hovered over that element before, since the AJAX responses aren't being cached. Doing this would be your quickest win.
However, I can't see any case at all for having the content live externally. Is there any reason you're against having the content physically in the page and just using show/hide methods? There's various benefits to this - SEO, for one thing, since Google will find the content.
this is the external page you are loading http://preferredparkingsolutions.com/client_list.inc.html and the content looks little and looks like its a static page then why not just load every thing upfront and then just hide and show div's ? as Utkanos suggested you will aslo have a SEO benifit and also its HTTP request each for each and every hover. if you still want to load it externally lost load it once and cache it and use the cached version to hide and show divs.

Progressive Rendering and SEO

The website I'm working on has a typical e-commerce product page, with the top part of the page containing the title, images and pricing, while the bottom part of the page has the tabs section, with tabs for Features, Specs, Accessories, Reviews and so on.
Naturally, this HTML Document is heavy. I think about splitting the page in two:
The HTML Document will contain only the top part of the page
Then JavaScript will call asynchronously another page, which contains a JSON object with the content of all the tabs; when successful - JavaScript will populate each tab with his content
The question is:
Will the Search Engines crawl the content that is loaded by JavaScript?
if not - then Progressive Rendering = Loss of SEO?
if yes - must I somehow ensure that all the tabs are populated prior to the Load event, or this doesn't matter?
I think that this question could be asked differently:
With SEO in mind, do the Search Engines crawl the HTML Document only, or they crawl the content of the page at time when the Load event takes place?
Any known best practices for this? any useful links?
Please advise.
Crawlers dont use js. Turn off JS in your browser to see what the crawler does. If you have links to these content pages it will crawl to them. If the SEO is important, make sure its in the page.
The search engines crawl the HTML document only as you describe it - don't use the JS solution you propose - diverse but appropriate content of your bottom tabs is important for SEO.

Resources