Progressive Rendering and SEO - performance

The website I'm working on has a typical e-commerce product page, with the top part of the page containing the title, images and pricing, while the bottom part of the page has the tabs section, with tabs for Features, Specs, Accessories, Reviews and so on.
Naturally, this HTML Document is heavy. I think about splitting the page in two:
The HTML Document will contain only the top part of the page
Then JavaScript will call asynchronously another page, which contains a JSON object with the content of all the tabs; when successful - JavaScript will populate each tab with his content
The question is:
Will the Search Engines crawl the content that is loaded by JavaScript?
if not - then Progressive Rendering = Loss of SEO?
if yes - must I somehow ensure that all the tabs are populated prior to the Load event, or this doesn't matter?
I think that this question could be asked differently:
With SEO in mind, do the Search Engines crawl the HTML Document only, or they crawl the content of the page at time when the Load event takes place?
Any known best practices for this? any useful links?
Please advise.

Crawlers dont use js. Turn off JS in your browser to see what the crawler does. If you have links to these content pages it will crawl to them. If the SEO is important, make sure its in the page.

The search engines crawl the HTML document only as you describe it - don't use the JS solution you propose - diverse but appropriate content of your bottom tabs is important for SEO.

Related

Does using AJAX on your website drop your page views while ranking?

Since its related to AJAX technology so I thought this is the best place to ask.
I am displaying 5 articles at a time to the user on my website and when he clicks 'Next' I load the next 5 articles using AJAX without loading the entire page.The result is that he always stays at the same page .
One of my friend told me that website ranking depends on number of page views and I think this obviously reduce my page views.
Should I not use AJAX then?
(This might be a stupid question but I seriously have no idea about ranking and SEO so please help)
By loading your content dynamically Google will not see the entire page. Only the part that is loaded. So, if Google rank is important for you it's better to not use an infinity loader.
Actually it is not a good idea to navigate page using AJAX. Consider a scenario,
display 5 articles first then by clicking Next button, next 5 items will load and so on... by using this the page will not become Search engine friendly.
in this case search engine can't locate your contents exactly and will crawl only initial contents.
but with some efforts you can make ajax navigation search engine friendly.. see example here.
Currently the scheme of loading content of page dynamically is not a good idea for SEO friendly web page but try considering other ajax page navigation schemes that might help the page to make dynamic as well as search engine friendly.
some suggested ajax navigation schemes are listed below,
http://nickjohnson.com/b/how-to-make-ajax-search-engine-friendly-seo
http://ajax.rswebanalytics.com/
http://www.symatix.co.uk/articles/ajax/search-engine-friendly-ajax-navigation

Is there an SEO penalty for writing URLs with segments that have no index content?

I am working on a bootstrap based responsive website. The dropdown menus in the main website navigation are opened with a click rather than a hover. There is no index content for each section, only specific page links in the dropdown.
Is there any SEO penalty for having content located at:
www.mysite.com/books/moby-dick
when
www.mysite.com/books
results in a 404 error?
I could generate index pages with links to all children if I had to, but I'd rather avoid creating any content that isn't meant to be viewed directly.
I would like to organize the pages by "folder" using mod_rewrite which I have a pretty good handle on at this point.
The way I understand it, Search Engines place no relevance on one page's URL in relation to another page's URL. Are you looking for some documentation. What search engine are you looking for documentation on? I'm not even sure if that type of info is "documented" but if you think about it, the order of words in a URL only have meaning to us as humans. An engine doesn't place importance of links further up or down the url hierarchy. It wouldn't make sense.
I don't even think your page moby-dick would have a positive/negative impact on the domains home page. Google at least treats every URL as a unique page, hence the "Page Rank" algorithm. Not the site hierarchy algorithm.

jQuery AJAX Load Method - Delay

I'll admit that I'm pretty new web development (only been coding for about a year) and especially green when it comes to JS / jQuery.
A specific web page I've built loads different data based on hovering over certain categories: country clubs, resorts, hotels, etc. When I built the site on my local machine, the javascript function was super quick. However, on the live site, it has a long delay before the data swap happens.
The URL is: http://preferredparkingsolutions.com/client_list.html
Which links to a javascript function at: http://preferredparkingsolutions.com/scripts/clientHover.js
Which replaces the display div (#client_list) by pulling data from a text file.
Is there a better / faster way of doing this?
Yes, this could be optimised by loading the content in up-front and caching it. Currently you are doing a HTTP request each for each and every hover - even if the user has hovered over that element before, since the AJAX responses aren't being cached. Doing this would be your quickest win.
However, I can't see any case at all for having the content live externally. Is there any reason you're against having the content physically in the page and just using show/hide methods? There's various benefits to this - SEO, for one thing, since Google will find the content.
this is the external page you are loading http://preferredparkingsolutions.com/client_list.inc.html and the content looks little and looks like its a static page then why not just load every thing upfront and then just hide and show div's ? as Utkanos suggested you will aslo have a SEO benifit and also its HTTP request each for each and every hover. if you still want to load it externally lost load it once and cache it and use the cached version to hide and show divs.

SEO for Dynamically Loaded Images

I have a developed PHP code to dynamically load files contained in a directory into a gallery / slideshow. I have many (40 - 50) of these gallery web pages which display images grouped by content. With hundreds of images, the dynamic gallery code allows me to add images to a directory without having to write code to each web page each time.
However I've realized that these files will be invisible to search engines since there isn't any HTML code to index on (e.g. the 'alt' tag). Does anyone have any suggestions on how to get these images indexed? Two ideas I've had:
1) Write a program to automatically generate a single web page for every jpeg file which will display the image when found with the search engine and contain a link to the gallery page where the user can see more content. The benefit to this method is not having to modify my live web pages. The downside is hundreds of additional files only to be found by a search engine.
2) Write a program to generate hidden links that can be pasted into my gallery html page - using the alt tag. The benefit to this method is that users would find my main gallery page with a search. The downside is having to cut and paste code to my live gallery web pages - defeating somewhat the purpose of a dynamic gallery.
I'm new at this, so any suggestions would be appreciated.
If I understand you correctly:
I would have one page that just lists thumbnails of pages, and then one page for each of the images, that shows a bigger version of each image, and all the meta-data you have. The best would be if you added a short unique snippet of text to each image, describing what in it.

Should I load an entire html page with AJAX?

My designer thought it was a good idea to create a transition between different pages. Essentially only the content part will reload (header and footer stay intact), and only the content div should have a transitional effect (fade or some sort). To create this sort of effect isn't really the problem, to make google (analytics) happy is...
Solutions I didn't like and why;
Load only the content div with ajax: google won't see any content, meaning the site will never be found, or only the parts which are retrieved by ajax, which arent't full pages at all
show the transitional effect, then after that 'redirect' the user to the designated page (capture the click event of a elements): effect is pretty much the same as just linking to another page, eg. user will still see a page being reloaded
I thought of one possible solution:
When a visitor clicks a link, capture the event, load the target with ajax, show the transitional effect in the meantime, then just rewrite the entire document with the content fetched with the ajax request.
At least this will work and has some advantages; the page reload will look seamless, no matter how slow your internet connection is, google won't really mind because the ajax content is a full html page itself, and can be crawled as is, even non-javascript browsers (mobile phones et al.) won't mind, they just reload the page.
My hesitation to implement this method is that i would reload an entire page using ajax. I'm wondering if this is what ajax is meant to do, if it would slow things down. Most of all, is there a better solution, eg. my first 'bad' solution but slightly different so google would like it (analytics too)?
Thanks for your thoughts on this!
Short answer: I would not recommend loading an entire page in this manner.
Long answer: Not recommended. whilst possible, this is not really the intent of XHR/Ajax. Essentially what you're doing is replicating the native behaviour of the browser. Some of the problems you'll encounter:
Support for the Back/Forward
button. You'll need a URI # scheme
to solve.
The Browser must parse
the entire page through AJAX.
This'll slow things down. E.g. if
you load a block of HTML into the
browser, then replace the DOM with
it, only then will any scripts, CSS
or images contained therein begin
downloading.
Memory - the
browser's not changing pages. Over
time (depending on the browser), I'd
expect the memory usage to increase.
Accessibility. Screen readers
will need to be notified whenever
the page content is updated. Might
not be a concern for you but worth
mentioning.
Caching. Browser
would not know which page to cache
(beyond the initial load).
Separation of concerns - your View
is essentially broken into
server-side pieces to render the
page's content along with the static
HTML for the page framework and
lastly the JS to combined the server
piece with the browser piece.
This'll make maintenance over time
problematic and complex.
Integration with other components -
you're already seeing problems with
Google Analytics. You may encounter
issues with other components related
to the timing of when the DOM is
constructed.
Whether it's worth it for the page transition effect is your call but I hope I've answered your question.
you can have AJAX and SEO: Google's proposal .
i think you can learn something from Gmail's design.
This may be a bit strange, but I have an idea for this.
Prepare your pages to load with an 'ifarme' GET parameter.
When there is 'iframe' load it with some javascript to trigger the parent show_iframe_content()
When there is no 'iframe' just load the page, with a hidden iframe element called 'preloader'
Your goal is to make sure every of your links are opened in the 'preloader' with an additional 'iframe' get parameter, and when the loading of the iframe finishes, and it calls the show_iframe_content() you copy the content to your parent page.
Like this: Link clicked -> transition to loading phase -> iframe loaded -> show_iframe_content() called -> iframe content copied back to parent -> transition back to normal phase
The whole thing is good since, if a crawler visit ary of your pages, it will do it without the 'iframe' get parameter, so it can go through all your pages like normal, but when you use it in a browser, you make your links do the magic above.
This is just a sketch of it, but I'm sure it can be made right.
EDIT: Actually you can do it with simple ajax, without iframe, the thing is you have to modify the page after it has been loaded in the browser, to load the linked content with ajax. Also crawlers should see the links.
Example script:
$.fn.initLinks = function() {
$("a",this).click(function() {
var url = $(this).attr("href");
// transition to loading phase ...
// Ajax post parameter tells the site lo load only the content without header/footer
$.post(href,"ajax=1",function(data) {
$("#content").html(data).initLinks();
// transition to normal phase ...
});
return false;
});
};
$(function() {
$("body").initLinks();
});
Google analytics can track javascript events as if they are pageviews- check here for implementation:
http://www.google.com/support/googleanalytics/bin/answer.py?hl=en-GB&answer=55521

Resources