Does using AJAX on your website drop your page views while ranking? - ajax

Since its related to AJAX technology so I thought this is the best place to ask.
I am displaying 5 articles at a time to the user on my website and when he clicks 'Next' I load the next 5 articles using AJAX without loading the entire page.The result is that he always stays at the same page .
One of my friend told me that website ranking depends on number of page views and I think this obviously reduce my page views.
Should I not use AJAX then?
(This might be a stupid question but I seriously have no idea about ranking and SEO so please help)

By loading your content dynamically Google will not see the entire page. Only the part that is loaded. So, if Google rank is important for you it's better to not use an infinity loader.

Actually it is not a good idea to navigate page using AJAX. Consider a scenario,
display 5 articles first then by clicking Next button, next 5 items will load and so on... by using this the page will not become Search engine friendly.
in this case search engine can't locate your contents exactly and will crawl only initial contents.
but with some efforts you can make ajax navigation search engine friendly.. see example here.
Currently the scheme of loading content of page dynamically is not a good idea for SEO friendly web page but try considering other ajax page navigation schemes that might help the page to make dynamic as well as search engine friendly.
some suggested ajax navigation schemes are listed below,
http://nickjohnson.com/b/how-to-make-ajax-search-engine-friendly-seo
http://ajax.rswebanalytics.com/
http://www.symatix.co.uk/articles/ajax/search-engine-friendly-ajax-navigation

Related

How do Big sites prevent the loading circle on tabs from showing?

Okay I do not know how to explain this to you, It may be just my internet, or maybe my site is slower, or they really have a technique for doing this.
If you visit Facebook, Reddit, Youtube, Twitter, and if you click on links or any actions on those websites, the url changes but the browser tab doesn't show any loading circle.
How do they do that?
I am pretty sure my website is fast enought and at times it loads even faster than the bigger sites, but mine shows the loading circle on the browser tab.
Okay so I found the answer. Here is the technique for changing the url without reloading the page.
Updating address bar with new URL without hash or reloading the page
How do I modify the URL without reloading the page?
I am still trying to figure out though how to redirect the actual page without reloading the entire page. I am guessing they are loading it via ajax or something similar upon url change. I'll update this once I figure it out.
Edit: I am currently working on this feature for my site. The technique is to use ajax to load the content based on the url. I'll update this thread more as I update my site with this feature.
Edit 2: Damn, you will probably face the same problem I had trying to detect the url change without using onhashchange. If so, here you go:
How to detect URL change in JavaScript
This literally took me 4 hours just to figure that one out.....
Edit 3: I have now integrated this feature on my site. You can check it at
Grandweb
It is quite simple, but lots of work in appending the content once retrieved via ajax. So here is the process:
I am using pushState(); to change the url without reloading the page.
var url = $(this).attr('href');
var split_url = url.split('/');
var new_url = url.replace('https://grandweb.net/','');
window.history.pushState("object or string", "Title", "/write");
Using 'mouseup' was a bad idea, I changed my mind.
I then have to trigger the first function using 'mouseup' to retreieve the content via ajax, and then listen to succeeding onpopstate() for the next ones, because some mouse actions such as Mouse 4 or Mouse 5 are bound to the browser's Back and Forward button, and does not trigger via 'mouseup'.
$(window).on('mouseup', function(evt) {
get_content();
}
window.onpopstate = function(event) {
get_content();
}
The first one is responsible for triggering the function on first try because onpopstate only listens only when the browser's history API is populated.
Using mouseup was a bad idea, basically, don't use it unless you really want to detect mouse action from anywhere on the document.
I instead use the anchor tags/links to trigger the first function for retrieveng content.
example:
<a class="dynamic_btn" href="website.com/post">Home</a>
then
$(document).on('click','.dynamic_btn',function(e){
e.preventDefault();
get_content();
});
Using onhashchange is possible IF you have hashes on your url. I do not use hashes on my url so basically onhashchange is useless in my use case, unless I do not know something.
After retrieving the contents, I append them via creating DOM elements to existing containers from the page.
This is much easier to do if you are planning to change few elements or containers in your pages. If you plan on doing this to change a full page layout, goodluck. It's doable, but it's a really pain in tha *ss.
Upon observing Facebook, I learned that they do not implement this technique in all of their links/features. It makes sense because this is harder to maintain most especially because most of the work here is being done client side. It is very nice though because the page doesn't load.
I have implemented it on a few 'essential' functions of my website such as the viewing of posts and returning to the homepage. I can implement it on the whole site, but I am still deciding on that. That is all, thank you very much for reading internet stranger.

Is there an SEO penalty for writing URLs with segments that have no index content?

I am working on a bootstrap based responsive website. The dropdown menus in the main website navigation are opened with a click rather than a hover. There is no index content for each section, only specific page links in the dropdown.
Is there any SEO penalty for having content located at:
www.mysite.com/books/moby-dick
when
www.mysite.com/books
results in a 404 error?
I could generate index pages with links to all children if I had to, but I'd rather avoid creating any content that isn't meant to be viewed directly.
I would like to organize the pages by "folder" using mod_rewrite which I have a pretty good handle on at this point.
The way I understand it, Search Engines place no relevance on one page's URL in relation to another page's URL. Are you looking for some documentation. What search engine are you looking for documentation on? I'm not even sure if that type of info is "documented" but if you think about it, the order of words in a URL only have meaning to us as humans. An engine doesn't place importance of links further up or down the url hierarchy. It wouldn't make sense.
I don't even think your page moby-dick would have a positive/negative impact on the domains home page. Google at least treats every URL as a unique page, hence the "Page Rank" algorithm. Not the site hierarchy algorithm.

best practices with single page websites to decrease page load time without hurting SEO

I'm Wondering what are some best practices to decrease page load time of single page websites, and doing so in a way that won't hurt with SEO.
I'm leaning toward an ajax solution with "hijax linking", but I'm wondering what are some best practices in terms of the load order for a page. So for instance, say I have a simple webpage- has home, about, pictures of my cat, contact etc. and I'm planning to have it all show up on the homepage via vertical scrolling-alotting one "screen" worth of content per item.
I'm coding this in wordpress, so my main idea would be to first load the first "screen" i.e. hero section of homepage, as part of the home.php, so the user doesn't have to wait for the whole thing-and SEO. Then once that has finished loading, to load the next four via ajax, in the background. So I'm wondering what the best strategy might be to go about that. Someone provided this answer elsewhere:
"Build a standard 5 page site using php with proper separation of header, footer, content. Then use javascript to redirect to a single (separate) page with all content include()ed on the page."
In wordpress I'd take this to mean. Create a seperate page with a loop the grabs the other four "screens" as posts. and then load this page, after home.php has loaded.. Does anyone see any issues with this approach, or as the question asks, have any better or best practices to accomplish this, I'd appreciate them. Thanks.
There are several things you can do:
Need to improve the performance of your back end code in case there
is any.
Pagination: split page in smaller pages
Caching
Decrease the size of content, decrease the size of background images, compress js content
Compress Content
Most of the time the perfect optimization will depend on your situation. To start with one of the above will do it for you.
Your question is tagged with "wordpress". Therefore, I am assuming that you use wordpress.
if so, what I would think as logical starting point is to use one of the wordpress caching plugins. I use Quick Cache for my website and it makes significant difference.
But, you shouldn't stop with the plugin. Consider the quality of the theme you are using. You must be sure that the theme is good quality. Poorly designed themes may make inefficient database call and may slow your website.
delaying and Loading part of the page with ajax shouldn't be your first optimization action. Try all the other options first.

jQuery AJAX Load Method - Delay

I'll admit that I'm pretty new web development (only been coding for about a year) and especially green when it comes to JS / jQuery.
A specific web page I've built loads different data based on hovering over certain categories: country clubs, resorts, hotels, etc. When I built the site on my local machine, the javascript function was super quick. However, on the live site, it has a long delay before the data swap happens.
The URL is: http://preferredparkingsolutions.com/client_list.html
Which links to a javascript function at: http://preferredparkingsolutions.com/scripts/clientHover.js
Which replaces the display div (#client_list) by pulling data from a text file.
Is there a better / faster way of doing this?
Yes, this could be optimised by loading the content in up-front and caching it. Currently you are doing a HTTP request each for each and every hover - even if the user has hovered over that element before, since the AJAX responses aren't being cached. Doing this would be your quickest win.
However, I can't see any case at all for having the content live externally. Is there any reason you're against having the content physically in the page and just using show/hide methods? There's various benefits to this - SEO, for one thing, since Google will find the content.
this is the external page you are loading http://preferredparkingsolutions.com/client_list.inc.html and the content looks little and looks like its a static page then why not just load every thing upfront and then just hide and show div's ? as Utkanos suggested you will aslo have a SEO benifit and also its HTTP request each for each and every hover. if you still want to load it externally lost load it once and cache it and use the cached version to hide and show divs.

Progressive Rendering and SEO

The website I'm working on has a typical e-commerce product page, with the top part of the page containing the title, images and pricing, while the bottom part of the page has the tabs section, with tabs for Features, Specs, Accessories, Reviews and so on.
Naturally, this HTML Document is heavy. I think about splitting the page in two:
The HTML Document will contain only the top part of the page
Then JavaScript will call asynchronously another page, which contains a JSON object with the content of all the tabs; when successful - JavaScript will populate each tab with his content
The question is:
Will the Search Engines crawl the content that is loaded by JavaScript?
if not - then Progressive Rendering = Loss of SEO?
if yes - must I somehow ensure that all the tabs are populated prior to the Load event, or this doesn't matter?
I think that this question could be asked differently:
With SEO in mind, do the Search Engines crawl the HTML Document only, or they crawl the content of the page at time when the Load event takes place?
Any known best practices for this? any useful links?
Please advise.
Crawlers dont use js. Turn off JS in your browser to see what the crawler does. If you have links to these content pages it will crawl to them. If the SEO is important, make sure its in the page.
The search engines crawl the HTML document only as you describe it - don't use the JS solution you propose - diverse but appropriate content of your bottom tabs is important for SEO.

Resources