I've created an AJAX enabled web application. In my application all contents [that I want to be appear in search pages] are loaded using AJAX. However I observed that despite of valid sitemap submitted to google, my page raking is very very poor.
What all I need to do and what to avoid in order to improve page ranking.
Thanks in advance.
you probably want to make it enabled for bookmark and history. There are many ways. One of them is jQuery's history plugin: https://github.com/tkyk/jquery-history-plugin
you probably want to create a page for search engines to crawl your website with those links http://www.mysite.com/foobar.php#!fetch_content=xyz. The #! is a way recognized by Google to crawl and index its content.
reference: http://googlewebmastercentral.blogspot.com/2007/11/spiders-view-of-web-20.html
Don'ts would be interesting. But here's a do, for all of JS as well.
Make sure that all links degrade gracefully, this can be easily achieved by giving the links real URLs that lead to the same content that is to be loaded in the event that JS is not enabled. This makes crawling your website possible.
You would also have to disable default for all the affected links.
Related
I have developed a website using angularjs and web api.
The problem is that the ajax rendered content is not crawable by google. And no one can find the website using google search.
After reading many articles regarding this issue, including:
This one with all links of explanation going out,
Google ajax crawling protocol, and also stack over flow question, I couldn't find the proper solution. Those that mention asp.net solutions, are talking about mvc, and I need only the simple REST by web api, other articles are not talking about asp.net.
Is there any simple explanation?
I'm the one who asked this same question long ago, so I will answer from my experience:
Firstly, if all your content are accessible via unique URIs (including the hashbang if you use it), modern search engines should index it just fine. In fact Google can index javascript generated content now. You can try that via the Google Webmaster tool and see how your site is indexed.
Secondly, there are libraries that help you to serve parsed content to search engines if you need to, but in my case I didn't bother much with it since Google is indexing js nicely.
I've seen others ask this question, and maybe I'm missing something or this is outdated, but I don't see why AngularJS needs to be an issue with SEO.
Say you have a landing page and it has a bunch of links. Assuming you're using html5 mode in AngularJS (and I'm not sure that's 100% necessary) and something like ng-route then the links on the landing page can work both as "angular" (JavaScript) links and "old school" (full page load) links.
If you're a human user you can click a link and it will do angular magic and adjust the content without loading the full page. Ok, all fine.
But if you instead copy the link and paste it in a new tab or new browser, it will still work - assuming you've set up routes correctly.
I'm not an SEO expert by any stretch of the imagination, but as I understand it, having links that load pages and having those pages have real and useful content is the core of SEO, and done this way, AngularJS should work fine. The key thing to check is if you copy and paste the link (not just click it) that it works.
I want to know that i am making a website which shows all the latest information on the front page... the website is completely AJAX working , So all the data is in the Server Databases , If someone searches for a perticular data available in my site through Search engines then will the search engine searches the database?
By default, no, AJAX-only accessible content is not going to be crawled by Google. There is a convention you can use, however, to ensure that it does get crawled - the "Hash Wallop".
When Google (and other search engines), encounter a link that starts with "#!", it will "crawl" the dynamic content returning from an AJAX call. To take advantage of this, rather than having something like:
trigger ajax
...you will want to use something like:
trigger ajax
There is some more info on the technique here (and lots of others sites, just google it): http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some insight: http://coding.smashingmagazine.com/2011/09/27/searchable-dynamic-content-with-ajax-crawling/
Also, you might want to look into HTML caching, it will boost your SEO performance.
We have web applications elgifto.com, roadbrake.com in which we used AJAX at many places, especially to update major portions of a page. All the important functionality of elgifto.com was implemented using AJAX. Now we realize a few issues due to AJAX implementation.
All the content implemented using
AJAX is not available to the SEO
bots and it is hurting the page rank
of our site.
Users will not be able to bookmark
some of the pages as they are always
available through AJAX.
When we want to direct the user from
one page through an anchor link to
another page having AJAX, we find it
difficult.
So now we are thinking of removing AJAX for these pages and use it only for small functionality such as something similar to marking a question as favorite in SO. So before going ahead and removing, we want to know expert's opinion on this. Thanks.
The problem is not "AJAX" per se, but your implementation of it. Just as a for instance, you can fix the 'bookmark' problem like google maps does it: provide a generated link for each state of your webapp.
SEO can befixed by supplying various of these state-links to the crawlers, either organically trough links in your site, or by supplying a list (sitemap).
If you implement 2, you can fix 1 and 3 with those links.
In the end you must figure out if the effort is worth it, and if you are not overusing AJAX ofcourse, but the statements you've made are not set in stone at all.
I'm costantly developing ajax based websites, with no problems for SEO at all. You just have to use it in the best possible way.
For example, I have a website with normal links pointing to normal webpages (PHP pages), this for normal navigation if a user doesn't have JS enabled. But if a user has JS enabled, a script will change the links behavior, only fetching the content of the page needed.
This way you still have phisycal separated webpages with all their content, which will be indexed as normal.
My Question is: Why don't use more webpages AJAX to load the Webpage content?
Because of the fact that you can switch off JS or is there a thought about some security problem ?
Probably for two reasons:
Users with Javascript disabled won't see anything.
Pages loaded through AJAX aren't crawl-able by search engines. You want your content to be as accessible as possible so people searching the Web will find your application.
Because in most cases it doesn't make the site any more comfortable to use (often the effect would be the opposite). "Ajax" shouldn't be used to load entire pages unless you have a very good reason for it.
One word: SEO. Seach engines execute no javascript -> do not se the content -> do not index the page.
i am planing to start a full ajax site project, and i was wondering about SEO.
The site will have urls like www.mysite.gr/#/category1 etc
Can Google crawl the site.
Is something that i have to noticed about full ajax and SEO
Any reading suggestions are welcome
Thanks
https://stackoverflow.com/questions/768233/do-hashes-in-urls-affect-seo
You might want to read about so called progressive enhancement.
Google supports indexing of AJAX sites, but unfortunately it involves extra work for the developer. See http://code.google.com/web/ajaxcrawling/docs/getting-started.html
I don't think Google is capable of doing so (yet)
http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
However you can of course make your site usable with or without JavaScript. That way, browsers will have the full candy stuff and Google (and text browsers) still can navigation your site.
In addition to SEO, you also need to think about usability standards here. A site that is that reliant on AJAX isn't going to work for things like screen-readers as well as spiders. You need a system for graceful degreadation. A website that can't function without JavaScript isn't really a functioning website.
The search engines will spider the initial page load - what happens to the page (with ajax) after that is irrelevant to listings.
Google itself doesn't crawl ajax content but advice a mechanism for it. For this you first need to change # to #!
Whole process to SEO AJAX content is explained here along with simple asp.net code to start working on it.
Imagine having to hit the “refresh” button in your browser to update your Twitter feed rather than just hitting the button on the page itself and having it instantly update? These are the types of problems that AJAX solves, although it does come with its pitfalls. Google might claim it’s able to crawl and parse AJAX websites, yet it’s risky to just take its word for it and leave your website’s organic traffic up to chance. Even though Google can usually index dynamic AJAX content, it’s not always that simple. This guide covers some of the things that can go wrong and how you can make sure your AJAX website is crawlable: https://prerender.io/ajax-seo/