indexing ajax website - parallel pages still needed? - ajax

I am building a single-page website that brings text html fragments. The fragments are in static html files stored at the server. Do I have to provide parallel pages for SEO indexing? I am hoping this technology improved.

Please be more specific, so we can help you better.
Escaped fragments is the recommended way to get Google to index your AJAX-site. Read more about them here.
I don't know what you mean by parallel, but ?_escaped_fragment_=key=value should result in a HTML snapshot of the page #!key=value

Related

cached html pages and dynamic content

I have a caching plugin that creates static html pages from my php/mysql driven site.
On the homepage I've a listing of content <ul><li>content</li></ul>
I've a drop-wdown (select), that loads different set of content.
Obviously this isn't playing well with caching plugin.
I'm don't have website yet, I'm in the thinking phase and I'm trying to understand what problems I might face... could you help me with this little part I explained above?
Not 100% sure what you mean; but you should be able to stop browser caching by adding a random GET variable into the URL. Like time() for example; your browser should see it as a different page.

How to do Search Engine Optimization for Web Applications

I am currently developing a single-page web application that is focused on functionality. It doesn't really have or need long paragraphs of text, and those that are there are loaded dynamically via javascript and AJAX.
Normally search engine optimization tips revolve around getting the right word count percentages, etc. But what are the best practices for SEO when your application is heavily reliant on AJAX? A landing page with descriptive text is not an option - it's important that users can immediately start using the application, and it's rather obvious what it does once it's loaded.
With meta tags fading in importance in modern search engines, is link-building the only solution or are there tricks to help search engines know what an AJAX-based web application is about?
Google has a written specification suggesting how you might make an AJAX web application better crawlable by their robots.
The fundamental principle is that you make a static html version of key pages, and let the crawler know these pages exist, and the relationships between them, using the #! url fragment syntax.
Somewhere you'll have to explain:
What so great about your app
How your app is working ("for dummies" style)
Who you are and why you did it
etc
You can use all this content to do SEO (no ajax is needed for that).
Forget about making ajax crawlable if you don't have any text inside your app anyway.

Search engine opimization dos and don'ts for AJAX

I've created an AJAX enabled web application. In my application all contents [that I want to be appear in search pages] are loaded using AJAX. However I observed that despite of valid sitemap submitted to google, my page raking is very very poor.
What all I need to do and what to avoid in order to improve page ranking.
Thanks in advance.
you probably want to make it enabled for bookmark and history. There are many ways. One of them is jQuery's history plugin: https://github.com/tkyk/jquery-history-plugin
you probably want to create a page for search engines to crawl your website with those links http://www.mysite.com/foobar.php#!fetch_content=xyz. The #! is a way recognized by Google to crawl and index its content.
reference: http://googlewebmastercentral.blogspot.com/2007/11/spiders-view-of-web-20.html
Don'ts would be interesting. But here's a do, for all of JS as well.
Make sure that all links degrade gracefully, this can be easily achieved by giving the links real URLs that lead to the same content that is to be loaded in the event that JS is not enabled. This makes crawling your website possible.
You would also have to disable default for all the affected links.

Why use only a few webpage AJAX to load content?

My Question is: Why don't use more webpages AJAX to load the Webpage content?
Because of the fact that you can switch off JS or is there a thought about some security problem ?
Probably for two reasons:
Users with Javascript disabled won't see anything.
Pages loaded through AJAX aren't crawl-able by search engines. You want your content to be as accessible as possible so people searching the Web will find your application.
Because in most cases it doesn't make the site any more comfortable to use (often the effect would be the opposite). "Ajax" shouldn't be used to load entire pages unless you have a very good reason for it.
One word: SEO. Seach engines execute no javascript -> do not se the content -> do not index the page.

Does Wicket hamper SEO or search engines ability to crawl?

We're coming from GWT projects and because of problems with SEO not liking GWT for our next project we're going to move clear of GWT (mainly because seo is a high priority for this next project). In choosing a new framework, I'm looking at Wicket and liking what I've seen so far. I've only done a few tutorials, but in looking at the war layout (from these tutorials) it looks like most of the html pages are in the WEB-INF folder.
It this going to cause problems for SEO and search engines crawling through the sites files?
Ideally, I'd like to use Wicket with some AJAX and deploy to Google App Engine.
It does not matter if your .jsps (or whatever) are stored in /WEB-INF. It just means they cannot be accessed directly by going to http://webapp/path/to/jsp.
For SEO think about:
Meaningful URLs and link text (i.e. URLs should be similar to expected search engine queries)
Crawlable pages (make sure all your content can be reached by a non-JS enabled bot... i.e. don't make content only available through AJAX, for instance). A sitemap might help
Look into Wicket's Bookmarkable page links and UrlCodingStrategies for a very powerful combination to use in SEO. Basicly all your links and parameters can be encoded as/a/static/url, regardless of (changing) implementation on the backend.
if you project SEO is really important than you might reconsider using a lot of ajax since crawler wont execute javascript they are not gonna read all the return of your ajax calls... that being said the SEO quality of your site is not really based on the framework you will be using ... jsut always think about img alts, links, meta, title, h1 ... in every pages and you should be fine ... also always try to post links to your site on other websites to gain visibility and get importance for crawlers

Resources