Loading Ajax Response Data with Adsense Codes inside - ajax

I'm 10000000% sure that this question has been asked before, however, the majority of the responses that I came across were from back in 2005, 2006 and so on. Not to mention, almost all of the questions themselves were too general. Therefore, I'm asking this so that for anyone else needs to find this out, then they won't need to dig through about 50 webpages to get an idea.
My question is simply that I have a webpage that has Google Ads embedded into the HTML of the website. The website was first developed as a static HTML site where each link reloaded a new page. Nevermind the backend technology of the website - the website itself produces purely dynamic content. The website is close to completion and now a fully-ajax listener has been added to all the links. When any of the links are clicked, JavaScript takes over, parses the link and sets that using popstate or the hashbang. The page itself is then queried to the server via AJAX and the content is updated using document.getElementByID('container').innerHTML=ajax.responseText; This way, there is almost a 100% method of accessing content that was replaced by AJAX.
This all works fine, but the responseText itself may, WILL contain Google Ads, and I was just wondering how to display them as if it were a static page. Clearly this doesn't work. Here are the options that I've come across:
Use an IFrame:
An IFrame seems to be an effective way to load the content; just stick the adsense codes a simple adsense.html iframe file and let the browser and

directly into page, it isn't possible
it's against their TOS
there is document.write() omitted in ajax request
Your chance is:
Create simple iframe
<iframe src="advert.html"></iframe>
and in advert.html, add your advert code
It's then loaded fine without problems.
Good luck

Related

Why is my ajax content not being indexed by google

I have tried to set my site up ( http://www.diablo3values.com )according to the guidelines set out here : https://developers.google.com/webmasters/ajax-crawling/ However, it appears that Google has updated their indexes (because I see the revisions to the meta description tags) but the ajax content does not show up in the index.
I am trying to use the “Handle pages without hash fragments” option.
If you view either of the following:
http://www.diablo3values.com/?_escaped_fragment_=
http://www.diablo3values.com/about?_escaped_fragment_=
you will correctly see the HTML snap shot with my content. (those are the two pages I an most concerned about).
Any Ideas? Am I doing something wrong? How do you get google to correclty recognize the tag.
I'm typing this as an answer, since it got a little to long to be a comment.
First of all, your links seems to point to localhost:8080/about, and not /about, which probably is why google doesn't index it in the first place.
Second, here's my experience with pushstate urls and Google AJAX crawling:
My experience is that ajax crawling with pushstate urls is handled a little differently by google than with hashbang urls. Since google won't know that your url is a pushstate url (since it looks just like a regular url), you need to add <meta name="fragment" content="!"> to all your pages, not only the "root" page. And google doesn't seem to know that the pages are part of the same application, so it treats every page as a separate Ajax application. So the Google bot will never actually create a navigation structure inside _escaped_fragment_, like _escaped_fragment_=/about, as it would with a hashbang url (#!/about). Instead, it will request /about?_escaped_fragment_= (which you aparently already have set up). This goes for all your "deep links". Instead of /?_escaped_fragment_=/thelink, google will always request /thelink?_escaped_fragment_=.
But as said initially, the reason it doesn't work for you is probably because you have localhost:8080 urls in your _escaped_fragment_ generated html.
Googlebot only knows to crawl the escaped fragment if your urls conform to the hash bang standard. As users navigate your site, your urls need to be:
http://www.diablo3values.com/
http://www.diablo3values.com/#!contact
http://www.diablo3values.com/#!about
Googlebot actually needs to see these urls in the source code so that it can follow them. Then it knows to download the following urls:
http://www.diablo3values.com/?_escaped_fragment=contact
http://www.diablo3values.com/?_escaped_fragment=about
On your site you appear to be loading a new page on each click, and then loading the content of each page via AJAX too. This is not how I would expect an AJAX site to work. Usually the purpose of using AJAX is so that the user never has to load a whole new page. When the user clicks, the new content section is loaded and inserted into the page. You serve the navigation once and then you only serve escaped fragments of the content.

how does usatoday display URI for news docs?

I am developing a web app/message board in AJAX. Ive come to the part where I need to decide how to display threads.
Should I refresh a completely new page for each thread? Or load it via AJAX. Obviously, I want each thread to be crawlable, linkable, and saveable as a favorite in your browser.
Then I saw USAToday's website (www.usatoday.com/news). Its very interesting how they load the page through a popup window, change the URI, and keep the data in the background.
This is exactly what I want, but I don't know what they are doing.
Can anyone else decipher this or lead me down the right path?
My impeccable googling skills has led me to believe that the answer lies in pushState.
http://www.seomoz.org/blog/create-crawlable-link-friendly-ajax-websites-using-pushstate
Essentially, it appears they are...
using the HREF of the provided link to change the URI via pushState.
using AJAX to load the contents of the page accessed via the link.
on close, they most likely use data from the newly loaded page to figure out what section its was under(sports, entertainment, etc), and reload that page.

Links in pages won't load using Ajax

I am using a script for Ajax from Dynamic Drive on my site to load content into my div. It has worked great for me until I created a page where I want links. For some reason I am finding that if I create a page with a single link, the page will not load. I can click on it all I want and the page still will not load. If I have a page that is just purely text content, it loads. Is this a flaw with Ajax, or am I not doing something right? My intention with my site is to have a "Store" section so I can use Amazon Affiliates. I can't even get my page to load even if I have a simple link say pointing to Amazon.com. Unfortunately this Ajax script has been the only successful way I've been able to get my content to load into my main div. For some odd reason links in the links section on my site will appear and that page will load, but not for my "store" page.
My site is: http://veterinarycare.atspace.cc
I'm not asking for a direct code, but just a step in the right direction.
'store.html' gives a 404 Not found... Does this file exist? That is probably your problem... Your links.html page for example has a link to ASPCA and that works fine.
You may also want to look into jQuery, as this is a bit neater for doing ajax and other javascript effects. You could probably get all that javascript mumbo jumbo down to 5 lines or so...
Also remember that your site isn't going to be particularly google friendly with all the content being loaded in via javascript.

Ajax generated content, crawling and black listing

My website uses ajax.
I've got a user list page which list users in an ajax table (with paging and more information stuff...).
The url of this page is :
/user-list
User list is created by ajax. When the user click on one user, he is redirected to a page which url is : /member/memberName
So we can see here that ajax is used to generate content and not to manage navigation (with the # character).
I want to detect bot to index all pages.
So, in ajax I want to display an ajax table with paging and cool ajax effetcs (more info...) and when I detect a bot I want to display all users (without paging) with a link to the member page like this :
JohnBob...
Do you think I can be black listed with this technique ? If you think so, could you please provide an alternative solution by keeping these clean urls and without redeveloping the user-list (without ajax) ?
Google support a specification to make AJAX crawlable:
http://code.google.com/web/ajaxcrawling/docs/specification.html
I did an experiment and it works:
http://seo-website-designer.com/SEO-Ajax-Google-Solution
As this is a Google specification, you won't get penalised (unless you abuse it).
Saying that, only Google support it at the moment (AFAIK).
Also, I believe following the concept of Progressive Enhancement is a better approach. That is, create a working html website then make the JavaScript enhance it
Maybe use the urls with an onclick to trigger your AJAX scripting? Like
Some URL
I don't think Google would punish you for this, you primarily use JScript, but you do provide a fall back for their bot, so your site doesn't get any less accessible.
EDIT
Ok, I misunderstood. Then my guess would be you basically have two options:
1. Write a different part of your site where bots end up, or,
2. Rewrite your current site to for example always give a 'full' page, with an option to only get, say, the content div. Then you can get only the content with JavaScript, but bots will always get a nice page.

External HTML page, inside AJAX Iframe?

Hi Masters Of Web Development.
I have a not that simple question this time. I have got a simple external HTML page, that I want to include in my site. The HTML page contains a submit form or something like that, and I wish to send this data from the form, without to reload the whole page. So I decided to put HTML page inside iframe. But, some people said that this is older technology, google doesn't like iframes, etc. So I want to use something like AJAX or JQuery to load that external HTML page, and to send submit form without reloading the whole page with it. :)
Any suggestions on how to make this?
Thanks in advance :)
Do you really need Google to index that iframe form? If that's the only iframe you have throughout the site, it aint going to be a problem in terms of google indexing.
About using the Iframe, if you are not comfortable learning and building ajax-type form, you'll still be fine (like what Frankie commented). Just make sure the form works, usable and compatible with popular browsers.
You want to use the jQuery Forms Plugin. Its very straightforward and easy to turn any normal HTML form into an AJAX form.

Resources