Pros and cons of AJAX-loaded content - ajax

It's pretty nice to sort a dataset by a number of filters and get the results shown instantly, right?
My solution to do this would be to POST the "filter" (read forms) parameters to a page called dataset.php, which returns the appropriate dataset in compiled HTML, that can be loaded straight into my page.
So, besides this being a total no-no for SEO and for people having deactivated Javascript, It appears as a quite good solution to easily build on in the future.
However, I have yet not the experience to consider it a good or bad overall solution. What should be our concerns with an AJAX-fetched dataset?

So, besides this being a total no-no for SEO and for people having deactivated Javascript, It appears as a quite good solution to easily build on in the future.
Not entirely true, there are solutions out there like jQuery Ajaxy which enable AJAX content with History tracking while remaining SEO and javascript disabled friendly. You can see this in action on my own site Balupton.com with evidence it's still SEO friendly here.
However, I have yet not the experience to consider it a good or bad overall solution. What should be our concerns with an AJAX-fetched dataset?
Having Ajax loaded content is great for the user experience it's fast quick and just nice to look at. If you don't have history tracking then it can be quite confusing especially if you are using ajax loaded content for things like pages, rather than just sidebar content - as then you break away from consistency users are experienced with. Another caveat is Google Analytics tracking for the Ajax pages. These shortcomings, those you've already mentioned as well as some others mentioned elsewhere are all quite difficult problems.
jQuery Ajaxy (as mentioned before) provides a nice high level solution for nearly all the problems, but can be a big learning curve if you haven't worked with Controller architecture yet but most people get it rather quickly.
For instance, to enable history trackable ajax content for changing a set of results using jQuery Ajaxy, you don't actually need any server side changes. You could do something like this at the bottom of your page: $('#results ul.pages li.page a').addClass('ajaxy ajaxy-resultset').ajaxify();
Then setup a Ajaxy controller like so to fetch just the content we want from the response:
'resultset': {
selector: '.ajaxy-resultset',
request: function(){
// Hide Content
$result.stop(true,true).fadeOut(400);
// Return true
return true;
},
response: function(){
// Prepare
var Ajaxy = $.Ajaxy; var data = this.State.Response.data; var state = this.state;
// Show Content
var Action = this;
var newResultContent = $(data.content).find('#result').html();
$result.html(newResultContent).fadeIn(400,function(){
Action.documentReady($result);
});
// Return true
return true;
}
}
And that's all there is too it, with most of the above being just copy and pasted code from the demonstration page. Of course this isn't ideal as we return the entire page in our Ajax responses, but this would have to happen anyway. You can always upgrade the script a bit more, and make it so on the server side you check for the XHR header, and if that is set (then we are an ajax request) so just render the results part rather than everything.

You already named the 2 big ones. Now all you need to do is make sure all the functionality works without javascript (reload the page with the requested dataset), and use AJAX to improve it (load the requested dataset without reloading the page).

This largely depends on the context. In some cases people today may expect the results to be delivered instantly without the page refreshing itself. It does also improve overall user-experience - again, this largely depends on the context.
However, it does also have its pitfalls. Would the user have a need to return to the previous pages after the ajax content was delivered? Since this may not be as simple as pressing the Back button in the browser.

Related

Using setInterval and AJAX to detect a server-side change

I have a server-side "cart" variable that gets updated via an AJAX call when a button is clicked (I'm using Shopify, if it matters, but I feel that this is a general question). I am using AJAX to reload a div once the cart changes. The problem I encountered was this:
I submit the "update cart" AJAX call
Immediately after I try to reload the div
Depending on the exact timing, maybe 1 out of every 10 times the reload would use the old cart data, since the cart change hadn't registered on the server yet.
I came up with a solution to use setInterval, but I think there are some serious problems with this method. I'll show the code first, then share my concerns.
function addToCart(prodid,prodHandle,sizeString){
var oldSpaces = getNumSpaces(); //gets the number of "free spaces" to display
//the actual call to update the cart
push_to_queue(prodid, 1, {}, Shopify.moveAlong);
//now wait for the number of items to change (ignore the possibility of cart update
//failure, that's handled elsewhere)
var timerVar = setInterval(function(){
var newSpaces = getNumSpaces();
if(newSpaces != oldSpaces)
{
$( document ).ready(function() {
$("#toybox").load(" #toybox > *");
clearInterval(timerVar);
});
}
},200);
};
My concerns are that this feels extremely hacky. I'm running my update function once every 200ms. Is there a way (in general preferably but in shopify only if need be) to ask the server itself to let me know when something has changed?
This seems like a strange question. The server does not change the cart. The client changes the cart. So when you ask for a preferable way to ask the server to let you know when something has changed, the answer will always be, that is never going to happen.
If you want to know when the cart changes, you will always know since you can listen to all cart events client side. Since you are coding up things client side, you need not trouble yourself with server events.
That is how Shopify cart works, and you are asking for advice with that in mind, so I hope this helps you. Polling every 200ms, or N ms or any seconds is a pointless exercise in wasting browser cycles.

Getting AJAX Loaded content indexed by Google

Ok, first question on stackoverflow!
I'm in a bit of a dilemma, I've recently created an ecommerce site and set it up to load everything very nicely with AJAX for speed, UX & UI. However it appears that the search engines (Google) is not indexing the content. Doing a bit of research the popular answer/fix that comes back is to implement Googles guide here: https://developers.google.com/webmasters/ajax-crawling/
To cut a long story short I don't really want to be using 'hash bangs' across the site, the URL's I currently use are simple and clean and littering them with #! just doesn't look nice! Before I start to go down this route does is there any alternatives? I currently use history.pushState on the category pages but I need something primarily for the product detail page tat doesn't involve changing the URL.
This is the current setup for loading the product details:
$(document).ready(function(){
LoadContent();
});
function LoadContent() {
Services.Content.GetProduct(productid, OnSuccess, OnFailure);
}
function OnSuccess(result) {
$('#ContentLoading').css("display", "none");
$('#Content').hide();
$('#Content').html(result);
$('#Content').delay(200).fadeIn();
}

Crawlable Ajax content. SEO-ing without hashbang. Is my way ok?

I'm going to build my application based on ajax, and my URLs are something like:
http://server.com/module/#function_name,param1,param2...etc
After referencing some discussions about google's suggestion: hashbang (#!), it's not hard for me to realize that it was not the best solution. There are several reasons:
The URL is pretty ugly, anyway.
It's terrible if someday Google(or some other search-engines) suggest a better solution other than hashbang. I must keep my ugly url with hashbang, or write some js-code to make link to my page still alive.
HTML5 pushState will be popular someday.
For all things above, I decide to make it my way: my navigation links will be like this:
<a href="http://server.com/module/for-crawler/function-name/param1/param2/...">
Some text </a>
And some jQuery code will make it capable to load ajax content instead of page-change like a normal link:
$(function(){
$('a').live('click',function(e){
var realURL = translateURL( $(this).attr('href') )
loadContent( realURL );
e.prevetnDefault();
return false;
})
})
/* -- the function translateURL will turn url like :
..... http://server.com/module/for-crawler/function-name/param1/param2/...
Into:
..... http://server.com/module/#function-name/param1/param2/...
That's the real url I think all ajaxers are used to dealing with
*/
When crawler reads my page, it will follow the url in "href" attribute, and I will provide it with static non-js version of my page just for google to read. After some days, my page is indexed & user will see my page on Google's results like this:
http://server.com/module/for-crawler/function-name/param1/param2/...
I'm going to user js again to redirect user to my normal ajax version, I mean, to the real URL :
http://server.com/module/#function-name/param1/param2/...
That's the best approach I can think about at this time. Please give me advices : should I do it that way, or can I make it better ? Thanks all guys !
Depending on your audience I would suggest to use HTML5 PushState anyway.
If the client is not supporting HTML5 PushState let him simply use the same version of your app as the crawlers do. In my opinion a page reload is not as bad as a hashed URL. Since users share URLs your hashed URL gets exposed to other users. This URL wouldn't work for, let's say Facebooks Link sharing previews or any other client that doesn't support JavaScript.
Instead I would only use the crawler-friendly app in combination with HTML5 PushState. With PushState you will always expose a single URL, independent of the JavaScript support of your client.
First, detect whether PushState is supported:
function supports_history_api() {
return !!(window.history && history.pushState);
}
Then your click-handler would look something like this:
$('a').live('click',function(e){
var url = $(this).attr('href');
e.preventDefault();
loadContent( url );
history.pushState({"url":url}, $(this).attr('title'), url);
return false;
})

AJAX: If you set the async to false will the whole page reload?

Does teh whole page reload when this is set to false?
My main question is what the asynchronous does. yes i know what the word means but what does it do in code?
xmlhttp.open("GET","ajax_info.txt",true);
The word "asynchronous" is best described as "done in the background" in this context. It means that if you set this parameter to true, the request will be sent in the background and the user will be able to continue interacting with the page. If you set it to false, the page will BLOCK and the user won't be able to do anything until the request returns.
Note that this is different from the whole page reloading. The amount of traffic going over the wire is still much smaller than the whole page reload, so many of the AJAX benefits are preserved.
One reason why you might want to use synchronous (blocking) AJAX requests is when there's nothing to really do on the page while the request is loading.
BTW, since we're already on this subject: I encourage you to use a javascript framework for your AJAX needs. jQuery is fantastic. Don't use the XMLHttpRequest object directly.
Having used jQuery's ajax I found some issues with IE compatibility, so if you have to support IE6, it may be a good idea to avoid that and use straight JS.
Here's a good tutorial on it:
http://daniel.lorch.cc/docs/ajax_simple/

Use Drupal7 AJAX goodness programmatically

X post from http://drupal.org/node/953016
The Drupal 7 AJAX system is great, it works very smoothly for forms and even for links.
What I can't work out how to do in a sane way is to call it from javascript. I may want to have a dynamic page without a form and as part of that make a Drupal ajax call, specifically so that the ajax commands get run on return.
The most effective way I have found to do this so far is:
dummy_link = $('Loading Vars');
$(vars_div).append(dummy_link);
Drupal.attachBehaviors(vars_div);
dummy_link.click();
Which is effective but a huge hack. I havn't found a way to perform an ajax call and have the Drupal ajax framework do it, rather than the standard jquery framework.
I would have thought that it was possible to invoke the drupal ajax api directly, does anyone know how?
The short short answer is you'll want to get yourself to something like:
$.ajax(ajax.options);
Which is the jQuery part, but with a set of options that help you hook into the Drupal Goodness in terms of success handling, effects, etc. This is what is effectively what's hapening for you in your "huge hack" example.
Creating a new Drupal.ajax function programatically still requires a synthetic element:
base = 'someid'
element = $('Loading Vars');
element_settings = {'url': uri, 'event': 'click'}
myAjax = new Drupal.ajax(base, element, element_settings)
But you can at least trigger it without simulating a click in the UI:
myAjax.eventResponse(element, 'click')
It feels like there should be a better way to do this, but it requires another way to set up the initial ajax prototype that doesn't require a DOM element. Because so much of the interaction-set hinges on how to move data back into the DOM, I don't think this use-case is well-supported yet.
It may also be possible to go direct to jQuery with a proper set of options and get the effect you want, but the Drupal.ajax protoype functions self-refer quite a lot so doing it without a Drupal.ajax class seems dicey.

Resources