Gmail/Facebook chat - iframa, ajax, url anchors, but what about indexing? - ajax

I'm still not sure how it works(but it's not the point:D). As far as I noticed, whole content(almost:D) is in the iframe and chat window is outside iframe. Request are probably made via ajax, and urls are changing like this const_part_of_url#something - so the only url anchors(or whatever it's called) are changing.
2 things bothering me :
What about googlebot, is it able to index those pages correctly(not gmail, but say some web page with similar "technology" used), 1st beacuse of iframe, 2nd because of only anchor changes in urls?
Is it possible to make some part of url changing not only anchors?
The thing is I have an mp3 search engine where you can listen these mp3s too, and this kind of floating, "not-reloading" player with playlist would be kinda cool:D But I'm very concern about proper page indexing and other SEO blah blah... so I don't really now if it's worth trying:D
Cheers

you can detect robots and not feed them with user-eyes-only content ...
Edit : you can also load it on demand (javascript)... bots wont load it

Related

How do Big sites prevent the loading circle on tabs from showing?

Okay I do not know how to explain this to you, It may be just my internet, or maybe my site is slower, or they really have a technique for doing this.
If you visit Facebook, Reddit, Youtube, Twitter, and if you click on links or any actions on those websites, the url changes but the browser tab doesn't show any loading circle.
How do they do that?
I am pretty sure my website is fast enought and at times it loads even faster than the bigger sites, but mine shows the loading circle on the browser tab.
Okay so I found the answer. Here is the technique for changing the url without reloading the page.
Updating address bar with new URL without hash or reloading the page
How do I modify the URL without reloading the page?
I am still trying to figure out though how to redirect the actual page without reloading the entire page. I am guessing they are loading it via ajax or something similar upon url change. I'll update this once I figure it out.
Edit: I am currently working on this feature for my site. The technique is to use ajax to load the content based on the url. I'll update this thread more as I update my site with this feature.
Edit 2: Damn, you will probably face the same problem I had trying to detect the url change without using onhashchange. If so, here you go:
How to detect URL change in JavaScript
This literally took me 4 hours just to figure that one out.....
Edit 3: I have now integrated this feature on my site. You can check it at
Grandweb
It is quite simple, but lots of work in appending the content once retrieved via ajax. So here is the process:
I am using pushState(); to change the url without reloading the page.
var url = $(this).attr('href');
var split_url = url.split('/');
var new_url = url.replace('https://grandweb.net/','');
window.history.pushState("object or string", "Title", "/write");
Using 'mouseup' was a bad idea, I changed my mind.
I then have to trigger the first function using 'mouseup' to retreieve the content via ajax, and then listen to succeeding onpopstate() for the next ones, because some mouse actions such as Mouse 4 or Mouse 5 are bound to the browser's Back and Forward button, and does not trigger via 'mouseup'.
$(window).on('mouseup', function(evt) {
get_content();
}
window.onpopstate = function(event) {
get_content();
}
The first one is responsible for triggering the function on first try because onpopstate only listens only when the browser's history API is populated.
Using mouseup was a bad idea, basically, don't use it unless you really want to detect mouse action from anywhere on the document.
I instead use the anchor tags/links to trigger the first function for retrieveng content.
example:
<a class="dynamic_btn" href="website.com/post">Home</a>
then
$(document).on('click','.dynamic_btn',function(e){
e.preventDefault();
get_content();
});
Using onhashchange is possible IF you have hashes on your url. I do not use hashes on my url so basically onhashchange is useless in my use case, unless I do not know something.
After retrieving the contents, I append them via creating DOM elements to existing containers from the page.
This is much easier to do if you are planning to change few elements or containers in your pages. If you plan on doing this to change a full page layout, goodluck. It's doable, but it's a really pain in tha *ss.
Upon observing Facebook, I learned that they do not implement this technique in all of their links/features. It makes sense because this is harder to maintain most especially because most of the work here is being done client side. It is very nice though because the page doesn't load.
I have implemented it on a few 'essential' functions of my website such as the viewing of posts and returning to the homepage. I can implement it on the whole site, but I am still deciding on that. That is all, thank you very much for reading internet stranger.

jQuery AJAX Load Method - Delay

I'll admit that I'm pretty new web development (only been coding for about a year) and especially green when it comes to JS / jQuery.
A specific web page I've built loads different data based on hovering over certain categories: country clubs, resorts, hotels, etc. When I built the site on my local machine, the javascript function was super quick. However, on the live site, it has a long delay before the data swap happens.
The URL is: http://preferredparkingsolutions.com/client_list.html
Which links to a javascript function at: http://preferredparkingsolutions.com/scripts/clientHover.js
Which replaces the display div (#client_list) by pulling data from a text file.
Is there a better / faster way of doing this?
Yes, this could be optimised by loading the content in up-front and caching it. Currently you are doing a HTTP request each for each and every hover - even if the user has hovered over that element before, since the AJAX responses aren't being cached. Doing this would be your quickest win.
However, I can't see any case at all for having the content live externally. Is there any reason you're against having the content physically in the page and just using show/hide methods? There's various benefits to this - SEO, for one thing, since Google will find the content.
this is the external page you are loading http://preferredparkingsolutions.com/client_list.inc.html and the content looks little and looks like its a static page then why not just load every thing upfront and then just hide and show div's ? as Utkanos suggested you will aslo have a SEO benifit and also its HTTP request each for each and every hover. if you still want to load it externally lost load it once and cache it and use the cached version to hide and show divs.

Progressive Rendering and SEO

The website I'm working on has a typical e-commerce product page, with the top part of the page containing the title, images and pricing, while the bottom part of the page has the tabs section, with tabs for Features, Specs, Accessories, Reviews and so on.
Naturally, this HTML Document is heavy. I think about splitting the page in two:
The HTML Document will contain only the top part of the page
Then JavaScript will call asynchronously another page, which contains a JSON object with the content of all the tabs; when successful - JavaScript will populate each tab with his content
The question is:
Will the Search Engines crawl the content that is loaded by JavaScript?
if not - then Progressive Rendering = Loss of SEO?
if yes - must I somehow ensure that all the tabs are populated prior to the Load event, or this doesn't matter?
I think that this question could be asked differently:
With SEO in mind, do the Search Engines crawl the HTML Document only, or they crawl the content of the page at time when the Load event takes place?
Any known best practices for this? any useful links?
Please advise.
Crawlers dont use js. Turn off JS in your browser to see what the crawler does. If you have links to these content pages it will crawl to them. If the SEO is important, make sure its in the page.
The search engines crawl the HTML document only as you describe it - don't use the JS solution you propose - diverse but appropriate content of your bottom tabs is important for SEO.

Should I load an entire html page with AJAX?

My designer thought it was a good idea to create a transition between different pages. Essentially only the content part will reload (header and footer stay intact), and only the content div should have a transitional effect (fade or some sort). To create this sort of effect isn't really the problem, to make google (analytics) happy is...
Solutions I didn't like and why;
Load only the content div with ajax: google won't see any content, meaning the site will never be found, or only the parts which are retrieved by ajax, which arent't full pages at all
show the transitional effect, then after that 'redirect' the user to the designated page (capture the click event of a elements): effect is pretty much the same as just linking to another page, eg. user will still see a page being reloaded
I thought of one possible solution:
When a visitor clicks a link, capture the event, load the target with ajax, show the transitional effect in the meantime, then just rewrite the entire document with the content fetched with the ajax request.
At least this will work and has some advantages; the page reload will look seamless, no matter how slow your internet connection is, google won't really mind because the ajax content is a full html page itself, and can be crawled as is, even non-javascript browsers (mobile phones et al.) won't mind, they just reload the page.
My hesitation to implement this method is that i would reload an entire page using ajax. I'm wondering if this is what ajax is meant to do, if it would slow things down. Most of all, is there a better solution, eg. my first 'bad' solution but slightly different so google would like it (analytics too)?
Thanks for your thoughts on this!
Short answer: I would not recommend loading an entire page in this manner.
Long answer: Not recommended. whilst possible, this is not really the intent of XHR/Ajax. Essentially what you're doing is replicating the native behaviour of the browser. Some of the problems you'll encounter:
Support for the Back/Forward
button. You'll need a URI # scheme
to solve.
The Browser must parse
the entire page through AJAX.
This'll slow things down. E.g. if
you load a block of HTML into the
browser, then replace the DOM with
it, only then will any scripts, CSS
or images contained therein begin
downloading.
Memory - the
browser's not changing pages. Over
time (depending on the browser), I'd
expect the memory usage to increase.
Accessibility. Screen readers
will need to be notified whenever
the page content is updated. Might
not be a concern for you but worth
mentioning.
Caching. Browser
would not know which page to cache
(beyond the initial load).
Separation of concerns - your View
is essentially broken into
server-side pieces to render the
page's content along with the static
HTML for the page framework and
lastly the JS to combined the server
piece with the browser piece.
This'll make maintenance over time
problematic and complex.
Integration with other components -
you're already seeing problems with
Google Analytics. You may encounter
issues with other components related
to the timing of when the DOM is
constructed.
Whether it's worth it for the page transition effect is your call but I hope I've answered your question.
you can have AJAX and SEO: Google's proposal .
i think you can learn something from Gmail's design.
This may be a bit strange, but I have an idea for this.
Prepare your pages to load with an 'ifarme' GET parameter.
When there is 'iframe' load it with some javascript to trigger the parent show_iframe_content()
When there is no 'iframe' just load the page, with a hidden iframe element called 'preloader'
Your goal is to make sure every of your links are opened in the 'preloader' with an additional 'iframe' get parameter, and when the loading of the iframe finishes, and it calls the show_iframe_content() you copy the content to your parent page.
Like this: Link clicked -> transition to loading phase -> iframe loaded -> show_iframe_content() called -> iframe content copied back to parent -> transition back to normal phase
The whole thing is good since, if a crawler visit ary of your pages, it will do it without the 'iframe' get parameter, so it can go through all your pages like normal, but when you use it in a browser, you make your links do the magic above.
This is just a sketch of it, but I'm sure it can be made right.
EDIT: Actually you can do it with simple ajax, without iframe, the thing is you have to modify the page after it has been loaded in the browser, to load the linked content with ajax. Also crawlers should see the links.
Example script:
$.fn.initLinks = function() {
$("a",this).click(function() {
var url = $(this).attr("href");
// transition to loading phase ...
// Ajax post parameter tells the site lo load only the content without header/footer
$.post(href,"ajax=1",function(data) {
$("#content").html(data).initLinks();
// transition to normal phase ...
});
return false;
});
};
$(function() {
$("body").initLinks();
});
Google analytics can track javascript events as if they are pageviews- check here for implementation:
http://www.google.com/support/googleanalytics/bin/answer.py?hl=en-GB&answer=55521

Ajax - How to change URL by content

I'll explain:
I have a picture gallery, the first page is display.php.
Users can flip through pictures using arrows, when you click an arrow it sends an Ajax request to retrieve the next picture from the db. Now I want the URL to change according to the picture displayed.
So if the first picture is:
www.mydomain.com/display.php?picture=Paris at night
I'll flip to the next one and the URL would be
www.mydomain.com/display.php?picture=The Big Ben
How do I do this?
The trick here are uri's with an anchor fragment.
The part before '#' points to a resource on the internet, and after normally designates to a anchor on the page.
The browser does not refresh if the resource is the same but moves to the anchors position when present.
This way you can keep the convenience of browser history from a usability point of view while replacing certain parts on the page with ajax for a fast and responsive user interface.
Using a plugin like jQuery history (as suggested by others) is really easy: you decorate certain elements with a rel attribute by which the plugin takes care of the rest.
Also kinda related to this topic is something called 'hijax', and it's something I really like.
This means generating html just like you would in the old days before ajax. Then you hijack certain behavior like links and request the content with ajax, only replacing the necessary parts. This in combination with the above technique allows really SEO friendly and accessible webpages.
You can use the jQuery history plugin for example.
changing the search of the url will load the changed url.
See also: stackoverflow, javascript changing the get parameter without redirecting
Do you really want to use AJAX here?
A traditional web request would work like this...
User navigates to display.php
User clicks "next" and location is updated to "display.php?picture=Big-Ben"
Big Ben is shown to user, along with a link to "display.php?picture=Parliment"
User clicks "next" and location is updated to "display.php?picture=Parliment"
And so on.
With AJAX, you essentially replace the GET with a "behind the scenes" GET, that just replaces a portion of your page. You would do this to make things faster... for example...
User navigates to display.php
User clicks "next" and the next image location is obtained using an AJAX request
The image (and image description) is changed to the next image
What you are suggesting is that you retrieve the "next url" using AJAX and then also perform a GET on the whole page. You would be much better off sending the "next" image when you send each page and not using AJAX at all.
this best describes everything i think: http://ajaxpatterns.org/Unique_URLs

Resources