Look at the profile pages of users that have asked more than 10 questions.
(e.g., https://webapps.stackexchange.com/users/2496/tobeannounced)
Now try skipping to questions 10-20, or page 2 of the questions they have asked.
The load for the new page is almost instantaneous.
How is this accomplished?
Simply be loading all the questions when the first page is loaded so that any additional pages that are called up load very fast? In other words, the additional pages are pre-loaded?
Using the network tab of Firebug for Firefox, you can see all HTTP requests being made. With this turned on you can see that clicking the next link fires off an HTTP request which grabs the next page of questions (i.e. it is not preloading all the questions with the initial page load). It's a small request, small response, and the server replies really quickly, which is why it happens almost instantaneously.
Related
I am just curious to know how these websites were made to load only once. If you go to the sites http://fueled.com/ or http://ecap.co.nz/, the browser shows the spinning wheel only the first time the website is loaded. When you navigate to other pages from the navigation menu, like About or Contact or Team, when those pages load, the browser doesn't show the spinning wheel.
How do they make them work like this?
It is because page load is not triggered upon those links. Instead, a post request is triggered and its response will be used. Also, further page loads will be quicker, since scripts, styles and pictures will be cached, that is, saved locally on your computer.
You can check what happens using the browser console's network tab. Click on the last request before you click on such a link. You will see that the request log will not be cleared, but other requests are added. That means there is no page load in the meantime.
I'm optimizing my site speed. One of the main issue I'm facing is the homepage.
In the homepage, each article has FB/TW share buttons.
I only inserted the scripts in the footer once but I'm getting bunch of FB/TW share button requests.
Is it normal or there is something I need to do?
For every Like/Share button that you have, your browser needs to make a request to get the content. This is only executed when the browser has received the page from your server, so it does not affect the initial load time.
As CBroe mentions, the button is displayed in an iFrame. These are loaded and depending on your browser settings all at the same time or consecutive. During this time, your browser is not blocked so your used can already interact with the page.
If you want to reduce load, the only option is to remove the buttons. I think you have some index/home page where you load all the articles and for each of those a button? You could consider only showing the buttons on the articles itself, if you are really concerned about this.
But, since this is normal behaviour and your page is not blocked by loading all the iframes, this is not a big issue nor can you optimise it yourself.
I am using Opera and sometimes a page keeps on loading even though all content has already been presented. How do I find out which elements are to be loaded or what causes the ongoing loading process?
Even though all content seems to be 'presented', the page may still be loading images, JavaScript, CSS, or other resources referenced by it. This process performed by the browser isn't refereed to as "AJAX" as you have tagged in your question. AJAX is the asynchronous invocation of JavaScript to retrieve or submit data without requiring page refreshes.
As for examining which resources are causing your page to appear to be still "loading"...
I use Firebug's network tab to look at pending requests for resources in Firefox. It shows every resource your browser requests, how long it takes to retrieve, and the entire request & response headers and body.
Google chrome has something similar built-in, just hit F12 to bring up the "Developer Tools"
I would assume Opera has something similar although I am not sure of it's name.
I am working on a new website. While testing some of the functionality I had a number of debug statements and was watching the logs. It seems that Firefox (at least) loads the "next" page in the menu as well as the page I have clicked on. If I have menu items A B C D E and click on B then I see a request for mysite.com/B and then a request for mysite.com/C in the logs, and so on.
Is this some kind of look-ahead performance thing? Is there any way to avoid it (setting an attribute on the link maybe?) The problem is that the second page in my menu is somewhat heavier as it loads a whole lot of data from a web service. I'm happy for people to do that if they want to use the functionality, but would rather not that every visitor to the front page loads it unneccessarily. Is this behvaiour consistent across browser?
Yes, Firefox will prefetch links to improve the perceived performance to the user. You can read more about the functionality in Firefox here https://developer.mozilla.org/en-US/docs/Link_prefetching_FAQ
It isn't possible to disable this in the client's browser, however the request should include the header X-moz: prefetch which you can use to determine if it is in fact a prefetch request or not, and potentially return a blank page for prefetch requests. You can then use Cache-control: must-revalidate to make sure the page loads appropriately when actually requested by the user.
If you happen to be using Worpdress for your site, you can disable the tags with the prefetch information by using:
Wordpress 3.0+
//remove auto loading rel=next post link in header
remove_action('wp_head', 'adjacent_posts_rel_link_wp_head');
Older versions:
//remove auto loading rel=next post link in header
remove_action('wp_head', 'adjacent_posts_rel_link');
Yes, it's called prefetch. It can be turned off in the client, see the FAQ:
https://developer.mozilla.org/en-US/docs/Link_prefetching_FAQ
I'm not aware of a way to turn it off via the server
I've got a site with a basic login/logout system.
When I display pages I do a check to see when the page was last modified and if the browser has sent an if modified since header. If it hasn't been modified I send a 304 header and exit.
This obviously loads the page alot quicker and means less memory and processing power for me as I don't have to build the pages content.
However, I find that if a user logs in and then views an un-modified page he still sees the "sign in" button, the same is true if he logs out and views an un-modified page, he'll see the option to logout.
This seems like it would be a common occurence for anybody wishing to use 304 not modified headers, yet i'm struggling to find any discussion on how to solve it.
Any help would be much appreciated (I realise I haven't given much info but i'm not sure what else to say it's quite self explanatory)