best practices with single page websites to decrease page load time without hurting SEO - ajax

I'm Wondering what are some best practices to decrease page load time of single page websites, and doing so in a way that won't hurt with SEO.
I'm leaning toward an ajax solution with "hijax linking", but I'm wondering what are some best practices in terms of the load order for a page. So for instance, say I have a simple webpage- has home, about, pictures of my cat, contact etc. and I'm planning to have it all show up on the homepage via vertical scrolling-alotting one "screen" worth of content per item.
I'm coding this in wordpress, so my main idea would be to first load the first "screen" i.e. hero section of homepage, as part of the home.php, so the user doesn't have to wait for the whole thing-and SEO. Then once that has finished loading, to load the next four via ajax, in the background. So I'm wondering what the best strategy might be to go about that. Someone provided this answer elsewhere:
"Build a standard 5 page site using php with proper separation of header, footer, content. Then use javascript to redirect to a single (separate) page with all content include()ed on the page."
In wordpress I'd take this to mean. Create a seperate page with a loop the grabs the other four "screens" as posts. and then load this page, after home.php has loaded.. Does anyone see any issues with this approach, or as the question asks, have any better or best practices to accomplish this, I'd appreciate them. Thanks.

There are several things you can do:
Need to improve the performance of your back end code in case there
is any.
Pagination: split page in smaller pages
Caching
Decrease the size of content, decrease the size of background images, compress js content
Compress Content
Most of the time the perfect optimization will depend on your situation. To start with one of the above will do it for you.

Your question is tagged with "wordpress". Therefore, I am assuming that you use wordpress.
if so, what I would think as logical starting point is to use one of the wordpress caching plugins. I use Quick Cache for my website and it makes significant difference.
But, you shouldn't stop with the plugin. Consider the quality of the theme you are using. You must be sure that the theme is good quality. Poorly designed themes may make inefficient database call and may slow your website.
delaying and Loading part of the page with ajax shouldn't be your first optimization action. Try all the other options first.

Related

Does using AJAX on your website drop your page views while ranking?

Since its related to AJAX technology so I thought this is the best place to ask.
I am displaying 5 articles at a time to the user on my website and when he clicks 'Next' I load the next 5 articles using AJAX without loading the entire page.The result is that he always stays at the same page .
One of my friend told me that website ranking depends on number of page views and I think this obviously reduce my page views.
Should I not use AJAX then?
(This might be a stupid question but I seriously have no idea about ranking and SEO so please help)
By loading your content dynamically Google will not see the entire page. Only the part that is loaded. So, if Google rank is important for you it's better to not use an infinity loader.
Actually it is not a good idea to navigate page using AJAX. Consider a scenario,
display 5 articles first then by clicking Next button, next 5 items will load and so on... by using this the page will not become Search engine friendly.
in this case search engine can't locate your contents exactly and will crawl only initial contents.
but with some efforts you can make ajax navigation search engine friendly.. see example here.
Currently the scheme of loading content of page dynamically is not a good idea for SEO friendly web page but try considering other ajax page navigation schemes that might help the page to make dynamic as well as search engine friendly.
some suggested ajax navigation schemes are listed below,
http://nickjohnson.com/b/how-to-make-ajax-search-engine-friendly-seo
http://ajax.rswebanalytics.com/
http://www.symatix.co.uk/articles/ajax/search-engine-friendly-ajax-navigation

one page design

What if use just one html page with blocks inside,
like following
<div id="page1" style="display:block">...</div>
<div id="page2" style="display:none">...</div>
<div id="page3" style="display:none">...</div>
Active block has style="display:block" others "display:none",
when block becomes active "display:block" previous goes "display:none".
AJAX server communication, for instance
$.post("json", { "name": $("#name").val(), "sex": $("#sex").val() },
function(data) { console.log("server responded", data); });
What are disadvantages of this approach?
this is fast and dynamic but this approach lacks of no bookmarking facility, user can't save a particular link because the data is dynamic, also it is not search engine friendly, another disadvantage is history button back and forward will not work.
There are no disadvantages to using pure AJAX. In fact, in a lot of ways, its advantageous.
If you don't plan for it correctly you can have some issues. For example, you're used to having CSS effect 1 page at a time, but with pure AJAX, the CSS will affect every single page.
Also, all things become slightly harder. Linking to another page becomes harder, as you have to use JavaScript.
Lastly, your APP will now REQUIRE JavaScript, so I'd have a decent fallback.
This approach is used in some mobile Web frameworks, such as jQuery Mobile, and is intended to make a Web application feel more native. This is more Web 2.0 than traditional websites or web applications where each page transition involves a trip to the server.
I'm sure you know the advantages already, so let's move on to the disadvantages.
Slightly Greater Initial Latency:
The main disadvantage of this approach is that it will take slightly longer to load the page content due to the fact that you're getting all of the HTML from the server in one single trip. Thus, the initial load time may involve more latency than in a traditional Web 1.0 application. However, with just a few pages, in my experience, the latency is not significant enough for it to be a problem.
Loss of Back Button - More Complexity in Maintaining History:
Another potential disadvantage is that, as a developer, you'll need to approach the development of your site differently. Because you're transitioning pages by hiding one DIV block and unhiding another, you'll lose native back button functionality. This can be mitigated by using the hash in the URL to record the history of page transitions. You'd then need to register an event to watch the hash and reload old content as the user navigates backwards. You'd also need to change the state of JavaScript objects and variables to refect the old state, which may add complexity to your app. There are of course API's and libraries to make this easier to implement and help ensure that you write good, maintainable code.
More Stateful Scope Involves Rethinking Approach and Possible Learning Curve:
Lastly, you'll need to remember that the scope of each page doesn't reset after each transition. While this could actually be an advantage in that your app is more stateful, you'll need to untrain yourself in the way of thinking that each page loaded will cause all of the JavaScript variables and data you've set to be cleared out.
Summary:
My suggestion, if you're going to go this route, is to use a library. Don't reinvent the wheel unless you have a good reason to. Libraries, like jQuery mobile, help ensure that there is good fallback for older browsers, and some even make sure that your site will still load using Web 1.0 techniques for cases where JavaScript is disabled.

How do you prevent gaming of page views?

Say I have a site with pages. Pages are ranked based on the number of times they have been viewed. It is good for a page to be highly ranked because it will make it show up higher in my search results. Hence, the author of a page may try to game the system to increase that particular page's views.
So how do you prevent that while still keeping a quasi-accurate count?
I have come up with the following "scheme":
A user can only affect the page view once per session. This is what I would normally expect. If a user returns to the site later and views the page again, it should count as another page view.
The problem is that this makes the page view increment vulnerable to a script that clears its cookies before each request. The easiest solution to this problem would be to save the ip-address and only allow the same ip-address to increment page count once. This however has several major drawbacks; First of all, this would potentially take up a lot of storage, and second of all would prevent users on big LANs from incrementing page count. Lastly, a user cannot revisit a page and increment the page view more than once from the same ip. I can live with that, but would rather live without it.
The best method I can come up with off the top of my head would be to save the last X ip-addresses, and not let anyone from these ip-addresses affect the page view count. This would effectively stop any (simple) script from raising the page view count. Furthermore it would probably be a good idea to add a delay to the display of actual view count (basically keeping two counts and a datetime field for when the "display" count was last updated with the "actual" count, something I believe is done on the SE sites).
This is not a perfect solution, so I would be happy to hear your suggestions and/or comments.
Don't prevent: monitor and handle.
I would use a very different approach. Let the page views stay the same, but have reporting in place to looks for view-gaming. If a page gets gamed, you can find out who is responsible, give them a warning and a page-view penalty. If it continues, ban them.
I think that you should consider the reported characteristics of the browser as well. Browser fingerprinting has been done before and is well publicized. You can then figure out some pretty advanced heuristics on determining whether the same user is trying to game you. But don't publicize that you're using browser fingerprinting of course. Also, it won't stop incognito mode, but I'm just trying to give you one more avenue of thought to follow, in addition to your current IP oriented strategies.

Getting content: AJAX vs. "Regular" HTTP call

I like that, these days, we have an option for how we get our web content from the server: we can make an old-style HTTP request (with its own URL in the browser) or we can make an AJAX call and replace parts of the DOM on the fly.
My question is this: how do you decide which method to use when there's an option to use either?
In the "old days" we'd have to redraw the entire page (including the parts that didn't change) if we wanted to show updated content. Now that AJAX has matured we don't need to do that any more; we could, conceivably, render a "page" once and just update the changing parts as needed. But what would be the consequences of doing so? Is there a good rule of thumb for doing a full-page reload vs a partial-page reload via AJAX?
If you want people to be able to bookmark individual pages, use HTTP requests.
If you are changing context, use HTTP requests.
If you are dividing functionality between different pages for better maintainability, use HTTP requests.
If you want to maximize your page views, use HTTP requests.
Lots of good reasons to still use HTTP requests - Stack overflow is a wonderful example of those divisions between AJAX and HTTP requests. Figure out why each function is HTTP or AJAX and I'm sure you will derive lots more reasons when to use each.
My simple rule:
Do everything ajax, especially if this is an application not just pages of content. Unless people are likely to want to link to direct content, like in a blog. Then it is easier to do regular full pages.
Of course there are many blended combinations, it doesn't have to be one or the other completely.
A fair amount of development overhead goes into partial-page reloads with AJAX. You need to create additional javascript handlers for all returned data. If you were to return full HTML blocks, you would still need to specify where the content should go and whether or not it's replacing other content. You would potentially have to re-render header tags to reflect content changes and you would have to implement a history solution to make sure search engines can index each page (using SWFAddress jQuery plugin, for example). If you return JSON encoded data you have an additional processing step.
The trade-off for reduced bandwidth usage by not using a page refresh is offset by an increase in JS code and event bindings which could affect page rendering speed as well as visual effects.
It all really depends on your target audience and the overall feel you are trying to go for on your page. AJAX and preloaders are flashy, and people love flashy things. If you believe the end-user experience will improve by adding partial page loads by all means implement them.

Why does my website need so much time to render?

When cached, my starting page only needs to load one element (the "root document") - but then it needs some time until it's rendered completely:
alt text http://www.walkner.biz/_temp/firebug_net.png
The elements following are things loaded asynchronous via JavaScript.
Two questions:
Why does it take so "long" from loading the root document until the DomContentLoaded-event?
Does it make sense to load some not-so-important things asynchronously? Is it important to have the DmoContentLoaded-event as early as possible? Unfortunately there's not much documentation about that event, but I don't think it's the moment when the page is displayed, is it?
I'm not sure YSlow is gonna help him as that will download all elements for a page and run performance tests on them, whereas swalkner's problem is how long it is taking to render the HTML page itself when all other elements (images, CSS, etc) are cached.
At least that's what I think he's saying.
In the original question you said, "The elements following are things loaded asynchronous via JavaScript." but then listed nothing. What is loaded?
I would suggest checking for Javascript errors in the first instance. Then try removing some of your asynchronous loading calls one by one until you hit the bottleneck. In fact, remove them all, how long does the downloaded HTML take to render? Take that time and work from there.
Is your HTML document very big? Does it use lots of inline styles that could be in the CSS file?
Perhaps if you posted a link to the site then people would have a look at it.

Resources