I have some complex pages that make use of many RenderPartial (from 13 up to 20).
I've found that those pages are taking from 400 - 800ms to render (that's just for the view rendering time. Action time is not included and it is not an issue).
There are no queries/anything else on the pages, they just render a pre-loaded model. If I "glue" all the partials on each page, I can cut those times to 1/4. That's a lot of time.
Precompiling the pages do not solve anything, as that only impacts the startup time of the page. But the generated code it's the same.
My question then is: Is there a way I can, via some external tool, to "replace" those RenderPartials with the actual code from the partial? Maybe I can even run that tool on deploy only. After all, those partials are only used on specific pages and they do not need to be accessed from other actions as standalone views...
Related
I'm Wondering what are some best practices to decrease page load time of single page websites, and doing so in a way that won't hurt with SEO.
I'm leaning toward an ajax solution with "hijax linking", but I'm wondering what are some best practices in terms of the load order for a page. So for instance, say I have a simple webpage- has home, about, pictures of my cat, contact etc. and I'm planning to have it all show up on the homepage via vertical scrolling-alotting one "screen" worth of content per item.
I'm coding this in wordpress, so my main idea would be to first load the first "screen" i.e. hero section of homepage, as part of the home.php, so the user doesn't have to wait for the whole thing-and SEO. Then once that has finished loading, to load the next four via ajax, in the background. So I'm wondering what the best strategy might be to go about that. Someone provided this answer elsewhere:
"Build a standard 5 page site using php with proper separation of header, footer, content. Then use javascript to redirect to a single (separate) page with all content include()ed on the page."
In wordpress I'd take this to mean. Create a seperate page with a loop the grabs the other four "screens" as posts. and then load this page, after home.php has loaded.. Does anyone see any issues with this approach, or as the question asks, have any better or best practices to accomplish this, I'd appreciate them. Thanks.
There are several things you can do:
Need to improve the performance of your back end code in case there
is any.
Pagination: split page in smaller pages
Caching
Decrease the size of content, decrease the size of background images, compress js content
Compress Content
Most of the time the perfect optimization will depend on your situation. To start with one of the above will do it for you.
Your question is tagged with "wordpress". Therefore, I am assuming that you use wordpress.
if so, what I would think as logical starting point is to use one of the wordpress caching plugins. I use Quick Cache for my website and it makes significant difference.
But, you shouldn't stop with the plugin. Consider the quality of the theme you are using. You must be sure that the theme is good quality. Poorly designed themes may make inefficient database call and may slow your website.
delaying and Loading part of the page with ajax shouldn't be your first optimization action. Try all the other options first.
I'll admit that I'm pretty new web development (only been coding for about a year) and especially green when it comes to JS / jQuery.
A specific web page I've built loads different data based on hovering over certain categories: country clubs, resorts, hotels, etc. When I built the site on my local machine, the javascript function was super quick. However, on the live site, it has a long delay before the data swap happens.
The URL is: http://preferredparkingsolutions.com/client_list.html
Which links to a javascript function at: http://preferredparkingsolutions.com/scripts/clientHover.js
Which replaces the display div (#client_list) by pulling data from a text file.
Is there a better / faster way of doing this?
Yes, this could be optimised by loading the content in up-front and caching it. Currently you are doing a HTTP request each for each and every hover - even if the user has hovered over that element before, since the AJAX responses aren't being cached. Doing this would be your quickest win.
However, I can't see any case at all for having the content live externally. Is there any reason you're against having the content physically in the page and just using show/hide methods? There's various benefits to this - SEO, for one thing, since Google will find the content.
this is the external page you are loading http://preferredparkingsolutions.com/client_list.inc.html and the content looks little and looks like its a static page then why not just load every thing upfront and then just hide and show div's ? as Utkanos suggested you will aslo have a SEO benifit and also its HTTP request each for each and every hover. if you still want to load it externally lost load it once and cache it and use the cached version to hide and show divs.
What if use just one html page with blocks inside,
like following
<div id="page1" style="display:block">...</div>
<div id="page2" style="display:none">...</div>
<div id="page3" style="display:none">...</div>
Active block has style="display:block" others "display:none",
when block becomes active "display:block" previous goes "display:none".
AJAX server communication, for instance
$.post("json", { "name": $("#name").val(), "sex": $("#sex").val() },
function(data) { console.log("server responded", data); });
What are disadvantages of this approach?
this is fast and dynamic but this approach lacks of no bookmarking facility, user can't save a particular link because the data is dynamic, also it is not search engine friendly, another disadvantage is history button back and forward will not work.
There are no disadvantages to using pure AJAX. In fact, in a lot of ways, its advantageous.
If you don't plan for it correctly you can have some issues. For example, you're used to having CSS effect 1 page at a time, but with pure AJAX, the CSS will affect every single page.
Also, all things become slightly harder. Linking to another page becomes harder, as you have to use JavaScript.
Lastly, your APP will now REQUIRE JavaScript, so I'd have a decent fallback.
This approach is used in some mobile Web frameworks, such as jQuery Mobile, and is intended to make a Web application feel more native. This is more Web 2.0 than traditional websites or web applications where each page transition involves a trip to the server.
I'm sure you know the advantages already, so let's move on to the disadvantages.
Slightly Greater Initial Latency:
The main disadvantage of this approach is that it will take slightly longer to load the page content due to the fact that you're getting all of the HTML from the server in one single trip. Thus, the initial load time may involve more latency than in a traditional Web 1.0 application. However, with just a few pages, in my experience, the latency is not significant enough for it to be a problem.
Loss of Back Button - More Complexity in Maintaining History:
Another potential disadvantage is that, as a developer, you'll need to approach the development of your site differently. Because you're transitioning pages by hiding one DIV block and unhiding another, you'll lose native back button functionality. This can be mitigated by using the hash in the URL to record the history of page transitions. You'd then need to register an event to watch the hash and reload old content as the user navigates backwards. You'd also need to change the state of JavaScript objects and variables to refect the old state, which may add complexity to your app. There are of course API's and libraries to make this easier to implement and help ensure that you write good, maintainable code.
More Stateful Scope Involves Rethinking Approach and Possible Learning Curve:
Lastly, you'll need to remember that the scope of each page doesn't reset after each transition. While this could actually be an advantage in that your app is more stateful, you'll need to untrain yourself in the way of thinking that each page loaded will cause all of the JavaScript variables and data you've set to be cleared out.
Summary:
My suggestion, if you're going to go this route, is to use a library. Don't reinvent the wheel unless you have a good reason to. Libraries, like jQuery mobile, help ensure that there is good fallback for older browsers, and some even make sure that your site will still load using Web 1.0 techniques for cases where JavaScript is disabled.
I'm having an issue with slow AJAX calls. This is a common question, but I've done everything suggested in all the research I can find. I'm hoping to get a consensus form people who read this.
Basically, I make an ajax request to a php page, which gets info from a database.
Here is the page.
I've timed all of my javascript, mySQL, and php scripts, requests, and pages.
(If you run firebug you can see my time markers in the console, as well as in the xml)
As an example -
The mysql request takes 20ms
The PHP page takes 50ms
The ajax success script, which processes the small amount of xml (less than 1k) and generates the markers, takes 8ms to run.
Yet, loading the page takes nearly 4 seconds.
So, assuming none of my scripts are lagging, this has to be a problem with the response time from the server, or my own internet connection, right?
I'd appreciate any theories or thoughts.
Thank you
Ok looked at your page and here are some of the issues I saw that would affect speed:
It takes 4ms to get your data in your getMarkers function but it takes 892ms to read the xml file. I would recommend falling back to vanilla javascript to read your xml file as the amount of find's you are performing is really harming your performance here.
Minify and combine all of the scripts that are local on your server. I was getting some really high response times. You can eliminate 4 http requests by doing this which with the response times on your server will help a bit. (Note don't combine jquery or jquery ui in this)
Since your server is a bit slow (not your fault this will vary as you are probably on shared hosting) I would recommend linking jquery and jquery ui to the google cdn hosted versions. Here is a post on that Jquery CDN
You have 24 images on your page; 23 of which are under 4KB. Combine those 23 into one CSS sprite image and assign a 1px X 1px blank gif to be the inline html image and use the CSS sprite image instead. Here is good article on what this is if you are unfamiliar: CSS Sprites explained also here is good online css sprites generator: CSS Sprite generator
Make sure you need Jquery UI for this page. I didn't see anything that would have required it. If you can remove it you save yourself 206k. Remember to remove the associated CSS file if it isn't needed. This would save you another 2 calls.
Didn't dig too deep but if you are not already kick off the call to setup the google map in a $(document).ready() that way the rest of your page can load and you can display a loading animation in that area. This way users know something is happening and your page will appear to load a lot quicker.
So you can greatly speed things up by doing the above. You would go from 82 components down to 51 local and 2 more on Google CDN. If you can improve that xml read time you can shave nearly another second off load time as well
When cached, my starting page only needs to load one element (the "root document") - but then it needs some time until it's rendered completely:
alt text http://www.walkner.biz/_temp/firebug_net.png
The elements following are things loaded asynchronous via JavaScript.
Two questions:
Why does it take so "long" from loading the root document until the DomContentLoaded-event?
Does it make sense to load some not-so-important things asynchronously? Is it important to have the DmoContentLoaded-event as early as possible? Unfortunately there's not much documentation about that event, but I don't think it's the moment when the page is displayed, is it?
I'm not sure YSlow is gonna help him as that will download all elements for a page and run performance tests on them, whereas swalkner's problem is how long it is taking to render the HTML page itself when all other elements (images, CSS, etc) are cached.
At least that's what I think he's saying.
In the original question you said, "The elements following are things loaded asynchronous via JavaScript." but then listed nothing. What is loaded?
I would suggest checking for Javascript errors in the first instance. Then try removing some of your asynchronous loading calls one by one until you hit the bottleneck. In fact, remove them all, how long does the downloaded HTML take to render? Take that time and work from there.
Is your HTML document very big? Does it use lots of inline styles that could be in the CSS file?
Perhaps if you posted a link to the site then people would have a look at it.