Kohana execution time is fast, but overall response time is slow, why? - time

I use the Kohana3's Profiler class and its profiler/stats template to time my website. In a very clean page (no AJAX, no jQuery etc, only load a template and show some text message, no database access), it shows the request time is 0.070682 s("Requests" item in the "profiler/stats" template). Then I use two microtime() to time the duration from the first line of the index.php to the last line of index.php, it shows almost very fast result. (0.12622809410095 s). Very nice result.
But if i time the request time from the browser's point of view, it's totally different. I use Firefox + Temper data add-on, it shows the duration of the request is 3.345sec! And I noticed that from the time I click the link to enter the website (firefox starts the animated loading icon), to when the browser finish its work(the icon animation stops), it really takes 3-4 seconds!!
In my another website which is built with WikkaWiki, the time measured by Temper Data is only 2190ms - 2432ms, including several access to mysql database.
I tried a clean installation of kohana, and the default plain hello-world page also loads 3025ms.
All the website i mentioned here are tested in the same "localhost" PC, same setting. Actually they are just hosted in different directories in the same machine. Only Database module is enabled in the bootstrap.php for kohana website.
I'm wondering why the kohana website's overall response is such slow while the php code execution time is just 0.126 second?? Are there anything I should look into?
==Edit for additional information ==
Test result on standard phpinfo() page is 1100-1200ms (Temper data)

Profiler shows you execution time from Kohana initialization to Profiler render call. So, its not a full Kohana time. Some kind of actions (Kohana::shutdown_handler(), Session::_destroy() etc) may take a long time.

Since your post confirms Kohana is finishing in a 1/10th of a second and less, it's probably something else:
Have you tested something else other than Kohana? It sounds like the server is at fault, but you can't be sure unless you compare the response times with something else. Try a HTML and pure PHP page.
The firefox profiler could be taking external media into consideration. So if you have a slow connection and you load Google Analytics, then that could be another problem.

Maybe there is something related with this issue: Firefox and Chrome slow on localhost; known fix doesn't work on Windows 7
Although the issue happens in Windows 7, maybe it can help...

Related

2 instances of 1 image on a page

I am not really sure how to google this, so I thought I could ask here.
I have the same image, posted on a page twice, will that slow down the execution time or will it remain the same since I am using the same resource?
The browser should be able to cache the image the first time it is requested from the source server. Most of the popular browsers should have this implemented. It should not have to load it twice, just once on the initial request to the server and then using the cached version the second time.
This also assumes the end user has the browser caching enabled. If that is turned off (even if the browser supports it), then it will make that extra request for the image since the cache is not there to pull from.

MVC 3 with IE, poor bundle performance

We have deployed an MVC 3 website on an IIS6 box.
Everything runs fine, but the performance is abysmal.
Can anyone help me understand
why am I getting 20 second response times to get a script bundle?
why bundled scripts are not cached by IE even if the Expires header is set?
The site is several times faster in Chrome (I have noticed the cache behaviour is correct), but we cannot force customers to use it.
Any help would be great. I'm kind of wondering if it's a server-side setting that's forcing the bundle recompilation each request, or if it's just IE acting like usual.
Edit: as per comments request, I'm including also the bundle request headers:
If you have different download times for a full reload between the two browsers it could be that you are doing intense computations with a client side framework like angularjs (I have seen big performance differences from highly complex angularjs apps between the two browsers).
If both your browsers show the same download time, it is either a network issue, or a server issue.
The IE caching could be a separate issue, break your problem into two parts - look for the cause of the slow downloads first.
All I can do now is suggest an approach to finding the issue.
Summary of what you know
It looks like you have:
Server sends an Expires header one year from now
When you reload the page (i.e. you don't force a full refresh using Ctrl+F5)
IE doesn't take any notice of the cache header, and when it sends it's new request it doesn't use If-Modified-Since or If-None-Match
Chrome behaves differently and respects the Expires and/or ETag response headers (it doesn't even make the request again for the bundle).
EDIT 1: You also seem to be saying (though it would be good to see a timeline from chrome) that Chrome downloads the files faster, implying it is not a server-side problem. Your latest comment states that Chrome's downloads are also slow. (end edit)
And you also seem to be saying that this behaviour is consistent (i.e. 100 requests in IE, and 100 requests in Chrome show the above behaviour with no deviations).
Approach
You should break this problem into two parts:
Why is the download so slow?
Is there a server-side performance problem? Look for common download times in IE and Chrome, and Firefox (it could be due to bundling/minification/compression on the server).
Is there a network connectivity issue (dropped packets, for instance)? Look for inconsistent download times, Start times, Request times, between requests in a given browser and the same behaviour across all browsers.
Is a script slowing down IE, but not Chrome (this is not uncommon, I maintain legacy sites where the scripts don't run well in IE but do in Chrome) - look at different profile results between browsers.
Why is the javascript not being cached in IE? Troubleshoot (1) first, then worry about this.
It is possible that the two are related, but approaching them separately will be a start. Number 1 is far easier to diagnose that 2, the top references to caching javascript in IE on the web are to prevent it in order to help with development.
Root cause diagnosis
EDIT 1 The first thing to do is try the site from a browser on the server, or very close to the server to see if you have a network issue. (end edit)
Tools like Fiddler, the browser developer tools, timeline and script profiler, and YSlow are your friend. Compare each of the following between Chrome and IE (and see what happens in Firefox as well) and spot the difference. Note: you may need to clear the browser cache between tests.
browser developer tools -> script profile: see if you have a slow running script in IE compared to Chrome
similar analysis in a tool like YSlow (look for comparisons between the two browsers, not script improvements)
request and response headers, and timeline from a normal (i.e. not full reload) page load
request and response headers, and timeline from a full page reload (Ctrl+F5)
Start and Request durations for every js file for a given browser, and between browsers (this may point to network issues)? I note that the Start and Request alone are taking 0.6s and 1s each in IE - that is very very poor performance.
5 requests, and 5 full reloads with cache clearing between (that is, don't chase a ghost - be consistent in your test methodology)
Download times should be no different between Chrome and IE with no scripts actually running so also add a control test. Assuming that your bundle files don't "do anything" (i.e. they contain functions that the page calls rather than kicking off long processes by themselves) then create a blank page in your site which references exactly the same javascript files - not just the bundle, but every single js reference.
With the control test you can compare pure download times and caching behaviour in IE to Chrome, without any client side javascript running (use the developer tools profiler to verify no scripts are running). If your bundle files do kick off long running things, just temporarily disable those things by putting return statements at the top of the script and concentrate only on the download into the browser.

Website optimization / decrease loading time in asp.net mvc 3

I have built a test website using nopCommerce open source , Everything is working fine , i need to know , why my website loading time is greater than 6 sec , the homepage works fine but the categories when clicked takes like 6-10 secs. how can i check the http request and calls to db so that i can track which function is taking a long times.
Test website is test website
Thanks
Things I would try in that order:
MvcMiniProfiler.
Analyze my code for possible performance bottlenecks using a .NET profiler.
Finally submit bugs to the nopCommerce support if the previous approaches didn't yield anything fruitful that would put my code into cause.
In between I might also checkout with my hosting provider whether he is not the cause of the slowness.
As a quick and dirty check, you can add the time it takes to generate the response as a column in the IIS logs - that will give you some idea as to whether the server is being slow to serve the pages or you need to do some front-end optimisation work.
On the front end side the first thing you need to do it merge all the CSS files for a theme into one to save on roundtrips - the browser can't render the page until it's got the CSS
All the .js files you have in the head will also block the page, can you merge them and load them later?
The performance of imagegen.ashx looks on the slow side - do you need to generate the banners on the fly or could they be pre-generated?
If the back-end side of generating the page is slow, there are some scripts around the web to show which queries are using the most CPU, making the most IO ops etc.
Below is a list of things you can improve,
1.Combine your js.
There are a few things you can use, for example, jsMin, you can read this [post] http://encosia.com/automatically-minify-and-combine-javascript-in-visual-studio/. However, jsmin doesn't seem to compress the combined js.
Another option is [jmerge] http://demo.lateralcode.com/jmerge/ It kinda does it after the fact, in the sense that you need to have the site ready to cobine them with jmerge since it only take a http link.
The best one I'v known so far is bundling and minification feature of MVC4. It's part of MVC4, however, you can get a Nuget package for you MVC 3 app.
Word of advice: bundling every js of yours is not necessarily a good idea, it even backfires someimtes, since you will end up with a big js that browser will have to download sequentially, instead of downloading several smaller ones. (you might want to look into head.js to make js download parallel) So the trick here is to keep the balance. I end up have a jquery from google CDN and bundled the rest of my js into one.
2.Put js at the bottom of the page so the browser doesnt have to load the js first before it starts to render the page. But you need to be careful with this one though, since normally you will have jquery functions doing stuff upon document.ready() at the header of the page, I adviese you moving that to the bottom of the page as well, if possible.
If you move the js reference and scirpt block in you layout page to the bottom, then you will most likely run into problem with nested js reference and js script blocks in your individual view. No worries, then you need to look into using #section (probably suitable for a discussion in an other thread) in your view and render it in your layout page, so that the referenced and script block inside your view get rendered at the bottom of the page at run time.
2.Use CDN
Pretty straight forward.
3.Combine CSS
Combine them into one, with the same tool you use for combining js, but you need to reference it at the page header, instead of the bottom.
4.Enable static content cache, something like this in your web config file
It won't help with first time load, but definitely will make it a lot faster for returning user.
5.Enable url compression
Time to first load
This is one of the metrics used by webpagetest.org. But dont bang your head against this one too much, as it basically says how fast your web server can serve the content. So probably not much you can do here form the software end.
Hope that would help!
NopCommerce is deadly slow, and the developers doesn't look in to the performance issue seriously. I have seen lot of performance related forums left unanswered. So best luck.

Display images and not allow hot linking

We need to display ~40 images in a page and not allow users to hot link those images. We are currently using <img src="..."> which points to a handler that checks the cgi.http_referer and display the images using cfcontent. However, some images will fail to load (~6 images out of 40), if I refresh the page, some other images will fail to load.
This problem seems to appear when I have to display more than 10 images. I suppose this is because I'm using cfcontent? If so, what should I use instead?
To find out exactly why those images are failing, you'll need to do a little more work. You should use something like Firebug in FireFox, or the console in Safari or Chrome, to find out what's happening with those requests that are failing. You can also use something like Fiddler on Windows for IE or Charles on the Mac, Windows, or Linux to see the full HTTP requests that are happening in the background, along with the full return values from your ColdFusion app server. Until you know exactly why they're failing, we can't come up with any sort of solution.
The other thing to remember, is that if you do this via ColdFusion, then for every page load, you're hitting your CF server with 40 more requests. So one page then results in 41 hits to your CF server for processing. Make sure that code is as tight as it can possibly be.
If I was going to go this route, I'd do it at the server level (IIS or Apache) using some sort of server-level filter to prevent the hotlinking. But just remember, that there will always be a way around it.

Sporadic page load failure

An issue has started recently, within Chrome and reportedly Firefox, pages would be loading fine and browsing would be as normal and then suddenly then a page would fail to load (continuing to spin as if loading). The page that fails is often not the same.
If I refresh the page or try to goto another page on the domain within the same browser, the browser doesn't even try to resolve the name or make a connection, and is then unable to load the page.
Swapping to another browser and I am back to browsing the domain normally again, while the original browser(in most cases Chrome) will not load the pages until a restart.
This has happened with 3 different people on 3 different machines in both Chrome and Firefox.
The domain that it is running off has allot of ajax calls within certain pages, I am not sure if the server is tripping out due to the number of requests from the one client...I am not sure.
I am not sure if this is a server, client or script functionality issue, as I can not personally reproduce it. I can do little to debug or work out what is causing this or how to fix it...
As you can see I am not sure of allot with this problem :) so I am throwing it out to stack-overflow in the hope that someone may have had similar experiences or have any directions I could look towards.
Cheers,
Brendan
If the page is making many requests in a short time, your firewall (router) may block it. I've noticed this behavior on my own router, and had to set it to a less restrictive level to make things work.

Resources