25 second load time on inital network request to my website - performance

When debugging using the chrome's performance profiler i recorded a reload of my own page. The following is the result.
chrome performance profiler screenshot
After reloading the page, there is a 25 second network request that first happens, then the website loads really ast. I've been searching the internet for an answer on how to debug this further or why this could happen but honestly I can't figure it out.
I've tried using another network, my 4G on mobile, flushing DNS, using other browsers/incognito, clearing cache. Nothing works.
My hosting company says that this is a problem on my end and GTmetrix also show very good results. But I'm not sure i trust it since it happens on both my phone and computer. (4G roaming on phone)

Related

Laravel slow on 1st time load

I have a laravel app that loads fine on localhost but when deployed on a shared 1&1 host server (for demo purposes) the first page load is very slow (up to 12s !). It's only occuring on first page load, after that it works perfectly fine as if the site was going on "sleep" mode when not used for a while.
It does sound like a cache issue though I activated all laravel caches (views, config, routes..).
Someone mentioned a similar problem on a godaddy shared host, their solution was to have a cron pinging the site every minute to keep it alive, that probably works but that's not a very satisfactory solution.
The debugger/console are not showing much :
On first page load :
Queries 343ms
Route request 12.46s
The console is showing a 12.69s TTFB waiting time
After reload
Queries 39.47ms
Route request 238ms
console 334ms.
Has anyone come across a similar issue before ?
Issue seems to be related to the host. Same install on a different host works perfectly, no real answer here..

Sometimes pages on my website load very slowly

Across all browsers/devices, I find random different pages, at random times, are very slow to load/don't load. The browser is stuck on 'Waiting for website.com'. I will wait 20 seconds and nothing will happen until I manually refresh the page. As I realise this is very vague, can you suggest a) most likely issues to look for first or b) some diagnostic tools that I could use to try and de-bug the issue as a starting point, so that my hosts/developers can solve the issue. Here are some results of recent speed tests.
One thing to also add is that, it seems it more often gets stuck on particular pages. Namely the pages where users take practice tests. After each time the user clicks 'Next', their selected answer is inserted into the database. My speculation is that potentially it's an issue with the DB itself, or the process which inserts into the database. It's when clicking 'Next', that the whole website sometimes just dies as described above.
Results from Google Speed Test
Waterfall image
A wait time of 20secs at random times and random pages could possibly be due to stop-the-world garbage collection. So GC logs are probably a good starting point.
A thread sampler such as Djigger a colleague of mine wrote might probably also help you figuring out what the machine is doing during the 20 seconds.
If that doesn't help I suggest to use a Profiler or better an APM tool to monitor whats going on on your system. Those tools give a you a broader insight of the internals.
You need to run a few page speed tests and look at the waterfall images.
It is very common on shared servers for the server to be too busy to get to your request. 20 seconds would indicate a serious issue with the server.
Another common reason is the page has a link to a third party resource and that resource is too often unavailable.
In your case the culprit is website.com and I assume that is your site.
Use something like webpagetest.org to run the tests.
In the waterfall image below
Dark Green is DNS lookup time.
Orange is the time for Browser to connect to server.
Green is the wait time for server to put image in output buffer.
Blue is the time for the server to transmit to the Browser.
The problem with the sample waterfall page is the index page took 4 seconds to be generated or retrieved. Most likely this is a Word Press site with plugins.
I suspect yours may be 20 seconds. But due to the randomness, is is also a good possibility it is a page resource that is stalled.
If it is the index page, then you likely have a poor ISP and or one of the other users of the server is hogging the CPU.
Keep running the tests until you see the problem occur.
It will be very obvious where the problem is located.
You can post the waterfall image and send me a message if you have any questions.
Waterfall from webpagetest.org

Web app initial load time

I am using a shared hosting plan at Bluehost to host a golf tournament live scoring mobile web app. I am caching everything I can on Cloudflare, and spent quite some time on overall optimization of the initial download & rendering times. There might be more I could do, but without question my single biggest issue is the initial call to my website: www.spanishpointscup.org. Sometimes this seems to be related to DNS lookup and other times related to Waiting(TTFB).
Below are 2 screen shot images of the network calls that show variations in accessing my index.html. Sometimes this initial file load can be even longer. Very rarely are any of the other files downloaded creating a long delay time, so my only focus now is the initial file load. I think that even if I had server side rendering, I would still have this issue.
Does anyone have specific recommendations that they are confident will help me? Switch to VPS or other host? Thank you.
This is typical when you use a shared server.
The DNS has nothing to do with the issue. DNS has to do with the request not the response. It is the Browser that must resolve the the domain name to an ip address.
The delay you are seeing is due to the server being busy and your page is sitting in a queue waiting behind other processes. Possibly you have a CPU grabbing neighbor on your shared server. Or Bluehost has some performance issues.
You will likely notice some image files take an excessively long time to transmit. Which image is slow will appear to be random with each fresh (not in cache) page load.
UPDATE
After further review I noticed the "wait" times are excessive. Wait time is in green on your waterfall. Notice how the transmit time (blue) is short. This is the time it takes the server to retrieve the page from the disk and put it into the transmit buffer. 300-400 millisecond is excessive.
Find a new service provider.

Proxy could not connect to the destination in time

I have been having issues with loading pages from the website I created. The strange thing is that after I reload a page (e.g. Ctrl+R) especially if I do it multiple times in a row, sometimes that page loads flawlessly fast, sometimes it takes 10-20 seconds and sometimes it doesn't load at all, because, based on the Network tab in the Developer panel, some files are 'pending' to load, but never load. I'm then getting a "proxy could not connect to the destination in time" error if the very first index.php doesn't load, or the page loads only partially and I'm getting an error in the Console "could not connect to [filename]'
Few facts to keep in mind:
This issue occurs on all browsers I tested (Chrome, Firefox, IE, and Opera)
I am not focused on asynchronizing the stylesheets or script files at this point. As a result, my pages are not loaded until all files are loaded, but at this stage of development, this is not a top priority
I'm using GoDaddy shared hosting for this website
I am accessing the website through a corporate proxy server (I believe they use McAffee Enterprise)
If I test the loading speed using external tools (tools.pingdom.com or google developers tool), there appears to be no issue with the loading speed
Even under the corporate proxy server, I have never had this issue with any other websites
My questions is whether anybody else had this issue and knows how to mitigate it. If it's the proxy, I'm not sure why other sites work just fine. Any thoughts?

Does Google Analytics have performance overhead?

To what extent does Google Analytics impact performance?
I'm looking for the following:
Benchmarks (including response times/pageload times et al)
Links or results to similar benchmarks
One (possible) method of testing Google Analytics (GA) on your site:
Serve ga.js (the Google Analytics JavaScript file) from your own server.
Update from Google Daily (test 1) and Weekly (test 2).
I would be interested to see how this reduces the communication between the client webserver and the GA server.
Has anyone conducted any of these tests? If so, can you provide your results? If not, does anyone have a better method for testing the performance hit (or lack thereof) for using GA?
2018 update: Where and how you mount Analytics has changed over and over and over again. The current gtag.js code does a few things:
Load the gtag script but async (non-blocking). This means it doesn't slow your page down in any other way than bandwidth and processing.
Create an array on the page called window.datalayer
Define a little gtag() function that just pushes whatever you throw at it into that array.
Calls that with a pageload event.
Once the main gtag script loads, it syncs this array with Google and monitors it for changes. It's a good system and unlike the previous systems (eg stuffing code in just before </body>) it means you can call events before the DOM has rendered, and script order doesn't really matter, as long as you define gtag() first.
That's not to say there isn't a performance overhead here. We're still using bandwidth on loading up the script (it's cached locally for 15 minutes), and it's not a small pile of scripts that they throw at you, so there's some CPU time processing it.
But it's all negligible compared to (eg) modern frontend frameworks.
If you're going for the absolute, most cut-down website possible, avoid it completely. If you're trying to protect the privacy of your users, don't use any third party scripts... But if we're talking about an average modern website, there is much lower hanging fruit than gtag.js if you're hitting performance issues.
There are some great slides by Steve Souders (client-side performance expert) about:
Different techniques to load external JavaScript files in parallel
their effect on loading time and page rendering
what kind of "in progress" indicators the browser displays (e.g. 'loading' in the status bar, hourglass mouse cursor).
I haven't done any fancy automated testing or programmatic number crunching, but using good old Firefox with the Firebug plugin and a pair of JS variables to tell the time difference before and after all GA code is executed, here is what I found.
Two things are downloaded:
ga.js is the JavaScript file containing the code. This is 9kb, so the initial download is negligible and the filename isn't dynamic so it's cached after the first request.
a 35 byte gif file with a dynamic url (via query string args), so this is requested every time. 35 bytes is a negligible download as well (firebug says it took me 70ms to dl it).
As far as execution time, my first request with a clean browser cache was an average of about 330ms each time and subsequent requests were between 35 and 130 ms.
From my own experience it has adding Google-Analytics has not changed the load times.
According to FireBug it loads in less then a second (648MS avg), and according so some of my other test ~60% - 80% of that time was transferring the data from the server, which of course will vary from user to user.I don't preticularly think that caching the analytics code locally will change the load times much, for the above reasons.
I use Google-Analytics on more then 40 websites without it ever being the cause of any, even small, slowdown, the most amount of time is spent getting the images which, due to their typical sizes, is understandable.
You can host the ga.js on your servers with no problems whatsoever, but the idea is that your users will have the ga.js cached from some other site they may have visited. So downloading ga.js, because it's so popular, adds very little overhead in many cases (i.e., it's already been cached).
Plus, DNS lookups do not cost the same in different places due to network topology. Caching behavior would change depending on whether users use other sites that include ga.js or not.
Once the JavaScript has been loaded, the ga.js does communicate with Google servers, but that is an asynchronous process.
There's no/minimal site overhead on the server side.
The HTML for Google Analytics is three lines of javascript that you place at the bottom of your webpage. It's nothing really, and doesn't consume any more server resource than a copyright notice.
On the client side, the page can take a little bit (up to a couple of seconds) of time to finish displaying a page. However - In my experience, the only bit of the page not loaded is the Google stuff, so users can see your page perfectly fine. You just get the throbber at the top of the page throbbing for a little longer.
(Note: You need to place your google analytics code block at the bottom of any served pages for this to be the case. I don't know what happens if the code block is placed at the top of your HTML)
The traditional instructions from Google on how to include ga.js use document.write(). So, even if a browser would somehow asynchronously load external JavaScript libraries until some code is actually to be executed, the document.write() would still block the page loading. The later asynchronous instructions do not use document.write() directly, but maybe insertBefore also blocks page loading?
However, Google sets the cache's max-age to 86,400 seconds (being 1 day, and even set to be public, so also applicable to proxies). So, as many sites load the very same Google script, the JavaScript will often be fetched from the cache. Still, even when ga.js is cached, simply clicking the reload button will often make a browser ask Google about any changes. And then, just like when ga.js was not cached yet, the browser has to await the response before continuing:
GET /ga.js HTTP/1.1
Host: www.google-analytics.com
...
If-Modified-Since: Mon, 22 Jun 2009 20:00:33 GMT
Cache-Control: max-age=0
HTTP/1.x 304 Not Modified
Last-Modified: Mon, 22 Jun 2009 20:00:33 GMT
Date: Sun, 26 Jul 2009 12:08:27 GMT
Cache-Control: max-age=604800, public
Server: Golfe
Note that many users click reload for news sites, forums and blogs they already have open in a browser window, making many browsers block until a response from Google is received. How often do you reload the SO home page? When Google Analytics response is slow, then such users will notice right away. (There are many solutions published on the net to asynchronously load the ga.js script, especially useful for these kind of sites, but maybe no longer better than Google's updated instructions.)
Once the JavaScript has loaded and executed, the actual loading of the web bug (the tracking image) should be asynchronous. So, the loading of the tracking image should not block anything else, unless the page uses body.onload(). In this case, if the web bug fails to load promptly then clicking reload actually makes things worse because clicking reload will also make the browser request the script again, with the If-Modified-Since described above. Before the reload the browser was only awaiting the web bug, while after clicking reload it also needs the response for the ga.js script.
So, sites using Google Analytics should not use body.onload(). Instead, one should use something like jQuery's $(document).ready() or MooTools' domready event.
See also Google's Functional Overview, explaining How Does Google Analytics Collect Data?, including How the Tracking Code Works. (This also makes it official that Google collects the contents of first-party cookies. That is: the cookies from the site you're visiting.)
Update: in December 2009, Google has released an asynchronous version. The above should tell everyone to upgrade just to be sure, though upgrading does not solve everything.
It really depends on the day. I'm just adding this to a blog. I'm in california, very close to their main data centers, on a fast low latency business DSL, on a overclocked i5 with plenty of RAM running a recent linux kernel and stable firefox.
here's a sample page load:
google-analytics alone added 5 seconds just of network download time... to get 15Kb!
You can see blogger.com served 34Kb in 300 mili seconds. That's 32x faster!
Also, look how the Red Line (which represents the onLoad event, meaning, there's no more script executing on the page and the so the browser can finally stops the loading indicators/spinings/etc) ... look how far to the right it is. that's probably 3seconds of garbage javascript processing that happened there. It's very uncommon for that line to be very far away from the end of the resources download bars. I'm done debugging this and it's 1/3 analytics fault, 2/3 blogger fault. ...one would think google stuff was fast.
Edit:
Some more data. Here's a request with everything cached. the above one was first visit.
I've removed the googleplus crap from above for two reasons, I was trying to see if they were playing some part on the slow onLoad event (they aren't) and because It is mostly useless.
So, With this we can see that the network time is the least of your worries. Even on a fast computer with modern software, the toll google analytics + blogger take on processing time will still dump your page load past 7s. Without the blogger, just check this very site, i'm seeing 0.5s of delay after resources are loaded and the red line kicks in.
Loading any extra javascript to your page is going to increase the download time from the client's perspective. You can ameliorate this by loading it at the bottom of your page so that your page is rendered even if GA is not loaded. I would avoid caching because you would lose the advantage of the client cache for your page. If the client has it cached from some other page, your page's request will be filled from the client itself. If you change it to load from your site, it will require a download even if the client already has the code (which is likely). Adding a task to your software processes to avoid loading the file from Google seems unwarranted for what may be an unnecessary optimization. It would be hard to test this since it would always serve up faster locally, but what really matters is how fast it works for your customers. If you decide to evaluate keeping it locally, make sure you test it from your home internet connection --- not the machine sitting next to the server in your rack.
Use FireBug and YSlow to check for yourself. What you will discover however is that GA is about 9KB in size (which is actually quite substantially for what it does) and that it also sometimes does NOT load very fast (for what reasons I don't know, I think it might be the servers "choking" sometimes)
We removed it due to performance issues on our Ajax Samples, but then again for us being ultra fast and responsive was priority 1, 2 and 3
Nothing noticeable.
The call to Google (including DNS lookup, loading the Javascript if not already cached and the actual tracer calls themselves) should be done by the client's browser in a separate thread to actually loading your page. Certainly the DNS lookup will be done by the underlying system and will not, to my knowledge, count as a lookup within the browser (browsers have a limit on the number of request threads they will use per site).
Beyond that, the browser will load the Google script in parallel along with all other embedded resources, so you will potentially get an extremely slight increase in the time it takes to download everything, in the worst case (we're talking in the order of milliseconds, unnoticable. If the Google script is loaded last by the browser, or you don't have many external resources on your page, or if your page's external resources are cached by the browser, or if Google's script is cached by the browser (extremely likely) then you won't see any difference. It's just absolutely trivial overall, the same effect as sticking an extra tiny picture on your page, roughly speaking.
About the only time it might make a concrete difference is if you have some behaviour that fire on the onLoad event (which waits for external resources to load), and the Google servers are down/slow. The latter is unlikely to happen often, but if this were the case then the onLoad even won't fire until the script is downloaded. You can work around this anyway by using various "when DOM loaded" events, which are generally more responsive as you don't have to wait for your own scripts/images to load this way either.
If you're really that worried about the effects on page load time, then have a look a the "Net speed" section of Firebug, which will quantify this and draw you a pretty graph. I would encourage you to do this for yourself anyway as even if other people give you the figures and benchmarks you request, it will be completely different for your own site.
Well, I have have searched, researched and expored extensively on net. But I have not found any statistical data that claims either in favour or against of the premise.
However, this excerpt from http://www.ga-experts.com claims that its a Myth that GA slows down your website.
Err, well okay, maybe slightly, but
we’re talking about milliseconds. GA
works by page tagging, and any time
you add more content to a web page, it
will increase loading times. However
if you follow best practice (adding
the tag before the </body> tag) then
your page will load first. Also, bear
in mind that any page tag based web
analytics package (which is the
majority) will work the same way
From the answers above and all other sources, what I feel is that whatever slowdown it causes in not percieved by the user as the Script is included at the bottom of the page. But if we talk of complete page-loads we might say that it slows down the page-load time.
Please post in more info if you have and DATA if you have any.
I don't think this is what your looking for but what are you worried about performance for?
If its your server... then there's obviously no impact as it resides on Google servers.
If its your users that your worried about then there is no impact either. As long as you place it just above the body tag then your users will not receive anything slower than they would before... the script is loaded last and has no affect on the appearance to the user. So there essentially not waiting on anything and even continue to browse through the page without noticing that its still loading.
The question was will Google Analytics cause your site to slow down and the answer is yes. Right now at the time of writing this Google-Analytics.com is not working so sites that have that in their pages won't load the pages so yes, it can slow down and cause your site to not even load. It's uncommon for google-analytics.com to be down this long which right now has been over 10 minutes, but it just shows that it is possible.
There are two aspects to it.
Analytics script's' (and a gif) download
Downloaded scripts execution
Download time is almost always less than 100ms, which is acceptable.
Here comes the twist.
analytics.js execution 250ms
re-marketing (if enabled) 300ms
demographic (if enabled) 200ms
So analytics with re-marketing takes 750ms on average. I feel that this is a huge number when it comes to performance overhead.
I noticed frequent I/o and CPU overload in cPanel resulting with:
Site unreachable error
And that stopped after I disabled WP Analytics plugin. So I reckon it does have some impact.

Resources