.woff causes 8 - 12 second load time on optimized site - performance

Have optimized my site as far as I can. When running ping page test in waterfall mode, I may score 90/100 but page load slow, 6-9 seconds at best for under 400kb. Everything loads in 1 second or less, culprits are fonts with 6-8 second wait times, mainly .woff
Sizes of problems fonts: 2, 19 and 20 kb. Files that are 60kb load in under 1 second, so not a size issue??
Theme developer blames web host server, web host blames developer. Tried disabling woff files and then tt takes over slow load time.
Can't add a mime for woff, server Bluehost doesn't support.
Absolutely stumped. Any ideas appreciated! http://www.joyjournist.com

I had a look at your web page and there are a few things that you could do to improve the performance. Firstly it would make sense to move your JavaScript references to the bottom of the page. They will block the page loading and rendering where they are at the moment. Check out the results on Google's PageSpeed Insights.
In terms of your fonts and their performance, I recommend reading this article by the Filament Group which talks about how they improved the performance of their font loading using the FontFaceObserver script.
Other than that - see if you can combine those 3 stylesheets into one. You will reduce the number of HTTP requests and hopefully speed up the load time.

Your fonts report X-Powered-By:W3 Total Cache/0.9.4.1 try not using that plugin instead just serving the fonts without cache interference? All three of them are super tiny and should load in 100s of milliseconds instead of the 5 seconds that W3-cache needs to serve them up. Stick the fonts in your media library/downloads library instead, and link to those. I strongly suspect they'll now load in about a quarter of a second each, if that.

Related

Random slow content download, while TTFB remains low

I'm developing a Magento2 website, everything is running well for now, except a random issue I can't figure out how to solve.
The website is using Varnish, and all pages have a varnish cache HIT, with a very low TTFB about 30ms and low content download too (about 40ms)
But randomly, the download timing increases to more than 2 seconds, and on the next page visit it will go down to normal 40ms.
Same issue on all browsers, so not a browser issue.
Would someone have an idea ?
Many thanks
There are so many factors at play. The most important thing is whether or not you can simulate the issue and predict when the issue will appear again.
Once you're able to simulate, you need to determine what type of resource is slow:
Is it the graphical rendering of the page in the browser?
Is there a 3rd party JS resource that is slow?
Is it the actual TTFB of the main page that is slow for that resource?
Please use your browser's development tools and check the breakdown in the networking tab. With enough attempts, you'll be able to spot the issue. When you do, please add a screenshot of the breakdown in this question.
In parallel, you can keep the following command running to identify requests that took more than 2 seconds to process:
varnishlog -g request -q "Timestamp:Resp[2] > 2.0"
You can also add the output to your question.

Slow Web site speeds

I am currently trying to fix some speed issues on a site for a client (not built by myself), but can't figure out how to increase the speed of the site. It is built on WordPress and I have tried multiple things to get it working - to no avail. Any help would be appreciated!
There is plenty of information here.
It the WP Loop.
The 7.69 second load time for index.php
It's the WP plug-ins in the WP Loop that are causing the long load time.
If the site is static then just save the pages and serve them as static.
It appears the waterfall is cut off. somewhere down below that there is very likely a CSS or font file being loaded that is delaying the Start Render.
Correction, this page is a basket case and it should be redesigned from the ground up.

CKEDITOR at FireFox take a time to be loaded round 7 seconds i want to reduce this time

I am using Ckeditor. It Works fine on chrome but its taking too much time to load on Fire Fox take a time to be loaded round 7 seconds i want to reduce that time to be open quickly
Sounds like you need to profile your page. Meaning that you need to collect data on what specifically is taking so long to load and trying to improve those areas. Here is a nice Firefox profiling link. Here's a small checklist off the top of my dome that might help - always when doing changes, measure the impact they have on loading times using some kind of tool.
Check that FireFox (FF) is loading the minified version of CKEditor (cke)
Check that cache is being used - this helps with loadtimes greatly except for the first request
Benchmark cke performance for FF and Chrome using a barebones test to rule out your customizations and configuration.
This can help tons if you have a lot of custom code around cke like I do. I take a copy of my site, strip away my code, config, plugins, DOM elements etc until I find what is making the page load slowly. Or the other way around, start with nothing but a page with an editor and profile that against your site.
Disable all features and plugins you don't need
Make sure you use the latest version of cke and FF
Make sure you don't get any errors or bad looking warnings in the console
Check with a few different versions of FF and cke

How can I figure out why my Wordpress pages load so slowly?

Yet my site pages load very slowly. Usually there's a 2-3 second lag before the page renders, and I cannot figure out why.
My site is powered by Wordpress v3.4.2.
I'm on a dedicated virtual server with plenty of resources and
bandwidth.
There are no huge images loading.
My CSS files load before JS scripts.
I've spent a lot of time trying to optimize the site within the constraints of the platform (Wordpress + plugins, etc). I don't expect my site to be SUPER fast, but I need it to not be SO slow.
I'm using Chrome's developer tools to audit my site but the suggestions do not appear to explain the long load time (unused CSS rules, etc). When I look at the timeline, I see a 2.7x second load time initially but I can't figure out why. Can anyone help me get to the bottom of this?
My site is located here. The homepage has some extra scripts, so it may be more helpful to look at this page.
I found this superb guide which really helped me fight through the mire of optimising Apache for use with WordPress:
http://thethemefoundry.com/blog/optimize-apache-wordpress/
You said you have a virtual server so chances are it's currently set up to load EVERY module - you'll find a great speed boost here if you eliminate unnecessary modules. Keep a backup of your config file in case you screw it up.
Also - use the TOP command through SSH to see how much memory PHP is using. Probably a lot currently. This will all be improved through eliminating modules as per above link. You don't mention how much memory you have on your VPS but there's a good chance your performance issues are coming from memory thrashing which will be mitigated significantly by reducing how much memory each PHP instance consumes using the link above.
Also, it matters to find out where your performance issues are actually coming from – a great little plugin called WP Tuner helps me locate performance bottlenecks. The original plugin is incompatible but someone else has written an upgrade:
http://www.wwvalue.com/tuts/tut-wp/wordpress-profiler-tuner-revised.html
That will help you identify which specific parts of the page are taking the longest to load so you will immediately find your performance bottleneck.
In addition, a cool plugin called Debug Queries is useful for tracking down performance issues although the wordpress profiler above actually does track queries too.
Finally – I can’t recommend highly enough this WordPress.org discussion on performance, and specifically on W3 Total Cache vs Super Cache (both are excellent).
It’s a fantastic read for anyone looking for split-second response times:
http://wordpress.org/support/topic/wp-super-cache-vs-w3-total-cache
I use W3 total cache on one of my sites and WP Super Cache on another. Both are great. I used both so I could learn about both. I would say use WP Super cache plus all the other tools the guy at the link above recommends if you're looking for extreme performance, but if you're looking to get immediate performance W3 total cache is more comprehensive in its initial setup.
Hope that helps.
use caching plugin,
put JS files at the bottom,
try different webhost (DB server may be slow sometimes)
minify css and JS,
make fewer HTTP requests
make sure external services (like FB and others) are not slowing down (remove
them and see if it helps)
run Yslow or similar test
try to use typekit or google font instead of cufon
Have you tried http://wordpress.org/extend/plugins/wp-super-cache/ or a similar caching plugin?

How do you measure page load speed? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I am trying to quantify "site slowness". In the olden days you just made sure that your HTML was lightweight, images optimized and servers not overloaded. In high end sites built on top of modern content management systems there are a lot more variables: third party advertising, trackers and various other callouts, the performance of CDN (interestingly enough sometimes content delivery networks make things worse), javascript execution, css overload, as well as all kinds of server side issues like long queries.
The obvious answer is for every developer to clear the cache and continuously look at the "net" section of the Firebug plugin. What other ways to measure "site dragging ass" have you used?
Yslow is a tool (browser extension) that should help you.
YSlow analyzes web pages and why they're slow based on Yahoo!'s rules for high performance web sites.
Firebug, the must have for web developers Firefox extension, can measure the loading time of different elements on your webpage. At least you can rule out CSS, JavaScript, and other elements taking too much time to load.
If you do need to shrink JavaScript and CSS loading times, there are various JavaScript and CSS compressors out there on the web that simply take out unnecessary text out of them like newline characters and comments. Of course, keep an ordinary version on the side for development sake.
If you use PNGs, I recently came across a PNG optimizer that can shrink PNG sizes called OptiPNG.
"Page Load time" is really not easy to define in general.
It depends on the browser you use, because different browsers may do more requests in parallel, because javascript has differents speeds in different browsers and because rendering time is different.
Therefore you can only really measure your true page load time using the browser you are interested in.
The end of the page load can also be difficult to define because there might be an Ajax request after everything is visible on the page. Does that count the the page load or not?
And last but not least the real page load time might not matter that much because the "perceived performance" is what matters. For the user what matters is when sHe has enough information to proceed
Markus
I'm not aware of any way (at least no I could tell you :] ) that would automatically measure your pages perceived load time.
Use AOL Pagetest for IE and YSlow for firefox (link see above) to get a "feeling" for you load time.
Get yourself a proper debugging proxy installed (I thoroughly recommend Charles)
Not only will you be able to see a full breakdown of response times / sizes, you can save the data for later analysis / comparison, as well as fiddle with the requests / responses etc.
(Edit: Charles' support for debugging SOAP requests is worth the pittance of its shareware fee - it's saved me a good half a day of hair-loss this week alone!)
I routinely use webpagetest.org, which you can use to perform performance tests from different locations, on different browsers (although only msie 7-9), with different settings (number of iterations, connection speed, first run vs 2nd visit, excluding specific requests if you want, credentials if needed, ...).
the result is a very detailed report of page loading time which also provides advise on how to optimize.
it really is a great (free) tool!
Last time I worked on a high-volume website, we did several things, including:
We used Yslow to get an analysis of the individual factors affecting page load: https://addons.mozilla.org/en-US/firefox/addon/5369
performance monitoring using an external, commercial tool called Gomez - http://www.gomez.com/instant-test-pro/
We stress tested using a continuous integration build, using Apache JMeter. http://jmeter.apache.org/
If you want a quick look, say a first approximation, I'd go with YSlow and see what the major factors affecting page load time in your app are.
Well, call me old fashioned but..
time curl -L http://www.example.com/path
in linux :) Other than that, I'm a big fan of YSlow as previously mentioned.
PageSpeed is an online checking tool by Google, which is very accurate and reliable:
https://developers.google.com/pagespeed/
If it's asp.net you can use Trace.axd.
Yahoo provide yslow which can be great for checking javascript
YSlow as mentioned above.
And combine this with Fiddler. It is good if you want to see which page objects are taking the most bandwidth, which are being compressed at the server, unexpected round-trips, and what is being cached. And it can give you a general idea about processing time in the client web browser as compared to time taken between server & client
Apache Benchmark. Use
ab -c <number of CPUs on server> -n 1000 url
to get good approximation of how fast your page is.
In Safari, the Network Timeline (available under the Develop menu, which you have to specifically enable) gives useful information about loading time of individual page components, as well as showing when each component started loading.
Yslow is good, and HttpWatch for IE is great as well. However, both miss the most important metric to a user "When is the page -above the fold- ready for use to the user?". I don't think that one has been solved yet...
There are obviously several ways to identify the response time, but the challenge has always been how to measure the rendering time that is spent in browser.
We have a controlled test phase in which we use several automated tools for testing the application. One of the output we generate from this test is a fiddler trace for each transaction (a click). We can then analyse the fiddler trace to understand the Time for last byte and subtract it with the overall time the page took.
Something like this
1. A= Total response time as measured by the an automated tool (in our case we use QTPro)
2. B= Time to last byte (Server + Network time, from the fiddler trace)
3. C= A-B (approx Rendering time, OR the time spent in browser)
All the above I explained can be made a standard test process and end of the test we could generate a break-up of time spent at each layer e.g. rendering time, network time, database calls etc...

Resources