I'm using Lighthouse from Google to check my PWA on performance and stuff. However, I always get extreme numbers with First meaningful paint. The last test the results gave me: 14445.7ms. But when I use the website on my own it never loads then more than 2 seconds? Am I doing something wrong here?
Lighthouse renders your browser connection to a slower speed to test how your application would run on a slower 3G connection (simulation of someone on a slower mobile connection). So if you are just loading your web application in your desktop browser (or even your mobile browser because your connection is most likely superior to the simulated connection) it will be significantly faster than the Lighthouse simulation.
As far as first paint goes, the first place I would begin to look is The PRPL pattern --
https://developers.google.com/web/fundamentals/performance/prpl-pattern/
-- as well as taking an in-depth look at the Lighthouse report because it actually tells you what can be better optimized on your app if your score is not ideal.
Related
I have built a web application, in which I display a map and I use both WMS and WFS requests in order to show network (lines) and points of interest on the map. The application has several filters in order to send queries to database (such as date filters etc.).
The app runs in a remote server. The speed when accessing the app from my browser (firefox or chrome) is satisfying. Everything runs quite smoothly.
My issue is that I get complaints related to speed and performance. So my question is: on what does the performance depends on and how can it be possible that me having great experience and others don't?
One hypothesis is that the computer power of the other client is too low (which is not the case).
Another one is that the server doesn't have enough resources (which is also not the case).
What are other parameters that affect the performance of a web app?
The connection speed is a very important factor and also the response time (mostly distance client-server).
I would like to load test pages AND render javascript. It seems there are 3 categories of applications which might solve this problem, and all three miss the mark.
1) Jmeter, apachebench, tsung, Grinder, Iago
Why it wont work: It doesn't render javascript. These are handy tools, but they won't work for this purpose.
2) Watir, Selenium
These tools are excellent, they use real browsers and render javascript / ajax, but alas, they are designed mostly for functional testing, and not performance testing. You could create your own app to use these things for a load test, but it would be a massive pain in the ass to collect all of the performance metrics and aggregate them.
If only there was a combination of the first two types of web performance tests, that would be great.
The third option, solves this problem, but unfortunately you have to pay for it.
3) Web load testing services
Services like Keynote Load Pro, BrowserMob and others are great, and they solve the problem of using real browsers and rendering JavaScript. The only problem is, I don't want to pay $300 ever time I run a damn load test (this is an exaggeration, but not really, depending on how many virtual users you use).
So that option won't work, unless I want to hemorrhage money.
Isn't there a test harness out there that solves this problem? It seems like a big gaping hole where there is a need for an open source tool that no one has solved yet. The commercial companies dominate this space (and those who want to employ a full time developer to write a selenium performance test framework).
The good thing about jmeter, ab, tsung, etc. is they can give you aggregated metrics in relatively short time but they can't execute javascript. They can't measure how fast DOM has been created. They won't load external resources such as images, javascripts, styles. They are good for stress-testing which is not the case here.
When measuring web performance you need to know:
what do you really want to measure? - domLoading, domInteractive, domContentLoaded or maybe something else from Performance Timing Interface,
which statistical measure you want to use? - p95, median. Average is not the best choice.
when? - do it always at the same time to make tests comparable,
from which location? - the best you can do is to spawn dedicated server located closely to your production,
how confident you want to be about the results? - choose sample data size, for example if you accept 3% margin of error with 95% confidence interval then choose 1067.
For doing such Synthetic Performance Measurements you can use https://github.com/msn0/sweter. It measures timeToFirstByte, domInteractive and domComplete timings. It can be scheduled to run tests always at the same time. Sweter can report raw data to ElasticSearch, console, json file or wherever you want.
The different topic is Real User Measurements. There are good tools for that already, e.g. NewRelic.
Well you haven't mentioned what metrics you really want to measure. But I think you can test both server side and browser performance using two simple tools.
PhantomJS and HAR file: PhantomJS is a webkit headless browser that has a Javascript API. You can script it using Javascript to do pretty much everything (including javascript rendering). Use it to generate HAR file so that you can analyze performance bottle necks in all the requests.
Navigation Timing API : Modern browsers support navigation timing data which gives you details about timings in browser rendering, including all important metrics like domLoading, domInteractive etc.
There are already plenty of hosted services for tracking server response times but we're looking for something to track page render/load times (i.e. browser renderer).
The problem with PageSpeed, YSlow, etc is they're on request where we want something that runs constantly and takes a reading every 15 mins for example.
Latest browsers have a window.performance.timing property which contains the timestamp at which some events occurred (such as domainLookupStart, domLoading, domInteractive, ...).
You may want to send a sample of those numbers to your servers.
See https://developer.mozilla.org/en/API/navigationTiming
Google Analytics has a page speed metric, you may want to look at it too.
Atatus monitors page load time and there by does Real User Monitoring (RUM), and also it monitors for JavaScript errors along with capturing all the user actions that lead to the error.
It also gives various views of how your performance is across geographical locations and browsers.
https://www.atatus.com/
https://www.atatus.com/blog/announcing-real-user-monitoring/
Disclaimer: I’m a web developer at Atatus.
We use GTmetrix to get daily Page Speed and YSlow scores. If you need a higher frequency, you can automate YSlow or Page Speed using Selenium or ShowSlow.
Try a free tool like dynaTrace's AJAX Edition 3. From the website: "Speed your page load times, optimize rendering, tune DOM execution, and compare to competition. Even integrate with Selenium, Watir or QTP to begin automating your performance tests. It's free, it's easy and it's now for both Firefox and IE."
I do work for dynaTrace, in full disclosure. But this simple, free tool should be very useful to you.
Best
GP
My website is http://secretpassagesbooks.com/. It runs on the latest version of wordpress and is hosted via GoDaddy on a shared web server.
My website takes at anywhere from ten seconds to one minute to load, and I don't understand why. I have tested in IE, FireFox, and Chrome, and the page speed is the same. I performed several speed tests at various online speed test sites and have an average load time of 5 - 6 seconds. Yet when I click on a link to my URL or enter it directly it takes in excess of 30 seconds (sometimes more than a minute) to load the index page.
Here is what I have done so far to troubleshoot the issue:
I have the YSlow and Page Speed extensions installed in Firebug
Yslow test gives me a "Grade A -Overall performance score 90"
My Page Speed a score is 94/100
I have the W3Cache wordpress plugin installed and am using page, browser, and database object caching
I've tried minimizing as much CSS and JavaScript as possible
The site is using HTTP compression
Is there anything more I can do with this design, or is it case of my shared web server being overloaded? Thanks in advance for all your help.
YSlow, etc detect problems in the HTML, Javascript and CSS parts, and these are probably OK. It looks like your hosting is to blame.
If those plug-in results are correct (and I've no reason to doubt they are), then it's most likely a case of your virtual server simply being overloaded.
I presume you have no such issues running an identical site in a "local" production environment either, although you might want to try this to confirm if you've not already done this.
Incidentally, a tale-tell sign of an overloaded VPS/shared hosting solution is if the first page load is incredibly slow, but subsequent loads are "normal" - a common reason being that your "decicated" sandbox is being awoken from a sleep/low resource state. (This also seems to be the case as far as your site is concerned.) As such, it's possible (I don't know the details of this server, such as whether you have a "guaranteed" resource level for CPU, memory, etc.) that other sites on this particular server are using more than their fair share of bandwidth until your site kicks in.
Based on some tests from a tool that I built (The Performance Grader at JoomlaPerformance.com), wow is it bad...
Notice that the HTML took approximately 21.83 seconds to download (from the initial request, to the last object being downloaded). Not to mention that the page is nearly 300kb (which is fairly large for only having 7 images)...
This is where the issue is. Notice that the connection and DNS phases are fine, but the generation phase is really REALLY slow. That's where your problems are. It's server-side. So, you need to debug why it's slow. Some areas to look at are the SQL queries that are being executed (and if they are slow), any slow plugins, etc. Try disabling things one at a time to see if each makes a measurable difference or not.
My "hunch" is that your database is either overloaded, or your queries are very expensive. So in short, you can try another host to see if that helps (which is the solution more than you'd think)...
As most of you pointed out, the issue seemed to be with the server. I contacted GoDaddy and explained the situation. It turns out that my site was hosted on one of their legacy servers and was most likely overloaded. They switched me over to one of their grid servers (no cost) and now everything is loading quickly. Thanks for all the responses. I spent a lot of time tweaking the design, removing plugins one by one, reducing as many HTTP requests as possible, and generally went crazy trying figure out how to best optimize my site. After a few days and a lot of tests, I could not accept that the problem was client-side, especially after all the optimization test I ran showed my site was ok. So good to have it settled...for now, at least.
GoDaddy's webhosting is the bottleneck to your website, you should probably go for a VPS if you have got an advanced website with loads of lookups!
Are there any open source (or I guess commercial) packages that you can plug into your site for monitoring purposes? I'd like something that we can hook up to our ASP.NET site and use to provide reporting on things like:
performance over time
current load
page traffic
SQL performance
PU time monitoring
Ideally in c# :)
With some sexy graphs.
Edit: I'd also be happy with a package that I can feed statistics and views of data to, and it would analyse trends, spot abnormal behaviour (e.g. "no one has logged in for the last hour. is this Ok?", "high traffic levels detected", "low number of API calls detected") and generally be very useful indeed. Does such a thing exist?
At my last office we had a big screen which showed us loads and loads of performance counters over a couple of time ranges, and we could spot weird stuff happening, but the data was not stored and there was no way to report on it. Its a package for doing this that I'm after.
It should be noted that google analytics is not an accurate representation of web site usage. This is because the web beacon (web bug) used on the page does not always load for these reasons:
Google analytics servers are called by millions of pages every second and can not always process the requests in a timely fashion.
Users often browse away from a page before the full page has loaded and thus there is not enough time to load Googles web beacon to record a hit.
Google analytics require javascript to be installed which can be disabled.
Quite a few (but not substantial amount) of people block google-analytics.com from their browsers, myself included.
The physical log files are the best 'real' representation of site usage as they record every request. Alternatively there are far better 'professional' packages, of which Omniture is my favourite, which have much better response times, alternative methods for recording actions and more functionality.
If you're after things like server data, would RRDTool be something you're after?
It's not really a webserver type stats program though, I have no idea how it would scale.
Edit:
I've also just found Splunk Swarm, if you're interested in something that looks "cool".
Google Analytics is free (up to 50,000 hits per month I think) and is easy to setup with just a little javascript snippet to insert into your header or footer and has great detailed reports, with some very nice graphs.
Google Analytics is quick to set up and provides more sexy graphs than you can shake a stick at.
http://www.google.com/analytics/
Not Invented here but it's on my todo list to setup.
http://awstats.sourceforge.net/
#Ian
Looks like they've raised the limit. Not very surprising, it is google after all ;)
This free version is limited to 5 million pageviews a month - however, users with an active Google AdWords account are given unlimited pageview tracking.
http://www.google.com/support/googleanalytics/bin/answer.py?hl=en&answer=55543
http://www.serverdensity.com/
One option is to use external monitoring tools, which will monitor the web performance from outside the firewall by simulating end user activities.
Catchpoint Systems has an interesting approach that requires very little coding and gives you the performance stats from outside the datacenter and from inside the asp.net (like processing time, etc)
http://www.catchpoint.com/products.html