Page Speed Insights and Measure Versus GT Metrix different results - performance

My agency is showing our performance as great in GT Metrix. In page speed insightsGTmetrix web vitals and measure, pagespeedinsightsresultsI am seeing a different picture. What does this mean?

GT Metrix looks at your page on desktop without any throttling applied.
Page Speed Insights applies throttling to the network (to simulate 4G) and to the CPU (to simulate a mid-tier mobile phone with less processing power) and in a mobile viewport.
Also on Page Speed Insights the "Origin Summary" is real world data from people's devices, so that is the more reliable and important information.
See these answers I have given for more info:
throttling applied
difference between mobile and desktop scores

Related

How can I measure the response time of a user interaction with my web app?

I have a mobile web app and I'd like a method of measuring the time it takes for my app to respond to a user interaction. An example would be measuring the time it takes for a button to change appearance (to show the user that it is clicked) after the initial tap of the button.
I'm not aware of any chrome development tools that can do this, but if you know of any tools or methods of measuring UI response times it would be greatly appreciated!
Thanks!
You can analyze end-to-end UI latency during page transitions with the Navigation Timing API (W3C specification, more documentation, browser support matrix). These latency measurements could be recorded for a wide swath of users and analyzed in aggregate to get a decent overview of performance for all users.
On the web, the most common choice is probably Selenium, thanks to the large number of language bindings and its array of testing tools. In fact, you should be able to build UI performance checks into a more general suite of regression tests.
You using System.currentTimeMillis() measuring time start and end of application.
Time end - time start = response time of app.

Client rendering time

I know that in order to measure end-to-end response time for any application scenario, we need to compute: server time + network time + client time
While I know for sure, server and network time are impacted by load, I want to know if client time too is impacted by load??
If client rendering time isn't impacted by load then will it be appropriate, if we do a test with 100 users and measure server time with help of any performance testing tool (like HP LoadRunner, JMeter etc); then measure client rendering time with single user and finally present end-to-end time by adding client time to server time?
Any views on this will be appreciated.
Reagrds,
What you are describing is a very old concept, termed a GUI Virtual User. LoadRunner, and other classical tools such as SilkPerfomer, QALoad and Rational Performance tester, have always had the ability to run one or two graphical virtual users created with the functional automation test tools from the vendor in question to address the question of user "weight" of the GUI.
This capability went out of vogue for a while with the advent of the thin client web, but now that web clients are growing in thickness with complex client side code this question is being asked more often.
Don't worry about actual "rendering time," the time taken to draw the screen elements, since you cannot control that anyway. It will vary from workstation to workstation depending upon what is running on the host and most development shops don't have a reconciliation path to Microsoft, Mozilla, Opera, Google or Apple to ask them to tune up the rendering on their browsers if someone finds a problem in the actual rendering engine of the browser.

What service(s) can be used to monitor frontend performance?

There are already plenty of hosted services for tracking server response times but we're looking for something to track page render/load times (i.e. browser renderer).
The problem with PageSpeed, YSlow, etc is they're on request where we want something that runs constantly and takes a reading every 15 mins for example.
Latest browsers have a window.performance.timing property which contains the timestamp at which some events occurred (such as domainLookupStart, domLoading, domInteractive, ...).
You may want to send a sample of those numbers to your servers.
See https://developer.mozilla.org/en/API/navigationTiming
Google Analytics has a page speed metric, you may want to look at it too.
Atatus monitors page load time and there by does Real User Monitoring (RUM), and also it monitors for JavaScript errors along with capturing all the user actions that lead to the error.
It also gives various views of how your performance is across geographical locations and browsers.
https://www.atatus.com/
https://www.atatus.com/blog/announcing-real-user-monitoring/
Disclaimer: I’m a web developer at Atatus.
We use GTmetrix to get daily Page Speed and YSlow scores. If you need a higher frequency, you can automate YSlow or Page Speed using Selenium or ShowSlow.
Try a free tool like dynaTrace's AJAX Edition 3. From the website: "Speed your page load times, optimize rendering, tune DOM execution, and compare to competition. Even integrate with Selenium, Watir or QTP to begin automating your performance tests. It's free, it's easy and it's now for both Firefox and IE."
I do work for dynaTrace, in full disclosure. But this simple, free tool should be very useful to you.
Best
GP

How do you achieve acceptable performance metrics for your web application?

What analysis do you currently perform to achieve performance metrics that are acceptable? Metrics such as page weight, response time, etc. What are the acceptable metrics that are currently recommended?
This is performance related, so 'it depends' :)
Do you have existing metrics (an existing application) to compare against? Do you have users that are complaining - can you figure out why?
The other major factor will depend on what network sits between the application and the users. On a LAN, page weight probably doesn't matter. On a very slow WAN, page size (esp WRT to TCP windowing) is going to dwarf the impact of server time.
As far as analysis:
Server response time, measured by a load test tool on the same network as the app
Client response time, as measured by a browser / client either on a real or simulated network
The workload for 1) follows the 80/20 rule in terms of transaction mix. For 2), I look at some subset of pages for a browser app and run empty cache and full cache cases to simulate new vs returning users.
Use webpagetest.org to get waterfall charts.
Do RUM, Real User Monitoring, using Google Analytics snippet with Site Speed Javascript enabled.
They are the bare minimum to do.

Best traffic / performance / usage monitoring module?

Are there any open source (or I guess commercial) packages that you can plug into your site for monitoring purposes? I'd like something that we can hook up to our ASP.NET site and use to provide reporting on things like:
performance over time
current load
page traffic
SQL performance
PU time monitoring
Ideally in c# :)
With some sexy graphs.
Edit: I'd also be happy with a package that I can feed statistics and views of data to, and it would analyse trends, spot abnormal behaviour (e.g. "no one has logged in for the last hour. is this Ok?", "high traffic levels detected", "low number of API calls detected") and generally be very useful indeed. Does such a thing exist?
At my last office we had a big screen which showed us loads and loads of performance counters over a couple of time ranges, and we could spot weird stuff happening, but the data was not stored and there was no way to report on it. Its a package for doing this that I'm after.
It should be noted that google analytics is not an accurate representation of web site usage. This is because the web beacon (web bug) used on the page does not always load for these reasons:
Google analytics servers are called by millions of pages every second and can not always process the requests in a timely fashion.
Users often browse away from a page before the full page has loaded and thus there is not enough time to load Googles web beacon to record a hit.
Google analytics require javascript to be installed which can be disabled.
Quite a few (but not substantial amount) of people block google-analytics.com from their browsers, myself included.
The physical log files are the best 'real' representation of site usage as they record every request. Alternatively there are far better 'professional' packages, of which Omniture is my favourite, which have much better response times, alternative methods for recording actions and more functionality.
If you're after things like server data, would RRDTool be something you're after?
It's not really a webserver type stats program though, I have no idea how it would scale.
Edit:
I've also just found Splunk Swarm, if you're interested in something that looks "cool".
Google Analytics is free (up to 50,000 hits per month I think) and is easy to setup with just a little javascript snippet to insert into your header or footer and has great detailed reports, with some very nice graphs.
Google Analytics is quick to set up and provides more sexy graphs than you can shake a stick at.
http://www.google.com/analytics/
Not Invented here but it's on my todo list to setup.
http://awstats.sourceforge.net/
#Ian
Looks like they've raised the limit. Not very surprising, it is google after all ;)
This free version is limited to 5 million pageviews a month - however, users with an active Google AdWords account are given unlimited pageview tracking.
http://www.google.com/support/googleanalytics/bin/answer.py?hl=en&answer=55543
http://www.serverdensity.com/
One option is to use external monitoring tools, which will monitor the web performance from outside the firewall by simulating end user activities.
Catchpoint Systems has an interesting approach that requires very little coding and gives you the performance stats from outside the datacenter and from inside the asp.net (like processing time, etc)
http://www.catchpoint.com/products.html

Resources