My manager wants to know the speed of our website and its load times in different locations of the world, according to some speed testing websites / tools.
What are the standard tools / procedures for this?
WebPageTest is a pretty awesome tool way more detailed than Gomez. Waterfall charts, repeat loads, and even videos of how a page loads. It has a few locations and connection speeds you can choose from. It seems to be down right now, but will probably be back up soon.
Gomez is a tools that I've used at a couple of organizations. It will monitor your site at timed intervals from a list of nodes that you select, for either an individual page or over an entire transaction (think a click-path through your website)
The reporting capabilities are really good, and you can drill down into individual page requests to identify any performance bottlenecks or issues with site performance. There's also alerting options and the list of nodes and update frequency is excellent.
I've never been involved in the actual purchasing or account management however, so I'm not sure how expensive it is.
Related
We have started to use Lighthouse to track the improvements we make to our sites. While this seems to work quite well for desktop sites, i.e. we see the values improve over time and as we make changes, for mobile sites the values remain consistently low. We do repeat the tests and use the best of three, but still.
Below, we have the results of the New York Times mobile site that appears to perform badly vis-a-vis the desktop site. The other two are sites of ours, the main site and the third one being our own.
Browsing the site (as well as the NYT, of course) this apparent bad performance cannot be felt at all.
The test procedure:
run same test three times for each site
mobile
no PWA
incognito mode
Now, while initially enthusiastic about Lighthouse's capability to evaluate a site by attributing aggregated figures that are easy to digest by management, we have the impression that they are not actually useful as they don't correspond to the users' reality and don't change even though we make changes.
Also, this being a Single Page Application, the first load of the page may take some more time, but any further navigation is quasi-instantaneous. We could not find a Lighthouse feature to take this into account.
No, you can't really rely on Lighthouse. As you've observed, well known fast websites perform badly in tests. While there are some reasons for this, it won't help you measure the actual loading speed. Caching being an important factor. Lazy-loading sometimes is confused as not yet loaded. So even if your website is loaded, Lighthouse might detect missing pieces and deem it not yet loaded.
Pingdom is great for that, and provides you with the options to test different regions, which I believe to be more realistic than one server fits all.
Also the Legacy version of GTmetrix is great because it points you directly to what improvements you can make but it tests only from Canada (unless you buy the PRO version). It takes caching into account.
I have been using PSI for mobile on our sites and worked well. Atleast mobile score was always better than lab data & my motivation was report was consistent on some external sites like https://covid19.ca.gov/.
Coming to tool works well for initial load but does not take into affect for one page app since cls is continuous evaluation has user scroll through CLS changes that is not simulated in tool. That is where field data differs.
Thanks,
I have to access various Atlassian tools (Confluence, Jira, etc) on three different sites. I feel like one is significantly slower than the others. I want to record the actual page load speeds I encounter in my day to day work to verfiy this.
Is there a place where my browser logs the page load speed of every page I visit (for example, with my history)? If not, is there an extension that will passively log this info as I go about my work?
(I have tried Firefox's built in logging, and it is too verbose. Most of the performance testing extensions seem to be focused on actively testing a site or page, not gathering info passively in the background.)
Solution for Mac required, Firefox preferred, Chrome will do.
Our Sitecore 7.2 website runs quite fast, the problem starts when we want to change a lot of items on the backend. Even just opening an item takes 5-6 seconds, and saving the same.
I have events handler for when an item is created or renamed, but no more - I can't fathom why clicking on an item and seeing its details takes so much time. Does the debug mode works for the backend too? How can I start figuring out where the bottleneck is?
To troubleshoot performance issues in Sitecore (including the content editor) your best bet is to start with the CMS Performance Tuning Guide.
There's a companion to it, the CMS Diagnostics Guide.
Depending on what you find, you may need to read up on indexing, caching, browser configuration - again, lots of the information is on SDN.
For the content editor, for instance, there are some application settings that can make a huge difference to loading items - e.g. Prefetch collapsed sections, show fields from standard templates, which warnings you show in the gutter.
The Sitecore Log Analyzer will almost certainly be very useful to you, if you add some performance counters to the log (though these can themselves impact performance).
You can also monitor caches at the admin/cache.aspx - if your cache deltas go haywire or cache sizes reach maximums then you'll have a performance hit
But I'd start by simply monitoring your server resources while you perform one of your troublesome item updates - that should at least let you know if your bottleneck is memory, CPU, SQL connectivity etc.
As time goes, not only Sitecore framework, but any other technology, hardware evolves over time.
The end advice/solution that was actual a while ago might not be applicable today.
It is far more important to be capable of finding reasons in exact your solution why certain operation is slow.
No random changes, every change has to be based on facts. Wall clock response times are to be investigated by performance profiles:
Introduction to wall clock investigation
Wall clock investigations in ASP.NET
Collecting PerfView profiles in Sitecore
Analyzing Sitecore PerfView profile
Here is a blog dedicated to Sitecore performance investigations with real life case studies.
Right now I'm paying 5 dollars a month for hosting to godaddy.com. Although there are no users registered yet (it's closed for maintenance mode as I'm testing and buiding it), it's slower than e.g. facebook. Does anyone have experience on using buddypress? What happens if my site blows up and draws a lot of users very fast. I guess I can get more expensive and better quality hosting, but is there a limit for buddypress based sites, especially when I'm using quite a few plugins.
BuddyPress scales quite high, so the code itself won't be a problem, even with tens of thousands of users. Your problems will probably be imposed by your host--limiting database transactions or sizes of tables--or specific themes taking a long time to render.
Firebug can be a great tool to use if you want to identify what component is causing a site to be slow. Instructions on using Firebug
I don't expect a straightforward silver bullet answer to this, but what are the best practices for ensuring good performance for SharePoint 2007 sites?
We've a few sites for our intranet, and it generally is thought to run slow. There's plenty of memory and processor power in the servers, but the pages just don't 'snap' like you'd expect from a web site running on powerful servers.
We've done what we can to tweak setup, but is there anything we could be missing?
There is a known issue with initial requests once an IIS application pool has unloaded the SharePoint resources or recycled itself where the spin-up on a new request is very slow.
Details about why that happens and how to fix it can be found here; SharePoint 2007 Quirks - Solving painfully slow spin-up times
Andrew Connell's latest book (Professional SharePoint 2007 Web Content Management Development) has an entire chapter dedicated to imporving performance of SharePoint sites.
Key topics it covers are Caching, Limiting page load (particularly how to remove CORE.js if it's not needed), working with Disposable objects and how to work with SharePoint querying.
2 really good tricks I've got are to use the CSS Freindly Control Adapters to generate smaller HTML for the common components (menus, etc) and setting up a server "wake up", so when IIS sleeps the app-pool due to inactivity you can reawaken it before someone hits your site.
Microsoft has released a white paper on this very issue.
How Microsoft IT Increases
Availability and Decreases Rendering
Time of SharePoint Sites Technical
White Paper Published: September 2008
Download it from here.
SharePoint has a lot of limitations that contribute to low performance problems we may call them performance bottlenecks. The SharePoint performance problems occur primarily due to the following reasons:
BLOBs overwhelm SQL Server
Too many database trips for lists
You can dramatically improve SharePoint performance if you use a few of intelligent techniques which are:
Externalize Documents (BLOBs)
Cache Lists and BLOBs
Microsoft Office SharePoint Server (MOSS) is an extremely popular product that improves effectiveness of organizations through content management and enterprise search, shared business processes, and information-sharing across boundaries for better business insight. And StorageEdge is an extremely fine product that enhance/improve SharePoint performance.
By using StorageEdge SharePoint's performance can easily be enhanced.
Just a few ideas...
Is displaying your pages as slow when doing it from the Server or from a Client? If slower from client, do check your network.
Are your pages very "heavy"? (means, many elements, web parts and so on?) Than maybe it's normal.
Have you noticed they load slowlier since you've add one specific web part? Maybe there's some issues with that specific web part (for example, it's accessing to a document library that has many -thousands of- documents). If that's the case, try to deactivate that specific web part and see if the performance works better.
I noticed that Sharepoint loves to add a ton of JavaScript. If you run a Browser with slow JavaScript (say, Internet Explorer), i notice that it sometimes does not "feel" fast.
Also, if you are running custom code on it: Make sure to dispose your SPWebs after use, that can up a lot!
Are you running on virtual or physical servers? We found that it is significantly faster on physical servers. Also, check the disk performance - if you are running the servers from a SAN it might be a sign that your SAN is over utilised.
To investigate SharePoint performance issues, I would try these things first, in that order:
Run SQL profiler for those non-performing pages. SharePoint API excels at hiding what's going on behind the scenes in respect to database roundtrips. There are single API calls that, without the knowledge of the developer, generate many roundtrips, which hurt performance.
Profile w3wp.exe process serving your SharePoint site. That's going to tell you relative API usage. Focus on ticks, no time, and do a top-down inclusive time analysis to see which calls are taking up most of the time. See here for instructions.
Run Fiddler or Microsoft NetMon to spot potential excessive client roundtrips (i.e. between browser and web front end server) and redirections (301's).
The three main major components of SharePoint setup is SharePoint Server (the one which runs WSS/SPS services), SQL Server DB and IIS.
You said you have decent power for your SharePoint services and I assume IIS would definitely be on a good machine.
Usually SQL Server setup that hosts the SP related DBs that would slow down the page loads. I would want you to take a look at your whole SQL Server related performance counters and you might want to tune these DBs too (which includes OS Server/Stored Procedures/Network, etc)
Hope this adds to your checklist of things you want to take a look at.