How to profile a classic ASP web site? - performance

I have a classic ASP (JScript) web-site that is running slow and are there any profilers that can help me identify what is taking time?
Other hints on how to optimize or debug ASP performance issues would be helpful.

Assuming you're looking for free solutions, here are some suggestions used in past (very old) project of mine:
ASP Profiler component. This is a line-level performance profiler for Active Server Pages (with VBScript) code. It shows how your ASP page runs, which lines are executed how many times, and how many milliseconds each take. Especially for heavy data-driven pages, you can see exactly which lines slow down the page, and optimize where necessary.
Googling around I have also found a couple of very old articles on timing/profiling ASP execution code: have a look here and here.
If you have an issue with server side code being slow I have found it is almost always the database causing the issue. You need to check for SQL which is slow to return a result; if you find any you need to look at applying new indexes to your tables. If your app is too chatty with the database you need to look at reducing the number of calls to the database. To find these problems you can always use SQL Server Profiler; this comes bundled with SQL Server 2005/2008 Developer edition.
Also you can use a free SQL Profiler available at xsqlsoftware.com

Related

Profiling a VBScript ASP website?

I recently started working on an E-Commerce site that is written in VBscript/Classic ASP. Coming from a PHP background... this is... less than pleasant.
The largest issue I have right now is that there is a ton of extra an unnecessary code. I am looking for a way to get a stack trace... see what functions get called on a particular page... how long the calls take... things like that.
Another thing that has to be slowing things down is the obscene and unnecessary ammount of Dims at the top of all the documents. There has to be a bunch of those that are legacy and not being used. Getting rid of extra Dims should free up memory and make things faster... I hope.
I have a copy of Visual Studio 2010 on my workstation... but I have no idea how to import the site into that... or if I can even accomplish what I'm looking for with VS2010.
Any suggestions as to how I can profile ASP/VBscript is greatly appreciated.
How to profile a classic ASP web site?
I suggest you learn what the Dim statements do BEFORE you declare them useless and the cause of your slowdowns: http://www.htmlgoodies.com/primers/asp/article.php/3477891/ASP-Primer-Some-Basic-VBScript.htm

How to benchmark BIRT report performance?

I have a BIRT report with performance problems: it takes approximately 5 minutes to run.
At the beginning I though the problem was the database: this report uses a quite complex SQL Server stored procedure to retrieve data. After a lot of SQL optimizations this procedure now takes ~20 seconds to run (in the management console).
However, the report itself still takes too much time (several minutes). How do I identify the other bottlenecks in BIRT report generation? Is there a way to profile the entire process? I'm running it using the www viewer (running inside Tomcat 5.5), and I don't have any Java event handlers, everything is done using standard SQL and JavaScript.
I watched the webinar "Designing High Performance BIRT Reports" 1, it has some interesting considerations but it didn't help much...
As I write this answer the question is getting close to 2 years old, so presumably you found a way around the problem. No one has offered a profiler for the entire process, so here are some ways of identifying bottle necks.
Start up time - About a minute can be spent here
running a couple reports one after the other or starting a second after the first is running can help diagnosis issues.
SQL Query run time - Good solutions are mentioned in the question
any SQL trace and performance testing will identify issues.
Building the report - This is where I notice the lions share of time being taken. Run a SQL trace while the report is being created. Even a relatively simple tables with lots of data can take around a minute to configure and display (HTML via apache tomcat) after the SQL trace indicates the query is done.
simplify the report or create a clone with fewer graphs or tables run with and without pieces to see if any create a notable difference
modify the query to bring back less records, less records are easier to display,
Delivery method PDF, Excel, HTML each can have different issues
try the report to different formats
if one is significantly greater, try different emitters.
For anyone else having problems with BIRT performance, here are a few more hints.
Profiling a BIRT report can be done using any Java profiler - write a simple Java test that runs your report and then profile that.
As an example I use the unit tests from the SpudSoft BIRT Excel Emitters and run JProfiler from within Eclipse.
The problem isn't with the difficulty in profiling it, it's in understanding the data produced :)
Scripts associated with DataSources can absolutely kill performance. Even a script that looks as though it should only have an impact up-front can really stop this thing. This is the biggest performance killer I've found (so big I rewrote quite a chunk of the Excel Emitters to make it unnecessary).
The emitter you use has an impact.
If you are trying to narrow down performance problems always do separate Run and Render tasks so you can easily see where to concentrate your efforts.
Different emitter options can impact performance, particularly with the third party emitters (the SpudSoft emitters now have a few options for making large reports faster).
The difference between Fixed-Layout and Auto-Layout is significant, try both.
Have you checked how much memory you are using in Tomcat? You may not be assigning enough memory. A quick test is to launch the BIRT Designer and assign it additional memory. Then, within the BIRT Designer software, run the report.

Terrible DotNetNuke performance

I'm involved with a project using DotNetNuke version 05.01.04 Community Edition. We are building our new Intranet using it, but performance is terrible.
We have five people adding pages and content to it and every 15-30 seconds they experience a pause of 10 seconds or longer before the system continues and the next screens loads.
The server is Windows 2003, 3.8GHz with 1GB of RAM. I'm told by our server admin that the CPU and memory performance don't appear to be the bottleneck.
We currently have 350 pages in the system, we a plan to add 1000. So we need to resolve this performance problem so that we can enter content and so we can go live.
I just can't see where the bottleneck is. Is there a good why to determine the bottleneck when using DotNetNuke?
Modules installed
Publish:Engage (Not currently in
use)
Page Blaster (Doesn't appear
to providing caching when users
logged in using Integrated
Authentication)
SimpleGallery
XMod
Content Manager
IIS Setup
Application recycling completely disabled (Apart from a 2am recycle)
New findings: 18th March 2010
The main bottleneck was due to version 5.1.4 having a bug which caused 1300 database roundtrips on an average page, due to broken database in-memory caching. We've upgraded to 5.2.4 which has resolved this bottleneck.
Now the next biggest bottleneck is the navigation. We've used both DDR:Menu and DDN:Nav, but both have a major impact on performance.
Is there a navigation interface out there that doesn't drain performance so badly?
I think you need to start investigating this using performance profiling tools. For the DNN application itself I'd grab something like JetBrains DotTrace or Red Gate's ANTS Performance Profiler.
For the database SQL Server Profiler would be the first choice or a tool such as Red Gate's SQL Response.
Without profiling the application these you're going to be pulling at straws.
And as Tim pointed out in his comment, installing Firebug in Firefox with the YSlow add-in to see what resources are taking longest to serve to the browser.
Mitchel Sellers has some good tutorials and checklists to go through with regards to performance in DNN. Start with Explaining High Performance DotNetNuke Configuration and Management (which points to some of his earlier articles).
I have several years of dnn development and maintainance experience, when I have this kind of problem, I start doing things from database clean up. Next thing is, find for missing indexes, and/or rebuild all the indexes periodically (sql job scheduled for that) but major performance gain would be from clean up of table
Another good considerations would be, disabling trace, debug mode to false and turn off features of dnn that you don't use (scheduler is the first one to turn off)
Edit: consider keep alive as well
Hope this helps
Is your database on that server? If so, just throw in some more RAM, or get a faster disk array...
Have you considered creating this lot of pages directly through TSQL? It's not hard to do and may save you a lot of time.

IIS5, 6, and 7 Speed Issues After Upgrade

I will apologize in advance as this post is born out of severe frustration.
I have a classic asp website that has been running on Windows 2000/IIS5 for years, and another ASP.NET 2.0 site that we've recently started running on the same servers. So far, everything is running well.
Last year, I tried upgrading (fresh install) to Windows 2003/IIS6. The classic ASP site was much slower, about 50% slower based on logs/stats averaged over weeks of use. I tried everything to find out what was slow. Network tweaks. Integrated. Classic iis5 mode. In process. Out of process. Nothing ever made things better and I soon rolled back to IIS5/2000. The very day rolled back, performance went right back to where it was. This happened on more than one server. Eventually, I gave up and chalked it up to 2003 TCP issues of some sort.
I recently installed a Windows 2008/IIS server on a similar, but more powerful machine in hopes that things were better. Much to my happiness, my ASP classic app is faster under Windows 2008. Unfortunately, my ASP.NET app is 50-75% slower for now apparent reason. All of it's content loads. It's on the same network as the 2000 machine. The site was copied directly from the other machine, and it's a precompile web app from studio 2005.
While the page does hit the database and another server for initial data, it caches it from there for quite a while, it also uses the same db servers as the classic site, which is fast, so I know it's not necessarily a connection issue.
I've tried the default app pool and the classic .NET pool Made no difference. Upped./check the max threads, max per cpu in all the usual locations, web garen or no nothing seems to matter. I've double checked that the compilation debug=false is still set in the web.config.
For a quick benchmark, I used ab.exe (Apache Bench) to send 10 request, 1 at a time. Even if I use IE or Firefox to hit the site, it's clearly slower than under 2000, even according to firebug.
At this point, I'm frustrated and at a complete loss as to where to start. Has anyone been through this sort of mess before?
Speed depends on many factors. You do need to measure performance just on the server to understand if this is server issue. Enable tracing for your web site in web config and see which part/function is slowing it down. You can add you own tracing after each operation to see which block of code is slowest. I'm sure you will find things that you can improve/optimize once you which part of the page is the slowest.
In my case, the answer turned out to be simple one I fired up WireShark. There was 1 external resource request that could not be resolved since the test machine had no direct access to the internet like the live machine did.
It's always the little things.

SharePoint 2007 performance issues

I don't expect a straightforward silver bullet answer to this, but what are the best practices for ensuring good performance for SharePoint 2007 sites?
We've a few sites for our intranet, and it generally is thought to run slow. There's plenty of memory and processor power in the servers, but the pages just don't 'snap' like you'd expect from a web site running on powerful servers.
We've done what we can to tweak setup, but is there anything we could be missing?
There is a known issue with initial requests once an IIS application pool has unloaded the SharePoint resources or recycled itself where the spin-up on a new request is very slow.
Details about why that happens and how to fix it can be found here; SharePoint 2007 Quirks - Solving painfully slow spin-up times
Andrew Connell's latest book (Professional SharePoint 2007 Web Content Management Development) has an entire chapter dedicated to imporving performance of SharePoint sites.
Key topics it covers are Caching, Limiting page load (particularly how to remove CORE.js if it's not needed), working with Disposable objects and how to work with SharePoint querying.
2 really good tricks I've got are to use the CSS Freindly Control Adapters to generate smaller HTML for the common components (menus, etc) and setting up a server "wake up", so when IIS sleeps the app-pool due to inactivity you can reawaken it before someone hits your site.
Microsoft has released a white paper on this very issue.
How Microsoft IT Increases
Availability and Decreases Rendering
Time of SharePoint Sites Technical
White Paper Published: September 2008
Download it from here.
SharePoint has a lot of limitations that contribute to low performance problems we may call them performance bottlenecks. The SharePoint performance problems occur primarily due to the following reasons:
BLOBs overwhelm SQL Server
Too many database trips for lists
You can dramatically improve SharePoint performance if you use a few of intelligent techniques which are:
Externalize Documents (BLOBs)
Cache Lists and BLOBs
Microsoft Office SharePoint Server (MOSS) is an extremely popular product that improves effectiveness of organizations through content management and enterprise search, shared business processes, and information-sharing across boundaries for better business insight. And StorageEdge is an extremely fine product that enhance/improve SharePoint performance.
By using StorageEdge SharePoint's performance can easily be enhanced.
Just a few ideas...
Is displaying your pages as slow when doing it from the Server or from a Client? If slower from client, do check your network.
Are your pages very "heavy"? (means, many elements, web parts and so on?) Than maybe it's normal.
Have you noticed they load slowlier since you've add one specific web part? Maybe there's some issues with that specific web part (for example, it's accessing to a document library that has many -thousands of- documents). If that's the case, try to deactivate that specific web part and see if the performance works better.
I noticed that Sharepoint loves to add a ton of JavaScript. If you run a Browser with slow JavaScript (say, Internet Explorer), i notice that it sometimes does not "feel" fast.
Also, if you are running custom code on it: Make sure to dispose your SPWebs after use, that can up a lot!
Are you running on virtual or physical servers? We found that it is significantly faster on physical servers. Also, check the disk performance - if you are running the servers from a SAN it might be a sign that your SAN is over utilised.
To investigate SharePoint performance issues, I would try these things first, in that order:
Run SQL profiler for those non-performing pages. SharePoint API excels at hiding what's going on behind the scenes in respect to database roundtrips. There are single API calls that, without the knowledge of the developer, generate many roundtrips, which hurt performance.
Profile w3wp.exe process serving your SharePoint site. That's going to tell you relative API usage. Focus on ticks, no time, and do a top-down inclusive time analysis to see which calls are taking up most of the time. See here for instructions.
Run Fiddler or Microsoft NetMon to spot potential excessive client roundtrips (i.e. between browser and web front end server) and redirections (301's).
The three main major components of SharePoint setup is SharePoint Server (the one which runs WSS/SPS services), SQL Server DB and IIS.
You said you have decent power for your SharePoint services and I assume IIS would definitely be on a good machine.
Usually SQL Server setup that hosts the SP related DBs that would slow down the page loads. I would want you to take a look at your whole SQL Server related performance counters and you might want to tune these DBs too (which includes OS Server/Stored Procedures/Network, etc)
Hope this adds to your checklist of things you want to take a look at.

Resources