Code Profiling in Visual Studio 2005 - visual-studio

I have a Visual Studio 2005 Solution workspace which in turn has 8 projects included in it. I want to profile the complete code(all the projects) and get some measure about the absolute cycles taken by each function to execute, or at least percentage cycle consumptions.
I checked out help for VS 2005, and also the project setiings options but could not find any pointers on hwo to get the profile info.
Any help regarding this would be beneficial.
-AD.

If your application is not particularly processor intensive, redgate ANTS Profiler is a good choice - the line-by-line stats can come in quite handy, and the whole product is clean and well-designed.
If your app needs a lot of CPU to operate normally, however, most of the .NET profilers on the market won't be able to handle it. The only two that I have ever found that will work for a really heavy-weight application are JetBrains dotTrace and YourKit. The two are very similar, which is not surprising, given that YourKit seems to have been started by a former JetBrains employee. I personally prefer dotTrace, but that may just be because that is what I used first, and there has never been any good reason to switch.
I have tested ANTS, AQTime, DevPartner, GlowCode, Borland OptimizeIt and Intel VTune, and all of them have too much overhead to handle a demanding application. (VTune is a possible exception, but it is so horribly complex to configure and use that I was never able to figure out exactly what it could handle. It is also very expensive.)

I guess the inbuilt profiler of Visual Studio 2005 comes onyl with the Developer Edition and Team Edition. I have a Professional edition which, it seems doesnot have the inbuilt profiler tool.
-AD

I've used both the profiler in Compuware’s DevPartner (I like to still call it “TrueTime”) and Rational's Quantify. I always liked Quantify better, but as I've moved between companies DevPartner is usually already the “standard”.
Both are expensive, but they (seem to) add so much value that any commercial shop should have no problem investing in some seats.
Quantify didn’t require special rebuilds of the project – which was GREAT. It also crashed less (that’s not saying much, it had its own issues). DevPartner also tends to break as each new version of Visual Stuido was release (maybe this is better now?). Buy the yearly maintenance agreement if you go this way.
That said, I’ve often just write a class remembers the time at construction and spits out (log file) the elapsed time in its destructor. I used QueryPerformanceCounter. I’d stick this class at the top of the function I’d want to time. You could get fancy with making it a macro, use the preprocessor to include this class only under a special build…

I recommend you EQATEC profiler which also includes in its site a tracer.
Also it's free and easy to use.
alt text http://www.eqatec.com/tools/profiler/profiler-logo.gif

We use DevPartner with Visual Studio 2005. It gives you performance analysis of the specific projects in your solution you want to look at. We also use it for memory management analysis, and error analysis. Is commercial tool, so it's not free.

Red-gate's Profiler is great for this.

I use Jebrains profiler is very easy to use and performs very well too.

If your app needs a lot of CPU to operate normally, however, most of the .NET profilers on the market won't be able to handle it.
I have used a trial version of RedGate Ant's profiler on an optimizing algorithm that normally uses up to 100% CPU on a single core machines and though slow it managed to get through and report all I needed to know. Extremely helpfull tool. I wonder what kind of algorithms have you run on the Ant's profiler.
Has anyone used the VS profiler ?

Related

Free .NET Profiler for .NET 4.0 mixed code

I checked out some of the performance profilers mentioned here. But...
EQATec didn't work for me because I have many assemblies I want to profile, and it has a limit on assemblies to profile. How much of a hassle is getting a free license? I'd go for it if someone guaranteed me that EQATec can profile both managed and unmanaged code
SlimTune only profiled my managed code, even if I set "Profile native functions" to "True"
XTE Profiler is no longer free
We have a copy of AQTime 6 we bought before, but it doesn't seem to support .NET 4.0 apps (it can't even start my app)
We use Visual Studio 2010 Professional SP1, so we don't have the Visual Studio profiler
I tried the "poor man's profiling" (halting the program many times and seeing where it is), but I get way too random results and I'm more used to traditional profiling
(I've spent the whole day stumped on this, sorry if I was too negative)
UPDATE: After I cleaned my solution, built it again and checked all debug info (.pdb) was copied to the same directory as the executable, I tried AQTime again and it worked! It showed me routine timing info for both managed and unmanaged code, so my problem is solved. However, I'm using a paid profiler, so the question will remain open until I take a look at xperf or someone comes up with something else
AQTime have a free version of their latest profiler (http://smartbear.com/products/free-tools/aqtime-standard/) It supports .Net 4, But I doubt it can do a mixed profile of Native and Managed.
If you are really serious about it you might look into the Microsoft xperf tools (http://msdn.microsoft.com/en-us/performance). They have a steep learning curve but they are free and I doubt any commercial profiler can do what xperf can (the instrumentation is in the OS, not in a separate process, thus either Vista, win7 or win2K8 are required). I'm waiting for someone to write a nice GUI around it, but it's taking a bit long... ;-)
xperf will profile your native code and you can load your symbols into the result viewer. I don't think it will go down to per-line granularity though. It has has a .Net CLR Provider (http://msdn.microsoft.com/en-us/library/dd264809.aspx). The cool thing about xperf is that it can also show other processes that may be influencing your performance (you are free to switch it off and only profile your own process). For example: it is capable of revealing that your IO is slow due to a badly written USB driver, virus scanner or firewall software. A traditional profiler would only show the slow IO, causing you to focus on a non-bottleneck.
By the way, there is also an ICorProfilerCallback interface you can utilize to write your own profiler (http://msdn.microsoft.com/en-us/library/s5ec0es1.aspx).
I' pretty sure the answer to your question is "There isn't one".
In comparing a whole bunch of .NET profilers a few months ago I found only very few could do mixed .NET/native profiling: AQTime ($599) and Glowcode ($499) could. Or so they say - I didn't try it.
EQATEC, Visual Studio, ANTS, Jetbrains dotTrace, Yourkit, XteProfiler, Slimtune etc could not, so I doubt you'll find a free profiler anytime soon that can.
CLR Profiler 4 by Microsoft is free. Have you tried it?
What do you look for in the unmanaged part of the profiler?
Your concern about EQATEC Profiler is easily resolved: it only does managed .NET profiling, not at all any kind of unmanaged profiling.
As shown in the pricing the actual profiling functionality differs only in the number of assemblies that can be profiled at once. So a $0 Free edition profiles a single-assembly WP7-app just as fine as a $999 Corporate edition does. For the extra price tag you get to profile more assemblies at once and a handful secondary features, like print, compare, min/max etc.
Getting a free license by trying out EQATEC Analytics is said to be easy. Going for the unlimited Corporate license is quite a popular choice and many have achieved it in just a couple of hours. Getting a free $99 Standard license shouldn't take more than 10 minutes or so, if you're good. Please note: I work at EQATEC and we actually hand out so many free licenses every day now that's it's almost become a burden because each is manually processed (yes, seriously!) so this particular offer may not go on forever.
OP: "I'd go for it if someone guaranteed me that XXXX can profile both managed and unmanaged code"
Our C# Timing Profiler is not dependent on how your C# code is compiled (managed or unmanaged, or mixed). It should work fine for this.

Is it worth upgrading to VS2010 Ultimate to take advantage of the advanced debugging features?

I am currently using VS2010 Premium, and have heard that the debugging in Ultimate is so much better. Is it really worth upgrading to Ultimate to take advantage of the new debugging features they've included? Or is it marketing hype and not really usable for every day development scenarios?
I believe the feature is called IntelliTrace.
Do you work in a team? Do you get bugs reported to you by a tester, and then you can't repro them? Would it save you a lot of time to be able to "debug" through the actual setup the tester had - see their values and execution path? Or perhaps you work with another developer. Would you like to be able to set a bunch of breakpoints and leave "notes" in the code (pinned data tips) and then export them and give them to the other developer, saying "the bug we're looking for is in your part of the code".
If those scenarios cause you pain now, you want Ultimate. If you work alone, it's possible that IntelliTrace alone will make you want Ultimate. It is cool to "time travel" in the debugger.
If you are heavily working on Linq to SQL / Linq to entities then IntelliTrace feature will surely helpful.
Then you should upgrade.
they have quite a few named advantages in this site http://c.ittoolbox.com/groups/technical-functional/csharp-l/whats-new-in-vs2010-and-is-it-worth-upgrading-3531098#M3669289
IntelliTrace is the only way to download the crash and other event logs from Windows Azure. The devFabric isnt quite a 100% representation of the azure deployment environment. I have used the Intellitrace to find deployment flubs quickly. When everything takes 20 minutes to deploy, having a tool that saves you 1 or 2 deployment cycles pays for itself quickly.

Visual Studio performance and add-ins

Do the useful add-ins (Resharper, StyleCop, etc.) to Visual Studio speed up your work? Or tools need too many resources and you have to wait until each add-in completes execution?
[Update]: By the way does some body notice whether performance of IDE + Resharper is better for solutions that contain web sites or web applications?
I can speak very strongly that resharper definitly does speed my productivity greatly. Past versions of Resharper have had some bad performance issue with the IDE but I have had no issues with the most recent version.
I use some add-ins as long as they don't affect the performance of Visual Studio. To that end, tools like StyleCop, MZ-Tools, and Visual Studio Commands are the clear winners.
The problem I have with tools like Refactor! and Resharper are that
They degrade performance, particularly for large solutions.
You become dependent on the shortcut keys, etc. they provide and become completely useless when working on another environment that doesn't have them installed.
Yes, tools like Refactor! and Resharper are excellent for what they do and can increase your typing productivity but I don't think the gain is worth the dependence. This, of course, depends largely on how you use them. For things like refactoring method parameters, changing fields into properties, etc. they can be very useful and potentially save a lot of time. Again, while it can save a lot of time it is still important to know what these tools are actually doing for you so you can still be productive without them.
ReSharper definitely puts a demand on hardware resources, particularly when using site wide analysis on a large project. Having said that, the extent of the performance hit is highly dependent on the host machine. On my work laptop (32 bit XP, 3Gb RAM, 7200 RPM HDD, 2.2 GHz dual core) it suffers but on my home PC (64 bit Win 7, 8Gb RAM, 7200 RPM HDD, 2.9 GHz quad core) it flies and I barely notice the performance hit. That said, I still couldn’t live without it even on the lower specced hardware. The productivity gain still outweighs the downtime in waiting for slower processes.
I user Refactor! all the time. Just the time it saves me to encapsulate private variables into properties is worth it in my opinion.
That being said... a lot of the "benefits" of these programs are negated if you program it correctly to begin with.
For example, if you already habitually use "WITH" statements properly, you probably do not need something to clean up your style.
However in corporate America (and elsewhere I am sure), coding practices are not always followed by everyone, and rework and modifications are always coming in, so usually you will end up needing them eventually.
I personally have not experienced any noticeable difference in performance with these type of tools.
I have Resharper, Resharper Scout and Team Explorer + TFS Power tools. My Visual Studio definitely feels a little sluggish comparing to barebones, but if you want superspeed over features why not work in Notepad? For me, Resharper is definitely worth the viscosity.

If I'm a solo dev, should I bother with VS Team System?

I have an MSDN subscription and I'm wondering what edition of Visual Studio 2008 to get. I recall reading that Team System has a lot of bonus features like doing high-level system architecture stuff, and specialized things related for doing database work. As a solo dev, I wear many hats including database developer and architect - should I use Visual Studio Team Suite to get all of these things, or are they major overkill for a single guy?
EDIT: I have a "special" MSDN license (via the MS BizSpark program for startups) that gives me access to the FULL version of Team Suite for 3 years, for myself and any developers in my startup. After that I have to pay if I want upgrades but I'm free to use it for development indefinitely if I'm okay with not upgrading (per BizSpark licensing).
With that in mind, should I look at Team Suite or stick with Pro? I don't plan to use Team Foundation Server at all.
Well, the "test" stuff is now available in "pro" (but not profiling) so that removes one major comparator. In many ways, the MSDN subscription is a bigger factor than the VS product suite, assuming you don't need the full bredth of tools.
The VS feature list here; the MSDN feature list is here.
I used to use pro, and I never felt I missed much. Of course, you could always get pro plus something like dotTrace for profiling, ReSharper for code analysis/refactoring, and maybe TestDriven.NET for testing - you'd probably still have change left over.
I now have a team suite license (which is very nice), but if I had to pay for it I'd have to think very carefully; I'd probably get developer edition + MSDN.
I'd say that VS Team System is an overkill for single developer sweatshop, but your situation may proves otherwise. Team System is great when you're working on a project where all things are Microsoft, but all the extra features (database, architect, etc) will become useless when you start working with Oracle and MySQL database. Don't put too much stress on the tools, VS Pro is good enough if you want to save money. I'd rather spend more money on extra tools such as third party component and refactoring tools than the shining VS Team System.
But, since you join the BizzSpark program, which I think is really great for startups, I think you should go and try VSTS. You basically pay nothing for the extra features. By the time you need to pay full for the licenses, I think you will gather enough experience on VSTS to decide either to stick with it, or rollback to pro.
It never hurts to have as many toys as possible in your toy box. Sure, you may only play with some of them once in a blue moon but the point is that you have them there to play with when you want to.
I run on a Mac so I have to run all of my stuff off of a VM, and I got to thinking that all I needed was VS installed and then I could use the underlying OS to handle all of my other functions (Dreamweaver, Photoshop, Office, Web Browsing) or in other words my general day to day computing life. Thanks to VMWare the transition between the VM and the host OS is easy, but you get attachments in your email that you want in you VM, or you work on a programming doc on the host os... the list goes on and on and on.
My point is this... you'll never regret putting more into your development system, you will regret not having that one tool that you wanted to have but just didn't think you'd need.
First, you should definitely use a version control product. Being able to go back in time and recall previous builds will save you tons of time and effort. Nothing worse than having it work one day, then realizing a change you made but can't remember broke everything.
Second, if it's just you (or even a couple of other people) you should probably go with subversion. Easy to setup, manage, and interact with is the name of the game here. Not to mention free, fully supported, reliable, and easy to learn.
I have recently started using VisualSVN Server and VisualSVN Client for Visual Studio. The server is free and the client is $45 for a license you can use on every one of your development machines. Add TortuousSVN and you can use the version control from the Windows shell.
I tried the TFS and VSS products from Microsoft and found subversive much easier to deal with.
If you are serious about unit testing your code (you should be) then I'd definitely recommend using the Development Edition, as it provides code coverage, which Professional Edition doesn't.
Sure, you can get most of the functionality difference between Professional & Development Edition from free/cheap 3rd party tools, but IMO these come at a price that is usually higher than what their tag says. Since you may use the even better Team Suite for 3 years I wouldn't even bother looking at the 3rd party tools.
I believe that the Team Developer Edition will now include the Database edition. This is probably all that you would require. From memory, the full Team Suite edition (Developer, Database, Architect and Test all together) is quite an expensive purchase.
One feature from team system which I like is the ability to profile the performance of your application. That might not merit an upgrade in itself if you have to pay for it, but it's very handy in some cases.
I agree with theBadDawg.
I thought it was a travesty when the unit testing features were only available in most expensive editions of Visual studio; unit testing is something everyone should have access to because it benefits us all by instilling good habits in us and helps us write far better software. Especially if we're new to the game.
Fortunately, it's now in the Pro edition.
If you can get the Team Suite and enjoy it's tools to be more productive and produce better quality software from it, do it.
I would agree with #Marc Gravell. You can probably approximate the value of Team System with add-ons, but you also need to factor in the cost of maintaining the add-ons as well. There is some pain associated with maintaining several third-party tools to get the functionality that you could get in an integrated package. Depending on who is spending the money (you or employer), the amount of pain you are willing to deal with to get all the functionality may differ.
I've been very happy with Team System, although I have added in TestDriven.Net as a test runner. We switched to this when TS came out with baked in unit testing, coverage analysis, and source code control. I'm very happy with the choice, but if I had had to pay for it personally, I probably would have gone with nUnit, nCover, SVN, etc. and kept the leftover money. I do feel that it has made me more productive, but I just wouldn't have had that much money to spend.

Why is Visual Studio 2005 so slow?

It is slow to load anything other than a small project. It is slow to quit; it can sometimes take minutes. It can be slow to open new files. The record macro feature used to be useful. It is now so slow to start up it's almost always quicker to do it manually!
More info would be helpful. How big are your solutions? What platform are you on. What 3rd party plugins are you running? What else is running on your pc?
3.2GHz P4 Hyperthreaded, 2GB RAM. Running Outlook, Perforce, IE7, directory browsers. Usually have 1-3 instances of VS running. It's much slower than VC6, say. It seems to take a long time to load projects and close down. I'm interested in if people know reasons why this happens, because of the way VS is written. Is it using .net internally and GC slows it down?
One of the biggest culprits for Visual Studio 2005 slowness is Intellisense. This has been brought up on the MSDN forums again and again and again. What I frequently experience is that Intellisense is working nearly non-stop to "re-index" symbols (or whatever you call it). But developers at Microsoft have not been deaf to the complaints and some outgoing folks there have come up with some workarounds that have helped me and might help you:
Check out this link to get a better understanding of Intellisense:
Intellisense Info
Then check out this link for some macros that I've had a lot of success with:
Intellisense Macros
With those macros, you can turn off intellisense (without renaming any DLLs), restart it, delete the ncb file (which you can do manually, but this is a convenience) and it can give you the status of Intellisense.
it might be that you have a plugin that is misbehaving. Try the safemode switch to see if this improves performance
One of the biggest culprits for Visual Studio 2005 slowness is Intellisense. This has been brought up on the MSDN forums again and again and again. What I frequently experience is that Intellisense is working nearly non-stop to "re-index" symbols [...]
I agree. I use Visual Assist. It is much better. There is no real way to turn "Intellisense" off either. The only way I have found is to rename the DLL so when you restart VS it is not found. This works and makes VS faster.
I tend to agree that VS is a heavyweight. Back in the day I coded in DOS using Boxer text editor and makefiles. Boxer didn't have the heavy intellisense and refactoring features, but it did better in the text editing department, had good syntax highlighting and startup/closing were instantaneous, even on a 486. ...those were the days.
I would say it would be really nice to customize VS to remove all the overhead you're never going to use anyway, but I don't see that happening.
here's ya problem:
3.2GHz P4 Hyperthreaded, 2GB RAM
Hypertheaded means "doesn't actually have two CPU's, but it fakes it". If you have a process with just one thread running, then you get bad performance. It was a good short-term measure, but compared to having two REAL CPU's, it's a slow hack.
I don't think that's the problem at all. The machine is plenty high spec enough to be professional C++ development machine for large projects. I can run Eclipse (which is Java, which is memory hungry and slower than native code) and this is still way faster than VS 2005.
I doubled the amount of RAM from 1GB to 2GB. This helps a lot with linking large applications. We also use Incredibuild to accelerate compilation. But it's the VS app that is slow.
And if you think I'm a grumpy anti-MS zealot, ask yourself why people aren't buying Vista! :)
I'm am seeing mixed results with faster machines. Sure, faster machine, hides the poor performance quality of vs2005 but not all.
Simply taking your obligatory "hello world" C/C++ program, just compile it, (CL /c helloword.cpp),
#include <stdio.h>
#include <windows.h>
int main(char argc, char *argv[])
{
printf("Hello World\n");
return 0;
}
I see 1 seconds compiler under Vc6 and a 6 seconds compile under VS2005.
Using DEPENDS to profile the two, I see 3 areas where the 5 seconds delays and time different are taking place:
~2.5 secs with ADVAPI32.DLL, CryptGetHashParam()
~1.5 secs with OLE2.DLL, StringFromGUID2()
~1.0 secs with C2.DLL, _AbortCompilerPass()
Again, this is just a compile, not a link. The VC8+ compiler executables/dlls are referencing sub-systems like crypto API, the Registry for some transparent reason and it is adding a tremendous amount of overhead to straight and pure compiles.
While a faster machine may hide some of hide slow down, one could only wonder if Microsoft can optimize the compiler by offering options that disables unnecessary overhead references. I understand the better compiler comes with some overhead, but what I see is a 300-500% compile time degradation - thats awful.
Hector Santos, CTO
Santronics Software
This is a purely subjective thing I am afraid.
May it is because of your low end system configuration.
May be VS trying to get updates from the net?
May be you are running too many application in the background.
May be you are trying to open a huge solution.
Recently I've had both Visual Studio 2008 and Visual Studio 2005 on my machine, and I agree that VS2005 is really heavy. They improved upon it in VS2008, although I'm not sure if you'll consider the performance improvements enough.
Could you maybe time some operations and post them so we get an idea what you mean by "slow"? On my machine, I wouldn't call VS 2005 slow, but if you compare it to notepad or my web browser it seems slow. Here's some things that might help people figure out what's going on:
Turn off any features that could affect load time. This includes uninstalling all add-ons and making sure that VS isn't configured to automatically open a project.
Reboot your machine.
Time how long it starts VS 2005 to start, from the time you click the icon until the program starts.
Create a program that you're willing to post here that seems to compile slowly (this might not be possible depending on what it takes to make a slow compile); post the program and how long it takes your computer to build it.
Do you know anyone else with the same kind of machine that has VS 2005 installed? Does it seem slower or faster than yours?
I believe Lord Kelvin said the best thing that can be said about situations like this:
When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge of it is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced it to the stage of science.
Until you give us some measurements to look at, we can't tell you if your machine is genuinely slow or if you are expecting more out of your machine than it can give. Your HT CPU might be the problem; I have roughly equivalent machines at work and at home, but my dual-core work machine runs circles around my single-core home machine when it comes to running VS.
VS 2005 is slower than VS 6 because it is less well optimized for speed. The developers of VS 6 had slower machines than the developers of VS 2005. They made it fast "enough" back then. On a modern machine VS is now "pleasantly fast", where VS 2005 is only just fast enough.
What annoys me is that they decided to scrap VS 6 and start again for VS 2005, when VS 6 was an awesome piece of software that just needed updating.
I noticed above you mentioned you are using perforcet too. Do projects load faster when not in perforce, I am willing to bet that some of the delay you are seeing is related to perforce during load. The latest version of perforce seems a lot slower too.
Change your Solution Platform on "Any CPU" option which is given on the top of Visual Studio then your program build speed will be definitely increased.
here's ya problem:
3.2GHz P4 Hyperthreaded, 2GB RAM
Hypertheaded means "doesn't actually have two CPU's, but it fakes it". If you have a process with just one thread running, then you get bad performance. It was a good short-term measure, but compared to having two REAL CPU's, it's a slow hack.
2GB of RAM would be an issue too, based on what you said you run. If you have a basic 5400RPM disk, then it's going to make it all worse.
I'd recommend, based on what you posted:
A good core2 machine, maybe a quad if you have the budget.
3GB of ram if you are running a 32bit OS, 4+GB if you are running x64. 4GB means you waste 1GB under 32bit.
Get 7200RPM disks, or better. If you can, RAID0 them (stripe) or RAID0+1 (stripe+mirror) if you can get 4 drives (stripe == split content over the two disks, so you can read from both at the same time. stripe+mirror == the safe version of striping, so your code is on TWO disks at all times)
I have a 2.0ghz Core2 (so roughly 3-4x the performance of your P4, if you count 2 CPU's(cores) to be 2x) with 2GB, and the most I can run well is 2 instances of VS.NET 2008. This is normal - nothing wrong with VS.NET, it's just a huge app.
More RAM. More CPU. More Screen. More. More. More :)

Resources