Visual Studio Performance Analysis - visual-studio-2010

I have an OpenGL application, which is running very well. But when i create a menu and make it visible to the scene, the startup of the application turns very slow. I would like to run a performance analysis to that function, to check where is Visual Studio spending that much time. Is it possible to get that kind of deep details into performance?
Thanks in advanced,
John

I would recommend using the ANTS Performance Profiler 8 - I know it can zero in to the function/line to show you where the bottleneck is. They do provide a free trial and it is easy to use.

Related

XNA game runs faster with debugging

I have a game that I've made in XNA 4.0 using C# (Visual Studio 2010 express). My issue that it's running significantly faster with debugging than when I run the executable directly. I did try switching to release mode and encountered the same performance. Any ideas on what could cause this?
Thanks in advance.
Make sure the XNA game is full screen. In my experiences, I notice frame decreases in the windowed mode.
You also need to check what is happening using Debug -> Start Performance Analysis. This will allow you to see what methods are using up memory and usage.
If in the case that your game has advanced collision detection, be sure to implement multithreading in that aspect. When I make games, I use a ThreadPool to handle the collision aspect.
I hope some of this helps.

Is it worth upgrading to VS2010 Ultimate to take advantage of the advanced debugging features?

I am currently using VS2010 Premium, and have heard that the debugging in Ultimate is so much better. Is it really worth upgrading to Ultimate to take advantage of the new debugging features they've included? Or is it marketing hype and not really usable for every day development scenarios?
I believe the feature is called IntelliTrace.
Do you work in a team? Do you get bugs reported to you by a tester, and then you can't repro them? Would it save you a lot of time to be able to "debug" through the actual setup the tester had - see their values and execution path? Or perhaps you work with another developer. Would you like to be able to set a bunch of breakpoints and leave "notes" in the code (pinned data tips) and then export them and give them to the other developer, saying "the bug we're looking for is in your part of the code".
If those scenarios cause you pain now, you want Ultimate. If you work alone, it's possible that IntelliTrace alone will make you want Ultimate. It is cool to "time travel" in the debugger.
If you are heavily working on Linq to SQL / Linq to entities then IntelliTrace feature will surely helpful.
Then you should upgrade.
they have quite a few named advantages in this site http://c.ittoolbox.com/groups/technical-functional/csharp-l/whats-new-in-vs2010-and-is-it-worth-upgrading-3531098#M3669289
IntelliTrace is the only way to download the crash and other event logs from Windows Azure. The devFabric isnt quite a 100% representation of the azure deployment environment. I have used the Intellitrace to find deployment flubs quickly. When everything takes 20 minutes to deploy, having a tool that saves you 1 or 2 deployment cycles pays for itself quickly.

Redgate Visual Studio addin

I realise that this may be subjective (and would appreciate not being voted down on this one XD), but I would like some advice from other developers out there who have used RedGate's .Net productivity addins - ANTS Performance Profiler Pro, ANTS Memory Profiler, and Excpetion Hunter. Its quite pricey, and basically, does anyone recommend it? And do the ANTS products do what they say they can (respectively)?:
Identify bottlenecks and ensure code is performing optimally
Zero in fast on common causes of memory leaks
Anticipating your input on this. much thanks!
I have evaluated the ANS Performance Profiler, and it's a great tool in my opinion, well worth the price. If you ever discover (and solve) a single annoying performance blocker with its help, it's more than worth its price - at least for professional devs (rather pricey for single home / hobby devs, I agree).
I have both the RedGate performance and memory profilers, and both are good. I used the trial of Exception Hunter when it first came out, but didn't see a need for it so I don't have a licence for that.
ANTS Performance Profiler - this is very good and I have used it many times to identify bottlenecks in code. The user interface is intuitive and easily shows slow/inefficient areas to focus on.
ANTS Memory Profile - I've had less success with this as I find it harder to use. I also have a licence for the SciTech Memory Profiler which I find a better tool for memory profiling, allowing you to see more detailed information and drill down into it easier.
My biggest niggle with the RedGate tools (and this applies to all of their tools) is that they do not work through authenticating proxies and there is no way to configure them to (this doesn't stop them from running though).
If cost is an issue, Eqatec make a free performance profiler. I've never used it though, so cannot comment on how good it is.
If you are looking to solve a specific memory/performance issue, the cost of these tools will pay for itself in saved time. If you are just curious about your application then it would be a harder cost to justify.
Good tools cost more money that lousy ones. From everything I've heard, seen and personally observed, RedGate produces good tools. Using lousy tools takes more of your time. How much that time is worth to you or your employer is something we cannot judge from the information you provided. In the Western world, a good tool pays itself back in only a few hours. That's an ROI that's hard to beat.
Do make sure you adjust that ROI by the amount of time you'll need to learn how to use the tool. You'll get a quick insight in that from spending an hour on the trial version.

Code Profiling in Visual Studio 2005

I have a Visual Studio 2005 Solution workspace which in turn has 8 projects included in it. I want to profile the complete code(all the projects) and get some measure about the absolute cycles taken by each function to execute, or at least percentage cycle consumptions.
I checked out help for VS 2005, and also the project setiings options but could not find any pointers on hwo to get the profile info.
Any help regarding this would be beneficial.
-AD.
If your application is not particularly processor intensive, redgate ANTS Profiler is a good choice - the line-by-line stats can come in quite handy, and the whole product is clean and well-designed.
If your app needs a lot of CPU to operate normally, however, most of the .NET profilers on the market won't be able to handle it. The only two that I have ever found that will work for a really heavy-weight application are JetBrains dotTrace and YourKit. The two are very similar, which is not surprising, given that YourKit seems to have been started by a former JetBrains employee. I personally prefer dotTrace, but that may just be because that is what I used first, and there has never been any good reason to switch.
I have tested ANTS, AQTime, DevPartner, GlowCode, Borland OptimizeIt and Intel VTune, and all of them have too much overhead to handle a demanding application. (VTune is a possible exception, but it is so horribly complex to configure and use that I was never able to figure out exactly what it could handle. It is also very expensive.)
I guess the inbuilt profiler of Visual Studio 2005 comes onyl with the Developer Edition and Team Edition. I have a Professional edition which, it seems doesnot have the inbuilt profiler tool.
-AD
I've used both the profiler in Compuware’s DevPartner (I like to still call it “TrueTime”) and Rational's Quantify. I always liked Quantify better, but as I've moved between companies DevPartner is usually already the “standard”.
Both are expensive, but they (seem to) add so much value that any commercial shop should have no problem investing in some seats.
Quantify didn’t require special rebuilds of the project – which was GREAT. It also crashed less (that’s not saying much, it had its own issues). DevPartner also tends to break as each new version of Visual Stuido was release (maybe this is better now?). Buy the yearly maintenance agreement if you go this way.
That said, I’ve often just write a class remembers the time at construction and spits out (log file) the elapsed time in its destructor. I used QueryPerformanceCounter. I’d stick this class at the top of the function I’d want to time. You could get fancy with making it a macro, use the preprocessor to include this class only under a special build…
I recommend you EQATEC profiler which also includes in its site a tracer.
Also it's free and easy to use.
alt text http://www.eqatec.com/tools/profiler/profiler-logo.gif
We use DevPartner with Visual Studio 2005. It gives you performance analysis of the specific projects in your solution you want to look at. We also use it for memory management analysis, and error analysis. Is commercial tool, so it's not free.
Red-gate's Profiler is great for this.
I use Jebrains profiler is very easy to use and performs very well too.
If your app needs a lot of CPU to operate normally, however, most of the .NET profilers on the market won't be able to handle it.
I have used a trial version of RedGate Ant's profiler on an optimizing algorithm that normally uses up to 100% CPU on a single core machines and though slow it managed to get through and report all I needed to know. Extremely helpfull tool. I wonder what kind of algorithms have you run on the Ant's profiler.
Has anyone used the VS profiler ?

Why is Visual Studio 2005 so slow?

It is slow to load anything other than a small project. It is slow to quit; it can sometimes take minutes. It can be slow to open new files. The record macro feature used to be useful. It is now so slow to start up it's almost always quicker to do it manually!
More info would be helpful. How big are your solutions? What platform are you on. What 3rd party plugins are you running? What else is running on your pc?
3.2GHz P4 Hyperthreaded, 2GB RAM. Running Outlook, Perforce, IE7, directory browsers. Usually have 1-3 instances of VS running. It's much slower than VC6, say. It seems to take a long time to load projects and close down. I'm interested in if people know reasons why this happens, because of the way VS is written. Is it using .net internally and GC slows it down?
One of the biggest culprits for Visual Studio 2005 slowness is Intellisense. This has been brought up on the MSDN forums again and again and again. What I frequently experience is that Intellisense is working nearly non-stop to "re-index" symbols (or whatever you call it). But developers at Microsoft have not been deaf to the complaints and some outgoing folks there have come up with some workarounds that have helped me and might help you:
Check out this link to get a better understanding of Intellisense:
Intellisense Info
Then check out this link for some macros that I've had a lot of success with:
Intellisense Macros
With those macros, you can turn off intellisense (without renaming any DLLs), restart it, delete the ncb file (which you can do manually, but this is a convenience) and it can give you the status of Intellisense.
it might be that you have a plugin that is misbehaving. Try the safemode switch to see if this improves performance
One of the biggest culprits for Visual Studio 2005 slowness is Intellisense. This has been brought up on the MSDN forums again and again and again. What I frequently experience is that Intellisense is working nearly non-stop to "re-index" symbols [...]
I agree. I use Visual Assist. It is much better. There is no real way to turn "Intellisense" off either. The only way I have found is to rename the DLL so when you restart VS it is not found. This works and makes VS faster.
I tend to agree that VS is a heavyweight. Back in the day I coded in DOS using Boxer text editor and makefiles. Boxer didn't have the heavy intellisense and refactoring features, but it did better in the text editing department, had good syntax highlighting and startup/closing were instantaneous, even on a 486. ...those were the days.
I would say it would be really nice to customize VS to remove all the overhead you're never going to use anyway, but I don't see that happening.
here's ya problem:
3.2GHz P4 Hyperthreaded, 2GB RAM
Hypertheaded means "doesn't actually have two CPU's, but it fakes it". If you have a process with just one thread running, then you get bad performance. It was a good short-term measure, but compared to having two REAL CPU's, it's a slow hack.
I don't think that's the problem at all. The machine is plenty high spec enough to be professional C++ development machine for large projects. I can run Eclipse (which is Java, which is memory hungry and slower than native code) and this is still way faster than VS 2005.
I doubled the amount of RAM from 1GB to 2GB. This helps a lot with linking large applications. We also use Incredibuild to accelerate compilation. But it's the VS app that is slow.
And if you think I'm a grumpy anti-MS zealot, ask yourself why people aren't buying Vista! :)
I'm am seeing mixed results with faster machines. Sure, faster machine, hides the poor performance quality of vs2005 but not all.
Simply taking your obligatory "hello world" C/C++ program, just compile it, (CL /c helloword.cpp),
#include <stdio.h>
#include <windows.h>
int main(char argc, char *argv[])
{
printf("Hello World\n");
return 0;
}
I see 1 seconds compiler under Vc6 and a 6 seconds compile under VS2005.
Using DEPENDS to profile the two, I see 3 areas where the 5 seconds delays and time different are taking place:
~2.5 secs with ADVAPI32.DLL, CryptGetHashParam()
~1.5 secs with OLE2.DLL, StringFromGUID2()
~1.0 secs with C2.DLL, _AbortCompilerPass()
Again, this is just a compile, not a link. The VC8+ compiler executables/dlls are referencing sub-systems like crypto API, the Registry for some transparent reason and it is adding a tremendous amount of overhead to straight and pure compiles.
While a faster machine may hide some of hide slow down, one could only wonder if Microsoft can optimize the compiler by offering options that disables unnecessary overhead references. I understand the better compiler comes with some overhead, but what I see is a 300-500% compile time degradation - thats awful.
Hector Santos, CTO
Santronics Software
This is a purely subjective thing I am afraid.
May it is because of your low end system configuration.
May be VS trying to get updates from the net?
May be you are running too many application in the background.
May be you are trying to open a huge solution.
Recently I've had both Visual Studio 2008 and Visual Studio 2005 on my machine, and I agree that VS2005 is really heavy. They improved upon it in VS2008, although I'm not sure if you'll consider the performance improvements enough.
Could you maybe time some operations and post them so we get an idea what you mean by "slow"? On my machine, I wouldn't call VS 2005 slow, but if you compare it to notepad or my web browser it seems slow. Here's some things that might help people figure out what's going on:
Turn off any features that could affect load time. This includes uninstalling all add-ons and making sure that VS isn't configured to automatically open a project.
Reboot your machine.
Time how long it starts VS 2005 to start, from the time you click the icon until the program starts.
Create a program that you're willing to post here that seems to compile slowly (this might not be possible depending on what it takes to make a slow compile); post the program and how long it takes your computer to build it.
Do you know anyone else with the same kind of machine that has VS 2005 installed? Does it seem slower or faster than yours?
I believe Lord Kelvin said the best thing that can be said about situations like this:
When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge of it is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced it to the stage of science.
Until you give us some measurements to look at, we can't tell you if your machine is genuinely slow or if you are expecting more out of your machine than it can give. Your HT CPU might be the problem; I have roughly equivalent machines at work and at home, but my dual-core work machine runs circles around my single-core home machine when it comes to running VS.
VS 2005 is slower than VS 6 because it is less well optimized for speed. The developers of VS 6 had slower machines than the developers of VS 2005. They made it fast "enough" back then. On a modern machine VS is now "pleasantly fast", where VS 2005 is only just fast enough.
What annoys me is that they decided to scrap VS 6 and start again for VS 2005, when VS 6 was an awesome piece of software that just needed updating.
I noticed above you mentioned you are using perforcet too. Do projects load faster when not in perforce, I am willing to bet that some of the delay you are seeing is related to perforce during load. The latest version of perforce seems a lot slower too.
Change your Solution Platform on "Any CPU" option which is given on the top of Visual Studio then your program build speed will be definitely increased.
here's ya problem:
3.2GHz P4 Hyperthreaded, 2GB RAM
Hypertheaded means "doesn't actually have two CPU's, but it fakes it". If you have a process with just one thread running, then you get bad performance. It was a good short-term measure, but compared to having two REAL CPU's, it's a slow hack.
2GB of RAM would be an issue too, based on what you said you run. If you have a basic 5400RPM disk, then it's going to make it all worse.
I'd recommend, based on what you posted:
A good core2 machine, maybe a quad if you have the budget.
3GB of ram if you are running a 32bit OS, 4+GB if you are running x64. 4GB means you waste 1GB under 32bit.
Get 7200RPM disks, or better. If you can, RAID0 them (stripe) or RAID0+1 (stripe+mirror) if you can get 4 drives (stripe == split content over the two disks, so you can read from both at the same time. stripe+mirror == the safe version of striping, so your code is on TWO disks at all times)
I have a 2.0ghz Core2 (so roughly 3-4x the performance of your P4, if you count 2 CPU's(cores) to be 2x) with 2GB, and the most I can run well is 2 instances of VS.NET 2008. This is normal - nothing wrong with VS.NET, it's just a huge app.
More RAM. More CPU. More Screen. More. More. More :)

Resources