Why is Visual Studio 2005 so slow? - visual-studio

It is slow to load anything other than a small project. It is slow to quit; it can sometimes take minutes. It can be slow to open new files. The record macro feature used to be useful. It is now so slow to start up it's almost always quicker to do it manually!
More info would be helpful. How big are your solutions? What platform are you on. What 3rd party plugins are you running? What else is running on your pc?
3.2GHz P4 Hyperthreaded, 2GB RAM. Running Outlook, Perforce, IE7, directory browsers. Usually have 1-3 instances of VS running. It's much slower than VC6, say. It seems to take a long time to load projects and close down. I'm interested in if people know reasons why this happens, because of the way VS is written. Is it using .net internally and GC slows it down?

One of the biggest culprits for Visual Studio 2005 slowness is Intellisense. This has been brought up on the MSDN forums again and again and again. What I frequently experience is that Intellisense is working nearly non-stop to "re-index" symbols (or whatever you call it). But developers at Microsoft have not been deaf to the complaints and some outgoing folks there have come up with some workarounds that have helped me and might help you:
Check out this link to get a better understanding of Intellisense:
Intellisense Info
Then check out this link for some macros that I've had a lot of success with:
Intellisense Macros
With those macros, you can turn off intellisense (without renaming any DLLs), restart it, delete the ncb file (which you can do manually, but this is a convenience) and it can give you the status of Intellisense.

it might be that you have a plugin that is misbehaving. Try the safemode switch to see if this improves performance

One of the biggest culprits for Visual Studio 2005 slowness is Intellisense. This has been brought up on the MSDN forums again and again and again. What I frequently experience is that Intellisense is working nearly non-stop to "re-index" symbols [...]
I agree. I use Visual Assist. It is much better. There is no real way to turn "Intellisense" off either. The only way I have found is to rename the DLL so when you restart VS it is not found. This works and makes VS faster.

I tend to agree that VS is a heavyweight. Back in the day I coded in DOS using Boxer text editor and makefiles. Boxer didn't have the heavy intellisense and refactoring features, but it did better in the text editing department, had good syntax highlighting and startup/closing were instantaneous, even on a 486. ...those were the days.
I would say it would be really nice to customize VS to remove all the overhead you're never going to use anyway, but I don't see that happening.

here's ya problem:
3.2GHz P4 Hyperthreaded, 2GB RAM
Hypertheaded means "doesn't actually have two CPU's, but it fakes it". If you have a process with just one thread running, then you get bad performance. It was a good short-term measure, but compared to having two REAL CPU's, it's a slow hack.
I don't think that's the problem at all. The machine is plenty high spec enough to be professional C++ development machine for large projects. I can run Eclipse (which is Java, which is memory hungry and slower than native code) and this is still way faster than VS 2005.
I doubled the amount of RAM from 1GB to 2GB. This helps a lot with linking large applications. We also use Incredibuild to accelerate compilation. But it's the VS app that is slow.
And if you think I'm a grumpy anti-MS zealot, ask yourself why people aren't buying Vista! :)

I'm am seeing mixed results with faster machines. Sure, faster machine, hides the poor performance quality of vs2005 but not all.
Simply taking your obligatory "hello world" C/C++ program, just compile it, (CL /c helloword.cpp),
#include <stdio.h>
#include <windows.h>
int main(char argc, char *argv[])
{
printf("Hello World\n");
return 0;
}
I see 1 seconds compiler under Vc6 and a 6 seconds compile under VS2005.
Using DEPENDS to profile the two, I see 3 areas where the 5 seconds delays and time different are taking place:
~2.5 secs with ADVAPI32.DLL, CryptGetHashParam()
~1.5 secs with OLE2.DLL, StringFromGUID2()
~1.0 secs with C2.DLL, _AbortCompilerPass()
Again, this is just a compile, not a link. The VC8+ compiler executables/dlls are referencing sub-systems like crypto API, the Registry for some transparent reason and it is adding a tremendous amount of overhead to straight and pure compiles.
While a faster machine may hide some of hide slow down, one could only wonder if Microsoft can optimize the compiler by offering options that disables unnecessary overhead references. I understand the better compiler comes with some overhead, but what I see is a 300-500% compile time degradation - thats awful.
Hector Santos, CTO
Santronics Software

This is a purely subjective thing I am afraid.
May it is because of your low end system configuration.
May be VS trying to get updates from the net?
May be you are running too many application in the background.
May be you are trying to open a huge solution.

Recently I've had both Visual Studio 2008 and Visual Studio 2005 on my machine, and I agree that VS2005 is really heavy. They improved upon it in VS2008, although I'm not sure if you'll consider the performance improvements enough.

Could you maybe time some operations and post them so we get an idea what you mean by "slow"? On my machine, I wouldn't call VS 2005 slow, but if you compare it to notepad or my web browser it seems slow. Here's some things that might help people figure out what's going on:
Turn off any features that could affect load time. This includes uninstalling all add-ons and making sure that VS isn't configured to automatically open a project.
Reboot your machine.
Time how long it starts VS 2005 to start, from the time you click the icon until the program starts.
Create a program that you're willing to post here that seems to compile slowly (this might not be possible depending on what it takes to make a slow compile); post the program and how long it takes your computer to build it.
Do you know anyone else with the same kind of machine that has VS 2005 installed? Does it seem slower or faster than yours?
I believe Lord Kelvin said the best thing that can be said about situations like this:
When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge of it is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced it to the stage of science.
Until you give us some measurements to look at, we can't tell you if your machine is genuinely slow or if you are expecting more out of your machine than it can give. Your HT CPU might be the problem; I have roughly equivalent machines at work and at home, but my dual-core work machine runs circles around my single-core home machine when it comes to running VS.

VS 2005 is slower than VS 6 because it is less well optimized for speed. The developers of VS 6 had slower machines than the developers of VS 2005. They made it fast "enough" back then. On a modern machine VS is now "pleasantly fast", where VS 2005 is only just fast enough.
What annoys me is that they decided to scrap VS 6 and start again for VS 2005, when VS 6 was an awesome piece of software that just needed updating.

I noticed above you mentioned you are using perforcet too. Do projects load faster when not in perforce, I am willing to bet that some of the delay you are seeing is related to perforce during load. The latest version of perforce seems a lot slower too.

Change your Solution Platform on "Any CPU" option which is given on the top of Visual Studio then your program build speed will be definitely increased.

here's ya problem:
3.2GHz P4 Hyperthreaded, 2GB RAM
Hypertheaded means "doesn't actually have two CPU's, but it fakes it". If you have a process with just one thread running, then you get bad performance. It was a good short-term measure, but compared to having two REAL CPU's, it's a slow hack.
2GB of RAM would be an issue too, based on what you said you run. If you have a basic 5400RPM disk, then it's going to make it all worse.
I'd recommend, based on what you posted:
A good core2 machine, maybe a quad if you have the budget.
3GB of ram if you are running a 32bit OS, 4+GB if you are running x64. 4GB means you waste 1GB under 32bit.
Get 7200RPM disks, or better. If you can, RAID0 them (stripe) or RAID0+1 (stripe+mirror) if you can get 4 drives (stripe == split content over the two disks, so you can read from both at the same time. stripe+mirror == the safe version of striping, so your code is on TWO disks at all times)
I have a 2.0ghz Core2 (so roughly 3-4x the performance of your P4, if you count 2 CPU's(cores) to be 2x) with 2GB, and the most I can run well is 2 instances of VS.NET 2008. This is normal - nothing wrong with VS.NET, it's just a huge app.
More RAM. More CPU. More Screen. More. More. More :)

Related

How to make Visual Studio 2010 use more than 600Mb of memory

I am tired of how slow the VS2010 is. I know there are a lot of topics here about tuning the settings and I've read/applied them all with not much luck though. Namely the things I've already done:
removed all the extensions
never had a resharper
tuned the settings to get maximum performance
tried SSD and RAM disks
Nothing helped it is still unacceptably slow. I know what I am saying because with VS2008 I never had such problems.
Now, I am working on a quite big C# solution with about 20 projects in it. Visual Studio works quite fast when just opened, but as time goes it starts lagging and eventually gets so slow that I have to restart it. The resource monitor shows that the amount of memory consumed by it is about 200 MB in the beginning and goes up to ~600 MB and then doesn't go any higher. I have 8 GB of total RAM on a x64 laptop with about 4GB that are always free. I find it weird how little memory the VS uses and from what my common sense tells me the more memory the faster the app should work. So I believe my question is how to make the VS use more of the available memory.
PS
I tried a recipe from Configure Visual Studio to use more ram Didn't work out.
There is no way to make Visual Studio use more memory. The application itself has no preset limitation. It will simply use the amount of memory that is granted it by the operating system (just like other apps).
The reason you see it increase to 600MB and then stop is just a side effect of how the managed GC works. As it performs operations like displaying intellisense, performing edits, etc ... more managed objects will be created. Eventually the GC is triggered and it reclaims all of the free objects and the longer lived ones are promoted. Overall though memory usage will be lowered but not as much as before you started editing. Then you edit some more and this process continues until it reaches the appearance of a steady state. If you deeply analyze it you'll see that it's actually more of a saw tooth graph of memory usage.
As to why your particular instance of Visual Studio is slow though is hard to determine remotely. 20 projects is a larger solution but performance should still be acceptable even with that many. Couple of things to try in order to isolate the problem
Try editing a smaller solution. It's possible there is one project in particular which is giving VS a problem. Breaking the project down into smaller solutions could help isolate it.
Try disabling Aero on your computer. It's possible that WPF is a problem here

Performance tips for making Visual Studio 2010 faster? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I don't know if anybody else has had an issue with the performance of Visual Studio 2010, but I close it daily and reopen it, and with an hour it starts to really bog down, and can't even keep up with my typing. Is there some obvious setting I am missing that would help to speed it up?
I am also using ReSharper, but even if I remove that, it only marginally increases the speed.
Since a couple of people have asked for my machine specs:
Intel Q9550 # 2.83 GHz
4 Cores
8GB Physical RAM
2x 60GB SSD in RAID0 combination for solution/project
VS2010 RTM Ultimate
Windows Server 2008 x64 R2 (Performance set for Applications)
Although it is sad to hear that the answer is "buy faster hardware" when my hardware is actually pretty good.
EDIT: Including a link to the Visual Studio Performance Diagnostics tool suggested by TimothyP
I recommend you consider to install some hotfixes from http://connect.microsoft.com/VisualStudio/Downloads. I had also havy performance problems before and seen messages like "insufficient memory" during Cut & Paste operations. This problem and some other (inclusive different Memory leak problems) are already hixed. After installing some hotfixes from http://connect.microsoft.com/VisualStudio/Downloads the performance of Visual Studio on my computer is much better.
Adding my own answer here. I really didn't think ReSharper did that much. After trying every option and only loading one project, it was still lagging even from simply joining 2 lines together (a couple of backspace presses).
I uninstalled ReSharper and along with all of the other tweaks I have done, the thing is blazingly fast.
One note re RAID0 SSDs. You should make sure your RAID controller (and driver!) supports TRIM for RAIDed SSDs. Most RAID controllers - especially the Intel chipset controllers - do NOT do this. The consequence is that I/O performance will quickly degenerate very significantly in SSD-based RAID arrays.
Windows 7 and Windows 2008R2 support the TRIM command - when your controller and driver implement it.
There is lots on this subject at sites like tomshardware.com or anandtech.com. If all else fails, you might consider either using SSDs in non-RAID, or using SSD for OS and normal, large HD for databases etc. You could also look up your specific SSDs on one of the aforementioned sites; performance varies much more widely than you might think.
Try turning off IntelliTrace. I've had numerous problems related to slowness and instability because of this feature (it could just be me). The setting is under Tools > Options > IntelliTrace > Enable IntelliTrace.
Of course, RAM is always very important for a large development environment like Visual Studio, especially the 2010 version and especially if you're using the Ultimate edition which includes such fairly memory intensive features as IntelliTrace and the Architecture and Modelling Diagrams.
However, one of the main things that is often overlooked, but can make a big difference to the overall performance of Visual Studio, is Hard-Drive Speed.
Scott Guthrie (Microsoft's Corporate Vice President of the .NET Developer Platform) wrote a very interesting article about this exact subject.
It's a few years old, and was written around the time of Visual Studio 2005, however, it is still very relevant today since the way that Visual Studio continues to work (specifically the way the compilers work) has not changed so much over that time.
Scott writes:
People often ask me at conferences for
PC hardware recommendations.
Specifically - "what type of machine
do you recommend I get for doing
development with Visual Studio?"
and/or "your laptop seems really fast,
what type is it?"
Some of my recommendations on this
topic are fairly standard and obvious:
Ideally you want to get a duel core or
better CPU. I also always recommend
getting at least 2GB or more of RAM.
The recommendation I make that often
seems to take people a little by
surprise is to make sure you always
get the fastest possible hard-drive
when buying a new machine - and where
necessary trade off purchasing
additional CPU processor speed in
favor of investing in a faster disk
instead.
Also:
Why does hard drive speed matter?
Multi-core CPUs on machines have gotten fast enough over the past few years that in most >common application scenarios you don't usually end up blocking on available processor >capacity in your machine.
What you are much more likely to block on is the Seek and I/O speed capacity with which >your computer accesses your hard drive. If you are using an application that needs to >read/write a lot of files, it is not atypical for your CPU processor utilization to be >really low - since the application might be spending most of its time just waiting for >the disk operations to complete.
When you are doing development with Visual Studio you end up reading/writing a lot of >files, and spend a large amount of time doing disk I/O activity. Large projects and >solutions might have hundreds (or thousands) of source files (including images, css, >pages, user controls, etc). When you open a project Visual Studio needs to read and >parse all source files in it so as to provide intellisense. When you are enlisted in >source control and check out a file you are updating files and timestamps on disk. When >you do a compilation of a solution, Visual Studio will check for updated assemblies from >multiple disk path locations, write out multiple new assemblies to disk when the >compilation is done, as well as persist .pdb debugger symbol files on disk with them (all >as separate file save operations). When you attach a debugger to a process (the default >behavior when you press F5 to run an application), Visual Studio then needs to search and >load the debugger symbols of all assemblies and DLLs for the application so as to setup >breakpoints.
If you have a slow hard-drive, Visual Studio will end up being blocked as it waits for it >to complete these read/write operations - which can really slow down your overall >development experience.
You can read the full article here:
Tip/Trick: Hard Drive Speed and Visual Studio Performance
Do you have the Desktop Experience component enabled in your Server 2008 R2 install? Unlike prior versions, Visual Studio 2010 makes heavy use of WPF and its performance benefits greatly from hardware acceleration. Enabling Desktop Experience will enable the Desktop Window Manager, which improves overall WPF performance.
For the same reason, you should ensure you are using the newest video drivers available.
If you are using many projects to build together in your solution I recommend to set to NOT BUILD in the project properties configuration. This is what I do to speed up mine. It is more evident in the Compile time...
With that kind of hardware, IMHO you shouldn't have any kind of trouble with performance almost no matter what you do. (2 x SSD in RAID-0? -- you're a maniac!!)
It looks like you've already solved this problem (is there anything specific that you did that you could share?), but another thing to check is to make sure your video drivers are up to date. It's surprising, but they can affect a lot of things you wouldn't suspect.
I suppose another culprit could be a hyperactive anti-virus package, too...
I love the suggestion of upgrades when clearly the machine is blazing. My suggestion, if possible would be to try out 2008 and see how it runs. I had several problems myself with 2010, least of all being the performance issue. For the sake of productivity I switched back to 2008.
If you can confirm that the problem does or doesn't occur on an older version we can have more of an idea where the problem lies.
The brand/controller of your SSDs is more important than the fact that they are SSD. Don't buy a cheap/budget SSD - you'd be better off with a good platter drive. Splurge on the high end SSDs and you'll experience major gains.
If you're editing XAML, you can just use the source code editor instead of the XAML editor. The performance difference is phenomenal:
http://msdn.microsoft.com/en-us/library/bb907321(v=vs.90).aspx
It's worth noting that if you open the xaml editor at any point, then you'll need to restart Visual Studio to get the performance back to normal. It's not sufficient to close the xaml editor.
The only way to get an ssd on raid0 with trim support is with the new ocz revo 3, 1.5GB Read/1.25GB Write, A more affordable and stable solution to raided ssd with no trim is to buy an ssd on sata 3 (550MB/s).
As Matt mentioned,
you might want to add more RAM to your machine,
but if it really "bogs down" every time you leave it open for an hour
you might want to get in contact with the VS team (http://connect.microsoft.com),
file a bug report and run the performance diagnostics tools they will send you.
Those guys have really helped me a lot in the past
and I'm sure they'd be willing to help you track down the real reason behind the slow downs.
Aside from that I can tell you that my main development machine
has 8 Xeon cores and 12GB of RAM. On that machine large solutions compile
in just a few seconds while they can take up to a minute to compile on my dual core macbook.
But since the RTM version I have not experienced any slow downs like the ones you are describing.
I'm assuming you are using the RTM version here,
can you give us the specs of your machine? Hardware + Software?
It's very hard to help you based on limited information.
I'm assuming you're using C#, but if you're using C++, maybe you could try turning off intellisense? I thought it was supposed to be better in 2010, but the previous versions always got a speed boost when I hacked out intellisense.
If it's the same as 2005, you can disable intellisense by renaming feacp.dll in [vs root dir]\vc\vcpackages.

Visual Studio performance and add-ins

Do the useful add-ins (Resharper, StyleCop, etc.) to Visual Studio speed up your work? Or tools need too many resources and you have to wait until each add-in completes execution?
[Update]: By the way does some body notice whether performance of IDE + Resharper is better for solutions that contain web sites or web applications?
I can speak very strongly that resharper definitly does speed my productivity greatly. Past versions of Resharper have had some bad performance issue with the IDE but I have had no issues with the most recent version.
I use some add-ins as long as they don't affect the performance of Visual Studio. To that end, tools like StyleCop, MZ-Tools, and Visual Studio Commands are the clear winners.
The problem I have with tools like Refactor! and Resharper are that
They degrade performance, particularly for large solutions.
You become dependent on the shortcut keys, etc. they provide and become completely useless when working on another environment that doesn't have them installed.
Yes, tools like Refactor! and Resharper are excellent for what they do and can increase your typing productivity but I don't think the gain is worth the dependence. This, of course, depends largely on how you use them. For things like refactoring method parameters, changing fields into properties, etc. they can be very useful and potentially save a lot of time. Again, while it can save a lot of time it is still important to know what these tools are actually doing for you so you can still be productive without them.
ReSharper definitely puts a demand on hardware resources, particularly when using site wide analysis on a large project. Having said that, the extent of the performance hit is highly dependent on the host machine. On my work laptop (32 bit XP, 3Gb RAM, 7200 RPM HDD, 2.2 GHz dual core) it suffers but on my home PC (64 bit Win 7, 8Gb RAM, 7200 RPM HDD, 2.9 GHz quad core) it flies and I barely notice the performance hit. That said, I still couldn’t live without it even on the lower specced hardware. The productivity gain still outweighs the downtime in waiting for slower processes.
I user Refactor! all the time. Just the time it saves me to encapsulate private variables into properties is worth it in my opinion.
That being said... a lot of the "benefits" of these programs are negated if you program it correctly to begin with.
For example, if you already habitually use "WITH" statements properly, you probably do not need something to clean up your style.
However in corporate America (and elsewhere I am sure), coding practices are not always followed by everyone, and rework and modifications are always coming in, so usually you will end up needing them eventually.
I personally have not experienced any noticeable difference in performance with these type of tools.
I have Resharper, Resharper Scout and Team Explorer + TFS Power tools. My Visual Studio definitely feels a little sluggish comparing to barebones, but if you want superspeed over features why not work in Notepad? For me, Resharper is definitely worth the viscosity.

Is Visual Studio 2010 beta 1 usable?

I saw that Beta 1 of VS2010 was publicly availible.
My question to those of you who has tried it is: does it work good?
Will it cause my computer to blow up in tiny pieces? Will it crash randomly? Will it work with some minor glitches? Or is it just perfect from bottom up?
I'm only coding school- and hobby-stuff, so nothing that someones life depend upon, but i still want software that works. How close to a final product is it? Is it worth trying?
It's a bit slow, and there's no offline MSDN, but it's worth trying IMO. Having said that it's slow, I still use it on my NC10 netbook, so it's clearly not that bad :)
I've got it side-by-side VS2008, and that hasn't caused any problems.
I've seen a couple of glitches (once the keyboard handling went completely wonky) but it's certainly usable. The main question is what you want to get out of trying it - in my case I absolutely need to code against C# 4 to explore the new features. I do most of that from the command line in fact, where the speed of VS obviously isn't an issue, but it's nice to see the VS-specific features as well (like the debug threading views for Parallel Extensions).
It seems more or less usable on the .NET side. The C++ side is a bit more sketchy. On one hand, they've added support for some very nice new C++0x features, on the other, they've broken some absolute fundamentals.
Your plain old main function won't compile in 32-bit with unicode enabled. (Workarounds: Either compile as 64-bit, disable unicode, or rename the function to wmain).
This seems to me to be a strong hint that the C++ side of things is nowhere near release-worthy. I'd probably wait for beta2 before doing any serious work with that.
I would say it is great, but the performance hurts a bit.
Here is an idea for you: Install it into a VirtualPC. Then you can play and not care what it does. You don't like it, delete the VPC image and keep on trucking. That is how I play with Microsoft betas now. I never install them on any real machine - too risky.
Usable: Yes.
Recommended: Not if you'r a touchpad-addict or dislike crashing apps.
I've been trying it for 2 weeks now coding small C#-projects and these are my impressions
Reasons to use 2010:
Looks good
Multi monitor support
I can see myself using the code templating but right now i couldnt find any really useful stuff except for reducing the fontsize of comments.
Zoom in the editor
Select a variable and then press shift+up/down to go to next usage of this variable
Ctrl+, brings up instant search of classes and functions in the entire project. (i've become really addicted to this)
Floating watches for single objects
Reasons to not use 2010:
TOUCHPAD SCROLL DOESN'T WORK IN THE EDITOR!!! (this is reason enough to not upgrade if you are using it on a laptop)
I've had some random app-crashes in the middle of just writing code, once or twice per day maybe.
UI sometimes freezes randomly for about 30seconds and then returns to normal.
It started to use 100% CPU power from one of my cores once when it was minimized in basic editing-mode and i was doing other stuff in other programs, i only noticed it because the fan started to go wild.
Otherwhise it's pretty similar to 2008. I haven't noticed any difference in speed like other people say.
You need to ask yourself: what is the advantage for you in using VS2010 over VS2008? I would suggest that there is no advantage if all you are doing is "school- and hobby-stuff".
I'm still using VS2008 for business related stuff (and, indeed, VC6 for some stuff). I prefer to wait until all the early adopters have tested it (and Microsoft has released at least one service pack after the real product release) before I do their testing for them.
It seems to co-exist with other versions of VS without causing any problems.
Regarding the slowness - it seems to be the UI that is slow, rather than building. Once it's going it doesn't seem much slower on my fast quadcore. I've yet to try it on my laptop.
It's usable enough, the small glitches that I've encounter weren't that bad. However, certain VS extensions(like XNA) don't work in VS2010 at the moment.
It's fun to toy with. Not usable for me, cause re#er does not support it yet (had to install TestDriven .NET which works through keyboard shortcuts only to run my tests).
Gave me an insight how addicted I am. :/
Btw, on Win7, without virtual pc it seemed even faster than vs2008 for me.
VS2010 doesn't yet support mobile device projects, which might or might not matter to you.
VC++ wise - VS2010 has a built-in 64-bit compiler, VS2008 does not.
You can supposedly add 64-bit support to VS2008, but it takes some effort.
I've been using VS 2010 beta (with .NET 4.0 beta) on Windows 7 RC. I've been trying to rewrite parts of a large-scale business application in it to see what can be done with it.
The UI freezes frequently. I'm talking 1-10 minutes between freezes. The UI does not come back, so I'm forced to kill devenv.exe every time it happens. Microsoft probably puts my error reports in their spam folder by now.
For me, VS 2010 beta 1 classifies as unusable. However, it's fast, the new IDE functions are very handy, and it's pretty. I keep coming back to it despite my resolutions to wait for a stable build.

ReSharper sluggishness

I like ReSharper, but it is a total memory hog. It can quickly swell up and consume a half-gig of RAM without too much effort and bog down the IDE. Does anybody know of any way to configure it to be not as slow?
Turn off the on-the-fly compilation (which, unfortunately, is one of its best features)
The next release 4.5 is going to based around performance and memory footprint.
see Ilya Ryzhenkov's blog
Resharper 4.5 has been released
From my experience it is less of a memory hog, but i still can run out of memory.
I had an issue where it was taking upwards of 10 minutes to load a solution of 100+ projects. Once loaded VS performance would be ok, though it would oddly flutter back and forth between ok and really bad.
The short answer: Eliminating Resharper warnings seems to improve overall VS/R# performance.
The biggest problem ultimately was that we had a number of files of binary data (encrypted stuff) being included as embedded resources, which happened to have .xml extensions. Resharper was trying really really hard to analyze those files. Eventually it'd get through but would generate 100K+ errors in the process. Changing the extension to one Resharper did not automatically analyze (.bin in this case) solved the problem.
We still have about 10 files which when they or a file they depend on is edited performance tanks for a while. These files are the partial parts of a single class definition where each file averages 3000 LOC. Yes, that's right, it's about a 30K line class. It also happens to be rather poor code for other reasons, many of which Resharper flags making the right hand gutter bar practically a solid orange line. Editing often causes Resharper to reanalyze the whole thing. While that analysis runs, performance is noticeably affected.
I've come to the conclusion that the less errors/warnings there are for R# to identify, the better it performs. My anecdotal evidence gathered while cleaning up/refactoring this project seems to support it.
A lot of folks complain of perf problems with Resharper. If you have even a few big ugly code files with lots of Resharper warnings, then a little time spent cleaning that code up might yield better performance overall. It has for us.
Not sure how big your solutions are, but I stopped using 4.5 for the same reasons I stopped using all previous versions, memory usage.
Code analysis and unit test support was the main reason I bought it, turning it off means the rationale for using it is gone.
Workstation has 4GB of memory, and I can easily kill it with ReSharper when running our end-to-end stack in debuggers.
You can look how much memory ReSharper use.
ReSharper -> General -> Show managed memory usege in status bar.
If you are working on large source files, Resharper does get sluggish (I'm working on version 5.0 at the time of writing this).
You can view the memory usage of Resharper by clicking on Resharper options -> General -> Show memory use in status bar.
When I first did this, I noticed Resharper had clocked up hundreds of megabytes of memory usage! However, the next step worked for me in (temporarily) fixing the slugishness:
Right click the memory usage, and select "Collect garbage" - this seemed to fix the slugishness for me straight away.
Regarding memory hogging - I've found that my VS2008 memory footprint grows every time I close one solution and open another. This is true even if I close a solution and re-open that same solution.
The new ReSharper 4.5 works a lot better than the previous 4.x releases. I would recommend you try that one.
In previous versions I had the same problem, when 4.0 came out these problems have seemed to have gone away. Now with 4.1 i do not feel the huge slow down i used to have. My IDE does not freeze up anymore.
have you tried upgrading ?
Try the 4.5 beta. 4.1 was killing my 2GB dev machine, but it's back to running incredibly smoothly with the beta. Others have had the opposite experience, though, so YMMV.
Yes, 4.5 works much better. My understanding is that 4.5 was to address the performance issues.
Me and my colleagues are also having huge performance issues with ReSharper, just now my ReSharper took 1.1GB of memory. Visual Studio slows down specially when writing JavaScript, it's unbearable. You can turn of the on the fly compilation, but it's the best feature it has...
edit: Everybody in this thread seems to have ReShaprper 4.x, my version is 6.0.

Resources