ReSharper sluggishness - visual-studio

I like ReSharper, but it is a total memory hog. It can quickly swell up and consume a half-gig of RAM without too much effort and bog down the IDE. Does anybody know of any way to configure it to be not as slow?

Turn off the on-the-fly compilation (which, unfortunately, is one of its best features)

The next release 4.5 is going to based around performance and memory footprint.
see Ilya Ryzhenkov's blog
Resharper 4.5 has been released
From my experience it is less of a memory hog, but i still can run out of memory.

I had an issue where it was taking upwards of 10 minutes to load a solution of 100+ projects. Once loaded VS performance would be ok, though it would oddly flutter back and forth between ok and really bad.
The short answer: Eliminating Resharper warnings seems to improve overall VS/R# performance.
The biggest problem ultimately was that we had a number of files of binary data (encrypted stuff) being included as embedded resources, which happened to have .xml extensions. Resharper was trying really really hard to analyze those files. Eventually it'd get through but would generate 100K+ errors in the process. Changing the extension to one Resharper did not automatically analyze (.bin in this case) solved the problem.
We still have about 10 files which when they or a file they depend on is edited performance tanks for a while. These files are the partial parts of a single class definition where each file averages 3000 LOC. Yes, that's right, it's about a 30K line class. It also happens to be rather poor code for other reasons, many of which Resharper flags making the right hand gutter bar practically a solid orange line. Editing often causes Resharper to reanalyze the whole thing. While that analysis runs, performance is noticeably affected.
I've come to the conclusion that the less errors/warnings there are for R# to identify, the better it performs. My anecdotal evidence gathered while cleaning up/refactoring this project seems to support it.
A lot of folks complain of perf problems with Resharper. If you have even a few big ugly code files with lots of Resharper warnings, then a little time spent cleaning that code up might yield better performance overall. It has for us.

Not sure how big your solutions are, but I stopped using 4.5 for the same reasons I stopped using all previous versions, memory usage.
Code analysis and unit test support was the main reason I bought it, turning it off means the rationale for using it is gone.
Workstation has 4GB of memory, and I can easily kill it with ReSharper when running our end-to-end stack in debuggers.

You can look how much memory ReSharper use.
ReSharper -> General -> Show managed memory usege in status bar.

If you are working on large source files, Resharper does get sluggish (I'm working on version 5.0 at the time of writing this).
You can view the memory usage of Resharper by clicking on Resharper options -> General -> Show memory use in status bar.
When I first did this, I noticed Resharper had clocked up hundreds of megabytes of memory usage! However, the next step worked for me in (temporarily) fixing the slugishness:
Right click the memory usage, and select "Collect garbage" - this seemed to fix the slugishness for me straight away.

Regarding memory hogging - I've found that my VS2008 memory footprint grows every time I close one solution and open another. This is true even if I close a solution and re-open that same solution.

The new ReSharper 4.5 works a lot better than the previous 4.x releases. I would recommend you try that one.

In previous versions I had the same problem, when 4.0 came out these problems have seemed to have gone away. Now with 4.1 i do not feel the huge slow down i used to have. My IDE does not freeze up anymore.
have you tried upgrading ?

Try the 4.5 beta. 4.1 was killing my 2GB dev machine, but it's back to running incredibly smoothly with the beta. Others have had the opposite experience, though, so YMMV.

Yes, 4.5 works much better. My understanding is that 4.5 was to address the performance issues.

Me and my colleagues are also having huge performance issues with ReSharper, just now my ReSharper took 1.1GB of memory. Visual Studio slows down specially when writing JavaScript, it's unbearable. You can turn of the on the fly compilation, but it's the best feature it has...
edit: Everybody in this thread seems to have ReShaprper 4.x, my version is 6.0.

Related

How to work with a Visual Studio 2017 Solution with 650+ projects?

I must work with a Visual Studio (2017) Solution that contains 654 projects, and counting. The projects are a mixture of C++ and C# projects - possibly 2/3 C++.
The problem is, VS2017 (we're already running 15.8) is highly unstable at this project count, but for some tasks I need to open the whole solution.
One can (and should) question the design, but please not here. Are there any viable tricks to make working with such a sln bearable?
The problems we have is:
After having fully loaded, it's sluggish as hell, even on our beefy dev machines. Hangs often.
It will crash many times a day. (We isolated a few cases that reliably crash it, like oping the C++ setting dialog, but it's still unstable).
Crashes are often observed when VS peaks at ~ 2.6GB RAM
Not Problems:
Solution load times: The solution is loaded in a decent amount of time. We don't need to optimize for this at the moment.
Compilation times: Devs don't do full-solution builds anyway. (But some tasks require having your corner in the full context of the whole solution.)
I already tried disabling VS Intellisense, but it didn't help. Disabling our VisualAssistX plugin also didn't really help.
Historically the VS team has always said that they're going to fix the problem of VS loading too much at some indefinite point in time. That's one reason why they
haven't made it 64-bit yet as well. Now that they've disabled the selective loading API, you're pretty much on their mercy.
For older VS versions, there's Funnel.
It allows you to selectively load a subset of the projects, automatically loading dependencies. The added benefit is that refactoring, search etc. only works in context of loaded projects, making it much faster. You can also save and organize your filters, making it easier to switch between different subsets.

Memory leak in Visual Studio 2012 using ReSharper on 'huge' Solution

For some major codecleanup I created a Solution including all projectfiles to make things easier.
This is roughly 620 .csproj with about 12k Source-Files.
Using the cleanup on this solution will surely take ages, but that was planned. Unplanned however was the SystemOutOfMemory-Exception during the process.
Im not sure whether this is resharpers fault or visual-studio itself (noticed similiar problems with ex. CodeMaid)
I monitored taskmanager and rightbefore it was throwing the exception, memoryusage was at ~ 2.6Gb. It grew constantly during the process, so this must be some kind of "not freeing ressources thing"
Is there anything that can be configured to get rid of this problem ? Like some option that disables any kind of caching or whatever?
I know splitting up in smaller solutions would work...
We also have an extremely large solution file that, unfortuantely, I have to deal with daily. There are some configurations you can change with resharper and vs to speed things up that should also help save on memory usage. The below link helped me some:
https://resharper-support.jetbrains.com/hc/en-us/articles/206546919

How to speed up VS2010?

The process of compiling, creating exe and running it is very slow on my machine(and also stopping the exe by the stop button). Its a windows forms app with a very simple form. I see that in Release mode it works faster, but not fast enough.
There is also slow down of IDE right after I hit the stop button, it really needs to think about something for at least 10seconds(I understand that I'm killing the app, but why VS cant just understand it and don't think about it?).
Maybe uninstalling something or disabling something?
P.S. This is slow only after a few runs, but I think I just got too old machine. I would rather not update it right now.
I have 2GB of RAM.
I think the accepted solution is to upgrade to VS 2008.
Release mode will make the compiler slower, if anything. It generally makes the finished application smaller/faster, though.
VS2010 needs an enormous amount of memory, and if you have less than 3-4G, you're almost certainly being hit by that - that might be a cheaper upgrade than a new machine.
But all versions of the 'Visual' dev tools, right back to VC1.0, have been beasts that have required reasonably up-to-date computer specifications. I'm afraid that's just the way things are.
The BETA service pack worked wonders for me - lots of bits of VS2010 that were broken (like macros) started working and its significantly faster for me now - ymmv tho :)
Microsoft VS2010 SP1 Beta Link
This is a duplicate question, but I can't find the duplicate. So in summary here's what the other one said :)
1) Get more memory, 4gig as a minimum.
2) Disable extensions
3) A list of changes to speed things up 1
4) Might of found the original
I love all my extensions, so memory is key to me.

Is Visual Studio 2010 beta 1 usable?

I saw that Beta 1 of VS2010 was publicly availible.
My question to those of you who has tried it is: does it work good?
Will it cause my computer to blow up in tiny pieces? Will it crash randomly? Will it work with some minor glitches? Or is it just perfect from bottom up?
I'm only coding school- and hobby-stuff, so nothing that someones life depend upon, but i still want software that works. How close to a final product is it? Is it worth trying?
It's a bit slow, and there's no offline MSDN, but it's worth trying IMO. Having said that it's slow, I still use it on my NC10 netbook, so it's clearly not that bad :)
I've got it side-by-side VS2008, and that hasn't caused any problems.
I've seen a couple of glitches (once the keyboard handling went completely wonky) but it's certainly usable. The main question is what you want to get out of trying it - in my case I absolutely need to code against C# 4 to explore the new features. I do most of that from the command line in fact, where the speed of VS obviously isn't an issue, but it's nice to see the VS-specific features as well (like the debug threading views for Parallel Extensions).
It seems more or less usable on the .NET side. The C++ side is a bit more sketchy. On one hand, they've added support for some very nice new C++0x features, on the other, they've broken some absolute fundamentals.
Your plain old main function won't compile in 32-bit with unicode enabled. (Workarounds: Either compile as 64-bit, disable unicode, or rename the function to wmain).
This seems to me to be a strong hint that the C++ side of things is nowhere near release-worthy. I'd probably wait for beta2 before doing any serious work with that.
I would say it is great, but the performance hurts a bit.
Here is an idea for you: Install it into a VirtualPC. Then you can play and not care what it does. You don't like it, delete the VPC image and keep on trucking. That is how I play with Microsoft betas now. I never install them on any real machine - too risky.
Usable: Yes.
Recommended: Not if you'r a touchpad-addict or dislike crashing apps.
I've been trying it for 2 weeks now coding small C#-projects and these are my impressions
Reasons to use 2010:
Looks good
Multi monitor support
I can see myself using the code templating but right now i couldnt find any really useful stuff except for reducing the fontsize of comments.
Zoom in the editor
Select a variable and then press shift+up/down to go to next usage of this variable
Ctrl+, brings up instant search of classes and functions in the entire project. (i've become really addicted to this)
Floating watches for single objects
Reasons to not use 2010:
TOUCHPAD SCROLL DOESN'T WORK IN THE EDITOR!!! (this is reason enough to not upgrade if you are using it on a laptop)
I've had some random app-crashes in the middle of just writing code, once or twice per day maybe.
UI sometimes freezes randomly for about 30seconds and then returns to normal.
It started to use 100% CPU power from one of my cores once when it was minimized in basic editing-mode and i was doing other stuff in other programs, i only noticed it because the fan started to go wild.
Otherwhise it's pretty similar to 2008. I haven't noticed any difference in speed like other people say.
You need to ask yourself: what is the advantage for you in using VS2010 over VS2008? I would suggest that there is no advantage if all you are doing is "school- and hobby-stuff".
I'm still using VS2008 for business related stuff (and, indeed, VC6 for some stuff). I prefer to wait until all the early adopters have tested it (and Microsoft has released at least one service pack after the real product release) before I do their testing for them.
It seems to co-exist with other versions of VS without causing any problems.
Regarding the slowness - it seems to be the UI that is slow, rather than building. Once it's going it doesn't seem much slower on my fast quadcore. I've yet to try it on my laptop.
It's usable enough, the small glitches that I've encounter weren't that bad. However, certain VS extensions(like XNA) don't work in VS2010 at the moment.
It's fun to toy with. Not usable for me, cause re#er does not support it yet (had to install TestDriven .NET which works through keyboard shortcuts only to run my tests).
Gave me an insight how addicted I am. :/
Btw, on Win7, without virtual pc it seemed even faster than vs2008 for me.
VS2010 doesn't yet support mobile device projects, which might or might not matter to you.
VC++ wise - VS2010 has a built-in 64-bit compiler, VS2008 does not.
You can supposedly add 64-bit support to VS2008, but it takes some effort.
I've been using VS 2010 beta (with .NET 4.0 beta) on Windows 7 RC. I've been trying to rewrite parts of a large-scale business application in it to see what can be done with it.
The UI freezes frequently. I'm talking 1-10 minutes between freezes. The UI does not come back, so I'm forced to kill devenv.exe every time it happens. Microsoft probably puts my error reports in their spam folder by now.
For me, VS 2010 beta 1 classifies as unusable. However, it's fast, the new IDE functions are very handy, and it's pretty. I keep coming back to it despite my resolutions to wait for a stable build.

Why is Visual Studio 2005 so slow?

It is slow to load anything other than a small project. It is slow to quit; it can sometimes take minutes. It can be slow to open new files. The record macro feature used to be useful. It is now so slow to start up it's almost always quicker to do it manually!
More info would be helpful. How big are your solutions? What platform are you on. What 3rd party plugins are you running? What else is running on your pc?
3.2GHz P4 Hyperthreaded, 2GB RAM. Running Outlook, Perforce, IE7, directory browsers. Usually have 1-3 instances of VS running. It's much slower than VC6, say. It seems to take a long time to load projects and close down. I'm interested in if people know reasons why this happens, because of the way VS is written. Is it using .net internally and GC slows it down?
One of the biggest culprits for Visual Studio 2005 slowness is Intellisense. This has been brought up on the MSDN forums again and again and again. What I frequently experience is that Intellisense is working nearly non-stop to "re-index" symbols (or whatever you call it). But developers at Microsoft have not been deaf to the complaints and some outgoing folks there have come up with some workarounds that have helped me and might help you:
Check out this link to get a better understanding of Intellisense:
Intellisense Info
Then check out this link for some macros that I've had a lot of success with:
Intellisense Macros
With those macros, you can turn off intellisense (without renaming any DLLs), restart it, delete the ncb file (which you can do manually, but this is a convenience) and it can give you the status of Intellisense.
it might be that you have a plugin that is misbehaving. Try the safemode switch to see if this improves performance
One of the biggest culprits for Visual Studio 2005 slowness is Intellisense. This has been brought up on the MSDN forums again and again and again. What I frequently experience is that Intellisense is working nearly non-stop to "re-index" symbols [...]
I agree. I use Visual Assist. It is much better. There is no real way to turn "Intellisense" off either. The only way I have found is to rename the DLL so when you restart VS it is not found. This works and makes VS faster.
I tend to agree that VS is a heavyweight. Back in the day I coded in DOS using Boxer text editor and makefiles. Boxer didn't have the heavy intellisense and refactoring features, but it did better in the text editing department, had good syntax highlighting and startup/closing were instantaneous, even on a 486. ...those were the days.
I would say it would be really nice to customize VS to remove all the overhead you're never going to use anyway, but I don't see that happening.
here's ya problem:
3.2GHz P4 Hyperthreaded, 2GB RAM
Hypertheaded means "doesn't actually have two CPU's, but it fakes it". If you have a process with just one thread running, then you get bad performance. It was a good short-term measure, but compared to having two REAL CPU's, it's a slow hack.
I don't think that's the problem at all. The machine is plenty high spec enough to be professional C++ development machine for large projects. I can run Eclipse (which is Java, which is memory hungry and slower than native code) and this is still way faster than VS 2005.
I doubled the amount of RAM from 1GB to 2GB. This helps a lot with linking large applications. We also use Incredibuild to accelerate compilation. But it's the VS app that is slow.
And if you think I'm a grumpy anti-MS zealot, ask yourself why people aren't buying Vista! :)
I'm am seeing mixed results with faster machines. Sure, faster machine, hides the poor performance quality of vs2005 but not all.
Simply taking your obligatory "hello world" C/C++ program, just compile it, (CL /c helloword.cpp),
#include <stdio.h>
#include <windows.h>
int main(char argc, char *argv[])
{
printf("Hello World\n");
return 0;
}
I see 1 seconds compiler under Vc6 and a 6 seconds compile under VS2005.
Using DEPENDS to profile the two, I see 3 areas where the 5 seconds delays and time different are taking place:
~2.5 secs with ADVAPI32.DLL, CryptGetHashParam()
~1.5 secs with OLE2.DLL, StringFromGUID2()
~1.0 secs with C2.DLL, _AbortCompilerPass()
Again, this is just a compile, not a link. The VC8+ compiler executables/dlls are referencing sub-systems like crypto API, the Registry for some transparent reason and it is adding a tremendous amount of overhead to straight and pure compiles.
While a faster machine may hide some of hide slow down, one could only wonder if Microsoft can optimize the compiler by offering options that disables unnecessary overhead references. I understand the better compiler comes with some overhead, but what I see is a 300-500% compile time degradation - thats awful.
Hector Santos, CTO
Santronics Software
This is a purely subjective thing I am afraid.
May it is because of your low end system configuration.
May be VS trying to get updates from the net?
May be you are running too many application in the background.
May be you are trying to open a huge solution.
Recently I've had both Visual Studio 2008 and Visual Studio 2005 on my machine, and I agree that VS2005 is really heavy. They improved upon it in VS2008, although I'm not sure if you'll consider the performance improvements enough.
Could you maybe time some operations and post them so we get an idea what you mean by "slow"? On my machine, I wouldn't call VS 2005 slow, but if you compare it to notepad or my web browser it seems slow. Here's some things that might help people figure out what's going on:
Turn off any features that could affect load time. This includes uninstalling all add-ons and making sure that VS isn't configured to automatically open a project.
Reboot your machine.
Time how long it starts VS 2005 to start, from the time you click the icon until the program starts.
Create a program that you're willing to post here that seems to compile slowly (this might not be possible depending on what it takes to make a slow compile); post the program and how long it takes your computer to build it.
Do you know anyone else with the same kind of machine that has VS 2005 installed? Does it seem slower or faster than yours?
I believe Lord Kelvin said the best thing that can be said about situations like this:
When you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge of it is of a meager and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced it to the stage of science.
Until you give us some measurements to look at, we can't tell you if your machine is genuinely slow or if you are expecting more out of your machine than it can give. Your HT CPU might be the problem; I have roughly equivalent machines at work and at home, but my dual-core work machine runs circles around my single-core home machine when it comes to running VS.
VS 2005 is slower than VS 6 because it is less well optimized for speed. The developers of VS 6 had slower machines than the developers of VS 2005. They made it fast "enough" back then. On a modern machine VS is now "pleasantly fast", where VS 2005 is only just fast enough.
What annoys me is that they decided to scrap VS 6 and start again for VS 2005, when VS 6 was an awesome piece of software that just needed updating.
I noticed above you mentioned you are using perforcet too. Do projects load faster when not in perforce, I am willing to bet that some of the delay you are seeing is related to perforce during load. The latest version of perforce seems a lot slower too.
Change your Solution Platform on "Any CPU" option which is given on the top of Visual Studio then your program build speed will be definitely increased.
here's ya problem:
3.2GHz P4 Hyperthreaded, 2GB RAM
Hypertheaded means "doesn't actually have two CPU's, but it fakes it". If you have a process with just one thread running, then you get bad performance. It was a good short-term measure, but compared to having two REAL CPU's, it's a slow hack.
2GB of RAM would be an issue too, based on what you said you run. If you have a basic 5400RPM disk, then it's going to make it all worse.
I'd recommend, based on what you posted:
A good core2 machine, maybe a quad if you have the budget.
3GB of ram if you are running a 32bit OS, 4+GB if you are running x64. 4GB means you waste 1GB under 32bit.
Get 7200RPM disks, or better. If you can, RAID0 them (stripe) or RAID0+1 (stripe+mirror) if you can get 4 drives (stripe == split content over the two disks, so you can read from both at the same time. stripe+mirror == the safe version of striping, so your code is on TWO disks at all times)
I have a 2.0ghz Core2 (so roughly 3-4x the performance of your P4, if you count 2 CPU's(cores) to be 2x) with 2GB, and the most I can run well is 2 instances of VS.NET 2008. This is normal - nothing wrong with VS.NET, it's just a huge app.
More RAM. More CPU. More Screen. More. More. More :)

Resources