For some major codecleanup I created a Solution including all projectfiles to make things easier.
This is roughly 620 .csproj with about 12k Source-Files.
Using the cleanup on this solution will surely take ages, but that was planned. Unplanned however was the SystemOutOfMemory-Exception during the process.
Im not sure whether this is resharpers fault or visual-studio itself (noticed similiar problems with ex. CodeMaid)
I monitored taskmanager and rightbefore it was throwing the exception, memoryusage was at ~ 2.6Gb. It grew constantly during the process, so this must be some kind of "not freeing ressources thing"
Is there anything that can be configured to get rid of this problem ? Like some option that disables any kind of caching or whatever?
I know splitting up in smaller solutions would work...
We also have an extremely large solution file that, unfortuantely, I have to deal with daily. There are some configurations you can change with resharper and vs to speed things up that should also help save on memory usage. The below link helped me some:
https://resharper-support.jetbrains.com/hc/en-us/articles/206546919
Related
My Visual Studio (2010 SP1) has a serious memory leak, and it seems to be caused by one or more of the installed extensions.
I have tried to narrow it down by turning extensions and add-ons on and off, but it takes a while of work to build up a significant leak and the results aren't totally conclusive. It rarely takes more than a few hours before the devenv.exe process uses 2GB memory and starts trashing, which is becoming a bit of a pain to work with.
Is there some way to make VS emit information on memory allocated for extensions?
Are managed extensions running in their own AppDomain? Maybe that would allow me to turn on some performance counters that could help pinpoint the problem.
Anything else I can do to troubleshoot this apart from disabling stuff one by one until the problem disappears?
According to the JetBrains faq page you should be able to attach to an already running process when it starts to display memory leak issues. Perhaps you can give that a shot. Note that the faq states that you need to be using version 5.
My installation of Visual studio eats up memory like a starved pig. Depending on what I'm doing, after a while, it's too slow and I have to reload it. Typically, devenv.exe will get to around 700 MB before I have to reload it.
I would expect that it's slowing down because of some excessive page swaps or something.
I read somewhere that Ctrl-Alt-Shift-F12 helps, but it does nothing.
Is there any fixes for this, or at least anything that will allow me to run it for longer till it explodes my ram, ie starts to run slow at 4gb rather than 700mb.
I have Windows 7 x64 with 8GB ram. Using Virgin anti virus stuff.
I have lots of Addons running, I have a suspition that it might be resharper that's causing it to slow.
The number of projects or solutions is irrelevant, as I can run a single winforms project with about 50 lines of code and after a few dozen debugs etc, it'll still be like trying flog a dead horse.
Ta.
Intellisense has been known to cause problems, you may want to try to disable it.
Also I had BIG problems with my Anti virus (Kaspersky), the newer versions use a new 'signature' technique that brought mine to a crawl.
Hope this helps
I am tired of how slow the VS2010 is. I know there are a lot of topics here about tuning the settings and I've read/applied them all with not much luck though. Namely the things I've already done:
removed all the extensions
never had a resharper
tuned the settings to get maximum performance
tried SSD and RAM disks
Nothing helped it is still unacceptably slow. I know what I am saying because with VS2008 I never had such problems.
Now, I am working on a quite big C# solution with about 20 projects in it. Visual Studio works quite fast when just opened, but as time goes it starts lagging and eventually gets so slow that I have to restart it. The resource monitor shows that the amount of memory consumed by it is about 200 MB in the beginning and goes up to ~600 MB and then doesn't go any higher. I have 8 GB of total RAM on a x64 laptop with about 4GB that are always free. I find it weird how little memory the VS uses and from what my common sense tells me the more memory the faster the app should work. So I believe my question is how to make the VS use more of the available memory.
PS
I tried a recipe from Configure Visual Studio to use more ram Didn't work out.
There is no way to make Visual Studio use more memory. The application itself has no preset limitation. It will simply use the amount of memory that is granted it by the operating system (just like other apps).
The reason you see it increase to 600MB and then stop is just a side effect of how the managed GC works. As it performs operations like displaying intellisense, performing edits, etc ... more managed objects will be created. Eventually the GC is triggered and it reclaims all of the free objects and the longer lived ones are promoted. Overall though memory usage will be lowered but not as much as before you started editing. Then you edit some more and this process continues until it reaches the appearance of a steady state. If you deeply analyze it you'll see that it's actually more of a saw tooth graph of memory usage.
As to why your particular instance of Visual Studio is slow though is hard to determine remotely. 20 projects is a larger solution but performance should still be acceptable even with that many. Couple of things to try in order to isolate the problem
Try editing a smaller solution. It's possible there is one project in particular which is giving VS a problem. Breaking the project down into smaller solutions could help isolate it.
Try disabling Aero on your computer. It's possible that WPF is a problem here
We use SourceSafe 6.0d and have a DB that is about 1.6GB. We haven't had any problems yet, and there is no plan to change source control programs right now, but how big can the SourceSafe database be before it becomes an issue?
Thanks
I've had VSS problems start as low as 1.5-2.0 gigs.
The meta-answer is, don't use it. VSS is far inferior to a half-dozen alternatives that you have at your fingertips. Part of source control is supposed to be ensuring the integrity of your repository. If one of the fundamental assumptions of your source control tool is that you never know when it will start degrading data integrity, then you have a tool that invalidates its own purpose.
I have not seen a professional software house using VSS in almost a decade.
1 byte!
:-)
Sorry, dude you set me up.
Do you run the built-in ssarchive utility to make backups? If so, 2GB is the maximum size that can be restored. (http://social.msdn.microsoft.com/Forums/en-US/vssourcecontrol/thread/6e01e116-06fe-4621-abd9-ceb8e349f884/)
NOTE: the ssarchive program won't tell you this; it's just that if you try to restore a DB over 2GB, it will fail. Beware! All these guys who are telling you that they are running fine with larger DB are either using another archive program, or they haven't tested the restore feature.
I've actually run a vss db that was around 40 gig. I don't recommend it, but it is possible. Really the larger you let it go, the more you're playing with fire. I've heard instances where the db gets corrupted, and the items in source control were unrecoverable. I would definately back it up on a daily basis and start looking to change source control systems. Having been in the position of the guy who they call when it fails, I can tell you that it will really start to get stressful when you realize that it could just go down and never come back.
Considering the amount of problems SourceSafe can generate on its own, I would say the size has to be in the category "Present on disk" for it to develop problems.
I've administered a VSS DB over twice that size. As long as your are vigilant about running Analyze, you should be OK.
Sourcesafe recommends 3-5G with a "don't ever go over 13G".
In practice, however, ours is over 20G and seems to be running fine.
The larger you get, Analyze will find more and more problems including lost files, etc.
EDIT: Here is the official word: http://msdn.microsoft.com/en-us/library/bb509342(VS.80).aspx
I have found that Analyze/Fix starts getting annoyingly slow at around 2G on a reasonably powerful server. We run Analyze once per month on databases that are used by 20 or so developers. The utility finds occasional fixes to perform, but actual use has been basically problem free for years at my workplace.
The main thing according to Microsoft is make sure you never run out of disk space, whatever the size of the database.
http://msdn.microsoft.com/en-us/library/bb509342(VS.80).aspx
quote:
Do not allow Visual SourceSafe or the Analyze tool to run out of disk space while running. Running out of disk space in the middle of a complex operation can create serious database corruption
I like ReSharper, but it is a total memory hog. It can quickly swell up and consume a half-gig of RAM without too much effort and bog down the IDE. Does anybody know of any way to configure it to be not as slow?
Turn off the on-the-fly compilation (which, unfortunately, is one of its best features)
The next release 4.5 is going to based around performance and memory footprint.
see Ilya Ryzhenkov's blog
Resharper 4.5 has been released
From my experience it is less of a memory hog, but i still can run out of memory.
I had an issue where it was taking upwards of 10 minutes to load a solution of 100+ projects. Once loaded VS performance would be ok, though it would oddly flutter back and forth between ok and really bad.
The short answer: Eliminating Resharper warnings seems to improve overall VS/R# performance.
The biggest problem ultimately was that we had a number of files of binary data (encrypted stuff) being included as embedded resources, which happened to have .xml extensions. Resharper was trying really really hard to analyze those files. Eventually it'd get through but would generate 100K+ errors in the process. Changing the extension to one Resharper did not automatically analyze (.bin in this case) solved the problem.
We still have about 10 files which when they or a file they depend on is edited performance tanks for a while. These files are the partial parts of a single class definition where each file averages 3000 LOC. Yes, that's right, it's about a 30K line class. It also happens to be rather poor code for other reasons, many of which Resharper flags making the right hand gutter bar practically a solid orange line. Editing often causes Resharper to reanalyze the whole thing. While that analysis runs, performance is noticeably affected.
I've come to the conclusion that the less errors/warnings there are for R# to identify, the better it performs. My anecdotal evidence gathered while cleaning up/refactoring this project seems to support it.
A lot of folks complain of perf problems with Resharper. If you have even a few big ugly code files with lots of Resharper warnings, then a little time spent cleaning that code up might yield better performance overall. It has for us.
Not sure how big your solutions are, but I stopped using 4.5 for the same reasons I stopped using all previous versions, memory usage.
Code analysis and unit test support was the main reason I bought it, turning it off means the rationale for using it is gone.
Workstation has 4GB of memory, and I can easily kill it with ReSharper when running our end-to-end stack in debuggers.
You can look how much memory ReSharper use.
ReSharper -> General -> Show managed memory usege in status bar.
If you are working on large source files, Resharper does get sluggish (I'm working on version 5.0 at the time of writing this).
You can view the memory usage of Resharper by clicking on Resharper options -> General -> Show memory use in status bar.
When I first did this, I noticed Resharper had clocked up hundreds of megabytes of memory usage! However, the next step worked for me in (temporarily) fixing the slugishness:
Right click the memory usage, and select "Collect garbage" - this seemed to fix the slugishness for me straight away.
Regarding memory hogging - I've found that my VS2008 memory footprint grows every time I close one solution and open another. This is true even if I close a solution and re-open that same solution.
The new ReSharper 4.5 works a lot better than the previous 4.x releases. I would recommend you try that one.
In previous versions I had the same problem, when 4.0 came out these problems have seemed to have gone away. Now with 4.1 i do not feel the huge slow down i used to have. My IDE does not freeze up anymore.
have you tried upgrading ?
Try the 4.5 beta. 4.1 was killing my 2GB dev machine, but it's back to running incredibly smoothly with the beta. Others have had the opposite experience, though, so YMMV.
Yes, 4.5 works much better. My understanding is that 4.5 was to address the performance issues.
Me and my colleagues are also having huge performance issues with ReSharper, just now my ReSharper took 1.1GB of memory. Visual Studio slows down specially when writing JavaScript, it's unbearable. You can turn of the on the fly compilation, but it's the best feature it has...
edit: Everybody in this thread seems to have ReShaprper 4.x, my version is 6.0.