Visual Studio 2008 crawling after long idle time - visual-studio

What's up people.
Something's been bothering me for a while now... and I was wondering if any of you might know of a workaround for this.
The C# solution im working on is a huge solution that contains about 20 projects and almost the same amount of unit test projects. Each projects contains hundreds of files. So opening and closing the solution takes a while... but once it's opened, everything is fine.
But, if I leave my computer up for the night (with my solution still opened in VS) and come back the next morning, everything I'll do in VS will be very slow for the next half hour or so.
I know why this happens... it's because Windows seems to remove idle processes from memory (RAM). And when I do something in VS, it takes the data from the pagefile and puts it back in the memory which slows everything single operations I do till the process' memory has been fully restored in RAM.
So my question is, is there a way to tell Windows that VS is a high priority process/application and to leave that process' memory in RAM?
Thanks in advance,
-Oli

I don't think this is possible. OTOH, you could put your computer in suspend-to-disk mode. That would pretty much freeze its state as it is when you leave (that is: VS in RAM) and restore it to the same when you start working. As an additional bonus, you would help to conserve energy and thus might save the earth.

You could alter your VS shortcut according to this article to boost the priority, but I don't know whether it would do what you describe for the process' memory.
Also solely for performance sake you could consider getting an SSD drive to replace your hard drive, if you haven't already. A friend of mine showed me his new laptop with an SSD onboard and it booted into Windows under a minute, and opened VS in less than 5 seconds.
Granted that was opening VS straight from the start menu, opening that huge a project hopefully would at least be significantly faster.

AFAIK, changing the process priority won't solve the problem, as the bottleneck seems to be I/O rather than CPU time. If the problem hurts your productivity, it would be well worth it to just buy a few more Gs of RAM (how much depends on your OS and budget). If you can get about 3-4GB of RAM, you can even eliminate the swap file (or close to eliminate it). This will prevent VS from sinking when idle.
Another option would be to create a tool that will walk VS's heap, forcing it into the main memory. This can be done by writing an add-in or by code injection. Have it run before you get to work, and you'll have VS up and about once you get to it. It will, however, require some work, and you might get more than you actually need in memory (some of VS's memory is in the swap file even when you work as usual, as with every other process).

Related

Visual Studio 2010 - how do I speed loading of large solution?

The solution I typically work with contains a couple of dozen projects. When I load this solution the status line displays something like "searching #includes for additional files", with a counting up well over one thousand. This can take 15-30 seconds (machine has quad core i7, 8G RAM and SSD, Windows 7 Pro, SP1). It the spends another 15 seconds or so "updating intellisense". In spite of all this preparation, if I right-click on a function or method and select Go To Definition, I'll frequently get a dialog box with "Please wait". This can take 10-15 seconds, though usually after the first few times, the search is instant. Others working on this solution (all are local copies managed by git and Cmake, no shared disk or anything) have the same experience.
Are there settings or something that will remove or lessen these problems. Or is this what happens when a solution gets to this size?
Thanks
I don't think you can do much about it. I've worked on large solutions on Quad Code i7 with 16/32 GB RAM on Win7/XP. They tend to load slow, build slow, start slow...
I do not recall messages about Additional Includes though.
Check out these pages, they may help:
msdn,
SO
Tinkering with settings might help, but be careful, as sometimes they lead to big problems and you have to restart your project.

Visual Studio slows down excessively

My installation of Visual studio eats up memory like a starved pig. Depending on what I'm doing, after a while, it's too slow and I have to reload it. Typically, devenv.exe will get to around 700 MB before I have to reload it.
I would expect that it's slowing down because of some excessive page swaps or something.
I read somewhere that Ctrl-Alt-Shift-F12 helps, but it does nothing.
Is there any fixes for this, or at least anything that will allow me to run it for longer till it explodes my ram, ie starts to run slow at 4gb rather than 700mb.
I have Windows 7 x64 with 8GB ram. Using Virgin anti virus stuff.
I have lots of Addons running, I have a suspition that it might be resharper that's causing it to slow.
The number of projects or solutions is irrelevant, as I can run a single winforms project with about 50 lines of code and after a few dozen debugs etc, it'll still be like trying flog a dead horse.
Ta.
Intellisense has been known to cause problems, you may want to try to disable it.
Also I had BIG problems with my Anti virus (Kaspersky), the newer versions use a new 'signature' technique that brought mine to a crawl.
Hope this helps

How to make Visual Studio 2010 use more than 600Mb of memory

I am tired of how slow the VS2010 is. I know there are a lot of topics here about tuning the settings and I've read/applied them all with not much luck though. Namely the things I've already done:
removed all the extensions
never had a resharper
tuned the settings to get maximum performance
tried SSD and RAM disks
Nothing helped it is still unacceptably slow. I know what I am saying because with VS2008 I never had such problems.
Now, I am working on a quite big C# solution with about 20 projects in it. Visual Studio works quite fast when just opened, but as time goes it starts lagging and eventually gets so slow that I have to restart it. The resource monitor shows that the amount of memory consumed by it is about 200 MB in the beginning and goes up to ~600 MB and then doesn't go any higher. I have 8 GB of total RAM on a x64 laptop with about 4GB that are always free. I find it weird how little memory the VS uses and from what my common sense tells me the more memory the faster the app should work. So I believe my question is how to make the VS use more of the available memory.
PS
I tried a recipe from Configure Visual Studio to use more ram Didn't work out.
There is no way to make Visual Studio use more memory. The application itself has no preset limitation. It will simply use the amount of memory that is granted it by the operating system (just like other apps).
The reason you see it increase to 600MB and then stop is just a side effect of how the managed GC works. As it performs operations like displaying intellisense, performing edits, etc ... more managed objects will be created. Eventually the GC is triggered and it reclaims all of the free objects and the longer lived ones are promoted. Overall though memory usage will be lowered but not as much as before you started editing. Then you edit some more and this process continues until it reaches the appearance of a steady state. If you deeply analyze it you'll see that it's actually more of a saw tooth graph of memory usage.
As to why your particular instance of Visual Studio is slow though is hard to determine remotely. 20 projects is a larger solution but performance should still be acceptable even with that many. Couple of things to try in order to isolate the problem
Try editing a smaller solution. It's possible there is one project in particular which is giving VS a problem. Breaking the project down into smaller solutions could help isolate it.
Try disabling Aero on your computer. It's possible that WPF is a problem here

What could cause the application as well as the system to slowdown?

I am debugging an application which slows down the system very badly. The application loads a large amount of data (some 1000 files each of half an MB) from the local hard disk.The files are loaded as memory mapped files and are mapped only when needed. This means that at any given point in time the virtual memory usage does not exceed 300 MB.
I also checked the Handle count using handle.exe from sysinternals and found that there are at the most some 8000 odd handles opened. When the data is unloaded it drops to around 400. There are no handle leaks after each load and unload operation.
After 2-3 Load unload cycles, during one load, the system becomes very slow. I checked the virtual memory usage of the application as well as the handle counts at this point and it was well within the limits (VM about 460MB not much fragmentation also, handle counts 3200).
I want how an application could make the system very slow to respond? What other tools can I use to debug this scenario?
Let me be more specific, when i mean system it is entire windows that is slowing down. Task manager itself takes 2 mins to come up and most often requires a hard reboot
The fact that the whole system slows downs is very annoying, it means you can not attach a profiler easily, it also means it would be even difficult to stop the profiling session in order to view the results ( since you said it require a hard reboot ).
The best tool suited for the job in this situation is ETW ( Event Tracing for Windows ), these tools are great, will give you the exact answer you are looking for
Check them out here
http://msdn.microsoft.com/en-us/library/cc305210.aspx
and
http://msdn.microsoft.com/en-us/library/cc305221.aspx
and
http://msdn.microsoft.com/en-us/performance/default.aspx
Hope this works.
Thanks
Tools you can use at this point:
Perfmon
Event Viewer
In my experience, when things happen to a system that prevent Task Manager from popping up, they're usually of the hardware variety -- checking the system event log of Event Viewer is sometimes just full of warnings or errors that some hardware device is timing out.
If Event Viewer doesn't indicate that any kind of loggable hardware error is causing the slowdown, then try Perfmon -- add counters for system objects to track file read, exceptions, context switches etc. per second and see if there's something obvious there.
Frankly the sort of behavior demonstrated is meant to be impossible - by design - for user-mode code to cause. WinNT goes to a lot of effort to insulate applications from each other and prevent rogue applications from making the system unusable. So my suspicion is some kind of hardware fault is to blame. Is there any chance you can simply run the same test on a different PC?
If you don't have profilers, you may have to do the same work by hand...
Have you tried commenting out all read/write operations, just to check whether the slow down disappears ?
"Divide and conquer" strategies will help you find where the problem lies.
If you run it under an IDE, run it until it gets real slow, then hit the "pause" button. You will catch it in the act of doing whatever takes so much time.
You use tools like "IBM Rational Quantify" or "Intel VTune" to detect performance issue.
[EDIT]
Like BenoƮt did, one good mean is measuring tasks time to identify which is eating cpu.
But remember, as you are working with many files, is likely to be missing that causes the memory to disk swap.
when task manager is taking 2 minutes to come up, are you getting a lot of disk activity? or is it cpu-bound?
I would try process explorer from sysinternals. When your system is in the slowed-down state, and you try running, say, notepad, pay attention to page fault deltas.
Windows is very greedy about caching file data. I would try removing file I/O as someone suggested, and also making sure you close the file mapping as soon as you are done with a file.
I/O is probably causing your slowdown,especially if your files are on the same disk as the OS. Another way to test that would be to move your files to another disk and see if that alleviates the problem.

Comparing cold-start to warm start

Our application takes significantly more time to launch after a reboot (cold start) than if it was already opened once (warm start).
Most (if not all) the difference seems to come from loading DLLs, when the DLLs' are in cached memory pages they load much faster. We tried using ClearMem to simulate rebooting (since its much less time consuming than actually rebooting) and got mixed results, on some machines it seemed to simulate a reboot very consistently and in some not.
To sum up my questions are:
Have you experienced differences in launch time between cold and warm starts?
How have you delt with such differences?
Do you know of a way to dependably simulate a reboot?
Edit:
Clarifications for comments:
The application is mostly native C++ with some .NET (the first .NET assembly that's loaded pays for the CLR).
We're looking to improve load time, obviously we did our share of profiling and improved the hotspots in our code.
Something I forgot to mention was that we got some improvement by re-basing all our binaries so the loader doesn't have to do it at load time.
As for simulating reboots, have you considered running your app from a virtual PC? Using virtualization you can conveniently replicate a set of conditions over and over again.
I would also consider some type of profiling app to spot the bit of code causing the time lag, and then making the judgement call about how much of that code is really necessary, or if it could be achieved in a different way.
It would be hard to truly simulate a reboot in software. When you reboot, all devices in your machine get their reset bit asserted, which should cause all memory system-wide to be lost.
In a modern machine you've got memory and caches everywhere: there's the VM subsystem which is storing pages of memory for the program, then you've got the OS caching the contents of files in memory, then you've got the on-disk buffer of sectors on the harddrive itself. You can probably get the OS caches to be reset, but the on-disk buffer on the drive? I don't know of a way.
How did you profile your code? Not all profiling methods are equal and some find hotspots better than others. Are you loading lots of files? If so, disk fragmentation and seek time might come into play.
Maybe even sticking basic timing information into the code, writing out to a log file and examining the files on cold/warm start will help identify where the app is spending time.
Without more information, I would lean towards filesystem/disk cache as the likely difference between the two environments. If that's the case, then you either need to spend less time loading files upfront, or find faster ways to load files.
Example: if you are loading lots of binary data files, speed up loading by combining them into a single file, then do a slerp of the whole file into memory in one read and parse their contents. Less disk seeks and time spend reading off of disk. Again, maybe that doesn't apply.
I don't know offhand of any tools to clear the disk/filesystem cache, but you could write a quick application to read a bunch of unrelated files off of disk to cause the filesystem/disk cache to be loaded with different info.
#Morten Christiansen said:
One way to make apps start cold-start faster (sort of) is used by e.g. Adobe reader, by loading some of the files on startup, thereby hiding the cold start from the users. This is only usable if the program is not supposed to start up immediately.
That makes the customer pay for initializing our app at every boot even when it isn't used, I really don't like that option (neither does Raymond).
One succesful way to speed up application startup is to switch DLLs to delay-load. This is a low-cost change (some fiddling with project settings) but can make startup significantly faster. Afterwards, run depends.exe in profiling mode to figure out which DLLs load during startup anyway, and revert the delay-load on them. Remember that you may also delay-load most Windows DLLs you need.
A very effective technique for improving application cold launch time is optimizing function link ordering.
The Visual Studio linker lets you pass in a file lists all the functions in the module being linked (or just some of them - it doesn't have to be all of them), and the linker will place those functions next to each other in memory.
When your application is starting up, there are typically calls to init functions throughout your application. Many of these calls will be to a page that isn't in memory yet, resulting in a page fault and a disk seek. That's where slow startup comes from.
Optimizing your application so all these functions are together can be a big win.
Check out Profile Guided Optimization in Visual Studio 2005 or later. One of the thing sthat PGO does for you is function link ordering.
It's a bit difficult to work into a build process, because with PGO you need to link, run your application, and then re-link with the output from the profile run. This means your build process needs to have a runtime environment and deal cleaning up after bad builds and all that, but the payoff is typically 10+ or more faster cold launch with no code changes.
There's some more info on PGO here:
http://msdn.microsoft.com/en-us/library/e7k32f4k.aspx
As an alternative to function order list, just group the code that will be called within the same sections:
#pragma code_seg(".startUp")
//...
#pragma code_seg
#pragma data_seg(".startUp")
//...
#pragma data_seg
It should be easy to maintain as your code changes, but has the same benefit as the function order list.
I am not sure whether function order list can specify global variables as well, but use this #pragma data_seg would simply work.
One way to make apps start cold-start faster (sort of) is used by e.g. Adobe reader, by loading some of the files on startup, thereby hiding the cold start from the users. This is only usable if the program is not supposed to start up immediately.
Another note, is that .NET 3.5SP1 supposedly has much improved cold-start speed, though how much, I cannot say.
It could be the NICs (LAN Cards) and that your app depends on certain other
services that require the network to come up. So profiling your application alone may not quite tell you this, but you should examine the dependencies for your application.
If your application is not very complicated, you can just copy all the executables to another directory, it should be similar to a reboot. (Cut and Paste seems not work, Windows is smart enough to know the files move to another folder is cached in the memory)

Resources