So my laptop hard drive reported a bad cluster last week, which is never a good sign.
I'm going to be shopping for a hard disk, and I may as well plump for the upgrade to Windows 7, which means a reinstallation of Visual Studio and everything else.
This particular laptop has space for two hard disks, so I was thinking about an SSD drive in one and a larger fastish (7.2k) drive in the other.
Where should Visual Studio best go in this arrangement? And what about "special" folders like %TEMP%? Does it make sense to use a ReadyBoost USB stick when your pagefile is already on an SSD? Should the database server and files live on the hard drive? Should I get concerned about the SSD wearing out?
Thanks all...
Do you have an Antivirus with on-access scans activated? If yes, deactivate it for the directory the compiler is installed and the source code, maybe also other areas (have a look at the on-access-scan stats during compilation). That was the main slow-down on my laptop.
Also a bit more RAM might help better also.
I had a look at the exorbitant prices of SSD. I would think twice before investing a big amount in money in something that might not help in the end (that's why you asked the question here, right? ;-) )
If you really need speed, I would buy a desktop and setup a raid-0 partition. Laptops are quite slow. Of course only if you can accept the drawback in mobility...
Make sure you have 2GB+ of RAM. The more the better, as Win7 will use any spare RAM as a disk cache which will probably negate most of the advantage of an SSD. (We have a solution that took 6 minutes to load the first time in VS2005, and 20 seconds thereafter, due to the disk cache).
If you have enough RAM, stick your temp & intermediate folders in a RAMdisk.
Then split your remaining files over the two drives (e.g. apps and pagefile on one, source/object files on the other) to spread the I/O load across both drives. If using SSD, try to use it as a read-only device as much as possible.
I would put OS and Apps on the SSD, lots of reading, little writing. And then put data on the other.
Visual Studio intensively uses disk while compiling. Putting temp and project folders on high performance disk can speed up compiling very much.
My opinion is based on tests using ramdisk.
Related
The solution I typically work with contains a couple of dozen projects. When I load this solution the status line displays something like "searching #includes for additional files", with a counting up well over one thousand. This can take 15-30 seconds (machine has quad core i7, 8G RAM and SSD, Windows 7 Pro, SP1). It the spends another 15 seconds or so "updating intellisense". In spite of all this preparation, if I right-click on a function or method and select Go To Definition, I'll frequently get a dialog box with "Please wait". This can take 10-15 seconds, though usually after the first few times, the search is instant. Others working on this solution (all are local copies managed by git and Cmake, no shared disk or anything) have the same experience.
Are there settings or something that will remove or lessen these problems. Or is this what happens when a solution gets to this size?
Thanks
I don't think you can do much about it. I've worked on large solutions on Quad Code i7 with 16/32 GB RAM on Win7/XP. They tend to load slow, build slow, start slow...
I do not recall messages about Additional Includes though.
Check out these pages, they may help:
msdn,
SO
Tinkering with settings might help, but be careful, as sometimes they lead to big problems and you have to restart your project.
What I Want
I want to simulate the performance of a normal hard drive on my SSD based development machine.
Background
I'm developing a Mac application on a Macbook with an SSD. It's gloriously fast.
If someone has a standard platter hard drive, my app will be slower for them. My app is heavy on Core Data too, so the disk access speed will be a significant factor.
I worry that the performance measurements I take with Instruments look fine, but when a customer runs my app on their normal hard drive it will be achingly slow.
What I've Tried
Before I installed my SSD, I measured the performance of my app in Instruments. After the install, I measured it again and the two benchmarks were identical.
This doesn't make sense to me. I'm convinced I was doing something wrong here. Instruments probably measures clock speed, not wall time speed. But still, surely the speed of the hard drive should affect the benchmark I took? Or does Instruments somehow compensate for this?
Kudos to #PaulR above who suggested using an external USB hard drive to test performance. Thanks!
You can use a Virtual Machine and throttle disk access. In this way you should have control over disk speed.... still not possible to limit only writes or only reads.
Here are some tips about how do it in Virtualbox 5.8. Limiting bandwidth for disk images https://www.virtualbox.org/manual/ch05.html#storage-bandwidth-limit
I'm planning on buying a new PC for programming under Visual Studio 2010. My main other usages are:
Programming under Microsoft Visual Studio.
Running VMWare Virtual Machines.
Probably multi-monitor (if my budget lets me buy an extra one)
Here are my questions:
Do I need to purchase a high performance display adapter considering my usage described above? or a medium-range one will suffice? In general, I'd like to know how much a display adapter could affect my usages?
Which CPU could perform better? Core i7, Core 2 Quad, AMD? I have a limited budget but I really need a good performance and buying a good CPU/MB/RAM is my first priority.
A good video card is not a must have, unless you want to develop advanced 3D with Visual Studio (which is an option after all). WPF and multi-monitor can work on any video card you would buy nowadays.
What is an absolute requirement is 4GB of RAM, just for Visual Studio 2010 alone under Win7 (x64 obviously, since the x86 version cannot use 4GB of RAM). Adding Virtual machines raises that need. This has no up limit since it really depends on how many VMs you're planning to run at the same time and what application will run on them. Add 1GB minimum per VM running Win7, a lot more if they are supposed to run databases, source control or any heavy load application.
Also, for the VMs, it is almost mandatory to have them use separate physical hard drives if they are going to run simultaneously, if you don't you will experience stone age level disk performance for both the host and the VMs (unless it's all on SSD, which I never tried).
Would I be buying a computer for programming now I would definitely buy an SSD to host Win7, VS and the projects, it would really be comfy (my current desktop takes several minutes to boot and load my projects, anything that improves loading is good).
On the CPU side, you might want to spend money on the number of cores rather than the actual speed (frequency) of the processor. All CPUs have decent performance, but your computer may slow down a lot if you're running several VMs on a 2-core CPU.
the i7 chip is a really good one, but I don't think you would gain a lot buy spending big amounts of money on high-end Intel chips. Go for a good price/perf ratio with lots of cores, which for your budget will be a 4-core i5 or a 6-core Phenom II X6 (I personnally would prefer the X6 but I don't want to sound partial).
More generally, if your host or your VMs are meant to run stuff DBs or continuous integration build or source control servers that are accessible to a lot of people, you might want to use another computer as your developping computer, since availability will be important (that means no reboots, avoid hardware and software failures). You might want to buy a good mobo, and an excellent power supply, plus a good tower with sufficiently numerous fans. And you might want to think of what you're going to use for backups.
Edit: this last line almost excludes pre-built computers, since afaik computer makers will almost always include cheap power supply and motherboard even in high-end computers, because those points are not advertised.
Another thing to look for is drive speed. Visual Studio does a lot of writing and reading to disk so get the fastest you can. SSD is ideal.
With the exception of the amazing graphics card, the same rules for gaming setups apply for development environments. The more resources (RAM) the better, move your default Windows page file location to a drive other than the C: drive, use an SDD or if you cannot afford one then try a hybrid 7200rpm / 4GB SSD drive such as the Momentus made Seagate which will not break the bank.
A lot of people agree that with the 64bit era, memory is the new disk. 48GB will cost around $700 at the moment but this will drop rapidly over the coming months due to a better acceptance of 64bit machines than ever now.
Oh and your graphics card, whilst not needing to be a monster, should still be a better made one (by a decent manufacturer) with the most RAM you can afford. 2GB of graphics ram means that you can have a high resolution image, with multiple monitors, without affecting the host machine RAM.
Best thing for a good Visual Studio setup? Money.
i7 or core 2, whichever. I'd go quad core if possible,and I'd use as much money as I could on ram.
The Quadcore AMD processors are also quite good now.
finally, considering 2010 is WPF based, a fast video card would also help, maybe not as much as more ram, but I'd go with something more than onboard video.
I'm running VS2008/VS2010 on a triple monitor setup with a really awful graphics card -- ATI Radeon HD3450. Graphics performance hasn't affected me one bit since I'm just doing simple WPF applications. Your needs will vary if you're doing game development or something more demanding.
I would spend your money on RAM, especially if you're using VMs. And not only do the VMs need memory to run well, they will also need to use the same disk. So either put them on a different hard disk, or go SSD. VS20xx thrashes the drive during compiling, and a fast disk will help you out a lot.
You can really get a great developer machine if you're willing to build it yourself.
Scott Hanselman says:
Jedis build their own lightsabers, so
you should build your own computer at
least once!
He describes how he built GOM (God's Own Machine) here for under $3K, and describes it in a podcast here.
If building your own is a bit beyond your aspirations, you can get some good ideas there about the most important features for a developer, from a Microsoft guru who really knows.
If you can afford it, go for a solid state drive.
I would consider getting a better-than-average video card because you'll need some horsepower to run multiple monitors, since you'll want to take advantage of the new tab tear-off ability in vs 2010 to display code files in separate windows.
I would definitely recommend a 10,000 RPM Velociraptor hard-drive or a pair of them striped because VS is a bit of a hog on IO resources.
If it was me, I'd go with a 6-core AMD Phenom processor and 6GB of Triple-channel RAM to maximize performance. If you're an Intel fan, go i7.
A good read on the importance of hard drive speed from ScottGu's Blog.
Tip/Trick: Hard Drive Speed and Visual Studio Performance
When you are doing development with Visual Studio you end up reading/writing a lot of files, and spend a large amount of time doing disk I/O activity.
At work, my PC is slow. I feel that I can be way more productive if I just wasn't waiting for Visual Studio and everything else to respond. My PC isn't bad (dual-core, 3GB of RAM), but there is a lot of corporate software and whatnot to slow everything down and sometimes lock it up.
Now, some developers have begun getting Windows 7 machines with 8 GB of RAM. Of course, I start salivating at this. However, I was told that I "had to justify" why I should get a new machine.
I can think of a lot of different things, but I am curious as to what every one else on SO would have to say.
NOTE: Ideally, these reasons should be specifically related to .NET development in Visual Studio on a Windows machine. This isn't a "how can I make my machine faster" question.
I would ask myself "What am I waiting on?" And then let the answer to that question drive whether or not I felt like I could justify it.
For example, right now, I'm dealing with 90 minute compiles of the project I'm working on. Would a faster machine help that? A little. But, more impactful would be sane configuration management. So, I'm pushing that way (to no avail) rather than to the hardware route.
Bring in a chess clock.
If you are waiting start the clock
when you aren't stop the clock.
At the end of day, total up the time
multiply it by your pay rate,
multiply it by 2000,
and that is a reasonable upper limit as
to the amount of money the company is squandering on you
per year due to a slow machine
Most useful metric: How much time do you spend reading The Onion (or, these days, StackOverflow)?
This is item #9 on The Joel Test:
9. Do you use the best tools money can buy?
Writing code in a compiled language is one of the last things that still can't be done instantly on a garden variety home computer. If your compilation process takes more than a few seconds, getting the latest and greatest computer is going to save you time. If compiling takes even 15 seconds, programmers will get bored while the compiler runs and switch over to reading The Onion, which will suck them in and kill hours of productivity.
I agree with the "what is holding me up?" approach.
I start with improviing workflow by looking at repetitive things I do that can be automated or a little helper tool can fix. Helper tools don't take long to write and add a lot of productivity. Purchasing tools is also a good return on your time - a lot of things you could write, you shouldn't bother, concentrate on your core activity and let the tool makers concentrate on theirs, whether is is help software, screen grabing, SEO tools, debugging tools, whatever.
If you can't improve things by changing your workflow (and I'd be surprised if you can't), then look at your hardware.
Increase memory if you can. If you're at 3GB with a 32 bit OS, no point going any further.
Add independent disks. One disk for the OS another for your build drive. That way there is less contention for disk access from the OS and the compiler. Makes a difference.
Better CPU. Only valid if you are doing the work to justify it.
Example: What do I use?
Dual Xeon Quad Core (8 cores, total)
8 GB RAM
Dual Monitors
VMWare virtual machines
What are the benefits?
Dual Monitor is great, much better than a single 1920x1200 screen.
Having lots of memory when using Virtual Machines is great because you can realistically give the VM a realistic amount of memory (2GB) without killing the host machine.
Having 8 cores means I can do a build and mess about in a VM doing a build or a debug at the same time, no problems.
I've had this machine for some time. Its old hat compared to an iCore7 machine, but its more than fast enough for any developer. Very rarely have I seen all the cores close to maxing out (pretty much going to be held back by I/O with that much CPU power - which is why I commented on multiple disks).
For me (working in a totally different environment, where JBoss, Eclipse and Firefox are the main resource sinks), it was simple enough:
"I have 2GBs of RAM available. I'm spending most of my time with 1GB of swap in use: imagine what task switching and application building looks like there. Another 2GB of RAM costs 50 euro. Ignoring the fact that it's frustrating working like this, you do the productivity math."
I could have shown CPU load figures and application build times as well, but it didn't come to that. It took them a month or two, but boy is development a joy since then! Oh, and for performance, it's likely you'd do best with Windows XP, but you probably already know that. ;]
Use some performance monitor to determine the cause.
For me its the antivirus has some kind of critical resource leak the slows down IO after a few days requiring a reboot and no hardware upgrades will help much.
The justification will need hard data to back it. If their business software is causing the problem that "this is industry standard" obviously doesn't fly anymore. Maybe they'll realize their business software sucks and fix that instead.
What's up people.
Something's been bothering me for a while now... and I was wondering if any of you might know of a workaround for this.
The C# solution im working on is a huge solution that contains about 20 projects and almost the same amount of unit test projects. Each projects contains hundreds of files. So opening and closing the solution takes a while... but once it's opened, everything is fine.
But, if I leave my computer up for the night (with my solution still opened in VS) and come back the next morning, everything I'll do in VS will be very slow for the next half hour or so.
I know why this happens... it's because Windows seems to remove idle processes from memory (RAM). And when I do something in VS, it takes the data from the pagefile and puts it back in the memory which slows everything single operations I do till the process' memory has been fully restored in RAM.
So my question is, is there a way to tell Windows that VS is a high priority process/application and to leave that process' memory in RAM?
Thanks in advance,
-Oli
I don't think this is possible. OTOH, you could put your computer in suspend-to-disk mode. That would pretty much freeze its state as it is when you leave (that is: VS in RAM) and restore it to the same when you start working. As an additional bonus, you would help to conserve energy and thus might save the earth.
You could alter your VS shortcut according to this article to boost the priority, but I don't know whether it would do what you describe for the process' memory.
Also solely for performance sake you could consider getting an SSD drive to replace your hard drive, if you haven't already. A friend of mine showed me his new laptop with an SSD onboard and it booted into Windows under a minute, and opened VS in less than 5 seconds.
Granted that was opening VS straight from the start menu, opening that huge a project hopefully would at least be significantly faster.
AFAIK, changing the process priority won't solve the problem, as the bottleneck seems to be I/O rather than CPU time. If the problem hurts your productivity, it would be well worth it to just buy a few more Gs of RAM (how much depends on your OS and budget). If you can get about 3-4GB of RAM, you can even eliminate the swap file (or close to eliminate it). This will prevent VS from sinking when idle.
Another option would be to create a tool that will walk VS's heap, forcing it into the main memory. This can be done by writing an add-in or by code injection. Have it run before you get to work, and you'll have VS up and about once you get to it. It will, however, require some work, and you might get more than you actually need in memory (some of VS's memory is in the swap file even when you work as usual, as with every other process).