Building a dedicated visual studio 2010 virtual machine, which path has least resistance? - visual-studio-2010

I'd like to ask anybody who has built a virtualized VS2010 environment in VirtualBox or VMware, which one was able to work out of the box without too much tweaking? Or both need workarounds to get stuff working?

Both are fine as long as you install the respective tools and drivers provided for the guest OS
If you're using VMWare Workstation, you can leverage even more out of the environment by installing Visual Studio on the Host PC, and using the Guest VM for debugging, if your application crashes you can actually rewind back to before the crash and step through your code with the same heap and stack before it crashed!
Basically, I suggest going with VMWare Workstation. It's pretty cheap (assuming you get paid to program) and has many, many awesome features that you'll come to love. If you're a hobbyist/student programmer however, you'll likely find VirtualBox to be a little more functional than the free VMWare Player.
As far as performance goes, Intel and AMD both have shipped chips with hardware virtualization since 2005/2006 respectively. This is called VT-x or AMD-V, and often has to be enabled in the bios on older machines.
Basically this means that your BIOS handles Memory and I/O virtualization on this chip, while specialist drivers (e.g. VMWare Tools) are installed to improve graphics and mouse performance - effectively this means the resulting VM has near native performance with minimal overhead.
Hope that helps!

You can work with a VS2010/Windows virtualized environment with no problems.
I've worked with such combination and I had no problems. Both VMWare and VirtualBox are stable so far since years and Windows OS virtualization works properly.
Obviously, you can have performance loss, because a virtualized OS has more bottle necked access to resources than a host one, but current CPUs from Intel and AMD have advanced virtualization instruction extensions which accelerates virtualization operations.
So... Just go ahead!

I don't know your requirement but there is also a great alternative using Win 7.
You can create a vhd file and boot on the vhd file.
A few steps more, you can create a base vhd file with everything you need, mark it as readonly and create as many differential disk as you want.
The drawback of this method are these ones :
it's a bit tricky to create the base and diff disk, because you have to do it in the setup console of windows setup (but google can help you)
there is a small performance impact on the disk I/O (but lower than the visualization environment)
you can run only one system at a time. In fact, nothing disallow you to install a virtualization software
you can't have your "host" and it's potential tools (corporate email, etc.)
but at least, the performance will be greatly better than a virtualization software.

Related

How to limit PC performance to test software

I am developing a .NET application, and have the luxury of doing this on a fairly powerful desktop PC. I want to ensure it runs okay on PCs with much lower spec, but I don't have spare machines kicking around and can't really afford to buy them. Is there any way to simulate a lower-spec PC on my current PC, to get a feel for how the software might run?
Any help or advice would be very much appreciated.
*My PC is Windows 7 Ultimate 64-bit with 8-core Intel i7 and 16GB RAM.
You could install VMWare and install any OS you want, with any hardware specs you want, provided that they don't exceed your current working hardware of course.
Keep in mind that VMWare is just a virtualization layer. It emulates an OS but you are still running your code on the same i7.
http://www.asp.net/mobile/device-simulators Here is an example of several Visual Studio plug-ins that emulate devices. You can also install Windows 8 and run hyper-v. It's great for this kind of thing.

Doing coding in Linux through a virtual machine on Windows VS partitioning

I already have experience with setting up virtual machines, running them and other minor tasks. Im a gamer, so I wont get rid of windows (for now at least...) but I do want to be a great programmer and to be involved with the Open-Source community.
Id like to know if its a good idea to do my programming in linux through a virtual machine, vs giving it a partitioned section of the HDD. Id like to know about performance pros and cons and functionality.
All responses are appreciated, thanks in advance.
The type of programming I intend to dive into :
Android Dev, Web Dev, Desktop Dev...More Android and Web right now though.
So im looking at C#,C,C++,Java,PHP,HTML,MySQL...Off the top of the dome.
I do web designing as well, so dreamweaver is added as an "essential". But im sure I can do dreamweaver files and upload them to the server after programming in Linux...Right?
And any info on IDE's in Linux for the above mentioned are appreciated, but i would prefer going the coding route and understanding the essence of whats happening "under the covers"
Thanks to all for reading, I appreciate it.
Hope this isnt confusing :S
There is an easier solution..
I still have to use Windows for Symbian programming so I use a Wubi and Ubuntu to provide my double bout into Linux..you deploy Wubi uses a large file and thus no need to worry or mess with creating a partition..
I have used it for 18 months with no data loss and no worries..
There is also another tool called andlinux:
http://www.andlinux.org/
It uses colinux to run Linux as a program inside windows..
A couple things:
If you're using an IDE, there's no point to coding on Linux. Linux is nice for programming because the command line tools are awesome. Netbeans and Eclipse both work fine on Windows. All you'd be missing is makefiles (which IDEs don't use anyway).
Using a virtual machine would be annoying (working with the window and stuff) and slow. Try AndLinux if you want to have Linux running in Windows. It sets up X and Pulseaudio for you, so all of your programs will appear to be native. It's basically a way to run Ubuntu as a Windows service (all Ubuntu packages for your architecture are installable).
If you just want the fun of Linux command line programs without access to all of Ubuntu, cygwin is smaller and might be faster.
If by "Dreamweaver files", you mean HTML/PHP/CSS, then yes, you can just upload them to the server. As far as I know, the only ASP or ASP.net compatible server is Microsoft's, but why use that anyway?
EDIT: SO didn't give me enough space in the comments to answer your question..
AndLinux and Cygwin are basically just better ways to do your "virtual machine" idea.
Cygwin adds a posix layer to Windows (basically everything you need to compile Unix/Linux/BSD programs). This means that you can generally take a Linux program and just compile it on Windows and have it work. They also have repositories, but in my experience, the cygwin installer is slow and hard to use.
AndLinux runs the Linux kernel as a Windows service, giving you a similar experience as running it in VirtualBox/other virtualization programs. However, it also sets up X (the graphics layer for Linux) and PulseAudio (a sound system that lets you run sound over a network), so that when you run Linux programs they act and sound like native programs. I also like AndLinux better because you have access to all of Ubuntu's programs, and apt-get is easier to use than cygwin's installer. Also, if you use AndLinux and later to decide to go 100% Linux, you're basically already using it that way.
What I'm getting at is: If you want to run Linux in a virtual machine, don't. Just install AndLinux. It will be faster and it's much easier to work with (since everything is just a normal window).
Here's an example of the difference:
Screenshot of AndLinux: The program in the bottom right corner is running in AndLinux. Notice how it just looks like a badly themed Windows program? Compare that to something like this, where you have another desktop in a Window.
And still.. there's no reason to virtualize Netbeans. It's a native Windows program and you can gain nothing and lose a lot of speed.
If you're interested in Android development and you want to use Linux, then I would recommend you do your development in Eclipse. Eclipse is available for Linux and if you get Ubuntu then Eclipse is amazingly easy to install. I used VirtualBox + Ubuntu + Eclipse for several projects I worked on. If you decide that Linux is not for you and your project was in Eclipse then you will have no problem switching back to Windows since Eclipse is available for both operating systems.
The ONLY problem I had was the screen size on the virtual machine... if you have a big screen and you use a virtual machine then you might get limited to a fraction of your actual screen resolution. It's very easy to install Linux on a second partition, so I would just recommend you go with a second partition if you want to fully utilize the size of your monitor.
My setup is sort of the opposite: I run Linux as my main OS, both at work an at home, and I have Windows in a virtual machine. On a modern computer with adequate memory the performance of development tools is not a problem. I work with Visual Studio in the virtual machine, and I have seen few performance issues. (But note that this is on a fast computer, and that you may need more memory than otherwise, since you are running two OS:es at the same time. On an old computer with less memory it can become unbearable.)
Dual-boot, where you have to restart the computer to switch OS, doesn't work well for me. It takes way too much time to switch, and really need to switch back and forth. Having Windows in a window works much better for me, and you can maximize that "Windows window", so it looks like you're just running Windows.
One thing you may want to look at is to have Linux running in a VM, then configuring Samba to allow the host to network-mount pieces of the Linux filesystem so that you can operate using Windows tools, and have Linux running the server processes (e.g., httpd). Alternatively, I'm sure that there are shell extensions for using FTP, NFS, or SSH/SFTP servers from within Explorer, but I've not looked at any for a long time.
If you should happen to need to use graphical Linux tools then you can use the X server found in cygwin for that.
The downside of this plan is that Samba can be a bit tricky to configure, but you get to use the Windows tools you're already familiar with.
I had no issues running Ubuntu via VMWare. You can easily switch to full screen mode anytime. Strongly recommended. One shortcoming is that Linux will not be exposed to the full potential of your hardware. Compbiz Fusion failed to work as a result.
Given that you're a gamer, I'm thinking your machine should be fast enough to run Linux in a VM. Best to try out the VM before messing with disk partitions.
I use physically separate machines to run Linux and Windows (and MacOS X). This means that I don't have to reboot to do something different, and each system gets the full power of the hardware.
Disadvantages: more desk space used, more time and money spent maintaining hardware (though if you do a rolling upgrade, this is mitigated - Linux runs most happily on not-quite-new machines). Doesn't work so well if you like carrying laptops around.
Be aware that VMs universally don't give you full graphics acceleration. This can be a non-issue (many programs must cope with Intel GMA anyway), or it can be a showstopper. Your choice.

Windows Virtual PC Development Setup?

After having had a dev PC HD corrupt, I'm considering the idea of making my development environment be fully Virtual PC based.
The core items would be:
- XP Pro 32
- IIS
- VS2003
- VS2008
- SQL Server 2005
- Office 2003
Primary source would reside on a server in SVN with only a clocal copy on the VPC.
This would be for Windows based web and desktop development.
Assuming that the host machine has decent performance and provides for hardware virtualization, are there any known gotchas with such a setup, ie main pros and cons. Any performance issues or other issues that make this a good or bad idea?
I'd like to go this route so I can create a full backup VPC that can be put on a new PC if one fails and is repalced or copied to a laptop as needed for offsite work, etc. With the new Virtual PC features of Win7 this seems like it may be even better goign forward too.
Would like to get some feedback on this before we go down that road...
I wouldn't recommend Virtual PC because the performance is pretty disappointing compared to VMWare.
I've used a virtual development machine inside VMWare Workstation and VMWare Fusion on Mac for quite a while, and it works very well. It feels as if you're running on a dedicated machine.
My recommendations are:
Use a 64-bit OS as your host OS (Vista x64, Windows 7 64-bit, Mac OS X Leopord)
Have at least 6GB of RAM on your physical machine
Allocate 3GB of RAM to your VM for 32-bit, or more for a 64-bit guest OS
Pre-allocate the diskspace for your guest OS (no auto-grow)
Another advantage is that you can take your VM from a Windows-based VMWare Workstation to a Mac-based VMWare Fusion (and the other way around) without any problems.
I have been running multiple virtual development environments in MS Virtual PC and Virtualbox for 2 years now. I am doing mostly asp.net applications, some of the solutions are relatively large and use large databases which I also run inside the VM.
My observations based on this:
It is a good idea for exactly the reasons you mention and it works fine. Go for it!
768 megs of ram for the VM is enough, but more is better.
Have a Multi-core CPU.
Install the virtual machine additions for the guest OS. (This is basically like installing the proper drivers for your "virtual" hardware, and seems to be more important for performance than having hardware virtualisation support).
If possible, have the VM disk image on
a separate physical disk from the
host OS.
Use Virtualbox. It's free, and being developed rapidly. It might already be the best.
If you can satisfy the above, performance is no issue. Multiple Visual studio instances, IIS, SQL, Office, works just fine.
Running multiple copies of the same guest OS when it is a member of a domain/AD is tricky. If you need to do this you should read up on the sysprep.exe tool. Basically you can't just make a copy of the virtual disk, you need to take some special precautions.
Virtual PC is very convenient and it was what I used for starters, but I have to say that virtualbox seems to have overtaken it now. It was a bit rough in the beginning but the last few versions have really gotten there.
Virtualbox is fully free, and it has better features than VPC2007 - the main one that made me switch was the support for high resolutions. Virtualbox runs fullscreen on my 1920x1080 no problem.
It can also run virtual PC images, so switching was just a matter of installing virtualbox and adding my existing virtual PC disks to it.
An added benefit is that I can run the virtual images just as easily on my new mac as on the old pc.
The commercial options are not (anymore) worth what they cost, IMHO.
One thing you might have to consider is the lack of support for multiple monitors within the VM. I really like using multiple monitors, one for my source, the rest for all the rest. As far as I know, this is not possible in Virtual PC. Aside from that I can't think of anything that should hold you back, it's something I have been considering as well.
Regards,
Sebastiaan
VirtualBox from Sun is also a good choice. I am writing this from a Vista laptop with a virtualised Ubuntu dev environment.
One thing that Virtual Box is great for is having a seamless mode in which the guest OS application windows are presented as just windows on the host system, with a single common background (you get 2 status bars - one for Windows and one for Linux).
The Z-orders don't interpolate (ie all guest windows appear on the same Z plane in the host Window system, with their own Z-order within that plane) which can make it a bit odd, but you get used to it.
It is particularly useful if you need to build across many environments. VirtualBox is getting better and I now have an OpenSolaris environment and a FreeBSD one as well.
It is free as in beer which can be handy.
I actually run three development environments (and many test environments) under Ubuntu host in Windows guest virtual machines - it's very good for keeping things separated and for being able to restore test environments to a known point. It's also handy since the backup is a simple directory copy on the host and you don't have to worry about recovering settings or re-installing applications. etc.
I prefer VMWare over Virtual PC for both performance and usability (keep in mind that's my opinion). You don't need the VMWare Workstation product to create a VM - check out EasyVMX here for a way to create easy VMs.
The one thing you'll miss though is VMWare tools which only comes with the Workstation product, not the player. But VMWare has this for download here - I'm unsure of the legality of this even though it's an official download from VMWare, you may only be able to use it if you have the paid product.
I actually have a license for Workstation, it's just an earlier version and I prefer the latest Player.

Quick creation of fresh OS install for software testing

What do you recommend for quickly creating images for testing a software product (that needs hardware access - full USB port access)? Does virtualization cover this? I need to be able to quickly re-image the system to test from scratch again, and need good options for Windows and Mac OS.
Virtualization may work for you as long as it is only USB access.
VirtualBox is available with USB support either for "private use or evaluation" or commercially and works on Win, Mac and Linux. USB support on Linux and Mac is somewhat sporadical though and does not work with all devices. VBox supports snapshots.
VMWare has one free product called VMWare Server for Win and Linux but I'm not sure how far USB support is included in their server products. For Mac there is VMWare Fusion but that's not available for free. Fusion should work with most USB devices. Workstation products for Windows are more expensive. I think there is a trial version for all of them. All do snapshots.
I don't know how far Parallels (Mac) supports USB devices or snapshots.
You don't need snapshot functionality if you can afford some short downtime between re-imaging. You can shut down the VM and then just copy the disk image (which is nothing else but one or multiple regular files) and start the VM again. Snapshots can be reverted to a lot faster (without rebooting).
If virtual machines will work for you, you can choose between Virtual PC, VMWare and VirtualBox.
Virtual PC supports Win host and Win/linux guests. Although there are some caveats with regards to the X resolution support.
VMWare supports Win, OS X and Linux host. It supports Win and Linux guests.
VirtualBox supports same hosts and guests as VMWare.
None of the three supports OS X as guest officially. The reason is that OS X is licensed only for Apple machines. However, there are some hacks that allow installing OS X under VMWare. It might be also possible to install it under VirtualBox or Virtual PC, although I have not seen specific instructions.
If virtualization is not good enough for you, you can use precreated installation images or a disk imaging program.
For precreated installation images for Windows, you can use the sysprep tool (search for sysprep or system preparation tool). I don't know if there are equivalent tools for OS X from Apple.
For disk imaging programs, I know quite a lot people swear by Symantec Ghost. I personally have not used it, so can't give you much info about it. There's also a list of disk imaging programs on Wikipedia, so you evaluate these as well.
Hope that helps.
If virtualization is right for you depends on how much access direct access you actually need.
But if virtualization works then vmware offers products for Windows and Mac that support a Snapshot feature.
Or there's also VirtualBox which works on Linux, Windows and Mac, also supports snapshots and is free.
I use VMWare Player for this sort of stuff. I've not tested it with the sort of access you discuss (since I mostly do apps rather than driver-level stuff) but the advantages are many, specifically being able to copy the VM when it's shut down for later restore to a specific point (sort of a poor man's snapshot) and being able to have lots of configurations without blowing the hardware budget.
It certainly provides USB virtualization and I would say it's the best bet for providing the full device access. I would suggest testing it since, if it provides the hardware support you need, it's a very good solution for the other reasons given. The only other (non-VM) suggestion I can think of which would match it would be hard disk image backups which can be restored at will.
I've used Virtual PC heavily for this kind of thing in Windows, without ever hitting any issues. It's free, which is always a bonus ;o)
Edit: Just re-read the question - not sure that it has USB support. Should tick all the other boxes though
CloneZilla is a great, free way to reimage machines.
Once I worked for some company where we needed to test our software for various combinations of versions of OS, SPs and some other libraries which our application was dependent on. For each separate identified combination we had a separate partition image created with the help of Norton Ghost (DOS version). All images were put to a server. Whenever a tester got the next version of the system core to test, they would just methodically restore from all applicable images, install the application, test it and report it.
This approach though a straightforward one would allow full access to the hardware and will provide you with 100% native installation.
Nowadays, I still use this approach for my private PC. I'm sure you can try the latest achievements like Hyper-V. We use it nowadays where I work. When we tried to install Team Foundation Server (the process is far from being easy) we also had to drop the process at some point and just restore a virtual machine from an image because we realized we made a few mistakes during installation. Conceptually the same approach that saves a great deal of time. I'm not really sure though how compatible a virtual PC is in the sense of hardware access.
You can try both approaches.
P.S. Today there are two Ghost products, Symantec Ghost (good old one) for corporate use and Norton Ghost for home use (bloatware in my opinion). If you decide to try this option, I would recommend the Symantec Ghost (part of Ghost Solution Suite).
If you can't just use a virtual machine and take snapshots of the fresh install then do a fresh install onto real hardware and use a disk imaging tool (Ghost comes to mind).
If cost is a factor then there's a few Linux live CDs that will do what you want. This comes to mind. Put a second disk in the machine and image from the second disk unless you have fast networks and network storage; it's way to slow to go to and from the network regularly. If you're using a Linux live CD then you can actually set the second disk to EXT3 so Windows won't detect it and assign a drive letter too.
If you have a dedicated workstation for testing then I would highly recommend Symantec Ghost. Simply get the workstation to the clean state, reboot to ghost and 'take a snapshot' of the HD or partition. You can then replace the HD or partition from the image say from CD or multicasted over a network connection from another PC.
I have used it for years now, even to the point of automating the build of 60 test workstations (at the same time).

Virtual PC high CPU usage problems

I run VPC 2007 on my Vista business laptop with 4 gig RAM. I use VPC to run windows XP and maintain a VS2003 web project. At first everything was great. I assigned the VPC 512MB and did my work as usual. I also run Resharper and Visual SVN. Lately, the act of scrolling in a page causes the CPU to spike above 50, sometimes near 100. This freezes my machine occasionally and is frustrating. Typing code sometimes does the same thing.
I have experimented with changing allocated memory, disk space, turning on/off the paging file, uninstalling ReSharper and Visual SVN. There should be no reason this thing is slow with all the memory I have on this laptop! I don't have anything running on it but VPC at any one time.
I'm wondering if I should just install VS2003 on my Vista machine and deal with any incompatibility problems.
Any suggestions?
Try VirtualBox.
VirtualBox is a family of powerful x86
virtualization products for enterprise
as well as home use. Not only is
VirtualBox an extremely feature rich,
high performance product for
enterprise customers, it is also the
only professional solution that is
freely available as Open Source
Software under the terms of the GNU
General Public License (GPL).
If it were me, I'd run the VS.NET 2003 IDE on Vista natively. Just check out this page with the problems you might have:
http://msdn.microsoft.com/en-us/vs2005/bb188244.aspx
As far as your CPU goes, it could be a video driver/display issue. Have you tried turning Aero Glass on/off on your vista machine to see if that changes things? Are your number of colors for your desktop the same both in the VPC and on your host? Have you updated your video drivers recently?
I recommend VirtualBox. Every time I use VPC I soon give up because the performance is terrible. I run VirtualBox with a Vista virtual PC allocated 1.5gb ram and it runs really well. In fact I don't really notice much slow down from running natively.
First thing I'd suggest doing is run Process Explorer and Process Monitor to find out whats really eating the cpu. If it used to run fine, switching to another VM might not fix anything.
I'd bet VisualSVN is the problem. I had the same problem on a dual-core system with 6GB of RAM. I eventually just uninstalled it because it kept crashing the IDE.
BTW, I'm running Server2003 64-bit.
You probably have VPC07 runnning the active vhd at maximum speed. Go to options on the console menu and change this setting to divide CPU time equally among all vhd's and your problems will disappear!

Resources