What do you recommend for quickly creating images for testing a software product (that needs hardware access - full USB port access)? Does virtualization cover this? I need to be able to quickly re-image the system to test from scratch again, and need good options for Windows and Mac OS.
Virtualization may work for you as long as it is only USB access.
VirtualBox is available with USB support either for "private use or evaluation" or commercially and works on Win, Mac and Linux. USB support on Linux and Mac is somewhat sporadical though and does not work with all devices. VBox supports snapshots.
VMWare has one free product called VMWare Server for Win and Linux but I'm not sure how far USB support is included in their server products. For Mac there is VMWare Fusion but that's not available for free. Fusion should work with most USB devices. Workstation products for Windows are more expensive. I think there is a trial version for all of them. All do snapshots.
I don't know how far Parallels (Mac) supports USB devices or snapshots.
You don't need snapshot functionality if you can afford some short downtime between re-imaging. You can shut down the VM and then just copy the disk image (which is nothing else but one or multiple regular files) and start the VM again. Snapshots can be reverted to a lot faster (without rebooting).
If virtual machines will work for you, you can choose between Virtual PC, VMWare and VirtualBox.
Virtual PC supports Win host and Win/linux guests. Although there are some caveats with regards to the X resolution support.
VMWare supports Win, OS X and Linux host. It supports Win and Linux guests.
VirtualBox supports same hosts and guests as VMWare.
None of the three supports OS X as guest officially. The reason is that OS X is licensed only for Apple machines. However, there are some hacks that allow installing OS X under VMWare. It might be also possible to install it under VirtualBox or Virtual PC, although I have not seen specific instructions.
If virtualization is not good enough for you, you can use precreated installation images or a disk imaging program.
For precreated installation images for Windows, you can use the sysprep tool (search for sysprep or system preparation tool). I don't know if there are equivalent tools for OS X from Apple.
For disk imaging programs, I know quite a lot people swear by Symantec Ghost. I personally have not used it, so can't give you much info about it. There's also a list of disk imaging programs on Wikipedia, so you evaluate these as well.
Hope that helps.
If virtualization is right for you depends on how much access direct access you actually need.
But if virtualization works then vmware offers products for Windows and Mac that support a Snapshot feature.
Or there's also VirtualBox which works on Linux, Windows and Mac, also supports snapshots and is free.
I use VMWare Player for this sort of stuff. I've not tested it with the sort of access you discuss (since I mostly do apps rather than driver-level stuff) but the advantages are many, specifically being able to copy the VM when it's shut down for later restore to a specific point (sort of a poor man's snapshot) and being able to have lots of configurations without blowing the hardware budget.
It certainly provides USB virtualization and I would say it's the best bet for providing the full device access. I would suggest testing it since, if it provides the hardware support you need, it's a very good solution for the other reasons given. The only other (non-VM) suggestion I can think of which would match it would be hard disk image backups which can be restored at will.
I've used Virtual PC heavily for this kind of thing in Windows, without ever hitting any issues. It's free, which is always a bonus ;o)
Edit: Just re-read the question - not sure that it has USB support. Should tick all the other boxes though
CloneZilla is a great, free way to reimage machines.
Once I worked for some company where we needed to test our software for various combinations of versions of OS, SPs and some other libraries which our application was dependent on. For each separate identified combination we had a separate partition image created with the help of Norton Ghost (DOS version). All images were put to a server. Whenever a tester got the next version of the system core to test, they would just methodically restore from all applicable images, install the application, test it and report it.
This approach though a straightforward one would allow full access to the hardware and will provide you with 100% native installation.
Nowadays, I still use this approach for my private PC. I'm sure you can try the latest achievements like Hyper-V. We use it nowadays where I work. When we tried to install Team Foundation Server (the process is far from being easy) we also had to drop the process at some point and just restore a virtual machine from an image because we realized we made a few mistakes during installation. Conceptually the same approach that saves a great deal of time. I'm not really sure though how compatible a virtual PC is in the sense of hardware access.
You can try both approaches.
P.S. Today there are two Ghost products, Symantec Ghost (good old one) for corporate use and Norton Ghost for home use (bloatware in my opinion). If you decide to try this option, I would recommend the Symantec Ghost (part of Ghost Solution Suite).
If you can't just use a virtual machine and take snapshots of the fresh install then do a fresh install onto real hardware and use a disk imaging tool (Ghost comes to mind).
If cost is a factor then there's a few Linux live CDs that will do what you want. This comes to mind. Put a second disk in the machine and image from the second disk unless you have fast networks and network storage; it's way to slow to go to and from the network regularly. If you're using a Linux live CD then you can actually set the second disk to EXT3 so Windows won't detect it and assign a drive letter too.
If you have a dedicated workstation for testing then I would highly recommend Symantec Ghost. Simply get the workstation to the clean state, reboot to ghost and 'take a snapshot' of the HD or partition. You can then replace the HD or partition from the image say from CD or multicasted over a network connection from another PC.
I have used it for years now, even to the point of automating the build of 60 test workstations (at the same time).
Related
I'd like to ask anybody who has built a virtualized VS2010 environment in VirtualBox or VMware, which one was able to work out of the box without too much tweaking? Or both need workarounds to get stuff working?
Both are fine as long as you install the respective tools and drivers provided for the guest OS
If you're using VMWare Workstation, you can leverage even more out of the environment by installing Visual Studio on the Host PC, and using the Guest VM for debugging, if your application crashes you can actually rewind back to before the crash and step through your code with the same heap and stack before it crashed!
Basically, I suggest going with VMWare Workstation. It's pretty cheap (assuming you get paid to program) and has many, many awesome features that you'll come to love. If you're a hobbyist/student programmer however, you'll likely find VirtualBox to be a little more functional than the free VMWare Player.
As far as performance goes, Intel and AMD both have shipped chips with hardware virtualization since 2005/2006 respectively. This is called VT-x or AMD-V, and often has to be enabled in the bios on older machines.
Basically this means that your BIOS handles Memory and I/O virtualization on this chip, while specialist drivers (e.g. VMWare Tools) are installed to improve graphics and mouse performance - effectively this means the resulting VM has near native performance with minimal overhead.
Hope that helps!
You can work with a VS2010/Windows virtualized environment with no problems.
I've worked with such combination and I had no problems. Both VMWare and VirtualBox are stable so far since years and Windows OS virtualization works properly.
Obviously, you can have performance loss, because a virtualized OS has more bottle necked access to resources than a host one, but current CPUs from Intel and AMD have advanced virtualization instruction extensions which accelerates virtualization operations.
So... Just go ahead!
I don't know your requirement but there is also a great alternative using Win 7.
You can create a vhd file and boot on the vhd file.
A few steps more, you can create a base vhd file with everything you need, mark it as readonly and create as many differential disk as you want.
The drawback of this method are these ones :
it's a bit tricky to create the base and diff disk, because you have to do it in the setup console of windows setup (but google can help you)
there is a small performance impact on the disk I/O (but lower than the visualization environment)
you can run only one system at a time. In fact, nothing disallow you to install a virtualization software
you can't have your "host" and it's potential tools (corporate email, etc.)
but at least, the performance will be greatly better than a virtualization software.
I already have experience with setting up virtual machines, running them and other minor tasks. Im a gamer, so I wont get rid of windows (for now at least...) but I do want to be a great programmer and to be involved with the Open-Source community.
Id like to know if its a good idea to do my programming in linux through a virtual machine, vs giving it a partitioned section of the HDD. Id like to know about performance pros and cons and functionality.
All responses are appreciated, thanks in advance.
The type of programming I intend to dive into :
Android Dev, Web Dev, Desktop Dev...More Android and Web right now though.
So im looking at C#,C,C++,Java,PHP,HTML,MySQL...Off the top of the dome.
I do web designing as well, so dreamweaver is added as an "essential". But im sure I can do dreamweaver files and upload them to the server after programming in Linux...Right?
And any info on IDE's in Linux for the above mentioned are appreciated, but i would prefer going the coding route and understanding the essence of whats happening "under the covers"
Thanks to all for reading, I appreciate it.
Hope this isnt confusing :S
There is an easier solution..
I still have to use Windows for Symbian programming so I use a Wubi and Ubuntu to provide my double bout into Linux..you deploy Wubi uses a large file and thus no need to worry or mess with creating a partition..
I have used it for 18 months with no data loss and no worries..
There is also another tool called andlinux:
http://www.andlinux.org/
It uses colinux to run Linux as a program inside windows..
A couple things:
If you're using an IDE, there's no point to coding on Linux. Linux is nice for programming because the command line tools are awesome. Netbeans and Eclipse both work fine on Windows. All you'd be missing is makefiles (which IDEs don't use anyway).
Using a virtual machine would be annoying (working with the window and stuff) and slow. Try AndLinux if you want to have Linux running in Windows. It sets up X and Pulseaudio for you, so all of your programs will appear to be native. It's basically a way to run Ubuntu as a Windows service (all Ubuntu packages for your architecture are installable).
If you just want the fun of Linux command line programs without access to all of Ubuntu, cygwin is smaller and might be faster.
If by "Dreamweaver files", you mean HTML/PHP/CSS, then yes, you can just upload them to the server. As far as I know, the only ASP or ASP.net compatible server is Microsoft's, but why use that anyway?
EDIT: SO didn't give me enough space in the comments to answer your question..
AndLinux and Cygwin are basically just better ways to do your "virtual machine" idea.
Cygwin adds a posix layer to Windows (basically everything you need to compile Unix/Linux/BSD programs). This means that you can generally take a Linux program and just compile it on Windows and have it work. They also have repositories, but in my experience, the cygwin installer is slow and hard to use.
AndLinux runs the Linux kernel as a Windows service, giving you a similar experience as running it in VirtualBox/other virtualization programs. However, it also sets up X (the graphics layer for Linux) and PulseAudio (a sound system that lets you run sound over a network), so that when you run Linux programs they act and sound like native programs. I also like AndLinux better because you have access to all of Ubuntu's programs, and apt-get is easier to use than cygwin's installer. Also, if you use AndLinux and later to decide to go 100% Linux, you're basically already using it that way.
What I'm getting at is: If you want to run Linux in a virtual machine, don't. Just install AndLinux. It will be faster and it's much easier to work with (since everything is just a normal window).
Here's an example of the difference:
Screenshot of AndLinux: The program in the bottom right corner is running in AndLinux. Notice how it just looks like a badly themed Windows program? Compare that to something like this, where you have another desktop in a Window.
And still.. there's no reason to virtualize Netbeans. It's a native Windows program and you can gain nothing and lose a lot of speed.
If you're interested in Android development and you want to use Linux, then I would recommend you do your development in Eclipse. Eclipse is available for Linux and if you get Ubuntu then Eclipse is amazingly easy to install. I used VirtualBox + Ubuntu + Eclipse for several projects I worked on. If you decide that Linux is not for you and your project was in Eclipse then you will have no problem switching back to Windows since Eclipse is available for both operating systems.
The ONLY problem I had was the screen size on the virtual machine... if you have a big screen and you use a virtual machine then you might get limited to a fraction of your actual screen resolution. It's very easy to install Linux on a second partition, so I would just recommend you go with a second partition if you want to fully utilize the size of your monitor.
My setup is sort of the opposite: I run Linux as my main OS, both at work an at home, and I have Windows in a virtual machine. On a modern computer with adequate memory the performance of development tools is not a problem. I work with Visual Studio in the virtual machine, and I have seen few performance issues. (But note that this is on a fast computer, and that you may need more memory than otherwise, since you are running two OS:es at the same time. On an old computer with less memory it can become unbearable.)
Dual-boot, where you have to restart the computer to switch OS, doesn't work well for me. It takes way too much time to switch, and really need to switch back and forth. Having Windows in a window works much better for me, and you can maximize that "Windows window", so it looks like you're just running Windows.
One thing you may want to look at is to have Linux running in a VM, then configuring Samba to allow the host to network-mount pieces of the Linux filesystem so that you can operate using Windows tools, and have Linux running the server processes (e.g., httpd). Alternatively, I'm sure that there are shell extensions for using FTP, NFS, or SSH/SFTP servers from within Explorer, but I've not looked at any for a long time.
If you should happen to need to use graphical Linux tools then you can use the X server found in cygwin for that.
The downside of this plan is that Samba can be a bit tricky to configure, but you get to use the Windows tools you're already familiar with.
I had no issues running Ubuntu via VMWare. You can easily switch to full screen mode anytime. Strongly recommended. One shortcoming is that Linux will not be exposed to the full potential of your hardware. Compbiz Fusion failed to work as a result.
Given that you're a gamer, I'm thinking your machine should be fast enough to run Linux in a VM. Best to try out the VM before messing with disk partitions.
I use physically separate machines to run Linux and Windows (and MacOS X). This means that I don't have to reboot to do something different, and each system gets the full power of the hardware.
Disadvantages: more desk space used, more time and money spent maintaining hardware (though if you do a rolling upgrade, this is mitigated - Linux runs most happily on not-quite-new machines). Doesn't work so well if you like carrying laptops around.
Be aware that VMs universally don't give you full graphics acceleration. This can be a non-issue (many programs must cope with Intel GMA anyway), or it can be a showstopper. Your choice.
After having had a dev PC HD corrupt, I'm considering the idea of making my development environment be fully Virtual PC based.
The core items would be:
- XP Pro 32
- IIS
- VS2003
- VS2008
- SQL Server 2005
- Office 2003
Primary source would reside on a server in SVN with only a clocal copy on the VPC.
This would be for Windows based web and desktop development.
Assuming that the host machine has decent performance and provides for hardware virtualization, are there any known gotchas with such a setup, ie main pros and cons. Any performance issues or other issues that make this a good or bad idea?
I'd like to go this route so I can create a full backup VPC that can be put on a new PC if one fails and is repalced or copied to a laptop as needed for offsite work, etc. With the new Virtual PC features of Win7 this seems like it may be even better goign forward too.
Would like to get some feedback on this before we go down that road...
I wouldn't recommend Virtual PC because the performance is pretty disappointing compared to VMWare.
I've used a virtual development machine inside VMWare Workstation and VMWare Fusion on Mac for quite a while, and it works very well. It feels as if you're running on a dedicated machine.
My recommendations are:
Use a 64-bit OS as your host OS (Vista x64, Windows 7 64-bit, Mac OS X Leopord)
Have at least 6GB of RAM on your physical machine
Allocate 3GB of RAM to your VM for 32-bit, or more for a 64-bit guest OS
Pre-allocate the diskspace for your guest OS (no auto-grow)
Another advantage is that you can take your VM from a Windows-based VMWare Workstation to a Mac-based VMWare Fusion (and the other way around) without any problems.
I have been running multiple virtual development environments in MS Virtual PC and Virtualbox for 2 years now. I am doing mostly asp.net applications, some of the solutions are relatively large and use large databases which I also run inside the VM.
My observations based on this:
It is a good idea for exactly the reasons you mention and it works fine. Go for it!
768 megs of ram for the VM is enough, but more is better.
Have a Multi-core CPU.
Install the virtual machine additions for the guest OS. (This is basically like installing the proper drivers for your "virtual" hardware, and seems to be more important for performance than having hardware virtualisation support).
If possible, have the VM disk image on
a separate physical disk from the
host OS.
Use Virtualbox. It's free, and being developed rapidly. It might already be the best.
If you can satisfy the above, performance is no issue. Multiple Visual studio instances, IIS, SQL, Office, works just fine.
Running multiple copies of the same guest OS when it is a member of a domain/AD is tricky. If you need to do this you should read up on the sysprep.exe tool. Basically you can't just make a copy of the virtual disk, you need to take some special precautions.
Virtual PC is very convenient and it was what I used for starters, but I have to say that virtualbox seems to have overtaken it now. It was a bit rough in the beginning but the last few versions have really gotten there.
Virtualbox is fully free, and it has better features than VPC2007 - the main one that made me switch was the support for high resolutions. Virtualbox runs fullscreen on my 1920x1080 no problem.
It can also run virtual PC images, so switching was just a matter of installing virtualbox and adding my existing virtual PC disks to it.
An added benefit is that I can run the virtual images just as easily on my new mac as on the old pc.
The commercial options are not (anymore) worth what they cost, IMHO.
One thing you might have to consider is the lack of support for multiple monitors within the VM. I really like using multiple monitors, one for my source, the rest for all the rest. As far as I know, this is not possible in Virtual PC. Aside from that I can't think of anything that should hold you back, it's something I have been considering as well.
Regards,
Sebastiaan
VirtualBox from Sun is also a good choice. I am writing this from a Vista laptop with a virtualised Ubuntu dev environment.
One thing that Virtual Box is great for is having a seamless mode in which the guest OS application windows are presented as just windows on the host system, with a single common background (you get 2 status bars - one for Windows and one for Linux).
The Z-orders don't interpolate (ie all guest windows appear on the same Z plane in the host Window system, with their own Z-order within that plane) which can make it a bit odd, but you get used to it.
It is particularly useful if you need to build across many environments. VirtualBox is getting better and I now have an OpenSolaris environment and a FreeBSD one as well.
It is free as in beer which can be handy.
I actually run three development environments (and many test environments) under Ubuntu host in Windows guest virtual machines - it's very good for keeping things separated and for being able to restore test environments to a known point. It's also handy since the backup is a simple directory copy on the host and you don't have to worry about recovering settings or re-installing applications. etc.
I prefer VMWare over Virtual PC for both performance and usability (keep in mind that's my opinion). You don't need the VMWare Workstation product to create a VM - check out EasyVMX here for a way to create easy VMs.
The one thing you'll miss though is VMWare tools which only comes with the Workstation product, not the player. But VMWare has this for download here - I'm unsure of the legality of this even though it's an official download from VMWare, you may only be able to use it if you have the paid product.
I actually have a license for Workstation, it's just an earlier version and I prefer the latest Player.
Recently the buzz of virtualization has reached my workplace where developers trying out virtual machines on their computers. Earlier I've been hearing from several different developers about setting up virtual machine in their desktop computers for sake of keeping their development environments clean.
There are plenty of Virtual Machine software products in the market:
Microsoft Virtual PC
Sun VirtualBox
VMWare Workstation or Player
Parallell Inc's Parallells Desktop
I'm interested to know how you use virtualization effectively in your work. My question is how do you use Virtual Machines for day-to-day development and for what reason?
I just built a real beefy machine at home so that I could run multiple VMs at once. My case is probably extreme though, but here is my logic for doing so.
Testing
When I test, particularly a desktop app, I typically create multiple VMs, one for each platform that my software should run on (Windows 2000/XP/Vista etc). If 32 and 64 bit flavors are available, I also build one of each. I also play with the VM hardware settings (e.g. lots of RAM, little RAM, 1 core, 2 core, etc). I found plenty of little bugs this way, that definitely would have made it into the wild had I not used this approach.
This approach also makes it easy to play with different software scenarios (what happens if the user installing the program doesn't have .NET 3.5 sp1? What happens if he doesn't have XXX component? etc?
Development
When I develop, I have one VM running my database servers (SQL2000/2005/2008). This is for two reasons. First, it is more realistic. In a production environment your app is probably not running on the same box as the db. Why not replicate it when you develop? Also, when I'm not developing (remember this is also my home machine), do I really need to have all of those database services running? Yes, I could turn them on and off manually, but its so much easier to switch a VM on.
Clients
If I want to show a client some web work I've done, I can put just a single VM into the DMZ and he can log into the VM and play with the web project, while the rest of my network/computer is safe.
Compatibility
Vista64 is now my main machine. Any older hardware/software I own will not play nicely with that OS. My solutions is to have Windows XP 32 as a VM for all of those items.
Here's something that hasn't been mentioned yet.
Whenever a project enter maintenance mode (aka abandonded), I create a VM with all the tools , libraries, and source code necessary to build the project. That way if I have to come back to it a year later, I won't bet bit in the ass by any upgraded tools or libraries on my workstation.
When I started at my current company, most support/dev/PM staff would run Virtual PC with 1-3 VMs on their desktop for testing.
After a few months I put together a proposal and now we use a VMware ESXi server running a pool of virtual machines (all on 24/7) with different environments for our support staff to test customer problems and reproduce issues on. We have VMs of Windows 2000/XP/Vista with each of Office 2000/2002/2003/2007 installed (so that's 12 VMs) plus some more general test VMs, some Server 2003/2008 machines running Citrix, Terminal Services, etc. Basically most of the time when we hit a new customer configuration that we need to debug, and it's likely other customers also have that configuration, I'll setup a VM for it. (eg. We're only using three 64-bit VMs at the moment - mostly it's 32bit)
On top of that the same server runs a XP VM that I use for building installers (InstallShield, WiX) debugging (VS 2005) and localization (Lingobit) as well as a second VM that our developers use for automated testing (TestComplete).
The development and installer VM have been allocated higher priority and are both configured as dual-cpu VMs with 1Gb memory. The remaining VMs have equal priority and 256-1Gb RAM.
Everything runs on a dual-quad-core Xeon with 8Gb of RAM running ESXi and hardware raid (4x1Tb RAID10)
For little more than US$2.5k investment we've improved productivity 10 fold (imagine the downtime while a support lackie installs an older version of office on their desktop to replicate a customer problem, or the time that I can't use my desktop because we're building installers). Next step will be to double the RAM to 16Gb as we add more memory hungry Server 2008 and Vista VMs.
We still have the odd VM on our desktops (I've got localized versions of Windows, Ubuntu and Windows 7 running under VMware Workstation for example) but the commonly/heavily used configurations have been offloaded to a dedicated server that we can all remotely connect into. Much, much easier.
Virtualisation (with snapshots or non-persistent disks) is really useful for testing software installation in a known clean configuration (i.e. nothing left over from previous buggy installs of your software).
Having your development box on a single file (with a Virtual Machine) will make it much easier to backup and restore if an issue occurs.
Other than that, you can also carry your portable development box around different machines, since you aren't restricted on that single particular machine you usually work on.
Not only that, but you can test on different Operating Systems at once, with a single OS installed on a each Virtual Machine file you have.
Believe me, this will save quite a hassle when doing the jobs I mentioned above.
Another nice use case for VMs is to create a virtual network of machines. For example you can bring up machines running the different tiers of your application stack, each running in its own VM. Think of it as a poor man's datacentre.
These VMs can also appear available on your physical network, so you can use RDP or similar to get a remote terminal session with them.
You can have a beefy machine (lots of memory) running these VMs, while you access them remotely from another machine such as a laptop, or whichever machine you have with the best screen.
I use a VM under Windows to run Linux. Even though there's already a version of emacs for windows, using it in Linux just feels more gratifying for some reason.
Maintaining shelved computers
I have the situation where schools in my region are closed down but their finance system has to be maintained for up to 2 years to ensure all outstanding bills are paid. This used to be handled by maintaining the hardware from the mothballed schools which had some problems:
This wasted scarce hardware resources and took up a lot of physical space.
Finance officers had to be physically present at the hardware to work on each system.
Today I host each mothballed school on its own virtual box inside a single physical host. Each individual system is accessed by rdp on the IP number of the host, but with its own port number and the original security of each school is maintained.
Finance officers can now work on the mothballed schools without having to travel to where they are physically located, there is more physical space in the server room and backup of all the mothballed schools at once is a simple automated process.
With each mothballed school in its own vbox there is no way for cross contamination of data between systems. Many thousands of dollars worth of hardware is also freed up for redeployment.
Virtualisation appears to be the perfect solution to this problem.
I used the Virtualization approach using VMWare Server when the task in front of me was to test a clustered environment of WebSphere Application Server. After setting up VMWare Server i created a new virtual machine and did all the software installations that i would need like WebSphere App Server, Oracle, WebSphere Commerce etc, after which i shutdown the VM, and copied over the virtual hard disk image to two different files, one as a clone VM and another as a backup.
Created a new VM and assigned the one of the copied disk images, so i got two systems up and running now which allowed me to test the same scenario of a clustered environment. I took a snapshot of the VM through VMware and if i goofed up with any activities i would revert the changes to the snapshot taken thereby going to the previous state and increasing my productivity instead of having to find out what to reverse. The backup disk image can also be used if i need to revert to a very old state, instead of having to start from scratch.
The snapshot functionality which exists in both VMWare and Microsoft's Virtual PC/Server is good enough to consider Virtualization for scenarios where you think you might do breaking changes, which may not be that easy to revert.
From what I know, there is nothing like Parallels on Mac, but rather for work instead of testing.
The integration (with "coherence", your VM is not running "in a window" of your host system, all programs in the guest system have their proper window in the host system) is splendid and let's you fill all (ALL!) gaps:
My coworker has it configured that Outlook (there is nothing like Outlook for MacOsX) in Windows pops up when he clicks on a "mailto:"-link on a web page, browsed with Firefox on Mac !
In the other direction, if he get's send a PDF, he doubleclicks the attachment in Outlook (in Windows) which opens the PDF-File in the Mac-buildin PDF-viewer.
VirtualBox also offers this window-separation possibility (at least when windows is running in the VM on Linux), which is really useful for work.
For testing etc. of course, there is nothing like a cleanly separated environment.
We have a physical server dedicated to hosting virtual machines in our development environment. The virtual machines are brought up and torn down on a regular basis and are used for testing software on known Standard Operating Environments.
It is also really helpful when we want an application to run on a domain that is different to the development environment.
Also, the organisation I am working for are in the planning stage to create a large virtual testing ground. This will be a large grid of machines, sitting on it's own network, and all of the organisations' internal staff, contractors and third-party vendors will be able to stage their software for testing purposes prior to implementing into the production environment. The virtual machines will reflect the physical machines in the production environment.
It sounds great, but everyone's a bit skeptical: This is a Government organisation... Bureaucracy and red-tape will probably turn this into a big waste of time and money.
If we are using Virtual machine (vpc 2007,Virtual Server 2005,VMWare application etc..)
1.We can run multiple operating systems(windows98,2000,XP,Vista,Windows Server 2003,2008,Windows 7/linux/solaris) on a single server
2.We can Reduce hardware costs & Data Center Space
3.We can Reduce power & AC cooling cost.
4.We can reduce admin resource,
5.We can reduce Application Cost
6.We can run ADS/DNS/DHCP/Exchange/SQL/Sharepoint Server/File Server...etc
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I am doing .net programming in addition to c and c++ development and want more flexibility on my home machine. I want to be able to have both Linux (probably Ubuntu) and Windows Vista on my home computer. Is there a way I can install both and on boot be prompted for which one to start? Is there a way to set Windows to default?
I have seen this before in CS labs in undergrad.
Also, I assume there would be no problem if I were to use Windows 32-bit along with Ubuntu 64-bit. Any advise?
The latest versions of Ubuntu include an installer called Wubi, which installs Ubuntu as a windows application (ie: it can be uninstalled from Add/Remove programs) and sets up the dual boot for you! It's great for those who want to give Linux a try without a system overhaul!
You can dual boot, but I would recommend using a Virtual Machine for what you want to do.
Look at VMWare and Virtual PC.
For more information on Virtual PC: http://en.wikipedia.org/wiki/Microsoft_Virtual_PC
For more information on VMWare: http://en.wikipedia.org/wiki/VMware_Workstation
You should note that dual booting Windows and Linux can be a little risky and is a bit permanent. Running in a Virtual Machine means that you can run the Linux install in a window and not worry about it affecting your development machine at all. The software will not know the difference, so your testing is not affected.
Consider that the Virtual Machine is like a sandbox, where you can try new and different things out, without fear of consequences.
Virtual machines do run with a bit of overhead, and therefore you should not expect to be playing games or anything through them. I would say it is very much like logging into a machine through Remote Desktop (good LAN connection) as far as performance goes.
EDIT: There is also VirtualBox that you could check out. Thanks for the helpers in my comments for that one.
I, too, recommend using a virtual machine for this purpose.
I've had problems with Virtual PC on some Linux distros (Fedora Core comes to mind), but no problems with VMWare or Virtual Box.
Think very hard before installing another operating system even as dual boot. It is rarely simple, even with installers like Ubuntu's that don't require you to mess around on a command line. There is a good risk you'll spend days trying to get your usual OS back to normal especially if you're using Vista.
VMWare and Virtual PC are both good options. Do a test install on one of these and use the OS for a while before making the decision to install.
One other great thing about using a virtual machine is that you only have to worry about getting your network settings sorted on your main OS, because VMWare (etc) will borrow those.
Also, try using the operating system on Live CD or DVD to start with if at all possible. You may also find that you can run an OS from a USB stick. This is obviously good for portability - but note that you can also carry your virtual machines on a removable USB drive.
All you have to do is go to http://www.ubuntu.com/getubuntu/download and follow the directions. I downloaded Ubuntu, burned it to CD, and rebooted with the CD in the drive. I did not have to get a second hard drive or worry about it messing with my Vista Home Premium installation.
With Ubuntu (as with most distros with a Live CD install) all you need to do is pop in a disc, boot, and click through the menus. The dual boot is set up perfectly by default, you don't even have to think about it. I've done this with Ubuntu, Debian, PC Linux OS, Freespire, and Xandros on my Vista Home Premium machine and they all worked that way.
If you are paranoid, then you should back up your PC. As cheap as hard drives (USB or internal) are these days, there really is no excuse to not have a full back up of your system. It's too easy. I use Acronis True Image, but I've heard good things about Norton Ghost as well.
Regadless, you don't need Wubi or VMWare, or any virtual anything, a straight install with a dual boot set up is the default on a typical Live CD Linux install and it works even with Vista.
I've done it different ways over the years, and I'd say using a virtual machine is the one that I like best. I've tried both VMWare and VirtualBox, both free, and I like VirtualBox a little better because you can use it with the .iso straight. You don't need somebody to have created a virtual machine image for you.
Another option is to actually run Linux as an application on Windows so you get Linux running at almost full speed but also the ability to run Windows applications along side it. Check it out at http://www.colinux.org/.
I haven't had a chance to play with it yet, but an option that looks promising for me is a tool in Ubuntu to create a bootable USB drive with Ubuntu on it. It has the benefit of a live cd (no effect on your system), better performance than a live CD and the ability to persist your data from session to session. I've used Wubi before, but I can't remember why I uninstalled it.
Have a look at "cygwin".
This istalls a "linux like" windowing application within your windows
environment. It has good support for gcc and most of the standard
gnu/linux development tools.
You dont have to mess with dual boot. Its especially good for testing
windows to/from unix communictions as you can get everything up and
running in one box.
What you're looking for is called 'Dual booting'. it allows one to choose which operating system to boot at the start. It's well supported in Linux, especially Ubuntu. Just install Ubuntu and it will set up dual booting by default.
You could go either way, a dual-boot or use a VM. I think it depends on whether you'll want to use any Windows apps while developing in the Linux environment. If so, I'd go with a VM, otherwise, here's a tutorial for setting up a dual-boot computer. It has a part on installing both OSes, and a part on if you already have Windows installed.
Wubi is a great (Ubuntu specific) solution.. The only problem I've found was installing Wubi on a FAT formatted Windows partition - I had serious problems then. Also, it might run slightly slower, as there is another layer when doing disk acccess, but I can't say I've noticed.
I dual boot Vista Ultimate 32-bit and Ubuntu 8.10 beta 64-bit with no problems. The key thing, in my opinion, is to have a completely separate hard drive to install Ubuntu on. That removes a lot of the risk since you don't have to fuss around partitioning your primary HDD and makes removing Ubuntu very straightforward as well if you decide you don't want it.
Just be careful and pay attention on which drive you select when you do the install. It's easy for me to tell them apart since my Linux drive is a different size than my main Vista and data storage drives.
If you'd rather go the VM route, VMware Player works well, and I've heard good things about VirtualBox.
try a live cd install of ubuntu :D
creating a bootable flash disk is easy - unetbootin from sourceforge.net
I have dual booted Ubuntu and Xp many times with absolutely no problems. I doubt you could do the virtual thing with one OS 32 bit and the other 64. This would not be a problem with a dual boot.
I have had problems using wubi and my boot into windows7 is now unstable at best, so given the choice would favour a VM solution in hindsight. However on other machines I have run Ubuntu Live on USB (installed using pendrivelinux.com) by picking the try ubuntu option at boot and that has worked well and was quite quick to get going.