How can I integrate a virtual machine into my automated unit tests in Visual Studio? - visual-studio

I've got some legacy software that I'd like to involve in an automated unit test (for testing network protocol compatibility) and because this software is old and runs in an outdated environment I'd like to encapsulate it in a virtual machine. What is the best way to control a virtual machine from a Visual Studio unit test? Once I have the vm configured and have saved the state appropriately, I will need to be able to start and stop the vm and possibly launch some programs inside the vm on command.
One consideration I do have is that I'd like for developers not to have to download the vm image if they aren't planning to run this test. The unit test may therefore have to also handle downloading the latest vm image from some location. Our convention is to tag long running tests with a special description so developers will be able to exclude this test during active development.

The virtual machine platforms provide a scripting API that let you control VMs from the command line. The VMware server docs and a video on Hyper-V Scripting are available.
You will need to include some logic in your build scripts to decide if you should execute the VM code, or just check for the presence of the VM on developers machines.
You may want to check out some of the NAnt and MSBuild task repositories for VM-related tasks to make this easier.

Related

Automated integration testing of a client/server Windows desktop application

My team is developing a desktop application (mixed C++/Tcl) that is used in a client-server setup. Currently it is Windows-only, but soon we will need to port it to Linux. CruiseControl.NET builds it every night from the source code in SVN and packages it into NSIS installer, but we have no automated tests to run.
It is nearly impossible to add any unit tests, but integration testing of the application is easy, because it is already heavily script-based.
The main task is to install the app into 3 PCs, configure it (that involves copying some files around), run it, monitor a possible crash, wait till integration testing is done, collect a summary, send emails. It could be done with a bunch of custom PowerShell scripts, but
In future we will want to add more features and more testing, and
what used to be a simple script soon blows up (as usual), so I want
to minimize custom scripting, and if I need to script something, I
prefer bash/cygwin (I am not familiar with Python or Ruby).
I want a web dashboard that will report current progress, and if
something failed - show logs
I need some supervisor that will monitor the app under test and
report if it hangs or crashes
we will need to test it also on Linux
ideally I would like to orchestrate some test steps between the PCs
(e.g. run test X on PC1 and test Y on PC2 in parallel, wait till they
both finish, then run test Z on PC1, while monitoring that nothing
crashes on PC2 etc)
So, I am looking for a COTS tool/set of tools that will help me to do it and don't have a steep learning curve. Ideally, for free, but if it is really good and has fair pricing, my company may purchase a license.
The process should be triggered from CruiseControl.NET when the NSIS installer is ready, and then perform everything described above. Basically, it should allow at least remote installation of software, running custom scripts and have a web dashboard.
Apparently, SCCM tools like Chef could be used, but so far neither of them supports a Windows server, only nodes. I would like to avoid setting up a Linux VM just for that, although I can do it, if I have no other choice. Also, Chef seems to be a bit overkill - good for 10k machines, but I have only 3... maybe 5 in future. And I am particularly curious about chances to orchestrate a distributed test.
Most of the similar questions here on StackOverflow and in internets are about web apps, Java containers, Maven etc, and there are just so many tools and plugins for these tools to evaluate.
Thanks in advance.
Install ccnet on your test machines. Have those ccnet projects listen to a file that gets edited when a new installer is ready. Have the test machines install that new installer and run tests. There you go. ccnet sends emails so there's your basic reporting.
Have the test results get reported into a database via web services using gSOAP(that's what we did). For linux you can run java cruisecontrol if you must. Write a gSOAP enabled test controller program to report the test results from the test machines. A little c++ app will do. Then write a website(we use ASP.NET) to query the database(Postgresql) and show results. Have the test machines auto update themselves via SVN to get the latest changes to the configuration. Use Nant. Nant is far superior to just using ccnet to run tasks. Nant works through ccnet. Use XML, XSL and CSS with ccnet to make test emails have the information you want(new passes, new failures, SVN differences to code bases, etc...)
Our latest development is putting a big TV in the kitchen with a summary of test results so people can know more readily what they broke!
The first thing I'd get working is a test machine listening for the new installer, installing it, running some basic tests and emailing the results back. Put the ccnet and nant configuration in version control and get that auto updating on the test machine so you don't have to log into every test machine and do an update every time you make a change.
This is hugely broad and pretty close to opinion based. Chef can handle steps like deploying the application to the test machines but it isn't a GUI test framework so you would need something else to handle that. Jenkins supports distributing tests to windows hosts so that seems like a good choice on that side of things but it isn't that great at multi-node tests or orchestration between them. I suspect you'll need to write most of this yourself given the requirements.

Using continuous integration to deploy to a virtual machine to run integration tests

Has anyone any experience of setting up a CI server (team city for preference) to manage the creation of a virtual machine, deploying a package to the machine, getting the database to a known configuration then running integration tests. Tearing down the whole thing and reporting back to Team city the test status?
We do something like this, we have three types of tests. Unit tests which I am sure you are aware of but we also run a number of Acceptance Tests and Integration Tests and it is the latter to that are relevant.
In our integration tests we run a series of WatiN tests against our QA environment which is known to have an environment already running on it, these are usually run after TeamCity runs a deployment build to the QA environment. These tests do a full integration against all our external third parties.
What you might be more interested in is our Acceptance Tests but point to note we do not spin up a virtual environment (more on that later). We have a series of acceptance tests that spin up all the services in their own application domains and deploy database using visual studio database projects. Because these are acceptance tests all third party interfaces are mocked. Since the services are spun up in process the only thing to clean up at the end are the databases.
This works for us but I have been considering taking it to the next level with out integration tests and spinning up a virtual environment that looks exactly like our live environment down to domain names and IP addresses and this is entirely feasible but will be time consuming to do but will depend on the flavor of virtual environment you plan on using.
Here is an answered SO question on how to spin up Hyper-V servers using MSBuild but I am sure there are other examples using Ant/Nant/Rake for Hyper-V/VMWare etc.
How can I create virtual machines as part of a build process using MSBuild and MS Virtual Server and/or Hyper-V Server Virtualization?

Running and debugging program from Visual Studio in virtual machine like VirtualPC or VirtualBox

I have program that I want to test on clean Windows installation. For now I have image in VirtualBox and I start program from shared folder, but this is not comfortable and I can't debug.
For debugging I found that I can use Remote Debugging Monitor, but still I want to automate whole process, especially uploading application on virtual machine.
I thought that VirtualPC would be better then VirtualBox, because this application was created by Microsoft. Unfortunately I can't find any info how to connect them.
EDIT:
After research: only possibility is to treat virtual machine as remote computer. There is no easier way. Project need to be published to VM using shared folders. After configuring in Visual Studion new release type for remote debugging all triggers automaticly and working.
I would:
1.Place the program in a pre-defined shared directory, such that it is immediately visible to the virtual machine after redeployment.
2.Remote debugger invokation can be automated - all the parameters, such as users allowed to debug can be passed on the command line.
VirtualBox is quite OK for this task, as it allows you to replace only the disk image with clean one, while leaving the setup, including shared directories intact. I am sure VirtualPC also allows such a thing, but choosing it just because it's also written by Microsoft does not seem like a valid consideration here.

Code, Build and Run on seperate Machines. Posssible?

I know that I can code on one machine and have it build on a different machine (ie. a build server). Now I have also heard that you can have visual studio run a build on a virtual machine (i think it requires Virtual PC). Now my question is if anyone has been able to code on machine A, have it compile on machine B and run a debugging sesion on machine C?
This is pretty common in enterprise development and just about the de facto standard way of doing things.
Typically, a dev works locally. Once s/he is happy with their changes, they'll check it into a source control system.
From that point there are a couple of options ranging from automated building to having someone push the button to cause the remote build.
Once the build is complete there are a host of options available for deploying the app to one or more other servers. And yet other options for kicking off automated test suites.
Concerning remote debugging, you can do that independently of whether you are using a build/deployment/automated testing. It's just a matter of getting the right stuff installed and configured (see ho1's answer for a link).
All of that said, I highly recommend you never enable remote debugging on a production server. Some people might disagree with me but I personally think it's dangerous for security reasons and can certainly lead to site outages.
Finally, the only reasons you would need a virtual machine is if the servers aren't available or if you just want to sandbox everything.
You can do remote debugging, so if you had an automated process to copy the compiled code from B to C, I suppose you could do what you're asking.
See this MSDN article for more details: How to: Set Up Remote Debugging

Creating a virtual machine image as a continuous integration artifact?

I'm currently working on a server-side product which is a bit complex to deploy on a new server, which makes it an ideal candidate for testing out in a VM. We are already using Hudson as our CI system, and I would really like to be able to deploy a virtual machine image with the latest and greatest software as a build artifact.
So, how does one go about doing this exactly? What VM software is recommended for this purpose? How much scripting needs to be done to accomplish this? Are there any issues in particular when using Windows 2003 Server as the OS here?
Sorry to deny anyone an accepted answer here, but based on further research (thanks to your answers!), I've found a better solution and wanted to summarize what I've found.
First, both VirtualBox and VMWare Server are great products, and since both are free, each is worth evaluating. We've decided to go with VMWare Server, since it is a more established product and we can get support for it should we need. This is especially important since we are also considering distributing our software to clients as a VM instead of a special server installation, assuming that the overhead from the VMWare Player is not too high. Also, there is a VMWare scripting interface called VIX which one can use to directly install files to the VM without needing to install SSH or SFTP, which is a big advantage.
So our solution is basically as follows... first we create a "vanilla" VM image with OS, nothing else, and check it into the repository. Then, we write a script which acts as our installer, putting the artifacts created by Hudson on the VM. This script should have interfaces to copy files directly, over SFTP, and through VIX. This will allow us to continue distributing software directly on the target machine, or through a VM of our choice. This resulting image is then compressed and distributed as an artifact of the CI server.
Regardless of the VM software (I can recommend VirtualBox, too) I think you are looking at the following scenario:
Build is done
CI launches virtual machine (or it is always running)
CI uses scp/sftp to upload build into VM over the network
CI uses the ssh (if available on target OS running in VM) or other remote command execution facility to trigger installation in the VM environment
VMWare Server is free and a very stable product. It also gives you the ability to create snapshots of the VM slice and rollback to previous version of your virtual machine when needed. It will run fine on Win 2003.
In terms of provisioning new VM slices for your builds, you can simply copy and past the folder that contains the VMWare files, change the SID and IP of the new VM and you have a new machine. Takes 15 minutes depending on the size of your VM slice. No scripting required.
If you use VirtualBox, you'll want to look into running it headless, since it'll be on your server. Normally, VirtualBox runs as a desktop app, but it's possible to start VMs from the commandline and access the virtual machine over RDP.
VBoxManage startvm "Windows 2003 Server" -type vrdp
We are using Jenkins + Vagrant + Chef for this scenario.
So you can do the following process:
Version control your VM environment using vagrant provisioning scripts (Chef or Puppet)
Build your system using Jenkins/Hudson
Run your Vagrant script to fetch the last stable release from CI output
Save the VM state to reuse in future.
Reference:
vagrantup.com
I'd recommend VirtualBox. It is free and has a well-defined programming interface, although I haven't personally used it in automated build situations.
Choosing VMWare is currently NOT a bad choice.
However,
Just like VMWare gives support for VMWare server, SUN gives support for VirtualBOX.
You can also accomplish this task using VMWare Studio, which is also free.
The basic workflow is this:
1. Create an XML file that describes your virtual machine
2. Use studio to create the shell.
3. Use VMWare server to provision the virtual machine.

Resources