Is there option to run some .exe when starting MSTest tests? - visual-studio-2010

I'm writing tests using MSTest. I need to run executable from another project (in solution) before tests starts.
Is there other option beside invoking exe directly under [ClassInitialize]? In this case I'm loosing debugging capabilities.
Something like Multiple startup projects in solution properties.

If it is in the same solution, can't you add a reference to it in your test project and call it directly from [ClassInitialize]. This shouldn't cause you to lose debugging capabilities. Do you not have access to the source code of the executable?

Related

Can I get automated MSTest testing with VS2010?

I would like to have MSTest (honestly, I'd take whatever at this point) run my tests post-build, and then if any tests fail, produce the standard file(lineno): message output that Visual Studio recognizes and allows me to go straight to the failure point.
Or, if possible, then after a successful build I'd like the "Run All Tests in Solution" Visual Studio command to fire automatically.
As far as I can tell, this isn't possible. The commandline version of MSTest does not produce correctly formatted output (and as far as I can tell, neither does NUnit or xUnit), so when I add MSTest to my project's Post Build, I still have to scroll around, looking for the one that failed. The in-IDE version works ok, as long as I remember to hit Ctrl-R,A when the build finishes.
You can get what you are looking for by making a custom target in your project file instead of using the pre-baked PostBuild target. Add something like this right into your project file, editing it as XML,
<Target Name="RunUnitTests"
AfterTargets="CoreBuild">
<Exec
Command="mstest /testcontainer:$(OutDir)\PathToUnitTests\UnitTests.dll"
CustomErrorRegularExpression="^Failed"
/>
</Target>
You'll need to properly set the path to your test assembly or whatever other method you are using (test config file etc.), and you may need to tweak the regex, going from memory here...
This will run mstest similar to how you are probably doing in the PostBuild, but adds in the ability for MSBuild (which is what drives the C# build system) to detect output strings that it should consider errors. There is a similar parameter for a CustomWarningRegularExpression also.
If you want to share this among multiple projects, look up "MSBuild Imports"
I've switched to xUnit, and with a minor change to their MSBuild unit test runner, I get what I want.

Brief "run-down" on setting up Unit Tests in Visual Studio 2008

I am forcing myself to learn test-driven development, and so far I'm enjoying myself. There's a few quirks that Visual Studio Unit Testing has that is driving me batty though. A bit of background information, my project folder looks like this:
[Root] BitFlex
BitFlex\Code
BitFlex\Debug
BitFlex\Documents
BitFlex\Release
Now of course all the source code is stored in the code folder, and on a build the project output either goes to the debug or release folders depending on the current configuration. Now for my unit testing, I have it setup so the test project is output to either:
BitFlex\Debug\Unit Tests\
BitFlex\Release\Unit Tests\
1) At this point, everything is fine and dandy. There are two problems with this, the first being that when I run a test it cannot find the assembly, as it gives me this error:
Error AssignDefaultProgramTest BitFlex.UnitTests The test assembly 'D:\src\DCOM Productions\BitFlex\Code\TestResults\David Anderson_DCOMPRODUCTIONS 2009-07-31 23_21_00\Out\BitFlex.UnitTests.dll' cannot be loaded. Error details: Could not find file 'D:\src\DCOM Productions\BitFlex\Code\TestResults\David Anderson_DCOMPRODUCTIONS 2009-07-31 23_21_00\Out\BitFlex.UnitTests.dll'.
I cannot seem to find information on this error, or how to resolve it so I suppose that's where everyone's experise around here comes in to play.
2) My other beef is that Visual Studio generates the "Test Results" folder in my code directory, I would prefer to move that to my Unit tests folder in either output configuration. Is there a way to do this, or a better practice to setting up a well organized Unit Test using my folder hierarchy?
By default MSTesting framework runs all tests in an 'isolated' location and not from the binaries directory.
To fix this you can do one of these two:
1. go to the test configuration file and under deployment uncheck deploy the tests.
2. don't use path when looking for external files instead use deploy attribute or the test config to deploy the needed files along with your tests.
For doing TDD with MSTest, turn off deployment. You shouldn't need it for "unit testing."
Also, never ever EVER have VS automatically generate tests for you. What's generated may be fine for some types of functional testing, but are usually very poor unit tests.

Visual Studio unit testing - how to access external files?

I have data files used as input to my unit tests. These files are quite big and I don't want to copy them each time unit tests are executed. Tests are executed without deployment. So I can just put them into folder under my solution, and... how to obtain path to my solution (or test project source code) when unit test is executing?
Because you can run a test project in different ways (TD.NET, Visual Studio, R# etc.), the path used to reference the tests can change.
For this reason, I embed test needed files in my test assembly and draw them out from there.
You can use:
Assembly.GetExecutingAssembly().Location
in your tests to get the path of the assembly containing the unit tests.
Simple, make the location of the files configurable (and testable).
Then either, set it in the unit testing code or set it thru some config file.

Good techniques to use Makefiles in VisualStudio?

I know the ideal way to build projects is without requiring IDE based project files, since it theoretically causes all sort of trouble with automation and what not. But I've yet to work on a project that compiles on Windows that doesn't depend on the VisualStudio project (Ok, obviously some Open Source stuff gets done with Cygwin, but I'm being general here).
On the other hand if we just use VS to run a makefile, we loose all the benefits of the compile options window, and it becomes a pain to maintain the external makefile.
So how do people that use VS actually handle external makefiles? I have yet to find a painless system to do this...
Or in reality most people don't do this, although its preached as good practice?
Take a look at MSBuild!
MSBuild can work with the sln/csproj files from VS, so for simple projects you can just call them directly.
if you need more control, wrap the projects in your own build process, add your own tasks etc. - it is very extensible!
(I wanted to add a sample but this edior totally messed up the XML... sorry)
Ideally perhaps, in practice no.
Makefiles would be my preference as the build master, however, the developers spend all their time inside the visual studio IDE and when they make a change, it's to the vcproj file, not the makefile. So if I'm doing the global builds with makefiles, it's too easily put out of synch with the project/solution files in play by 8 or 10 others.
The only way I can stay in step with the whole team is to run devenv.exe on the solution files directly in my build process scripts.
There are very few makefiles in my builds, where there are they are in the pre-build or custom build sections or a separate utility project.
One possibility is to use CMake - you describe with a script how you project is to be built, and CMake generates the Visual Studio solution/project files for you.
And if you need to build your project from the command line, or in a continuous integration tool, you use CMake to generate a Makefile for NMake.
And if you project is a cross-platform one - you can run CMake to generate the makefiles for the toolchain of your choice.
A simple CMake script looks like this:
project(hello)
add_executable(hello hello.cpp)
Compare these two lines with a makefile or the way you setup a simple project in your favorite IDE.
In a nutshell CMake does not only cross-platform-enables your project it also makes it cross-IDE. If you like to just test your project with eclipse or KDevelop, or codeblocks, just run CMake to generate the corresponding project files.
Well, in practice it is no always so easy, but the CMake idea just rocks.
For example, if you consider using CMake with Visual Studio there is some tweaking required to obtain the familiar VS project feeling, main obstacle is to organize your header and source files, but it is possible - check the CMake wiki (and by writting a short script you might even simplify this task).
We use a NAnt script, which at the compile step calls MSBuild. Using NAnt allows us to perform both pre- and post-build tasks, such as setting version numbers to match source control revision numbers, collating code coverage information, assembling and zipping deployment sources. But still, at the heart of it, it's MSBuild that's actually doing the compiling.
You can integrate a NAnt build as a custom tool into the IDE, so that it can be used both on a build or continuous integration server and by the developers in the same way.
Personally, I use Rake to call msbuild on my solution or project. For regular development I use the IDE and all the benefits that provides.
Rake is set up so that I can just compile, compile and run tests or compile run tests and create deployable artifacts.
Once you have a build script it is really easy to start doing things like setting up continuous integration and using it to automate deployment.
You can also use most build tools from within the IDE if you follow these steps to set it up.
We use the devenv.exe (same exe that launches the IDE) to build our projects from the build scripts (or the command line). When specifying the /Build option the IDE is not displayed and everything is written back to the console (or the log file if you specify the /Out option)
See http://msdn.microsoft.com/en-us/library/xee0c8y7(VS.80).aspx for more information
Example:
devenv.exe [solution-file-name] /Build [project-name] /Rebuild "Release|Win32" /Out solution.log
where "Release|Win32" is the configuration as defined in the solution and solution.log is the file that gets the compiler output (which is quite handy when you need to figure out what went wrong in the compile)
We have a program that parses the vcproj files and generates makefile fragments from that. (These include the list of files and the #defines, and there is some limited support for custom build steps.) These fragments are then included by a master makefile which does the usual GNU make stuff.
(This is all for one of the systems we target; its tools have no native support for Visual Studio.)
This didn't require a huge amount of work. A day to set it up, then maybe a day or two in total to beat out some problems that weren't obvious immediately. And it works fairly well: the compiler settings are controlled by the master makefile (no more fiddling with those tiny text boxes), and yet anybody can add new files and defines to the build in the usual way.
That said, the combinatorical problems inherent to Visual Studio's treatment of build configurations remain.
Why would you want to have project that "compiles on Windows that doesn't depend on the VisualStudio project"? You already have a solution file - you can just use it with console build.
I'd advise you to use msbuild with conjunction with makefile, nant or even simple batch file if your build system is not as convoluted as ours...
Is there something I'm missing?
How about this code?
public TRunner CleanOutput()
{
ScriptExecutionEnvironment.LogTaskStarted("Cleaning solution outputs");
solution.ForEachProject(
delegate (VSProjectInfo projectInfo)
{
string projectOutputPath = GetProjectOutputPath(projectInfo.ProjectName);
if (projectOutputPath == null)
return;
projectOutputPath = Path.Combine(projectInfo.ProjectDirectoryPath, projectOutputPath);
DeleteDirectory(projectOutputPath, false);
string projectObjPath = String.Format(
CultureInfo.InvariantCulture,
#"{0}\obj\{1}",
projectInfo.ProjectName,
buildConfiguration);
projectObjPath = Path.Combine(productRootDir, projectObjPath);
DeleteDirectory(projectObjPath, false);
});
ScriptExecutionEnvironment.LogTaskFinished();
return ReturnThisTRunner();
}
public TRunner CompileSolution()
{
ScriptExecutionEnvironment.LogTaskStarted ("Compiling the solution");
ProgramRunner
.AddArgument(MakePathFromRootDir(productId) + ".sln")
.AddArgument("/p:Configuration={0}", buildConfiguration)
.AddArgument("/p:Platform=Any CPU")
.AddArgument("/consoleloggerparameters:NoSummary")
.Run(#"C:\Windows\Microsoft.NET\Framework\v3.5\msbuild.exe");
ScriptExecutionEnvironment.LogTaskFinished ();
return ReturnThisTRunner ();
}
You can find the rest of it here: http://code.google.com/p/projectpilot/source/browse/trunk/Flubu/Builds/BuildRunner.cs
I haven't tried it myself yet, but Microsoft has a Make implementation called NMake which seems to have a Visual Studio integration:
NMake
Creating NMake Projects
Visual Studio since VS2005, uses "msbuild" to define and run builds. When you fiddle with project settings in the Visual Studio designer - let's say you turn XML doc generation on or off, or you add a new dependency, or you add a new project or Assembly reference - Visual Studio will update the .csproj (or .vbproj, etc) file, which is an msbuild file.
Like Java's ant or Nant before it, msbuild uses an XML schema to describe the project and build. It is run from VS when you do a "F6" build, and you can also run it from the command line, without ever opening VS or running devenv.exe.
So, use the VS tool for development and command-line msbuild for automated builds - same build, and same project structure.

How do I run (unit) tests in different folders/projects separately in Visual Studio?

I need some advice as to how I easily can separate test runs for unit tests and integration test in Visual Studio. Often, or always, I structure the solution as presented in the above picture: separate projects for unit tests and integration tests. The unit tests is run very frequently while the integration tests naturally is run when the context is correctly aligned.
My goal is to somehow be able configure which tests (or test folders) to run when I use a keyboard shortcut. The tests should preferably be run by a graphical test runner (ReSharpers). So for example
Alt+1 runs the tests in project BLL.Test,
Alt+2 runs the tests in project DAL.Tests,
Alt+3 runs them both (i.e. all the tests in the [Tests] folder, and
Alt+4 runs the tests in folder [Tests.Integration].
TestDriven.net have an option of running just the test in the selected folder or project by right-clicking it and select Run Test(s). Being able to do this, but via a keyboard command and with a graphical test runner would be awesome.
Currently I use VS2008, ReSharper 4 and nUnit. But advice for a setup in the general is of course also appreciated.
I actually found kind of a solution for this on my own by using keyboard command bound to a macro. The macro was recorded from the menu Tools>Macros>Record TemporaryMacro. While recording I selected my [Tests] folder and ran ReSharpers UnitTest.ContextRun. This resulted in the following macro,
Sub TemporaryMacro()
DTE.Windows.Item(Constants.vsWindowKindSolutionExplorer).Activate
DTE.ActiveWindow.Object.GetItem("TestUnitTest\Tests").Select(vsUISelectionType.vsUISelectionTypeSelect)
DTE.ExecuteCommand("ReSharper.UnitTest_ContextRun")
End Sub
which was then bound to it's own keyboard command in Tools>Options>Environment>Keyboard.
However, what would be even more awesome is a more general solution where I can configure exactly which projects/folders/classes to run and when. For example by the means of an xml file. This could then easily be checked in to version control and distributed to everyone who works with the project.
This is a bit of fiddly solution, but you could configure some external tools for each of group of tests you want to run. I'm not sure if you'll be able to launch the ReSharper test runner this way, but you can run the console version of nunit. Once you have of those tools setup, you can assigned keyboard shortcuts to the commands "Tools.ExternalCommand1", "Tools.ExternalCommand2", etc.
This wont really scale very well, and it's awkward to change - but it will give you keyboard shortcuts for running your tests. It does feel like there should be a much simpler way of doing this.
You can use a VS macro to parse the XML file and then call nunit.exe with the /fixture command line argument to specify which classes to run or generate a selection save file and run nunit using that.
I have never used this but maybe it could help....
http://www.codeplex.com/VS2008UnitTestGUI
"Project Description
This project is about running all unit test inside multiple .NET Unit tests assembly coded with Visual Studio 2008."

Resources