How to exclude non-unit tests from runs across multiple installations of Visual Studio - visual-studio

I am aware of multiple methods one can use to keep certain types of test out of a test playlist: filtering by TestCategory or class name etc.
I am also aware that one can instruct TFS builds to only run certain categories or classes of test.
However, I find it quick and convenient to be able to check my unit test runs just by going into Test Explorer and clicking "run all". It's good practice to do this regularly and prior to check-in to ensure the build is likely to pass.
Is there a straightforward way I can configure my tests to ensure that "run all" just picks up the quick unit tests, and leaves the slower system and regression tests alone? Ideally, I'd like a method that can easily be applied to everyone else working on the code at the same time.

It's a little incomprehensible. Since the MSDN explains very clear.Run all just mean run all tests. And there are also filters or other settings can achieve just run unit tests.
And also as your have mentioned, you can run unit tests with your builds directly. Why you need this function. If you just want to make sure the check-in will not cause build fail. You can use Gated Check-in directly. This also applied to everyone else.

Related

Running Visual Studio Load Test From Build Definition

I created a build definition that runs automated tests using MTM build environments and test suites. I recently created a Visual Studio Load Test, which can be added to a test suite just like any test method marked with the [TestMethod] attribute. However, when I run the build, I get no errors and it appears the aggregate tests don't run. Is there a way to make this work?
I found this article: https://blogs.msdn.microsoft.com/testingspot/2013/01/22/how-to-automatically-run-a-load-test-as-part-of-a-build/ which describes a way to do it, but I can't find a build template that matches what he describes, and it appears this only allows you to run a single load test.
Also, when you configure a test controller, there is an option to configure it for load testing, but to do this, you must unregister it from the Team Project Collection. If this is done, it appears the controller can no longer be used in an environments to run project automated tests. This defeats the purpose of what I want to do and makes it seem that Load Tests and Team Projects are mutually exclusive. Is this the case? If so, this is a big oversight. Load tests are the kind of thing you would like to run automatically. Thanks for the help.
You are unfortunately right. A test controller used for load testing cannot be used for other automated test execution 'at the same time'. In your scenario I would recommend that you setup a different test controller and agent for load testing and you would be able to queue it as a part of your build to achieve what you are looking for.
There is no special build process template for this case.

Run ignored test in Resharper 2016.1 in VS2015

Upgraded Resharper to 2016.1 and I'm not able to run NUnit tests marked with the Ignore attribute. This was possible before by right clicking the test and run. Is this a change or am I missing something?
It is very frustrating, I have many tests which I ignore on the build machine which I would like to run locally.
Resharper has been using NUnit itself to run it's tests for a long time, so it seems you must have done a really big upgrade, from an early version of Resharper that executed the test methods itself. NUnit simply will not run an ignored test, even if you programmatically tell it to do so. Or, to put it differently, NUnit "runs" ignored tests by reporting that they are ignored.
This is actually the definition of "Ignored" in NUnit. It was designed a long time ago to deal with tests that should not be run, generally for a very short period, while the developer is doing other work. It shows up as a warning in any test runs because, in the ideal world, Ignored tests should not make it into your CI runs. It may be the wrong attribute for you to use for your purposes, especially if you want to be able to run it some of the time. If you want a test that is only run when expressly selected, we have the "Explicit" attribute instead. Other behavior is possible, but it would take a feature request.

TDD: refactoring and global regressions

While the refactoring step of test driven development should always involve another full run of tests for the given functionality, what is your approach about preventing possible regressions beyond the functionality itself?
My professional experience makes me want to retest the whole functional module after any code change. Is it what TDD recommends?
Thank you.
While the refactoring step of test driven development should always
involve another full run of tests for the given functionality, what is
your approach about preventing possible regressions beyond the
functionality itself?
When you are working on specific feature it is enough to run tests for the given functionality only. There is no need to do full regression.
My professional experience makes me want to retest the whole
functional module after any code change.
You do not need to do full regression but you can, since Unit tests are small, simple and fast.
Also, there are several tools that are used for "Continuous Testing" in different languages:
in Ruby (e.g Watchr)
in PHP, (e.g. Sismo)
in .NET (e.g. NCrunch)
All these tools are used to run tests automatically on your local machine to get fast feedback.
Only when you are about to finish implementation of the feature it is time to do full run of all your tests.
Running tests on Continuous integration (CI) server is essential. Especially, when you have lots of integration tests.
TDD is just a methodology to write new code or modify old one. Your entire tests base should be ran every time a modification is done to any of the code file (new feature or refactoring). That's how you ensure no regression has taken place. We're talking about automatic testing here (unit-tests, system-tests, acceptance-tests, sometimes performance tests as well)
Continuous integration (CI) will help you achieve that: a CI server (Jenkins, Hudson, TeamCity, CruiseControl...) will have all your tests, and run them automatically when you commit a change to source control. It can also calculate test coverage and indicate where your code is insufficiently tested (note if you do proper TDD, your test coverage should always be 100%).

Is there a way to disable/ignore a Load Test in Visual Studio 2010 without using Test Lists?

I'm new to load testing in Visual Studio/MSTest, and I created a new Load Test recently to validate some high-traffic scenarios for a WCF service. I want to add this to the tests project for the service, but I don't want the test to be executed whenever I "Run All Tests in Solution" nor as part of our Continuous Integration build-verification process because a) it takes 5 minutes to run, and b) the service call that it is testing generates many thousands of email messages. Basically, I'd like to do the equivalent of adding the [Ignore] attribute to a unit test so that the load test is only executed when I explicitly choose to run it.
This MSDN Article ("How to: Disable and Enable Tests") suggests that the only to disable the test is to use Test Lists (.vsmdi files), but I don't have much experience with them, they seem like a hassle to manage, I don't want to have to modify our CI Build Definition, and this blog post says that Test Lists are deprecated in VS2012. Any other ideas?
Edit: I accepted Mauricio's answer, which was to put the load tests into a separate project and maintain separate solutions, one with the load tests and one without. This enables you to run the (faster-running) unit tests during development and also include the (slower-running) load tests during build verification without using test lists.
This should not be an issue for your CI Build Definition. Why?
To run unit tests as part of your build process you need to configure the build definition to point to a test container (usually a .dll file containint your test classes and methods). Load tests do not work this way, they are defined within .loadtest files (which are just xml files) that are consumed by the MSTest engine.
If you do not make any further changes to your CI Build definition the load test will be ignored.
If you want to run the test as part of a build, then you need to configure the build definition to use the .loadtest file.
Stay away from testlists. Like you said, they are being deprecated in VS11.
Edit: The simplest way to avoid running the load test as part of Visual Studio "Run All" tests is to create a different solution for your load tests.
Why don't you want to use Test Lists. I think is the best way to do that. Create different Test Lists for each test type (unit test, load test...) and then in your MSTest command run the Test List(s) you want:
MSTest \testmetadata:testlists.vsmdi \testlist:UnitTests (only UnitTests)
MSTest \testmetadata:testlists.vsmdi \testlist:LoadTests (only LoadTests)
MSTest \testmetadata:testlists.vsmdi \testlist:UnitTests \testlist:LoadTests (UnitTests & LoadTests)

How do you separate unit tests from integration tests in Visual Studio?

I've been using Visual Studio 2008 Test projects to store my tests. Lately I've realized that a lot of my unit tests are in fact integration tests because they rely on external sources (e.g. file system, SQL server, registry).
My question is, what is a good approach to separating out integration tests from unit tests?
Ideally I want only the unit tests to show up in the Test View, because I run them frequently during development. The integration tests on the other hand I don't want in the Test View because I will only run them infrequently, e.g. when I'm about to make a build drop.
Keep them in separate projects, and keep the integration testing projects out of your day-to-day Visual Studio solutions.
When you wish to run the integration tests, you can use a different solution that includes them. If you don't want to wait for a second instance of VS to load, you can run them from the command-line.
I put them in a separate project named IntegrationTests or something similar.
EDIT:
With Test View you can create lists & filter them:
http://msdn.microsoft.com/en-us/library/ms182452.aspx
And then run them:
http://msdn.microsoft.com/en-us/library/ms182470.aspx

Resources