DataSource attribute causing Unit Test to be skipped - visual-studio

Just out of curiosity has anyone ever had the issues of a test being skipped only when the DataSource attribute is enabled on that specific test. When I try to run this specific test, Test Explorer consistently shows ExecuteSproc_Test ignored with no meaning or explanation. Examples below
Works
public TestContext testContext { get; set; }
[TestMethod]
//[DataSource("SqlClient","ConnectionString","SqlTable", Sequential)]
public void ExecuteSproc_Test()
{
Assert.IsNotNull(testContext.DataRow["Row"]);
}
Ignored
public TestContext testContext { get; set; }
[TestMethod]
[DataSource("SqlClient","ConnectionString","SqlTable", Sequential)]
public void ExecuteSproc_Test()
{
Assert.IsNotNull(testContext.DataRow["Row"]);
}

There should be no problem doing that. For example, Microsoft's documentation does this in an example on the page How To: Create a Data-Driven Unit Test.
So my guess would be that your DataSource isn't returning any rows.

Related

Mvvmcross Testing different view models fails when running together

I've come across an interesting error. I have two test files for my xamarin mobile application, both testing view models:
public class TestFirstViewModel : MvxIoCSupportingTest
{
public void AdditionalSetup() {
//Register services and dependencies here.
}
[Fact]
public TestMethod1() {
// Successful test code here.
}
}
That's in one file. In another file, I have:
public class TestSecondViewModel : MvxIoCSupportingTest
{
public void AdditionalSetup() {
//Register services and dependencies here, slightly different from first
}
[Fact]
public TestMethod2() {
// Successful test code here.
}
}
When I run these files individually (I'm using xunit), they work just fine. However, when I run them together, I get the following error on one of the test cases:
Result Message: Cirrious.CrossCore.Exceptions.MvxException : You cannot create more than one instance of MvxSingleton
Result StackTrace:
at Cirrious.CrossCore.Core.MvxSingleton`1..ctor()
at Cirrious.CrossCore.IoC.MvxSimpleIoCContainer..ctor(IMvxIocOptions options)
at Cirrious.CrossCore.IoC.MvxSimpleIoCContainer.Initialize(IMvxIocOptions options)
at Cirrious.MvvmCross.Test.Core.MvxIoCSupportingTest.ClearAll()
at Cirrious.MvvmCross.Test.Core.MvxIoCSupportingTest.Setup()
at Project.Test.TestFirstViewModel.TestMethod1() in ...
Can anyone tell me what's going on here?
The issue stems from the parallelization of XUnit without the option to do proper tear-down. You could diable parallelization in the AssemblyIndo.cs file in you test project by adding:
[assembly: CollectionBehavior(DisableTestParallelization = true)]
I ended up solving this question by changing testing frameworks. I had different ioc singleton initializations, because, well, they're different test cases and needed different inputs/mocks. Instead of using Xunit, I resorted to Nunit where their cache clearing was much more defined: Xunit doesn't exactly believe in setup and tear-down, so it made a test environment like this more difficult.
I fixed the issue by using the collection attribute.
[Collection("ViewModels")]
class ViewModelATest : BaseViewModelTest {
...
}
[Collection("ViewModels")]
class ViewModelBTest : BaseViewModelTest {
...
}
The base view model test class has the mock dispatcher and performs the singleton registrations in the additional setup method.
Each of my tests calls ClearAll() at the beginning.
I hade some success with setup things in a constructor and add this check:
public PaymentRepositoryTests()
{
if (MvxSingletonCache.Instance == null)
{
Setup();
}
//other registerings.
}`
Also I did implement the IDisposable Interface
public void Dispose()
{
ClearAll();
}
But tbh not sure how much impact that had..
It works ok with xunit
Copy MvxIocSupportingTest and Mvxtest in your xunit PCL project.
Modify MvxTest to remove the attributes and use a simple contructor:
public class MvxTest : MvxIoCSupportingTest
{
protected MockMvxViewDispatcher MockDispatcher { get; private set; }
public MvxTest()
{
Setup();
}
...
And in each of you test, derive from IClassFixture
public class TestRadiosApi : IClassFixture<MvxTest>
{
[Fact]
public async Task TestToken()
{
...
xunit will create the MvxTest class only once for all tests.

Entity Framework CodeFirst KeyNotFoundException in MemberDomainMap

Trying to run down an error in my EF datacontext implementation that is yielding a fairly cryptic error.
Test Name: Nodes_can_be_saved
Test FullName: MyProj.Test.Integration.AFDataContextTest.Nodes_can_be_saved
Test Source: c:\Users\pvencill. \Documents\Visual Studio 2012\Projects\MyProj\MyProj.Test\Integration\AFDataContextTest.cs : line 49
Test Outcome: Failed
Test Duration: 0:00:01.4192808
Result Message:
Test method MyProj.Test.Integration.AFDataContextTest.Nodes_can_be_saved threw exception:
System.Data.Entity.Infrastructure.DbUpdateException: Error retrieving values from ObjectStateEntry. See inner exception for details. ---> System.Data.UpdateException: Error retrieving values from ObjectStateEntry. See inner exception for details. ---> System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.
Result StackTrace:
at System.Collections.Generic.Dictionary`2.get_Item(TKey key)
at System.Data.Mapping.ViewGeneration.Structures.MemberDomainMap.GetDomainInternal(MemberPath path)
at System.Data.Mapping.ViewGeneration.QueryRewriting.FragmentQueryKB.CreateIsOfTypeCondition(MemberPath currentPath, IEnumerable`1 derivedTypes, MemberDomainMap domainMap)
at System.Data.Mapping.ViewGeneration.QueryRewriting.FragmentQueryKB.CreateVariableConstraintsRecursion(EdmType edmType, MemberPath currentPath, MemberDomainMap domainMap, EdmItemCollection edmItemCollection)
at System.Data.Mapping.ViewGeneration.QueryRewriting.FragmentQueryKB.CreateVariableConstraintsRecursion(EdmType edmType, MemberPath currentPath, MemberDomainMap domainMap, EdmItemCollection edmItemCollection)
at System.Data.Mapping.ViewGeneration.ViewgenContext..ctor(ViewTarget viewTarget, EntitySetBase extent, IEnumerable`1 extentCells, CqlIdentifiers identifiers, ConfigViewGenerator config, MemberDomainMap queryDomainMap, MemberDomainMap updateDomainMap, StorageEntityContainerMapping entityContainerMapping)
at System.Data.Mapping.ViewGeneration.ViewGenerator.CreateViewgenContext(EntitySetBase extent, ViewTarget viewTarget, CqlIdentifiers identifiers)
at System.Data.Mapping.ViewGeneration.ViewGenerator.GenerateDirectionalViewsForExtent(ViewTarget viewTarget, EntitySetBase extent, CqlIdentifiers identifiers, KeyToListMap`2 views)
at System.Data.Mapping.ViewGeneration.ViewGenerator.GenerateDirectionalViews(ViewTarget viewTarget, CqlIdentifiers identifiers, KeyToListMap`2 views)
at System.Data.Mapping.ViewGeneration.ViewGenerator.GenerateAllBidirectionalViews(KeyToListMap`2 views, CqlIdentifiers identifiers)
at System.Data.Mapping.ViewGeneration.ViewgenGatekeeper.GenerateViewsFromCells(List`1 cells, ConfigViewGenerator config, CqlIdentifiers identifiers, StorageEntityContainerMapping containerMapping)
at System.Data.Mapping.ViewGeneration.ViewgenGatekeeper.GenerateViewsFromMapping(StorageEntityContainerMapping containerMapping, ConfigViewGenerator config)
at System.Data.Mapping.StorageMappingItemCollection.ViewDictionary.SerializedGenerateViews(StorageEntityContainerMapping entityContainerMap, Dictionary`2 resultDictionary)
at System.Data.Mapping.StorageMappingItemCollection.ViewDictionary.SerializedGetGeneratedViews(EntityContainer container)
at System.Data.Common.Utils.Memoizer`2.<>c__DisplayClass2.<Evaluate>b__0()
at System.Data.Common.Utils.Memoizer`2.Result.GetValue()
at System.Data.Common.Utils.Memoizer`2.Evaluate(TArg arg)
at System.Data.Mapping.StorageMappingItemCollection.ViewDictionary.GetGeneratedView(EntitySetBase extent, MetadataWorkspace workspace, StorageMappingItemCollection storageMappingItemCollection)
at System.Data.Mapping.Update.Internal.ViewLoader.InitializeEntitySet(EntitySetBase entitySetBase, MetadataWorkspace workspace)
at System.Data.Mapping.Update.Internal.ViewLoader.SyncInitializeEntitySet[TArg,TResult](EntitySetBase entitySetBase, MetadataWorkspace workspace, Func`2 evaluate, TArg arg)
at System.Data.Mapping.Update.Internal.ViewLoader.SyncContains[T_Element](EntitySetBase entitySetBase, MetadataWorkspace workspace, Set`1 set, T_Element element)
at System.Data.Mapping.Update.Internal.ExtractorMetadata..ctor(EntitySetBase entitySetBase, StructuralType type, UpdateTranslator translator)
at System.Data.Mapping.Update.Internal.UpdateTranslator.GetExtractorMetadata(EntitySetBase entitySetBase, StructuralType type)
at System.Data.Mapping.Update.Internal.ExtractorMetadata.ExtractResultFromRecord(IEntityStateEntry stateEntry, Boolean isModified, IExtendedDataRecord record, Boolean useCurrentValues, UpdateTranslator translator, ModifiedPropertiesBehavior modifiedPropertiesBehavior)
at System.Data.Mapping.Update.Internal.RecordConverter.ConvertStateEntryToPropagatorResult(IEntityStateEntry stateEntry, Boolean useCurrentValues, ModifiedPropertiesBehavior modifiedPropertiesBehavior)
--- End of inner exception stack trace ---
at System.Data.Mapping.Update.Internal.RecordConverter.ConvertStateEntryToPropagatorResult(IEntityStateEntry stateEntry, Boolean useCurrentValues, ModifiedPropertiesBehavior modifiedPropertiesBehavior)
at System.Data.Mapping.Update.Internal.ExtractedStateEntry..ctor(UpdateTranslator translator, IEntityStateEntry stateEntry)
at System.Data.Mapping.Update.Internal.UpdateTranslator.LoadStateEntry(IEntityStateEntry stateEntry)
at System.Data.Mapping.Update.Internal.UpdateTranslator.PullModifiedEntriesFromStateManager()
at System.Data.Mapping.Update.Internal.UpdateTranslator.ProduceCommands()
at System.Data.Mapping.Update.Internal.UpdateTranslator.Update(IEntityStateManager stateManager, IEntityAdapter adapter)
at System.Data.EntityClient.EntityAdapter.Update(IEntityStateManager entityCache)
at System.Data.Objects.ObjectContext.SaveChanges(SaveOptions options)
at System.Data.Entity.Internal.InternalContext.SaveChanges()
--- End of inner exception stack trace ---
at System.Data.Entity.Internal.InternalContext.SaveChanges()
at System.Data.Entity.Internal.LazyInternalContext.SaveChanges()
at System.Data.Entity.DbContext.SaveChanges()
at MyProj.Data.MyProjDataContext.SaveChanges() in c:\Users\pvencill. \Documents\Visual Studio 2012\Projects\MyProj\MyProj.Data\MyProjDataContext.cs:line 44
at MyProj.Test.Integration.AFDataContextTest.Nodes_can_be_saved() in c:\Users\pvencill. \Documents\Visual Studio 2012\Projects\MyProj\MyProj.Test\Integration\AFDataContextTest.cs:line 55
Researching the error led to few hits on Google, but the ones I found suggested that it's something to do w/ my model relationships, though in looking at the DB that the migrations generated, all seems in order to my eyes. My relevant models are as follows:
My data context DBSets and modelCreating definition:
public DbSet<Blip> Blips { get; set; }
public DbSet<SensorAdapter> Sensors { get; set; }
public DbSet<NodeReport> NodeReports { get; set; }
public DbSet<Node> Nodes { get; set; }
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Entity<Blip>().Property(b => b.TimeStamp).HasColumnType("datetime2");
modelBuilder.Entity<Node>().HasMany<NodeReport>(n => n.NodeReports).WithRequired(nr=>nr.Node);
modelBuilder.Entity<Blip>().HasMany<NodeReport>(b => b.NodeReports);
base.OnModelCreating(modelBuilder);
}
The Blips and SensorAdapter objects worked fine prior to adding NodeReports to them, so I suspect it's in there that the project lies.
I have a base Entity object that all my stuff inherits from, which just defines an Id property of type T; that was working fine.
NodeReport inherits from Report, whose definition is here:
public abstract class Report : Entity<long>
{
public Report()
{
Status = Status.Unknown;
}
public DateTime TimeStamp { get; set; }
public Status Status { get; set; }
public String Raw { get; set; }
}
NodeReport in turn is defined thus:
public class NodeReport : Report
{
public virtual Node Node { get; set; }
//public virtual Blip Blip { get; set; }
}
I tried it both with and without the Blip on there, commented out at the moment as I try and narrow down the problem
A Node is a fairly sparse class at hte moment too:
public class Node : Entity<long>
{
public Node ()
{
NodeReports = new List<NodeReport>();
}
public String HostName { get;set; }
public String Description { get; set; }
public virtual IList<NodeReport> NodeReports { get; set; }
}
Any suggestions would be greatly appreciated, I've been beating myself up trying to figure it.
Well, after much searching through my code and rebuilding from scratch I found that the problem was actually that I had a derived class of Node that had a Uri as a property, which obviously failed mapping since it doesn't have a default constructor (and possibly other reasons). I solved it for now by simply changing the property to a String which I validate as a Uri internally, though I would have preferred a more elegant solution. I tried mapping Uri and even a custom subclass (w/ default constructor) of Uri to a complextype, but that didn't help.
Still, the question above is answered.
With the #Paul's answer I could finally figure out my problem.
I am using EF with Inheritance TPT (Table per Type).
The source code
To make it easier I'll use the same classes discribed in this tutorial.
public abstract class BillingDetail
{
public int BillingDetailId { get; set; }
public string Owner { get; set; }
public string Number { get; set; }
}
[Table("BankAccounts")]
public class BankAccount : BillingDetail
{
public string BankName { get; set; }
public string Swift { get; set; }
public Agency Agency { get; set; } /* I added it */
}
public class InheritanceMappingContext : DbContext
{
public DbSet<BillingDetail> BillingDetails { get; set; }
}
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Entity<BankAccount>().ToTable("BankAccounts");
modelBuilder.Entity<CreditCard>().ToTable("CreditCards");
}
The problem
Note that I've added a new property called Agency inside the BankAccount. Given it is a complex type, if you do not map it you'll get this annoying error at runtime!
Solution
What I did was simply ignore this property Agency, but you can also map it to EF know what to do. Both will stop the error.
The most weird thing is that even not mapping the derived entity (BankAccount) the problem occurs. It seems that EF somehow knows that you created the derivation. So, if you're trying to run EF without mapping some derivation you will probably get this error too.
I have the same issue, unfortunately none of the solutions here in StackOverflow worked for me aside from the answers on other questions related to this issue.
But I found my own fix and you can also check it in your part if you have the same issue as mine. What happen is that if some class inherits from the table that I am using on my DbSet:
public DbSet<Employee> Employees { get; set; }
On this case, if some other classes inherit from my POCO Employee, this triggers this particular error. I removed all inheritance from this class and this fixes the issue.
Take note that this inheritance issue which triggers this same issue:
The given key was not present in the dictionary.
Only happens if the inheritance is on the same project. I tried to inherit the POCO on different project and it happens to be fine.

EF and repository pattern - ending up with multiple DbContexts in one controller - any issues (performance, data integrity)?

Most of my knowledge of ASP.NET MVC 3 comes from reading through the book Pro ASP.NET MVC 3 Framework by Adam Freeman and Steven Senderson. For my test application I have tried to stick to their examples very closely. I am using the repository pattern plus Ninject and Moq which means that unit testing work quite well (i.e. without needing to pull data from the database).
In the book repositories are used like this:
public class EFDbTestChildRepository
{
private EFDbContext context = new EFDbContext();
public IQueryable<TestChild> TestChildren
{
get { return context.TestChildren; }
}
public void SaveTestChild(TestChild testChild)
{
if (testChild.TestChildID == 0)
{
context.TestChildren.Add(testChild);
}
else
{
context.Entry(testChild).State = EntityState.Modified;
}
context.SaveChanges();
}
}
And here is the DbContext that goes with it:
public class EFDbContext : DbContext
{
public DbSet<TestParent> TestParents { get; set; }
public DbSet<TestChild> TestChildren { get; set; }
}
Please note: to keep things simple in this extracted example I have left out the interface ITestChildRepository here which Ninject would then use.
In other sources I have seen a more general approach for the repository where one single repository is enough for the whole application. Obviously in my case I end up with quite a list of repositories in my application - basically one for each entity in my domain model. Not sure about the pros and cons about the two approaches - I just followed the book to be on the safe side.
To finally get to my question: each repository has its own DbContext - private EFDbContext context = new EFDbContext();. Do I risk ending up with multiple DbContexts within one request? And would that lead to any significant performance overhead? How about a potential for conflicts between the contexts and any consequences to the data integrity?
Here is an example where I ended up with more than one repository within a controller.
My two database tables are linked with a foreign key relationship. My domain model classes:
public class TestParent
{
public int TestParentID { get; set; }
public string Name { get; set; }
public string Comment { get; set; }
public virtual ICollection<TestChild> TestChildren { get; set; }
}
public class TestChild
{
public int TestChildID { get; set; }
public int TestParentID { get; set; }
public string Name { get; set; }
public string Comment { get; set; }
public virtual TestParent TestParent { get; set; }
}
The web application contains a page that allows the user to create a new TestChild. On it there is a selectbox that contains a list of available TestParents to pick from. This is what my controller looks like:
public class ChildController : Controller
{
private EFDbTestParentRepository testParentRepository = new EFDbTestParentRepository();
private EFDbTestChildRepository testChildRepository = new EFDbTestChildRepository();
public ActionResult List()
{
return View(testChildRepository.TestChildren);
}
public ViewResult Edit(int testChildID)
{
ChildViewModel cvm = new ChildViewModel();
cvm.TestChild = testChildRepository.TestChildren.First(tc => tc.TestChildID == testChildID);
cvm.TestParents = testParentRepository.TestParents;
return View(cvm);
}
public ViewResult Create()
{
ChildViewModel cvm = new ChildViewModel();
cvm.TestChild = new TestChild();
cvm.TestParents = testParentRepository.TestParents;
return View("Edit", cvm);
}
[HttpPost]
public ActionResult Edit(TestChild testChild)
{
try
{
if (ModelState.IsValid)
{
testChildRepository.SaveTestChild(testChild);
TempData["message"] = string.Format("Changes to test child have been saved: {0} (ID = {1})",
testChild.Name,
testChild.TestChildID);
return RedirectToAction("List");
}
}
catch (DataException)
{
//Log the error (add a variable name after DataException)
ModelState.AddModelError("", "Unable to save changes. Try again, and if the problem persists see your system administrator.");
}
// something wrong with the data values
return View(testChild);
}
}
It's not enough to have an EFDbTestChildRepository available but I also need an EFDbTestParentRepository. Both of them are assigned to private variables of the controller - and voila, it seems to me that two DbContexts have been created. Or is that not correct?
To avoid the issue I tried using EFDbTestChildRepository to get to the TestParents. But that obviously will only bring up those that are already hooked up to at least one TestChild - so not what I want.
Here is the code for the view model:
public class ChildViewModel
{
public TestChild TestChild { get; set; }
public IQueryable<TestParent> TestParents { get; set; }
}
Please let me know if I forgot to include some relevant code. Thanks so much for your advice!
There won't be a performance problem (unless we are talking about nanoseconds, instantiating a context is very cheap) and you won't have damaged your data integrity (before that happens you'll get exceptions).
But the approach is very limited and will work only in very simple situations. Multiple contexts will lead to problems in many scenarios. As an example: Suppose you want to create a new child for an existing parent and would try it with the following code:
var parent = parentRepo.TestParents.Single(p => p.Id == 1);
var child = new Child { TestParent = parent };
childrenRepo.SaveTestChild(child);
This simple code won't work because parent is already attached to the context inside of parentRepo but childrenRepo.SaveTestChild will try to attach it to the context inside of childrenRepo which will cause an exception because an entity must not be attached to another context. (Here is actually a workaround because you could set the FK property instead of loading the parent: child.TestParentID = 1. But without a FK property it would be a problem.)
How to solve such a problem?
One approach could be to extend the EFDbTestChildRepository by a new property:
public IQueryable<TestParent> TestParents
{
get { return context.TestParents; }
}
In the example code above you could then use only one repository and the code would work. But as you can see, the name "EFDbTest Child Repository" doesn't really fit anymore to the purpose of the new repository. It should be now "EFDbTest ParentAndChild Repository".
I would call this the Aggregate Root approach which means that you create one repository not for only one entity but for a few entities which are closely related to each other and have navigation properties between them.
An alternative solution is to inject the context into the repositories (instead of creating it in the repositories) to make sure that every repository uses the same context. (The context is often abstracted into a IUnitOfWork interface.) Example:
public class MyController : Controller
{
private readonly MyContext _context;
public MyController()
{
_context = new MyContext();
}
public ActionResult SomeAction(...)
{
var parentRepo = new EFDbTestParentRepository(_context);
var childRepo = new EFDbTestChildRepository(_context);
//...
}
protected override void Dispose(bool disposing)
{
_context.Dispose();
base.Dispose(disposing);
}
}
This gives you a single context per controller you can use in multiple repositories.
The next step might be to create a single context per request by dependency injection, like...
private readonly MyContext _context;
public MyController(MyContext context)
{
_context = context;
}
...and then configuring the IOC container to create a single context instance which gets injected into perhaps multiple controllers.
Do I risk ending up with multiple DbContexts within one request?
Yes. Each instance of a repository is going to instantiate its own DbContexts instances. Depending on the size and use of the application, this may not be a problem although it is not a very scalable approach. There are several ways of handling this though. In my web projects I add the DbContext(s) to the Request's Context.Item collection, this way it is available to all classes that require it. I use Autofac (similar to Ninject) to control what DbContexts are created within specific scenarios and how they are stored, e.g. I have a different 'session manager' for a WCF context to the one for a Http context.
And would that lead to any significant performance overhead?
Yes, but again not massively if the application is relatively small. As it grows though, you may notice the overhead.
How about a potential for conflicts between the contexts and any
consequences to the data integrity?
One of the reasons for using an ORM like this is so that changes can be maintained within the DbContext. If you are instantiating multiple context instances per request you lose this benefit. You wouldn't notice conflicts or any impact of the integrity per se unless you were handling a lot of updates asynchronously.
As promised I post my solution.
I came across your question because I was having trouble with the IIS application pool memory growing beyond limits and having multiple DBContexts was one of my suspects. In retrospect it is fair to say that there were other causes for my trouble. However, it challenged me to find a better layer based design for my repository.
I found this excellent blog: Correct use of Repository and Unit Of Work patterns in ASP.NET MVC leading me to the right direction. The redesign is based on the UnitOfWork pattern. It enables me to have just one constructor parameter for all my controllers instead of "never ending constructor parameters". And after that, I was able to introduce proactive caching as well, which solved a great deal of the earlier mentioned trouble I was having.
Now I only have these classes:
IUnitOfWork
EFUnitOfWork
IGenericRepository
EFGenericRepository
See the referred blog for complete information and implementation of these classes. Just to give an example, IUnitOfWork contains repository definitions for all entities that I need, like:
namespace MyWebApp.Domain.Abstract
{
public interface IUnitOfWork : IDisposable
{
IGenericRepository<AAAAA> AAAAARepository { get; }
IGenericRepository<BBBBB> BBBBBRepository { get; }
IGenericRepository<CCCCC> CCCCCRepository { get; }
IGenericRepository<DDDDD> DDDDDRepository { get; }
// etc.
string Commit();
}
}
The Dependency Injection (DI) is just one statement (I use Ninject):
ninjectKernel.Bind<IUnitOfWork>().To<EFUnitOfWork>();
The Controllers-constructors are maintainable:
public class MyController : BaseController
{
private MyModel mdl = new MyModel();
private IUnitOfWork _context;
public MyController(IUnitOfWork unitOfWork)
{
_context = unitOfWork;
// intialize whatever needs to be exposed to the View:
mdl.whatever = unitOfWork.SomeRepository.AsQueryable();
}
// etc.
Within the Controller I can use _context to access all repositories, if needed. The nice part of it, is that it needs just a single Commit()-call to save changed data for all repositories:
_context.Commit();

Access TestContext in SpecFlow Step Binding class

Is it possible to access the MSTest TestContext from within a SpecFlow (1.7.1) step binding class?
In the generated code of a feature file there is a method FeatureSetup which takes the TestContext as an argument but apparently doesn't do anything with it.
I found a way to pass parameters to TestContext and then access them from SpecFlow.
By adding a [TestClass] which has a TestContext property and marking its AssemblyInit() method as [AssemblyInitialize] so it gets initialized early before runnig the tests and MSTest will be able to populate the TestContext.
{
[TestClass]
public class InitializeTestContext
{
public static TestContext Context { get; private set; }
[AssemblyInitialize]
public static void AssemblyInit(TestContext context)
{
Context = context;
}
}
}
And then can access it from my BaseSteps class:
{
public abstract class BaseSteps : TechTalk.SpecFlow.Steps
{
public string GetTestEnvironment()
{
TestContext testContext = InitializeTestContext.Context;
string testEnvironment = testContext.Properties["Environment"].ToString();
return testEnvironment;
}
}
}
Gáspár Nagy answered on SpecFlow google group: https://groups.google.com/group/specflow/browse_thread/thread/5b038e3e283fdbfe#
By default not. We have a test-provider independent ScenarioContext.Current that can be used for similar purposes.
Further to Valentin's answer. Here is an example of a test generator that will add in the test context. Its from the same Google group.
Gáspár Nagy said it may be added to the provider that ships in specflow.
So to answer the OP's question, yes it is possible.

Forcing MSTest to use a single thread

Given this test fixture:
[TestClass]
public class MSTestThreads
{
[TestMethod]
public void Test1()
{
Trace.WriteLine(Thread.CurrentThread.ManagedThreadId);
}
[TestMethod]
public void Test2()
{
Trace.WriteLine(Thread.CurrentThread.ManagedThreadId);
}
}
Running the test with MSTest through Visual Studio or command line prints two different thread numbers (yet they are run sequentially anyway).
Is there a way to force MSTest to run them using a single thread?
I solved this problem with locking:
public static class IntegrationTestsSynchronization
{
public static readonly object LockObject = new object();
}
[TestClass]
public class ATestCaseClass
{
[TestInitialize]
public void TestInitialize()
{
Monitor.Enter(IntegrationTestsSynchronization.LockObject);
}
[TestCleanup]
public void TestCleanup()
{
Monitor.Exit(IntegrationTestsSynchronization.LockObject);
}
//test methods
}
// possibly other test cases
This can of course be extracted to a base test class and reused.
I've fought for endless hours to make MSTest run in a single threaded mode on a large project that made heavy use of nHibernate and it's not-thread-safe (not a problem, it's just not) ISession.
We ended up more time writing code to support the multi-threaded nature of MSTest because - to the best of my and my teams knowledge - it is not possible to run MSTest in a single threaded mode.
You can derive your test class from
public class LinearTest
{
private static readonly object SyncRoot = new object();
[TestInitialize]
public void Initialize()
{
Monitor.Enter(SyncRoot);
}
[TestCleanup]
public void Cleanup()
{
Monitor.Exit(SyncRoot);
}
}
We try hard to make out tests isolated from each other. Many of them achieve this by setting up the state of a database, then restoring it afterwards. Although mostly tests set up different data, with some 10,000 in a run there is a fair chance of a collision unless the code author of a test takes care to ensure its initial data is unique (ie doesn't use the same primary keys as another test doing something similar). This is, frankly, unmanageable, and we do get occasional test failures that pass second time around. I am fairly sure this is caused by collisions that would be avoided running tests strictly sequentially.
The way to make an MSTest method run in single-threaded mode:
Nuget:
install-package MSTest.TestAdapter
install-package MSTest.TestFramework
In your test source on those methods that need to run while no other tests are running:
[TestMethod]
[DoNotParallelize]
public void myTest(){
//
}
Whilst it is a cop out answer, I would actually encourage you to make your code thread-safe. The behaviour of MSTest is to ensure isolation as Richard has pointed out. By encountering problems with your unit tests you are proving that there could be some problems in the future.
You could ignore them, use NUnit, or deal with them and continue to use MSTest.
I tried a bit of a different approach, because the underlying problem is that the names of the pipes are the problem. So I made a fakePipe, derived it from the one I use in the program. And named the pipe with the tests name.
[TestClass]
public class PipeCommunicationContractTests {
private PipeDummy pipe;
/// <summary>
///Gets or sets the test context which provides
///information about and functionality for the current test run.
///</summary>
public TestContext TestContext { get; set; }
[TestInitialize]
public void TestInitialize() {
pipe = new PipeDummy(TestContext.TestName);
pipe.Start();
}
[TestCleanup]
public void TestCleanup() {
{
pipe.Stop();
pipe = null;
}
...
[TestMethod]
public void CallXxOnPipeExpectResult(){
var result = pipe.Xx();
Assert.AreEqual("Result",result);
}
}
It appears to be a bit faster, since we can run on multiple cores and threads...

Resources