A test class with the following test is discovered as expected:
[Theory]
[AutoData]
public void MyDiscoveredTest() { }
However, the following test is missing:
[Theory]
[AutoNSubstituteData]
public void MyMissingTest() { }
Interestingly, if I put MyDiscoveredTest after MyMissingTest, then MyDiscoveredTest is also now missing. I have tried both the xUnit visual studio runner and xUnit console runner with the same results.
My AutoNSubstituteData attribute is defined here:
internal class AutoNSubstituteDataAttribute : AutoDataAttribute
{
internal AutoNSubstituteDataAttribute()
: base(new Fixture().Customize(new AutoNSubstituteCustomization()))
{
}
}
A related question: since the AutoNSubstituteDataAttribute above seems like a fairly common attribute, I'm wondering why it's not included with AutoFixture.AutoNSubstitute. Similarly useful would be an InlineAutoNSubstituteDataAttribute. Should I submit a pull request for these?
Nuget package versions used:
AutoFixture 3.30.8
AutoFixture.Xunit2 3.30.8
AutoFixture.AutoNSubstitute 3.30.8
xunit 2.0.0
xunit.runner.visualstudio 2.0.0
xunit.runner.console 2.0.0
NSubstitute 1.8.2.0
I am using Visual Studio 2013 Update 4 and targeting the .NET 4.5.1 Framework
Update: As recommended I tried TestDriven.NET-3.9.2897 Beta 2. The missing test now runs, however it still seems there is some bug. New example:
[Theory]
[AutoData]
public void MyWorkingTest(string s)
{
Assert.NotNull(s); // Pass
}
[Theory]
[AutoNSubstituteData]
public void MyBrokenTest(string s)
{
Assert.NotNull(s); // Fail
}
[Theory]
[AutoData]
public void MyWorkingTestThatIsNowBroken(string s)
{
Assert.NotNull(s); // Fail even though identical to MyWorkingTest above!
}
Both MyBrokenTest and MyWorkingTestThatIsNowBroken fail at Assert.NotNull, while MyWorkingTest passes even though it is identical to MyWorkingTestThatIsNowBroken. So not only does the AutoNSubstituteData attribute not work properly, but it is causing the downstream test to misbehave!
Update2: Changing the definition of AutoNSubstituteDataAttribute to public instead of internal fixes everything. xunit runner now discovers and passes all the tests as does TestDriven.Net. Any idea about this behavior? Is it expected?
Both xUnit visual studio runner and TestDriven.Net runner are causing these weird issues because the AutoNSubstituteDataAttribute class and constructor are internal. Changing these to public resolves all the issues. If the attribute is being ignored I would expect an error like this: System.InvalidOperationException : No data found for ...
This doesn't explain why the downstream tests are affected by the offending AutoNSubstituteData attribute from a totally different test. It seems like the unit test runners should be more robust in this case.
For completeness here is the working implementation of AutoNSubstituteDataAttribute:
public class AutoNSubstituteDataAttribute : AutoDataAttribute
{
public AutoNSubstituteDataAttribute()
: base(new Fixture().Customize(new AutoNSubstituteCustomization()))
{
}
}
Related
I'm coding in IntelliJ IDEA. When debugging my application, I can't use some default method implementations in Watches.
Here is a condensed example:
public class Friendship {
interface Friend {
default void sayHiTo(Friend friend) {
System.out.println("Hi, " + friend.hashCode());
}
default int amountOfHands() {
return 2;
}
}
public static class BasicFriend implements Friend {
int numberOfFaces() {
return 1;
}
}
public static void main(String[] args) {
System.out.println("Put a breakpoint here");
}
}
In main() method I put a breakpoint and set up three watches:
// Default interface method with dependency
new BasicFriend().sayHiTo(new BasicFriend())
// Default interface method without dependency
new BasicFriend().amountOfHands()
// Class method
new BasicFriend().numberOfFaces()
The first watch throws NoSuchMethodException complaining that method Friendship$BasicFriend.sayHiTo() doesn't exist.
The second watch runs successfully, but strangely it reports a boxed object
{java.lang.Integer#537} "2" instead of just a primitive 2.
The third watch reports a primitive 1, just as expected.
Why is the first watch not working? Is this a bug? Is this actually IDE related? Is it because of some conceptual flaw of default methods? Should it be working as I want it to in the first place? Is the strange result of the second watch somehow related to the issue in the first watch?
Prior to JDK 8u40, default and static interface methods were not supported by JDI (Java Debugger Interface), JDWP (Java Debugger Wire Protocol) and JDB (the standard Java debugger). This is bug JDK-8042123, which is recorded as fixed in 8u40 and a corresponding blurb appears in the 8u40 release notes.
Update to 8u40 or later to fix this issue, at least on the JDK side.
From the bug description, it looks like debugger-side changes are also required, to avoid casting com.sun.jdi.InterfaceType objects to com.sun.jdi.ClassType, but instead call InterfaceType.invokeMethod() directly.
In the specific case of IntelliJ, Suseika confirmed in a comment that 14.1.2 has mostly fixed the issue (except the unexpected boxing), though Mike Kobit still experiences this problem on that version with a ClassCastException suggestive of the incorrect cast above.
I'm trying to get parallel tests to work in NUnit v3, however, the tests don't seem to.
Considering the following test class:
namespace NUnitAlpha3Experimental
{
[TestFixture]
[Parallelizable(ParallelScope.Children)]
class DummyTests
{
[Test]
public void MustSuccess()
{
Assert.IsTrue(true);
FileIO.appendToFile("output.txt", Reflexion.GetCurrentMethodName());
}
[Test]
public void MustFail()
{
Thread.Sleep(500);
FileIO.appendToFile("output.txt", Reflexion.GetCurrentMethodName());
Assert.IsFalse(true);
}
}
}
Whenever I run my tests, "MustFail" is always outputted before "MustSuccess". "MustSuccess" should be outputted first if the tests were ran in parallel. Maybe there's something wrong with my attributes. I don't know.
Please help. Thank you.
edit: I added the /workers=8 to my command line:
[...] \NUnit3\nunit-console NUnitAlpha3Experimental.exe /framework:net-4.5 -workers=8
but still, my tests dont seem to run in parallel.
More info here: https://groups.google.com/forum/#!topic/nunit-discuss/_Zcd3EjiJGo
From the author of NUnit, parallel test cases are not yet implemented. https://groups.google.com/forum/#!topic/nunit-discuss/_Zcd3EjiJGo
Parallel testing of fixtures is implemented thought.
I'm getting an error when trying to run the EF 4.3.1 add-migrations command:
"The model backing the ... context has changed since the database was created".
Here's one sequence that gets the error (although I've tried probably a dozen variants which also all fail)...
1) Start with a database that was created by EF Code First (ie, already contains a _MigrationHistory table with only the InitialCreate row).
2) The app's code data model and database are in-sync at this point (the database was created by CF when the app was started).
3) Because I have four DBContexts in my "Services" project, I didn't run 'enable-migrations' command (it doesn't handle multipe contexts). Instead, I manually created the Migrations folder in the Services project and the Configuration.cs file (included at end of this post). [I think I read this in a post somewhere]
4) With the database not yet changed, and the app stopped, I use the VS EDM editor to make a trivial change to my data model (add one property to an existing entity), and have it generate the new classes (but not modify the database, obviously). I then rebuild the solution and all looks OK (but don't delete the database or restart the app, of course).
5) I run the following PMC command (where "App" is the name of one of the classes in Configuration.cs):
PM> add-migration App_AddTrivial -conf App -project Services -startup Services -verbose
... which fails with the "The model ... has changed. Consider using Code First Migrations..." error.
What am I doing wrong? And does anyone else see the irony in the tool telling me to use what I'm already trying to use ;-)
What are the correct steps for setting-up a solution starting with a database that was created by EF CF? I've seen posts saying to run an initial migration with -ignorechanges, but I've tried that and it doesn't help. Actually, I've spent all DAY testing various permutations, and nothing works!
I must be doing something really stupid, but I don't know what!
Thanks,
DadCat
Configuration.cs:
namespace mynamespace
{
internal sealed class App : DbMigrationsConfiguration
{
public App()
{
AutomaticMigrationsEnabled = false;
MigrationsNamespace = "Services.App.Repository.Migrations";
}
protected override void Seed(.Services.App.Repository.ModelContainer context)
{
}
}
internal sealed class Catalog : DbMigrationsConfiguration<Services.Catalog.Repository.ModelContainer>
{
public Catalog()
{
AutomaticMigrationsEnabled = false;
MigrationsNamespace = "Services.Catalog.Repository.Migrations";
}
protected override void Seed(Services.Catalog.Repository.ModelContainer context)
{
}
}
internal sealed class Portfolio : DbMigrationsConfiguration<Services.PortfolioManagement.Repository.ModelContainer>
{
public Portfolio()
{
AutomaticMigrationsEnabled = false;
MigrationsNamespace = "Services.PortfolioManagement.Repository.Migrations";
}
protected override void Seed(Services.PortfolioManagement.Repository.ModelContainer context)
{
}
}
internal sealed class Scheduler : DbMigrationsConfiguration<.Services.Scheduler.Repository.ModelContainer>
{
public Scheduler()
{
AutomaticMigrationsEnabled = false;
MigrationsNamespace = "Services.Scheduler.Repository.Migrations";
}
protected override void Seed(Services.Scheduler.Repository.ModelContainer context)
{
}
}
}
When using EF Migrations you should have one data context per database. I know that it can grow really large, but by trying to split it you will run into several problems. One is the migration issue that you are experiencing. Later on you will probably be facing problems when trying to make queries joining tables from the different contexts. Don't go that way, it's against how EF is designed.
'here is output...'
Loading C:\TEMP\BankDemo_mstest\Test_BankDemo\bin\Debug\Test_BankDemo.dll...
Starting execution...
Results Top Level Tests
------- ---------------
Error Test.BankDemo.AccountTest.CreditTest
Error Test.BankDemo.AccountTest.DebitTest
Error Test.BankDemo.AccountTest.FreezeTest
0/3 test(s) Passed, 3 Error
Summary
-------
Test Run Error.
Error 3
--------
Total 3
This is the command I used
OpenCover\OpenCover.Console.exe -register:user
-output:"Codecoverage.xml"
-mergebyhash
-target:"C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\MSTest.exe"
targetargs:"/testcontainer:
"C:\TEMP\BankDemo_mstest\Test_BankDemo\bin\Debug\Test_BankDemo.dll"
/noisolation"
-filter:"-[Bank.*]* +[Bank*]* +[Bank.Accounts*]* -[Test.BankDemo*]*"
ReportGenerator\bin\ReportGenerator.exe Codecoverage.xml Coverage HTML
(I have even tried regsvr32 to register the profile and I am using XP)
actually I am beginner to Nunit,mstest and opencoverage and I found sample Unit test case at http://www.nunit.org/index.php?p=quickStart&r=2.4 so
** Nunit test class is as below**
private TestContext testContextInstance;
public TestContext TestContext
{
get { return testContextInstance; }
set { testContextInstance = value; }
}
private int store;
[TestInitialize()]
public void TestFixtureSetUp()
{
store = 1;
}
the above class works fine with Nunit and Opencoverage also showing accurate data but same class after replacing mstest specific attributes didn't worked so after posting this questin I figured it that this method has to be static and added TestContext argument. so I made code changes(in bold) as below and above command worked fine.
MSTest class
private TestContext testContextInstance;
public TestContext TestContext
{
get { return testContextInstance; }
set { testContextInstance = value; }
}
[ClassInitialize()]
public **static** void ClassInit(**TestContext context**)
{
}
Your tests aren't failing -- they're erroring, meaning there appears to be a problem compiling the test project. It stands to reason that you'll get no coverage if the tests cannot be built and executed.
2 reasons could be for this however I suspect your filters are wrong, as described in the usage the filters are
(+/-)[assembly/module filter]namespace.typefilter
and exclusion filters take precedence over inclusion filters
So your -[Bank.*]* is excluding types before the +[Bank.Accounts*]* (and probably +[Bank*]*) can take effect. As the default filter +[*]* is only added if you have no other extra filters, other than the default ones, then you should only need to add filters for the modules you wish to profile i.e. +[Bank.*]*
If you open up the XML output then if a class is filtered out then a reason is supplied via the skippedDueTo attribute.
The other reason could be due to missing PDB files not in the folder of the assembly (some test harnesses copy assemblies to other folders - but I see you are using the /noisolation switch - so this shouldn't be it)
Please feel free to discuss or if you think there is a big raise the issue on the OpenCover GitHub site
I've inherited a web application written in ASP.NET that has an incomplete implementation of a localization scheme (not using resource files). Here's a micro version:
public class Useful
{
public void DoSomething()
{
return Localizations.Do_Something_Message_vx7Hds8i;
}
}
public class Localizations
{
public const string Do_Something_Message_vx7Hds8i = "Some text!";
}
In almost all cases, these localized strings aren't even used in more than one place. I'd like to factor out this annoying localization layer before properly localizing the app.
The end result I want is just:
public class Useful
{
public void DoSomething()
{
return "Some text!";
}
}
This is proving tediously slow and I have over 1000 in this app.
What would be awesome would be a one-click way of selecting the reference and have it automatically suck in the string contents. I'm using Visual Studio 2008 and ReSharper 5.1.
Does anyone know if there's a way to accomplish this? It seems like there should be a proper name for what I'm trying to do (anti-modularization?) but I'm a little stumped where to start.
The default key command in Resharper is Ctrl+Alt+N for inline refactoring.