This is my project structure. I have a DataAccess project that houses the edmx file and related classes generated by EF4. This project gets called by my DataAccessLibrary.dll file. The public methods in DataAccessLibrary.dll get called by my Nunit Test project's test methods.
When I attach my project using VS2010 to the Nunit [ver. 2.5.10] process, I am able to run my tests and have the methods connect to the database and all. But when I right click on the project file and use Testdriven.Net to 'run tests', I get an error when I am attempting to connect to the database. Both the DataAccess project and the DataAccessLibrary.dll and the nunit test project have the config file which has the connection string. But still I get this error. "The specified named connection is either not found in the configuration, not intended to be used with the EntityClient provider, or not valid."
What do I need to do for this?
Related
We tried to implement the dependency mapping explained in this Microsoft post, but it doesn't work.
Microsoft dependency mapping implementation example
We have enabled in our build the "Run only impacted test" and set the "Settings file" and TIA.usermapfile from visual studio task.
After that we have created the XML file at the root of the solution, and set parameters for executing specific test from FullyQualifiedName contained in the changes from check-in.
We have verified that the XML file is correctly located at the $(System.DefaultWorkingDirectory) in the build agent server.
We have checked in some modifications which impact the repositories classes.
As you can see, there is no information about the TIAmap.xml file in the Test assemblies log, and the tests were not started, whereas the code changes impact the repository folder.
We have more two thousand tests and there are none that are launched.
Do you have any idea why the tests that would impact and not impact the repositories folder aren't run?
We are using VSTS for build and release management, and using CI/CD. Typically, our solutions consist of a web application project, and a database project.
Our current release tasks take the application offline (using app_offline.htm), publish the database, then publish the web application. Publishing the database project often results in no changes, as due to CI/CD we are much more frequently updating code on the web app than changing the db schema.
Is there a way to only run the database publish task (using WinRM) when it detects a change in the database project code, in our git repository?
EDIT: This in itself isn't a problem, as typically when the DACPAC gets published, there will be no activity. HOWEVER, I've been requesting that the database is backed up using the /p:BackupDatabaseBeforeChanges=true flag - which seems to back up the database even if there are no changes. This is an issue for large databases.
The simple way is that you can separate web project and database project to two build definitions.
Create a new build definition
Enable Continuous Integration in Triggers tab
Specify Path filter to include database project
Modify Visual Studio Build task, specify /t:[database project name] argument in MSBuild Arguments box to just build database project
The same steps for web project
Create a new related definition
Add artifacts for previous two build definitions and enable Continuous deployment trigger
Add two environments (e.g. database, web)
Open Pre-deployment conditions of an environment (e.g. database)
Enable Artifact filters and select corresponding artifact (e.g. database build artifact), specify build branch (can specify *, it means all branches)
Add tasks to just deploy database in this environment
The same steps for web environment
The answer is - exactly what I want isn't possible.
I want to make sure that I can run my unit test developed using specflow-c# from MTM on my local machine.
Background info:
- Developed the unit test methods using SpecFlow-C#
- Checked in my project to TFS
Problem:
- Now can anyone guide me step by step, how to achieve the integration of this unit methods.
I have gone through various articles of the MSDN, but somehow i get lost in there.
I have learnt that i need to create a build defination (But how to?), setup test controller and test agent (Again how?).
Please guide me in this.
Thanks in advance.
MTM needs three things to execute tests:
A build with which to associate the test run. It can only read from TFS. If you're not using TFS for build/deployment you can run the TFSBuild.exe tool to create a dummy build in TFS to point MTM to.
A test lab in which to run the test. Install both the controller and agent on your local machine. Open the agent configuration tool and register it to the controller. Open the controller configuration tool and register it with TFS Team Project Collection. Once this is done, you should be able to see the controller when setting up the lab environment in MTM.
A test case in TFS which is associated with a unit test. As far as I know, this association must be made in Visual Studio. In Team Explorer, open up the work items. Find your test case and click on the Associated Automation tab. In the Test Name field you select a unit test method to tie to the testcase. This is the test that gets run when the testcase is executed in MTM.
I work with Coded UI Tests and Visual Studio 2013. Now I have to test if images from a folder are shown correctly inside the application. That’s why I have created a folder containing the images and set build action to none and deployment to always.
Unfortunately all tests are executed in an own test results folder and my images are not deployed correctly. I know I could do this by using the DeploymentItem attribute or a testsettings file but I don't want to do this. I want to avoid the "test results" folder and run the tests from within the output folder of my test project.
I do this with the unit tests for instance. They are written with XUnit.Net which works just fine. I thought it would also work with MS Test but this only seems to work for Unit Tests but not for Coded UI Tests.
So, to sum it up again: How can I get rid of the “Test Results” folder when using Coded UI Tests with Visual Studio 2013 and run my tests simply from within the output directory of the project?
You can do this using a custom entry in the .runsettings file I think.
DeploymentItemAttribute Class
Consider running your unit tests directly in the build output
directory, so that testing runs more rapidly. This is especially
useful on the build server after you have checked in your tests. To do
this, add a .runsettings file to your solution, include
False, and select the file in
the Test, Test Settings menu. The same effect occurs in any test run
in which DeploymentItemAttribute is not used at all.
However, you
might prefer not to do this if you want to be able to inspect the data
files after a failed run.
You cannot avoid using a deployment folder
if you are using a .testsettings file, which is required for web and
load tests, coded UI tests, and any test in which you deploy an
application to remote machines.
As for DeploymentItemAttribue itself I would steer clear of it unless the following issues have been fixed, Gotchas: MSTest’s [DeploymentItem] attribute. +1 for xUnit in TestDriven.NET.
You can try deleting the LocalTestRun.testrunconfig file from the Solution Items folder (directly under the solution). When we did that it started using the \bin\debug\ folder instead of the TestResults folder when running our unit tests using MSTest.
I am using NHibernate against an Oracle database with the NHibernate.Driver.OracleDataClientDriver driver class. I have an integration test that pulls back expected data properly when executed through the IDE using TestDriven.net. However, when I run the unit test through the NUnit GUI or Console, NHibernate throws an exception saying it cannot find the Oracle.DataAccess assembly. Obviously, this prevents me from running my integration tests as part of my CI process.
NHibernate.HibernateException : The
IDbCommand and IDbConnection
implementation in the assembly
Oracle.DataAccess could not be found.
Ensure that the assembly
Oracle.DataAccess is located in the
application directory or in the Global
Assembly Cache. If the assembly is in
the GAC, use
element in the application
configuration file to specify the full
name of the assembly.*
I have tried making the assembly available in two ways, by copying it into the bin\debug folder and by adding the element in the config file. Again, both methods work when executing through TestDriven in the IDE. Neither work when executing through NUnit GUI/Console.
The NUnit Gui log displays the following message.
21:42:26,377 ERROR [TestRunnerThread]
ReflectHelper [(null)]- Could not load
type
Oracle.DataAccess.Client.OracleConnection,
Oracle.DataAccess.
System.BadImageFormatException: Could
not load file or assembly
'Oracle.DataAccess,
Version=2.111.7.20, Culture=neutral,
PublicKeyToken=89b483f429c47342' or
one of its dependencies. An attempt
was made to load a program with an
incorrect format.
File name: 'Oracle.DataAccess,
Version=2.111.7.20, Culture=neutral,
PublicKeyToken=89b483f429c47342' --->
System.BadImageFormatException: Could
not load file or assembly
'Oracle.DataAccess' or one of its
dependencies. An attempt was made to
load a program with an incorrect
format.
File name: 'Oracle.DataAccess'
I am running NUnit 2.4.8, TestDriven.net 2.24 and VS2008sp1 on Windows 7 64bit. Oracle Data Provider v2.111.7.20, NHibernate v2.1.0.4.
Has anyone run into this issue, better yet, fixed it?
Sly, Thanks for the reply. However, the NUnit test runner was using the correct configuration file as I was testing by pulling a known value out of the expected configuration file.
I am assuming this has something to do with my configuration, specifically with either Windows 7 in general or the 64bit version. I went ahead and installed/configured the Oracle client on the build server (W2k3 Server). I moved the test onto the build server and ran the same scenarios described above and the tests worked as expected in all cases.
I followed this up by running through the scenarios on two of the other developer workstations (Win XP 32bit with same toolset versions) and the tests worked as expected in all cases.
I'm perplexed but satisfied for now. I can run my integration tests through the IDE and can execute them on the build server through our CI automation. Only problem now is I can't test the automation build project on my development workstation.
I was getting this error when I tried to run an application which used Oracle.DataAccess.dll (odp.net version 2.111.7.20). I was shipping the oracle 11g instant client alongside the application. On 64 bit servers it would fail. However, the client assemblies I was shipping were 32 bit so I compiled a version of the app with the CPU flag set 32-bit and now it works fine. This is because the server runs the entire component within the wow64 emulator when you tell it the app is 32bit explicitly.
I had a similar problem when configuring NHibernate. The thing is that most of test runners are using app.config that is placed with your tests dll. But some versions of NUnit don't. Thats why your system stays in not configured state for the tests. You can try to configure NHibernate mannually from the test. Hope it helps
I may have just solved a similar/same problem on my local machine by going into the build settings for the test project and changing platform target from "any cpu" to "x86"