Organization of Unit Tests in Visual Studio - visual-studio

I'm currently creating a paired unit test assembly for every assembly in my project, both are in the same folder.
MyProject/MyProject.csproj
MyProject.Test/MyProject.Test.csproj
Looking at open source projects, I've seen some smaller project put all tests in one assembly, and other split it out like mine. I'm dealing with a large solution so it would be pretty crazy to put all tests in one project.
I currently have msbuild logic to run tests on all *.Test.csproj files. If I had all my tests in a different folder I wouldn't need to do this.
Just wondering if there are any good arguments to do things a certain way.
Thanks

I do it the same way but I change the default namespace for each test project to match the namespace of the production project. So the tests for class X.Y.Foo are in X.Y.FooTest rather than X.Y.Test.FooTest - it means you need fewer using directives, and generally makes things simpler.
My main reason for wanting to keep the two in separate projects is to avoid either including the tests in the production library or having to ship an untested library. With the separate project structure, you can run unit tests against anything you build. It also makes it easier to look through just the production classes without having twice as many files to look at (when getting the "feel" of a library).
Finally, don't forget that if you need to access internal members when testing, there's always [InternalsVisibleTo].

I suggest making as few unit test projects as possible. The reason being is that each one you create adds on at least ten seconds of compile time. In a big project, it starts adding up.
Here's the directory structure I use:
projectName/branches/trunk/projects/code/codeproject1
projectName/branches/trunk/projects/code/codeproject2
projectName/branches/trunk/projects/code/codeproject3
projectName/branches/trunk/projects/Tests/testproject1
projectName/branches/trunk/Dependencies
projectName/prototypes
projectName/...
and within testproject1, the following directory structure:
codeproject1/
codeproject2/
codeproject2/web
codeproject2/web/mvc
codeproject3/
codeproject3/support

I do the same thing except each project is in it's own folder under the same root folder.
Something along the following:
Solution Folder
ProjectA folder
ProjectA.Test folder
ProjectB folder
ProjectB.Test folder

I always have a separate test project for each project. Part of it is simply I like the organization of it, but I've also often run into situations where I've decide to break a library out into it's own solution so that it can be reused by other solutions. In those cases, having the library project have it's own separate test project (rather than all the tests in a single project) makes it much easier to break that library out.

Related

Avoiding build twice when using a shared project together with build generated code

I have a visual studio solution with multiple projects. One generates code files as part of pre-build (grpc classes via Grpc.Tools). There is also a shared project that extends the partial classes built as part of that pre-build.
However, sometimes for one reason or another - like compiling the client half of this (client uses the shared project to extend its own classes), compilation will error because the shared project can't find the generated classes yet. Presumably they don't exist. It's fixed easily by compiling the project twice.
Is there something I can do in this scenario? Is it possible to somehow move validating/compiling the shared project "further down" the compilation pipeline? Or even just set that particular project to try and compile twice if there's an error? Or is this the kind of thing that realistically I should just live with given what I'm doing - I haven't found any other references to this problem. It's not that big of an issue and it wouldn't happen very often, but I'd like to handle it reasonably if I can.
Edit
If I wasn't clear, this is a shared project, as in a .shproj, a project that is not compiled separately. The project that references it includes it and builds it all together as one.
If project B depends on project A, then project A must be built before project B. Visual Studio is smart enough to figure out the build order this way. Incidentally, this is also one of the reasons (among many) why circular dependencies simply cannot work.
I suspect that your projects are currently not linked via a dependency, as this issue wouldn't occur if there were such a link. Perhaps your second project is accessing the first project's files via the file system? That's just a guess though.
You can use this "A before B which depends on A" behavior of the build process to your advantage. Have project B (i.e. the project you need to go second) add project A (i.e. the project you need to go first) as a dependency. This forces VS to build them in the appropriate order.
Some remarks:
I am unsure if VS is able to omit dependencies that you add but not actually depend on (i.e. you never reference its content). I can't find any confirmation on this point (but absence of proof is not proof of absence!) But even if that is the case, that could be easily worked around by having a dummy class in B which actually references and uses something from project A.
Keep in mind that during a regular build, VS does not rebuild projects that have not changed since the last build. If this is an issue for you (unsure if it is, you didn't add enough context), make sure to always rebuild or clean to make sure that a new build will be triggered.
However, sometimes for one reason or another - like compiling the client half of this (client uses the shared project to extend its own classes), compilation will error because the shared project can't find the generated classes yet. Presumably they don't exist. It's fixed easily by compiling the project twice.
That it is only sometimes and can be fixed by "trying again" points at one thing - you got race condition. But a race condition during compilation, is not a thing I heard off or encountered before.
I got a few possible cultripts. But in the end, race conditions are notoriously hard to debug:
- Maybe the compiler that deals with the shared project returns before it is finished - wich should be impossible - or
- Maybe something causes the main projects to compile before the shared projects files are ready.
- Maybe a 3rd party tool - like a Virus scanner or auto backup maker inteferes?
- Maybe the shared projects compiled files are hosted on a network drive, and there sometimes is just the slightest delay between "compiled" and "visible to all other computers in the network"?
Usually the proper ways for dependant compilation should deal with such issues. That indicates that what you got there, is propably not the most stable setup.

Mono fo Android - One Solution for many clients

I have created three different solutions for three different clients, but those solutions are for an app that have the same features, classes, methods, resolution, except for the images, XML resource files, and a web service reference, that are specific for each one.
I would like to have just one solution for all those apps, that I could open in VS2010 IDE for edition, without errors. So, when I need to build or publish an specific app, I just set the client which one I need to, and go ahead to building or publishing.
It is important to consider that XML file names will be the same, as classes and images names too. The difference will be the content, but the name will always be the same.
My intention is to reduce my effort to maintain many solutions, having just one solution to work with.
In my company, we will have more than those three clients soon, so I am worried about how to maintain that. The best way will be have just one solution and when I need to generate a new app for a new client, I have just to change/include a few things (like some resources and images) and compile to a new client folder.
Is it possible? If so how?
One option would be to have a master solution which had the following
A "Template" project that contained your actual application and all of the shared code
Projects for all of your clients
In the projects for your clients, you could have links to the files in your files that come from your shared project. Then, in each of those projects, you could add the files that are only specific to them.
With this kind of structure, whenever you made a change to your Template project, all of the client projects would be updated as well because they just have pointers back to the Template project.
A good reference for this kind of setup would be the Json.Net Code Base. There he has a solution and project for all of the different configurations, but they all share the same files.
In terms of ensuring that the xml files are named properly, you might just want to put some checks into your main application to ensure that it has all of the files needed or potentially add a check into your build process.
There are many ways you could look to tackle this.
My favorite would be to run some sort of pre-build step - probably outside of Visual Studio - which simply replaces the files with the correct ones before you do a build. This would be easy to automate and easy to scale.
If you are going to be building for many more than three customers, then I think you should look to switch from Visual Studio building to some other automated build system - e.g. MSBuild from the command line or from something like TeamCity or CruiseControl. You'll find it much easier to scale if your build is automated (and robust)
If you don't like the file idea, then there are plenty of other things you could try:
You could try doing a similar step to above, but could do it inside VS using a pre-Build step.
You could use Conditional nodes within the .csproj file to switch files via a project configuration
You could look to shift the client-specific resources into another assembly - and then use GetResourceStream (or similar) at runtime to extract the resources.
But none of these feel as nice to me!

Unit/Integration Test Organization in a Large Visual Studio Solution

I'm starting to develop and organize tests for a very large Visual Studio solution. (Yes, I know that tests should have been developed along with the code rather than when the project is nearly complete, but things are what they are.)
I've seen similar questions about organizing unit tests in Visual Studio solutions, but I didn't see any that address integration tests as well. I would appreciate some guidance about where to place test projects so that they don't clutter up the already large code base.
Here's the basic hierarchy of things within the solution. (All items not ending in .proj are folders within a project or Solution Folders.)
HardwareServices
HardwareService1
HardwareService1.Core.proj
HardwareService1.Host.proj
HardwareService1.Service.proj
HardwareService2
HardwareService2.Core.proj
HardwareService2.Host.proj
HardwareService2.Service.proj
Infrastructure
MyApp.Database.proj
MyApp.Infrastructure.proj
MyApp.ReportViewer.proj
MyApp.SettingsManager.proj
AppModules
AppModule1.proj
Common
Reports
Services
ViewModels
Views
AppModule2.proj (similar structure to other AppModules)
AppModule3.proj (similar structure to other AppModules)
Modules
ComputeEngine.proj
Footer.proj
Header.proj
CommonServices.proj
My thought was to make a Solution Folder called "Tests" and then mimic the hierarchy above, making one test project for every production code project. Within each test project, I would make folders called "UnitTests" and "IntegrationTests".
My focus is to create a consistent naming/organization scheme so that there's no ambiguity about where new tests should go and where to find existing tests. Given the large size of this project/application, I'd like to get the structure pretty solid right out of the gate so that it's not a pain later.
Thank you for your time and advice.
The naming convention that our company adopted was the use of projectName.Tests.Unit and projectName.Tests.Integration.
With your existing structure you would have something like this:
HardwareService1
HardwareService1.Core.proj
HardwareService1.Host.proj
HardwareService1.Service.proj
Tests
HardwareService1.Core.Tests.Unit
HardwareService1.Core.Tests.Integration
If you keep your tests folder along with the root folder you don't have to mimic the complete structure again as the tests are right with the respective project.
side note
By having the project name having a consistant Tests.Unit it helps assist in running unit tests in your build script as you can run tests with a wild card search like **\*tests.unit*.dll
At the end of the day, project structure can be very subjective so do what makes sense in your environment and makes sense to your team.

Add all projects to same solution or not?

I am the intranet developer for the company I work for and I have been doing this for the last 5 years. My projects are divided into two solutions, the "Intranet" solution itself and the "Library" solution. The "Library" sln itself has several projects containing the DAL, BLL, etc.. The reason why I kept them in a different solution is because I thought that "maybe", one day my library sln can be used in other projects as well - you know reuse the code that I already wrote :) Well, that never happened. Now, since its so easier to have all projects in the same .sln, I am thinking to just do that. Is that a wise situation? What would you do if you were in my shoes?
In the past I've used and reused the same 'project' in multiple solutions - I really just see a solution as a 'particular' instance of a collection of projects.
For example, we might have different solutions for the same overall piece of software depending on whether we want to be doing unit testing (in their own project) and or integration testing (in a separate project), and we'd open the right solution for what it is we're about to do. That way if you're doing normal coding with unit testing you don't have to build the integration test code every time and visa-versa.
Only thing to watch out for is bringing in a project to a solution that is a dependency of lots of other projects/solutions and then "accidentally" changing the code in it without realising it's in a side project rather than your main code. Then you can start breaking loads of other projects that depend on it and not realise!
Yes, you can do it! You may still reuse your DAL and BLL, as the project settings are stored in the specific project files (csproj, vbproj, ...). Also dependencies are stored there, so no problem and good to go. I have an addin-infrastructure and for every and each addin-package, I do need the addin-host, which is included in several solution files. I never experienced any problems with this. Open up your *.sln file in a text-editor to see its contents...just links to the projects.
Simply add your library projects to your intranet sln. Keep your library solution as is.
I would personally all add them to the same solution, yes. Namely, it doesn't matter if you plan on using some of the libraries in the solution in other projects: you can still add the compiled dll to those solutions, or you have the option to add them as an exisiting project to the new solutions.
So yes, I add everything to the same solution: gui-projects, libraries, even unit tests. Maybe if your solution becomes incredibly large (20+ projects, or larger) it would be better to split them up, but I've never worked on such large projects.
I prefer to have 2 solutions. They both have identical structure (more or less) but the 2nd contains reusable infrastructure code only that isn't tied to a particular project. The thing is - your project code shouldn't contain framework-like (i.e. 'IsNumeric()' string extension method) stuff next to serious banking business logic (even if you won't reuse your 'Library'). That just makes things much more readable and maintainable.
For example:
Solution1:
ProjectName.Core
ProjectName.Infrastructure
ProjectName.Application
ProjectName.UI
ProjectName.IntegrationTests
ProjectName.UnitTests
And
Solution2:
CompanyName.Core
CompanyName.Infrastructure
CompanyName.Application
CompanyName.UI
CompanyName.Tests
And I try not to have more than 10 projects in 1 solution. Otherwise - it leads to infinite toggling between "unload project"/"reload project.
I, for my part, have separated the Solutions and the projects, leaving me with a big punch of projects and only a few solution-files. I add all the projects I need in new solutions.
The upside is that I only have the projects in my workspace which I really need and it still changes in all other solutions.
The downside is that it changes in all other solutions too, means that if you change the API of a widely used library, you'll have to check all your other solutions if incompatibilities.

Visual Studio solution structure using Codesmith frameworks (NetTiers / Plinqo)

I have been using the Codesmith framework NetTiers to generate a DAL etc., into a folder called, say, 'NetTiers', outside my main project's folder, and referencing the DLLs within that folder from my main project.
I've started using the Plinqo framework, and want to use the generated files from that framework within the same project as the one I'm using with NetTiers. (The reason I'm using both frameworks is that I want to get/learn the newer LINQ goodness from Plinqo, yet also have the familiar NetTiers code DAL, BLL syntax available, for compatibility.)
My question is: what's the best Visual Studio solution and file structure to use when using Codesmith templates like these? Should the frameworks' generated code be contained outside the main project and added as projects to the overall solution? Or should each template's generated code have its own solution? Should the generated files be within the main project's file structure?
I've tried combinations of each of these, and they each have their pros and cons. I'd like to know if there's a tried and tested pattern.
When it comes to .netTiers, I always compile the generated solution and add the assemblies as references to my project. This makes it much easier to upgrade/diff and regen.
However, there are going to be some cases where you would want to add your custom logic so keep this in mind.
Thanks
-Blake Niemyjski
I tend to just keep the .csp and the generated folder outside of my main app's folder. When adding a reference Visual Studio copies in the .DLLs from the built generated code. All of the generated projects sit under a main folder such as D:\CodeSmith Projects\
If you want to version control the .csp file it might be beneficial to move it in with the rest of your version controlled app files to tie it all together.
We put the generated projects inside our solution. In fact on my current project I generated the nettiers files to the location that I wanted the files to be, and Started adding my own project files to that...But we have always kept the files in the solution, that way if i need to add something to the code in the concrete classes I can do it without having to open a whole new project.
We have tried both scenarios. We settled for including the assemblies in a dependencies folder, which was shared by multiple projects.
We had problems with TFS when the projects were included in the solution. the downside, is that you can't so easily step into the .NetTiers generated code when debugging, though after a while you get used to this, and accept that whatever is in .NetTiers stays within .NetTiers!

Resources