Managing Dependencies of Reflected Dependencies - visual-studio

I presently work with a large solution, containing about 100 projects. At least 10 of the projects are executable applications. Some of the library projects are imported as plugins via MEF and reflection rather than with direct references. If a needed plugin's own dependencies are not copied to the output or plugin directory of the executable project using it, we'll get reflection errors at runtime.
We've already tried or discussed the following solutions, but none of them seem like a good fit:
"Hard" References: Originally, we had the executable projects reference other projects they needed, even if they were going to ultimately be imported as optional plugins. This quickly fell out of favor with team members who needed to make builds that excluded certain plugins and liked to unload those projects to begin with. This also made it difficult to use Resharper or other tools to clean unused references and remove obsolete third party libraries without accidentally blowing away the "unused" references to the needed plugins own dependencies.
Post-build copying (with pre-build "pull"): For a brief period of time, a senior team member set all the plugin projects to xcopy their outputs output themselves to a known "DependencyInjection" folder as post-build events. Projects that needed those plugins would have pre-build events, xcopying each desired plugin to their own output directories. While this meant that the plugin projects "rightly" had no knowledge of where they might be used, this caused two major headaches. First, any time one made a change in a plugin project, they had to separately build (in sequence) the plugin project and then the executable project they would test it in (to get the files to copy over). Rebuild all would be more convenient but far too slow. Second, the continuous integration build would have to have been reconfigured since it compiled everything all in one directory and only cared if everything built successfully.
Post-build copying (push): The present solution started with xcopy and now mostly uses robocopy in post-build events of the plugin projects to copy needed files directly to the plugin folders of the executable projects that use them. This works fairly well in that if one makes a change in a plugin, one can go straight to running with the debugger. Also, the CI build doesn't break, and users disabling certain "optional" plugin projects for various builds don't get build errors from missing references. This still seems hackish, and is cumbersome to maintain in all the separate post-build windows, which are rather small and can't be expanded. When executable projects get moved from a project restructure or renamed, we don't find out about broken references until the next day after hearing results from the overnight automated testing.
"Dummy" projects with references: One idea that was briefly tossed about involved making empty projects for each of the different executable build configurations and going back to the hard references method on those. Each would use its own references to gather up the plugins and their dependencies. They would also have a reference to the actual executable and copy it over. Then, if one wanted to run a particular executable in a particular configuration, you'd run its dummy project. This one seemed particularly bloated and was never attempted.
NuGet: In my limited familiarity with NuGet, this seems like a good fit for using packages, except I wouldn't know how to implement that internal to one solution. We've talked about breaking up the solution, but many members of the team are strongly opposed to that. Is using NuGet with packages coming from within the same solution possible?
What are best practices for a situation like this? Is there a better solution to managing dependencies of reflected dependencies like this than any of the above, or is a refinement of one of the above the best choice?

Ok, so I assume in this answer that each developer needs to constantly have all 100 assemblies (Debug mode) locally to do its job (develop, compile, smoke test, run automatic tests).
You are mentioning that RebuildAll takes long time. Generally this symptom is caused by too many assemblies + build process not rationalized. So the first thing to do is to try to merge the 100 assemblies into as few assemblies as possible and avoid using things like Copy Local = true. The effect will be a much faster (like 10x) RebuildAll process. Keep in mind that assemblies are physical artefacts and that they are useful only for physical things (like plug-in, loading on-demand, test/app separation...). I wrote a white-book that details my thoughts on the topic: http://www.ndepend.com/WhiteBooks.aspx
Partitioning code base through .NET assemblies and Visual Studio projects (8 pages)
Common valid and invalid reasons to create an assembly
Increase Visual Studio solution compilation performance (up to x10 faster)
Organize the development environment
In the white-book advice's, one of idea is to avoid referencing project but to reference assemblies instead. This way it becomes your responsibility to fill Project > right click > Project Dependencies that will define the Project > right click > Project Build Order. If you decide to keep dealing with 100 assemblies, defining this setting represents an effort, but as a bonus a high-level (executable) project can depend on a library only used by reflection and this will solve your problem.
Did you measure the Lines of Code in terms of # of PDB sequences points? I estimate that until the limit 200K to 300K doing a RebuildAll (with optimization described in the white-book) should take 5 to 10 seconds (on a decent laptop) and it remains acceptable. If your code base is very large and goes beyond this limit, you'll need to break my first assumption and find a way that a developer doesn't need all assemblies to do its job (in which case we can talk about this further).
Disclaimer: This answer references resources from the site of the tool NDepend that I created and now manage its development.

I have been in a situation like yours. We had almost 100 projects. We too were using MEF and System.AddIn. In the beginning we had a few solutions. I was working on the core solution that included the core assemblies and their tests. Each plug-in category in a separate solution, that included contracts, implementation (some plug-ins had more than one implementation) and tests, plus some test host as well as the core assemblies. At some later point we added a solution that included all projects and after trying a few of the approaches you mention we decided to do the following:
Keep the references that are mandatory,
All executable projects were set to output to common locations (one for debug and one for release configurations),
All projects that should not be referenced were set to output to these common locations,
All projects that were referenced by others, were left unchanged and each reference was set with Copy Local = true.
Tests were left unchanged.
Although building all was slow, we didn't have any other problems. Of course having almost 100 projects is a sign that the design is probably too modular and as Patrick advises, we should have tried to compact it.
Anyway, you could try this approach in a couple of hours and perhaps instead of setting Copy Local = true, try to set the output folder of all projects mentioned in 4 to have their output set to the common locations. We didn't know that this setting will slow down the build process as Patrick mentions.
PS. We never tried using NuGet because we didn't have enough resources and time to experiment with it. It looked promising though.

We are starting up a new project and I am looking for the "best practices" solution of this similar problem. For us, we can divide the projects into two categories 1) The Platform assemblies, which provide common set of services across the board and 2) Verticals which would be perform business specific functions.
In the past we have used a Visual Studio plug-in with a simple UI that allow developers to specify a common assemblies path to copy the output assemblies and then reference all assemblies (whereever they reside in a different solution) from the common assemblies folder.
I am looking at NUGET but the sheer work you have to do to created and maintain NUGET packages is punitive.
It's a very common scenario and would be really interested to see how others have addressed it.

Related

Cruise Control .NET two project same working directory

in ccnet wiki in the project block, workingDirectory I read: "Make sure this folder is unique per project to prevent problems with the build." I want to do two project that has the same working directory... what are the "problems with the build" that can occur and how can I overcome these problems?
Edit:
My situation: I have two applications in the same trunk that has some common code but if I made commit to one of the applications I don't want the other application to build and increase its version number but if I made a change to the common code I want both of them to trigger a build. My source control is SVN and I use Filtered block to include only the files I want to trigger a build.
Option 1
Have a single project that builds just the common code. This should emit its built assembly to a known location outside it's working directory.
Have 2 other projects that build the other parts of the solution. Each only listens to changes to its particular source control paths. Each project can incorporate/reference the built assembly from the known location
The 2 other projects can be forced from the common project using a forceBuildPublisher.
These projects should be in the same queue to prevent the common project rewriting the built common assembly whist its trying to be reference by the other 2 projects.
Option 2
Have 2 individual projects that both build the common source code and the specific code together. Say by building a solution file which contains both sets of projects.
This is the simpler option, but does lose you the neatness of having a 'common assembly version number'
Pros and Cons
Option 1
You have a single version no for each version of the common code.
Its more prone to issues due to the additional complexity.
You need to maintain a known location outside of the working directory so that it is not deleted/cleaned by the build process.
Option 2
Simpler solution.
Common code version is lost in the version of the dependant assembly.
If I had to suggest one, I would opt for for option 2, purely because its simplicity reduces chance of other issues.
Working directory is a place where Cruise Control puts the source code of your project to, and this is also where the build process is happening. If you point two projects to the same working directory, you can end up with any kind of conflicts you can guess. Source files of project A and project B can mix up, the build process might break because of unknown state of the build folder, etc.
It's quite natural to separate unrelated things, and in this case it's a call of common sense. Besides, I can hardly imagine a situation when you have to put 2 projects into the same working directory.

Managing internal 3rd Party Dependencies

We have a lot of different solutions/projects which are managed by different teams. Our solution needs to reference several projects that another team owns. We don't want to add these dependencies as project references because we do not intend on modifying that code, we just want to use it. Also we already have quite a bit of projects in our solution and don't want to add a bunch more since it will slow down Visual Studio. So we are building these projects in a separate solution and adding them as file references to our solution.
My question is, how do people manage these types of dependencies? Should I just have some automated process what looks for changes to those projects, builds them and checks the dlls into our source control, after which we treat them like other 3rd party dependencies? Is there a recommended way of doing this?
One solution, although it may not necessarily be what you are looking for, is to have each dependent sub-system perform a release. This release could be in the form of a MSI install, or just a network share of assemblies. When a significant change is made, that team could let you know, and you could run the install or a script to copy the files.
Once you got the release, you could put them into the GAC, that way you would not have to worry about copying them to your project bin folders.
Another solution, assuming you are using a build server or continuous integration of some kind, is to have a post build step or process stage the files. Than at any given moment, the developers of the other teams could grab the new files , or have a script or bat file pull them down locally.
EDIT - ANOTHER SOLUTION
It might be best to ask why do you have these dependencies? Do you really need them locally when building your part of the application? Could you mock out the dependencies in your solution, allowing you to code, build, and run unit tests? The the actual application would wire these up in your DEV/Test/Prod environments. Keeping your solution decoupled and dependent free may be a better solution for the individual team. Leave the integration and coupling when the application runs in a real setting.
(Not a complete answer, but still:)
Any delivery is better stored in a file/binary repository, as opposed to a VCS used to manage sources history.
We prefer managing those deliveries in a repo like Nexus, and we are using maven to get back the right dependencies.
Even if those tools can be more Java-oriented, Nexus can store anything, and maven is only there to read the pom.xml of each artifact and compute the right dependencies.

How to work on a Cocoa app and plugins in parallel?

I have a relatively simple goal: I want to create a Cocoa application which doesn't have much functionality itself, but is extendable through plugins. In addition I want to work on a few plugins to supply users with real functionality (and working examples).
As I am planning to make the application and each plugin separate open-source projects (and Git repositories), I'm now searching for the best way to organize my files and the Xcode projects. I'm not very experienced with Xcode and right now I don't see a simple way to get it working without copying files after building.
This is the simple monolithic setup I used for development up until now:
There's only one Xcode project with multiple products:
The main application
A framework for plugin development
Several plugin bundles
What I'm searching for is a comfortable way to split these into several Xcode projects (one for the application and framework) and one for each plugin. As my application is still in an early stage of development, I'm still changing lots of things in both the application and the plugins. So what I mean by "comfortable" is, that I don't want to copy files manually or similar inconvenience.
What I need is that the plugin projects know where they can find the current development framework and the application needs to know where it can find the development plugins. The best would be something like a inter-project dependency, but I couldn't find a way to setup something like that in Xcode.
One possible solution I have in mind is to copy both (the plugins and the framework) in a "Copy Files Build Phase" to a known location, e.g. /tmp/development, so production and development files aren't mixed up.
I think that my solution would be enough, but I'm curious if there's a better way to achieve what I want. So any suggestions are welcome.
First, don't use a static "known location" like you mention. I've worked in this kind of project; it's a royal pain. As soon as you get to the point of needing a couple of different copies of the project around (for fixing bugs in parallel, for testing a "clean" build versus your latest changes, for working on multiple branches), the builds start trashing each other and you find yourself having to do completely clean/builds much more often than you'd want.
You can create inter-project dependencies by adding the dependent project (Add File), right click the Target and choose "Get Info," and then add a Direct Dependency on the General pane.
In terms of structure, you can either put the main app and framework together, or put them in separate projects. In either case, I recommend a directory tree like:
/MyProject
/Framework
/Application
/Plugins
/Plugin1
/Plugin2
Projects should then refer to each other by relative paths. This means you can easily work on multiple copies of the project in parallel.
You can also look at a top-level build script that changes into each directory and runs "xcodebuild". I dislike complex build scripts (we have one; it's called Xcode), but if all it does is call "xcodebuild" with parameters if needed, then a simple build script is useful.

What is the best practice for sharing a Visual Studio Project (assembly) among solutions

Suppose I have a project "MyFramework" that has some code, which is used across quite a few solutions. Each solution has its own source control management (SVN).
MyFramework is an internal product and doesn't have a formal release schedule, and same goes for the solutions.
I'd prefer not having to build and copy the DLLs to all 12 projects, i.e. new developers should to be able to just do a svn-checkout, and get to work.
What is the best way to share MyFramework across all these solutions?
Since you mention SVN, you could use externals to "import" the framework project into the working copy of each solution that uses it. This would lead to a layout like this:
C:\Projects
MyFramework
MyFramework.csproj
<MyFramework files>
SolutionA
SolutionA.sln
ProjectA1
<ProjectA1 files>
MyFramework <-- this is a svn:externals definition to "import" MyFramework
MyFramework.csproj
<MyFramework files>
With this solution, you have the source code of MyFramework available in each solution that uses it. The advantage is, that you can change the source code of MyFramework from within each of these solutions (without having to switch to a different project).
BUT: at the same time this is also a huge disadvantage, since it makes it very easy to break MyFramwork for some solutions when modifiying it for another.
For this reason, I have recently dropped that approach and am now treating our framework projects as a completely separate solution/product (with their own release-schedule). All other solutions then include a specific version of the binaries of the framework projects.
This ensures that a change made to the framework libraries does not break any solution that is reusing a library. For each solution, I can now decide when I want to update to a newer version of the framework libraries.
That sounds like a disaster... how do you cope with developers undoing/breaking the work of others...
If I were you, I'd put MyFrameWork in a completely seperate solution. When a developer wants to develop one of the 12 projects, he opens that project solution in one IDE & opens MyFrameWork in a seperate IDE.
If you strong name your MyFramework Assemby & GAC it, and reference it in your other projects, then the "Copying DLLs" won't be an issue.
You just Build MyFrameWork (and a PostBuild event can run GacUtil to put it in the asssembly cache) and then Build your other Project.
The "best way" will depend on your environment. I worked in a TFS-based, continuous integration environment, where the nightly build deployed the binaries to a share. All the dependent projects referred to the share. When this got slow, I built some tools to permit developers to have a local copy of the shared binaries, without changing the project files.
Does work in any of the 12 solutions regularly require changes to the "framework" code?
If so your framework is probably new and just being created, so I'd just include the framework project in all of the solutions. After all, if work dictates that you have to change the framework code, it should be easy to do so.
Since changes in the framework made from one solution will affect all the other solutions, breaks will happen, and you will have to deal with them.
Once you rarely have to change the framework as you work in the solutions (this should be your goal) then I'd include a reference to a framework dll instead, and update the dll in each solution only as needed.
svn:externals will take care of this nicely if you follow a few rules.
First, it's safer if you use relative URIs (starting with a ^ character) for svn:externals definitions and put the projects in the same repository if possible. This way the definitions will remain valid even if the subversion server is moved to a new URL.
Second, make sure you follow the following hint from the SVN book. Use PEG-REVs in your svn:externals definitions to avoid random breakage and unstable tags:
You should seriously consider using
explicit revision numbers in all of
your externals definitions. Doing so
means that you get to decide when to
pull down a different snapshot of
external information, and exactly
which snapshot to pull. Besides
avoiding the surprise of getting
changes to third-party repositories
that you might not have any control
over, using explicit revision numbers
also means that as you backdate your
working copy to a previous revision,
your externals definitions will also
revert to the way they looked in that
previous revision ...
I agree with another poster - that sounds like trouble. But if you can't want to do it the "right way" I can think of two other ways to do it. We used something similar to number 1 below. (for native C++ app)
a script or batch file or other process that is run that does a get and a build of the dependency. (just once) This is built/executed only if there are no changes in the repo. You will need to know what tag/branch/version to get. You can use a bat file as a prebuild step in your project files.
Keep the binaries in the repo (not a good idea). Even in this case the dependent projects have to do a get and have to know about what version to get.
Eventually what we tried to do for our project(s) was mimic how we use and refer to 3rd party libraries.
What you can do is create a release package for the dependency that sets up a path env variable to itself. I would allow multiple versions of it to exist on the machine and then the dependent projects link/reference specific versions.
Something like
$(PROJ_A_ROOT) = c:\mystuff\libraryA
$(PROJ_A_VER_X) = %PROJ_A_ROOT%\VER_X
and then reference the version you want in the dependent solutions either by specific name, or using the version env var.
Not pretty, but it works.
A scalable solution is to do svn-external on the solution directory so that your imported projects appear parallel to your other projects. Reasons for this are given below.
Using a separate sub-directory for "imported" projects, e.g. externals, via svn-external seems like a good idea until you have non-trivial dependencies between projects. For example, suppose project A depends on project on project B, and project B on project C. If you then have a solution S with project A, you'll end up with the following directory structure:
# BAD SOLUTION #
S
+---S.sln
+---A
| \---A.csproj
\---externals
+---B <--- A's dependency
| \---B.csproj
\---externals
\---C <--- B's dependency
\---C.csproj
Using this technique, you may even end up having multiple copies of a single project in your tree. This is clearly not what you want.
Furthermore, if your projects use NuGet dependencies, they normally get loaded within packages top-level directory. This means that NuGet references of projects within externals sub-directory will be broken.
Also, if you use Git in addition to SVN, a recommended way of tracking changes is to have a separate Git repository for each project, and then a separate Git repository for the solution that uses git submodule for the projects within. If a Git submodule is not an immediate sub-directory of the parent module, then Git submodule command will make a clone that is an immediate sub-directory.
Another benefit of having all projects on the same layer is that you can then create a "super-solution", which contains projects from all of your solutions (tracked via Git or svn-external), which in turn allows you to check with a single Solution-rebuild that any change you made to a single project is consistent with all other projects.
# GOOD SOLUTION #
S
+---S.sln
+---A
| \---A.csproj
+---B <--- A's dependency
| \---B.csproj
\---C <--- B's dependency
\---C.csproj

Recommended number of projects in Visual Studio Solution

We are starting to develop new application that will include something like 30-50 projects developed by about dozen of developers with C# in MS Visual Studio.
I am working on componentize the application modules in order to support the architecture and enable parallel work.
We have argue: how many solutions should we have?
Some claim that we should have 1-2 solutions with 15-30 projects each. Some claim that we need a solution per component that means about 10-12 solutions with about 3-6 projects each.
I would be happy to hear pros/cons and experience with each direction (or other direction thereof)
I've worked on products on both extremes: one with ~100 projects in a single solution, and one with >20 solutions, of 4-5 projects each (Test, Business Layer, API, etc).
Each approach has its advantages and disadvantages.
A single solution is very useful when making changes - its easier to work with dependencies, and allows refactoring tools to work well. It does however, result in longer load times and longer build times.
Multiple solutions can help enforce separation of concerns, and keep build/load times low, and may be well suited to having multiple teams with narrower focus, and well defined service boundaries. They do however, have a large drawback when it comes to refactoring, since many references are file, not project references.
Maybe there's room for a hybrid approach use smaller solutions for the most part, but create a single including all projects for times when larger scale changes are required. Of course, you then have to maintain two separate solutions...
Finally, the structure of your projects and dependencies will have some influence on how you organize your solutions.
And keep in mind, in the long run RAM is cheaper than programmer time...
Solutions are really there for dependency management, so you can have project in more that one solution, if more than one thing depends on it. The number of solutions should really depend on your dependency graph.
Edit: This means you shouldn't be sticking projects that are not dependent on each other into the same solution, as it creates the illusion of dependency which means someone could create a real dependency when two projects should really be independent.
I've worked on a solution with close to 200 projects. It's not a big deal if you have enough RAM :).
One important thing to remember is that is projects depend on each other (be it with Dependencies or References), they should probably be in the same solution. Otherwise you get strange behavior when different projects have different dependencies in different solutions.
You want to maintain project references. If you can safely break up your solution with two or more discrete sets of projects that depend on each other, then do it. If you can't, then they all belong together.
We have a solution that has approximately 130 projects. About 3 years ago when we are using vs.net 2003 it was a terrible problem. Sometimes solution and VSS were crashing.
But now with VS.NET 2005 it's ok. Only loading is taking much time. Some of my coworkers unloading projects that they don't use. It's another option to speed up.
Changing build type to release is an another problem. But we have MSBuild scripts now. We do not use relese build of VS.NET no more.
I think you should not exaggerate your number of projects/solutions. Componentize what can
and will be reused, otherwise don't componentize!
It will only make things less transparent and increase build times. Partitioning can also be done within a project using folder or using a logical class structure.
When deciding what number of projects vs solutions do you need, you need to concider some questions:
logical layers of your application;
dependency between projects;
how projects are built;
who works with what projects;
Currently we have 1 solution with 70 projects.
For our continous integration we created 5 msbuild projects, so CI does not build our development solution.
Previously, we had separate solution for presentation (web and related projects) layer in separate git repository. This solution was used by outsource and freelance web developers.
I am working with a solution that has 405 projects currently. On a really fast machine this is doable, but only with current Visual Studio 2017 or 2012. Other versions crash frequently.
I don't think the actual number of solutions matters. Much more important is that you break the thing up along functional lines. As a silly, contrived example if you have a clutch of libraries that handles interop with foreign web services, that would be a solution; an EXE with the DLLs it needs to work would be another.
Only thing about so many projects in one solution is that the references and build order start to get confusing.
As a general rule I'd gravitate toward decreasing the number of projects (make the project a little more generic) and have devs share source control on those projects, but separate the functionality within those projects by sub-namespaces.
You should have as many as you need. There is no hard limit or best practice. Practice and read about what projects and solutions are and then make the proper engineering decisions about what you need from there.
It has been said in other answers however it is probably worth saying again.
Project dependencies are good in that they can rebuild dependent projects if the dependency has a change.
If you do a assembly file dependency there is no way that VS or MSBuild is going to know that a string of projects need to be built. What will happen is that on the first build the project that has changed will be rebuilt. If you have put the dependency on the build output then at least on the second build the project dependent on that will build. But then you have an unknown number of builds needed to get to the end of the chain.
With project dependencies it will sort it all out for you.
So the answer that says have as many (or few) as needed to ensure project dependencies are used.
If your team is broken down into functional areas that have a more formal release mechanism rather than checkin of source code then splitting it down those lines would be the way to go, otherwise the dependency map is your friend.

Resources