What is the best practice for sharing a Visual Studio Project (assembly) among solutions - visual-studio

Suppose I have a project "MyFramework" that has some code, which is used across quite a few solutions. Each solution has its own source control management (SVN).
MyFramework is an internal product and doesn't have a formal release schedule, and same goes for the solutions.
I'd prefer not having to build and copy the DLLs to all 12 projects, i.e. new developers should to be able to just do a svn-checkout, and get to work.
What is the best way to share MyFramework across all these solutions?

Since you mention SVN, you could use externals to "import" the framework project into the working copy of each solution that uses it. This would lead to a layout like this:
C:\Projects
MyFramework
MyFramework.csproj
<MyFramework files>
SolutionA
SolutionA.sln
ProjectA1
<ProjectA1 files>
MyFramework <-- this is a svn:externals definition to "import" MyFramework
MyFramework.csproj
<MyFramework files>
With this solution, you have the source code of MyFramework available in each solution that uses it. The advantage is, that you can change the source code of MyFramework from within each of these solutions (without having to switch to a different project).
BUT: at the same time this is also a huge disadvantage, since it makes it very easy to break MyFramwork for some solutions when modifiying it for another.
For this reason, I have recently dropped that approach and am now treating our framework projects as a completely separate solution/product (with their own release-schedule). All other solutions then include a specific version of the binaries of the framework projects.
This ensures that a change made to the framework libraries does not break any solution that is reusing a library. For each solution, I can now decide when I want to update to a newer version of the framework libraries.

That sounds like a disaster... how do you cope with developers undoing/breaking the work of others...
If I were you, I'd put MyFrameWork in a completely seperate solution. When a developer wants to develop one of the 12 projects, he opens that project solution in one IDE & opens MyFrameWork in a seperate IDE.
If you strong name your MyFramework Assemby & GAC it, and reference it in your other projects, then the "Copying DLLs" won't be an issue.
You just Build MyFrameWork (and a PostBuild event can run GacUtil to put it in the asssembly cache) and then Build your other Project.

The "best way" will depend on your environment. I worked in a TFS-based, continuous integration environment, where the nightly build deployed the binaries to a share. All the dependent projects referred to the share. When this got slow, I built some tools to permit developers to have a local copy of the shared binaries, without changing the project files.

Does work in any of the 12 solutions regularly require changes to the "framework" code?
If so your framework is probably new and just being created, so I'd just include the framework project in all of the solutions. After all, if work dictates that you have to change the framework code, it should be easy to do so.
Since changes in the framework made from one solution will affect all the other solutions, breaks will happen, and you will have to deal with them.
Once you rarely have to change the framework as you work in the solutions (this should be your goal) then I'd include a reference to a framework dll instead, and update the dll in each solution only as needed.

svn:externals will take care of this nicely if you follow a few rules.
First, it's safer if you use relative URIs (starting with a ^ character) for svn:externals definitions and put the projects in the same repository if possible. This way the definitions will remain valid even if the subversion server is moved to a new URL.
Second, make sure you follow the following hint from the SVN book. Use PEG-REVs in your svn:externals definitions to avoid random breakage and unstable tags:
You should seriously consider using
explicit revision numbers in all of
your externals definitions. Doing so
means that you get to decide when to
pull down a different snapshot of
external information, and exactly
which snapshot to pull. Besides
avoiding the surprise of getting
changes to third-party repositories
that you might not have any control
over, using explicit revision numbers
also means that as you backdate your
working copy to a previous revision,
your externals definitions will also
revert to the way they looked in that
previous revision ...

I agree with another poster - that sounds like trouble. But if you can't want to do it the "right way" I can think of two other ways to do it. We used something similar to number 1 below. (for native C++ app)
a script or batch file or other process that is run that does a get and a build of the dependency. (just once) This is built/executed only if there are no changes in the repo. You will need to know what tag/branch/version to get. You can use a bat file as a prebuild step in your project files.
Keep the binaries in the repo (not a good idea). Even in this case the dependent projects have to do a get and have to know about what version to get.
Eventually what we tried to do for our project(s) was mimic how we use and refer to 3rd party libraries.
What you can do is create a release package for the dependency that sets up a path env variable to itself. I would allow multiple versions of it to exist on the machine and then the dependent projects link/reference specific versions.
Something like
$(PROJ_A_ROOT) = c:\mystuff\libraryA
$(PROJ_A_VER_X) = %PROJ_A_ROOT%\VER_X
and then reference the version you want in the dependent solutions either by specific name, or using the version env var.
Not pretty, but it works.

A scalable solution is to do svn-external on the solution directory so that your imported projects appear parallel to your other projects. Reasons for this are given below.
Using a separate sub-directory for "imported" projects, e.g. externals, via svn-external seems like a good idea until you have non-trivial dependencies between projects. For example, suppose project A depends on project on project B, and project B on project C. If you then have a solution S with project A, you'll end up with the following directory structure:
# BAD SOLUTION #
S
+---S.sln
+---A
| \---A.csproj
\---externals
+---B <--- A's dependency
| \---B.csproj
\---externals
\---C <--- B's dependency
\---C.csproj
Using this technique, you may even end up having multiple copies of a single project in your tree. This is clearly not what you want.
Furthermore, if your projects use NuGet dependencies, they normally get loaded within packages top-level directory. This means that NuGet references of projects within externals sub-directory will be broken.
Also, if you use Git in addition to SVN, a recommended way of tracking changes is to have a separate Git repository for each project, and then a separate Git repository for the solution that uses git submodule for the projects within. If a Git submodule is not an immediate sub-directory of the parent module, then Git submodule command will make a clone that is an immediate sub-directory.
Another benefit of having all projects on the same layer is that you can then create a "super-solution", which contains projects from all of your solutions (tracked via Git or svn-external), which in turn allows you to check with a single Solution-rebuild that any change you made to a single project is consistent with all other projects.
# GOOD SOLUTION #
S
+---S.sln
+---A
| \---A.csproj
+---B <--- A's dependency
| \---B.csproj
\---C <--- B's dependency
\---C.csproj

Related

Build dependencies and local builds with continuous integration

Our company currently uses TFS for source control and build server. Most of our projects are written in C/C++, but we also have some .NET projects and wouldn't want to be limited if we need to use other languages in the future.
We'd like to use Git for our source control and we're trying to understand what would be the best choice for a build server. We have started looking into TeamCity, but there are some issues we're having trouble with which will probably be relevant regardless of our choice of build server:
Build dependencies - We'd like to be able to control the build dependencies for each <project, branch>. For example, have <MyProj, feature_branch> depend on <InfraProj1, feature_branch> and <InfraProj2, master>.
From what we’ve seen, to do that we might need to use Gradle or something similar to build our projects instead of plain MSBuild. Is this correct? Are there simpler ways of achieving this?
Local builds - Obviously we'd like to be able to build projects locally as well. This becomes somewhat of a problem when project dependencies are introduced, as we need a way to reference these resources or copy them locally for the build to succeed. How is this usually solved?
I'd appreciate any input, but a sample setup which covers these issues will also be a great help.
IMHO both issues you mention fall really in the config management category, thus, as you say, unrelated to the build server choice.
A workspace for a project build (doesn't matter if centralized or local) should really contain all necessary resources for the build.
How can you achieve that? Have a project "metadata" git repo with a "content" file containing all your project components and their dependencies (each with its own git/other repo) and their exact versions - effectively tying them together coherently (you may find it useful to store other metadata in this component down the road as well, like component specific SCM info if using a mix of SCMs across the workspace).
A workspace pull wrapper script would first pull this metadata git repo, parse the content file and then pull all the other project components and their dependencies according with the content file info. Any build in such workspace would have all the parts it needs.
When time comes to modify either the code in a project component or the version of one of the dependencies you'll need to also update this content file in the metadata git repo to reflect the update and commit it - this is how your project makes progress coherently, as a whole.
Of course, actually managing dependencies is another matter. Tons of opinions out there, some even conflicting.

Cruise Control .NET two project same working directory

in ccnet wiki in the project block, workingDirectory I read: "Make sure this folder is unique per project to prevent problems with the build." I want to do two project that has the same working directory... what are the "problems with the build" that can occur and how can I overcome these problems?
Edit:
My situation: I have two applications in the same trunk that has some common code but if I made commit to one of the applications I don't want the other application to build and increase its version number but if I made a change to the common code I want both of them to trigger a build. My source control is SVN and I use Filtered block to include only the files I want to trigger a build.
Option 1
Have a single project that builds just the common code. This should emit its built assembly to a known location outside it's working directory.
Have 2 other projects that build the other parts of the solution. Each only listens to changes to its particular source control paths. Each project can incorporate/reference the built assembly from the known location
The 2 other projects can be forced from the common project using a forceBuildPublisher.
These projects should be in the same queue to prevent the common project rewriting the built common assembly whist its trying to be reference by the other 2 projects.
Option 2
Have 2 individual projects that both build the common source code and the specific code together. Say by building a solution file which contains both sets of projects.
This is the simpler option, but does lose you the neatness of having a 'common assembly version number'
Pros and Cons
Option 1
You have a single version no for each version of the common code.
Its more prone to issues due to the additional complexity.
You need to maintain a known location outside of the working directory so that it is not deleted/cleaned by the build process.
Option 2
Simpler solution.
Common code version is lost in the version of the dependant assembly.
If I had to suggest one, I would opt for for option 2, purely because its simplicity reduces chance of other issues.
Working directory is a place where Cruise Control puts the source code of your project to, and this is also where the build process is happening. If you point two projects to the same working directory, you can end up with any kind of conflicts you can guess. Source files of project A and project B can mix up, the build process might break because of unknown state of the build folder, etc.
It's quite natural to separate unrelated things, and in this case it's a call of common sense. Besides, I can hardly imagine a situation when you have to put 2 projects into the same working directory.

What is the recommended way to setup projects like this?

We are working on a large project. The project has multiple external sites and multiple internal sites all stored in Subversion.
The external sites allow a customer to make requests of various things we provide, pay utility bills and more. We decided to break many of these functions apart because most work completely different than the others. So this is one Visual Studio solution with the WebUI and the database layer broken into two projects each. For instance, utility billing has a Utility.WebUI project and a Utility.Domain project. All DB/business logic is kept in the domain project.
The internal sites bridge the gap between the back-office system (IBM i) and the web database. Also will replace/enhance some of our older RPG programs. In theory they should use the exact same database logic that the external sites use because they access the same database right? What is the best way to reference these projects from a different solution? Should I just add a reference to the dll or should I import that project from the external application solution into the internal application solution?
This comes down to that we have two developers working on this project. Myself, I do most of the back-end coding. The other developer does most of the GUI coding. So we need to make sure that this project works on multiple workstations.
Does this make sense? Any thoughts?
Use the svn:externals property to reference the shared project into your project(s).
You have to choose between 1) referencing the directory containing the shared project's source code (i.e. where the csproj and cs files are located) or 2) referencing the directory containing the shared project's build output (assembly / dll).
I normally prefer method 1) since it makes modifications to the shared project's source code easier (you can make changes without having to open the shared project's solution in a second instance of Visual Studio). If you don't intend to make changes to the shared project often then method 2) might be better. It reduces compile time and prevents accidental modifications of the shared project's source code. Both methods are fine - matter of taste.
It is recommended for both methods that you version your shared project. i.e. create tags with version numbers and reference the tags, not the trunk. When a new version of the shared project comes out you can update the svn:externals property of your other project(s) with the new version number, run "svn update" to download the new version of the shared project, and recompile. This works especially well if you have a build server for the shared project that does the tagging for you automatically.
I think you can use a sort of "commons" solution that contains the common projects and then refer to these projects in you main solutions using SVN external pointing to the project folder in the SVN trunk.
Commons SVN repository must follow the suggested repository structure (trunk, branches, tags) to have always stable commons projects.
In this scenario you can consider to use a dependency management tool, such as NPanday or NDepend, where you must declare to which version of which assemblies every project depends on; using these tools you can have a local repository (such as Artifactory or Nexus) of binary assemblies to refer to, or choose to use SVN externals to refer directly to source code.

How to deal with Git Submodules in Visual Studio solutions with different layout?

We develop with Visual Studio 2010 (in C#) and migrated a while ago from SVN to GIT. Now we try to split up our repository (which is quite big - ~30.000 files) to many git repositories - one for each solution.
The solutions share some projects, mostly libraries we develop in-house and like to add to from all the solutions.
The new repositories have a flat layout. One subdirectory for each project (shared projects are submodules).
In the big old repo, the projects are in a tree structure.
The Problem occurs with external references in the submodules. In the new repos, the path to a referenced project may be "......libs\someproject", while in the new layout the correct path would be "..\someproject".
We already had some edit wars concerning this and are not keen on more.
Half-baked Solutions I could think of:
use "Reference Paths" in ...csproj.user and exclude this file from version control (has to be redone for each developer and after each reopsitory cleanup)
use branches for each situation and try to teach everyone where "real" commits should go and where "environment-change" commits should go (submodules are already not the simplest concept...)
embed binaries instead of the submodules (but what about developing changes to the submodules? what about different log4net versions?)
Does anyone know of a sane solution?
Since you are asking for a sane solution, I can only advise you to look into setting up your own NuGet service (look at http://www.MyGet.org for inspiration)
http://nuget.codeplex.com/
IF you go down the route of package management, consider OpenWrap. However, embedding the package management artefacts in source code is a bad idea. You can use such tools to update what is actually stored in submodules, but don't rely on them at build time. Expect the binaries to be there from the point of view of your build scripts.
So if I understand you correctly, the problem is with Visual Studio and not with Git? If that's the case, use the old tree structure that worked with Visual Studio. Make your submodules structure a tree structure too. So the top of the tree would be one super repo whose sub modules (the branches) would have submodules of their own, until you get down to the leaves of your tree. It would be a pain to setup at first, but it should just work.
Use one submodule to house all "common libraries". Just one level. But you should move the common libraries as services with well defined contracts. This way you can incrementally rollout new versions with no down time. This way you only have a submodule in each that holds the contracts. These could be interfaces or messages.
I have a similar problem using VS 2013.
I want to use git-svn instead of SVN directly. SVN has a gigantic set of directories. I could not create a single git-repository that would contain all of our trunk folder. Git-always exited with an error and the repository was corrupted. I worked around the problem by doing as follows:
Using git-svn, I cloned the subset of folders off SVN/trunk that I needed by creating one git-repository per folder.
Created a local parent git repository that contains all my git-svn-cloned folders.
Each git-repository was added as a sub-module to the parent git-repository.
The problem with Visual Studio is that it does not recognize the multiple projects outside the main project where I opened the solution. This solution is in a folder that contains the only files recognized by Visual Studio as being under git-source control.
I tried setting the git-preferences to use the upper level parent directory as the location of the git-repostitory without noticing any difference.

Managing internal 3rd Party Dependencies

We have a lot of different solutions/projects which are managed by different teams. Our solution needs to reference several projects that another team owns. We don't want to add these dependencies as project references because we do not intend on modifying that code, we just want to use it. Also we already have quite a bit of projects in our solution and don't want to add a bunch more since it will slow down Visual Studio. So we are building these projects in a separate solution and adding them as file references to our solution.
My question is, how do people manage these types of dependencies? Should I just have some automated process what looks for changes to those projects, builds them and checks the dlls into our source control, after which we treat them like other 3rd party dependencies? Is there a recommended way of doing this?
One solution, although it may not necessarily be what you are looking for, is to have each dependent sub-system perform a release. This release could be in the form of a MSI install, or just a network share of assemblies. When a significant change is made, that team could let you know, and you could run the install or a script to copy the files.
Once you got the release, you could put them into the GAC, that way you would not have to worry about copying them to your project bin folders.
Another solution, assuming you are using a build server or continuous integration of some kind, is to have a post build step or process stage the files. Than at any given moment, the developers of the other teams could grab the new files , or have a script or bat file pull them down locally.
EDIT - ANOTHER SOLUTION
It might be best to ask why do you have these dependencies? Do you really need them locally when building your part of the application? Could you mock out the dependencies in your solution, allowing you to code, build, and run unit tests? The the actual application would wire these up in your DEV/Test/Prod environments. Keeping your solution decoupled and dependent free may be a better solution for the individual team. Leave the integration and coupling when the application runs in a real setting.
(Not a complete answer, but still:)
Any delivery is better stored in a file/binary repository, as opposed to a VCS used to manage sources history.
We prefer managing those deliveries in a repo like Nexus, and we are using maven to get back the right dependencies.
Even if those tools can be more Java-oriented, Nexus can store anything, and maven is only there to read the pom.xml of each artifact and compute the right dependencies.

Resources