HowTo: Reference external SLN files with TeamCity - teamcity

I'm new to TeamCity and we have a number of common projects under source control that are then referenced directly within relevant projects i.e.:
Common
branches
tags
trunk
CommonProject
CommonProject.csproj
Common.sln
ProjectX
branches
tags
trunk
ProjectX.sln
As a result, the reference to "CommonProject.csproj" in "ProjectX.sln" is something along the lines of ....\Common\trunk\CommonProject\CommonProject.csproj which is fine within our development enviroments but when it comes to TeamCity it falls over saying it can't find the path "....\Common\trunk\CommonProject\CommonProject.csproj"
What's the best way around this problem? I've tried adding CommonProject to TeamCity as a dependancy but it still doesn't seem to want to play ball...
Thanks
Tim

We address this by using Externals in Subversion which allows you to pull in stuff from a different (bit of the) repository.
Then, when we're building the solutions, we have those common projects grouped into the same folders with project specific solution - i.e. when we check stuff out we have:
Solution1
+---Project1
+---Project2
+---Project3
+---Common1
+---Common2
Then, separately:
Solution2
+---ProjectA
+---ProjectB
+---ProjectC
+---Common1
+---Common2
Because we have the externals and the directory/folder structure set up this way you should, in theory, be able to checkout (or export) a "solution" to an empty directory and have it build successfully from scratch (subject to all the necessary tools being installed) and therefore TeamCity (or whatever your continuous integration server is) should be able to also be build it from scratch. In fact even before we stared using TeamCity I had this as policy but the value is clearer once you start doing continuous integration.
The appropriate bit of the Subversion Red Book is here: Externals Definitions

Related

TFS2015 vNext missing features

I have been working with TeamCity, Jenkins and Bamboo my last 8years. Latest 2years very involved in ContinuousIntegration factory setup and mainteinance on my team with very good results, giving me a lot of habits about how to deal with builds, artifacts and pipelines.
Now, i'm on a new company, new team, new CI, TFS2015, first time for me.
Just one month before I arrive to this new team they were on TFS2012, with XAMLs, so I took the migration to vNext builds.
At first look, I found on vNext builds the classic build definition, i mean adding steps as a single task to the build, instead of monolithic XAML file.
But with the time, i was trying to create more complex builds, like TeamCity build chains, but this is not possible, strike one...
Then, I was trying to deal with multiple branches, one continuous build for each branch (we are on TFSVC), create packages from each branch, and I found I was duplicating my builds just to change repository paths and a few details, so I moved builds to templates trying to reuse build definitions, introducing variables to generates paths(for repos and branches) and versions and expecting to change only in one place the build and having this changes reflected in all builds derived from templates... but that's was not the case... :
variables are not accepted everywhere, like in repo paths.
can't change templates after created them, just replace them, and builds created from templates are not affected after template is changed.
strike two ?.....
I'm wondering if maybe I'm doing things wrong with TFS, maybe this is a different system and I can't do things like in other CIs.
Any advice in how to approach TFS to have a good, dynamic and reusable set of builds ??
here isn’t the feature that the changes of a build definition or template affects other existing build definitions.
If the build steps of build definitions are the same for each branch, you just need a build definition and add filters for each branch (Triggers > Continuous integrations (CI), after that it will uses corresponding source to build for CI build, for example, develop branch changes > Trigger a build automatically with develop branch source.
On the other hand, you can change branch and source version when queue build manually or through REST API.
If the build steps are different for each branch, you need to modify the definitions independently for detail requirements.

Build dependencies and local builds with continuous integration

Our company currently uses TFS for source control and build server. Most of our projects are written in C/C++, but we also have some .NET projects and wouldn't want to be limited if we need to use other languages in the future.
We'd like to use Git for our source control and we're trying to understand what would be the best choice for a build server. We have started looking into TeamCity, but there are some issues we're having trouble with which will probably be relevant regardless of our choice of build server:
Build dependencies - We'd like to be able to control the build dependencies for each <project, branch>. For example, have <MyProj, feature_branch> depend on <InfraProj1, feature_branch> and <InfraProj2, master>.
From what we’ve seen, to do that we might need to use Gradle or something similar to build our projects instead of plain MSBuild. Is this correct? Are there simpler ways of achieving this?
Local builds - Obviously we'd like to be able to build projects locally as well. This becomes somewhat of a problem when project dependencies are introduced, as we need a way to reference these resources or copy them locally for the build to succeed. How is this usually solved?
I'd appreciate any input, but a sample setup which covers these issues will also be a great help.
IMHO both issues you mention fall really in the config management category, thus, as you say, unrelated to the build server choice.
A workspace for a project build (doesn't matter if centralized or local) should really contain all necessary resources for the build.
How can you achieve that? Have a project "metadata" git repo with a "content" file containing all your project components and their dependencies (each with its own git/other repo) and their exact versions - effectively tying them together coherently (you may find it useful to store other metadata in this component down the road as well, like component specific SCM info if using a mix of SCMs across the workspace).
A workspace pull wrapper script would first pull this metadata git repo, parse the content file and then pull all the other project components and their dependencies according with the content file info. Any build in such workspace would have all the parts it needs.
When time comes to modify either the code in a project component or the version of one of the dependencies you'll need to also update this content file in the metadata git repo to reflect the update and commit it - this is how your project makes progress coherently, as a whole.
Of course, actually managing dependencies is another matter. Tons of opinions out there, some even conflicting.

How to deal with Git Submodules in Visual Studio solutions with different layout?

We develop with Visual Studio 2010 (in C#) and migrated a while ago from SVN to GIT. Now we try to split up our repository (which is quite big - ~30.000 files) to many git repositories - one for each solution.
The solutions share some projects, mostly libraries we develop in-house and like to add to from all the solutions.
The new repositories have a flat layout. One subdirectory for each project (shared projects are submodules).
In the big old repo, the projects are in a tree structure.
The Problem occurs with external references in the submodules. In the new repos, the path to a referenced project may be "......libs\someproject", while in the new layout the correct path would be "..\someproject".
We already had some edit wars concerning this and are not keen on more.
Half-baked Solutions I could think of:
use "Reference Paths" in ...csproj.user and exclude this file from version control (has to be redone for each developer and after each reopsitory cleanup)
use branches for each situation and try to teach everyone where "real" commits should go and where "environment-change" commits should go (submodules are already not the simplest concept...)
embed binaries instead of the submodules (but what about developing changes to the submodules? what about different log4net versions?)
Does anyone know of a sane solution?
Since you are asking for a sane solution, I can only advise you to look into setting up your own NuGet service (look at http://www.MyGet.org for inspiration)
http://nuget.codeplex.com/
IF you go down the route of package management, consider OpenWrap. However, embedding the package management artefacts in source code is a bad idea. You can use such tools to update what is actually stored in submodules, but don't rely on them at build time. Expect the binaries to be there from the point of view of your build scripts.
So if I understand you correctly, the problem is with Visual Studio and not with Git? If that's the case, use the old tree structure that worked with Visual Studio. Make your submodules structure a tree structure too. So the top of the tree would be one super repo whose sub modules (the branches) would have submodules of their own, until you get down to the leaves of your tree. It would be a pain to setup at first, but it should just work.
Use one submodule to house all "common libraries". Just one level. But you should move the common libraries as services with well defined contracts. This way you can incrementally rollout new versions with no down time. This way you only have a submodule in each that holds the contracts. These could be interfaces or messages.
I have a similar problem using VS 2013.
I want to use git-svn instead of SVN directly. SVN has a gigantic set of directories. I could not create a single git-repository that would contain all of our trunk folder. Git-always exited with an error and the repository was corrupted. I worked around the problem by doing as follows:
Using git-svn, I cloned the subset of folders off SVN/trunk that I needed by creating one git-repository per folder.
Created a local parent git repository that contains all my git-svn-cloned folders.
Each git-repository was added as a sub-module to the parent git-repository.
The problem with Visual Studio is that it does not recognize the multiple projects outside the main project where I opened the solution. This solution is in a folder that contains the only files recognized by Visual Studio as being under git-source control.
I tried setting the git-preferences to use the upper level parent directory as the location of the git-repostitory without noticing any difference.

Migrating from SVN to TFS (what to do about externals)

So we are trying out TFS right, for the most part you can pretty much figure it out, we created a test project and added it etc. All went well. The thing I haven't really grokked is what to do about externals, we have two large projects and they both reference a shared core project, and then some of the other parts of those two solutions are separate projects themselves which of course are externals as well.
We don't care about the history (we are at a point where we can make a clean break) so I was planning to go the route of 1) create new TFS 2) add projects exported from SVN.
Again my question is how to handle the externals - If someone coule point me in the right direction that would be great.
Thanks.
Unlike Subversion, TFS branches exist in "path space." So, you could check in your "externals" and create a branch for each distinct version of them you want to reference. Then, you can configure your workspace to reference the appropriate version from the corresponding branch path. Alternatively, you could consider managing these components via NuGet, setting up a private NuGet feed (can be as simple as a UNC path).

What is the best practice for sharing a Visual Studio Project (assembly) among solutions

Suppose I have a project "MyFramework" that has some code, which is used across quite a few solutions. Each solution has its own source control management (SVN).
MyFramework is an internal product and doesn't have a formal release schedule, and same goes for the solutions.
I'd prefer not having to build and copy the DLLs to all 12 projects, i.e. new developers should to be able to just do a svn-checkout, and get to work.
What is the best way to share MyFramework across all these solutions?
Since you mention SVN, you could use externals to "import" the framework project into the working copy of each solution that uses it. This would lead to a layout like this:
C:\Projects
MyFramework
MyFramework.csproj
<MyFramework files>
SolutionA
SolutionA.sln
ProjectA1
<ProjectA1 files>
MyFramework <-- this is a svn:externals definition to "import" MyFramework
MyFramework.csproj
<MyFramework files>
With this solution, you have the source code of MyFramework available in each solution that uses it. The advantage is, that you can change the source code of MyFramework from within each of these solutions (without having to switch to a different project).
BUT: at the same time this is also a huge disadvantage, since it makes it very easy to break MyFramwork for some solutions when modifiying it for another.
For this reason, I have recently dropped that approach and am now treating our framework projects as a completely separate solution/product (with their own release-schedule). All other solutions then include a specific version of the binaries of the framework projects.
This ensures that a change made to the framework libraries does not break any solution that is reusing a library. For each solution, I can now decide when I want to update to a newer version of the framework libraries.
That sounds like a disaster... how do you cope with developers undoing/breaking the work of others...
If I were you, I'd put MyFrameWork in a completely seperate solution. When a developer wants to develop one of the 12 projects, he opens that project solution in one IDE & opens MyFrameWork in a seperate IDE.
If you strong name your MyFramework Assemby & GAC it, and reference it in your other projects, then the "Copying DLLs" won't be an issue.
You just Build MyFrameWork (and a PostBuild event can run GacUtil to put it in the asssembly cache) and then Build your other Project.
The "best way" will depend on your environment. I worked in a TFS-based, continuous integration environment, where the nightly build deployed the binaries to a share. All the dependent projects referred to the share. When this got slow, I built some tools to permit developers to have a local copy of the shared binaries, without changing the project files.
Does work in any of the 12 solutions regularly require changes to the "framework" code?
If so your framework is probably new and just being created, so I'd just include the framework project in all of the solutions. After all, if work dictates that you have to change the framework code, it should be easy to do so.
Since changes in the framework made from one solution will affect all the other solutions, breaks will happen, and you will have to deal with them.
Once you rarely have to change the framework as you work in the solutions (this should be your goal) then I'd include a reference to a framework dll instead, and update the dll in each solution only as needed.
svn:externals will take care of this nicely if you follow a few rules.
First, it's safer if you use relative URIs (starting with a ^ character) for svn:externals definitions and put the projects in the same repository if possible. This way the definitions will remain valid even if the subversion server is moved to a new URL.
Second, make sure you follow the following hint from the SVN book. Use PEG-REVs in your svn:externals definitions to avoid random breakage and unstable tags:
You should seriously consider using
explicit revision numbers in all of
your externals definitions. Doing so
means that you get to decide when to
pull down a different snapshot of
external information, and exactly
which snapshot to pull. Besides
avoiding the surprise of getting
changes to third-party repositories
that you might not have any control
over, using explicit revision numbers
also means that as you backdate your
working copy to a previous revision,
your externals definitions will also
revert to the way they looked in that
previous revision ...
I agree with another poster - that sounds like trouble. But if you can't want to do it the "right way" I can think of two other ways to do it. We used something similar to number 1 below. (for native C++ app)
a script or batch file or other process that is run that does a get and a build of the dependency. (just once) This is built/executed only if there are no changes in the repo. You will need to know what tag/branch/version to get. You can use a bat file as a prebuild step in your project files.
Keep the binaries in the repo (not a good idea). Even in this case the dependent projects have to do a get and have to know about what version to get.
Eventually what we tried to do for our project(s) was mimic how we use and refer to 3rd party libraries.
What you can do is create a release package for the dependency that sets up a path env variable to itself. I would allow multiple versions of it to exist on the machine and then the dependent projects link/reference specific versions.
Something like
$(PROJ_A_ROOT) = c:\mystuff\libraryA
$(PROJ_A_VER_X) = %PROJ_A_ROOT%\VER_X
and then reference the version you want in the dependent solutions either by specific name, or using the version env var.
Not pretty, but it works.
A scalable solution is to do svn-external on the solution directory so that your imported projects appear parallel to your other projects. Reasons for this are given below.
Using a separate sub-directory for "imported" projects, e.g. externals, via svn-external seems like a good idea until you have non-trivial dependencies between projects. For example, suppose project A depends on project on project B, and project B on project C. If you then have a solution S with project A, you'll end up with the following directory structure:
# BAD SOLUTION #
S
+---S.sln
+---A
| \---A.csproj
\---externals
+---B <--- A's dependency
| \---B.csproj
\---externals
\---C <--- B's dependency
\---C.csproj
Using this technique, you may even end up having multiple copies of a single project in your tree. This is clearly not what you want.
Furthermore, if your projects use NuGet dependencies, they normally get loaded within packages top-level directory. This means that NuGet references of projects within externals sub-directory will be broken.
Also, if you use Git in addition to SVN, a recommended way of tracking changes is to have a separate Git repository for each project, and then a separate Git repository for the solution that uses git submodule for the projects within. If a Git submodule is not an immediate sub-directory of the parent module, then Git submodule command will make a clone that is an immediate sub-directory.
Another benefit of having all projects on the same layer is that you can then create a "super-solution", which contains projects from all of your solutions (tracked via Git or svn-external), which in turn allows you to check with a single Solution-rebuild that any change you made to a single project is consistent with all other projects.
# GOOD SOLUTION #
S
+---S.sln
+---A
| \---A.csproj
+---B <--- A's dependency
| \---B.csproj
\---C <--- B's dependency
\---C.csproj

Resources