Good Directory Layout for .NET Projects with libraries used across applications and using Mercurial - visual-studio

I've been using Mercurial for a bunch of standalone projects. But now I'm looking at converting a subversion repository to Mercurial thats a lot more busy / complicated.
Given about 40 Library projects and about 20 Applications ( various web / console / wpf, etc) or so. Various apps make use of various Libs. All of this is structured under 1 trunk in subversion. So there's a directory where all the libs live, and a directory where all the apps live. Very easy to find and reference the libs when creating a new Visual Studio Projects.
simplified....
--trunk-|-- libs
|-- apps
Now moving to mercurial, this is less ideal, it seems the way to handle this is with 1 repository for each app? and sub repositories per each lib you want to use?
--app repository-|-- libs
|-- app
Is this right?
If so, when starting a new application in visual studio and you want to add various libs, whats the best/most efficient way to go about it?
I'm getting the feeling the initial setup is a bit painful? As opposed to the subversion layout where effectively you don't really have to do anything other than reference the library in your visual studio project.
So, hence this question, wanting to know a good directory structure, and how to quickly setup a new project using this structure.

Ideally, and this is going to be based on my own opinion and experience in working with larger, distinct applications, but with dependencies, you want to have a repository per distinct, unrelated project, and keep related, possibly dependent projects within the same repo. I'm not a big fan of Subrepositories, but that might just be to lack of exposure.
The reason for this is that you should want to version related projects together as changing one may affect the other. In reality, anything that can be pulled into a single solution and have project references, you definitely want to keep together.
Now, there are some exceptions where you may have a library project that you can't necessarily have as part of a solution, but is a reference for a set of projects. This is where I'd keep a lib folder versioned along side the rest of my applications in the same repo, but the lib folder holds pre-build assemblies. It can also hold 3rd party vendor assemblies as well. This is also important to be versioned along with the project that uses them as you can treat a library update for the main project as a minor release.
For other projects that are truly independent, create another repository for it, as it will have its own version life and you do not want changes to it to affect the graph of changes for your other, completely unrelated projects.
Example layout with several related projects and lib folder:
[-] Big Product Repo
--[-] Big Product 1
----[+] Dal
----[+] Services
----[-] Web
------[+] Controllers
------[+] Models
------[+] Views
--[+] Big Product 2
--[-] lib
----[+] iTextSharp
----[+] nHibernate
Example layout with another unrelated project in it (for sake of argument, a Windows services project):
[-] Small Product Repo
--[-] Windows Services
----[+] Emailer
----[+] Task Runner
In reality, though, your folder structure isn't as important as making sure projects that are being treated as one logical unit (a product) are kept together to ensure control over what is built and released. That is my definition of what a repository should contain and what I use to think about how to split things up if there's more than one versionable product.

Related

How to Structure Projects for Multiple Xamarin Apps

My team is working on translating several legacy mobile applications to Xamarin Forms apps. Currently each application is in its own solution, which is not ideal when it comes to the fact that they all use a common set of backend software libraries. We were planning to consolidate all the smaller solutions into a single solution, containing the apps as well as the common libraries.
However, one of my teammates brought up a valid concern about how with a single Xamarin Forms app, several projects could get generated (core, Android, iOS, etc.), with the eventual result of a generally unwieldy solution. I agree with him that the current setup probably would not scale too well as we add more apps -- even if we group projects in solution folders, Visual Studio will eventually slow to a crawl after a certain amount of projects exist in the solution.
So we are considering just going back to having each app in its own solution, each solution containing the few Xamarin Forms projects for that app, as mentioned above. But this brings us back to the question of how to reasonably manage the shared library code. My current thought would be to just use shared project(s) for the libraries, or maybe assemble them into NuGet package(s) the app solutions would consume. Am I on the right track here, or does anyone know of a better way to do this?
There are several different ways to manage a shared code project using subtrees, submodules, NuGet packages, etc. There are pros and cons to each so it's best to decide based on the expected use case for that project.
Subtrees essentially take a copy of the remote repo and pull it into the parent repo. This makes it easy to pull in changes from the remote repo but if changes are expected to be pushed back it can be significantly more difficult since it has no knowledge of the remote repo. While it is possible to push changes back it can take a significant among of time to do depending on the amount of history of the repos.
Submodules are similar to subtrees except that instead of taking a copy it tracks the remote repo based on a specific commit it's pointed to. This essentially can be thought of as another repo inside of the parent that makes pushing changes back to the remote repo much easier but at the cost of making pulling/updating from it a little bit more difficult.
NuGet packages are extremely convenient to install, update, and release to others without having to make the source code public, but that comes with a bit more initial setup to generate each package version and comes at the cost of making it more difficult to debug than with the actual source code. This is particularly a great option if the shared code library will be distributed to others.
For most projects, if changes are expected to be potentially made to that shared project from a consuming one I'd recommend a repo for each project and set up the shared one as a submodule in each. It does take a bit of learning to get used to the different processes of checking out and updating a submodule but actually isn't all that difficult and worth learning the few git commands required. The docs provide a great example of how to get started using submodules.

What is the recommended way to setup projects like this?

We are working on a large project. The project has multiple external sites and multiple internal sites all stored in Subversion.
The external sites allow a customer to make requests of various things we provide, pay utility bills and more. We decided to break many of these functions apart because most work completely different than the others. So this is one Visual Studio solution with the WebUI and the database layer broken into two projects each. For instance, utility billing has a Utility.WebUI project and a Utility.Domain project. All DB/business logic is kept in the domain project.
The internal sites bridge the gap between the back-office system (IBM i) and the web database. Also will replace/enhance some of our older RPG programs. In theory they should use the exact same database logic that the external sites use because they access the same database right? What is the best way to reference these projects from a different solution? Should I just add a reference to the dll or should I import that project from the external application solution into the internal application solution?
This comes down to that we have two developers working on this project. Myself, I do most of the back-end coding. The other developer does most of the GUI coding. So we need to make sure that this project works on multiple workstations.
Does this make sense? Any thoughts?
Use the svn:externals property to reference the shared project into your project(s).
You have to choose between 1) referencing the directory containing the shared project's source code (i.e. where the csproj and cs files are located) or 2) referencing the directory containing the shared project's build output (assembly / dll).
I normally prefer method 1) since it makes modifications to the shared project's source code easier (you can make changes without having to open the shared project's solution in a second instance of Visual Studio). If you don't intend to make changes to the shared project often then method 2) might be better. It reduces compile time and prevents accidental modifications of the shared project's source code. Both methods are fine - matter of taste.
It is recommended for both methods that you version your shared project. i.e. create tags with version numbers and reference the tags, not the trunk. When a new version of the shared project comes out you can update the svn:externals property of your other project(s) with the new version number, run "svn update" to download the new version of the shared project, and recompile. This works especially well if you have a build server for the shared project that does the tagging for you automatically.
I think you can use a sort of "commons" solution that contains the common projects and then refer to these projects in you main solutions using SVN external pointing to the project folder in the SVN trunk.
Commons SVN repository must follow the suggested repository structure (trunk, branches, tags) to have always stable commons projects.
In this scenario you can consider to use a dependency management tool, such as NPanday or NDepend, where you must declare to which version of which assemblies every project depends on; using these tools you can have a local repository (such as Artifactory or Nexus) of binary assemblies to refer to, or choose to use SVN externals to refer directly to source code.

What is the purpose of Xcode 4's workspaces?

I don't quite understand the utility of Xcode 4's workspaces. What are they used for, and how do they aid with development in Xcode?
E.g. you have a library, that you use in two applications. You will most likely have an own project for this library, correct? Now, you are free to treat this library as an independent project with versioning and regularly do releases; but this can be very cumbersome, if you need to change the library code pretty often and all these changes are directly caused by changes to your two applications using that library. Instead you can create two projects, one for each applications and then two workspaces, one consisting out of the library project and app 1, the other one out of the library project and app 2. Opening a workspace always opens both relevant projects, workspace build settings automatically apply to both of them, they both build to the same build directory (which is actually chosen by Xcode automatically, but it is chosen by workspace, not by project) and when you do global searches, search for symbols, etc. Xcode will always do so in both projects. Further if you change build settings to the library project, because you have to, the changes are also correctly set when you open up the other workspace, which is an advantage to directly importing the library files to two different projects. And now think of 50 libraries, 20 apps and each of them uses various of those 50 libraries.
This may not be the idea Apple had in mind, it may not be the perfect use case for workspaces and other people may have better ideas, but this is one use case I can think of.
A workspace is mainly used to manage multiple projects in one logical space. This facilitates the management of dependencies between multiple projects. Very useful when you are involved with open source development.

How to work on a Cocoa app and plugins in parallel?

I have a relatively simple goal: I want to create a Cocoa application which doesn't have much functionality itself, but is extendable through plugins. In addition I want to work on a few plugins to supply users with real functionality (and working examples).
As I am planning to make the application and each plugin separate open-source projects (and Git repositories), I'm now searching for the best way to organize my files and the Xcode projects. I'm not very experienced with Xcode and right now I don't see a simple way to get it working without copying files after building.
This is the simple monolithic setup I used for development up until now:
There's only one Xcode project with multiple products:
The main application
A framework for plugin development
Several plugin bundles
What I'm searching for is a comfortable way to split these into several Xcode projects (one for the application and framework) and one for each plugin. As my application is still in an early stage of development, I'm still changing lots of things in both the application and the plugins. So what I mean by "comfortable" is, that I don't want to copy files manually or similar inconvenience.
What I need is that the plugin projects know where they can find the current development framework and the application needs to know where it can find the development plugins. The best would be something like a inter-project dependency, but I couldn't find a way to setup something like that in Xcode.
One possible solution I have in mind is to copy both (the plugins and the framework) in a "Copy Files Build Phase" to a known location, e.g. /tmp/development, so production and development files aren't mixed up.
I think that my solution would be enough, but I'm curious if there's a better way to achieve what I want. So any suggestions are welcome.
First, don't use a static "known location" like you mention. I've worked in this kind of project; it's a royal pain. As soon as you get to the point of needing a couple of different copies of the project around (for fixing bugs in parallel, for testing a "clean" build versus your latest changes, for working on multiple branches), the builds start trashing each other and you find yourself having to do completely clean/builds much more often than you'd want.
You can create inter-project dependencies by adding the dependent project (Add File), right click the Target and choose "Get Info," and then add a Direct Dependency on the General pane.
In terms of structure, you can either put the main app and framework together, or put them in separate projects. In either case, I recommend a directory tree like:
/MyProject
/Framework
/Application
/Plugins
/Plugin1
/Plugin2
Projects should then refer to each other by relative paths. This means you can easily work on multiple copies of the project in parallel.
You can also look at a top-level build script that changes into each directory and runs "xcodebuild". I dislike complex build scripts (we have one; it's called Xcode), but if all it does is call "xcodebuild" with parameters if needed, then a simple build script is useful.

What is the best practice for sharing a Visual Studio Project (assembly) among solutions

Suppose I have a project "MyFramework" that has some code, which is used across quite a few solutions. Each solution has its own source control management (SVN).
MyFramework is an internal product and doesn't have a formal release schedule, and same goes for the solutions.
I'd prefer not having to build and copy the DLLs to all 12 projects, i.e. new developers should to be able to just do a svn-checkout, and get to work.
What is the best way to share MyFramework across all these solutions?
Since you mention SVN, you could use externals to "import" the framework project into the working copy of each solution that uses it. This would lead to a layout like this:
C:\Projects
MyFramework
MyFramework.csproj
<MyFramework files>
SolutionA
SolutionA.sln
ProjectA1
<ProjectA1 files>
MyFramework <-- this is a svn:externals definition to "import" MyFramework
MyFramework.csproj
<MyFramework files>
With this solution, you have the source code of MyFramework available in each solution that uses it. The advantage is, that you can change the source code of MyFramework from within each of these solutions (without having to switch to a different project).
BUT: at the same time this is also a huge disadvantage, since it makes it very easy to break MyFramwork for some solutions when modifiying it for another.
For this reason, I have recently dropped that approach and am now treating our framework projects as a completely separate solution/product (with their own release-schedule). All other solutions then include a specific version of the binaries of the framework projects.
This ensures that a change made to the framework libraries does not break any solution that is reusing a library. For each solution, I can now decide when I want to update to a newer version of the framework libraries.
That sounds like a disaster... how do you cope with developers undoing/breaking the work of others...
If I were you, I'd put MyFrameWork in a completely seperate solution. When a developer wants to develop one of the 12 projects, he opens that project solution in one IDE & opens MyFrameWork in a seperate IDE.
If you strong name your MyFramework Assemby & GAC it, and reference it in your other projects, then the "Copying DLLs" won't be an issue.
You just Build MyFrameWork (and a PostBuild event can run GacUtil to put it in the asssembly cache) and then Build your other Project.
The "best way" will depend on your environment. I worked in a TFS-based, continuous integration environment, where the nightly build deployed the binaries to a share. All the dependent projects referred to the share. When this got slow, I built some tools to permit developers to have a local copy of the shared binaries, without changing the project files.
Does work in any of the 12 solutions regularly require changes to the "framework" code?
If so your framework is probably new and just being created, so I'd just include the framework project in all of the solutions. After all, if work dictates that you have to change the framework code, it should be easy to do so.
Since changes in the framework made from one solution will affect all the other solutions, breaks will happen, and you will have to deal with them.
Once you rarely have to change the framework as you work in the solutions (this should be your goal) then I'd include a reference to a framework dll instead, and update the dll in each solution only as needed.
svn:externals will take care of this nicely if you follow a few rules.
First, it's safer if you use relative URIs (starting with a ^ character) for svn:externals definitions and put the projects in the same repository if possible. This way the definitions will remain valid even if the subversion server is moved to a new URL.
Second, make sure you follow the following hint from the SVN book. Use PEG-REVs in your svn:externals definitions to avoid random breakage and unstable tags:
You should seriously consider using
explicit revision numbers in all of
your externals definitions. Doing so
means that you get to decide when to
pull down a different snapshot of
external information, and exactly
which snapshot to pull. Besides
avoiding the surprise of getting
changes to third-party repositories
that you might not have any control
over, using explicit revision numbers
also means that as you backdate your
working copy to a previous revision,
your externals definitions will also
revert to the way they looked in that
previous revision ...
I agree with another poster - that sounds like trouble. But if you can't want to do it the "right way" I can think of two other ways to do it. We used something similar to number 1 below. (for native C++ app)
a script or batch file or other process that is run that does a get and a build of the dependency. (just once) This is built/executed only if there are no changes in the repo. You will need to know what tag/branch/version to get. You can use a bat file as a prebuild step in your project files.
Keep the binaries in the repo (not a good idea). Even in this case the dependent projects have to do a get and have to know about what version to get.
Eventually what we tried to do for our project(s) was mimic how we use and refer to 3rd party libraries.
What you can do is create a release package for the dependency that sets up a path env variable to itself. I would allow multiple versions of it to exist on the machine and then the dependent projects link/reference specific versions.
Something like
$(PROJ_A_ROOT) = c:\mystuff\libraryA
$(PROJ_A_VER_X) = %PROJ_A_ROOT%\VER_X
and then reference the version you want in the dependent solutions either by specific name, or using the version env var.
Not pretty, but it works.
A scalable solution is to do svn-external on the solution directory so that your imported projects appear parallel to your other projects. Reasons for this are given below.
Using a separate sub-directory for "imported" projects, e.g. externals, via svn-external seems like a good idea until you have non-trivial dependencies between projects. For example, suppose project A depends on project on project B, and project B on project C. If you then have a solution S with project A, you'll end up with the following directory structure:
# BAD SOLUTION #
S
+---S.sln
+---A
| \---A.csproj
\---externals
+---B <--- A's dependency
| \---B.csproj
\---externals
\---C <--- B's dependency
\---C.csproj
Using this technique, you may even end up having multiple copies of a single project in your tree. This is clearly not what you want.
Furthermore, if your projects use NuGet dependencies, they normally get loaded within packages top-level directory. This means that NuGet references of projects within externals sub-directory will be broken.
Also, if you use Git in addition to SVN, a recommended way of tracking changes is to have a separate Git repository for each project, and then a separate Git repository for the solution that uses git submodule for the projects within. If a Git submodule is not an immediate sub-directory of the parent module, then Git submodule command will make a clone that is an immediate sub-directory.
Another benefit of having all projects on the same layer is that you can then create a "super-solution", which contains projects from all of your solutions (tracked via Git or svn-external), which in turn allows you to check with a single Solution-rebuild that any change you made to a single project is consistent with all other projects.
# GOOD SOLUTION #
S
+---S.sln
+---A
| \---A.csproj
+---B <--- A's dependency
| \---B.csproj
\---C <--- B's dependency
\---C.csproj

Resources