in ccnet wiki in the project block, workingDirectory I read: "Make sure this folder is unique per project to prevent problems with the build." I want to do two project that has the same working directory... what are the "problems with the build" that can occur and how can I overcome these problems?
Edit:
My situation: I have two applications in the same trunk that has some common code but if I made commit to one of the applications I don't want the other application to build and increase its version number but if I made a change to the common code I want both of them to trigger a build. My source control is SVN and I use Filtered block to include only the files I want to trigger a build.
Option 1
Have a single project that builds just the common code. This should emit its built assembly to a known location outside it's working directory.
Have 2 other projects that build the other parts of the solution. Each only listens to changes to its particular source control paths. Each project can incorporate/reference the built assembly from the known location
The 2 other projects can be forced from the common project using a forceBuildPublisher.
These projects should be in the same queue to prevent the common project rewriting the built common assembly whist its trying to be reference by the other 2 projects.
Option 2
Have 2 individual projects that both build the common source code and the specific code together. Say by building a solution file which contains both sets of projects.
This is the simpler option, but does lose you the neatness of having a 'common assembly version number'
Pros and Cons
Option 1
You have a single version no for each version of the common code.
Its more prone to issues due to the additional complexity.
You need to maintain a known location outside of the working directory so that it is not deleted/cleaned by the build process.
Option 2
Simpler solution.
Common code version is lost in the version of the dependant assembly.
If I had to suggest one, I would opt for for option 2, purely because its simplicity reduces chance of other issues.
Working directory is a place where Cruise Control puts the source code of your project to, and this is also where the build process is happening. If you point two projects to the same working directory, you can end up with any kind of conflicts you can guess. Source files of project A and project B can mix up, the build process might break because of unknown state of the build folder, etc.
It's quite natural to separate unrelated things, and in this case it's a call of common sense. Besides, I can hardly imagine a situation when you have to put 2 projects into the same working directory.
Related
I presently work with a large solution, containing about 100 projects. At least 10 of the projects are executable applications. Some of the library projects are imported as plugins via MEF and reflection rather than with direct references. If a needed plugin's own dependencies are not copied to the output or plugin directory of the executable project using it, we'll get reflection errors at runtime.
We've already tried or discussed the following solutions, but none of them seem like a good fit:
"Hard" References: Originally, we had the executable projects reference other projects they needed, even if they were going to ultimately be imported as optional plugins. This quickly fell out of favor with team members who needed to make builds that excluded certain plugins and liked to unload those projects to begin with. This also made it difficult to use Resharper or other tools to clean unused references and remove obsolete third party libraries without accidentally blowing away the "unused" references to the needed plugins own dependencies.
Post-build copying (with pre-build "pull"): For a brief period of time, a senior team member set all the plugin projects to xcopy their outputs output themselves to a known "DependencyInjection" folder as post-build events. Projects that needed those plugins would have pre-build events, xcopying each desired plugin to their own output directories. While this meant that the plugin projects "rightly" had no knowledge of where they might be used, this caused two major headaches. First, any time one made a change in a plugin project, they had to separately build (in sequence) the plugin project and then the executable project they would test it in (to get the files to copy over). Rebuild all would be more convenient but far too slow. Second, the continuous integration build would have to have been reconfigured since it compiled everything all in one directory and only cared if everything built successfully.
Post-build copying (push): The present solution started with xcopy and now mostly uses robocopy in post-build events of the plugin projects to copy needed files directly to the plugin folders of the executable projects that use them. This works fairly well in that if one makes a change in a plugin, one can go straight to running with the debugger. Also, the CI build doesn't break, and users disabling certain "optional" plugin projects for various builds don't get build errors from missing references. This still seems hackish, and is cumbersome to maintain in all the separate post-build windows, which are rather small and can't be expanded. When executable projects get moved from a project restructure or renamed, we don't find out about broken references until the next day after hearing results from the overnight automated testing.
"Dummy" projects with references: One idea that was briefly tossed about involved making empty projects for each of the different executable build configurations and going back to the hard references method on those. Each would use its own references to gather up the plugins and their dependencies. They would also have a reference to the actual executable and copy it over. Then, if one wanted to run a particular executable in a particular configuration, you'd run its dummy project. This one seemed particularly bloated and was never attempted.
NuGet: In my limited familiarity with NuGet, this seems like a good fit for using packages, except I wouldn't know how to implement that internal to one solution. We've talked about breaking up the solution, but many members of the team are strongly opposed to that. Is using NuGet with packages coming from within the same solution possible?
What are best practices for a situation like this? Is there a better solution to managing dependencies of reflected dependencies like this than any of the above, or is a refinement of one of the above the best choice?
Ok, so I assume in this answer that each developer needs to constantly have all 100 assemblies (Debug mode) locally to do its job (develop, compile, smoke test, run automatic tests).
You are mentioning that RebuildAll takes long time. Generally this symptom is caused by too many assemblies + build process not rationalized. So the first thing to do is to try to merge the 100 assemblies into as few assemblies as possible and avoid using things like Copy Local = true. The effect will be a much faster (like 10x) RebuildAll process. Keep in mind that assemblies are physical artefacts and that they are useful only for physical things (like plug-in, loading on-demand, test/app separation...). I wrote a white-book that details my thoughts on the topic: http://www.ndepend.com/WhiteBooks.aspx
Partitioning code base through .NET assemblies and Visual Studio projects (8 pages)
Common valid and invalid reasons to create an assembly
Increase Visual Studio solution compilation performance (up to x10 faster)
Organize the development environment
In the white-book advice's, one of idea is to avoid referencing project but to reference assemblies instead. This way it becomes your responsibility to fill Project > right click > Project Dependencies that will define the Project > right click > Project Build Order. If you decide to keep dealing with 100 assemblies, defining this setting represents an effort, but as a bonus a high-level (executable) project can depend on a library only used by reflection and this will solve your problem.
Did you measure the Lines of Code in terms of # of PDB sequences points? I estimate that until the limit 200K to 300K doing a RebuildAll (with optimization described in the white-book) should take 5 to 10 seconds (on a decent laptop) and it remains acceptable. If your code base is very large and goes beyond this limit, you'll need to break my first assumption and find a way that a developer doesn't need all assemblies to do its job (in which case we can talk about this further).
Disclaimer: This answer references resources from the site of the tool NDepend that I created and now manage its development.
I have been in a situation like yours. We had almost 100 projects. We too were using MEF and System.AddIn. In the beginning we had a few solutions. I was working on the core solution that included the core assemblies and their tests. Each plug-in category in a separate solution, that included contracts, implementation (some plug-ins had more than one implementation) and tests, plus some test host as well as the core assemblies. At some later point we added a solution that included all projects and after trying a few of the approaches you mention we decided to do the following:
Keep the references that are mandatory,
All executable projects were set to output to common locations (one for debug and one for release configurations),
All projects that should not be referenced were set to output to these common locations,
All projects that were referenced by others, were left unchanged and each reference was set with Copy Local = true.
Tests were left unchanged.
Although building all was slow, we didn't have any other problems. Of course having almost 100 projects is a sign that the design is probably too modular and as Patrick advises, we should have tried to compact it.
Anyway, you could try this approach in a couple of hours and perhaps instead of setting Copy Local = true, try to set the output folder of all projects mentioned in 4 to have their output set to the common locations. We didn't know that this setting will slow down the build process as Patrick mentions.
PS. We never tried using NuGet because we didn't have enough resources and time to experiment with it. It looked promising though.
We are starting up a new project and I am looking for the "best practices" solution of this similar problem. For us, we can divide the projects into two categories 1) The Platform assemblies, which provide common set of services across the board and 2) Verticals which would be perform business specific functions.
In the past we have used a Visual Studio plug-in with a simple UI that allow developers to specify a common assemblies path to copy the output assemblies and then reference all assemblies (whereever they reside in a different solution) from the common assemblies folder.
I am looking at NUGET but the sheer work you have to do to created and maintain NUGET packages is punitive.
It's a very common scenario and would be really interested to see how others have addressed it.
I don't quite understand the utility of Xcode 4's workspaces. What are they used for, and how do they aid with development in Xcode?
E.g. you have a library, that you use in two applications. You will most likely have an own project for this library, correct? Now, you are free to treat this library as an independent project with versioning and regularly do releases; but this can be very cumbersome, if you need to change the library code pretty often and all these changes are directly caused by changes to your two applications using that library. Instead you can create two projects, one for each applications and then two workspaces, one consisting out of the library project and app 1, the other one out of the library project and app 2. Opening a workspace always opens both relevant projects, workspace build settings automatically apply to both of them, they both build to the same build directory (which is actually chosen by Xcode automatically, but it is chosen by workspace, not by project) and when you do global searches, search for symbols, etc. Xcode will always do so in both projects. Further if you change build settings to the library project, because you have to, the changes are also correctly set when you open up the other workspace, which is an advantage to directly importing the library files to two different projects. And now think of 50 libraries, 20 apps and each of them uses various of those 50 libraries.
This may not be the idea Apple had in mind, it may not be the perfect use case for workspaces and other people may have better ideas, but this is one use case I can think of.
A workspace is mainly used to manage multiple projects in one logical space. This facilitates the management of dependencies between multiple projects. Very useful when you are involved with open source development.
I have a relatively simple goal: I want to create a Cocoa application which doesn't have much functionality itself, but is extendable through plugins. In addition I want to work on a few plugins to supply users with real functionality (and working examples).
As I am planning to make the application and each plugin separate open-source projects (and Git repositories), I'm now searching for the best way to organize my files and the Xcode projects. I'm not very experienced with Xcode and right now I don't see a simple way to get it working without copying files after building.
This is the simple monolithic setup I used for development up until now:
There's only one Xcode project with multiple products:
The main application
A framework for plugin development
Several plugin bundles
What I'm searching for is a comfortable way to split these into several Xcode projects (one for the application and framework) and one for each plugin. As my application is still in an early stage of development, I'm still changing lots of things in both the application and the plugins. So what I mean by "comfortable" is, that I don't want to copy files manually or similar inconvenience.
What I need is that the plugin projects know where they can find the current development framework and the application needs to know where it can find the development plugins. The best would be something like a inter-project dependency, but I couldn't find a way to setup something like that in Xcode.
One possible solution I have in mind is to copy both (the plugins and the framework) in a "Copy Files Build Phase" to a known location, e.g. /tmp/development, so production and development files aren't mixed up.
I think that my solution would be enough, but I'm curious if there's a better way to achieve what I want. So any suggestions are welcome.
First, don't use a static "known location" like you mention. I've worked in this kind of project; it's a royal pain. As soon as you get to the point of needing a couple of different copies of the project around (for fixing bugs in parallel, for testing a "clean" build versus your latest changes, for working on multiple branches), the builds start trashing each other and you find yourself having to do completely clean/builds much more often than you'd want.
You can create inter-project dependencies by adding the dependent project (Add File), right click the Target and choose "Get Info," and then add a Direct Dependency on the General pane.
In terms of structure, you can either put the main app and framework together, or put them in separate projects. In either case, I recommend a directory tree like:
/MyProject
/Framework
/Application
/Plugins
/Plugin1
/Plugin2
Projects should then refer to each other by relative paths. This means you can easily work on multiple copies of the project in parallel.
You can also look at a top-level build script that changes into each directory and runs "xcodebuild". I dislike complex build scripts (we have one; it's called Xcode), but if all it does is call "xcodebuild" with parameters if needed, then a simple build script is useful.
I'm new to TeamCity and we have a number of common projects under source control that are then referenced directly within relevant projects i.e.:
Common
branches
tags
trunk
CommonProject
CommonProject.csproj
Common.sln
ProjectX
branches
tags
trunk
ProjectX.sln
As a result, the reference to "CommonProject.csproj" in "ProjectX.sln" is something along the lines of ....\Common\trunk\CommonProject\CommonProject.csproj which is fine within our development enviroments but when it comes to TeamCity it falls over saying it can't find the path "....\Common\trunk\CommonProject\CommonProject.csproj"
What's the best way around this problem? I've tried adding CommonProject to TeamCity as a dependancy but it still doesn't seem to want to play ball...
Thanks
Tim
We address this by using Externals in Subversion which allows you to pull in stuff from a different (bit of the) repository.
Then, when we're building the solutions, we have those common projects grouped into the same folders with project specific solution - i.e. when we check stuff out we have:
Solution1
+---Project1
+---Project2
+---Project3
+---Common1
+---Common2
Then, separately:
Solution2
+---ProjectA
+---ProjectB
+---ProjectC
+---Common1
+---Common2
Because we have the externals and the directory/folder structure set up this way you should, in theory, be able to checkout (or export) a "solution" to an empty directory and have it build successfully from scratch (subject to all the necessary tools being installed) and therefore TeamCity (or whatever your continuous integration server is) should be able to also be build it from scratch. In fact even before we stared using TeamCity I had this as policy but the value is clearer once you start doing continuous integration.
The appropriate bit of the Subversion Red Book is here: Externals Definitions
Suppose I have a project "MyFramework" that has some code, which is used across quite a few solutions. Each solution has its own source control management (SVN).
MyFramework is an internal product and doesn't have a formal release schedule, and same goes for the solutions.
I'd prefer not having to build and copy the DLLs to all 12 projects, i.e. new developers should to be able to just do a svn-checkout, and get to work.
What is the best way to share MyFramework across all these solutions?
Since you mention SVN, you could use externals to "import" the framework project into the working copy of each solution that uses it. This would lead to a layout like this:
C:\Projects
MyFramework
MyFramework.csproj
<MyFramework files>
SolutionA
SolutionA.sln
ProjectA1
<ProjectA1 files>
MyFramework <-- this is a svn:externals definition to "import" MyFramework
MyFramework.csproj
<MyFramework files>
With this solution, you have the source code of MyFramework available in each solution that uses it. The advantage is, that you can change the source code of MyFramework from within each of these solutions (without having to switch to a different project).
BUT: at the same time this is also a huge disadvantage, since it makes it very easy to break MyFramwork for some solutions when modifiying it for another.
For this reason, I have recently dropped that approach and am now treating our framework projects as a completely separate solution/product (with their own release-schedule). All other solutions then include a specific version of the binaries of the framework projects.
This ensures that a change made to the framework libraries does not break any solution that is reusing a library. For each solution, I can now decide when I want to update to a newer version of the framework libraries.
That sounds like a disaster... how do you cope with developers undoing/breaking the work of others...
If I were you, I'd put MyFrameWork in a completely seperate solution. When a developer wants to develop one of the 12 projects, he opens that project solution in one IDE & opens MyFrameWork in a seperate IDE.
If you strong name your MyFramework Assemby & GAC it, and reference it in your other projects, then the "Copying DLLs" won't be an issue.
You just Build MyFrameWork (and a PostBuild event can run GacUtil to put it in the asssembly cache) and then Build your other Project.
The "best way" will depend on your environment. I worked in a TFS-based, continuous integration environment, where the nightly build deployed the binaries to a share. All the dependent projects referred to the share. When this got slow, I built some tools to permit developers to have a local copy of the shared binaries, without changing the project files.
Does work in any of the 12 solutions regularly require changes to the "framework" code?
If so your framework is probably new and just being created, so I'd just include the framework project in all of the solutions. After all, if work dictates that you have to change the framework code, it should be easy to do so.
Since changes in the framework made from one solution will affect all the other solutions, breaks will happen, and you will have to deal with them.
Once you rarely have to change the framework as you work in the solutions (this should be your goal) then I'd include a reference to a framework dll instead, and update the dll in each solution only as needed.
svn:externals will take care of this nicely if you follow a few rules.
First, it's safer if you use relative URIs (starting with a ^ character) for svn:externals definitions and put the projects in the same repository if possible. This way the definitions will remain valid even if the subversion server is moved to a new URL.
Second, make sure you follow the following hint from the SVN book. Use PEG-REVs in your svn:externals definitions to avoid random breakage and unstable tags:
You should seriously consider using
explicit revision numbers in all of
your externals definitions. Doing so
means that you get to decide when to
pull down a different snapshot of
external information, and exactly
which snapshot to pull. Besides
avoiding the surprise of getting
changes to third-party repositories
that you might not have any control
over, using explicit revision numbers
also means that as you backdate your
working copy to a previous revision,
your externals definitions will also
revert to the way they looked in that
previous revision ...
I agree with another poster - that sounds like trouble. But if you can't want to do it the "right way" I can think of two other ways to do it. We used something similar to number 1 below. (for native C++ app)
a script or batch file or other process that is run that does a get and a build of the dependency. (just once) This is built/executed only if there are no changes in the repo. You will need to know what tag/branch/version to get. You can use a bat file as a prebuild step in your project files.
Keep the binaries in the repo (not a good idea). Even in this case the dependent projects have to do a get and have to know about what version to get.
Eventually what we tried to do for our project(s) was mimic how we use and refer to 3rd party libraries.
What you can do is create a release package for the dependency that sets up a path env variable to itself. I would allow multiple versions of it to exist on the machine and then the dependent projects link/reference specific versions.
Something like
$(PROJ_A_ROOT) = c:\mystuff\libraryA
$(PROJ_A_VER_X) = %PROJ_A_ROOT%\VER_X
and then reference the version you want in the dependent solutions either by specific name, or using the version env var.
Not pretty, but it works.
A scalable solution is to do svn-external on the solution directory so that your imported projects appear parallel to your other projects. Reasons for this are given below.
Using a separate sub-directory for "imported" projects, e.g. externals, via svn-external seems like a good idea until you have non-trivial dependencies between projects. For example, suppose project A depends on project on project B, and project B on project C. If you then have a solution S with project A, you'll end up with the following directory structure:
# BAD SOLUTION #
S
+---S.sln
+---A
| \---A.csproj
\---externals
+---B <--- A's dependency
| \---B.csproj
\---externals
\---C <--- B's dependency
\---C.csproj
Using this technique, you may even end up having multiple copies of a single project in your tree. This is clearly not what you want.
Furthermore, if your projects use NuGet dependencies, they normally get loaded within packages top-level directory. This means that NuGet references of projects within externals sub-directory will be broken.
Also, if you use Git in addition to SVN, a recommended way of tracking changes is to have a separate Git repository for each project, and then a separate Git repository for the solution that uses git submodule for the projects within. If a Git submodule is not an immediate sub-directory of the parent module, then Git submodule command will make a clone that is an immediate sub-directory.
Another benefit of having all projects on the same layer is that you can then create a "super-solution", which contains projects from all of your solutions (tracked via Git or svn-external), which in turn allows you to check with a single Solution-rebuild that any change you made to a single project is consistent with all other projects.
# GOOD SOLUTION #
S
+---S.sln
+---A
| \---A.csproj
+---B <--- A's dependency
| \---B.csproj
\---C <--- B's dependency
\---C.csproj