I have a few (3) core projects I want to share across many solutions (12+).
So, say I have 12 websites and they use some shared back end core code (in this case I'm not talking about shared js, css or views - I'm talking about business objects, entity stuff, etc.).
I need to be able to identify which site has which version of the shared code in dev, test, prod, etc. so a developer can get the website code and get the right version of the shared code to develop or patch the website.
And then the MS build server needs to know which version of the shared code to get for the deployment.
To solve this, I'm seeing people branch that core code - which seems absurd to do 12+ times. (I do expect to branch the core code sometimes for things like hot fixes and long running projects.)
I'm also seeing people copy DLLs of the core code and check those in.
I would think I would list the dependencies for my solutions based on TFS label names somewhere so developers can easily get the apps running with the right code and given a tfs label the build server can get the code for the website and the proper version of the core code. I'm using TFS & VS 2013 at the moment too, so there's that.
So, is there a way to do this that's straightforward, supportable/scale-able and intuitive? Thanks - Peter
Labels in TFS is very limited. For example once the label created you couldn't change and update it. If one of your core projects updated, did you need to create a new label for it. If you did and use the new label for one of your solution. However you found there are some bugs in this update, you need a newer update of your core project to fix the bug. Then a newer label created, you need to manually maintain the dependencies which seems not to be an easy job.
Moreover how to list the dependencies for your solutions based on TFS label names? TFS don't have this built-in option, seems the only way is store it in a txt or someother files and check in the source control. Every time the developer open a website application need to check it first and get label from server to their workspace and work on it.
Usually the purpose of sharing code between projects is reducing maintenance. There’s two main code sharing paths: source and binary. The difference between them you could take a look at this blog: Code Sharing in Team Foundation Server
Sharing code between products is a primary cause of quality erosion and elevated bug counts. I would recommend you to build separately and sharing binary output through NuGet which use preferable.
Also take a look below similar questions:
Sharing code between solutions in TFS
TFS 2010 Branch Across Team Projects - Best Practices
Related
We started programming in a project that uses Agile Work Item Templates. Now, there is some history of the code that we want to keep.
Also, we want to change to a customized CMMI template, so it is close to CMMI, but customized, with slightly different work items, also some new/removed ones (for testing purposes, we set it up in a different project).
How can we now merge the source (and history) from the one project with the work items from another project?
From my understanding, you could simply export/import the work item types, but then, all the reports and queries as well as the dashboard would not get updated properly as well? So all scenarios we can come up with now result in a loss of version history (simply importing the current state of the source into newly created project using CMMI and then updating the work items).
Is there a better solution?
(using TFS 2010 and VS 2010)
edit: some useful information to be found here: http://blogs.msdn.com/b/willy-peter_schaub/archive/2011/05/17/tfs-integration-tools-where-does-one-start-part-3-dust-has-settled-did-it-work.aspx - like me, you will probably especially run into trouble with the ProcessBuildTemplates
Have you considered using the TFS Integration Tools? I'm not sure about the successful migration of work items for Team Projects using different templates, but I've been able to successfully migrate code with its history between Team Projects.
We have Team Foundation Server 2008 deployed as our source control management system. A team that is responsible for multiple products is asking for all their products to be put under a single TFS project. Their reason is because the products are all in a similar domain.
Here are my reasons against:
The workspace mappings will get weird, since projects will be mapped to subfolders
Continuous Integration may be a problem, since a single project can't be referenced
Tracking history of source control activity could be problematic
This just feels like an overall bad idea, but I would like some concrete reasons against it. If I'm completely off-base and this is a good approach to take, I'd like to hear that as well.
What are the pros/cons?
I have experience storing multiple Visual Studio Solutions (seperate products) under one TFS Team Project in both TFS2008 and TFS2010. Here is my take.
In both versions we create a folder for the Product, then a folder for the branches (Main, etc.) This makes it easy to see what product we are working on, and we can see the history of the product seperate from other products. Continuous integration works just fine with multiple build definitions, one for each product. We only create one workspace mapping for the entire TFS Team Project.
The shortfall in TFS2008 is that it can be difficult to manage work items for each Product. In TFS2008 the work items apply to the entire Team Project and it is not as easy as it should be to figure out which work item belongs to which product.
In TFS2010 the work items have an Areas and Iterations section. We use the Area to define the Product. So each Work Item gets an Area that matches the Product name. This has worked very well for us.
If you are not using work items heavily in TFS2008 than I don't think you should avoid putting multiple Products in one TFS Team Project, certinally not for the reasons you listed above.
Using one Team Project does haves some advantages:
1. There is ony one Team Project to manage and there is only one Share Point site.
2. You can see history across the entire Team Project easily.
My thoughts:
If there are assemblies shared amongst the projects, it makes sense to lump them together, otherwise you will run into the same problems that many people have discussed here, on how to handle shared assemblies.
You shouldn't encounter any problems with workspace mappings. Within our organization, we simply map $/ to a folder and go from there. Otherwise you could very easily map individual source control folders to different areas on disk. The only recommendation I would have is to put that mapping in a batch file, so that new members can run the batch and be consistent.
The only thing that you might lose out on a bit by lumping these all together is quick and easy reporting. If everything is in its own Team Project, the built-in reporting works "out of the box." If you put things together, you'll need to set up additional areas and iterations in order to do the reporting and tracking.
In our organization we have upward of 15 separate team projects, but every single one of them has more than one "product" underneath. We've been running this way for two years and really haven't had any problem with it, with the exception of the reporting.
Using a single Team Project for more than one software is a perfectly acceptable solution if you don't use separate templates for them. Martin Hinshelwood has a detailed blog post on the subject.
http://blog.hinshelwood.com/when-should-i-use-areas-in-tfs-instead-of-team-projects-in-team-foundation-server-2010/
Following SO thread shows Managing DLL references in multiple projects across different solutions. Additionally I want to know how to manage the version control for these dependency DLLs ?
Should the DLL (which is published to other external projects) be committed too, every time code change happens ?
Team Development with TFS Guide (Final Release)
This guide has proven to be invaluable to my team. They describe several scenarios and what the pros and cons are of each. For anyone managing a TFS environment this is a must read.
With regards to the DLL's from external projects. We will keep a copy of the source in TFS if we can get access to the source some times, use your best judgement. We keep a copy of the DLL in source control always. The DLL goes into a "SharedBinaries" folder next to source code so that they can be branched together.
It is critical that you be able to branch these DLL's along with source. It is also critical that they be in TFS so that you can do an automated build with little or no build machine configuration. My own personal goal while managing TFS is to be ready for a new developer to join the team and with a single get of source code be able to execute a successful build for local debugging.
EDIT: Different department builds DLL
Like all good IT answers I have to start with "it depends". If the other department is truly segregated from your department and you have little or no knowledge of what they are working or when they will be working on it. If they just occasionally tell you that they have done some things and you should now incorporate the changes then I would lean towards the DLL being committed to the repository every time that the department consuming it wants to change it.
If on the other hand we are really just talking about different teams in the same department where there is lots of cross talk and water cooler communication then I would expect that you could making something else work with just some project references.
It sounds to me like it is the former and not the latter situation that you find yourself in today. I would try to get the department that is creating the shared code to "release" the shared code like Microsoft releases the .NET Framework. Get them to just build the API and give you some DLL's and some documentation. Then the groups that are incorporating those DLL's into there products can check them in separately into a repository of there own control and isolate themselves from code churn while the department working on the these reused DLL's can work on the next version of them.
You should take this all with a grain of salt. This is just one guy rambling on about what might be a good idea. There are many more ways to solve these problems and they are all correct given different circumstances. If you are asking 5 people and you get 5 different responses I wouldn't be surprised.
We currently have a local TF Server here in our company, and we are about to make a subset of our projects open source (via Codeplex), but we are having problems mixing two Team Foundation Servers in the same solution. Looks like Visual Studio can't be connected to many TF Servers at the same time. What's the best way to deal with that?
Solution 1: Bind Open source projects to Codeplex only, and proprietary projects to local only. Bind and un bind projects depending where are you connected --> Looks like VS doesn’t like the idea. Projects loose bindings and start to behave strangely.
Solution 2 Bind all to local and use another solution for the open source subset --> Team Explorer Workspace manager avoid you using overlapping local folder trees, even on different servers, so it is not an option.
Solution 3 Bind all to local using TFS. Use another source control like SVN for the open source subset. It looks it will become messy easily, but we don't have a lot of options.
Someone with open source projects has faced a problem like this??
I would stick to one single authorative repository or you'd end up with a version hell at some point.
If you intend to have external developers contributing code on the codeplex side you will need to merge your changes with theirs and also integrate their changes on your own internal TFS server.
It's safer to have one single authoritive repository and just create snapshots for milestone releases on the other.
You could do your fine grained check-ins and modifications on your internal repository and periodically integrate/merge them to the codeplex code-tree. However what works on one codebase may not work so well on the other after integrating, the sooner you integrate changes the better (don't work on your own isolated branch too long).
We're making the switch from SourceGear Vault to TortoiseSVN with VisualSVN for Visual Studio integration - absolutely love it. However, there are multiple class libraries that we reference in multiple different applications that aren't a part of the working copy root in any of the applications. What's the best way to deal with this so that we can continue to utilize Visual Studio integration, but still keep various class libraries located outside of each project/application's root? SourceGear doesn't have an issue with this.
It is possible to add class libraries separately just using TortoiseSVN in explorer, but there's no ability to commit changes to anything outside of the working copy from within Visual Studio; neither are there the VisualSVN "traffic lights" indicating status for these outside of working copy class libraries.
By the way, we're also going with the "one repository with many projects" route as opposed to multiple repositories, especially as that is how we have worked for years to this point.
UPDATE:
I re-read some things that I had looked at before and discovered that svn:externals don't just refer to using code in different repositories, but can also be used to use multiple working copies in VisualSVN.
See http://www.visualsvn.com/support/topic/00007/ and http://svnbook.red-bean.com/en/1.2/svn.advanced.externals.html
However, is this the best way to deal with this issue? There's a good thread that goes through things, but doesn't completely resolve things.
Therefore, use svn:externals or not? Use multiple repositories or not? Again, for years we have referenced the code in shared class libraries amongst multiple solutions/applications and this works for us. Now how best to make this work with VisualSVN?
Found the best answers here:
Referenced Projects
Sometimes it is useful to construct a working copy that is made out of a number of different checkouts. For example, you may want different subdirectories to come from different locations in a repository, or perhaps from different repositories altogether. If you want every user to have the same layout, you can define the svn:externals properties.
And here:
Include a common sub-project
Sometimes you will want to include another project within your working copy, perhaps some library code. You don't want to make a duplicate of this code in your repository because then you would lose connection with the original (and maintained) code. Or maybe you have several projects which share core code. There are at least 3 ways of dealing with this.
I understand it's been more than ten years since you asked this question, but I am glad to tell you that there was progress in implementing support for multiple working copies in the VisualSVN plug-in.
VisualSVN 7.1 and 6.5 support multiple working copies within a single solution. The new functionality is available to Visual Studio 2019 and 2017 users.
Download the latest VisualSVN builds from the main download page. Please also see the article KB7: Using Multiple Working Copies in VisualSVN.