Scenario:
In my organisation, we separately develop multiple applications. In the end however, many of the applications (and their databases) are deployed to the same SQL instance, so they share the same master database.
We use Visual Studio 2010 database and server projects to source control said databases.
To try and standardise some things, I want to do the following:
Create a 'Core' Server database project which has all the server settings, core logins etc. Things like SET TRUSTWORTHY ON and server-level ANSI settings etc.
Have each Application's own Server.dbproj specify the logins and roles etc specific to that application.
Have each Application's own ApplicationDatabase.dbproj reference the ApplicationX.Server.dbproj
In theory, each Application in source control would only contain the items specific to it, rather than keeping server related settings or configuration synchronised across many projects.
Problem
However, in practice I can get this far:
Done. Produces a .schema file which I reference in future steps
Done. Server.dbproj happily references Core.dbschema and 'extends' it with it's own logins and roles etc. Is happy to deploy this anywhere I point it.
Nadda. I add a reference from ApplicationDatabase.dbproj to Server.dbproj (assuming Server would pull in the items from Core) and it complains about any logins that are actually in Core.
So I then added both Server and Core as references to ApplicationDatabase as it settled down. Compiles fine.
However, when you deploy, you get the same problem described here: http://social.msdn.microsoft.com/Forums/uk/vstsdb/thread/23cb9132-00d4-42ed-b34c-ab49027cddf7
Error TSD01234: The source model contains 2 server option elements.
Only one element can be contained in a model that can be deployed
The problem I think is that ApplicationDatabase essentially has two Server projects that it knows about, and therefore duplicate settings.
Microsofts documentation makes no mention of using partial projects in Server projects, but neither is it listed as a limitation.
So the question is...
Has anybody used partial projects successfully for Server projects, or is there a way you can see to achieve the same thing?
I'll be honest and say I won't just 'remove the Server projects' to make the problem disappear - we had it working very well up until I tried to improve things!
Related
I have a few (3) core projects I want to share across many solutions (12+).
So, say I have 12 websites and they use some shared back end core code (in this case I'm not talking about shared js, css or views - I'm talking about business objects, entity stuff, etc.).
I need to be able to identify which site has which version of the shared code in dev, test, prod, etc. so a developer can get the website code and get the right version of the shared code to develop or patch the website.
And then the MS build server needs to know which version of the shared code to get for the deployment.
To solve this, I'm seeing people branch that core code - which seems absurd to do 12+ times. (I do expect to branch the core code sometimes for things like hot fixes and long running projects.)
I'm also seeing people copy DLLs of the core code and check those in.
I would think I would list the dependencies for my solutions based on TFS label names somewhere so developers can easily get the apps running with the right code and given a tfs label the build server can get the code for the website and the proper version of the core code. I'm using TFS & VS 2013 at the moment too, so there's that.
So, is there a way to do this that's straightforward, supportable/scale-able and intuitive? Thanks - Peter
Labels in TFS is very limited. For example once the label created you couldn't change and update it. If one of your core projects updated, did you need to create a new label for it. If you did and use the new label for one of your solution. However you found there are some bugs in this update, you need a newer update of your core project to fix the bug. Then a newer label created, you need to manually maintain the dependencies which seems not to be an easy job.
Moreover how to list the dependencies for your solutions based on TFS label names? TFS don't have this built-in option, seems the only way is store it in a txt or someother files and check in the source control. Every time the developer open a website application need to check it first and get label from server to their workspace and work on it.
Usually the purpose of sharing code between projects is reducing maintenance. There’s two main code sharing paths: source and binary. The difference between them you could take a look at this blog: Code Sharing in Team Foundation Server
Sharing code between products is a primary cause of quality erosion and elevated bug counts. I would recommend you to build separately and sharing binary output through NuGet which use preferable.
Also take a look below similar questions:
Sharing code between solutions in TFS
TFS 2010 Branch Across Team Projects - Best Practices
We have a website application that stores data and pictures for a specific customer. We are about to release the same application for use by another customer. The second application will eventually be customized for the second customer. Eventually we hope to have several customers using their own versions of the application.
We are using ASP.NET in Visual Studio 2012. Should we:
clone the existing application and maintain separate code bases?
add a project to the existing solution for the new customer?
We have searched for an answer to but this seems to be a rare situation.
Thanks.
I dont think its rare at all. SAP and Maximo use this a a businiess model. Same core but each package customized to the clients specifications. I have done this (on a much much smaller scale) with some of the programs that we have.
We always start a new project rather than just copy the old. No telling what is in the old one that references the old client. Sort of embarasing when an About window that you forgot about is for someone elses company.
All the code, forms, reports that are customizeable should be in the project for that customer. All of the code, forms, reports that are standard should be in a library.
It really depends on the scope of the application. I've had to do this internally with the company I'm working for; I wrote one solution for one company, then the sister company found out and wanted the same and had to implement it there.
I had a fairly small project to work on, so it was easy to make it universal (while also keeping things rooting from the same code base). All i did was:
break out the unique setting [page title?] using appSettings or similar.
add a new configuration to your solution. Then take advantage of the *.config migrations to:
set connectionStrings
specify appSettings values
When it comes to unique business logic, I had the luxury of using the *.config migrations (most of the data I gathered came from WCF endpoints of services local to the company)--so I lucked out. However, you could make generic interfaces within the app then break out implementation for each company in to separate libraries.
Here's the scenario: I have multiple developers on an asp.net mvc 4 project. Each developer has a local database. The source control system is TFS at http://tfs.visualstudio.com. We're using Azure websites to host the site. We have also configured Azure websites for continuous deployment.
The source control system could be git, mercurial, TFS, etc. Honestly, I don't think it matters.
My question is how to accomplish these three things:
Each developer has his/her own connection string(s) locally (without them being in source control)
Azure has its own connection string(s) (without it being in source control)
Source Control doesn't show any connection information
The ability for each developer to F5 and run/debug/test the app locally.
We accomplished #1 by adding our individual connection strings to our machine.config so that there's no conflict between developer workstation setups.
I originally removed the connectionstrings section from web.config. In the Azure website (using the management portal, under Configure), I configured the connection strings, and after watching a Scott Hanselman video was under the impression that these would be dynamagically merged into my web.config upon deployment, but that doesn't appear to happen. Whenever I go to any page that hits the db, I get an error saying can't find the connection string (or some other db error related to the connection)
If I put the Azure connection string directly in web.config, Things work on Azure, but then the connection details are in source control visible to everybody.
After reading some more posts from Scott and David Ebbo it seems that I could put a blank connection string in web.config (with the correct name) and then Azure will overwrite the values correctly. I would then have to have the developers put their connection strings in their web.debug.config and then install the Slow Cheetah plugin so that they could F5 and test locally. They would also have to not check in the web.debug.config into source control. (Not that easy with TFS) This seems like a seriously burdensome kludge, that's bound to fail somewhere along the line.
I have to believe that this isn't that rare of a problem. How do other teams accomplish this?
After looking around, it appears that what I was asking isn't actually supported without a bunch of command line hacks to the pre/post build process. What we ended up doing is forcing developers to all create their own local databases, use trusted authentication, and establish a SQL alias that was used by all developers in the web.config. That way, it works locally for everybody, it doesn't expose any user names/passwords within source control, and Azure can still overwrite it when automatically pulled from source control.
Slow Cheetah is actually a nice solution. It's an extension to web.config transformations. Those transformations let you keep one web.config file and then for each deployment scenario you specify which changes you want to it. For example, your Release configuration will probably remove the debug attribute.
This can also be used to change connection strings. The transformations are applied during the deployment of your project to Azure.
What I've done in the past to make this also work with local development machines is use a web.config with an externalized connections.config file. Each developer created a connection.machinename.config file that was copied to connection.config on build in the post-build step. Those files don't have to be checked in and they can never cause conflicts because each machine name is unique.
The release/staging/.. configurations used a web.config transformation to replace the connection string element with a specific connection string for that deployment (and that way remove the dependency on the external config file).
Slow Cheetah offers some nice helpers for checking the result of these transformations at design time.
We started programming in a project that uses Agile Work Item Templates. Now, there is some history of the code that we want to keep.
Also, we want to change to a customized CMMI template, so it is close to CMMI, but customized, with slightly different work items, also some new/removed ones (for testing purposes, we set it up in a different project).
How can we now merge the source (and history) from the one project with the work items from another project?
From my understanding, you could simply export/import the work item types, but then, all the reports and queries as well as the dashboard would not get updated properly as well? So all scenarios we can come up with now result in a loss of version history (simply importing the current state of the source into newly created project using CMMI and then updating the work items).
Is there a better solution?
(using TFS 2010 and VS 2010)
edit: some useful information to be found here: http://blogs.msdn.com/b/willy-peter_schaub/archive/2011/05/17/tfs-integration-tools-where-does-one-start-part-3-dust-has-settled-did-it-work.aspx - like me, you will probably especially run into trouble with the ProcessBuildTemplates
Have you considered using the TFS Integration Tools? I'm not sure about the successful migration of work items for Team Projects using different templates, but I've been able to successfully migrate code with its history between Team Projects.
My company are working at Sharepoint site that we are developing using Visual Studio. The actual installation at the customer is performed by scripts deploying the produced wsp-files. During normal development I mostly use deployment from directly from inside Visual Studio. Unfortunately I often run into problems when trying to deploy my solutions. We are using a server-farm set up, but each developer has their own virtual server, datebase instance and so on.
We have one project file that the define the basic content-type used for different department. This content-type typically define stuff like what period that the list item cover. Each department have their own project that uses the content type combined with department specific fields to form the final list.
One of my current problems is that when I make edits to the content type and deploy it the changes does not seem to propagate. Even though I rebuild the solution and deploy both the base project and the department project with success I still see the old version of the content fields when I create a new department list. Sometimes it helps to retract the projects, but often I literally have to restart everything before it works.
My question is if this problem is caused by Visual Studio not really deploying my new defintions or if there is some architectual aspect of Sharepoint 2010 that might prevent the change to propagate. What steps can I take to lessen the likelihood of the problem occuring?
Have you tried deleting the content type with Central Administration before doing a new deployment? I've found out that Sharepoint don't update/create content types when it finds other one with the same name.