TFS 2010 version control - visual-studio-2010

We want to start using the TFS version control on our project. I read the tutorial and noticed that TFS creates tables in the sql db. My questions are:
What are these tables for?
Where is the vs solution actually stored?
How can I use more then one instance of our solution from another computer (another developer)?

TFS stores pretty much all its data in few SQL database: source control, work items, build definitions, build results etc.
In the SQL database for the Team Project Collection, specifics about which tables etc should not matter to you. Users setup a workspace which maps the directory structure in source control to a place on their local disk.
I'm not sure what you're asking here, can you try clarify your question?

Related

How to migrate work items from TFS to Visual Studio Team Services

My team currently works with an on-premises TFS 2012 server. I am migrating everything to Visual Studio Team Services, formerly Visual Studio Online. I am starting with a test project and was able to easily get all the code migrated, but can't figure out how to do the same for the work items.
Are there any good guides out there?
New options as of March 29th 2018:
TFS to VSTS migration - The official import option which will import 1 project collection into 1 VSTS account. It automatically imports everything stored in the backup. At the point of writing this, the TFS must be upgraded to TFS 2018 and some work item template customizations must be removed (there are a few well documented features unavailable on VSTS).
VSTS Sync Migrator - Marting Hinshelwood, the uncrowned king of TFS and VSTS migrations, has built his own little tool that can migrate work items from one server/account to another. It can even do migrations from one Team Project to another and while doing it switch between process templates.
VSTS Work Item Migrator - Microsoft has also open sourced a project that they used internally to migrate work items. It's less powerful, but it was made by Microsoft.
Previous answer:
At the moment there isn't a really good story. Your options are:
Start over - easiest :).
Start over and manually recreate items of value - It's a pain, but it's some teams have done these things in the past. keep the old TFS server available in read-only mode and each time you use a work item in the old system, you manually create it in the new one, set all the fields and upload the attachments. Depending on the number of items it'll take you a few sprints to migrate the most important stuff over.
Wait a while longer - Microsoft is currently working on a full fidelity import option which will allow you to upload a Project Collection and it will be exposed as a new VSTS Account (it's not going to be possible to import a project collection into an existing account).
Use Excel for import/export - Will work for most work items, you loose attachments and work item links other than parent/child. The trick is to extract from one Project Collection then copy all fields, except the ID to an Excel sheet bound to the target project collection. You will need to fix all Identity fields (works best when users have the exact same display name on premise as in VSTS) and you'll have to import once with state new and then past the current state/reason over the just imported values and sync again. Test Cases, Plans, Suites and Shared Steps will not be imported with their relations in tact. The approach would be very similar to this one.
Use the TFS Integration Tools - Will work for most work item types, though it will loose custom kanban states and tags. Test cases, Shared steps and their relations will not be imported. This option will allow you to import import work items and source code with their relationships in tact.
Use a 3rd party solution - Out of the available options currently OpsHub offers the most complete solution. For test case and source control link migration you're looking at the commercial edition, which comes at a steep price. It still has a long list of known issues and last time I tried it, I ran into numerous issues which required their support to resolve them.
There are specialized TFS consultants who live off these kinds of migrations if your current state of the work items is precious to you, then you could reach out to them.
See also:
https://www.visualstudio.com/en-us/articles/adopting-vsts

How can I perform a data compare on a VS 2013 SSDT project programmatically?

Visual Studio 2013 has a feature that allows for performing a data compare between your SSDT project and a target database.
According to another post here on SO, there are certain requirements with regards to performing such a compare.
Those requirements taken into consideration, I want to do something like this as a part of our build and deployment process:
Publish any DB schema changes to the target database(s) to make sure that source and target have exactly the same tables, columns, SP's, etc. to comply with the requirements mentioned in the link above
Run a data compare and generate an update script, or publish any changes in the source DB directly to the target DB
Currently, I have a script which takes care of bullet no. 1 by doing a schema compare, using a DACPAC, via sqlpackage.exe. It does not look like it is possible to perform a data compare using sqlpackage, though, and I have not found any other alternatives yet. In VS 2010 it was possible to run a data compare via the command window, but I have not seen any documentation regarding this in VS 2013...
Thus, my question is if there exists an API and/or other tools that allows for a data compare to be run programmatically through e.g. a Powershell script.
It appears you are correct, for schema diff there is command line support as long as SSDT is installed on disk (more details here), but there is no programmatic interface yet for data compare and update.

Is it possible to update a SSDT DB project from a database?

We have two software projects that both communicate with a single database. Right now SQL updates are all done on the database and it's relying on developers to make sure to update both sets of projects independently to use the latest database model. Making these matters worse both projects are in separate solutions in separate source control repositories.
While I acknowledge this is a terrible situation to be in, I inherited this situation, and while my long term goal is to consolidate and share the (lots) of duplicated logic between them in one common project shared among both sets of application for various reasons it is not feasible to jump right into that right now due to critical deadlines coming up and the need to combine them iteratively and schedule it with other developers to not disrupt work too much.
Keeping that in mind, I really want to use SSDT to at least start bringing the database structure under source control and make it easier to manage, as there are quite a few database changes that I'm about to do.
The problem with SSDT in this scenario is that you can only import from database once. After that the option is greyed out and unavailable, which is apparently a design decision of SSDT, since it's explicitly listed in the MSDN documentation.
Is there any easy way to update my SSDT project without nuking the current project and recreating it each time someone makes a change to the database structure?
Firstly your right, it is a horrible situation so work on improving it in the long term!
There are two things you can do, firstly you could use SSMS "Generate Scripts" to export all the objects and then use the import in SSDT to import from the scripts - this isn't greyed out.
The second thing you can do is manually bring the changes in using the schema compare in SSDT, you can set the database as the source and project as the destination and choose what you drop, update and import.
Ed
its bit delay in answer. I am using VS2017 Database project in which I have achieved this task by comparing a local database with database project once the comparison is over you can update the database by update button
Step 1 right click on the database project and click on schema compare item.
Step 2 select target -> select database connection option
Step 3 change source and target
Review Screenshots for more detail
I am going with compare solution :
Choose schema compare and make your database as a source and database project as a target then compare and update
see the this answer
Make a new temp Database project (outside of TFS) and import all the objects.
Checkout the Database project (inside TFS) and copy and paste all the folders (excluding BIN, OBJ folders) from the new temp Database Project into the Database Project (in TFS) and check in. This was way you get the latest DB object into TFS without duplicating.
If you expect new files in the copy/paste operation, then the new files should be included in the DB Project.
Delete the temp Database project folder.
You will need to do the process whenever you want to update all DB Objects into TFS.
This is a workaround which worked for me for this file duplicating issue.

Do VS Database projects and Entity Framework work together?

I've always been intrigued by Visual Studio Database Projects, and while they seem to be quite capable, I've never used them to any great degree outside of simplistic proof-of-concept work. I want to try this for a new project, and I'm also interested in using an EF layer on top of it, but in past test projects this has involved some decent effort.
I'm curious: has Visual Studio matured its product integration to support a single workflow that builds the database project, builds the EF layer on top of it, and finally builds the code, without intermediate steps involved?
We are a small team and we don't have dedicated SQL developers, and our primary goal is to bring the database into Visual Studio and to get it nicely under source control (TFS), and to achieve strong integration between from end to end. We're interested in growing into EF, and will probably start simple by treating it like a simple ORM tool to begin with if possible.
Has anyone actually done this that can provide insight into the process?
We have used VS2014, tool seem much the same and early version
Don't think has been much changes over the years
We have EDMX model and a DB project in the solution
Does mean that you need to keep the db project up to date.
But this is easy to do, you just publish you EDMX to local box/target
Then can import the changes with a schema compare of local to the project.
So they you can still have Model driven DB design
And use the DB project to deploy changes to the Dev/Stage/Live boxes
And can publish with automated deployments also.
The db project has a post build scripts option
Where you can use it to do seed data
And also a pre-build where you can do db manipulation if need to change structure and types of fields types when the data is on a live db.
Schema compare tool are rather good in Visual Studio
Can compare a DB to DB, DB to Project, or Schema file to either also

Managing database scripts in your solutions

I usually create a solution folder in Visual Studio and put my DB scripts in them. I always use at least this set of scripts:
Drop model
Create model script
User functions
Stored procedures
Static data (lookup tables)
Test data (not deployed)
Then I simply combine them and run against an SQL Server so I'm able to recreate the whole DB in a single step (by combining these scripts into a single one and executing it).
Anyway. I've never used projects in either:
Visual Studio or
SQL Management Studio
I've tried creating SQL Server 2008 Database Project in Visual Studio 2010, but I'm somehow overwhelmed by all the possible server settings (which I prefer to stay default as set on the server anyway). So I'm a bit confused: Should I use this project template or should I just do the same thing I always did?
What do you use and why? What are advantages I may benefit from by using either?
If I were you I would continue to do it the way you are doing it. In fact I do! The advantages of having the actual .sql files right there in a folder for you to use/edit/look at in my opinion are far better than the advantages you get by using a DB project. DB Project would be used if you were doing something like Storage Reports, were you have to communicate with like 8 databases and compare then to 8 different databases and save result sets etc... Now don't get my wrong there are advantages of Database Projects, I just don't think they are actually doing much help when you have such a simple setup that works already.
Advantages of the SQL Server 2008 Database Project in VS10:
Not having to switch back and forth
from your current client you use to
communicate with your SQL server.
Decent Data and Schema compare tools.
Gives you a one-click way to reverse
engineer a database into source
control, and keep it up to date.
You can compare projects to physical
databases and vice-versa. (This makes it pretty easy to keep your database up to date, no matter where you make change it: file system database project, or in the physical database itself)
If the current tool your using is not specifically tailored to SQL Server, this one is.
Extremely helpful if you need to do
unit tests directly on the database
without using abstractions.
If you're looking for something a little less complicated, you might want to try SQL Source Control. This won't even require you to maintain scripts, as it doesn't this for you behind the scenes. It will, however, only work as a solution for you if you use either TFS or SVN. And it costs $295...
It has a 28-day trial period, so if you're happy to try it out, I'd be interested in your feedback.

Resources