DBGhost with TFS - visual-studio-2010

We are currently using DBGhost with our SQL server database. The current version control tool is VSS along with which we are using DBGhost. Now, we want to migrate to TFS and have some questions about using DBGhost with TFS.
Can DBGhost work with TFS?
If yes, what kind of output it expects from TFS?
a. Just the checked in database file (table, stored procedure .sql files etc.) same as VSS
b. OR It needs TFS to run a build after the files are checked in and then DBGhost looks at the results of the Build?
What I meant to ask is, for using DB Ghost with TFS, is it okay to put all the code from VSS into TFS just like that OR it should be first converted to a Visual Studio Database Solution using SSDT and then put into TFS and then define Builds in TFS that would run after every checkin?

DB Ghost works fine with TFS and we have many customers using it. In essence all you have to do is a command line pull from the TFS respository to a local folder and then point DB Ghost at that set of folders, the rest is just the same.

Related

How to force TFS server to take all of my local files?

I have a couple of ASP.NET Core based projects being developed using Visual Studio 2019.
I am having issues where my workspace and TFS server on Azure-DevOps are out of sync. My PC contains the most recent code and I want to push everything I have on the server. I don't really care about the status of the TFS server as it is wrong. I just want to force everything to get pushed to ensure my PC and TFS are syncing again.
How can I force the TFS on Azure-DevOps to take all my files? I don't even mind removing the project altogether from Azure-DevOps and push all files as if this is a new project.
According to your description, sounds like there is something wrong with your source control binding. Or maybe some files outside of Visual Studio do not detect by TFS server. Which cause your workspace and TFS server out of sync.
If you want TFS server detect changes done to files outside of Visual Studio, the simplest way is using local workspace.
Now anything else changes files outside Visual Studio, your workspace detects the changes automatically.
It also detects adds or deletes but you have to include them to your Pending Changes manually with the link under `Excluded Changes
If you are using server workspace, this is kind like when you are offline, you cannot work with your local files because they are read-only until you check them out. So highly recommend you switch to local workspace, you just need to make sure you open the files in VS from a path which the same as your TFS local worksapce. Then it will auto sync changes in Visual Studio and show in pending changes.
More detailed information on the pros and cons of local and server workspaces, please refer our official link.
Now in your situation, we suggest you fist back up all of your local codes/files first. Then delete your old workspace, create a new local workspace.
Get latest from your sever, then copy all your back up to your workspace folder. Then let windows file system auto detect the difference between them, replace files download from server with your back up local version.
Now your local workspace should contain the latest version of your code/file, Visual Studio will auto detect the changes and list them in pending changes, if something added in excluded list, manually promote them.
Finally you could just check in/push all pending changes to TFS server. Now everything back to the track again.
Hope this helps.

Is there any method to perform check-in and check-out operations of TFS code repository from Jenkins build?

My ultimate goal of this exercise is to update TFS code repository from the contents (i.e., files and folders) which are getting copied from another source.
Following is the exact scenario in my project:
There exists a code repository in VSTS Online
Have setup Jenkins on my local computer.
Configured Jenkins to create a workspace for TFS code.
Written powershell scripts in the build step to copy files from another source to the workspace folder configured for TFS.
Till this point everything is working fine. In the next step, I want to update TFS repository from whatever is there in the workspace.
Any idea of how I can achieve this?
Thanks,
Nirman
You can try to use tf command with a script during the build just like Stefan suggested.
If you want to use Windows Explorer to manage TFS version control files, you can also try to use Microsoft Visual Studio Team Foundation Server 2015 Power Tools. Which allow you to check-in and check-out through folders.
You need to: First, create a workspace folder, put your project in this folder.
Then use Windows Shell Extensions coming with TFSPowerTools2015 to check in/check out files in the local workspace folder.

Not able to migrate defaultcollection from tfs 2010 to tfs 2012

I am trying to migrate projects from TFS 2010 to TFS 2012. I am following the steps given below:
Detach Team Project Collection
Back up the collection database and restore it in target machine.
Some of the projects are in DefaultCollection project. When I try to attach the collection in Team Foundation server, I do not get the option to restore DefaultCollection. I am not able to connect to this collection from visual studio.
I am not sure what I am doing wrong here. Have I missed any step during the migration? Any help would be greatly appreciated.
Have you configured the Application Tier on the target server using Team Foundation Server Administration Console?
If that install/configuration left you with an empty DefaultCollection already there, you need to either delete that or rename your original collection to avoid naming collision.

How can I manage SQL Server jobs using a Database project

Does Visual Studio 2010 support managing SQL Server jobs in the Database Project?
I am working with a database project in Visual Studio 2010. I would like to manage my database scheduler job in my database project. It seems that I could not create any server object in the database project.
What we do at my company is:
Script out your jobs to be re-runnable (either drop/create or skip if exists)
Place the scripts in your Post-Deployment folder (and include the reference in your Script.PostDeployment.sql file as necessary)
No, you won't be able to do that. If you want to use Visual Studio to manage database projects you can use the database and server projects (what used to be called Data Dude).
You might also want to take a look at Red Gate SQL Connect. It works with databases and source control through Visual Studio.

Database in version control using Visual Studio 2010 Professional

I've added a SQL Server 2008 database project to my Visual Studio 2010 Professional Edition solution in the hope that it might allow me to include my database in version control.
I can commit the schema files for each database object into version control, however these schema files all script objects as create rather than alter, so are not good for colleges getting my changes and updating their databases.
Is this a good way to get my database into source control?
And what would the workflow be for actually using it to update databases to a given revision without losing all the data associated with dropping and re-creating all the tables?
Update: on Premium and Ultimate versions, there is a schema compare tool which makes this easy. This does not exist on Professional. Is there any straightforward manual workaround?
I'm not sure if you can do this in VS 2010 Professional, but in VS 2010 Premium, you can do a schema comparison (Data -> Schema Compare -> New Schema Comparison) between your project and database, and update changes in either direction.
When going from project to database, VS generates a script that copies existing data into a temporary table before dropping the existing one.
The database project has a deploy step (which is present in my Professional copy of VS2010) that will generate a sql script with your sql objects in it.
The key thing here is if you r-click the project, properties, goto deploy and change target database settings to a specific database, when you deploy it will generate a change script for that specific database so it matches the objects in the project (and in theory keep existing data etc).
You can get it to either generate a sql script, or directly update the database. Generating a script is probably a better idea :)

Resources