Using different source control providers with different instances of Visual Studio - visual-studio-2010

I'm in a situation where I need to merge multiple branches of a codebase, one from TFS and one in held by git, into git. But it seems that two instances of VS share a single setting for source code provider so, for example if I want to check the history of a file in TFS prior to merging I have to manually switch providers under Tools -> Options, do my checking and switch it back.
Having to have two instances of VS running concurrently, is hard enough in terms of remembering what is what, having multiple branches is worse and having multiple source control bindings is the pits.
Is there a work-around?

You could use git-tf to create a temporary GIT working copy of your TFS data, do the merge all in GIT. And then commit the changes back to TFS using git-tf if you need to update TFS as well.

Related

Visual Studio creating multiple/ additional workspaces

I started using TFS since VS2010. By that time I already created my own TFS server (myname.visualstudio.com). My problem is that I created a new project on the my TFS website (the one with the dashboards). Then when I go to VS2017, and connect to that new project, it asks me to map and get it as expected. Instead of just clicking "Map & Get" button, I instead clicked advanced so that I can configure everything.
On the workspace configuration dialog, I noticed that VS names it as "MYPCBLABLA_1". If I try to remove the "_1", VS says that "the workspace blablabla already exists on computer blablabla", and does not let me use my existing workspace name.
Why does it do so? Can I not use only 1 workspace? From what I understand of workspaces, it is the container of my projects, so different workspace, different set of projects. But what are they really?
Additional info:
I don't know if this helps but on the past, I used to format my PC many times, I'm not sure if that affects the mappings or workspace names when I use VS after reformatting.
Workspaces are maybe the least well-understood feature in TFVC. And you are right in saying they're a way to isolate different sets of files from a TFVC repository.
A lot of people configure a new workspace for a specific project or set of solutions, but let's look at some of the ways workspaces can be used in detail:
Hotfixes: you may need to create a hotfix for something happening now, but you have pending changes in your existing workspace. Instead of shelving these changes, performing a "Get Specific version" on the bugged version, you can also create a new workspace in which to solve this particular problem. After completing the fix you can then continue working with the other workspace without needing to do anything.
Experiments: you may want to do some major refactoring, restructure source control or some other highly impactful operation. Doing this in a new (temporary) workspace helps you prevent messing up your normal work area.
Reviewing other peoples changes: When performing a review on another person's changes, you may want to have a local copy so you can run, annotate and play with the other person's code. Instead of taking these changes into your own workspace, you can easily bring these into a temporary workspace, which you can safely delete afterwards.
Performing a merge, while you are working on other changes: It may be the case that you're working on a new feature an already have some changes merged back to another branch when a release needs to be shipped. In order to prepare this release, without picking up changes or overwriting work in progress in your current workspace, it's often easier to perform these kinds of release activities in a temporary workspace, that way you know that the work is always done on the exact version in source control.
Preventing accidental changes to important branches: By putting your production branch in a separate workspace, you can't accidentally combine changes from say Development and Main into a single check-in. Since Visual Studio often auto-selects all pending changes in the workspace, this may cause unintended changes to your master/main branch. I've written a Check-in policy to prevent these issues, but having separate workspaces is a much safer solution.
Working with multiple developers on the same workstation/server: in some organisations, developers use a remote desktop to a central beefy server to do changes. To ensure each developer has his own set of files, each developer gets his/her own workspace. An alternative is to make the workspace public, which allows multiple developers to use the same workspace folder. But this often leads to all kinds of unexpected issues.
Browsing an old version of the code: if you need to review/compare an older version to a new one, you can often get away with the folder diff view in Visual Studio, but if you need to do more thorough comparisons, you may want to have 2 copies of the same folder in your TFVC repo. Creating two workspaces will allow you to have two different versions of the same folder on your local disk.
Prepare a special version for merges or labels: You can merge and label the workspace version of a set of files. You can create a workspace and then use Get Specific Version to fetch specific versions of specific files, these can all come from different changeset versions. Once you're satisfied, you can perform the label or merge or branch action to store this specific workspace version configuration on the server.
As you can see, Workspaces allow you to do parallel development on one machine, isolate changes etc.
Be creative
As you can see, workspaces are a very powerful concept. Usable for a lot of operations. But you need to understand the concept thoroughly. Many developers don't understand exactly what workspaces are and how they work, they're missing out of some of the most powerful concepts of TFVC.
Consolidating and cleaning up
In your case you now have two workspaces. In order to consolidate these (if you want to), you can unmap the folders from your _1 folder and then map these same folders in your original workspace. You can also delete the _1 workspace from the TFS Server and then update the mappings of the original workspace.
Remember that workspaces are stored on your local machine, but that the TFS server also has a registry of who mapped which TFVC folders to which workstations. So simply deleting files from your local disk is not sufficient. You need to save these changes to the TFS server (this happens automatically after performing a get operation after changing the mappings).
To check which workspaces are registered to your workstation on the TFS server, use:
tf vc workspaces /computer:YOURWORKSTATIONNAME
Then delete old workspaces with
// DELETE the local workspace
tf vc workspace /delete:WORKSPACENAME
// DELETE the workspace registration on the TFS server
tf vc workspaces /remove:WORKSPACENAME
To prevent the creation of a new workspace by VS, I:
Create a local folder to which I’ll map the content of the remote repository;
In VS, connect to the remote repository;
In VS, open Source Control Explorer and navigate to the content I need; VS will show a “not mapped message”.
Click on that message and map locally.
This guarantees that no other workspace will be created, and the current one will be used.

How can i keep local copy of modified files in visual studio?

How can i keep local copy of modified files in visual studio ?.I am using TFS for source control.I have modified two files but do not want it be in TFS now.I want it later when these changes are approved by manager.
You can use shelvesets to keep local versions of your code.
You can read up on how to use them here
Next to that i would suggest you look into branching and branching strategies. This way you can still check in your code, but on a different branch. This is a much more robust approach for working with multiple versions of your code, and still keep tracebility and availability. You can find some information here, but there are many sources for this if you google around a bit.
User shelve-sets in TFS .It helps to keep local copy of your changes and publish these to your local copy or other users.
Reference URL : http://codespower.com/2017/04/27/using-shelvesets-in-tfs/
You can use shelve-sets to keep local versions of your code and then undo changes of server version.This helps you when required the local version code then get from shelvesets.

TFS 2012 not detecting deleted files in pending changes

We have multiple developers on our team. This works for everyone except one developer, but we cannot seem to find the reason it does not work for this individual. We all have VS premium+, TFS 2012 power tools installed.
We have a branch. We get latest version from branch. Go to windows explorer and delete all files in folder "sdk" (there exist no subdirectories in sdk/). Then we copy into it a bunch of files. (This effectively leaves some files as new files, updated files, identical files or removed files when compared with what was deleted.)
When we go to pending changes, these changes show up under "Excluded Changes - Add(s) 51, Deletes(3)".
Except for one developer. His system does not recognize these changes. What might cause this to not work for him?
If it helps troubleshoot, he is also the only developer that if he were to delete these files via power tools delete option in windows explorer, his .dll files get locked. This does not happen for anyone else either.
This is what we've checked so far:
EDIT: Solution Found - Thank you all for the responses! It was indeed the local vs server workspace option. Setting his workspace to local solved these and a few other issue he was apparently having.
Make sure that the developer is using a "Local Workspace" as opposed to the "Server Workspace".
This is a concept which was introduced in TFS 2012 which helps developers to work offline as opposed to server workspace in earlier versions which did not allow that. TFS 2012 changes up the workspace options. Server workspaces are still available, and work exactly has they have in previous versions. However, TFS 2012 now contains a new type of workspace, called a Local workspace. Again, this is an oversimplification, but in a Local workspace, all the files are read/write, not read-only. The meta-data about the files is stored in a hidden folder in the root of the workspace, which allows edits, renames and deletes to be done locally without any communication to the server.
This improves the offline story with TFS significantly, as you no longer encounter issues with editing read-only files. It also makes it easier to work with other tools (such as Notepad) to edit code files. Making a change to a code file using Notepad will still mark that file as edited, which will be picked up by TFS the next time you connect.
LINK
This only ever happens when a user tampers with a local view of source control (be it a local workspace, or not). If all you ever did was get latest from TFS this would never occur, instead, the local view of what is in TFS would always be properly managed.
Also sounds like a bad merge, e.g. getting latest (where the files no longer exist) then copying in old content (introducing untracked files.) One thing you might try doing to correct the issue is doing a forced fetch from TFS after deleting the local workspace contents BEFORE attempting a merge. This will ensure that the local workspace is up to date an accurate with what the TFS server believes is truth, if it still occurs after merging in content then the problem is almost certainly within the merge process the user is going through (i.e. PEBKAC, or a knowledge gap about what they are doing.)
If you unshelve old content (pre-deletion) into the local workspace (where the deletions have already been performed, according to the SCC, and thus locally because of a sync/get-latest) then the unshelved files will effectively become untracked and it's up to the user to clean up the mess. This is identical to a user having copied loose files into their workspace that TFS never had any knowledge of. TFS isn't going to prune untracked files for you, I believe some other source control tools might do this as a configurable default, TFS does not.
That this is only happening to one developer in the team suggests that the other developers, one at a time, should sit with this developer and drive using "their process" to see if it still occurs for them. More often than not this comes down to a bad process a user has adopted, and putting a different person in the chair can help highlight why it has been occurring and help end it. A disciplined build/source manager and/or developer should not experience this problem.
Very interested in knowing what the problem turns out to be.

Multiple Team Foundation Server

We currently have a local TF Server here in our company, and we are about to make a subset of our projects open source (via Codeplex), but we are having problems mixing two Team Foundation Servers in the same solution. Looks like Visual Studio can't be connected to many TF Servers at the same time. What's the best way to deal with that?
Solution 1: Bind Open source projects to Codeplex only, and proprietary projects to local only. Bind and un bind projects depending where are you connected --> Looks like VS doesn’t like the idea. Projects loose bindings and start to behave strangely.
Solution 2 Bind all to local and use another solution for the open source subset --> Team Explorer Workspace manager avoid you using overlapping local folder trees, even on different servers, so it is not an option.
Solution 3 Bind all to local using TFS. Use another source control like SVN for the open source subset. It looks it will become messy easily, but we don't have a lot of options.
Someone with open source projects has faced a problem like this??
I would stick to one single authorative repository or you'd end up with a version hell at some point.
If you intend to have external developers contributing code on the codeplex side you will need to merge your changes with theirs and also integrate their changes on your own internal TFS server.
It's safer to have one single authoritive repository and just create snapshots for milestone releases on the other.
You could do your fine grained check-ins and modifications on your internal repository and periodically integrate/merge them to the codeplex code-tree. However what works on one codebase may not work so well on the other after integrating, the sooner you integrate changes the better (don't work on your own isolated branch too long).

Perforce integration with visual studio without project files being checked in to perforce

I am working on a large source base (approx 15K files) decomposed into about 25 projects. I want to keep the source in perforce (and am evaluating perforce to that end) but due to complications in the setup it isn't possible for me to keep the visual studio projects in source control, I know in theory the answer to this is to check the projects in, but that isn't feasible (we would end up with projects for several versions of VS checked in, and additionally several variants of each of these, instead they are generated automatically and this setup works very well).
Is there a way to get VS to checkout files for editing as it goes without adding the project to perforce, to avoid the user having to go to the perforce client and manually check out each file for editing as they go? Alternatively (and even better) is there a way to get VS to recognise that the files in a project are under source control, without having to add the project to source control also?
I know we could also take the tack of having every user check out for editing all files they might potentially want to edit ahead of time, then revert unmodified files before submitting their changes, is there a performance penalty in perforce in taking this approach?
In your case, I'd suggest not using the visual studio integration for Perforce.
You can either add Perforce commands to the Tools Menu, or try Nifty Perforce from Google:
http://code.google.com/p/niftyplugins/
One option is to use Perforce as if you were disconnected from the server and reconcile your changes later, rather than telling Perforce everything you do before you do it. (This is roughly equivalent to the workflow in CVS or Subversion.) You would synchronize your working copy, go off and develop, and then ask Perforce to figure out what you did while it wasn't watching.
Perforce has a nice document describing the process: Working Disconnected From The Perforce Server
One thing the document doesn't mention is the allwrite clientspec attribute, which marks all files in your working directory as writable instead of only the files you have checked out.
For the sake of completeness: There is a new tool for your wish called P4VS. I like it better that P4SCC which never worked for me as I wanted.

Resources