I have an issue with the TFS cache folder.
I have configured TFS to save cached data in a dedicated hard drive (F:\TFSCache) and I see this in the TFS Admin Console.
If I open the F:\ drive, I can see the cached data under the "proxy" folder.
All seems to work fine, but I noticed that my C:\ drive space is still used by TFS to save cached data under "C:\Program Files\Microsoft Team Foundation Server 12.0\Application Tier\Web Services_tfs_data\1e68059b-3328-4ab3-af6e-3a068be57a6d\Proxy"
If I compare the two folders in the two drive, they have almost same folders and all these folders have the "date modified" attribute updated with the latest date and time.
Why?
I want that TFS uses only the F:\ drive as specified in the console.
Thanks in advance.
First make sure you have followed To specify a different cache root folder
The last step is Delete the old cache root folder. Otherwise, TFS will still write cache to the old folder. After deletion, TFS will uses only the F:\TFSCache as specified in the console.
Moreover, please also pay attention to the
Security Note:
The cache folder stores sensitive information that is not encrypted.
Therefore, you should make sure that only the service account of the
application tier (TFSService) has Modify permissions to this
folder.
I have fixed the problem updating TFS to the latest build (TFS 2015 Update 4).
Related
Can I put my Visual Studio solution in OneDrive and work from there or can that end up in data loss somehow? I want to take my projects with me wherever I am and not worry about putting them on a flash drive every time.
I wouldn't recommend it.
OneDrive and other cloud services tend to synchronize your files whenever they are created, modified or deleted. And Visual Studio does that a lot―to put it mildly.
OneDrive will not just synch (copy to the cloud) all your source files, but also all the binaries Visual Studio generates, which will use quite some space unnecessarily on your OneDrive and generates lots of synch traffic. I guess it is better to use VSTS (Visual Studio Team Services) and Git to keep the sources synchronised on your various PCs and backed up in the cloud.
Remove the Visual Studio from your OneDrive path. In general, it is better not to let OneDrive synching everything stored in Documents.
How to do that: Make first a copy of Documents outside of OneDrive, for example C:. Then remove Documents in OneDrive Online (https://onedrive.live.com) from synching and delete Documents in OneDrive Online. On your PC, copy back from the folder with the Documents copy to Documents. Finally, add a new folder to the OneDrive folder (C:\Users\Xxx\OneDrive). Move all Folders you want OneDrive to synch from Documents to that new folder.
Maybe it is also possible just to ask OneDrive to stop synching Documents and delete everything from Documents in OneDrive. But I recommend to keep a copy of documents somewhere, in case anything goes wrong.
I'm using MS Team Foundation Server 2012 with a server workspace, and the local files are on a Linux server (accessed via Samba). Accessing TFS from Visual Studio 2017.
When I check in foo.txt, TFS successfully sets foo.txt to read only as expected (r-xr--r--). But TFS does NOT set the permissions for the directory in which foo.txt is stored. Thus although I can't modify foo.txt, I still have write permissions to the directory it's in, so I can delete foo.txt or rename it or over-write it.
Is there a way I can tell TFS to manage the permissions of the directory a file is stored in (in addition to the file itself)? So that I would NOT be able to delete/rename/overwrite a file (outside of Visual Studio) without first checking it out of TFS? I'd be happy if when I checked out a file, the directory it is stored in became writeable(u+w), and when all files in the directory are checked in, the directory becomes readonly again (ugo-w).
As a side note I thought this might be a complexity of my having the files stored on a Linux box. But I tried it with a local file stored on my Win 7 PC, and got the same result. TFS will set the read-only file attribute. But even with read-only attribute set, I can still delete / overwrite / rename a file. I suppose because I'm an administrator of my PC (the security tab of the file properties shows I have full control).
So I think it's a generic TFS question. Since TFS uses the read-only attribute to prevent files from being modified outside of TFS without being checked out, is there a good way for TFS to prevent them from being deleted / overwritten / renamed outside of TFS?
Without that, I think I'm at risk of my local files accidentally becoming out of synch with the repository, and that doesn't seem like a good thing.
Just as you said local workspace is more appropriate for your situation.
And this is also available with TFS 2012.
A local workspace caches the unmodified version of each of your files to enable you to edit, compare, and do other things without being connected to the server. Just like work offline. Besides when you add or delete files outside of Visual Studio, the program automatically detects these changes.
Even though you have permission of the directory in which foo.txt is stored, such as delete foo.txt, you still can't be able to check in TFS source contol without sufficient permissions. And if you accidentally delete the file, also easy to restore locally, just get your files again.
Moreover, if you are an administrator, you can specify which type of workspace Visual Studio creates for your team members by default: Local or Server.
Take a look this excellent blog: Server workspaces vs. local workspaces which helps you clearly understand differences between the two.
I deleted a local copy of a TFS source-code branch (actually I renamed the branch and had to delete the old-named version), but Source Control Explorer window in Visual Studio says I still have the latest version so whenever I double-click a file, I get an error that the file doesn't exist.
Is TFS supposed to notice when I delete a local working copy i.e. this is a glitch?
How can I address it? Get the latest version and then delete it?
Is TFS supposed to notice when I delete a local working copy...?
No. TFS TFVC expects that it controls your working directories, at least with a Server Workspace. When you start doing things without telling it, then it has no idea.
If you want to remove files from your local drive, do a get of changeset 0 on that path (where the files won't be) and/or delete your working folder mapping or delete the TFS workspace.
Why does it work this way? Performance. If you have 10+ GB of sources, you can't afford to have your version control system scanning your filesystem to try to figure out what you've done. That's why TFVC Server Workspaces work this way.
Change your workspace to a Local Workspace if you have only a small bit of source code and you want to scan the filesystem for changes. Or switch to Git in TFS if you want a complete distributed experience.
I'm using TFS with VS2008 and VS2010 and in the TFS collection I have several projects.
I've mapped the TFS root to a local drive to preserve the TFS folder structure and I've done a Get Latests of several subfolders.
I downloaded also an unwanted folder so I deleted the local folder contents but now in the TFS I see that folder in black and "Latest" Yes. How can I tell TFS that I've locally deleted a folder that I previously downloaded?
I think the problem is that you do not agree with TFS on what "latest" mean.
"Latest" in TFS mean that nothing has changed on the server since you did get latest. It do not mean that what is on the hard drive is equal to the latest version on the server.
So TFS shows what it is supposed to, see this question for more: Why doesn't TFS get latest get the latest?
The intended solution for folders on the server that you do not want to have on your local hard drive is "Cloak", as MBulava mentioned. (Right click folder -> Cloak). If you do not wish to have a folder on your hard drive I recommend this solution as it will never be downloaded until you uncloak it, and will show as greyed out and "not downloaded".
If you want to look at the differences between the contents on the hard drive and the server version you can use the "compare folders" feature. It will show that the folder is deleted on the hard drive.
If you want to get the folder mapped and grey (as the other folders you did not download) you can cloak it, and then uncloak it but answer "No" to downloading it now. This is equivalent to not downloading the folder in the first place.
Martin Woodward has a cool TFS Top Tip #11 - Removing source control files from your local file system blog post that answers this problem without resorting to cloaking.
You need to Get Specific Version, change the Version Type to Changeset and specify 1 for Changeset number. His blog post goes into detail about why this works. I've verified this behavior in Team Explorer for Visual Studio 2013, 2015 and 2017.
Update:
If you have got a bunch of folders to process you can use the command line as follows replacing folderName with the relative folder as the Client itemspec or the equivalent Server itempec:
tf.exe get folderName /v:1 /recursive
Mbulava's suggestion to use 'cloak' will leave the files and/or folders on the TFS server but will remove the files / folder from your local version and from the 'Get Latest Version' request.
If you have deleted files from your local directory and you then want those changes to appear on the server you need to go to the TFS director, select the files you want to delete, right click and delete the folders. Then you need to 'Check In' the pending deletion changes to the TFS server. TFS server will then delete the files / folders.
I use VS 2005 and VSS 2005. Every time I close VS I get error: ss.ini not found. But except this VSS works fine, no problems when I open VS and do check in and check out. ss.ini is present and VSS repository specified as network path. I just worry that I can have problems later.
Ss.ini keeps track of your preferences, project locations, dialog settings, and such. The ss.ini needs to be in the user's folder for the data base. In my case, it is C:\DEV_\VSS\SourceSafe\users\amissico\ss.ini.
If the file exists, check permissions of folder and file. If the file doesn't exist, check permissions of folder.
Copy paste SS.ini file from user folder from another user folder into user folder with whom you are getting an issue like :
Source: Project\users\ajit\SS.ini
Destination: Project\users\Sujit\SS.ini
VSS provides Deploy command which allows you publish files to your web server. You can check the following link for more info:
http://msdn.microsoft.com/en-us/library/bb509340(v=vs.80).aspx