"Access to the path is denied." when attempting to check in files to TFS - visual-studio-2013

I suddenly have begun encountering an error similar to "E:\Websites\Stage\mywebsite\somefile.ascx: Access to the path is denied" on a multitude of my local files when attempting to check them in. The files it is failing on are all sorts of files, PNG, ASPX, CONFIG, etc.
I am using Visual Studio 2013 for Web (Update 4) and the visualstudioonline.com TFS.
The files are stored on a network location and I have a drive mapped to that location. I can manually open, manipulate, and save any of the files that error so I do not believe it is truly a permissions issues.
This setup has worked for months but suddenly it is giving me problems.
I ran a powershell script on the folder Get-ChildItem -Include *.* -Recurse -Path 'E:\Websites\Stage' | select fullname,isreadonly and all the files return 'False' under the isreadonly column. No errors are returned.
I am in need of some further ideas.

I found a workaround in another StackOverflow question.
Essentially, you shelve the pending changes, then you commit it. No need to unshelve them.
I would only suggest to use that to check-in your changes until you set another workspace locally (or someone fixes that issue).
As many other, using Visual Studio 2013 from within a VM having a local workspace located on the host computer mapped through a shared drive was working well before updating to "VS2013 update 4".
That setup was suggested to me with the reasoning that if the VM crashes, then I wouldn't loose my changes.

Storing your local workspace on a network location is not supported and should never be done.
Have a 'local' (physically on your local machine) workspace where you edit the files and check in. Then have an automated build that publishes the files to a location of your choice.

I ran Windows/Visual Studio in Parallels on a Mac and had a project saved to my desktop (yes, shame on me). Internally this path is handled as \\psf\Home\Desktop even if it is stored locally and not in the network. Still gives the same exception and is solved by moving it to your regular drive (c:\...)

Related

Visual Studio 2015 can't open project.exe for writing. Access to path denied

I am developing a VB.NET (4.5 framework) solution in Visual Studio 2015, Win10 OS, and have been able to run the builds uninhibited for several months, but now I am receiving the following error upon starting the build:
vbc : error BC2012: can't open
'C:\MyProject\ProjR5\ProjR5\obj\Debug\ProjR5.exe' for writing: Access
to the path 'C:\MyProj\ProjR5\ProjR5\obj\Debug\GenTagR5.exe' is
denied.
At first, VS2015 would give me the option to run the last successful build, but even that is no longer an option. After exhaustive internet searches on this problem, none of the dozen or so given solutions are solving my issue.
Here is what I have tried in order to resolve the error so far:
Ran sfc /scannow (elevated prompt)
Using ProcessExplorer, find handle or DLL substring that included my project
Made sure there were no hanging procs (including procs with my project name, devenv.exe, [project].exe, [myproject].vhost.exe, etc.)
Restarted VS2015
Restarted VS2015, running "as Administrator"
Restarted Computer
Full Shutdown of computer
Complete Rebuild of Solution
Build->Clean Solution
Build->Clean Solution, then Build->Build Solution
Build->Rebuild Solution
Uninstalled and Reinstalled VS2015
Disabled all indexing
Removed "Read Only" attribute from entire project folder and files within
Checked startup scripts for like- or identical processes
Disabled all AV apps
Disabled all antispyware apps
Disabled all firewalls
Verified that Application Experience (services.msc) wasn't disabled (I'm using
Win10 ... it isn't even in the list of services)
Set Tools->Options->Projects and Solutions->Build and Run->Max. parallel
builds to 1
Rerun aspnet_regiis.exe (under .NET\Framework)
Checked Local Security Policies and verified account was listed under
"Impersonate a client after authentication"
Removed \bin and \obj folders
Put \bin and \obj back when removing them didn't help
Removed \bin and \obj folders, then Rebuilt
None of these have worked. Any suggestions?
The problem ended up being Samsung Magician's Rapid Mode losing data during its write-caching phase to my solid state drive. I turned off Rapid Mode, and now the project builds without any problems.
Sorry for came too late, but i had this problem and i wanted to show how i fixed for the next devs who need a solution:
It's quite simple, just change your proyect assembly name:1) On your solution explorer: Right click on your proyect.
2) Properties>> aplication>> assembly name>> change it.
3) Compile, run to test it.
4) Change the name again if u wanted the original name.
Adding a description:
Changin the assembly name
New 2 programing in VS but i had same problem of Access or Write exe file ON BUILD.
Problem came out of nowhere. I didn't use or make changes 2 exe file in months,
made exe file, used it now and then and forgot about it....
Then after few months i wanted 2 start exe but no icon on desktop ??? ....tried everything, lost 3 days of searching inside code for error in VS and then called Google....
I read last comment ABOVE which mentioned Bitdefender, opened it and found BitD did block and isolate exe files ..... so i tried exluding files and folders which made problems inside BitD but no help....
So i went back 2 VS.
Within debug i got some X86 processor error which didnt make problem to build but it was warning (free component name in error description helped me ), - errors you can ignore but they are here on build ....
So i made last move before starting it all over again. Removed COMPONENT from application, deleted it on PC ...started VS from start .. and ALL was OK !!!
So in my case it was all about FREE component i used in app inside VS .... Bitdefender found some add / virus in it and blocked build progress.
BitD deleted or blocked exe file in start....
Hope this help anyone with similar problem !
The cause of this error for me was that Team Foundation Server had pulled in a bunch of files to my work space as Read-only. Not sure why it pulled them down from the server with read-only checked, but all I had to do was uncheck it.
Ok. Create a new solution and add its directories to the exception list and copy all your work, except for the '.vbproj' and except for the '.csproj' and the directory files to the directory of the directory of the new solution. I have tried that and it works, due that I have Bitdefender, it will be the only way to sort that issue. After doing so, try to build the app again. If it does not work, then I am definitely out of ideas.

Visual Studio, TFS and file system links

My Visual Studio projects are located in C:\Users\MyName\ProjectName.
To make life easier (I thought), I created a file system link in the root called TFS.
(i.e. C:\TFS points to C:\Users\MyName\Projects)
I always open my projects from the link (i.e. C:\Tfs\ProjectName\ProjectName.sln) and my TFS local paths uses the link.
This work fine most of the time but someVisual Studio and TFS think files are in C:\Users\MyName...
i.e. If I look at the properties of projects in a solution, one can be in C:\Tfs and another in C:\Users. I have verified that there are no absolute paths in any solution or project files.
When this happen and I add a new file to a project TFS becomes a real mess. TFS thinks the new file is in C:\Users and is not versions controlled but at the same time there is a file with the same name in the C:\TFS folder so I need to resolve a conflict. I can resolve the conflict but TFS starts versioning the C:\Users file. i.e. the local folder for the project is C:\TFS... but according to TFS (and pending changes) the new files live in C:\Users.
I have not found a way to change the local name of file, only a folder.
Is there a way to resolve this or should I just get rid of the link?
(It works slightly better with a TFS local workspace but the problem is still there)
<tl;dr>
Symlinks are funny things and because TFVC stores binding information outside of the source control folder, it may get very confused when your repository is stored in or includes them.
Details
Opposed to Git, Mercurial and Subversion, TFVC doesn't just keep the binding of disk to repository in a subfolder of the repository (in case of a server workspace it doesn't keep this data on disk with the repository at all). It also stores it in a number of other places, namely the TFS server and your user profile.
When you look at a subversion or git repository you'll find the .svn or .git folders which contain the information of which folders on disk map to the repository.
With TFVC this information is not only stored on disk (in case of Local workspaces), but also on the server (machine name, server path, local path) and in your user profile (under AppData\Local\Microsoft\TeamFoundation\). These configurations store the full paths and these are used to see if a file is under version control or not. The reason why Local workspaces improve things, is because they add the tf$ folder with some of the binding information.
Since a workspace mapping can map a folder in your repository only to one folder on disk, the use of symlinks confuses the TFVC client. You might consider this a bug, since Microsoft should be able to resolve the link (depending on the link type), but Visual Studio assumes you are not using links. Other reasons why this (potentially) confuses Visual Studio is that file size and other attribute changes are not always signaled in case links are used (date changed and file size may not be notified until Visual Studio is told to refresh the solution). The Read-Only bit (which TFVC uses in case of server workspaces) also has special behavior in case links are used and may cause issues of undetected checkouts.
More on the strange edge cases caused by links, can be found here.
I'm not sure why you'd want to use a link in this case, the sources are already stored in TFS, so a backup of your profile doesn't add much and only makes you system slower in case you have a roaming profile. Plus, workspaces are machine bound and should never "move between machines" magically anyway.
You can submit a suggestion on the Visual Studio User Voice, or file a bug on Connect if you want to see this behavior changed, but to solve your problem, use normal folders and map your files to a unique location.
Just keep in mind, very few applications in Windows are built to handle Symlinks, and those that are may cause strange behaviors. Windows explorer (file open dialogs and drag&drop) may provide the original file location, instead of the link location for certain actions and changes to attributes in one location may not be visible in the linked location.
As you can see here, systems will be able to see the difference between symlinks and real directories, and thus may act on that knowledge:

Visual Studio 2013 Preview Not Responding frequently

I've installed the Visual Studio 2013 Preview to try out and I'm having some very bad performance issues. Every time I open a file and immediately try to close it, edit a file, save a file, etc. the IDE will stop responding for about 15 seconds.
I've gone through every performance tweak I could find through stackoverflow, blogs, web search, etc but none have worked (for example, clean up temp folders, disable add-ins and extensions, delete .suo file, etc.).
Using /safemode, the performance problems go away but I can't find what could be different since I have no add-ins, nuget packages, or extensions installed.
Using SysInternals Process Explorer, I can only see the process for devenv.exe peg the core it's using at 100% when it stops responding. I am not seeing any network or hard drive activity during this time and no other processes become active.
I've reinstalled with no luck, and I've installed it on another development machine where it seems to work just fine.
Anyone have any ideas?
Thanks!
UPDATE: In Process Explorer 'Other I/O Delta' shows ~200,000/sec when it locks up on the devenv.exe process. Still looking...
UPDATE 2: I guess I should add that this PC is a Dell Vostro 460, i7-2600 # 3.4Ghz, 8GB RAM, Windows 7, 1TB HDD with 550GB free, plenty of power for what I'm doing. Closed all other apps while debugging, including VIPRE A/V and Malwarebytes.
UPDATE 3: Maybe getting closer... using Process Monitor (love SysInternals stuff!) for some reason my entire C:\Projects\ folder is being parsed/searched by devenv.exe. I keep all my project folders under C:\Projects\ where there are about 20 projects each with their own sub-folder. Here's where it gets weird. In /safemode, devenv only parses the current project's folder, not the entire parent folder. Projects has 6,271 folders with 29,914 files. I tried creating a new c:\Projects2013\ folder, created a new test project, and devenv is trying to parse the full parent Projects2013 folder, yet in /safemode only parses Projects2013\Sample.
Obviously though the new project in Projects2013 runs full speed because it's parsing far fewer files. The other computer runs fine because I left the default Projects path and there were no other projects in that folder. Now what in the world could be doing this and why the different folder path between regular and safe mode? Time to dig through Tools, Options... ugh!
Here's a screen grab from ProcMon:
Final Update - Resolved! It was git causing the problem. I had a local repository set at c:\Projects\ which contained all my various project sub-folders. The dump file I created for them allowed them to narrow it down to git. Removing the local repository fixed my performance issue where VS 2013 is at least usable now. The programming team still needs to resolve the continuous re-parse of the folder though. Anyone with a very large repository will end up with this issue.
RESOLVED! It was git causing the problem. I had a local repository set at c:\Projects\ which contained all my various project sub-folders. The debug dump file I created for the VS programmers allowed them to narrow it down to git. Removing the local repository fixed my performance issue where VS 2013 is at least usable now. The VS programming team still needs to resolve the continuous re-parse of the folder though. Anyone with a very large repository will end up with this issue.

How can I force TFS to let me download a folder (other than methods listed)?

I have a seemingly common problem, but cannot find a common solution that will work for me. I recently had my computer re-imaged and am now in the process of redownloading a solution from TFS. One of the solution folders contains 2 folders that list "Not downloaded" in the "Latest" column of the Source Control Explorer. When trying to open the solution, I get the error "The project file could not be loaded. Could not find file x". I've tried the methods listed below, to no avail:
Get Specific Version, checking Overwrite options
Deleting, .suo file, restarting VS2010
tf get /force
Remove mapping, deleting local files, remapping entire TFS project to local folder
tfpt rollback /changeset where the last changeset for the .csproj listed a branch and a merge as pending changes by me
File -> Source Control -> Open from Source Control, Navigate to TFS project, try to open .csproj in undownloaded folder, receive error "The selected file cannot be opened. The project file has been moved, renamed or is not on your computer."
I may be missing other things I've tried, I'll be sure to update this list if I can think of anything.
Besides those listed above, is there any other way to get those 2 folders and their content from TFS?
Try browsing via visual studio command line to the directory and do a:
tf get . /force /recursive
This should forcibly recurse down from the current directory.
You have tried most of the things that I would suggest. A force-get-latest should work if it's a simple case of TFS being confused about what is on your pc.
Are the "folders" in tfs, or in your solution explorer? Folders in the solution explorer typically mirror the real disk structure, but it is possible to get files and folders in a different location in the SE than on disk. This coild mean that the files the solution explorer is referencing are not mapped into your tfs workspace.
I would check the workspace mapping is as simple as possible (no branches or extra unneeded folders etc), close the solution, force-get the latest version of the disk structure from the source control view, and then load the .csproj file in a text editor to check exactly what the project is referencing to be sure that all the files exist and are in sensible places on disk.
I found the problem. I recently added a certain domain group to the TFSProject/Readers TFS group, then explicitly denied access to all rights in those two folders. It seems that although I am in the Contributor TFS group, I'm also a "Reader", so I denied access to myself.

How do you force the deletion of a TFS 2010 workspace on a client when the TFS Server no longer exists?

I currently have a TFS 2010 Server running on SERVER-1. On my client (MY-CLIENT) I have VS2010 running and have a workspace associating SERVER-1 with \MY-CLIENT\Development. All is good.
I was playing around with setting up a different instance of TFS on SERVER-2. On my client, I deleted the original SERVER-1 workspace and created a new workspace associating SERVER-2 with \MY-CLIENT\Development. All is good.
Having finished my experiments with TFS on SERVER-2, I re-imaged the machine (deleting the TFS Server on SERVER-2).
I then went back to my client machine, reconnected to TFS on SERVER-1 and attempted to remap source control to my Development folder. However, am now receiving the error "The path \MY-CLIENT\Development is already mapped in workspace MY-CLIENT;SERVER-2\Steve." Now I have a problem.
So, I gather from this that I should have first deleted the SERVER-2 workspace BEFORE re-imaging the machine. Unfortunately, I did not do that.
Poking around in some forums, I realize that I can use a command line tool to perhaps delete it:
tf workspace /delete MY-CLIENT;SERVER-2\Steve
However, when I run this, I get a message indicating that "Team Foundation services are not available from server http://SERVER-2:8080/tfs/development."
So the question, then, is how do I force deletion of the SERVER-2 workspace on my client so that I can re-create my old SERVER-1 workspace?
The working folder mappings for all the local workspaces is stored in the version control cache file. This allows you to bootstrap TFS clients, allowing them to locate the server information for a given local folder. In addition, it will provide the information for this test you're seeing, that prevents a local folder from being mapped to two different servers.
In order to clean this up (without trying to connect to the server), you can use the tf workspaces command (note the pluralization - the workspaces command operates on the list of workspaces, the workspace command operates on a workspace and generally requires connectivity to the server that workspace is located on.
To delete all workspaces for your deleted project collection, you can do:
tf workspaces /remove:* /collection:http://server-2:8080/tfs/DefaultCollection
(Obviously replacing the project collection URI with the URI for your deleted server.)
I had exactly the same issue: After moving TFS server to another machine, I couldn't map to a local folder in VS2012 on the old machine because it was still associated with an old Workspace that TFS denied all existence of. After many hours (and days) searching Google and trying different things, none of which worked (including all the "tf" commands, deleting the local cache etc), this is how I eventually solved it:
Edit the actual TFS collection database on the TFS server using SQL Management Studio Express (e.g. "Tfs_DefaultCollection")
Look for the "dbo.tbl_Workspace" table and edit it
You should see your "ghost" workspace(s) in here
Delete the rows
All is right in the world
The workspaceowner parameter on the delete command is optional. Can you issue the delete without that parameter, or will that damage another MY-CLIENT workspace?

Resources