Is there any way to retreive a VS project i opened on my computer from a usb stick.
I don't have the poject on my laptop and i lost my usb stick. I still see the project on the recent project list on VS's start page.
Opening a project from a remote disk does not make a local copy of it. You would have had to have done that manually. If you didn't make a copy, and you have lost the disk, then you are unfortunately out of luck.
The shortcut in the recent projects list is just that—a shortcut. It points to the file on the remote disk. If you clicked on it, it would try to load the solution from the disk, fail to find the disk, and present you with an error message.
This is a good reason to use source control and back up your code to multiple locations.
This is a longshot but if you have the compiled DLL and/or EXE files on your local machine, you can get the source code (or most of it) out of there by using a tool like Telerik JustDecompile.
Related
My Visual Studio projects are located in C:\Users\MyName\ProjectName.
To make life easier (I thought), I created a file system link in the root called TFS.
(i.e. C:\TFS points to C:\Users\MyName\Projects)
I always open my projects from the link (i.e. C:\Tfs\ProjectName\ProjectName.sln) and my TFS local paths uses the link.
This work fine most of the time but someVisual Studio and TFS think files are in C:\Users\MyName...
i.e. If I look at the properties of projects in a solution, one can be in C:\Tfs and another in C:\Users. I have verified that there are no absolute paths in any solution or project files.
When this happen and I add a new file to a project TFS becomes a real mess. TFS thinks the new file is in C:\Users and is not versions controlled but at the same time there is a file with the same name in the C:\TFS folder so I need to resolve a conflict. I can resolve the conflict but TFS starts versioning the C:\Users file. i.e. the local folder for the project is C:\TFS... but according to TFS (and pending changes) the new files live in C:\Users.
I have not found a way to change the local name of file, only a folder.
Is there a way to resolve this or should I just get rid of the link?
(It works slightly better with a TFS local workspace but the problem is still there)
<tl;dr>
Symlinks are funny things and because TFVC stores binding information outside of the source control folder, it may get very confused when your repository is stored in or includes them.
Details
Opposed to Git, Mercurial and Subversion, TFVC doesn't just keep the binding of disk to repository in a subfolder of the repository (in case of a server workspace it doesn't keep this data on disk with the repository at all). It also stores it in a number of other places, namely the TFS server and your user profile.
When you look at a subversion or git repository you'll find the .svn or .git folders which contain the information of which folders on disk map to the repository.
With TFVC this information is not only stored on disk (in case of Local workspaces), but also on the server (machine name, server path, local path) and in your user profile (under AppData\Local\Microsoft\TeamFoundation\). These configurations store the full paths and these are used to see if a file is under version control or not. The reason why Local workspaces improve things, is because they add the tf$ folder with some of the binding information.
Since a workspace mapping can map a folder in your repository only to one folder on disk, the use of symlinks confuses the TFVC client. You might consider this a bug, since Microsoft should be able to resolve the link (depending on the link type), but Visual Studio assumes you are not using links. Other reasons why this (potentially) confuses Visual Studio is that file size and other attribute changes are not always signaled in case links are used (date changed and file size may not be notified until Visual Studio is told to refresh the solution). The Read-Only bit (which TFVC uses in case of server workspaces) also has special behavior in case links are used and may cause issues of undetected checkouts.
More on the strange edge cases caused by links, can be found here.
I'm not sure why you'd want to use a link in this case, the sources are already stored in TFS, so a backup of your profile doesn't add much and only makes you system slower in case you have a roaming profile. Plus, workspaces are machine bound and should never "move between machines" magically anyway.
You can submit a suggestion on the Visual Studio User Voice, or file a bug on Connect if you want to see this behavior changed, but to solve your problem, use normal folders and map your files to a unique location.
Just keep in mind, very few applications in Windows are built to handle Symlinks, and those that are may cause strange behaviors. Windows explorer (file open dialogs and drag&drop) may provide the original file location, instead of the link location for certain actions and changes to attributes in one location may not be visible in the linked location.
As you can see here, systems will be able to see the difference between symlinks and real directories, and thus may act on that knowledge:
I suddenly have begun encountering an error similar to "E:\Websites\Stage\mywebsite\somefile.ascx: Access to the path is denied" on a multitude of my local files when attempting to check them in. The files it is failing on are all sorts of files, PNG, ASPX, CONFIG, etc.
I am using Visual Studio 2013 for Web (Update 4) and the visualstudioonline.com TFS.
The files are stored on a network location and I have a drive mapped to that location. I can manually open, manipulate, and save any of the files that error so I do not believe it is truly a permissions issues.
This setup has worked for months but suddenly it is giving me problems.
I ran a powershell script on the folder Get-ChildItem -Include *.* -Recurse -Path 'E:\Websites\Stage' | select fullname,isreadonly and all the files return 'False' under the isreadonly column. No errors are returned.
I am in need of some further ideas.
I found a workaround in another StackOverflow question.
Essentially, you shelve the pending changes, then you commit it. No need to unshelve them.
I would only suggest to use that to check-in your changes until you set another workspace locally (or someone fixes that issue).
As many other, using Visual Studio 2013 from within a VM having a local workspace located on the host computer mapped through a shared drive was working well before updating to "VS2013 update 4".
That setup was suggested to me with the reasoning that if the VM crashes, then I wouldn't loose my changes.
Storing your local workspace on a network location is not supported and should never be done.
Have a 'local' (physically on your local machine) workspace where you edit the files and check in. Then have an automated build that publishes the files to a location of your choice.
I ran Windows/Visual Studio in Parallels on a Mac and had a project saved to my desktop (yes, shame on me). Internally this path is handled as \\psf\Home\Desktop even if it is stored locally and not in the network. Still gives the same exception and is solved by moving it to your regular drive (c:\...)
I've installed the Visual Studio 2013 Preview to try out and I'm having some very bad performance issues. Every time I open a file and immediately try to close it, edit a file, save a file, etc. the IDE will stop responding for about 15 seconds.
I've gone through every performance tweak I could find through stackoverflow, blogs, web search, etc but none have worked (for example, clean up temp folders, disable add-ins and extensions, delete .suo file, etc.).
Using /safemode, the performance problems go away but I can't find what could be different since I have no add-ins, nuget packages, or extensions installed.
Using SysInternals Process Explorer, I can only see the process for devenv.exe peg the core it's using at 100% when it stops responding. I am not seeing any network or hard drive activity during this time and no other processes become active.
I've reinstalled with no luck, and I've installed it on another development machine where it seems to work just fine.
Anyone have any ideas?
Thanks!
UPDATE: In Process Explorer 'Other I/O Delta' shows ~200,000/sec when it locks up on the devenv.exe process. Still looking...
UPDATE 2: I guess I should add that this PC is a Dell Vostro 460, i7-2600 # 3.4Ghz, 8GB RAM, Windows 7, 1TB HDD with 550GB free, plenty of power for what I'm doing. Closed all other apps while debugging, including VIPRE A/V and Malwarebytes.
UPDATE 3: Maybe getting closer... using Process Monitor (love SysInternals stuff!) for some reason my entire C:\Projects\ folder is being parsed/searched by devenv.exe. I keep all my project folders under C:\Projects\ where there are about 20 projects each with their own sub-folder. Here's where it gets weird. In /safemode, devenv only parses the current project's folder, not the entire parent folder. Projects has 6,271 folders with 29,914 files. I tried creating a new c:\Projects2013\ folder, created a new test project, and devenv is trying to parse the full parent Projects2013 folder, yet in /safemode only parses Projects2013\Sample.
Obviously though the new project in Projects2013 runs full speed because it's parsing far fewer files. The other computer runs fine because I left the default Projects path and there were no other projects in that folder. Now what in the world could be doing this and why the different folder path between regular and safe mode? Time to dig through Tools, Options... ugh!
Here's a screen grab from ProcMon:
Final Update - Resolved! It was git causing the problem. I had a local repository set at c:\Projects\ which contained all my various project sub-folders. The dump file I created for them allowed them to narrow it down to git. Removing the local repository fixed my performance issue where VS 2013 is at least usable now. The programming team still needs to resolve the continuous re-parse of the folder though. Anyone with a very large repository will end up with this issue.
RESOLVED! It was git causing the problem. I had a local repository set at c:\Projects\ which contained all my various project sub-folders. The debug dump file I created for the VS programmers allowed them to narrow it down to git. Removing the local repository fixed my performance issue where VS 2013 is at least usable now. The VS programming team still needs to resolve the continuous re-parse of the folder though. Anyone with a very large repository will end up with this issue.
Is it possible to automatically check a file into Visual Source Safe after the local (working) copy had been changed? Our current process involves editing our code on Windows computer that are running VSS Explorer and after a check in VSS shadow copies the files to the devel Linux server. We're spending a lot of time manually checking the file back in through VSS and would like to just have it push every time we save the files.
Thanks,
Why not write a small app that monitors the folder for updated files, and when they are updated, open your VSS connection and checkin to VSS via a specified or hardcoded path? Just an idea though. I believe you can use something such as FileSystemWatcher. http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx
I made some code changes (in VS 2010), and did not check in or shelve my latest changes. Unfortunately, my laptop had a heating problem and I can't use it for another week (it's going for service). I have the hard drive with me though, so I can probably access the code from another machine. But how do I do it without messing up the state in TFS? If I copy-paste all my code from the hard drive into the other machine, VS will check out all my files right? Is there a better way to do this?
Thanks!
I would check the files out that you think are changed (you can use a tool like WinMerge to see which files have changed), copy the updated files from the old drive, and shelve or check them back in.
If the files are in a solution and you have the complete structure on your disk try if the "Go online" feature is available if you open the solution.
"File => SourceControl => Go online" or go to another machine and map your workspace to your disk with the changed source code.
(I would create a temporary private workspace for that situation and delete it after successfull merging/checkin).
I suppose if you open the solution and it has a valid source control binding, you get a message that your solution is right to go online.
If this dialog appears select "yes", all changes are correctly detected and you can merge/checkin in the normal way.
On the new machine get the latest code from branch, which comes with previously checked-out TFS version by you, then copy/paste the files from harddisk to replace tfs verison.