We just did a move from storing all files locally to a network drive. Problem is that is where my VS projects are also stored now. (No versioning system yet, working on that.) I know I heard of problems with doing this in the past, but never heard of a work-around. Is there a work around?
So my VS is installed locally. The files are on a network drive. How can I get this to work?
EDIT: I know what SHOULD be done, but is there a band-aid I can put on right now to fix this and maintain the network drive?
EDIT 2: I am sure I am not understanding something, but Bob King has the right idea. I'll work with the lead web developer when he gets back into the office to figure out a temporary solution until we get some sort of version control setup. Thanks for the ideas.
While we do use Source Control, we do also run all our projects from Network Drives (not shared directories, private directories on network drives). The network drives are backed up nightly, and also use Volume Shadow Copy, so if you need to revert to something before it made it's way to SC, then you can.
To get projects to run correctly with the right permission, follow these steps.
Basically, you've just got to map the shared directory to a drive, and then grant permission, based on that Url, to all code. Say you map to "N:\", then use "N:\*" as your Url pattern. It isn't obvious you need to wildcard, but you do.
The question is rather generic so I'll give an answer to one issue I was facing.
I run Visual Studio 2010 using a Parallels virtual machine on my Mac while keeping all my projects on the mac side via a network share. Visual Studio however wouldn't load the projects assembly files from there. Trying to set the rights using "caspol" alone didn't help in my case.
What finally worked for me to allow Visual Studio to load assemblies from a network share was to edit the file
"C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\devenv.exe.config" (assuming a default installation).
in the xml "<runtime>" section you have to add
<loadFromRemoteSources enabled="true"/>
You may have to change the permissions on that file to allow write access. Save the file. Restart Visual Studio.
In the interests of actually answering the question, I copied this comment from jcarle.com:
Trusting Network Shares with Visual Studio 2010 / .NET Framework v4.0
January 20, 2011, 4:10 pm
If you are like me and you store all your code on a server, you will have likely learned about trusting a network share using CasPol.exe. However, when moving from Visual Studio 2008 (.NET Framework 2.0/3.0/3.5) over to Visual Studio 2010 (.NET Framework 4.0), you may find yourself scratching your head.
If you are used to using the Visual Studio Command Prompt to quickly get to CasPol, you may find that some of your projects will not seem to respect your new FullTrust settings. The reason is that, unless you are carefully paying attention, the Visual Studio Command Prompt defaults to adding the .NET Framework 4.0 folder to its path. If your project is still running under .NET Framework 2.0/3.0/3.5, it will require setting CasPol for those versions as well. Just a note, I have also personally had more success with using 1 as a code group instead of 1.2.
To trust a network share for all versions of the .NET Framework, simply call CasPol for each version using the full path as below:
C:\Windows\Microsoft.NET\Framework\v2.0.50727\CasPol -m -ag 1 -url file://YourSharePath* FullTrust
C:\Windows\Microsoft.NET\Framework\v4.0.30319\CasPol -m -ag 1 -url file://YourSharePath* FullTrust
I would not recommend doing that if you have (or even if you don't have) multiple people who are working on the projects. You're just asking for trouble.
If you're the only one working on it, on the other hand, you'll avoid much of the trouble. Performance is going to out the window, though. As far as how to get it to work, you just open the solution file from VS. You'll likely run into security issues, but can correct that using CASPOL. As I said, though, performance is going to be terrible. Again, not recommended at all.
Do yourself and your team a favor and install SVN or some other form of source control and put the code in there ASAP.
EDIT: I'll partially retract my comments. Bob King explains below the reason they run VS projects from a network drive and it makes sense. I would say unless you're doing it for a specific reason like Bob, stay away from it. Otherwise, get your ducks in a row before setting up such a development environment.
So I was having a similar issue. Visual Studio wouldn't recognize a network location I had mapped for a drive letter for anything. The funny thing is, it worked for a day. I set up my project and began working on it and had no issues. Then, I shut down and the next day nothing works. I couldn't read/write files in code, output my executables or anything. My project is local but my output was intended to be thrown up on the network.
Anyways, the problem is probably about the administrator context but one way to fix it which I found while digging around online is to get Visual Studio to browse to the drive in question some how. There are plenty of ways to do this but VS will magically be able to recognize mapped drive letters. My solution is to go the the Debug Output Location in the Project Properties, click browse and go to my previously made output location on my network drive and Voila!!!
I wanted to put this up because I spent half a day trying to figure this out and figured it might save someone else some time. Thanks much and good luck!!!
Erik
I understand this is an older thread, but this was the best thread I found when looking to solve a similar issue I had visual studio 2013 on a virtual box (using Win 8.1) and the code on the host machine (Win 7). Although I could open the solution, I could not compile. All of the other answers on this relate to older software, so I am adding this answer to update this frequently found question with the solution that worked for me.
Here's what I did; Made a registry entry to be able to use a UNC path as the current directory.
WARNING: Using Registry Editor incorrectly can cause serious, system-wide problems that may require you to reinstall Windows NT to correct them. Microsoft cannot guarantee that any problems resulting from the use of Registry Editor can be solved. Use this tool at your own risk.
Under the registry path:
HKEY_CURRENT_USER
\Software
\Microsoft
\Command Processor
add the value DisableUNCCheck REG_DWORD and set the value to 0 x 1 (Hex).
WARNING: If you enable this feature and start a Console that has a current directory of an UNC name, start applications from that Console, and then close the Console, it could cause problems in the applications started from that Console.
Found this information at link: http://support.microsoft.com/kb/156276
How about we rephrase this into a question that everyone can answer? I have the exact same problem as the initial poster.
I have a copy of VB 2008 (recently upgraded from VB6). If I store my solutions on the backed up network drive, then it won't run a single thing ever. It gives "partially trusted caller" errors for accessing a module, even when "allowpartiallytrustedcallers" is set in the assembly. If I store the files on my (not backed up) C:, then it will run wonderfully, until I put it on the share drive for everyone to use, and I'm back to my same problem.
This isn't a big request. I just want to be able to put a solution and executable on the share drive and run it without an absurd amount of nonsense about security. I shouldn't have to cram all my work into form files.
-Edit: I found the problem with why it was ignoring the AllowPartialllyTrustedCallers command. I'm trying to reference ADODB, which doesn't allow partially trusted. So, no network executable can access a database? What does Microsoft have against intranets anyway?
I was facing the same issue just recently so this answer is more for the sake of keeping track of my own knowledge. Anyway, should soumeone find it useful, below is the issue and the solution.
Issue:
NET 4.0 projects, SVN repo, checkout folders are on local drives, referenced assemblies are build by build server and available on a network drive. Visual studio on W7 is is able to add the reference but unable to build projects.
Solution:
Since NET 4.0 does not automatically provide a sandbox anymore for network assemblies, you have to make those full-trusted via machine.config update. http://msdn.microsoft.com/en-us/library/dd409252.aspx
I had a similar problem with opening Visual Studio projects on a network drive, and I fixed it by creating a symbolic link on my local C:\ drive that points to the UNC directory
e.g.
mklink /D "C:\Users\Self\Documents" "\\domain.net\users\self\My Documents"
then you can just open the project using the C:\Users\Self\Documents\ path, instead of the UNC path
(You have to be careful, because Visual Studio will automatically redirect you to the '\\domain.net..' path if you double click the symlink when you're browsing for the project. I had to copy paste the 'C:\Users\' path to get it to open with the drive letter path)
Don't do it. If you have source control (versioning), you do not want your files on a network drive. It totally bypasses all you want to achieve by using source control, because once your files are on a network drive, anyone can modify them .... even while you're currently building your project. Ka-boooom!
PS: this sounds like a typical case of over-engineering to me.
Are you having any specific problems?
If you allow more than one person to open the solution, your first problem will be that the .NCB file (Intellisense) will be locked exclusively and only one user will be able to browse the class tree. And of course you have the potential for one user's changes to overwrite the other user's changes.
You should be warned that some feature in Visual Studio will refuse to work with network drive.
For example, mdf file of SQL Express user instance must be located in local drive.
For another example, if you use UNC path, you have to make sure they are short enought.
i found this helpful while trying use vc11 with parallels which run on mac:
http://social.msdn.microsoft.com/Forums/en-US/toolsforwinapps/thread/2ffdcb01-c511-4961-834b-afd5f2fbb8e1, and specifically:
1) You can switch from local debugging to remote debugging and set the machine name as 'localhost'. This will do a remote deployment on your local machine (thus not using the project's directory). You don't need to install the Remote Debugger tools, nor start msvsmon for this to work on localhost.
In case this helps anyone else, I had to do the steps outlined here to add the network share location to Windows intranet zone. In particular, I was having trouble with Visual Studio hanging on load when opening a solution on a network share (i.e. using VMware Fusion and opening a solution from my Mac's hard drive). I also had problems with PostSharp running in this scenario.
If i understand you correctly, your Visual Studio project files are stored on the network drive and you are running them from there. This is what I do and don't have any problems. You will need to make sure that you have set the security policy. You can use Caspol to do this, or via the control panel-admin tools menu.
"How can I get this to work?"
You have a couple choices:
Choice A:
1. Move all files back to your local hard drive
2. Implement some type of backup software on your machine
3. Test said backup solution
4. keep on coding
Choice B:
1. Get a copy of one of the FREE source control products and implement it.
2. Make sure it's being backed up
3. Test it
Choice C:
Use one of the many ONLINE source control repositories available. Google, SourceForge, CodePlex, something.
Well, my question would be why you are asking this. Is it not working when you are storing it on a network drive? I haven't tried this myself, and one problem I could envision would be that .NET code running from a network drive (ie. from the bin\Debug directory, also located on the network drive) would be running in a sandbox mode, unless you mess around with CASPOL (or use 3.5 SP1 which I hear has removed that obstacle).
If you have specific problems, ask about them. Never ask "Why is doing X not working?".
You're not saying if you're just one person or multiple persons accessing the same remote drive, but I'm assuming you're just one for each network directory. Is this correct? If not, no, there is no band-aid. Get version control, move the files back to a local disk.
Related
I was just shocked when I saw this dialog popping up, because I thought I was a victim of an encryption virus:
Using Process Explorer to see which process opened the dialog, I found efsui.exe and I verified the signature to make sure it was an official Microsoft product.
Next, I googled about this and I found the command cipher /u to find encrypted files on my disk. The result was
C:\Windows\system32>cipher /u
C:\Users\XXX\Documents\Visual Studio 2015\Backup Files\Form1.Designer.vb\~AutoRecover.Form1.Designer.vb.sln: Encryption updated.
and that's confirmed when opening the folder in Explorer.
I'm using Visual Studio for a long time, but it's the first time I opened a VB Form.
Is it common that Visual Studio encrypts files? Does it only happen for VB or is this due to some other VS update that was installed recelty? How to turn that feature off?
I tried:
enter "encrypt" and "encryption" in Visual Studio's quick find box.
finding an answer like this one about removable drives but my drive is C: and not removable.
I suspect this is a feature that project templates/extensions can enable:
From: http://www.visualmicro.com/forums/YaBB.pl?num=1362001590/0
In summary, a number of machinations of Win32 code produced .sln recovery files, none encrypted. A creation of an independent arduino project created an encrypted .sln recovery file within a a minute or so
In this case the Arduino template seems to have this particular flag (perhaps because it can store credentials (App Store IDs, private keys etc.)).
In my mind it's an excellent sign that Microsoft Visual Studio devs have thought of the possibility of leaking sensitive/confidential information in the "temporary" files that might even go roaming via the domain controller and profile replication.
As such, there doesn't seem to be a way to disable the feature (and rightly so). But you can always store your projects/application data on a filesystem that doesn't support NTFS encryption. As the name implies, this would likely be any other filesystem.¹
¹ it's not impossible that any POSIX filesystems supporting extended attributes could also support this feature on Windows, but I haven't heard of it before
Current versions of Visual Studio (e.g.: 2019) still do this, and in my case, it decided to encrypt a backup of a single icon file. There wasn't even a project open, it just decided to do this all on its own for no apparent reason. I'd definitely classify this as a bug.
What I ended up doing is disabling EFS altogether, since I don't need VS deciding to encrypt files whenever it gets the urge. You can do this using the Group Policy editor (gpedit.msc) and select Computer Configuration > Administrative Templates > System > File System > NTFS and enable the "Do not allow encryption on all NTFS volumes".
Important note: If you decide to do this, make sure you decrypt any encrypted files using the cipher /d command before you disable EFS. Otherwise, you won't be able to decrypt them.
Not really sure of my exact question, but here is the situation:
I have an application (WinForms, C# .Net) that I am developing in Visual Studio 2012. It does a lot of things but the important bit is that it needs to read files from a certain location.
In this case, the location of the files is on a server and my machine has a mapped network drive setup for accessing the files. I can manually navigate to the files with Windows Explorer fine.
I have the following line in my code which is highlighting the issue:
System.IO.File.Exists("X:\\A Folder\\a_file.txt");
And that file does exist in that location. However this is where the problem occurs: if I build the solution and run the .exe directly from the "bin" folder (double-click). The code is fine, and it finds the file. But if I run it with visual studio then I get a "file not found" exception.
I am putting this down to the fact that Visual Studio is running in "Administrator" mode (I forget why I needed this, but I do). Now this makes sense if you consider that the "administrator" account does not have the "X:\" drive mapped. However, this has never been a problem until I upgraded to Windows 10 last week.
So my question is:
Does Visual Studio Administrator mode work differently in Windows 10? In this case, does it handle mapped network drives differently?
It's worth noting I upgraded from Windows 7, so I cannot confirm if this issue is also present in 8 and 8.1 or not.
And before anyone asks, let's just say it has to be a mapped drive. No UNC paths allowed!
So I have found a solution/workaround. Kind of seems like a wasted bounty now, so if someone has other suggestions that are better then please post and I will review them and award as applicable. Or even if somebody can make a more detailed version of my solution then I will award that one.
The issue is probably not specific to Visual Studio, but would occur with any application running with elevated privileges. Anyway, the solution I found is to add a registry key that enables the same shared drives to be accessible when running in administrator mode.
The registry key location is:
HKEY_LOCAL_MACHINE/SOFTWARE/Microsoft/Windows/CurrentVersion/Policies/System
And the key to add is called:
EnableLinkedConnections
And should be created as a DWORD with a value of 1 (0x00000001)
I checked with the machines running Windows 7 and they do NOT have this key, yet they still work fine. So I expect this isn't the only solution, but it does seem to work (no side effects noted yet). I would assume that Windows 10 has a specific setting somewhere that by default prevents mapped drives from automatically being available with "run as administrator".
For reference, I found this information here.
In fact, here is a more "official" recommendation for using this reg key.
This is unlikely to have anything to do with Windows 10, just with the configuration of your machine. What you describe is normal and covered by this KB article. Nothing I can check for myself so just try the recommended workarounds, follow up at superuser.com if necessary.
Different users/system tasks maybe running. As such, you have the X drive mapped, but others do not. You could do the drive mapping on additional users on your Windows installation as well. As you stated, this should not be a Windows 10 only issue, but also Windows 7+ and elevated privileges.
Maybe you could use a configured parameter for the X: path and load at runtime, or even try using UNC paths which will resolve at runtime and not need the drive to be mapped.
\\ServerNameOrIP\A Folder\a_file.txt.
In the code, you would need:
System.IO.File.Exists("\\\\ServerName\\A Folder\\a_file.txt");
On occasion I telecommute and need to work on the project files on my box at the office via my laptop. I bounce back and forth between two methods of doing this:
I remote into the machine at my office and work on the instance of Visual Studio on that machine through RDP.
I have the folder containing the solution files set up as a network share and load the solution on my local (laptop) install of Visual studio
These options are far from perfect. Loading the solution via network share makes for a responsive experience unless VS needs to read/write files, which happens with more actions than one might realize. The RDP method avoids the network latency of reading and writing files, but makes for a laggy experience, plus I have to reconfigure VS's layout everytime I log in to accommodate a single monitor.
What I'd like to do is continue to work on my local/laptop copy of VS but figure out a way to speed it up substantially. For a variety of reasons, I do not want to have to commit to and pull from source control to share across machines.
Any suggestions on how to speed up working with Visual Studio when loading remote project files?
Storing your local files under a Dropbox installation can help with synchronizing your files, but it comes with a couple caveats:
On each machine that you synchronize, you must be able to store the files under your /Dropbox directory. This may be feasible on your laptop but not on your office box, for example.
Dropbox synchronizes at its own casual pace. This is most noticeable on its initial sync, but also imposes a short delay on subsequent synchronizations. When you open your laptop at home, you'll need to keep an eye on the taskbar icon to ensure it is "up to date" before you begin working.
Given those caveats, I love using Dropbox for my personal projects for a couple reasons:
Since I don't use source control on those projects, it is relatively easy to overwrite something you want to keep. Dropbox gives you fast access to your entire revision history, allowing you to roll back whenever needed. This does not count against my free quota.
Even when I delete or remove files, Dropbox continues to store them and allows me, years later, to undelete and recover them. Again, all of this happens without impacting my free quota.
As the OP mentioned in the question's comments, Google Drive offers a solution similar to Dropbox.
I develop both on my desktop and laptop, and I am frequently switching between them. Are there any problems that could arise from keeping a project folder in my dropbox and always accessing/editing from there? I'm running the VS2010 on both, but W7 on one and W8 on the other.
I'm using it often. But I do experience some issues. It seems that sometime VS and Dropbox conflict. This shows by leaving some temporary source files or by errors during compilation of file being locked.
In fact I came here while looking how to solve them. But still they are only a little issue and I keep using it that way for a long time.
EDIT: It is not just me. See Visual studio 2012 and dropbox don't play nice together question on SuperUser.
I'm using Dropbox to host my project and I edit and build directly on there and have experienced no problems, ever. Win7, VS2010, CPP. I find Dropbox to be simpler and equally robust to than version control software. I'm a big fan. I should say Microsoft OneDrive once failed me, horribly, and I no longer trust it. With Dropbox, I always check the icon in the systray carefully to make sure it is finished updating before I turn my computer off.
I use both git and Dropbox, as I also switch which machine I'm working on. This way I can use source control with the rest of my team, while also able to pick up where I left off. My 2 PCs that sync are my one at work and at home. Both desktops, both almost always on and running dropbox.
Rarely I get conflicts, when a machine is offline or something. The solution 99% of the time is to simply delete any conflicting files. Because I'm constantly up to date with git, it's fine if I ever have to delete all my local code, since I can always get it back.
So it's really for nothing other than being able to run out of work on an urgent task, and then resume where I left off when I got home.
We have tremendous problems with Visual Studio (2008, if that matters) locking up and slowing down when accessing projects over a network drive. It can take several minutes to open a large Web site project through a mapped drive, and saving even a single file can take a minute or more.
I fired up Wireshark and watched the traffic. VS, it seems, requests massive amounts of files from the network -- there's an enormous amount of SMB traffic. I've done some research, and this traffic seems to stem from two situations.
VS has to have everything in its own process to provide Intellisense.
VS needs to have all the source in order to compile the project.
All the advice I've read seems to boil down to the same thing: work locally, not on a remote machine, then push your code to an integration server via source control.
This would sure solve our problems (VS is quite fast working locally), but what if you can't work locally? What if the project and the infrastructure required to run it is too large and complicated to be replicated on everyone's individual machines?
We've gone 'round this problem a couple times, and the only way we can figure to work on these projects is direct access via a mapped drive. However, the VS slowness and lockups are really becoming a problem.
One solution: we installed VS on the server and work on the projects directly on the servers via RDP. Seriously.
So, I ask:
What does everyone else do? Do you work via the network, or do you replicate projects locally? If remotely, do you suffer from VS performance issues.
We work locally and use SVN to keep all our code on the server.
I find VS 2008 quite slow working locally sometimes so I wouldn't fancy working on a network share.
Trying to compile over a network share is horribly slow using visual studio. Your start times will be bad as the intellisense database is regenerated. Each compilation has to go over the network multiple times. Linking takes forever.
If you need the output of your compilation on the network, I'd recommend doing your compile locally and defining a post-build command to copy the results to your share.
If, as you say, you cannot pull everything locally then I'd suggest your project is too big and needs to be broken up into more manageable chunks. For a multi-tier application, break it up by tier and invest in some form of continuous integration (e.g. CruiseControl) to automatically build individual pieces. In this way you can work locally on an particular piece and pull the pre-build portions from CI for the other pieces of the application.
I'm not terribly surprised that using VS to load projects over a network share has performance issues. VS (in any language) is constantly getting information from files in the project. Once you start loading this over a network you're at the mercy of the underlying network connection. All lags and access issues will directly translate into VS having an issue loading file contents.
I would advise copying the solution locally and using some form of source code control to sync the project on the share.
If the code is too complicated to install on everyone's machine, then don't put it on everyone's machine. Does everyone need to have everything in order to do productive work?
I have 79 projects in my solution that I work with. Several hundred thousand lines of code. I pull my source down everyday from TFS and build it; it's a lot of code, but it's a far better solution than trying to work over a network share.
A more legitimate situation of having the source code on a share is when one has a non-Windows host on which a (number of) virtual Windows machine is running.
I have this exact situation where my desktop machine (the host) is running Debian and I use VMware to run various virtual Windows machines (the guests), including one that has Visual Studio installed so that I can target Windows OS's. Having the source code on a Samba share on the host machine has the following pro's:
The source is not duplicated, so there is no way to confuse different copies while working on several virtual machines at the same time.
I have full control over the source from my preferred OS.
I can turn on and off any of the virtual machines, or roll back to a snapshot, without the risk of loosing changes.
I can build (etc.) from the same source on several machines without having to commit changes before the source fully tested (reason: I have to use Subversion <1.5).
The only problem with this setup is that Visual Studio (6,7,8,9) is painfully slow.
I have mounted the partition (on which the share lies) with "relatime" and this works in as far as the disk activity on the share moderate, but Visual Studio keeps the (virtual) network card occupied all the time.
Any solutions to this would be very appreciated.
I encountered similar problems everytime I worked (work = anything else then just copy / paste files) over a network drive. The problem occured with ZendStudio and Eclipse.
Why not use any kind of source control?
When working on Windows based projects I've always worked locally.
Once at a unix shop (AIX iirc) developers would work via NFS mount and checkin/checkout via RCS...
I'm using VS2005 across to a network share and not having any performance issues. However, it is a new server (Windows Server 2008). I don't have any other data points for VS since using it at work is relatively new for me.
However, some datapoints from using Netbeans for previous projects on a network share... Local build time for my project was 2 minutes on Vista, on a fast dual-core AMD 64-bit machine. For a network share project, on a Server 2003 box, it was 20 minutes. Building that same project from an ancient Tablet PC (1ghz, single core) running XP locally was around 5 minutes. Interestingly enough, the Tablet PC could build on the Server 2003 box in the same 5 minutes.
For those asking "why" on the network share. The network share is automatically backed up, archived, etc. Also, that way I can very easily look at the same projects from multiple machines without having to worry about pushing back into the repository, etc. Once you've gone to having your dev stuff on a device where you can get to it from anywhere/anything, you'll never want to do local storage again!
I have performance problems via network anything, they just aren't good enough yet.
I thought it was common knowledge that disk-speed is one of the major "slowness" factors when it comes to using VS in Windows. Most dev machines I've built have had projects located on 10k RPM RAID0 drives, or at least a single 10k RPM drive. And even then it seems slow sometimes. Just the way it is, I suppose, until VS2009/VS2010 fixes it? :)
From my experience, this lag when working on a network share is 99% due to Intellisense. Disable it and you'll see.
disabling Intellisense indeed speeds up saving and opening files trough a UNC share dramatically
http://blogs.msdn.com/saraford/archive/2007/12/03/did-you-know-how-to-turn-off-intellisense-by-default.aspx
but then again, as stated in other comments, you might as well use a good text editor
I've also experienced the problems with performance mentioned above. It seems to vary from project to project, but I did find one way of speeding up performance significantly for some project types.
Following the advice in this article made a previously unusable project on a network location (it would take minutes to open one file) perform almost like a local project. The basic gist is that you need to grant FULL TRUST to the network location:
To grant permission to all your projects in your Visual Studio Projects folder located on the network, follow these 8 steps:
Open Microsoft .NET Framework 1.1 (or 2.0) Configuration which you'll
find under Administrative Tools in the Control Panel.
Expand Runtime Security Policy | Machine, | Code Groups | All_Code |
LocalIntranet_Zone In the right-hand pane, click Add a Child Code
Group.
In the dialog that follows choose Create a new code group and fill in
a Name like Visual Studio Projects.
Optionally, provide a Description for the Code Group. (You'll see the
description when you click a Code Group in the left tree, helping you
identify the various Code Groups you may have) .
In the Condition Type drop down, choose URL
For the URL field, type something like this:
file://YourServer/My Documents/Visual Studio Projects/*
Under Use existing permission set, choose FullTrust (that is, if you
trust your own applications. If you don't, choose a different
permission set or create a new one).
Not sure why this works, but it made a previously unusable NET 2.0 project perform significantly better.
Original article: http://imar.spaanjaars.com/364/how-do-i-allow-my-visual-studio-net-projects-to-run-from-a-network-location
I was having the same problem. I have a local copy of our build system, which expects certain drive letters, and was also experiencing slowness.
I have solved the problem by adding the following registry keys:
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\DOS Devices]
"R:"="\DosDevices\D:\devel\build
"S:"="\DosDevices\D:\devel\src"
Note that the double '\'s above are part of the .reg file format. When using regedit use single '\' throughout.
My build times were divided by 3. :)
I found the info in the wikipedia article on the SUBST command.