Related
The app I work on is written mainly in VB6.
Some users report that when they start up my app a different MSI installer will automatically run and try to repair its own installation. Often this is for AutoCAD but sometimes other programs also.
Usually this occurs every time they start the app.
What is a procedure that we can use to diagnose why this occurs? Since it is a third-party's installer which is running we don't have any visibility into what it is doing.
AutoDesk does have some info published on this:
Unexpected installer launches
Windows Installer displayed unexpectedly
but these do not directly provide enough information. Ideally I want to be able to completely prevent this from occurring to my end users, rather than just telling them how to avoid it or clean it up.
Your installer is acting on a directory, file or registry key that Windows Installer knows is part of the AutoCad installation.
First, I would turn on global Windows Installer logging. This means that any Windows Installer activity - including AutoCad's installer - is written to an external log file (in %temp%).
How to Enable Windows Installer Logging
Next, run your installer, and let the AutoCad installer run.
Now go to %temp% and you should find files MSIXXXX.LOG - one for your installer, one for AutoCad. Open these and you can work your way through them and identify which file or registry key the AutoCad MSI find is missing or changed.
You may find WiLogUtl.exe helpful for this:
Wilogutl.exe
With any luck you will identify that the directory, file or registry key triggering autorepair is also in your installer. If you're really in luck you can identify it as an item you should not be installing anyway - perhaps you are referencing a system component that would be present anyway, something protected by Windows File Protection.
If not, you will have to look at something like RegFree COM to move files out of shared directories into your private directory and reduce registry conflicts. Also, if you are using (consuming) the Visual C++ Runtime MSMs to make your MSI, consider using the Microsoft EXE installer instead or (best of all) placing the DLLs directly in your program folder, since I've found that the MSMs can cause just this sort of problem.
With regards to Peter Cooper Jr's comment on VB6 causing self-repair. Please check out the heat.exe documentation for Wix. You will see that there is a special switch the tool supports to suppress extracting certain registry values that are owned by the VB6 runtime itself (and hence shouldn't be messed with or updated by any other MSI): http://wixtoolset.org/documentation/manual/v3/overview/heat.html
Go down the list to the switch -svb6 and read the description to the right. (Reproduced here:)
When registering a COM component created in VB6 it adds registry
entries that are part of the VB6 runtime component:
CLSID{D5DE8D20-5BB8-11D1-A1E3-00A0C90F2731}
Typelib{EA544A21-C82D-11D1-A3E4-00A0C90AEA82}
Typelib{000204EF-0000-0000-C000-000000000046}
[as well as] Any Interfaces that reference these two type libraries
Does your installer write to these keys? If so try to exclude them - this is good to do even if it isn't the culprit in this particular case.
Other than that there is a lengthy description of what can cause Windows Installer self-repair here: How can I determine what causes repeated Windows Installer self-repair?. It is a long article because there are so many different ways self-repair can occur. The common denominator is that different installers on your system are fighting over a shared setting that they keep updating with their own values on each application launch in an endless loop.
Not really sure of my exact question, but here is the situation:
I have an application (WinForms, C# .Net) that I am developing in Visual Studio 2012. It does a lot of things but the important bit is that it needs to read files from a certain location.
In this case, the location of the files is on a server and my machine has a mapped network drive setup for accessing the files. I can manually navigate to the files with Windows Explorer fine.
I have the following line in my code which is highlighting the issue:
System.IO.File.Exists("X:\\A Folder\\a_file.txt");
And that file does exist in that location. However this is where the problem occurs: if I build the solution and run the .exe directly from the "bin" folder (double-click). The code is fine, and it finds the file. But if I run it with visual studio then I get a "file not found" exception.
I am putting this down to the fact that Visual Studio is running in "Administrator" mode (I forget why I needed this, but I do). Now this makes sense if you consider that the "administrator" account does not have the "X:\" drive mapped. However, this has never been a problem until I upgraded to Windows 10 last week.
So my question is:
Does Visual Studio Administrator mode work differently in Windows 10? In this case, does it handle mapped network drives differently?
It's worth noting I upgraded from Windows 7, so I cannot confirm if this issue is also present in 8 and 8.1 or not.
And before anyone asks, let's just say it has to be a mapped drive. No UNC paths allowed!
So I have found a solution/workaround. Kind of seems like a wasted bounty now, so if someone has other suggestions that are better then please post and I will review them and award as applicable. Or even if somebody can make a more detailed version of my solution then I will award that one.
The issue is probably not specific to Visual Studio, but would occur with any application running with elevated privileges. Anyway, the solution I found is to add a registry key that enables the same shared drives to be accessible when running in administrator mode.
The registry key location is:
HKEY_LOCAL_MACHINE/SOFTWARE/Microsoft/Windows/CurrentVersion/Policies/System
And the key to add is called:
EnableLinkedConnections
And should be created as a DWORD with a value of 1 (0x00000001)
I checked with the machines running Windows 7 and they do NOT have this key, yet they still work fine. So I expect this isn't the only solution, but it does seem to work (no side effects noted yet). I would assume that Windows 10 has a specific setting somewhere that by default prevents mapped drives from automatically being available with "run as administrator".
For reference, I found this information here.
In fact, here is a more "official" recommendation for using this reg key.
This is unlikely to have anything to do with Windows 10, just with the configuration of your machine. What you describe is normal and covered by this KB article. Nothing I can check for myself so just try the recommended workarounds, follow up at superuser.com if necessary.
Different users/system tasks maybe running. As such, you have the X drive mapped, but others do not. You could do the drive mapping on additional users on your Windows installation as well. As you stated, this should not be a Windows 10 only issue, but also Windows 7+ and elevated privileges.
Maybe you could use a configured parameter for the X: path and load at runtime, or even try using UNC paths which will resolve at runtime and not need the drive to be mapped.
\\ServerNameOrIP\A Folder\a_file.txt.
In the code, you would need:
System.IO.File.Exists("\\\\ServerName\\A Folder\\a_file.txt");
Starting Visual Studio 2015 also launches two other executables:
VsHub.exe
and
Microsoft.VsHub.Server.HttpHost.exe
Both of these take a considerable space in the task manager.
How can I delete this "Visual Studio Hub" option? I don't use any of Visual Studio 'top-notch' features, including the Visual Studio Hub one.
I thought I’d try to shed some light on the VS Hub and what it’s intended for. I work for Microsoft.
As sraboy mentions, the VS Hub is an out-of-proc services host that Visual Studio (and other VS shell-based products such as Blend) use in order to support multi-tool communication, better responsiveness within devenv (VS), and enable certain services to extend past the lifetime of the spawning process. The set of services currently hosted in the VS Hub includes many of the items called out in the other answers, such as roaming settings, processing of large swaths of ETL data that is rendered in the diagnostics tooling, some telemetry reporting, and extension auto update and notifications. That set of services is very likely to grow in the future though, so even if none of those services seem necessary at the moment, additional services will be hosted there in the future (i.e. it’s a pretty big hammer to disable the vshub.exe through the renaming recommendation :-).
In terms of lifetime, the vshub and host processes (i.e. Microsoft.VsHub.Server.HttpHost(64).exe) can stay running after devenv.exe closes. However, they should not keep running indefinitely. In most cases these processes will terminate within ~5 minutes of the last instance of a VS-based shell closing. So if you have an instance of VS running (devenv.exe) and an instance of Blend running (blend.exe), and you shut down devenv.exe, vshub and the associated host processes will keep running. If you then shut down blend.exe, vshub and the associated host process will still be running. After about 5 minutes from then, however, those additional processes will shut down. If you start another instance of devenv.exe within that 5 minute window, then vshub and the associated host processes will not terminate, and will keep running (basically the host processes terminate whenever they don’t receive any requests within 5 minutes, and after all of the host processes terminate, the vshub.exe process itself terminates).
Resource-wise, the vshub.exe process itself should always be relatively lithe. If it ever gets large, then that’s a bug and I’d love to know about it so we can fix it :-) The host processes, on the other hand, may get very large depending on the service that is being hosted. In particular, the diagnostics tooling works by processing ETL. ETL can be very, very, large, and as such, the host may use a lot of resources. The diagnostics team is looking at ways to reduce that, but for the moment, closing the diagnostics tool window when you don’t need it should help mitigate the problem.
In terms of online connectivity, there are three main sources in the current set of hosted services at the moment (note, this will change over time). First, as user3345048 mentions, the service that detects and auto-updates extensions runs in that process. The options that control that communication are in Tools | Options | Environment | Extensions and Updates (see the first two checkboxes). Second, roaming settings runs as a service in the VS Hub. The setting that controls this behavior is in Tools | Options | Environment | Synchronized settings (or more holistically, if you do not sign into the personalization account in the upper right hand corner of VS). Finally, the VS Hub does report telemetry. The volume of this data can be significantly reduced via the Help | Customer Feedback Options | Settings… menu item. You can also read about the kind of telemetry that Microsoft collects and how it’s used in that location.
Something no one's mentioned above...
According to my firewall log, VsHub.exe, Microsoft.VsHub.Server.HttpHost.exe, and Microsoft.VsHub.Server.HttpHostx64.exe all try to communicate online.
Addresses I saw to which there were outgoing connection attempts included 191.236.194.164 (Microsoft Azure, Wichita Kansas) and 23.102.160.172 (Microsoft Azure, Redmond Washington).
I realize "modern" software is supposed to be cloud-integrated, but...
As one who does not require anything from Microsoft Azure servers, and who is legitimately concerned with privacy and not leaking any part of what I'm working on to the outside world, I'd really like A) to have a way to choose not to run these programs, or B) be provided with settings to limit their chattiness online. Yes, the firewall blocks the connections, but that's a last resort.
Just a simple checkbox, "[ ] Contact Microsoft Azure Servers" would be nice. Whether that would mean not running the programs in question or just having them not make the online connections isn't of consequence to me. I guess from a resource perspective the former would be better as it would use fewer resources.
As a rule I wouldn't propose to change the files in an installed application's suite of files, but as I have a virtual machine environment within which I can test changes to Visual Studio 2015 without much consequence (snapshots are wonderful), I tried altering the permissions (to remove inheritance then disallow Read and Execute for Users) on these three files.
Voila, no more VsHub applications running, trying to contact remote systems.
Visual Studio comes right up. I'm not seeing a downside here.
-Noel
I am using Windows 7 x64 with Visual Studio Express 2015. I have terminated annoying processes with Task Manager. Then I have deleted the C:\Program Files (x86)\Common Files\microsoft shared\VsHub folder. This operation solves the problem, but requires administrator rights.
As xakepp35 mentioned, you can delete the C:\Program Files (x86)\Common Files\microsoft shared\VsHub folder. However, I suspect that updates or other installers will likely try to re-create it.
What I did was shut down all the VS processes. Take ownership of the folder (as admin) and then RAR (ZIP) up the folder and finally delete it (RAR as a backup if I need the files back). Mine is on an SSD so I want to conserve space. Otherwise you could simply rename it and leave it in place.
Then to prevent it being created again, I used an old Win 3.1 trick. Create a text file named VsHub.txt in the C:\Program Files (x86)\Common Files\microsoft shared folder. Then rename the text file and take off the .txt extension leaving a file named VsHub. Since the OS can't create a folder and a file of the same name in the same location, poof, its inaccessible as a folder to VS and its inaccessible and future installers/updaters as a folder either. Then if you need to allow access again in the future, simply add the .txt back on the file and away you go.
This seems to be a communication Swiss Army knife for visual studio as per #sraboy's answer. It is used during debugging to display performance information about the running process, but also to send telemetry to Microsoft about the project you're working on. You can build and step through code fine with it disabled (at first glance).
Removing, renaming or blocking the vshub process creation with AV will break the performance tracing I mentioned. Losing vshub improves privacy while using Visual Studio as it communicates with vortex.data.microsoft.com, passing information such as solution & project GUIDs along with your account id. Below is a screenshot from fiddler intercepting the HTTPS data.
Blocking access at network level helps with privacy, but it will not address your resource usage issue. I would consider the latter as a normal overhead of running Visual Studio.
For your use case, you can probably get away with some form of disabling (blocking instantiation with your antivirus software is probably the cleanest approach), but it may support additional functionality I haven't figured out yet.
For those of you who want to preserve VSHub and still be able to use Fiddler you can setup a Filter in Fiddler with the following setting:
Request Headers > Hide if URL contains =
REGEX:localhost:\d+\/vshub\/
EDIT - you probably want to add this too:
Hosts > Show only the following Hosts: =
localhost;
in order to omit vortex.data.microsoft.com etc.. requests
According to a Microsoft Program Manager commenting on the Visual Studio Blog commenting on the Visual Studio Blog, it's used to support multi-tool communication across the VS suite. Given how complicated Visual Studio is, I wouldn't recommend anything as harsh as xakepp35's answer (deleting it).
On my Win10 x64 with VS2015 running, while debugging, there's three processes and the total RAM usage is less than 150MB total. Unless you're page-thrashing on a machine with minimal RAM that's not much to be concerned about. Given that you're running VS2015, I'd guess you have 150MB to spare.
Until or unless you find documentation showing explicitly what the Hub is supporting, I'd recommend leaving it be. In my experience, Visual Studio installs are far too easy to break.
One of the reasons why Visual Studio tries to connect online seems also that, by default, search online for updates for both Visual Studio and its extensions.
Also, Visual Studio includes a version of Internet Explorer within its core so that webpage (and extension) can be downloaded live. In other words, it acts as a browser as well and as we all know... Microsoft is pretty keen on checking its users' data and usage of its software.
There are plenty of online functions in the menu Tools → Options.
(To be honest, I do prefer MonoDevelop even with its flaws.)
It's needed for BrowserLink, the Diagnostics window, Intellitrace.
I sometimes need these features, but only have 8 GB of RAM. I'm usually at 90-95% usage so I created a batch file to toggle VSHub on and off by renaming the folder and creating a symlink to an empty folder with dummy files.
Shutdown Visual Studio before running.
#echo off
goto CheckVsHubRunning
:KillVsHub
echo Killing VsHub Process
taskkill /IM VsHub.exe /T /F
TIMEOUT /T 3 /NOBREAK
:CheckVsHubRunning
ver > nul
tasklist /FI "IMAGENAME eq VsHub.exe" | find /I /N "VsHub.exe"
if "%ERRORLEVEL%"=="0" goto KillVsHub
if "%ERRORLEVEL%"=="1" echo VsHub is not running.
echo.
PUSHD "C:\Program Files (x86)\Common Files\microsoft shared"
IF NOT EXIST "VsHub.original" (
echo Renaming Original VsHub folder.
RENAME "VsHub" "VsHub.original"
)
IF NOT EXIST "VsHub.dummy" (
echo Creating Dummy Folder and Contents
mkdir "VsHub.dummy"
copy NUL > "VsHub.dummy\1.0.0.0"
copy NUL > "VsHub.dummy\ServiceModules"
mkdir "VsHub.dummy\dummy"
)
IF EXIST "VsHub\dummy" (
echo ENABLING VsHub
echo.
rmdir VsHub
mklink /d VsHub VsHub.original
) ELSE (
echo DISABLING VsHub
echo.
rmdir VsHub
mklink /d VsHub VsHub.dummy
)
echo.
pause
On my machine VSHub and its cronies usually use:
VsHub.exe: 50 MB initially. 250-350 MB after 2+ hrs
Microsoft.VsHub.Server.HttpHost.exe: 200 MB initially. 350+MB after 1+ hrs
Microsoft.VsHub.Server.HttpHostx64.exe: 320 MB initially. 550+MB after 1+ hrs
This frees up over a 1 GB of RAM with hardly any functionality lost.
"VsHub" should be renamed to "SmartMobileCloud";
that's how stupidTrendy it is. I dumped it;
my VisualC editing/debugging wasn't harmed.
After installing VisualStudio, remove unUsed extensions, do the
"C:\Program Files (x86)\Common Files\Microsoft Shared\ - Deleted - VsHub"
fix... and put a "VsHub" text file there ( no ".TXT" ),
so nothing can recreate the folder.
As a developer, tools that store configuration/options in the registry are the bane of my life. I can't easily track changes to those options, can't easily port them from machine to machine, and it all makes me really yearn for the good old days of .INI files...
When writing my own applications, what - if anything - should I choose to put in the registry rather than in old-fashioned configuration files, and why?
Originally (WIN3) configuration was stored in the WIN.INI file in the windows directory.
Problem: WIN.INI grew too big.
Solution (Win31): individual INI files in the same directory as the program.
Problem: That program may be installed on a network and shared by many people.
Solution(Win311): individual INI files in the user's Window directory.
Problem: Many people may share a windows folder, and it should be read-only anyway.
Solution (Win95): Registry with separate sections for each user.
Problem: Registry grew too big.
Solution (WinXP): Large blocks of individual data moved to user's own Application Data folder.
Problem: Good for large amounts of data, but rather complex for small amounts.
Solution (.NET): small amounts of fixed, read-only data stored in .config (Xml) files in same folder as application, with API to read it. (Read/write or user specific data stays in registry)
Coming at this both from a user perspective and a programmers perspective I would have to say there really isn't a good exceuse to put something in the registry unless it is something like file associations, or machine specific settings.
I come from the school of thought that says that a program should be runnable from wherever it is installed, that the installation should be completely movable within a machine, or even to another machine and not affect the running of it.
Any configurable options, or required dlls etc, if they are not shared should reside in a subdirectory of the installation directory, so that the whole installation is easily moved.
I use a lot of smaller utility like programs, so if it cant be installed on a usb stick and plugged into another machine and just run, then its not for me.
When - You are forced to due to legacy integration or because your customer's sysadmin says "it shall be so" or because you're developing in an older language that makes it more difficult to use XML.
Why - Primarily because the registry is not as portable as copying a config file that is sitting next to the application (and is called almost the same).
If you're using .Net2+ you've got App.Config and User.Config files and you don't need to register DLL's in the registry so stay away from it.
Config files have their own issues (see below), but these can be coded around and you can alter your architecture.
Problem: Applications needed configurable settings.
Solution: Store settings in a file (WIN.INI) in the Windows folder - use section headings to group data (Win3.0).
Problem: WIN.INI file grew too big (and got messy).
Solution: Store settings in INI files in the same folder as the application (Win3.1).
Problem: Need user-specific settings.
Solution: Store user-settings in user-specific INI files in the user's Window directory (Win3.11) or user-specific sections in the application INI file.
Problem: Security - some application settings need to be read-only.
Solution: Registry with security as well as user-specific and machine-wide sections (Win95).
Problem: Registry grew too big.
Solution: User-specific registry moved to user.dat in the user's own "Application Data" folder and only loaded at login (WinNT).
Problem: In large corporate environments you log onto multiple machines and have to set EACH ONE up.
Solution: Differentiate between local (Local Settings) and roaming (Application Data) profiles (WinXP).
Problem: Cannot xcopy deploy or move applications like the rest of .Net.
Solution: APP.CONFIG XML file in same folder as application - , easy to read, easy to manipluate, easy to move, can track if changed (.Net1).
Problem: Still need to store user-specific data in a similar (i.e. xcopy deploy) manner.
Solution: USER.CONFIG XML file in user's local or roaming folder and strongly-typed (.Net2).
Problem: CONFIG files are case-sensitive (not intuitive to humans), require very specific open/close "tags", connection strings cannot be set at run-time, setup projects cannot write settings (as easily as registry), cannot easily determine user.config file and user settings are blown with each new revision installed.
Solution: Use the ITEM member to set connection strings at runtime, write code in an Installer class to change the App.Config during install and use the application settings as defaults if a user setting is not found.
Microsoft policy:
Before windows 95, we used ini files for application data.
In the windows 95 - XP era, we used the registry.
From windows Vista, we use ini files although they are now xml based.
The registry is machine dependent. I have never liked it because its getting to slow and it is almost imposible to find the thing you need. That's why I like simple ini or other setting files. You know where they are (application folder or a user folder) so they are easy portable, and human readable.
Is the world going to end if you store a few window positions and a list of most recently used items in the Windows registry? It's worked okay for me so far.
HKEY-CURRENT-USER is a great place to store trivial user data in small quantities. That's what it's for. It seems silly not to use for its intended purpose just because others have abused it.
Registry reads and writes are threadsafe but files are not. So it depends on whether or not your program is single threaded.
Settings that you want to have available in a user's roaming profile should probably go in the registry, unless you actually want to go to the effort of looking for the user's Application Data folder by hand. :-)
If you are developing a new app and you care about portability you should NEVER store data in windows registry since other OS don't have a (windows) registry (duh note - this may be obvious but gets often overlooked).
If you're only developing for Win platforms ... try to avoid it as much as possible. Config files (possibly encrypted) are a way better solution. There's no gain in storing data into the registry - (isolated storage is a much better solution for example if you're using .NET).
Slightly off-topic, but since I see people concerned about portability, the best approach I've ever used is Qt's QSettings class. It abstracts the storage of the settings (registry on Windows, XML preference file on Mac OS and Ini files on Unix). As a client of the class, I don't have to spend a brain cycle wondering about the registry or anything else, it Just Works (tm).
http://doc.trolltech.com/4.4/qsettings.html#details
Personally I have used the registry to store install paths for use by the (un)install scripts. I'm not sure if this is the only possible option, but seemed like a sensible solution. This was for an app that was solely in use on Windows of course.
Usually, if you don't put settings in registry, you use it mostly to get current Windows settings, change file associations, etc.
Now, if you need to detect if your software is already installed, you can make a minimal entry in registry, that's a location you can find back in any config. Or search a folder of given name in Application Data.
If I look at my Document and Settings folder, I see lot of softwares using the Unix dot notation for setting folders:
.p4qt
.sqlworkbench
.squirrel-sql
.SunDownloadManager
.xngr
.antexplorer
.assistant
.CodeBlocks
.dbvis
.gimp-2.4
.jdictionary
.jindent
.jogl_ext (etc.)
and in Application Data, various folders with editor names or software names. Looks like being the current trend, at least among portable applications...
WinMerge uses a slightly different approach, storing data in registry, but offering Import and Export of options in the config dialog.
I believe that Windows Registry was a good idea, but because of great abuse from application developers and standard policies not encouraged/mandated by Microsoft grew into an unmanageable beast. I hate using it for the reasons you've mentioned, there are however some occasions that it makes sense using it:
Leaving a trace of your application after your application has been uninstalled (e.g. remember user's preferences in case the application is installed again)
Share configuration settings between different applications - components
In .NET there really is NOT ever a need.
Here are 2 examples that show how to use Project proerties to do the this.
These examples do this by Windows User Project Properties, but the same could/can be done by Application as well.
More here:
http://code.msdn.microsoft.com/TheNotifyIconExample
http://code.msdn.microsoft.com/SEHE
(late to the discussion but) Short Answer: Group Policy.
If your customer's IT department wants to enforce settings related to Windows or the component(s) you're writing or bundling in, such as a link speed, or a custom error message, or a database server to connect to, this is still typically done via Group Policy, which makes its ultimate manifestation as settings stored in the registry. Such policies are enforced from the time Windows starts up or the user logs in.
There are tools to create custom ADMX templates that can map your components' settings to registry locations, and give the administrator a common interface to enforce policies (s)he needs to enforce while showing them only those settings that are meaningful to enforce this way.
We just did a move from storing all files locally to a network drive. Problem is that is where my VS projects are also stored now. (No versioning system yet, working on that.) I know I heard of problems with doing this in the past, but never heard of a work-around. Is there a work around?
So my VS is installed locally. The files are on a network drive. How can I get this to work?
EDIT: I know what SHOULD be done, but is there a band-aid I can put on right now to fix this and maintain the network drive?
EDIT 2: I am sure I am not understanding something, but Bob King has the right idea. I'll work with the lead web developer when he gets back into the office to figure out a temporary solution until we get some sort of version control setup. Thanks for the ideas.
While we do use Source Control, we do also run all our projects from Network Drives (not shared directories, private directories on network drives). The network drives are backed up nightly, and also use Volume Shadow Copy, so if you need to revert to something before it made it's way to SC, then you can.
To get projects to run correctly with the right permission, follow these steps.
Basically, you've just got to map the shared directory to a drive, and then grant permission, based on that Url, to all code. Say you map to "N:\", then use "N:\*" as your Url pattern. It isn't obvious you need to wildcard, but you do.
The question is rather generic so I'll give an answer to one issue I was facing.
I run Visual Studio 2010 using a Parallels virtual machine on my Mac while keeping all my projects on the mac side via a network share. Visual Studio however wouldn't load the projects assembly files from there. Trying to set the rights using "caspol" alone didn't help in my case.
What finally worked for me to allow Visual Studio to load assemblies from a network share was to edit the file
"C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\devenv.exe.config" (assuming a default installation).
in the xml "<runtime>" section you have to add
<loadFromRemoteSources enabled="true"/>
You may have to change the permissions on that file to allow write access. Save the file. Restart Visual Studio.
In the interests of actually answering the question, I copied this comment from jcarle.com:
Trusting Network Shares with Visual Studio 2010 / .NET Framework v4.0
January 20, 2011, 4:10 pm
If you are like me and you store all your code on a server, you will have likely learned about trusting a network share using CasPol.exe. However, when moving from Visual Studio 2008 (.NET Framework 2.0/3.0/3.5) over to Visual Studio 2010 (.NET Framework 4.0), you may find yourself scratching your head.
If you are used to using the Visual Studio Command Prompt to quickly get to CasPol, you may find that some of your projects will not seem to respect your new FullTrust settings. The reason is that, unless you are carefully paying attention, the Visual Studio Command Prompt defaults to adding the .NET Framework 4.0 folder to its path. If your project is still running under .NET Framework 2.0/3.0/3.5, it will require setting CasPol for those versions as well. Just a note, I have also personally had more success with using 1 as a code group instead of 1.2.
To trust a network share for all versions of the .NET Framework, simply call CasPol for each version using the full path as below:
C:\Windows\Microsoft.NET\Framework\v2.0.50727\CasPol -m -ag 1 -url file://YourSharePath* FullTrust
C:\Windows\Microsoft.NET\Framework\v4.0.30319\CasPol -m -ag 1 -url file://YourSharePath* FullTrust
I would not recommend doing that if you have (or even if you don't have) multiple people who are working on the projects. You're just asking for trouble.
If you're the only one working on it, on the other hand, you'll avoid much of the trouble. Performance is going to out the window, though. As far as how to get it to work, you just open the solution file from VS. You'll likely run into security issues, but can correct that using CASPOL. As I said, though, performance is going to be terrible. Again, not recommended at all.
Do yourself and your team a favor and install SVN or some other form of source control and put the code in there ASAP.
EDIT: I'll partially retract my comments. Bob King explains below the reason they run VS projects from a network drive and it makes sense. I would say unless you're doing it for a specific reason like Bob, stay away from it. Otherwise, get your ducks in a row before setting up such a development environment.
So I was having a similar issue. Visual Studio wouldn't recognize a network location I had mapped for a drive letter for anything. The funny thing is, it worked for a day. I set up my project and began working on it and had no issues. Then, I shut down and the next day nothing works. I couldn't read/write files in code, output my executables or anything. My project is local but my output was intended to be thrown up on the network.
Anyways, the problem is probably about the administrator context but one way to fix it which I found while digging around online is to get Visual Studio to browse to the drive in question some how. There are plenty of ways to do this but VS will magically be able to recognize mapped drive letters. My solution is to go the the Debug Output Location in the Project Properties, click browse and go to my previously made output location on my network drive and Voila!!!
I wanted to put this up because I spent half a day trying to figure this out and figured it might save someone else some time. Thanks much and good luck!!!
Erik
I understand this is an older thread, but this was the best thread I found when looking to solve a similar issue I had visual studio 2013 on a virtual box (using Win 8.1) and the code on the host machine (Win 7). Although I could open the solution, I could not compile. All of the other answers on this relate to older software, so I am adding this answer to update this frequently found question with the solution that worked for me.
Here's what I did; Made a registry entry to be able to use a UNC path as the current directory.
WARNING: Using Registry Editor incorrectly can cause serious, system-wide problems that may require you to reinstall Windows NT to correct them. Microsoft cannot guarantee that any problems resulting from the use of Registry Editor can be solved. Use this tool at your own risk.
Under the registry path:
HKEY_CURRENT_USER
\Software
\Microsoft
\Command Processor
add the value DisableUNCCheck REG_DWORD and set the value to 0 x 1 (Hex).
WARNING: If you enable this feature and start a Console that has a current directory of an UNC name, start applications from that Console, and then close the Console, it could cause problems in the applications started from that Console.
Found this information at link: http://support.microsoft.com/kb/156276
How about we rephrase this into a question that everyone can answer? I have the exact same problem as the initial poster.
I have a copy of VB 2008 (recently upgraded from VB6). If I store my solutions on the backed up network drive, then it won't run a single thing ever. It gives "partially trusted caller" errors for accessing a module, even when "allowpartiallytrustedcallers" is set in the assembly. If I store the files on my (not backed up) C:, then it will run wonderfully, until I put it on the share drive for everyone to use, and I'm back to my same problem.
This isn't a big request. I just want to be able to put a solution and executable on the share drive and run it without an absurd amount of nonsense about security. I shouldn't have to cram all my work into form files.
-Edit: I found the problem with why it was ignoring the AllowPartialllyTrustedCallers command. I'm trying to reference ADODB, which doesn't allow partially trusted. So, no network executable can access a database? What does Microsoft have against intranets anyway?
I was facing the same issue just recently so this answer is more for the sake of keeping track of my own knowledge. Anyway, should soumeone find it useful, below is the issue and the solution.
Issue:
NET 4.0 projects, SVN repo, checkout folders are on local drives, referenced assemblies are build by build server and available on a network drive. Visual studio on W7 is is able to add the reference but unable to build projects.
Solution:
Since NET 4.0 does not automatically provide a sandbox anymore for network assemblies, you have to make those full-trusted via machine.config update. http://msdn.microsoft.com/en-us/library/dd409252.aspx
I had a similar problem with opening Visual Studio projects on a network drive, and I fixed it by creating a symbolic link on my local C:\ drive that points to the UNC directory
e.g.
mklink /D "C:\Users\Self\Documents" "\\domain.net\users\self\My Documents"
then you can just open the project using the C:\Users\Self\Documents\ path, instead of the UNC path
(You have to be careful, because Visual Studio will automatically redirect you to the '\\domain.net..' path if you double click the symlink when you're browsing for the project. I had to copy paste the 'C:\Users\' path to get it to open with the drive letter path)
Don't do it. If you have source control (versioning), you do not want your files on a network drive. It totally bypasses all you want to achieve by using source control, because once your files are on a network drive, anyone can modify them .... even while you're currently building your project. Ka-boooom!
PS: this sounds like a typical case of over-engineering to me.
Are you having any specific problems?
If you allow more than one person to open the solution, your first problem will be that the .NCB file (Intellisense) will be locked exclusively and only one user will be able to browse the class tree. And of course you have the potential for one user's changes to overwrite the other user's changes.
You should be warned that some feature in Visual Studio will refuse to work with network drive.
For example, mdf file of SQL Express user instance must be located in local drive.
For another example, if you use UNC path, you have to make sure they are short enought.
i found this helpful while trying use vc11 with parallels which run on mac:
http://social.msdn.microsoft.com/Forums/en-US/toolsforwinapps/thread/2ffdcb01-c511-4961-834b-afd5f2fbb8e1, and specifically:
1) You can switch from local debugging to remote debugging and set the machine name as 'localhost'. This will do a remote deployment on your local machine (thus not using the project's directory). You don't need to install the Remote Debugger tools, nor start msvsmon for this to work on localhost.
In case this helps anyone else, I had to do the steps outlined here to add the network share location to Windows intranet zone. In particular, I was having trouble with Visual Studio hanging on load when opening a solution on a network share (i.e. using VMware Fusion and opening a solution from my Mac's hard drive). I also had problems with PostSharp running in this scenario.
If i understand you correctly, your Visual Studio project files are stored on the network drive and you are running them from there. This is what I do and don't have any problems. You will need to make sure that you have set the security policy. You can use Caspol to do this, or via the control panel-admin tools menu.
"How can I get this to work?"
You have a couple choices:
Choice A:
1. Move all files back to your local hard drive
2. Implement some type of backup software on your machine
3. Test said backup solution
4. keep on coding
Choice B:
1. Get a copy of one of the FREE source control products and implement it.
2. Make sure it's being backed up
3. Test it
Choice C:
Use one of the many ONLINE source control repositories available. Google, SourceForge, CodePlex, something.
Well, my question would be why you are asking this. Is it not working when you are storing it on a network drive? I haven't tried this myself, and one problem I could envision would be that .NET code running from a network drive (ie. from the bin\Debug directory, also located on the network drive) would be running in a sandbox mode, unless you mess around with CASPOL (or use 3.5 SP1 which I hear has removed that obstacle).
If you have specific problems, ask about them. Never ask "Why is doing X not working?".
You're not saying if you're just one person or multiple persons accessing the same remote drive, but I'm assuming you're just one for each network directory. Is this correct? If not, no, there is no band-aid. Get version control, move the files back to a local disk.