I've recently installed ClojureBox on a Windows 7 machine after using it on a different, XP machine for a while. When I created and saved a file, it wasn't being saved where I expected, but to the \Users\xxxx\AppData\Local\VirtualStore directory. This happened as long as I wasn't running emacs as the local administrator.
A Google search returned only a couple of hits, and with nothing I could really apply other than to run emacs as a local admin.
Any other way to get around this? Is there a windows setting, or something I could configure in emacs?
Thanks.
You can right-click Emacs and "run as Administrator" which I expect will get annoying quickly. Further, if you launch other apps from inside it you might be misled about the behaviour of those apps under normal circumstances. A better approach would be to save your files somewhere other than under Program Files or the root of C, thus avoiding virtualization.
Related
Not really sure of my exact question, but here is the situation:
I have an application (WinForms, C# .Net) that I am developing in Visual Studio 2012. It does a lot of things but the important bit is that it needs to read files from a certain location.
In this case, the location of the files is on a server and my machine has a mapped network drive setup for accessing the files. I can manually navigate to the files with Windows Explorer fine.
I have the following line in my code which is highlighting the issue:
System.IO.File.Exists("X:\\A Folder\\a_file.txt");
And that file does exist in that location. However this is where the problem occurs: if I build the solution and run the .exe directly from the "bin" folder (double-click). The code is fine, and it finds the file. But if I run it with visual studio then I get a "file not found" exception.
I am putting this down to the fact that Visual Studio is running in "Administrator" mode (I forget why I needed this, but I do). Now this makes sense if you consider that the "administrator" account does not have the "X:\" drive mapped. However, this has never been a problem until I upgraded to Windows 10 last week.
So my question is:
Does Visual Studio Administrator mode work differently in Windows 10? In this case, does it handle mapped network drives differently?
It's worth noting I upgraded from Windows 7, so I cannot confirm if this issue is also present in 8 and 8.1 or not.
And before anyone asks, let's just say it has to be a mapped drive. No UNC paths allowed!
So I have found a solution/workaround. Kind of seems like a wasted bounty now, so if someone has other suggestions that are better then please post and I will review them and award as applicable. Or even if somebody can make a more detailed version of my solution then I will award that one.
The issue is probably not specific to Visual Studio, but would occur with any application running with elevated privileges. Anyway, the solution I found is to add a registry key that enables the same shared drives to be accessible when running in administrator mode.
The registry key location is:
HKEY_LOCAL_MACHINE/SOFTWARE/Microsoft/Windows/CurrentVersion/Policies/System
And the key to add is called:
EnableLinkedConnections
And should be created as a DWORD with a value of 1 (0x00000001)
I checked with the machines running Windows 7 and they do NOT have this key, yet they still work fine. So I expect this isn't the only solution, but it does seem to work (no side effects noted yet). I would assume that Windows 10 has a specific setting somewhere that by default prevents mapped drives from automatically being available with "run as administrator".
For reference, I found this information here.
In fact, here is a more "official" recommendation for using this reg key.
This is unlikely to have anything to do with Windows 10, just with the configuration of your machine. What you describe is normal and covered by this KB article. Nothing I can check for myself so just try the recommended workarounds, follow up at superuser.com if necessary.
Different users/system tasks maybe running. As such, you have the X drive mapped, but others do not. You could do the drive mapping on additional users on your Windows installation as well. As you stated, this should not be a Windows 10 only issue, but also Windows 7+ and elevated privileges.
Maybe you could use a configured parameter for the X: path and load at runtime, or even try using UNC paths which will resolve at runtime and not need the drive to be mapped.
\\ServerNameOrIP\A Folder\a_file.txt.
In the code, you would need:
System.IO.File.Exists("\\\\ServerName\\A Folder\\a_file.txt");
I have a very old application which I've moved from computer to computer over the years. I was probably running NT or maybe even Windows 95 when I got it. It still runs fine, but I recently tried to back up some of the files I created using it and I find that they are hidden. When I run the app I can read them or write them, but when I try to access them either via the command line or Windows Explorer they are not found. I can see them from the cygwin command line, but I would really prefer explorer.
My theory is that this is because my app is so old that it is putting user data in c:\ProgramFiles(x86)\MyApp\data rather than in some User\AppData directory which is what more recent versions of Windows are happier with.
What I've tried:
Using attrib to remove hidden attributes (failed with permission issue)
Same, but running attrib in cmd window with admin privileges (no permission error message, but the files do not subsequently show up)
Copying using cygwin command line (got unhelpful message "omitting directory `data'")
Any suggestions what I could try next? I am running Windows 7.
I would happy with a fix that I could do once and would fix it for good (setting permissions somehow?); I would be satisfied with a workaround like "run the following command every time you want to back files up").
Edit: I noticed something strange which may be a clue for someone more knowledgeable than I am: for files which have been modified recently, as opposed to created, doing a dir shows the file information for the old version, even though cygwin shows the new information and that's what I see when I read the file using the app.
This is really just an annoyance, but I'd like to see if someone has solved it.
I really love and need to have the Windows power toy "command prompt here".
I installed it first day on the XP laptop my new client gave me.
It works well in All folders EXCEPT those under the control of Base Clear Case.
If I create a junction (symbolic link) to that folder, and navigate to that, it does show in the pop-up menu.
I am thinking the clear case registry entry is overriding the command prompt here,
but I don't need this enough to muck about in registry on my clients machine.
I have seen it working on snapshot views (which are really simple files/directories on the C: drive), but not on dynamic views (which are mounted directory on the device M:)
The ClearCase submenu shouldn't matter, except it doesn't show up on Windows64.
From technote swg1PK36107:
ClearCase is a 32-bit application, therefore, the ClearCase and Windows Explorer integration will only work in a 32-bit Windows Explorer.
Maybe that limitation might have a side-effect for your own "command prompt here" plugin?
Found an alternative method that seems to work using file associations.
These are set under Tools / Folder Options / File Types in windows Explorer.
Reference Method #3 in the following link:
http://www.petri.co.il/add_command_prompt_here_shortcut_to_windows_explorer.htm
Now on to find the next annoyance.
We have a program that the installer checks for the existence of a config file, and if it exists, it doesn't copy that file over (it assumes the user has modified their config file and wants to keep those modifications). Unfortunately, this is a pre-Vista application and it keeps the config file in Program Files. The problem is, if you manually wipe out the directory when it re-installs certain API still thinks that there is a directory there. VB6, for example, and its browse for file dialog sees the folder, however explorer, cmd shell, etc cannot see folder. Writing over the file still leaves the old file there (to some API's, but not to explorer) which cannot be removed except form within the Browse for File dialog.
What is going on with these Phantom folders, and how do we delete the file so that all API's see the same thing? Maybe it has something to do with TxF, or the indexer for search, but both the installer we use (InnoSetup) and parts of the application (the parts written in VB6) are seeing the old version of the file, and everything else sees the current version.
As Oskar Duveborn said, it's very likely that what you're seeing is Vista's virtualization behaviour.
When a machine has User Account Control (UAC) enabled, standard users and non-elevated programs aren't allowed to write to the Programs folder. Windows instead silently redirects files to the appropriate subfolder of %AppData%\Local\VirtualStore (for example, C:\Users\MyUser\AppData\Local\VirtualStore).
If you browse the real folder in Explorer, you'll see the 'Compatibility Files' toolbar button, which you can use to browse the virtual store instead.
Note that this is only compatibility behaviour from Windows - your program should write to its own subfolder of %AppData%.
For more information, see this TechNet Magazine article.
Dunno if I'm on the right track, but doesn't Vista virtualize %programfiles% for applications that tries to write to it or otherwise are flagged as "not going about this the right way"? (and hence moves it somewhere into the user part of the filesystem instead, without telling the legacy app about it - making it kinda transparent)..?
Virtual Store Redirected files are stored somewhere in %appdata% - you can also find out the location by checking the "Compability Files" option in Explorer when at the aliased location. You need to stop writing to %programfiles% to get rid of this behaviour as far as I know.
Do you mean the AppData folder (C:\Documents and Settings\UserName\AppData)? I'm not on my vista machine, but I think that's the path, and afaik it's not wiped after an uninstall.
The TechNet link by Ant above (accepted answer) is no longer valid. The new link is:
http://support.microsoft.com/kb/927387 - Common file and registry virtualization issues in Windows Vista
I have a setup project created by Visual Studio 2005, and consists of both a C# .NET 2.0 project and C++ MFC project, and the C++ run time. It works properly when run from the main console, but when run over a Terminal Server session on a Windows XP target, the install fails in the following way -
When the Setup.exe is invoked, it immediately crashes before the first welcome screen is displayed. When invoked over a physical console, the setup runs normally.
I figured I could go back to a lab machine to debug, but it runs fine on a lab machine over Terminal Server.
I see other descriptions of setup problems over Terminal Server sessions, but I don't see a definite solution. Both machines have a nearly identical configuration except that the one that is failing also has the GoToMyPC Host installed.
Has anyone else seen these problems, and how can I troubleshoot this?
Thanks,
I had LOTS of issues with developing installers (and software in general) for terminal server. I hate that damn thing.
Anyway, VS Setup Projects are just .msi files, and run using the Windows installer framework.
This will drop a log file when it errors out, they're called MSIc183.LOG (swap the c183 for some random numbers and letters), and they go in your logged-in-user account's temp directory.
The easiest way to find that is to type %TEMP% into the windows explorer address bar - once you're there have a look for these log files, they might give you a clue.
Note - Under terminal server, sometimes the logs don't go directly into %TEMP%, but under numbered subdirectories. If you can't find any MSIXYZ.LOG files in there, look for directories called 1, 2, and so on, and look in those.
If you find a log file, but can't get any clues from it, post it here. I've looked at more than I care to thing about, so I may be able to help
Before installing, drop to a command prompt and type
CHANGE USER /INSTALL
Then install your software. Once the install has completed, drop back to the command prompt and type:
CHANGE USER /EXECUTE
Alternatively, don't start the installation by a double click but instead go to Add/Remove Programs and select "install software" from there.
Good luck!