I've tracked down an error in my logs, and am trying to reproduce it. My theory is that a file sometimes gets locked in a specific folder, and when the application (ASP.NET) tries to delete that folder it hangs.
I don't have the application running on my own machine so I'm debugging this on a remote server. But for the life of me, I can't seem to figure out a way to lock a file that prevents it from being deleted by the process.
My first thought was to map the network path to a local drive and just leave a command prompt open to that folder. Locally that always fouls up my folder deletes, but apparently SMB is a bit more robust and doesn't grant me a lock.
After that I created an infinte loop vbscript in the folder and executed it remotely. The file was deleted out from underneath the executing code. Man!
I then tried creating a file on the server in that folder and removing all permissions. That didn't do the trick. I don't have access to the IIS settings so perhaps it's running under a privileged system account.
So: what's a program that you know is free and I can quickly use to create an exclusive lock on a file so I can test my delete theory? Like a really, really bad Notepad clone or something.
:-)
Can't you just create a text file from a network folder and open it with MS Word/VS or a similar program which locks it during editing?
Related
I am using C# with .net 2.0
I am saving my program data in a file under: C:\ProgramData\MyProgramName\fileName.xml
After installing and running my application one time I uninstalled it (during uninstallation I'm removing all the files from "program data") and then I reinstall the application, and ran it.
The strange thing is that my application started as if the files in program data existed - means, I had old data in my app even though the data file was deleted.
When running:
File.Exists("C:\ProgramData\MyProgramName\fileName.xml")
I got "true" even though I knew for sure that the file does not exist.
The thing became stranger when I ran the application as admin and then the file didn't exist.
After a research, I found out that when running my application with no admin privileges instead of getting:
C:\ProgramData\MyProgramName\fileName.xml
I get
C:\Users\userName\AppData\Local\VirtualStore\ProgramData\MyProgramName\fileName.xml
and indeed there was a file that existed from the previous installation (that I obviously didn't delete, because I didn't know it existed).
So just guide me how could I stop this when apps running with no admin right.
I do not want to create any file automatically in VirtualStore folder. Please discuss all the possible ways to stop this.
First, ask yourself, do this need to be globally saved for all users?
If it doesn't have to be, save the file in Application Data instead, you can get the path with Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData), it should always reliably expand to C:\Users\Username\AppData\Roaming\. Do note that this path is unique for each user though.
If you have to, you're out of luck. There is no reliable way to store application data for all users without admin rights (or UAC) on any Windows post-XP that's not extremely hacky, like storing your data in the Public user (which may or may not be possible, I can't check right now).
An approach to solving this is to use the Environment.SpecialFolder.CommonApplicationData location, but with some very important caveats & setup.
CommonApplicationData is
The directory that serves as a common repository for
application-specific data that is used by all users.
This location is described further here and here.
Important requirements and restrictions are given in another SO answer: https://stackoverflow.com/a/22107884/3195477
which said in part:
The recommended solution is for your installer to create a sub
directory of C:\ProgramData for your shared storage. And that sub
directory must be given a permissive ACL by the installation program.
That is what grants the desired access to all standard users.
Otherwise the program running with standard user permission will still not be all equally able to read/write files in that location for all users.
I found a work around for this issue when transferring a very old win32 app to windows 7 & 10. The program wrote to a database on C:\Program Files... but the OS auto changed the path to virtual store. However the database was required globally. By changing compatablilty mode to Windows 95 or XP SP2 and always running as administrator the database was worked on directly in C:\Program Files\etc.
There are security implications for this and the box was removed from all networks and adapters disabled etc.
I have some source code on my Mac, and in order to test I'm interested in synchronizing it with a VM containing a similar web server setup to the production environment. Therefore I need to be able to automatically copy files over to the VM every time there are changes.
I know I can use rsync to do this manually whenever a script is run but I need some way of getting it to run in the background every single time a file in a particular directory or one of its sub-directories is modified.
I know inotifywait exists on Linux machines and could solve this problem. I've also read about the FSEvents API and kqueue. However, none of these seem to be accessible from the command line and I really don't want to spend a long time making something to do this...
I guess I could use a cronjob but a minute is a pretty long time to wait to see changes on a website...
Any ideas?
I do this all the time, developing on a Windows/Linux/Mac workstation, and saving changes to a remote Linux server where they're immediately served back to my workstation's browser for testing. You've got a couple options:
You could mount the remote files locally (like via sshfs) and make changes directly to them. I.e., your Mac thinks the files are local, so you can edit them with your GUI editor, but when you File->Save, it actually saves the file remotely. The main downside to this is that you can't work when disconnected from the server.
Mount the local files remotely. This would allow you to work locally while disconnected but won't allow the test site to work when disconnected -- which may not be a big deal. This option might not be doable if you don't have the right tools/access on the remote server.
(My preference.) Use NetBeans IDE, which has a very nice "copy to remote" feature. You maintain a full copy of all files locally, and edit them directly. When you hit File->Save on a file, NetBeans will save it locally and transparently scp/ftp it to your remote server.
How about using a DVCS like git or mercurial, and having the local repo run post-commit hooks to run the rsync and then the test itself?
I'm a bit confused about why you can't just run rsync from the same script that runs the test. If you run rsync -e ssh you can set up automatic public key authentication between the VM and the Mac. There won't be anything manual about the rsync in that case.
You might be able to set up a launchd agent to do what you want for a simple setup. See this question and the man page for launchd.plist for more information about the launchd WatchPath key. But it looks like WatchPath may not work for changes within sub-directories.
Are there any FTP programs which can automatically copy (or rather 'move') the contents of a folder to a remote server? I have of course googled this but only really found one or two ancient products which look really clunky and unmaintained. I was wondering if there's a way to do this from the command line or any better solution to the base problem.
In more detail, new files get written to a folder every few hours. These new files need to be FTP'd elsewhere and then deleted. Mirroring or synchonisation systems are probably out of the picture as we need to delete the source files once they've been successfully transferred.
If it's easier, the 'solution' could pull the files off the server (rather than the server pushing them to the client). The computers will both be Windows OS.
You could use any off the shelf FTP program that supports command line and schedule a task on Windows Scheduler to run every 10 minutes. Check the folder, and move any files to the FTP site.
In the end I used a program called FTP Auto Sync: http://ftp-auto-sync.com/
How do I list files with edit locks on a network drive using a shell and associated tools?
I think net file has to be run on the server, and I'm looking to do this from any box on the drive.
"Display all the open shared files on a server and the lock-id NET FILE
Close a shared file (disconnect other users and remove file locks) NET FILE id /CLOSE"
And this was tested on an xp workstation. Operates fine, but I am not sure how UAC would effect it.
This is a strange one to me. Let me list the setup:
Application with a manifest (ie: wont get pushed to virtual store)
UAC is turned on (can't write to other program files directories, or other areas, and uac prompt appears)
Can write to "c:\program files\%app_name%\%directory%\" both from within my application (not run as admin) as well as a non admin command prompt
Can not write to "c:\program files\%app_name%\%directory%\%subdirectory%\".
Any ideas? Are there hidden permissions or registry settings somewhere? Could it be that this directory was created when UAC was off, so now its fair game? Could it be that this directory was created in a time of XP, and its fair game?
It makes sense to me why i can't write to the other program files directories and the subdirectory. However I have no idea why i am actually allowed to write to the %directory%?
Side note: If I move the %directory% to another area (appdata), I still can not write to the subdirectory (confused).
Let me know any ideas you may have or anything I can check.
Thanks
EDIT: Arr, sorry, I skimmed your post a little too fast, looks like this is a non-issue!
Have you looked where the written files are actually going?
Vista has a feature where files written into Program Files folders by applications get redirected to a local per user store. This store is located at %userprofile%\AppData\Local\VirtualStore
This is to allow legacy applications which wrote per user settings to Program Files to still operate correctly, also allowing multiple users to use the program without conflict.
There's a button in explorer called 'Compatibility Files' which will take you to this user store.. perhaps your writes are ending up there?
I'm not sure why you cant write to the subdirectory though. Security permissions?