Command-line tool for finding out who is locking a file - windows

I would like to know who is locking a file (win32). I know about WhoLockMe, but I would like a command-line tool which does more or less the same thing.
I also looked at this question, but it seems only applicable for files opened remotely.

Handle should do the trick.
Ever wondered which program has a particular file or directory open?
Now you can find out. Handle is a utility that displays information
about open handles for any process in the system. You can use it to
see the programs that have a file open, or to see the object types and
names of all the handles of a program.

handle.exe
http://technet.microsoft.com/en-us/sysinternals/bb896655.aspx
THis has helped me sooooo many times....

Download Handle.
https://technet.microsoft.com/en-us/sysinternals/bb896655.aspx
If you want to find what program has a handle on a certain file, run this from the directory that Handle.exe is extracted to. Unless you've added Handle.exe to the PATH environment variable. And the file path is C:\path\path\file.txt", run this:
handle "C:\path\path\file.txt"
This will tell you what process(es) have the file (or folder) locked.

In my case Handle.exe did not help.
Simple program from official Microsoft called Process Explorer was useful.
Just open as administrator and press Ctrl+f, type part of file name it will show process using file.

Handle didn't find that WhatsApp is holding lock on a file .tmp.node in temp folder.
ProcessExplorer - Find works better
Look at this answer https://superuser.com/a/399660

Computer Management->Shared Folders->Open Files

I have used Unlocker for years and really like it. It not only will identify programs and offer to unlock the folder\file, it will allow you to kill the processing that has the lock as well.
Additionally, it offers actions to do to the locked file in question such as deleting it.
Unlocker helps delete locked files with error messages including "cannot delete file," and "access is denied." Video tutorial available.
Some errors you might get that Unlocker can help with include:
Cannot delete file: Access is denied.
There has been a sharing violation.
The source or destination file may be in use.
The file is in use by another program or user.
Make sure the disk is not full or write-protected and that the file is not currently in use.

Related

Powershell script to remove files that are not in use

I'm trying to write a script that removes all files from a given directory, however I don't want to remove files that are currently in use (being viewed, edited, etc.). What I'm finding is that for some file types (.docx, .xlsx, etc.), this works just fine and the .ps1 script fails as expected and moves on. However, some files (.bmp, .txt) can be open and get deleted as well. It looks like certain files are not locked when in-use. I guess my question is a few smaller questions:
Is there a way to tell if files are in-use (other than seeing if it's locked)?
Is there any definitive way to tell which file types are locked when in-use?
Is there a better cmdlet than Remove-Item to use for what I am trying to achieve?
Thanks in advance!
Some applications like Word locks the file wile reading them to avoid it being modified while it's open (usualy in case you want to modify it in ex. Word). Other applications, like notepad, doesn't.
This is not specific to a filetype (which is just a GUI attribute that has nothing to do with the data inside). It's the application that decides if it wants to lock a file or not. Ex:
Open a docx file in Word: Access denied (file in use)
Open a docx file in Wordpad: Success (no lock)
AFAIK it's impossible to detect 100% files being in use by ex. notepad which doesn't lock files.
There are many ways to delete files, but Remove-Item is as good as any of them.
Adding to Frode's point, nothing wrong with remove-item. Normal directory doesn't offer you much. Files hosted via windows file server provides session information might help you identify if a file is being used. If possible, you may implement a smart logic that excludes the files are in use.
In general, I'd recommend a work around that your powershell script check file's LastAccessTime, say, delete everything that hasn't been accessed for 24 hours
Get-ChildItem -path c:\ps | Where-Object {$_.LastAccessTime -lt (get-date).addDays(-1)}
The extreme way of protecting files, is using SCM and don't delete anything checked out. However this is not a friendly solution to average users.

How to let Windows know that a file is "being used" by my application?

I'm making a simple VB.net application, which basically asks the user for multiple files and later it will need to access the selected files and modify them.
Right now, I'm saving the full paths of the selected files, and in the future, the application will iterate through each path, open the file from such path, and modify it.
The problem with that is that the user could select a file (so the full path is saved) and then they delete or move the file before my application modifies it.
Normally, I'd throw an error saying "File not found", but I'm under the impression that Windows had a feature that would disallow you from deleting/moving/renaming a file because "a program was using it" - which is a feature that would fit way better for my application.
I'm not very advanced with VB.NET, but I suppose that if I "open" a file using my application (with some IO thing), the feature I mentioned earlier would indeed trigger and the user would be unable to modify the file because it is "opened" by my application.
However, since my only desire is to "reserve" files, it seems to be quite wasteful to actually open them when I don't really need to (yet). Is there a way to tell Windows I need a certain file to be intact?
Opening files (with specifying desired sharing mode) is the way to do that.
I don't believe there is anything really wrong with opening multiple files (also you still will not be able to do anything for cases like removing of removable drive). In old times there were restrictions on number of opened files per process, but I it no longer practical limitation - Pushing the Limits of Windows: Handles
There is an easy solution: open each file in exclusive mode.
It should look like this:
Sub test()
Dim FS = System.IO.File.Open("path", IO.FileMode.Open, IO.FileAccess.ReadWrite, IO.FileShare.None)
End Sub
But beware: You have opened a file handle and if you code responsible for closing files fails without terminating the application files will still be locked for very long (till app shuts down).
You can use a using clause or a try/catch/finally clause - I don't know enough about your program to recommend anyone.

Hiding Files in Windows

Currently, I'm developing a system which will extract some files from an SFX archive (files that will be used for another app). I want to make the extracted files hidden, so the person which has find the location of the exe couldn't get the files which will be in same directory with the exe. I know i can apply attrib +h to the files but if the user turns on "show hidden and system files" option in Windows, the files will be visible.
Isn't there any method to overcome this? Any suggestion is welcomed.
Thanks.
If you're writing to the disk, a user can find and read your file. There's no way around that, one could monitor what happens when your application is run, find what files it's writing to, or just intercept while it's being written. Consider why you don't want the user to find your files.
Is it because there's sensitive data, or things you'd rather they didn't change? Consider encrypting it, or verifying it's integrity with a checksum or hash.
Guess you could play around with user rights. However, you'd need to ask an administrator right at install then to make it hidden from the given user who is an admin on the PC.

Get a look at the temporary files a process creates

I'm trying to reverse-engineer a program that does some basic parsing: text in, text out. I've got an executable "reference implementation" and the source code to what must be a different version, since the compiled source output != executable output.
The process creates and deletes temporary files very quickly in a multi-step parsing process. If I could take a look at the individual temporary files, I could get some great diagnostic data to narrow down where my source differs from the binary.
Is there any way to do any of the following?
Freeze a directory so that file creation will work but file deletion will fail silently?
Run a program in "slow motion" so that I can look at the files that it creates?
Log everything that a program does, including any data written out to files?
Running a tool like NTFS Undelete should give you the chance to recover the temporary files it's creating then deleting. Combine this with ProcMon from Sysinternals to get the right filenames.
You didn't mention what OS you're doing this on, but assuming you're using Windows...
You might be able to make use of SysInternals tools like Process Explorer and Process Monitor to get a better idea of the files being accessed. As far as I know, there's no "write-only" option on folders. For "slowing down" the files, you'd just need to use a slower computer. For logging, the SysInternals tools will help out quite a bit. Once you have a file name(s) that are being created, you could try preventing their deletion by opening the files in a stream from another process. That would prevent the system from being able to delete them.
There are two ways to attack this:
Run various small test cases through both systems and notice the differences. Since the test cases are small, you should be able to figure out why your code works differently than the executable.
Disassemble the executable and remove all the "delete temp file" instructions. Depending on how this works, this could be a very complex task (say when there is no central place where it happens).

Renaming A Running Process' File Image On Windows

I have a Windows service application on Vista SP1 and I've found that users are renaming its executable file (while it's running) and then rebooting, thus causing it to fail to start on next bootup because the service manager can no longer find the exe file since it's been renamed.
I seem to recall that with older versions of Windows you couldn't do this because the OS placed a lock on the file. Even with Vista SP1 I still cannot copy over the existing file when it's running - Windows reports that the file is in use - makes sense. So why should I be allowed to rename it? What happens if Windows needs to page in a new code page from the exe but the file has been renamed since it was started? I ran Process Monitor while renaming the exe file, etc, but Process Mon didn't report anything strange and just logged changing the filename like any other file.
Does anyone know what's going on here behind the scenes? It's seem counter intuitive that Windows would allow a running process' filename (or its dependent DLLs) to be changed. What am I missing here?
your concept is wrong ... the filename is not the center of the file-io universe ... the handle to the open file is. the file is not moved to a different section of disk when you rename it, it's still in the same place and the part of the disk the internal data structure for the open file is still pointing to the same place. bottom line is that your observations are correct. you can rename a running program without causing problems. you can create a new file with the same name as the running program once you've renamed it. this is actually useful behavior if you want to update software while the software is running.
As long as the file is still there, Windows can still read from it - it's the underlying file that matters, not its name.
I can happily rename running executables on my XP machine.
The OS keeps an open handle to the .exe file,. Renaming the file simply changes some filesystem metadata about the file, without invalidating open handles. So when the OS goes to page in more code, it just uses the file handle it already has open.
Replacing the file (writing over its contents) is another matter entirely, and I'm guessing the OS opens with the FILE_SHARE_WRITE flag unset, so no other processes can write to the .exe file.
Might be a stupid question but, why do users have access to rename the file if they are not suppose to rename the file? But yeah, it's allowed because, as the good answers point out, the open handle to the file isn't lost until the application exits. And there are some uses for it as well, even though I'm not convinced updating an application by renaming its file is a good practice.
You might consider having your service listen to changes to the directory that your service is installed in. If it detects a rename, then it could rename itself back to what it's supposed to be.
There are two aspects to the notion of file here:
The data on the disk - that's the actual file.
The file-name (could be several or none) which you can give that data - called directory entries.
What you are renaming is the directory entry, which still references the same data. Windows doesn't care about your doing so, as it still can access the data when it needs to. The running process is mapped to the data, not the name.

Resources