My Recycle Bin in Windows 10 is very slow. It contains 60k files and I have 10k files to restore. I think the excruciating slowness (an inevitable crashes!) is how long it take to refresh the Recycle Bin view rather than the operations (delete, restore, etc.)
Is there a way to circumvent the Recycle Bin GUI and perform the operations from e.g. Command Prompt or PowerShell?
What would the command or script for the following operation look like?
Restore all files deleted on DD-MM-YYYY to original location
There are some discussions on using Windows PowerShell to operate Recycle Bin here and here
...I am not a coder so I am unable to use these solutions ... the most I can do is put a script into Notepad, save as PS1 file, and run with PowerShell!
To be clear, I am not looking to recover files using their hidden system names but rather the human readable names displayed in the GUI so I am assuming I have to work with Recycle Bin in some way?
Related
Here's the scenario: We have a computer running Windows 10 which has a directory that's backed up nightly. The backups are done with a batch file utilizing Robocopy and scheduled via Windows. The parameters are as such that the backup will always add any new files or existing file edits into the destination, but it will never delete files from the destination that have been deleted in the source. It essentially archives all files which are in the source directory at the end of each day.
Here's the tricky part. The source directory is very large, and occasionally someone finds a duplicate file (or several duplicates of a file) in it. When that happens, we need to delete all but one copy of the file, and then we need to access the backup directory manually, locate the file there, and do the same. This is tedious and time-consuming as it's not rare for someone to notice an entire subdirectory full of files that exist 5+ times each.
What we're looking for is a way to scan the source directory and all subdirectories inside for duplicate files and remove all but one copy of them, and then a way to reflect that into the destination. I've assumed that we will not be able to use Robocopy to reflect the changes in the destination due to the nature of the backup script it's running, but we do have the ability to run any third-party software on the destination directory as well, essentially running an action in both directories to clean each of them of duplicate files.
On that note, I'm not against using third-party tools to make this cleaner or more efficient, I'm just not aware of any.
There is one way to solve this problem I was also suffering from this problem. but I found that how to use "BATCH" file
There are mainly 2 command
X_COPY
ROBO_COPY
According to your need here, (1)x_copy will be helpfull
xcopywill backup your specific file or folder even if you changed some megabytes data, it will copy the new data and will not be replaced on previous data it will make new copy.
HOW TO DO
Open NotePad and type
xcopy "source file" "destination" /y/e/d/c/f/h/i/z/j
And then save your notepad as ".bat" file
for more requirement use below url
https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/xcopy
Our application collects data from an external device. This data primarily resides in memory, but is spooled to disk in temporary files until the user explicitly saves the data. This is to provide some recovery chance if the application crashes for some reason. Generally speaking, it works just fine.
Lately we've discovered, thanks to Windows becoming more forceful about automatic updates, that these files get deleted automatically during a reboot. So if Windows kills our application to automatically apply updates, the temporary files that would have allowed recovery are gone after the reboot.
I've tested the issue by killing the application on purpose and rebooting; indeed, the temporary files have vanished after the reboot.
The files are created using the Win32 API call GetTempFileName, along with GetTempPath. Everything I've read on the subject says these files are not automatically deleted ever, but they clearly are being deleted.
What can I do to stop this? Or should I just change where our safety data is stored?
What you are seeing is a new "Storage Sense" feature added in Windows 10.
How to Clear Temporary Files Automatically in Windows 10.
Windows 10 got the ability to clear temporary files automatically in a recent build. Starting with build 15014, a new option appeared in Settings.
When enabled, it can be set to clear items like temporary files, Recycle Bin, etc. You can turn them off individually.
Alternatively, another option would be to change your app to save its temporary files in a non-system temp folder that you control, rather than using GetTempPath(). And maybe also use something other than GetTempFileName() to create your temporary file names (like using date/times or guids instead), so Windows can't possibly track the temporary files you create. Then perhaps your files won't be deleted automatically by Storage Sense anymore.
The best solution IMO is not using the temporary folder which contains (as the name suggests) temporary files that can be deleted without any consequences.
Instead you should store them somewhere in the LocalAppdata folder.
Use SHGetFolderPath function to retrieve the actual location of the LocalAppData folder.
In LocalAppData create a folder whose name is that of your company and/or product name or some combination of both and store all your pseudo temporary files there.
In Windows command prompt, how do I list out the latest created files in a system drive (e.g. C:)?
I would like to test a software that creates unknown temporary files in various places within the system drive, and like to know when each files is created during runtime, and want to make sure that the uninstallation does indeed remove these files.
My idea is to start the program, and repeatedly search in command prompt. Or is there a simpler way to track?
I'm trying to write a script that removes all files from a given directory, however I don't want to remove files that are currently in use (being viewed, edited, etc.). What I'm finding is that for some file types (.docx, .xlsx, etc.), this works just fine and the .ps1 script fails as expected and moves on. However, some files (.bmp, .txt) can be open and get deleted as well. It looks like certain files are not locked when in-use. I guess my question is a few smaller questions:
Is there a way to tell if files are in-use (other than seeing if it's locked)?
Is there any definitive way to tell which file types are locked when in-use?
Is there a better cmdlet than Remove-Item to use for what I am trying to achieve?
Thanks in advance!
Some applications like Word locks the file wile reading them to avoid it being modified while it's open (usualy in case you want to modify it in ex. Word). Other applications, like notepad, doesn't.
This is not specific to a filetype (which is just a GUI attribute that has nothing to do with the data inside). It's the application that decides if it wants to lock a file or not. Ex:
Open a docx file in Word: Access denied (file in use)
Open a docx file in Wordpad: Success (no lock)
AFAIK it's impossible to detect 100% files being in use by ex. notepad which doesn't lock files.
There are many ways to delete files, but Remove-Item is as good as any of them.
Adding to Frode's point, nothing wrong with remove-item. Normal directory doesn't offer you much. Files hosted via windows file server provides session information might help you identify if a file is being used. If possible, you may implement a smart logic that excludes the files are in use.
In general, I'd recommend a work around that your powershell script check file's LastAccessTime, say, delete everything that hasn't been accessed for 24 hours
Get-ChildItem -path c:\ps | Where-Object {$_.LastAccessTime -lt (get-date).addDays(-1)}
The extreme way of protecting files, is using SCM and don't delete anything checked out. However this is not a friendly solution to average users.
I'm trying to reverse-engineer a program that does some basic parsing: text in, text out. I've got an executable "reference implementation" and the source code to what must be a different version, since the compiled source output != executable output.
The process creates and deletes temporary files very quickly in a multi-step parsing process. If I could take a look at the individual temporary files, I could get some great diagnostic data to narrow down where my source differs from the binary.
Is there any way to do any of the following?
Freeze a directory so that file creation will work but file deletion will fail silently?
Run a program in "slow motion" so that I can look at the files that it creates?
Log everything that a program does, including any data written out to files?
Running a tool like NTFS Undelete should give you the chance to recover the temporary files it's creating then deleting. Combine this with ProcMon from Sysinternals to get the right filenames.
You didn't mention what OS you're doing this on, but assuming you're using Windows...
You might be able to make use of SysInternals tools like Process Explorer and Process Monitor to get a better idea of the files being accessed. As far as I know, there's no "write-only" option on folders. For "slowing down" the files, you'd just need to use a slower computer. For logging, the SysInternals tools will help out quite a bit. Once you have a file name(s) that are being created, you could try preventing their deletion by opening the files in a stream from another process. That would prevent the system from being able to delete them.
There are two ways to attack this:
Run various small test cases through both systems and notice the differences. Since the test cases are small, you should be able to figure out why your code works differently than the executable.
Disassemble the executable and remove all the "delete temp file" instructions. Depending on how this works, this could be a very complex task (say when there is no central place where it happens).