I am creating a virtual file system using Cloud Files API. I am trying to implement rename/move and delete operations for folders. The rename folder with overwrite scenario (when the target folder exists) behavior is confusing. In Windows Explored I am trying to rename Z:\Folder1 to Z:\Folder2 while the Z:\Folder2 already exists. Before the the rename operation is performed, the source folder is being deleted by calling CF_CALLBACK_TYPE_NOTIFY_DELETE callback.
Here is the sequence of callbacks that I get with Cloud Filter API:
CF_CALLBACK_TYPE_NOTIFY_DELETE is called on the source folder.
Each file from source folder is being moved to the target folder.
CF_CALLBACK_TYPE_NOTIFY_DELETE_COMPLETION is called on the source folder.
As a result there is no way to delete a folder in my storage inside the CF_CALLBACK_TYPE_NOTIFY_DELETE callback because this will delete all files prior to the move operation. Also there is no way to distinguish between delete and move operations inside the CF_CALLBACK_TYPE_NOTIFY_DELETE (so I can ignore it for the move operation).
How do I properly implement the delete and rename/move callbacks in Cloud Files API?
It is specific of Windows File Manager. I tried "Move" operation using PowerShell "Move-Item" with "-Force" option and I got next sequence of callbacks (without delete callbacks):
CF_CALLBACK_TYPE_NOTIFY_RENAME
CF_CALLBACK_TYPE_NOTIFY_RENAME_COMPLETION
Related
I'm looking for a solution that checks for the same filename when I'm downloading files, specifically through Firefox on Windows 10. I know that this feature comes standard when it comes to files in the same directory, but as the volume of files scales up, it's getting harder and harder to find what I'm looking for out of the files I've downloaded.
But since Firefox doesn't have an option to scan sub directories when saving files (nor can I find an add-on for Firefox that does something like it), I'm looking for any alternative solutions that would achieve what I'm looking for in the end: something that will notify me that I'm attempting to download (or have just downloaded) a file whose name already exists in the sub directory of a given folder, whether that be via an add-on, or some kind of application or script that can run in the background. Preferably, I would like it to check the folders inside of those sub folders as well.
My memory is terrible, so I opted to keep everything in the same folder so I would immediately get the warning when attempting to download a file I'd already downloaded. But said folder now contains far too many files for me to realistically glean through to find a particular file that I'm looking for.
I would like to be able to sort these files into sub folders of the folder I'm currently storing my downloaded files while keeping the functionality of being able to immediately tell whether or not I'm about to download something I've already downloaded. All I need is a check to see if the same filename exists upon trying to create a file (which is already a feature) - but in the sub directories as well. I do not need any functionality to actually view all the files in each sub folder in the same window.
I'm creating a macOS FileProviderExtension for the remote Document Storage System (kind of like GoogleDrive), where it is possible to share a single document with multiple folders.
For example, Document1.pdf can simultaneously exist in Folder A and Folder B because it's shared with both folders. In my FileProvider extension, this would mean that file should be accessible in both folders:
Folder A/Document1.pdf
Folder B/Document1.pdf
But the file provider extension will treat those as two completely separate files. I.e., if you download one of them, and then try to open the other one, it will redownload the other one, effectively doubling the used space on user's disk and consuming network connection.
I'm looking for a way to tell the FileProviderItem what is the backing data for the given file, and thus solve problems such as:
If user downloads a file in one location, ideally I would tell the FileProvider extension that the same document in all the other locations is also now downloaded (cloud icon should disappear from all files).
Some approaches I considered:
I thought of using symbolic links as part of solution, but I don't really think that's possible
When user tries to open non-downloaded file, fetchContents(for itemIdentifier) callback is invoked. Once file is downloaded, I would ideally now notify all the other files of the same document that they are downloaded, i.e. by updating the isDownloaded property in NSFileProviderItem, but that doesn't seem to work. Also, even if I do that, I still can't say to file, what his backing data file should be.
By turning off the Sandbox capability, I guess I could, when user tries to download/open the file which has already been downloaded in other location, immediately report that file has been downloaded and provide the copy of already downloaded file as data for the requested file, but there are two drawbacks here:
3.1. I would have to turn off the Sandbox capability because I want to access the file in FileProvider path directly
3.2 System would still use disk space for each file. So, if I have same document in multiple folders, extension would keep all those copies in the system, without the option to tell it that for all those files, there is same backing data file somewhere in extension's Container.
Here's the scenario: We have a computer running Windows 10 which has a directory that's backed up nightly. The backups are done with a batch file utilizing Robocopy and scheduled via Windows. The parameters are as such that the backup will always add any new files or existing file edits into the destination, but it will never delete files from the destination that have been deleted in the source. It essentially archives all files which are in the source directory at the end of each day.
Here's the tricky part. The source directory is very large, and occasionally someone finds a duplicate file (or several duplicates of a file) in it. When that happens, we need to delete all but one copy of the file, and then we need to access the backup directory manually, locate the file there, and do the same. This is tedious and time-consuming as it's not rare for someone to notice an entire subdirectory full of files that exist 5+ times each.
What we're looking for is a way to scan the source directory and all subdirectories inside for duplicate files and remove all but one copy of them, and then a way to reflect that into the destination. I've assumed that we will not be able to use Robocopy to reflect the changes in the destination due to the nature of the backup script it's running, but we do have the ability to run any third-party software on the destination directory as well, essentially running an action in both directories to clean each of them of duplicate files.
On that note, I'm not against using third-party tools to make this cleaner or more efficient, I'm just not aware of any.
There is one way to solve this problem I was also suffering from this problem. but I found that how to use "BATCH" file
There are mainly 2 command
X_COPY
ROBO_COPY
According to your need here, (1)x_copy will be helpfull
xcopywill backup your specific file or folder even if you changed some megabytes data, it will copy the new data and will not be replaced on previous data it will make new copy.
HOW TO DO
Open NotePad and type
xcopy "source file" "destination" /y/e/d/c/f/h/i/z/j
And then save your notepad as ".bat" file
for more requirement use below url
https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/xcopy
That's what I want to achieve for my VCL application:
I extract a ZIP file to a directory in %TEMP%. If everything was okay, I move the directory to the target dir.
For the move operation I use JclFileUtils.FileMove (from JCL which resolves to MoveFileEx), because I need a real move operation and I want to overwrite files in the target dir.
This works so far, but the moved files have a per user file permission (inherited from Temp). I want the moved files to inherit permissions from the target folder.
Remarks:
Copy and delete is a workaround, I am aware of. But I want to avoid this (because of file size).
System.IOUtils.TFile.Move does not work for me, because it is implemented by copy and delete (in XE4).
Here a similar problem is described for .NET, but I do not know if an equivalent to GetAccessControl/SetAccessControl exists in Delphi.
Our application collects data from an external device. This data primarily resides in memory, but is spooled to disk in temporary files until the user explicitly saves the data. This is to provide some recovery chance if the application crashes for some reason. Generally speaking, it works just fine.
Lately we've discovered, thanks to Windows becoming more forceful about automatic updates, that these files get deleted automatically during a reboot. So if Windows kills our application to automatically apply updates, the temporary files that would have allowed recovery are gone after the reboot.
I've tested the issue by killing the application on purpose and rebooting; indeed, the temporary files have vanished after the reboot.
The files are created using the Win32 API call GetTempFileName, along with GetTempPath. Everything I've read on the subject says these files are not automatically deleted ever, but they clearly are being deleted.
What can I do to stop this? Or should I just change where our safety data is stored?
What you are seeing is a new "Storage Sense" feature added in Windows 10.
How to Clear Temporary Files Automatically in Windows 10.
Windows 10 got the ability to clear temporary files automatically in a recent build. Starting with build 15014, a new option appeared in Settings.
When enabled, it can be set to clear items like temporary files, Recycle Bin, etc. You can turn them off individually.
Alternatively, another option would be to change your app to save its temporary files in a non-system temp folder that you control, rather than using GetTempPath(). And maybe also use something other than GetTempFileName() to create your temporary file names (like using date/times or guids instead), so Windows can't possibly track the temporary files you create. Then perhaps your files won't be deleted automatically by Storage Sense anymore.
The best solution IMO is not using the temporary folder which contains (as the name suggests) temporary files that can be deleted without any consequences.
Instead you should store them somewhere in the LocalAppdata folder.
Use SHGetFolderPath function to retrieve the actual location of the LocalAppData folder.
In LocalAppData create a folder whose name is that of your company and/or product name or some combination of both and store all your pseudo temporary files there.