File watcher in shell - shell

I am trying to keep two directories synchronized with the same files in them.
Files are dropped into Directory A throughout the day. I would like to create a file watcher script that will copy files from Directory A to Directory B as soon as they are dropped.
My thought was to run the job every minute and simply copy everything that dropped in the last minute, but I am wondering if there is a better solution out there.
I'm running MKS toolkit under Windows. Different servers, same operating system.
Thanks for your help!

If you use Linux, you can hook into the kernel using the inotify API to get notified if something in a folder changes. There are command line versions like inotifywatch(1) as well.
To copy the files, I suggest to use rsync(1): it is clever, knows how to clean up after itself and it will create new files hidden while they are copied so users and programs are less likely to pick them up before they are complete.

Related

How can I mirror deleted duplicates from a source into a destination?

Here's the scenario: We have a computer running Windows 10 which has a directory that's backed up nightly. The backups are done with a batch file utilizing Robocopy and scheduled via Windows. The parameters are as such that the backup will always add any new files or existing file edits into the destination, but it will never delete files from the destination that have been deleted in the source. It essentially archives all files which are in the source directory at the end of each day.
Here's the tricky part. The source directory is very large, and occasionally someone finds a duplicate file (or several duplicates of a file) in it. When that happens, we need to delete all but one copy of the file, and then we need to access the backup directory manually, locate the file there, and do the same. This is tedious and time-consuming as it's not rare for someone to notice an entire subdirectory full of files that exist 5+ times each.
What we're looking for is a way to scan the source directory and all subdirectories inside for duplicate files and remove all but one copy of them, and then a way to reflect that into the destination. I've assumed that we will not be able to use Robocopy to reflect the changes in the destination due to the nature of the backup script it's running, but we do have the ability to run any third-party software on the destination directory as well, essentially running an action in both directories to clean each of them of duplicate files.
On that note, I'm not against using third-party tools to make this cleaner or more efficient, I'm just not aware of any.
There is one way to solve this problem I was also suffering from this problem. but I found that how to use "BATCH" file
There are mainly 2 command
X_COPY
ROBO_COPY
According to your need here, (1)x_copy will be helpfull
xcopywill backup your specific file or folder even if you changed some megabytes data, it will copy the new data and will not be replaced on previous data it will make new copy.
HOW TO DO
Open NotePad and type
xcopy "source file" "destination" /y/e/d/c/f/h/i/z/j
And then save your notepad as ".bat" file
for more requirement use below url
https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/xcopy

Running a applescript on folder action

I have attached a script with a folder to run whenever something is added to the folder. The problem is if multiple files are added one after another the process gets queued. How to make the script run even if its already running for another file.
You can't stop the process queuing. The "on adding folder items to xxfolder after receiving xxfiles", gets all files dropped at same time (xxfiles is a list). When system takes too much time to add all files (copy via slow network, ...), the systems split the list of added files in sub-lists, each one calling the script. That's the way it has been built ! but what is your troubles having several calls instead of only 1 ? ..as long as all files added are processed, it should be OK.
fyi, the system does periodic calls to check folder actions, using launchd process, but it is not documented where the period is set.

How does one add a single file to a perforce repository?

I have a large shared drive (~500Gb, 20k files, samba/afs). I would like to add all the files in there to a perforce repository.
I imagine adding/committing them all in one fell swoop is not a good idea.
How would I then to do that? Add/commit one by one? And would that ensure that the files on the shared drive are NOT locked?
I am comfortable with bash or perl, and this would have to happen under Mac OS X.
Bonus question: would the method also allow checking in the same files if they get changed on the shared drive via a cron job?
Thanks.
It will depend on your hardware what Perforce can handle. You should not likely have to add the files one at a time, however. This article here shows how to add whole directories at once:
Regarding your bonus question, yes you can easily handle files modified by your cron job by using the reconcile method. See the section 'Reconcile through the Command Line' in this article.

Windows: File Monitoring Script (Batch/VBS)

I'm currently in working on a script to create a custom backup script, the only piece I'm missing is a file monitor. I need some form of a script that will monitor a folder for file changes, and then run a command with the file that's changed.
So, for example, if the file changes, it'll execute "c:/syncbatch.bat %Location_Of_File%"
In VBScript, you can monitor a folder for file changes by subscribing to the WMI __InstanceModificationEvent event. These articles contain sample code that you can learn from and adapt to your specific needs:
WMI and File System Monitoring
How Can I Monitor for Different Types of Events With Just One Script?
Calling WMI is fairly cryptic and it causes the WMI service to start running which can contribute to bloat since its fairly large and you really can't cancel the file change notifications you've requested from it without rebooting. Some people experimenting with remote printing from a Dropbox folder found that a simple VBScript program that ran an endless loop with a 10 second WScript.Sleep call in the loop used far less resource. Of course to stop it you have to task kill that script or program into it some exit trigger it can find like a specifically named empty file in the watch folder, but that is still easier to do than messing with WMI.
The Folder Spy http://venussoftcorporation.blogspot.com/2010/05/thefolderspy.html
is a free lightweight DOT.NET based file/folder watching GUI application I'ved used before to run scripts based on file changes. It looks like the new version can pass the event filename to the launched command. The old version I had didn't yet support file event info so when launched, my script had to instance a File System Object and scan the watched folder to locate the new files based on criteria like datestamps and sizes.
This newer version appears to let you pass the file name to the script if you say myscript.vbs "*f" on the optional script call entry. Quotes can be important when passing file paths that have spaces in folder names. Just remember if you are watching change events you will get a lot of them as a file grows or is edited, usually you just want notification of file adds or deletes.
Another trick your script can do is put the file size in a variable, sleep for a few seconds, and check the file again to see if its changed. if it hasn't changed in a few seconds you can usually assume whatever created it is done writing it to disk. if it keeps changing just loop until its stable.

cocoa + skip os generated files

my app actually goes to different folders and takes each file into account and reads each file and does a lot of processing on them and marks the folder it has processed as done. but this is not happening as the system is immediately generating files like .DS_store and .localized and .trash. so is there any mechanism to skip processing hidden files or stop the os from generating these files programatically?
Thanks
Couldn't you change your app to just ignore files that start with "."? You've tagged this Cocoa, so using something like NSFileManager's contentsOfDirectoryAtURL:includingPropertiesForKeys:options:error: seems appropriate. One of the options you can specify is NSDirectoryEnumerationSkipsHiddenFiles, which will skip hidden files.
Check the documentation for more details.
I'm not aware of any option that disables the generation of .DS_Store files locally. There is an option for remote, here.
Another way to do it could be to create a unix user for just that job and let him own the dirs, so that the Finder never can go there. Either start the job manually using sudo, or make it a setuid job.. or use launchd.

Resources