I want to modify *.ica files (to launch Citrix apps) when they are downloaded (to add a transparent Key Passthrough option for remote desktop), so I settled on using entr to monitor the directory, and call then another script (which invokes sed) to update all ica files.
while true; do
ls *.ica | entr -d ~/Downloads/./transparentKeyPassthrough-CitrixIca.sh
done
However, this only works when there is already an .ica file in the directory. If the directory has no *.ica files when first executed, entr errors with:
entr: No regular files to match
Putting a dummy ica file suffices, in which case the new (real) ica file will be detected by entr, and then acted on.
Is there a better way to do this?
The alternative I can think of is to use entr to watch the whole directory for any changes, and if so, run ls -l *.ica and if the change resulted in a new ica file, and then in turn, run the above script.
It seems inelegant and complicated to nest entr that way, so wanted to know if there is some simple option I am missing.
Related
I am new to bash scripting and I have to create a script that will run on all computers within my group at work (so it's not just checking one computer). We have a spreadsheet that keeps certain file information, and I am working to automate the updating of that spreadsheet. I already have an existing python script that gathers the information needed and writes to the spreadsheet.
What I need is a bash script (cron job, maybe?) that is activated anytime a user deletes a file that matches a certain extension within the specified file path. The script should hold on to the file name before it is completely deleted. I don't need any other information besides the name.
Does anyone have any suggestions for where I should begin with this? I've searched a bit but not found anything useful yet.
It would be something like:
for folders and files in path:
if file ends in .txt and is being deleted:
save file name
To save the name of every file .txt deleted in some directory path or any of its subdirectories, run:
inotifywait -m -e delete --format "%w%f" -r "path" 2>stderr.log | grep '\.txt$' >>logfile
Explanation:
-m tells inotifywait to keep running. The default is to exit after the first event
-e delete tells inotifywait to only report on file delete events.
--format "%w%f" tells inotifywait to print only the name of the deleted file
path is the target directory to watch.
-r tells inotifywait to monitor subdirectories of path recursively.
2>stderr.log tells the shell to save stderr output to a file named stderr.log. As long as things are working properly, you may ignore this file.
>>logfile tells the shell to redirect all output to the file logfile. If you leave this part off, output will be directed to stdout and you can watch in real time as files are deleted.
grep '\.txt$' limits the output to files with .txt extensions.
Mac OSX
Similar programs are available for OSX. See "Is there a command like “watch” or “inotifywait” on the Mac?".
I need to find a solution at work to backup specific folders daily, hopefully to a RAR or ZIP file.
If it was on PC, I would have done it already. But I don't have any idea to how to approach it on a Mac.
What I basically want to achieve is an automated task, that can be run with an executable, that does:
compress a specific directory (/Volumes/Audio/Shoko) to a rar or zip file.
(in the zip file exclude all *.wav files in all sub Directories and a directory names "Videos").
move It to a network share (/Volumes/Post Shared/Backup From Sound).
(or compress directly to this folder).
automate the file name of the Zip file with dynamic date and time (so no duplicate file names).
Shutdown Mac when finished.
I want to say again, I don't usually use Mac, so things like what kind of file to open for the script, and stuff like that is not trivial for me, yet.
I have tried to put Mark's bash lines (from the first answer, below) in a txt file and executed it, but it had errors and didn't work.
I also tried to use Automator, but it's too plain, no advanced options.
How can I accomplish this?
I would love a working example :)
Thank You,
Dave
You can just make a bash script that does the backup and then you can either double-click it or run it on a schedule. I don't know your paths and/or tools of choice, but some thing along these lines:
#!/bin/bash
FILENAME=`date +"/Volumes/path/to/network/share/Backup/%Y-%m-%d.tgz"`
cd /directory/to/backup || exit 1
tar -cvz "$FILENAME" .
You can save that on your Desktop as backup and then go in Terminal and type:
chmod +x ~/Desktop/backup
to make it executable. Then you can just double click on it - obviously after changing the paths to reflect what you want to backup and where to.
Also, you may prefer to use some other tools - such as rsync but the method is the same.
I have a lastaction script I'm trying to run in my "log" folder, as I want to move all files and folders in the log folder inside the log/archive folder. So I simply added
mv log/* log/archive/2014
Obviously enough, I get an error saying archive folder cannot be moved to a subdirectory of itself, so I tried adding the extra parameter to the move command to move everything except the archive folder.
mv !(archive) log/* log/archive/2014
This exact command, if executed from cli, works just fine, but when added inside the lastaction/endscript block, it throws the following message
logrotate_script: 2: logrotate_script: Syntax error: "(" unexpected
Anybody has any clue on why this happens?
You are using bash as your shell. You also have the extglob setting enabled.
When logrotate is executing that shell script one or the other of those is not true.
Also that mv command looks odd to me. If archive is under log then I don't see why !(archive) doesn't have log/ as a prefix. Also the log/* should still be matching log/archive regardless of the extglob glob before it. (That is I would think you wanted mv log/!(archive) log/archive/2014, assuming you don't feel like just ignoring the warning from mv in the first place.)
I take delivery of files from multiple places as part of a publishing aggregation service. I need a way to move files that have been delivered to me from one location to another without losing the directory listings for sorting purposes.
Example:
Filepath of delivery: Server/Vendor/To_Company/Customer_Name/**
Filepath of processing: ~/Desktop/MM-DD-YYYY/Returned_Files/Customer_Name/**
I know I can move all of the directories by doing something such as:
find Server/Vendor/To_Company/* -exec mv -n ~/Desktop/MM-DD-YYYY/Returned_Files \;
but using that I can only run the script one time per day and there are times when I might need to run it multiple times.
It seems like ideally I should be able to create a copycat directory in my daily processing folder and then move the files from one to the other.
you can use rsync command with --remove-source-files option. you can run it as many times as needed.
#for trial run, without making any actual transfer.
rsync --dry-run -rv --remove-source-files Server/Vendor/To_Company/ ~/Desktop/MM-DD-YYYY/Returned_Files/
#command
rsync -rv --remove-source-files Server/Vendor/To_Company/ ~/Desktop/MM-DD-YYYY/Returned_Files/
reference:
http://www.cyberciti.biz/faq/linux-unix-bsd-appleosx-rsync-delete-file-after-transfer/
You could use rsync to do this for you:
rsync -a --remove-source-files /Server/Vendor/To_Company/Customer_Name ~/Desktop/$(date +"%y-%m-%d")/Returned_files/
Add -n to do a dry run to make sure it does what you want.
From the manual page:
--remove-source-files
This tells rsync to remove from the sending side the files (meaning non-directories) that are a part of the
transfer and have been successfully duplicated on the receiving side.
Note that you should only use this option on source files that are quiescent. If you are using this to move
files that show up in a particular directory over to another host, make sure that the finished files get renamed
into the source directory, not directly written into it, so that rsync can’t possibly transfer a file that is
not yet fully written. If you can’t first write the files into a different directory, you should use a naming
idiom that lets rsync avoid transferring files that are not yet finished (e.g. name the file "foo.new" when it
is written, rename it to "foo" when it is done, and then use the option --exclude='*.new' for the rsync trans‐
fer).
I used to use Ubuntu
most of time, the command are same as Ubuntu
But there is a command annoys me.
it is cp (copy command)
because I often type cp -a FOLDER/ target
to copy a file , however , the command will copy all the folder under FOLDER/ but not including FOLDER/ itself.
I know I can get what I want by using cp -a FOLDER target (without the slash in the tail)
it easily to misuse the command.
Is there any way to change the behavior, thanks
The command cp is a program in /bin so you could rename the cp command to another file, such as main_cp.exe and create a batch file (script) of your own called cp, which changes the call from what you type to what is actually required and then calls main_cp.exe.
However, if you're going to be using OSX a lot and perhaps on other people's machines, learning to train yourself to just not type the slash is probably going to be better in the long term.