I have some software that exports a file called My Library.bib to a folder called thesis. Assume that the name of the exported file is fixed. Every time I export such a file, I want to:
Delete any old files called MyLibrary.bib if they exist.
Remove the space from this new file so that it becomes the up-to-date MyLibrary.bib.
I've tried making an Automator 'folder action' as follows:
... However, while the shell script works perfectly if run manually, the folder action itself never appears to trigger.
Folder actions are nonetheless enabled (see below settings), and other folder actions do seem to work.
Summarily, I just want any files named My Library.bib entering the thesis folder (at any time, automatically) to become renamed to MyLibrary.bib, replacing any existing MyLibrary.bib files. Any ideas what's going wrong, or how else to achieve this? Thanks in advance.
When you use the "Run Shell Script" action, the current directory is the Home folder, not the "thesis" folder.
So, you must use the cd command to change the current directory
Informations:
The "Get Folder Content" action is useless for what you want to do,
you can remove it.
The rm command is not necessary, you can use the mv -f to
overwrite an existing file
read firstLine ### get the path of the first dropped item
myDir=$(dirname "$firstLine") ### get the parent (this folder action)
cd "$myDir" && if [ -f "My Library.bib" ]; then
mv -f "My Library.bib" "MyLibrary.bib"
fi
Related
I have a folder path stored in a variable ${PROJECT_DIR}.
I want to navigate up into its parent folder, and back down into a folder called "Texture Packer" , i.e.. ${PROJECT_DIR} and "Texture Packer" are siblings.
How do I specify it in a shell script ?
So far I have:
TP=/usr/local/bin/TexturePacker
# create all assets from tps files
${TP} "${PROJECT_DIR}/../Texture Packer/*.tps"
But this is incorrect, since Texture Packer can't detect the files in the path. The error message displays:
TexturePacker:: error: Can't open file
/Users/john/Documents/MyProj/proj.ios_mac/../Texture Packer/*.tps for
reading: No such file or directory
EDIT: The following seems to work but isn't clean:
#! /bin/sh
TP=/usr/local/bin/TexturePacker
if [ "${ACTION}" = "clean" ]
then
# remove sheets - please add a matching expression here
# Some unrelated stuff
else
cd ${PROJECT_DIR}
cd ..
cd "Texture Packer"
# create all assets from tps files
${TP} *.tps
fi
exit 0
You're on the right track; the problem is that wildcards (like *.tps) don't get expanded when they're in quotes. The solution is to leave that part of the path outside of the quotes:
${TP} "${PROJECT_DIR}/../Texture Packer"/*.tps
BTW, I almost always recommend against using cd in scripts. It's too easy to lose track of where the current directory will be at various points in the script, or have an error occur and the rest of the script runs in the wrong place, or... Also, any relative pathis you're using (e.g. those supplied by the user as arguments) change meanings every time you cd. Basically, it's an opportunity for things to go weirdly wrong.
New at this. I need to use a postinstall script to move a file and a folder to the user's Application Support folder on a Mac. For the file I only want to move it if the file doesn't already exist. I do not want to overwrite it if if does exist. Here is my script. It runs but nothing gets copied. I'm using the Packages app, btw, and this script is loaded into the postinstall script tab.
#!/bin/sh
if ! "/Library/Application Support/MyApp/MyApp user dict"; then
mv "$1/Contents/Resources/MyApp user dict" "/Library/Application Support/MyApp/.";
fi
mv "$1/Contents/Resources/Spellcheck Dictionary" "/Library/Application Support/MyApp/.";
exit 0
User-specific tasks generally do not belong in installer scripts -- remember that there may be multiple users on a machine, and that some of them may not be accessible when your installer is running. (For example, users may have encrypted home directories, or may not exist until after your installer is run.) If your application needs to copy files to the user's home directory, it should probably do this when it is first launched.
Nevertheless, I see several specific issues with this script:
Your script refers to $1 in several places. Are you sure that your script has an argument passed to it on the command line?
The correct syntax to test if a file does not exist is:
if [ ! -f "/path/to/file" ] ; then …
Your script is missing the square brackets and -f condition. (For details, see man test.)
Assuming that $1 is supposed to be the path to the current user's home directory, you have the arguments to mv backwards. The destination comes last, not first. (The syntax is essentially mv from to.)
I've recently started to learn bash script and have started to create a file repository system. I have gotten pretty far and am able to add files, remove files. When I remove a file from the repository it actually leaves the file in but changes permissions so only the user that removed can use it, it then send a copy to there home area, it also changes the name of the file left behind to "$fileNameOUT"
I know plan to add a feature to my add function which checks after a file has been added if there is a file with the same name but with "OUT" at the end, if it finds this the old file will be sent to a back folder so files can be restored. I know I have to loop through the directory using a for loop, however the problem I'm having is I don't know how I can compare the file I have just added to all of the files in the directory.
I hope someone can make send of what I just wrote.
If you know the name of the file you are interested in, you can use the -e test to check if it exists.
if [ -e fooOUT ]
then
echo File exists
fi
I have a folder where some xml files keep updated, I want those files copied to another folder. From which a software collects the xml. I want to watch the folder and copy only new files as it comes to another folder automatically.
Is there any ready made solution or can it be done using Autoit, shell script or vb.net(last choice).
Main concerns:
Know when a file is added.
Copy them to the folder.
Make sure its notcopied second time.
If I schedule this code, it might also work
cd /home/webapps/project1/folder1 for f in *.csv do cp -v $f
/home/webapps/project1/folder2/${f%.csv}$(date +%m%d%y).csv done
But it again don't have a way to not copy those which are already copied.
rsync. Put it into cron to run every few minutes.
Does the folder action for when a folder item is changed not exist? I want my script to run when and if I update a file. I don't see any reference to it in the documentation. Is there some sort of alternative I am missing because this seams pretty crazy to not have.
on adding folder items to this_folder after receiving added_files
do shell script "anything"
end adding folder items to
on removing folder items from this_folder after losing removed_files
do shell script "anything"
end removing folder items from
-- does not exits?!?
on changing folder items in this_folder after updating changed_files
do shell script "anything"
end changing folder items in
Nope, doesn't exist directly. However, something similar could be accomplished with an idle handler that watches the files in the folder to see if their modification date has changed and perform an action on files where that's true.
There is an alternative to folder actions. You use launchd and setup a watch path. With a watch path, any time something changes in the folder you are watching, your code runs. The biggest difference between folder actions and the launchd action is that with the launchd action you don't know which files changed. You just know something changed. So your code has to figure out what the change actually was, but that shouldn't be too difficult in your case because if you're looking for an updated file you just check the modification date of the files.
You can google for launchd and watch paths if you want to try it.
What about rsync -va '/source/path/' '/destination/path/', using Lingon, have this simple command set as an user Daemon to run, say, every 10 sec?
I found a tricky way to do this with Automator easily and works only in some cases so try it and see if it helps. When you create/modify a folders contents in OSX a hidden OS file called .DS_Store gets written to the folder, its a useless file to a user but it can trigger the folder action. With that said, I use rsync in a Run Shell Script action in my folder action. Once the action is done syncing I then remove the .DS_Store file.
Here is my example:
rsync -r /Users/path/to/source/* /Users/path/to/destination
rm -f /Users/path/to/source/.DS_Store
Then the next time you modify files/folders in that directory, the folder action kicks in and the process would repeat.
I hope this helps...