I have a folder where some xml files keep updated, I want those files copied to another folder. From which a software collects the xml. I want to watch the folder and copy only new files as it comes to another folder automatically.
Is there any ready made solution or can it be done using Autoit, shell script or vb.net(last choice).
Main concerns:
Know when a file is added.
Copy them to the folder.
Make sure its notcopied second time.
If I schedule this code, it might also work
cd /home/webapps/project1/folder1 for f in *.csv do cp -v $f
/home/webapps/project1/folder2/${f%.csv}$(date +%m%d%y).csv done
But it again don't have a way to not copy those which are already copied.
rsync. Put it into cron to run every few minutes.
Related
I have two directories:
/dev/
/www/
The www is a copy of the dev directory.I copy the files across from the dev to the www when they're ready to go live by a script which deletes all of the files inside the www directory and then copies the dev files to it. I'm losing the updated time though as the new copies are essentially new files.
How can I copy the last-modified date too?
It was only a particular subdirectory that I was concerned about so I did it with a for loop in my shell script.
$DIR_DEV="/dev"
$DIR_LIVE="/www"
for i in `ls $DIR_DEV/demos/*.html`
do
DEMO_FILENAME=`basename $i`
touch -d `stat $DIR_DEV/demos/$DEMO_FILENAME --format=%y` "$DIR_LIVE/demos/$DEMO_FILENAME"
done
OOPSY: As I write this I've realised that the copy command has a --preserve option... Could've saved a few hours. :-/
Due to the way the Plesk Extension on my web server works, I am trying to write a shell command that fires after a deployment. This simply needs to copy the contents of one folder to another.
Currently, I am using this:
cp -r /deployed-site/public/ /httpdocs/
However, this only seems to work if the destination folder is empty. Every time a deployment occurs, I want the contents of the first folder copied and pasted into the second?
I would say it's better to clean the destination folder before copying files:
rm -rf /httpdocs
cp -r /deployed-site/public/ /httpdocs/
I have some software that exports a file called My Library.bib to a folder called thesis. Assume that the name of the exported file is fixed. Every time I export such a file, I want to:
Delete any old files called MyLibrary.bib if they exist.
Remove the space from this new file so that it becomes the up-to-date MyLibrary.bib.
I've tried making an Automator 'folder action' as follows:
... However, while the shell script works perfectly if run manually, the folder action itself never appears to trigger.
Folder actions are nonetheless enabled (see below settings), and other folder actions do seem to work.
Summarily, I just want any files named My Library.bib entering the thesis folder (at any time, automatically) to become renamed to MyLibrary.bib, replacing any existing MyLibrary.bib files. Any ideas what's going wrong, or how else to achieve this? Thanks in advance.
When you use the "Run Shell Script" action, the current directory is the Home folder, not the "thesis" folder.
So, you must use the cd command to change the current directory
Informations:
The "Get Folder Content" action is useless for what you want to do,
you can remove it.
The rm command is not necessary, you can use the mv -f to
overwrite an existing file
read firstLine ### get the path of the first dropped item
myDir=$(dirname "$firstLine") ### get the parent (this folder action)
cd "$myDir" && if [ -f "My Library.bib" ]; then
mv -f "My Library.bib" "MyLibrary.bib"
fi
I need to find a solution at work to backup specific folders daily, hopefully to a RAR or ZIP file.
If it was on PC, I would have done it already. But I don't have any idea to how to approach it on a Mac.
What I basically want to achieve is an automated task, that can be run with an executable, that does:
compress a specific directory (/Volumes/Audio/Shoko) to a rar or zip file.
(in the zip file exclude all *.wav files in all sub Directories and a directory names "Videos").
move It to a network share (/Volumes/Post Shared/Backup From Sound).
(or compress directly to this folder).
automate the file name of the Zip file with dynamic date and time (so no duplicate file names).
Shutdown Mac when finished.
I want to say again, I don't usually use Mac, so things like what kind of file to open for the script, and stuff like that is not trivial for me, yet.
I have tried to put Mark's bash lines (from the first answer, below) in a txt file and executed it, but it had errors and didn't work.
I also tried to use Automator, but it's too plain, no advanced options.
How can I accomplish this?
I would love a working example :)
Thank You,
Dave
You can just make a bash script that does the backup and then you can either double-click it or run it on a schedule. I don't know your paths and/or tools of choice, but some thing along these lines:
#!/bin/bash
FILENAME=`date +"/Volumes/path/to/network/share/Backup/%Y-%m-%d.tgz"`
cd /directory/to/backup || exit 1
tar -cvz "$FILENAME" .
You can save that on your Desktop as backup and then go in Terminal and type:
chmod +x ~/Desktop/backup
to make it executable. Then you can just double click on it - obviously after changing the paths to reflect what you want to backup and where to.
Also, you may prefer to use some other tools - such as rsync but the method is the same.
My backup.zip has the following structure.
OverallFolder
lots of files and subfolders inside
i used this unzip backup.zip -d ~/public_html/demo
so i end up with ~/public_html/demo/OverallFolder/my other files.
How do i extract so that i end up with all my files INSIDE OverallFolder GOING DIRECTLY into ~public_html/demo?
~/public_html/demo/my other files
like this?
if you can't find any options to do that, this is the last resort
mv ~/public_html/demo/OverallFolder/* ~/public_html/demo/
(cd ~public_html/demo; unzip $OLDPWD/backup.zip)
This, in a subshell, changes to your destination directory, unzips the file from your source directory, and when the subshell exits, leaves you back in your source directory.
That, or something similar, should work in most shells.