Dealing with spaces in rsync path arguments - macos

So I'm currently trying to write a few small scripts that allow me to manage my iTunes library of which I have clones on multiple OS X machines.
The basic idea is that I have a NAS holding a copy of the library that is used as an intermediate "master copy" since the machines holding the actually used copies aren't available all the time. If I want to update my old copy on machine B with the newer version from machine A, I'd then update the NAS copy based on machine A's current state, then update machine B from the updated NAS copy possibly at a later time.
The script files are located on the NAS within the same folder that also houses the iTunes directory. Since I'm mounting the NAS as a volume via AFP, I simply open a Finder window with the directory containing the scripts and drag'n'drop the script I want to use to a Terminal window for easy execution.
This is my attempt at the "update NAS from local copy" script:
rsync -avz --compress-level 1 --exclude 'Mobile Applications/*.ipa' --delete --delay-updates -n "$(echo $HOME | sed 's/ /\\ /g')/Music/iTunes" "$(dirname $0 | sed 's/ /\\ /g')"
(-n option of course only for testing the script)
Since there will be spaces in the paths I supply rsync with, I already figured out that I'd need to escape those somehow. I also know that the standard way to do that on OS X is to prepend all the spaces with a backslash, at least when manually typing paths in Terminal. But the code above still won't work – rsync complains that it cannot change to the directory I supplied, although the path it spits out in the error message seems to be perfectly fine and can be cd'd to, if you remove the double quotes around it first:
[...]
building file list ... rsync: change_dir "/Volumes/Macintosh\ HD/Users/Julian/Music" failed: No such file or directory (2)
done
[...]
If I remove the surrounding double quotes in the script itself, rsync seems to not honor the escaping backslashes at all and still treat the space following the backslash as a path separator:
[...]
building file list ... rsync: link_stat "/Volumes/Macintosh\" failed: No such file or directory (2)
rsync: change_dir "/Volumes/Macintosh HD/Users/Julian/HD/Users/Julian/Music" failed: No such file or directory (2)
done
[...]
And no, I can't work around the issue by shortening /Volumes/Macintosh\ HD/Users/Julian/Music to /Users/Julian/Music since this machine has multiple HDDs and / is not the same disk/partition as /Volumes/Macintosh\ HD. So I need to find a solution for this specific problem.
I'm seriously lost now.
Can anyone please explain to me what I need to change in order to have rsync recognize the paths correctly?

After messing around quite a bit more and finding this question, I managed to develop a working solution:
localpath=$HOME/Music/iTunes
remotepath=$(dirname $0)/
rsync -avz \
--compress-level 1 \
--exclude 'Mobile Applications/*.ipa' \
--delete \
--delay-updates \
-n \
"$localpath" \
"$remotepath"

Related

How to remove /Volume from rsync destination path

I am trying to backup similar to time machine, many examples on the web are not complete. I am trying to do relative path -R but the destination includes /Volume and hides it so I cannot see backup until I manually unhide the folder. I am using -R also so non-existing directory will be created. Destination is to usb flash drive containing sparse bundle.
Source /Volume/Drive/Folder
Destination /Volume/USBBackup
I have tried putting in my filter.txt file:
H /Volumes/*
with and without this /Volume is included and hidden.
rsync -avHAXNR --fileflags --force-change --numeric-ids --protect-args --stats --progress --filter="._$FILTER" --exclude-from="$EXC" --link-dest="/Volume/USBBackup/$PREVDIR" "/Volume/Drive/Folder" "/Volume/USBBackup/name-timestamp/" 2> ~/Desktop/rsync-errors.txt
what gets backed up is /Volume/USBBackup/name-timestamp/Volume/Drive/Folder with Volume hidden.
what I want instead
/Volume/USBBackup/name-timestamp/Drive/Folder
not hidden.
Have you considered changing your working directory to /Volume while running that task?
You should be able to do something like:
(cd /Volumes; rsync Drive/Folder)
Or you could:
OLDPWD=$(pwd)
cd /Volumes
rsync Drive/Folder /Volumes/USBBackup....
cd $OLDPWD

Taking back up of files while using Copy & Paste via command line

I am using the command cp -a <source>/* <destination> for copying and pasting the files inside one particular destination. In the destination the above command only replaces the files inside a folder that is present in source as well. If the there are other files present in destination, the command will not do anything and leave as it is. Now before doing the pasting, I want to take the back up of the files that are about to be replaced with the copy paste. Is there an option in the cp command that does this?
There is no such option in cp command. Here you need to create a shell script. First execute a ls command in your destination directory and store the output in a file like history.txt. Now just before cp command execute a grep command with the file you want to copy in the history file to check whether that file is already available in history file or not. If the file is available in destination directory (that means file available in history file) back up the file in destination directory first with todays datestamp and then copy the same file name from source to destination.
If you want to backup these files that will be copied from source, use -b option, available in GNU cp
cp -ab <source>/* <destination>
There is 2 caveats that you should know about.
This command, in my knoledge, is not available in non GNU
system (like BSD systems)
It will ask for confirmation for each existing file in target. We can reduce the probleme with the -u option but this is unusable in a script.
It appears to me that you are trying to make a backup (copy files to another location, don't erase them, don't overwrite those already in them), you probably want to take a look at the rsync command. This same command would be written
rsync -ab --suffix=".bak" <source>/ <destination>
and the rsync command is much more flexible to handle this sort of things.

Rsync including all files

After some reading and trying rsync copy over only certain types of files using include option I can't get seem to get it to work.
I run the following command:
rsync -zarv -vvv -e ssh --prune-empty-dirs --delete --include="*/" --include="*.csv"
--include="*.hdf5" --include="*.pickle" --include="*.tar.gz" --include="*.bin"
--include="*.zip" --include="*.npz" --exclude="*" . user#host.com:/rsync
But at the target it backups any file I have in the directory and subdirectories. delete-before and delete-after does not delete files like .txt or .py. I have also tried the --exclude="*" before the extension includes but I am running 2.6.9 so it should be after as far as I have understood it.
Deleting files on the host machine will just sync them again for whatever reason I don't know.
Your command looks fine, although try using --delete-excluded instead of --delete.
--delete-excluded - also delete excluded files from destination dirs
It should eliminate any files that are --excluded and not --included on the destination.
Sorry to have bothered. This was a bash issue and not a command issue.
As I was using the command:
exec $COMMAND
instead of
eval $COMMAND
This made god knows what for error but executing it manually (after print) and correctly in bash made it work. Deleting items still seems flaky but that I can try some.

Is it safe to alias cp -R to cp?

When I copy something, I always forget the -R, then I have go all the way back to add it right after cp.
I want to add this to bash config files.
alias cp="cp -R"
I have not seen anything bad happen. Is it safe to do this?
The only thing I can think of that would cause unexpected behavior with the -R flag is that it doesn't work with wildcards.
What I mean is... for example you want to copy all mp3 Files in a directory and every subdirectory with: cp -R /path/*.mp3. Although -R is given it will not copy mp3s in the subdirectories of path - if there are any.
I wouldn't use aliases for changing the behaviour of normal commands. When you are in a different shell / another computer, the alias will be missing. When a friend wants to help you, he will not know what you did.
Once I had an alias rm="rm -i" and I performed rm *, while I just had changed shell with a su.
And sometimes you want to use cp without the -R option, will you remember to use /bin/cp in these cases (copy all files in the current dir to another location, but do not cp the subdirs)?

cp: missing destination file operand after

Im trying to do a full back up and copy over all files from one directory into another directory
#!/bin/bash
#getting files from this directory
PROJECTDIRECTORY=/../../Project3
#Copied to this directory
FILEBACKUPLOCATION= /../.../FMonday
for FILENAME in $PROJECTDIRECTORY/*
do
cp $FILENAME $FILEBACKUPLOCATION
done
But I keep getting this error
./fullbackup: line 6: /../../FMonday: Is a directory
cp: missing destination file operand after
Variable assignments in bash scripts require no space between the variable name and value or (unless quoted) within the value. Since a space was present in the line FILEBACKUPLOCATION= /../.../FMonday, it attempted to execute /../.../FMonday as a command (which caused the first error) with FILEBACKUPLOCATION assigned to an empty string. The variable was then not assigned when the other command further down tried to use it (accounting for the second error).
I think this is what you need is FILEBACKUPLOCATION=/FMonday/
In my case, I just used used space and dot operator after the file name. It worked perfectly.
For Example, darthVader is a file name which is inside F drive. So, I used the command "mv /f/darthVader . "
Has you already found the variable needs to be FILEBACKUPLOCATION=/location with no space, also if possible try to make your script a little more portable by using something like:
FILEBACKUPLOCATION=$HOME/backup/FMonday
In this case, the location is relative to your user $HOME environment variable.
Going further, if you want to keep 2 files in sync probably you could do better with rsync one of the advantages is that it will only copy the files that don't exist or update the ones that have been changed.
You indeed could create an alias to copy/sync files on demand, for example:
alias cpr="rsync --delete --archive --numeric-ids --human-readable --no-compress --whole-file --verbose --info=progress2"
Then if you would like to sync contents of /dir/foo into /dir/bar, could simply do:
$ cpr /dir/foo/ /dir/bar/
Your script also could then be something like:
#!/bin/sh
ORIGIN=/path/to/foo/
DESTINATION=/path/to/bar/
rsync --delete \
--archive \
--numeric-ids \
--human-readable \
--no-compress \
--whole-file \
--verbose \
--info=progress2 \
$ORIGIN $DESTINATION
Notice that in this case the options --no-compress and --whole-file are used, mainly because the files will be copied locally, if this where a remote server options could be different, check this post (The fastest remote directory rsync over ssh archival I can muster (40MB/s over 1gb NICs)

Resources