Rsync create symbolic links only - shell

I currently have rsync working well. It copies all my files from one directory to another directory. The only thing is it is physically copying the files.
I have a lot of large files that I don't want to have a duplicate of all the files. I just want to create a symbolic link in the new directory so that I can serve the data on a webpage. The source directory has some scripts and files I don't want the public to see. I'm moving the safe data to the web root (destination).
What I would like rsync to do is any new files in the source directory would create links into the destination. That way I am not using up my hard drive space like I currently am doing. What I have works perfect except for doing the symbolic link aspect to it. Is there a way to have rsync track and create symbolic links?
rsync -aP --exclude="file.sql" --exclude="*~" --exclude=".*" --exclude="*.sh" . ${destination}

It's not a symlink, but you might be able to work with --link-dest=DIR. It creates a hard link which will create a new name for the same file. This will behave similarly to a softlink as long as:
Both files are on the same filesystem
You don't plan to delete the original and not the copy (the symlink would break but a hard-link won't)
You don't have anything explicitly checking to see if it's a softlink

You could use cp -aR -s (Linux or FreeBSD) or cp --archive --recursive --symbolic-link (Linux) to create symbolic links to the source files in the destination directory instead of copies. Note that -s is non-standard.

Can lndir be useful to you. According to manual it creates a shadow directory of symbolic links to another directory tree.

I think master_delivery is probably the best tool for this. With the already introduced --link-dest option of rsync, files which are not the same will be copied. If you don't mind the situation where copies and hardlinks are mixed, you can use rsync, but if you want to eliminate duplicates completely, use master_delivery.
Usage is:
gem install master_delivery
master_delivery -m <path_to_master> -d <path_to_delivery_root>

Related

recursive rm -rf of a symlink ?

I think I know the answer to this one but I'll ask anyway.
I created a symlink to a dir on a different disk. A script (which I have no control over) will "rm -rf *" in the dir that has this symlink. It deletes the symlink OK, but leaves the target dir on the other directory. I expected this but want to make sure that there's no way I can create the symlink somehow to behave like a hard link in terms of making "rm -rf " recursively delete the dir on the other disk. -T looked kind of promising, but didn't pan out. Again, I have no control over the rm command execution. But I do create the target dir on the other disk, plus I create the symlink to it..
Thanks in advance!
What you are essentially asking is can a hard link be created across file systems. The answer is, they cannot. This tutorial confirms this:
An important thing to note about hard links is that they only work on the current file system. You can not create a hard link to a file on a different file system. To do that you need to use symbolic links, Section 1.4.3.
As you seem to already understand, removing a softlink will have no effect on the the thing it is linked to. This is true for hardlinks as well.

how to copy broken symlinks in ruby script

I want to copy all the contents from one directory to another (including broken symlinks) in my Ruby script. I am using FileUtils.cp_r 'src/.', 'dest' but it is complaining about the broken symlinks. Can someone please help me with this? It is a show-stopper for me right now.
FileUtils.cp_r internally copies the src folder recursively to dest. When it finds a symlink, it will create a symlink using File#symlink method (Refer line 1369 of fileutils.rb).
The documentation of File#symlink states that:
Creates a symbolic link called new_name for the existing file
old_name. Raises a NotImplemented exception on platforms that do not
support symbolic links.
So, it seems that it may not be possible to use FileUtils.cp_r to copy directories if the one of the symlinks in it is broken and pointing to a non-existing file.
Workaround
You can execute shell command cp -r command from your ruby script, it may not be platform-independent code and may not be easy to debug, but it will help you to get around the given scenario which you consider to be a show-stopper.
src = "/path/to/src/dir"
dest = "/path/to/dest/dir"
`cp -r #{src} #{dest}`

rsync using --delete but want to not delete symlinks at destination

Using rsync, my source directory has a number of files and directories. My destination already has been synced, so it mirrors those files and directories.
However, I have manually created a symlink in my destination that does not exist in my source.
I need to use the --delete operation in rsync. Is there a way to get rsync to not remove the symlink?
There is no option to achieve this the way you suggested BUT the simplest solution to the described problem is just to add the filenames of the symlinks to the rsync exclude pattern
like: --exclude="folder/symlinkname1" --exclude="folder/symlinkname2"
if there are many symlinks you can keep a list of them in a exclude pattern file
this file may be autogenerated by a little script or a bash one liner...
Does your destination existing symlink point to a directory or file? If a directory, you may be able to use --keep-dirlinks.

Rsync bash script and hard linking files

I am creating a bash script to backup my files with rsync.
Backups all come from a single directory.
I only want new or modified files to be backed up.
Currently, I am telling rsync to backup the dir, and to check the files compared to the last backup.
The way I am doing this is
THE_TIME=`date "+%Y-%m-%dT%H:%M:%S"`
rsync -aP --link-dest=/Backup/Current /usr/home/user/backup /Backup/Backup-$THE_TIME
rm -f /Backup/Current
ln -s /Backup/Backup-$THE_TIME /Backup/Current
I am pretty sure I have the syntax correct for this. Each backup will check against the "Current" folder, and upload only as necesary. It will then delete the Current folder, and re-create the symlink to the newest backup it just did.
I am getting an error when I run the script:
rsync: link "/Backup/Backup-2010-08-04-12:21:15/dgs1200series_manual_310.pdf"
=> /Backup/Current/dgs1200series_manual_310.pdf
failed: Operation not supported (45)
The host OS is running HFS filesystem, which supports hard linking. I am trying to figure out if something else is not supporting this, or if I have a problem in my code.
Thanks for any help
Edit:
I am able to create a hard link on my local machine.
I am also able to create a hard link on the remote server (when logged in locally)
I am NOT able to create a hard link on the remote server when mounted via afp. Even if both files exist on the server.
I am guessing this is a limitation of afp.
Just in case your command line is only an example: Be sure to always specify the link-dest directory with an absolute pathname! That’s something which took me quite some time to figure out …
Two things from the man page stand out that are worth checking:
If file's aren't linking, double-check their attributes. Also
check if some attributes are getting forced outside of rsync's
control, such a mount option that squishes root to a single
user, or mounts a removable drive with generic ownership (such
as OS X's “Ignore ownership on this volume” option).
and
Note that rsync versions prior to 2.6.1 had a bug that could
prevent --link-dest from working properly for a non-super-user
when -o was specified (or implied by -a). You can work-around
this bug by avoiding the -o option when sending to an old rsync.
Do you have the "ignore ownership" option turned on? What version of rsync do you have?
Also, have you tried manually creating a similar hardlink using ln at the command line?
I don't know if this is the same issue, but I know that rsync can't sync a file when the destination is a FAT32 partition and the filename has a ":" (colon) in it. [The source filesystem is ext3, and the destination is FAT32]
Try reconfiguring the date command so that it doesn't use a colon and see if that makes a difference.
e.g.
THE_TIME=`date "+%Y-%m-%dT%H_%_%S"`

RSync copies only folder directory structure not files

I am using RSync to copy tar balls to an external hard drive on a Windows XP machine.
My files are tar.gz files (perms 600) in a directory (perms 711).
However, when I do a dry-run, only the folders are returned, the files are ignored.
I use RSync a lot, so I presume there is no issue with my installation.
I have tried changing permissions of the files but this makes no difference
The owner of the files is root, which is also the user which the script logs in as
I am not using Rsync's CVS option
The command I am using is:
rsync^
-azvr^
--stats^
--progress^
-e 'ssh -p 222' root#servername:/home/directory/ ./
Is there something I am missing to get my files copied over?
I can think of only a single possibility: My experience with rsync is that it creates the directory structure before copying files in. Rsync may be terminating prematurely, but after this directory step has been completed.
Update0
You mentioned that you were running dry run. Rsync by default only shows the directory names when the directory and all its contents are not present on the receiver.
After a lot of experimentation, I'm only able to reproduce the behaviour you describe if the directories on the source have later modification dates than on the receiver. In this instance, the modification times are adjusted on the receiver.
I had this problem too, and it turns out that backing up to a windows drive from linux doesn't seem to copy the temp files in place, after they are transferred over.
Try adding the --inplace flag, when rsyncing to windows drives.

Resources