rsync subset of directories - include

I am trying to use include and exclude options in rsync to copy a directory structure, excluding most but not all of the subdirectories, based on a pattern in the directory names. But, it isn't working. It is trying to copy everything over instead of just the subfolders I want. Is my syntax wrong?
I have tried:
rsync -am --include='*/*/*MPRAGE*/' --exclude='*' /parent_directory/ /destination
Also:
rsync -am --include='*/' --include='*/*/*MPRAGE*/' --exclude='*' /parent/ /dest
MPRAGE is the pattern that is in the name of each folder I want copied. But these folders are three levels deep in the structure, and I want to keep the well-organized directory structure intact for these folders I want copied.
Thanks in advance for any tips.

Related

rsync: Exclude specific filetype in only one directory

I backup my data with rsync and would like to exclude a specific filetype in only one directory (and its subdirectories). For example I have:
$ ls Source/
Folder1/a.tar
Folder1/b.dat
Folder2/c.tar
Folder2/d.dat
Folder2/Subfolder3/e.tar
Folder2/Subfolder3/f.dat
Folder2/Subfolder3/g.pdf
Now I would like to sync all files except for the .tar files in Folder2 and its subfolder. At the end it should look like this:
$ ls Target/
Folder1/a.tar
Folder1/b.dat
Folder2/d.dat
Folder2/Subfolder3/f.dat
Folder2/Subfolder3/g.pdf
Does someone know how to do that? I played around with the --exclude option, but without luck.
rsync manual says
INCLUDE/EXCLUDE PATTERN RULES
...
o use ’**’ to match anything, including slashes.
so you can do
rsync -a --exclude='Folder2/**.tar' Source/ Target
Note this is different from bash's globstar option where you would use Folder/**/*.tar.

Unix/Mac OS X: Use file list to copy files and folder keeping the directory structure

I have a plain text file containing names of hundreds of files with path relative to a home directory (can be made absolute path, if needed) in various sub-directories. The home directory contains multiple directories, and thousands of files. I need to create another directory copying the files in the list, while maintaining their directory structure in the destination.
Example:
Source folder:
/home/a/
file1.jpg
file2.jpg
file3.jpg
/home/b/
file4.jpg
file5.jpg
file6.jpg
File List: (plain text, in /home/)
./a/file2.jpg
./b/file5.jpg
Expected Result:
/home/dest/a/
file2.jpg
/home/dest/b/
file5.jpg
Tried cp with various modifications from various questions on stackoverflow, but got a flat folder structure in the result every time.
Using bash on OS X Terminal
Please tell how this can be done.
You can use rsync:
rsync --relative --files-from file-list.txt /home /home/dest

rsync with folder and file name pattern matching to copy files

Right now I'm successfully running:
rsync -uvma --include="*/" --include="*.css" --exclude="*" $spec_dir $css_spec_dir
In a shell script which copies all of the files in the source directory, that are .css files, into a target directory.
I want to do the same for HTML files, but only where they are in a subfolder with the name 'template'.
So I'm in directory ~/foo, and I want to rsync where the --include="*/" only matches on subfolders with the name 'template'. So ~/foo/bar/template/baz/somefile.html would match, and so would ~foo/bar/baz/qux/template/someotherfile.html, but NOT ~/foo/bar/thirdfile.html
Although it looks a little bit strange, this works for me:
rsync -uvma --include="*/" --include="*/template/*/*.html" --include="*/template/*.html" --include="template/*.html" --include="template/*/*.html" --exclude="*" $spec_dir $html_spec_dir
This one works for me:
rsync -umva --include="**/templates/**/*.html" --exclude="*.html" source/ target
Were you looking for **? Here you have to be careful about choosing your exclude pattern, * won't work as it matches directories on the way. If rsync finds foo/templates/some.html, it will first copy foo, then foo/templates and then foo/templates/some.html, but before it gets there * already matched foo and nothing gets copied.
Here's what worked:
rsync -uvma --include="*/" --include="templates/**.html" --exclude="*" $html_all_dir $html_dir
My guess is, your format and mine probably accomplish the same thing. I know I tried about 20 different patterns before this one, and this is the only one that worked properly. I don't think I tried your format though :)

How to exclude particular files from being copied by rsync?

I am reading rsync docs, INCLUDE/EXCLUDE PATTERN RULES section. Following the rules explained there I would like to exclude the following folders and files:
all .metadata folders
all *.DS_Store* files
So, I am creating rules like:
- .DS_Store
.metadata/
But files and folders are not excluded. What am I doing incorrectly?
The following will skip everything in .DS_Store directories plus the .DS_Store directories themselves and works with rsync distributed with Mavericks: rsync --exclude='.DS_Store' --exclude='.metadata' <your_source_dir> <your_destination_dir>.
The --exclude=<pattern> is actually just a shorthard for --filter='- <pattern>'. This means --exclude='.DS_Store' and --filter='- .DS_Store' are equivalent. The same goes with --include=<pattern> which is actually just a shorthand for --filter='+ <pattern>'.

How to backup with s3cmd, ignoring multiple directories and file types

I've been trying to figure out how to backup the contents of my file server's (CentOS via smb) user's folder, ignoring certain file types and directories. It seems like this should be easy, but I'm not getting anywhere on figuring out how to ignore multiple directories.
I'd like to ignore the following:
all files and directories starting with a . or a _
all MS Office temp files (eg ~$*)
lock files (eg .lock)
I've tried a bunch of different combinations of the --exclude flag, but can't get any to work right.
This is the command that makes the most sense, but it's not excluding anything:
s3cmd sync --dry-run --verbose --delete-removed --exclude '.*' '_*' '~$*' '*.lock' /home/user-folder s3://bucket-name/
If you are already using .gitignore, you can do something like
s3cmd sync --exclude '.git/*' --exclude-from .gitignore <local_dir> s3://<bucket>/
as stated in this blog post and confirmed by the documentation for --exclude-from from the official docs (Ctrl+F and search for "exclude-from").
It works great, with one minor drawback: if you're excluding a folder within .gitignore, you must exclude its contents also, or s3cmd will grab its contents. However, this is easy, you can just add a line like <foldername>/* inside the .gitignore and everything will be ok.
EDIT:
Well, better than this. Set up a .s3ignore file and just refer to it from the sync command:
s3cmd sync --exclude-from .s3ignore <local_dir> s3://<bucket>/
.s3ignore example:
.git
.git/*
.gitignore
node_modules
node_modules/*
*.swo
*.swp
*.pyo
*.pyc
I've do something similar. The key is to use --exclude before each pattern you want to match:
s3cmd -v --recursive --exclude ".ts" --exclude ".aac" --exclude "/thumbnails" put /var/www/folder s3://bucket/
Also I managed to use .ts without the wildcard symbol and it worked in my case!
Other answers mention passing --exclude <pattern> for each pattern, and packing all patterns into a file to pass with --exclude-from <file>
Using regex:
You can also pack all patterns into a regular expression and pass it with the --rexclude option:
The Regex pattern for the question above: ".^\.*|._*|.~$*|.*.lock"
s3cmd sync --dry-run --verbose --delete-removed --rexclude ".^\.*|._*|.~$*|.*.lock" /home/user-folder s3://bucket-name/

Resources