rsync : Recursively sync all files while ignoring the directory structure - bash

I am trying to create a bash script for syncing music from my desktop to a mobile device. The desktop is the source.
Is there a way to make rsync recursively sync files but ignore the directory structure? If a file was deleted from the desktop, I want it to be deleted on the device as well.
The directory structure on my desktop is something like this.
Artist1/
Artist1/art1_track1.mp3
Artist1/art1_track2.mp3
Artist1/art1_track3.mp3
Artist2/
Artist2/art2_track1.mp3
Artist2/art2_track2.mp3
Artist2/art2_track3.mp3
...
The directory structure that I want on the device is:
Music/
art1_track1.mp3
art1_track2.mp3
art1_track3.mp3
art2_track1.mp3
art2_track2.mp3
art2_track3.mp3
...

Simply:
rsync -a --delete --include=*.mp3 --exclude=* \
pathToSongs/Theme*/Artist*/. destuser#desthost:Music/.
would do the job if you're path hierarchy has a fixed number of level.
WARNING: if two song file do have exactly same name, while on same destination directory, your backup will miss one of them!
If else, and for answering strictly to your ask ignoring the directory structure you could use bash's shopt -s globstar feature:
shopt -s globstar
rsync -a --delete --include=*.mp3 --exclude=* \
pathToSongsRoot/**/. destuser#desthost:Music/.
At all, there is no need to fork to find command.
Recursively sync all files while ignoring the directory structure
For answering strictly to question, there must no be limited to an extension:
shopt -s globstar
rsync -d --delete sourceRoot/**/. destuser#desthost:destRoot/.
With this, directories will be copied too, but without content. All files and directories would be stored on same level at destRoot/.
WARNING: If some different files with same name exists in defferents directories, they would simply be overwrited on destination, durring rsync, for finaly storing randomly only one.

May be this is a recent option, but I see the option --no-relative mentioned in the documentation for --files-from and it worked great.
find SourceDir -name \*.mp3 | rsync -av --files-from - --no-relative . DestinationDir/

The answer to your question: No, rsync cannot do this alone. But with some help of other tools, we can get there... After a few tries I came up with this:
rsync -d --delete $(find . -type d|while read d ; do echo $d/ ; done) /targetDirectory && rmdir /targetDirectory/* 2>&-
The difficulty is this: To enable deletion of files at the target position, you need to:
specify directories as sources for rsync (it doesn't delete if the source is a list of files).
give it the complete list of sources at once (rsync within a loop will give you the contents of the last directory only at the target).
end the directory names with a slash (otherwise it creates the directories at the target directory)
So the command substitution (the stuff enclosed with the $( )) does this: It finds all directories and adds a slash (/) at the end of the directory names. Now rsync sees a list of source directories, all terminated with a slash and so copies their contents to the target directory. The option -d tells it, not to copy recursively.
The second trick is the rmdir /targetDirectory/* which removes the empty directories which rsync created (although we didn't ask it to do that).
I tested that here, and deletion of files removed in the source tree worked just fine.

If you can make a list of files, you've already solved the problem.
Try:
find /path/to/src/ -name \*.mp3 > list.txt
rsync -avi --no-relative --progress --files-from=list.txt / user#server:/path/to/dest
If you run the script again for new files, it will only copy the missing files.
If you don't like the list, then try a single sentence (but it's another logic)
find /path/to/src/ -name \*.mp3 -type f \
-exec rsync -avi --progress {} user#server:/path/to/dest/ \;
In this case, you will ask for each file, each time, since by the type of sentence, you cannot build the file list previously.

Related

How to exclude a list of files and folders while using tar? [duplicate]

Is there a simple shell command/script that supports excluding certain files/folders from being archived?
I have a directory that need to be archived with a sub directory that has a number of very large files I do not need to backup.
Not quite solutions:
The tar --exclude=PATTERN command matches the given pattern and excludes those files, but I need specific files & folders to be ignored (full file path), otherwise valid files might be excluded.
I could also use the find command to create a list of files and exclude the ones I don't want to archive and pass the list to tar, but that only works with for a small amount of files. I have tens of thousands.
I'm beginning to think the only solution is to create a file with a list of files/folders to be excluded, then use rsync with --exclude-from=file to copy all the files to a tmp directory, and then use tar to archive that directory.
Can anybody think of a better/more efficient solution?
EDIT: Charles Ma's solution works well. The big gotcha is that the --exclude='./folder' MUST be at the beginning of the tar command. Full command (cd first, so backup is relative to that directory):
cd /folder_to_backup
tar --exclude='./folder' --exclude='./upload/folder2' -zcvf /backup/filename.tgz .
You can have multiple exclude options for tar so
$ tar --exclude='./folder' --exclude='./upload/folder2' -zcvf /backup/filename.tgz .
etc will work. Make sure to put --exclude before the source and destination items.
You can exclude directories with --exclude for tar.
If you want to archive everything except /usr you can use:
tar -zcvf /all.tgz / --exclude=/usr
In your case perhaps something like
tar -zcvf archive.tgz arc_dir --exclude=dir/ignore_this_dir
Possible options to exclude files/directories from backup using tar:
Exclude files using multiple patterns
tar -czf backup.tar.gz --exclude=PATTERN1 --exclude=PATTERN2 ... /path/to/backup
Exclude files using an exclude file filled with a list of patterns
tar -czf backup.tar.gz -X /path/to/exclude.txt /path/to/backup
Exclude files using tags by placing a tag file in any directory that should be skipped
tar -czf backup.tar.gz --exclude-tag-all=exclude.tag /path/to/backup
old question with many answers, but I found that none were quite clear enough for me, so I would like to add my try.
if you have the following structure
/home/ftp/mysite/
with following file/folders
/home/ftp/mysite/file1
/home/ftp/mysite/file2
/home/ftp/mysite/file3
/home/ftp/mysite/folder1
/home/ftp/mysite/folder2
/home/ftp/mysite/folder3
so, you want to make a tar file that contain everyting inside /home/ftp/mysite (to move the site to a new server), but file3 is just junk, and everything in folder3 is also not needed, so we will skip those two.
we use the format
tar -czvf <name of tar file> <what to tar> <any excludes>
where the c = create, z = zip, and v = verbose (you can see the files as they are entered, usefull to make sure none of the files you exclude are being added). and f= file.
so, my command would look like this
cd /home/ftp/
tar -czvf mysite.tar.gz mysite --exclude='file3' --exclude='folder3'
note the files/folders excluded are relatively to the root of your tar (I have tried full path here relative to / but I can not make that work).
hope this will help someone (and me next time I google it)
You can use standard "ant notation" to exclude directories relative.
This works for me and excludes any .git or node_module directories:
tar -cvf myFile.tar --exclude=**/.git/* --exclude=**/node_modules/* -T /data/txt/myInputFile.txt 2> /data/txt/myTarLogFile.txt
myInputFile.txt contains:
/dev2/java
/dev2/javascript
This exclude pattern handles filename suffix like png or mp3 as well as directory names like .git and node_modules
tar --exclude={*.png,*.mp3,*.wav,.git,node_modules} -Jcf ${target_tarball} ${source_dirname}
I've experienced that, at least with the Cygwin version of tar I'm using ("CYGWIN_NT-5.1 1.7.17(0.262/5/3) 2012-10-19 14:39 i686 Cygwin" on a Windows XP Home Edition SP3 machine), the order of options is important.
While this construction worked for me:
tar cfvz target.tgz --exclude='<dir1>' --exclude='<dir2>' target_dir
that one didn't work:
tar cfvz --exclude='<dir1>' --exclude='<dir2>' target.tgz target_dir
This, while tar --help reveals the following:
tar [OPTION...] [FILE]
So, the second command should also work, but apparently it doesn't seem to be the case...
Best rgds,
I found this somewhere else so I won't take credit, but it worked better than any of the solutions above for my mac specific issues (even though this is closed):
tar zc --exclude __MACOSX --exclude .DS_Store -f <archive> <source(s)>
After reading all this good answers for different versions and having solved the problem for myself, I think there are very small details that are very important, and rare to GNU/Linux general use, that aren't stressed enough and deserves more than comments.
So I'm not going to try to answer the question for every case, but instead, try to register where to look when things doesn't work.
IT IS VERY IMPORTANT TO NOTICE:
THE ORDER OF THE OPTIONS MATTER: it is not the same put the --exclude before than after the file option and directories to backup. This is unexpected at least to me, because in my experience, in GNU/Linux commands, usually the order of the options doesn't matter.
Different tar versions expects this options in different order: for instance, #Andrew's answer indicates that in GNU tar v 1.26 and 1.28 the excludes comes last, whereas in my case, with GNU tar 1.29, it's the other way.
THE TRAILING SLASHES MATTER: at least in GNU tar 1.29, it shouldn't be any.
In my case, for GNU tar 1.29 on Debian stretch, the command that worked was
tar --exclude="/home/user/.config/chromium" --exclude="/home/user/.cache" -cf file.tar /dir1/ /home/ /dir3/
The quotes didn't matter, it worked with or without them.
I hope this will be useful to someone.
If you are trying to exclude Version Control System (VCS) files, tar already supports two interesting options about it! :)
Option : --exclude-vcs
This option excludes files and directories used by following version control systems: CVS, RCS, SCCS, SVN, Arch, Bazaar, Mercurial, and Darcs.
As of version 1.32, the following files are excluded:
CVS/, and everything under it
RCS/, and everything under it
SCCS/, and everything under it
.git/, and everything under it
.gitignore
.gitmodules
.gitattributes
.cvsignore
.svn/, and everything under it
.arch-ids/, and everything under it
{arch}/, and everything under it
=RELEASE-ID
=meta-update
=update
.bzr
.bzrignore
.bzrtags
.hg
.hgignore
.hgrags
_darcs
Option : --exclude-vcs-ignores
When archiving directories that are under some version control system (VCS), it is often convenient to read exclusion patterns from this VCS' ignore files (e.g. .cvsignore, .gitignore, etc.) This option provide such possibility.
Before archiving a directory, see if it contains any of the following files: cvsignore, .gitignore, .bzrignore, or .hgignore. If so, read ignore patterns from these files.
The patterns are treated much as the corresponding VCS would treat them, i.e.:
.cvsignore
Contains shell-style globbing patterns that apply only to the directory where this file resides. No comments are allowed in the file. Empty lines are ignored.
.gitignore
Contains shell-style globbing patterns. Applies to the directory where .gitfile is located and all its subdirectories.
Any line beginning with a # is a comment. Backslash escapes the comment character.
.bzrignore
Contains shell globbing-patterns and regular expressions (if prefixed with RE:(16). Patterns affect the directory and all its subdirectories.
Any line beginning with a # is a comment.
.hgignore
Contains posix regular expressions(17). The line syntax: glob switches to shell globbing patterns. The line syntax: regexp switches back. Comments begin with a #. Patterns affect the directory and all its subdirectories.
Example
tar -czv --exclude-vcs --exclude-vcs-ignores -f path/to/my-tar-file.tar.gz path/to/my/project/
I'd like to show another option I used to get the same result as the answers before provide, I had a similar case where I wanted to backup android studio projects all together in a tar file to upload to media fire, using the du command to find the large files, I found that I didn't need some directories like:
build, linux e .dart_tools
Using the first answer of Charles_ma I modified it a little bit to be able to run the command from the parent directory of the my Android directory.
tar --exclude='*/build' --exclude='*/linux' --exclude='*/.dart_tool' -zcvf androidProjects.tar Android/
It worked like a charm.
Ps. Sorry if this kind of answer is not allowed, if this is the case I will remove.
For Mac OSX I had to do
tar -zcv --exclude='folder' -f theOutputTarFile.tar folderToTar
Note the -f after the --exclude=
For those who have issues with it, some versions of tar would only work properly without the './' in the exclude value.
Tar --version
tar (GNU tar) 1.27.1
Command syntax that work:
tar -czvf ../allfiles-butsome.tar.gz * --exclude=acme/foo
These will not work:
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude=./acme/foo
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude='./acme/foo'
$ tar --exclude=./acme/foo -czvf ../allfiles-butsome.tar.gz *
$ tar --exclude='./acme/foo' -czvf ../allfiles-butsome.tar.gz *
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude=/full/path/acme/foo
$ tar -czvf ../allfiles-butsome.tar.gz * --exclude='/full/path/acme/foo'
$ tar --exclude=/full/path/acme/foo -czvf ../allfiles-butsome.tar.gz *
$ tar --exclude='/full/path/acme/foo' -czvf ../allfiles-butsome.tar.gz *
I agree the --exclude flag is the right approach.
$ tar --exclude='./folder_or_file' --exclude='file_pattern' --exclude='fileA'
A word of warning for a side effect that I did not find immediately obvious:
The exclusion of 'fileA' in this example will search for 'fileA' RECURSIVELY!
Example:A directory with a single subdirectory containing a file of the same name (data.txt)
data.txt
config.txt
--+dirA
| data.txt
| config.docx
If using --exclude='data.txt' the archive will not contain EITHER data.txt file. This can cause unexpected results if archiving third party libraries, such as a node_modules directory.
To avoid this issue make sure to give the entire path, like --exclude='./dirA/data.txt'
After reading this thread, I did a little testing on RHEL 5 and here are my results for tarring up the abc directory:
This will exclude the directories error and logs and all files under the directories:
tar cvpzf abc.tgz abc/ --exclude='abc/error' --exclude='abc/logs'
Adding a wildcard after the excluded directory will exclude the files but preserve the directories:
tar cvpzf abc.tgz abc/ --exclude='abc/error/*' --exclude='abc/logs/*'
To avoid possible 'xargs: Argument list too long' errors due to the use of find ... | xargs ... when processing tens of thousands of files, you can pipe the output of find directly to tar using find ... -print0 | tar --null ....
# archive a given directory, but exclude various files & directories
# specified by their full file paths
find "$(pwd -P)" -type d \( -path '/path/to/dir1' -or -path '/path/to/dir2' \) -prune \
-or -not \( -path '/path/to/file1' -or -path '/path/to/file2' \) -print0 |
gnutar --null --no-recursion -czf archive.tar.gz --files-from -
#bsdtar --null -n -czf archive.tar.gz -T -
You can also use one of the "--exclude-tag" options depending on your needs:
--exclude-tag=FILE
--exclude-tag-all=FILE
--exclude-tag-under=FILE
The folder hosting the specified FILE will be excluded.
Use the find command in conjunction with the tar append (-r) option. This way you can add files to an existing tar in a single step, instead of a two pass solution (create list of files, create tar).
find /dir/dir -prune ... -o etc etc.... -exec tar rvf ~/tarfile.tar {} \;
You can use cpio(1) to create tar files. cpio takes the files to archive on stdin, so if you've already figured out the find command you want to use to select the files the archive, pipe it into cpio to create the tar file:
find ... | cpio -o -H ustar | gzip -c > archive.tar.gz
gnu tar v 1.26 the --exclude needs to come after archive file and backup directory arguments, should have no leading or trailing slashes, and prefers no quotes (single or double). So relative to the PARENT directory to be backed up, it's:
tar cvfz /path_to/mytar.tgz ./dir_to_backup --exclude=some_path/to_exclude
tar -cvzf destination_folder source_folder -X /home/folder/excludes.txt
-X indicates a file which contains a list of filenames which must be excluded from the backup. For Instance, you can specify *~ in this file to not include any filenames ending with ~ in the backup.
Success Case:
1) if giving full path to take backup, in exclude also should be used full path.
tar -zcvf /opt/ABC/BKP_27032020/backup_27032020.tar.gz --exclude='/opt/ABC/csv/' --exclude='/opt/ABC/log/' /opt/ABC
2) if giving current path to take backup, in exclude also should be used current path only.
tar -zcvf backup_27032020.tar.gz --exclude='ABC/csv/' --exclude='ABC/log/' ABC
Failure Case:
if giving currentpath directory to take backup and full path to ignore,then wont work
tar -zcvf /opt/ABC/BKP_27032020/backup_27032020.tar.gz --exclude='/opt/ABC/csv/' --exclude='/opt/ABC/log/' ABC
Note: mentioning exclude before/after backup directory is fine.
It seems to be impossible to exclude directories with absolute paths.
As soon as ANY of the paths are absolute (source or/and exclude) the exclude command will not work. That's my experience after trying all possible combinations.
Check it out
tar cvpzf zip_folder.tgz . --exclude=./public --exclude=./tmp --exclude=./log --exclude=fileName
I want to have fresh front-end version (angular folder) on localhost.
Also, git folder is huge in my case, and I want to exclude it.
I need to download it from server, and unpack it in order to run application.
Compress angular folder from /var/lib/tomcat7/webapps, move it to /tmp folder with name angular.23.12.19.tar.gz
Command :
tar --exclude='.git' -zcvf /tmp/angular.23.12.19.tar.gz /var/lib/tomcat7/webapps/angular/
Your best bet is to use find with tar, via xargs (to handle the large number of arguments). For example:
find / -print0 | xargs -0 tar cjf tarfile.tar.bz2
Possible redundant answer but since I found it useful, here it is:
While a FreeBSD root (i.e. using csh) I wanted to copy my whole root filesystem to /mnt but without /usr and (obviously) /mnt. This is what worked (I am at /):
tar --exclude ./usr --exclude ./mnt --create --file - . (cd /mnt && tar xvd -)
My whole point is that it was necessary (by putting the ./) to specify to tar that the excluded directories where part of the greater directory being copied.
My €0.02
I had no luck getting tar to exclude a 5 Gigabyte subdirectory a few levels deep. In the end, I just used the unix Zip command. It worked a lot easier for me.
So for this particular example from the original post
(tar --exclude='./folder' --exclude='./upload/folder2' -zcvf /backup/filename.tgz . )
The equivalent would be:
zip -r /backup/filename.zip . -x upload/folder/**\* upload/folder2/**\*
(NOTE: Here is the post I originally used that helped me https://superuser.com/questions/312301/unix-zip-directory-but-excluded-specific-subdirectories-and-everything-within-t)
The following bash script should do the trick. It uses the answer given here by Marcus Sundman.
#!/bin/bash
echo -n "Please enter the name of the tar file you wish to create with out extension "
read nam
echo -n "Please enter the path to the directories to tar "
read pathin
echo tar -czvf $nam.tar.gz
excludes=`find $pathin -iname "*.CC" -exec echo "--exclude \'{}\'" \;|xargs`
echo $pathin
echo tar -czvf $nam.tar.gz $excludes $pathin
This will print out the command you need and you can just copy and paste it back in. There is probably a more elegant way to provide it directly to the command line.
Just change *.CC for any other common extension, file name or regex you want to exclude and this should still work.
EDIT
Just to add a little explanation; find generates a list of files matching the chosen regex (in this case *.CC). This list is passed via xargs to the echo command. This prints --exclude 'one entry from the list'. The slashes () are escape characters for the ' marks.

cp -r * except dont copy any .pdf files - copy a directory subtree while excluding files with a given extension

Editor's note: In the original form of the question the aspect of copying an entire subtree was not readily obvious.
How do I copy all the files from one directory subtree to another but omit all files of one type?
Does bash handle regex?
Something like: cp -r !*.pdf /var/www/ .?
EDIT 1
I have a find expression: find /var/www/ -not -iname "*.pdf"
This lists all the files that I want to copy. How do I pipe this to a copy command?
EDIT 2
This works so long as the argument list is not too long:
sudo cp `find /var/www/ -not -iname "*.pdf"` .
EDIT 3
One issue though is that I am running into issues with losing the directory structure.
Bash can't help here, unfortunately.
Many people use either tar or rsync for this type of task because each of them is capable of recursively copying files, and each provides an --exclude argument for excluding certain filename patterns. tar is more likely to be installed on a given machine, so I'll show you that.
Assuming you are currently in the destination directory, the shell command:
tar -cC /var/www . | tar -x
will copy all files from /var/www into the current directory recursively.
To filter out the PDF files, use:
tar -cC /var/www --exclude '*.pdf' . | tar -x
Multiple --exclude arguments can be given, so:
tar -cC /var/www --exclude '*.pdf' --exclude '*.txt' . | tar -x
would exclude .txt files as well.
K. A. Buhr's helpful answer is a concise solution that reflects the intent well and is easily extensible if multiple extensions should be excluded.
Trying to do it with POSIX utilities and POSIX-compliant options alone requires a slightly different approach:
cp -pR /var/www/. . && find . -name '*.pdf' -exec rm {} +
In other words: copy the whole subtree first, then remove all *.pdf files from the destination subtree.
Note:
-p preserves the original files' attributes in terms of file timestamps, ownership, and permission bits (tar appears to do that by default); without -p, the copies will be owned by the current user and receive new timestamps (though the permission bits are preserved).
Using cp has one advantage over tar: you get more control over how symlinks among the source files are handled, via the -H, -L, and -P options - see the POSIX spec. for cp.
tar invariably seems to copy symlinks as-is.
-R supersedes the legacy -r option for cp, as the latter's behavior with non-regular files is ill-defined - see the RATIONALE section in the POSIX spec. for cp
Neither -iname for case-insensitive matching nor -delete are part of the POSIX spec. for find, but both GNU find and BSD/macOS find support them.
Note how source path /var/www/. ends in /. to ensure that its contents are copied to the destination path (as opposed to putting everything into a www subfolder).
With BSD cp, /var/www/ (trailing /) would work too, but GNU cp treats /var/www and /var/www/ the same.
As for your questions and solution attempts:
Does bash handle regex?
In the context of filename expansion (globbing), Bash only understands patterns, not regexes (Bash does have the =~ regex-matching operator for string matching inside [[ ... ]] conditionals, however).
As a nonstandard extension, Bash implements the extglob shell option, which adds additional constructs to the pattern-matching notation to allow for more sophisticated matching, such as !(...) for negating matchings, which is what you're looking for.
If you combine that with another nonstandard shell option, globstar (**, Bash v4+), you can construct a single pattern that matches all items except a given sub-pattern across an entire subtree:
/var/www/**/!(*.pdf)
does find all non-PDF filesystem items in the subtree of /var/www/.
However, combining that pattern with cp won't work as intended: with -R, any subdirs. are still copied in full; without -R, subdirs. are ignored altogether.
Caveats:
By default, patterns (globs) ignore hidden items unless explicitly matched (* will only match non-hidden items). To include them, set shell option dotglob first.
Matching is case-sensitive by default; turn on shell option nocaseglob to make it case-insensitive.
find /var/www/ -not -iname "*.pdf" in essence yields the same as the extended glob above, except with case-insensitive matching, hidden items invariably included, and the output paths (generally) not in the same order.
However, copying the output paths to their intended destination is the nontrivial part: you'd have to construct analogous subdirs. in the destination dir. on the fly, and you'd have to do so for each input path separately, which will also be quite slow.
Your own attempt, sudo cp `find /var/www/ -not -iname "*.pdf"` ., falls short in several respects:
As you've discovered yourself, this copies all matching items into a single destination directory.
The output of the command substitution, `...`, is subject to shell expansions, namely word-splitting and filename expansion, which may break the command, notably with filenames with embedded spaces.
Note: As written, all destination items will be owned by the root user.
Edit As per #mklement0's comment below, these solutions are not suitable for directory tree recursion--they will only work on one directory, as per the OP's original form of the question.
#rorschach. Yes, you can do this.
Using cp:
Set your Bash shell's extglob option and type:
shopt -s extglob #You can set this in your shell startup to enable it by default
cp /var/www/!(*.pdf) .
If you wish to turn off (unset) this (or any other) shell option, use:
shopt -u extglob #or whatever shell option you wish to unset
Using find
If you prefer using find, you can use xargs to execute the operation you would like Bash to perform:
find /var/www/ ! -iname "*.pdf" -maxdepth 1 | xargs -I{} cp {} .

BASH: Copy all files and directories into another directory in the same parent directory

I'm trying to make a simple script that copies all of my $HOME into another folder in $HOME called Backup/. This includes all hidden files and folders, and excludes Backup/ itself. What I have right now for the copying part is the following:
shopt -s dotglob
for file in $HOME/*
do
cp -r $file $HOME/Backup/
done
Bash tells me that it cannot copy Backup/ into itself. However, when I check the contents of $HOME/Backup/ I see that $HOME/Backup/Backup/ exists.
The copy of Backup/ in itself is useless. How can I get bash to copy over all the folders except Backup/. I tried using extglob and using cp -r $HOME/!(Backup)/ but it didn't copy over the hidden files that I need.
try rsync. you can exclude file/directories .
this is a good reference
http://www.maclife.com/article/columns/terminal_101_using_rsync_locally
Hugo,
A script like this is good, but you could try this:
cp -r * Backup/;
cp -r .* Backup/;
Another tool used with backups is tar. This compresses your backup to save disk space.
Also note, the * does not cover . hidden files.
I agree that using rsync would be a better solution, but there is an easy way to skip a directory in bash:
for file in "$HOME/"*
do
[[ $file = $HOME/Backup ]] && continue
cp -r "$file" "$HOME/Backup/"
done
This doesn't answer your question directly (the other answers already did that), but try cp -ua when you want to use cp to make a backup. This recurses directories, copies rather than follows links, preserves permissions and only copies a file if it is newer than the copy at the destination.

What's the best way to store files on a remote host as they are created and remove the originals?

Basically I want to move files to another server on creation preserving the directory structure. I have a solution put it lacks elegance. Also I feel like I'm missing the obvious answer, so thanks in advance for your help and I totally understand if this bores you.
The situation
I have server with limited disk space (let's call it 'Tiny') and a storage server. Tiny creates files every once in a while. I want to store them automatically on the storage server and remove the originals when it's safe. I have to retain the directory structure of tiny. I don't know in advance how the dir structure looks like. That is, all files are created in the directory /some/dir/ but sudirectories of this are created on the fly. They should be sotred in /other/fold/ on the storege server preserving the substcrutre under /some/dir. E.g:
/some/dir/bla/foo/bar/snap_001a on tiny ---> becomes /other/fold/bla/foo/bar/snap_001a on the storage server. They are all called snap_xxxx wgere xxxx is a four letter alphanumeric string.
My old solution
Now I was thinking to loop over files and scp them. Once scp is finished and returns without error the files on tiny are removed with rm.
#!/bin/bash
# This is invoked by a cronjob ever once in a while.
files=$(find /some/dir/ -name snap_*)
IFS='
'
for current in $files; do
name=$(basename $current) # Get base name (i.e. strip directory)
dir=$(dirname $current) # Get the directory name of the curent file on tiny
dir=${dir/\/some\/dir/\/other\/fold} # Replace the directory root on tiny with the root on the storage server
ssh -i keyfile myuser#storage.server.net \
mkidir -p $dir # create the directory on the storage server and all parents if needed
scp -i keyfile $current myuser#storage.server.net:$dir$name \
&& rm $current # remove files on success
done
This however strikes me as unnecssarily complicated and maybe error prone. I thought of rsync but when coping single files, there is no option to create a directory and it's parents if they don't exist. Does anyone have an idea, better than mine?
What I ended up using after this thread
rsync -av --remove-sent-files --prune-empty-dirs \
-e 'ssh -i /full/path/to/keyfile' \
--include="*/" --include="snap_*" --exclude="*" \
/some/dir/ myuser#storage.server.com:/other/fold/
More recent versions then the one I was using take --remove-source-files instead of --remove-sent-files. The former being more of a telling name in that it's clearer what files are deleted. Also --dry-run is a good option to test your parameters BEFORE actually using rsync.
Thanks to Alex Howansky for the solution and to Douglas Leeder for caring!
How do I tell rsync just to copy the snap_xxxx files?
See the --include option.
How do I change the direcotry root?
Just specify it on the command line.
rsync [options] source_dir dest_host:dest_dir
How do I delete the originals on Tiny after transfer to the storage server?
See the --remove-source-files option.
Maybe something like:
touch /tmp/start
rsync -va /some/dir/ /other/fold
find /some/dir -type f -not -newer /tmp/start | xargs rm

Copy files from one directory into an existing directory

In bash I need to do this:
take all files in a directory
copy them into an existing directory
How do I do this? I tried cp -r t1 t2 (both t1 and t2 are existing directories, t1 has files in it) but it created a directory called t1 inside t2, I don't want that, I need the files in t1 to go directly inside t2. How do I do this?
What you want is:
cp -R t1/. t2/
The dot at the end tells it to copy the contents of the current directory, not the directory itself. This method also includes hidden files and folders.
cp dir1/* dir2
Or if you have directories inside dir1 that you'd want to copy as well
cp -r dir1/* dir2
If you want to copy something from one directory into the current directory, do this:
cp dir1/* .
This assumes you're not trying to copy hidden files.
Assuming t1 is the folder with files in it, and t2 is the empty directory. What you want is something like this:
sudo cp -R t1/* t2/
Bear in mind, for the first example, t1 and t2 have to be the full paths, or relative paths (based on where you are). If you want, you can navigate to the empty folder (t2) and do this:
sudo cp -R t1/* ./
Or you can navigate to the folder with files (t1) and do this:
sudo cp -R ./* t2/
Note: The * sign (or wildcard) stands for all files and folders. The -R flag means recursively (everything inside everything).
cp -R t1/ t2
The trailing slash on the source directory changes the semantics slightly, so it copies the contents but not the directory itself. It also avoids the problems with globbing and invisible files that Bertrand's answer has (copying t1/* misses invisible files, copying `t1/* t1/.*' copies t1/. and t1/.., which you don't want).
For inside some directory, this will be use full as it copy all contents from "folder1" to new directory "folder2" inside some directory.
$(pwd) will get path for current directory.
Notice the dot (.) after folder1 to get all contents inside folder1
cp -r $(pwd)/folder1/. $(pwd)/folder2
Nov, 2021 Update:
This code with Flag "-R" copies perfectly all the contents of "folder1" to existing "folder2":
cp -R folder1/. folder2
Flag "-R" copies symbolic links as well but Flag "-r" skips symbolic links so Flag "-R" is better than Flag "-r".
The latest GNU Grep 3.7:
-R, --dereference-recursive
For each directory operand, read and process all files in that directory,
recursively, following all symbolic links.
-r, --recursive
For each directory operand, read and process all files in that directory,
recursively. Follow symbolic links on the command line, but skip symlinks
that are encountered recursively. Note that if no file operand is given,
grep searches the working directory. This is the same as the
‘--directories=recurse’ option.
Depending on some details you might need to do something like this:
r=$(pwd)
case "$TARG" in
/*) p=$r;;
*) p="";;
esac
cd "$SRC" && cp -r . "$p/$TARG"
cd "$r"
... this basically changes to the SRC directory and copies it to the target, then returns back to whence ever you started.
The extra fussing is to handle relative or absolute targets.
(This doesn't rely on subtle semantics of the cp command itself ... about how it handles source specifications with or without a trailing / ... since I'm not sure those are stable, portable, and reliable beyond just GNU cp and I don't know if they'll continue to be so in the future).
the correct option should be -T. used with -r to copy recursively.
$ cp -r -T t1 t2

Resources