BASH: Copy all files and directories into another directory in the same parent directory - bash

I'm trying to make a simple script that copies all of my $HOME into another folder in $HOME called Backup/. This includes all hidden files and folders, and excludes Backup/ itself. What I have right now for the copying part is the following:
shopt -s dotglob
for file in $HOME/*
do
cp -r $file $HOME/Backup/
done
Bash tells me that it cannot copy Backup/ into itself. However, when I check the contents of $HOME/Backup/ I see that $HOME/Backup/Backup/ exists.
The copy of Backup/ in itself is useless. How can I get bash to copy over all the folders except Backup/. I tried using extglob and using cp -r $HOME/!(Backup)/ but it didn't copy over the hidden files that I need.

try rsync. you can exclude file/directories .
this is a good reference
http://www.maclife.com/article/columns/terminal_101_using_rsync_locally

Hugo,
A script like this is good, but you could try this:
cp -r * Backup/;
cp -r .* Backup/;
Another tool used with backups is tar. This compresses your backup to save disk space.
Also note, the * does not cover . hidden files.

I agree that using rsync would be a better solution, but there is an easy way to skip a directory in bash:
for file in "$HOME/"*
do
[[ $file = $HOME/Backup ]] && continue
cp -r "$file" "$HOME/Backup/"
done

This doesn't answer your question directly (the other answers already did that), but try cp -ua when you want to use cp to make a backup. This recurses directories, copies rather than follows links, preserves permissions and only copies a file if it is newer than the copy at the destination.

Related

Deleting specific files in a directory using bash

I have a txt file with a list of files (approximately 500) for example:
file_0_hard.msOut
file_1_hard.msOut
file_10_hard.msOut
.
.
.
file_1000_hard.msOut
I want to delete all those files whose name is not in the txt file. All of these files are in the same directory. How can I do this using bash where I read the text file and then delete all those files in the directory that are not in the text file. Help would be appreciated.
Along the lines of user1934428
There is something to say for this solution. But since we have linux at our disposal with a strong filesystem in use I hope. we can make hardlinks; The only requirement for that the destination is on the same filesystem.
So along those lines:
make a directory to store the files you want to keep.
hardlink (ln {file} {target}) ; as this does not cost extra disk space, it only stores the inode number in the new directory file.
remove all files
move the files back from their origin.
And actually this would be about the same as:
mv {files} {save spot}
remove all files
mv {save spot}/{files} back
Which does pretty much the same thing. Then again; it is a nice way to learn about the power of a hardlink.
you may try this :
cd path/dir
for f in *; do
if ! grep -Fxq "$f" pathToFile/file.txt; then
rm -r "$f"
else
printf "exists-- %s \n" ${f}
fi
done
In case you are wondering (as I did) what -Fxq means in plain English:
F: Affects how PATTERN is interpreted (fixed string instead of a regex)
x: Match whole line
q: Shhhhh... minimal printing
Assuming the directory in question is mydir
set -e
cd mydir
tmpdir=/tmp/x$$ # adapt this to your taste
mv $(<list.txt) $tmpdir
cd ..
rm -r mydir
mkdir mydir
mv $tmpdir/* mydir
rm -r $tmpdir
Basically, instead to delete those files you want to keep, you safe them, then delete everything, and then restore them. For your case, this is probably faster than doing the other way around.
UPDATE:
As Michiel commented, it is advisable that you place your tmpdir in the same file system as mydir.

Copy multiple files with wildcard in bash

Using Ubuntu 18.04. Say we have a file called debug.log. You can create a copy called debug_BACKUP.log with either of these commands:
cp debug.log debug_BACKUP.log
cp debug{,_BACKUP}.log
Alternatively, substitute cp with mv to rename the file.
Now suppose we have debug1.log and debug2.log. We would like to create copies called debug1_BACKUP.log and debug2_BACKUP.log. Is there a single command to achieve this?
When I tried either of the following:
cp debug*.log debug*_BACKUP.log
cp debug*{,_BACKUP}.log
the error is cp: target 'debug*_BACKUP.log' is not a directory.
Brace expansions are an instruction for the shell about how to rewrite your command before glob expansion takes place. They aren't passed to the command itself -- cp has no idea if a brace expansion was used. For that matter, cp doesn't even have any idea if a wildcard is used; when you run cp *.txt dir/, the shell generates an array of C strings corresponding to something like cp foo.txt bar.txt baz.txt dir/ before running it.
This means that if you want to rewrite content after wildcard expansion takes place, you need to do it by hand.
for f in debug*.log; do
[[ $f = *_BACKUP.log ]] && continue # skip things that are already backup files
cp "$f" "${f%.log}_BACKUP.log"
done
There are few excellent bulk rename programs, including Perl based file-rename. You can achieve your bulk copy in 3 steps:
Copy the files to tmp sub folder
Perform bulk rename, moving the files back into the current folder
Remove the tmp folder

Move files to the correct folder in Bash

I have a few files with the format ReportsBackup-20140309-04-00 and I would like to send the files with same pattern to the files as the example to the 201403 file.
I can already create the files based on the filename; I would just like to move the files based on the name to their correct folder.
I use this to create the directories
old="directory where are the files" &&
year_month=`ls ${old} | cut -c 15-20`&&
for i in ${year_month}; do
if [ ! -d ${old}/$i ]
then
mkdir ${old}/$i
fi
done
you can use find
find /path/to/files -name "*201403*" -exec mv {} /path/to/destination/ \;
Here’s how I’d do it. It’s a little verbose, but hopefully it’s clear what the program is doing:
#!/bin/bash
SRCDIR=~/tmp
DSTDIR=~/backups
for bkfile in $SRCDIR/ReportsBackup*; do
# Get just the filename, and read the year/month variable
filename=$(basename $bkfile)
yearmonth=${filename:14:6}
# Create the folder for storing this year/month combination. The '-p' flag
# means that:
# 1) We create $DSTDIR if it doesn't already exist (this flag actually
# creates all intermediate directories).
# 2) If the folder already exists, continue silently.
mkdir -p $DSTDIR/$yearmonth
# Then we move the report backup to the directory. The '.' at the end of the
# mv command means that we keep the original filename
mv $bkfile $DSTDIR/$yearmonth/.
done
A few changes I’ve made to your original script:
I’m not trying to parse the output of ls. This is generally not a good idea. Parsing ls will make it difficult to get the individual files, which you need for copying them to their new directory.
I’ve simplified your if ... mkdir line: the -p flag is useful for “create this folder if it doesn’t exist, or carry on”.
I’ve slightly changed the slicing command which gets the year/month string from the filename.

rsync : Recursively sync all files while ignoring the directory structure

I am trying to create a bash script for syncing music from my desktop to a mobile device. The desktop is the source.
Is there a way to make rsync recursively sync files but ignore the directory structure? If a file was deleted from the desktop, I want it to be deleted on the device as well.
The directory structure on my desktop is something like this.
Artist1/
Artist1/art1_track1.mp3
Artist1/art1_track2.mp3
Artist1/art1_track3.mp3
Artist2/
Artist2/art2_track1.mp3
Artist2/art2_track2.mp3
Artist2/art2_track3.mp3
...
The directory structure that I want on the device is:
Music/
art1_track1.mp3
art1_track2.mp3
art1_track3.mp3
art2_track1.mp3
art2_track2.mp3
art2_track3.mp3
...
Simply:
rsync -a --delete --include=*.mp3 --exclude=* \
pathToSongs/Theme*/Artist*/. destuser#desthost:Music/.
would do the job if you're path hierarchy has a fixed number of level.
WARNING: if two song file do have exactly same name, while on same destination directory, your backup will miss one of them!
If else, and for answering strictly to your ask ignoring the directory structure you could use bash's shopt -s globstar feature:
shopt -s globstar
rsync -a --delete --include=*.mp3 --exclude=* \
pathToSongsRoot/**/. destuser#desthost:Music/.
At all, there is no need to fork to find command.
Recursively sync all files while ignoring the directory structure
For answering strictly to question, there must no be limited to an extension:
shopt -s globstar
rsync -d --delete sourceRoot/**/. destuser#desthost:destRoot/.
With this, directories will be copied too, but without content. All files and directories would be stored on same level at destRoot/.
WARNING: If some different files with same name exists in defferents directories, they would simply be overwrited on destination, durring rsync, for finaly storing randomly only one.
May be this is a recent option, but I see the option --no-relative mentioned in the documentation for --files-from and it worked great.
find SourceDir -name \*.mp3 | rsync -av --files-from - --no-relative . DestinationDir/
The answer to your question: No, rsync cannot do this alone. But with some help of other tools, we can get there... After a few tries I came up with this:
rsync -d --delete $(find . -type d|while read d ; do echo $d/ ; done) /targetDirectory && rmdir /targetDirectory/* 2>&-
The difficulty is this: To enable deletion of files at the target position, you need to:
specify directories as sources for rsync (it doesn't delete if the source is a list of files).
give it the complete list of sources at once (rsync within a loop will give you the contents of the last directory only at the target).
end the directory names with a slash (otherwise it creates the directories at the target directory)
So the command substitution (the stuff enclosed with the $( )) does this: It finds all directories and adds a slash (/) at the end of the directory names. Now rsync sees a list of source directories, all terminated with a slash and so copies their contents to the target directory. The option -d tells it, not to copy recursively.
The second trick is the rmdir /targetDirectory/* which removes the empty directories which rsync created (although we didn't ask it to do that).
I tested that here, and deletion of files removed in the source tree worked just fine.
If you can make a list of files, you've already solved the problem.
Try:
find /path/to/src/ -name \*.mp3 > list.txt
rsync -avi --no-relative --progress --files-from=list.txt / user#server:/path/to/dest
If you run the script again for new files, it will only copy the missing files.
If you don't like the list, then try a single sentence (but it's another logic)
find /path/to/src/ -name \*.mp3 -type f \
-exec rsync -avi --progress {} user#server:/path/to/dest/ \;
In this case, you will ask for each file, each time, since by the type of sentence, you cannot build the file list previously.

Copy files from one directory into an existing directory

In bash I need to do this:
take all files in a directory
copy them into an existing directory
How do I do this? I tried cp -r t1 t2 (both t1 and t2 are existing directories, t1 has files in it) but it created a directory called t1 inside t2, I don't want that, I need the files in t1 to go directly inside t2. How do I do this?
What you want is:
cp -R t1/. t2/
The dot at the end tells it to copy the contents of the current directory, not the directory itself. This method also includes hidden files and folders.
cp dir1/* dir2
Or if you have directories inside dir1 that you'd want to copy as well
cp -r dir1/* dir2
If you want to copy something from one directory into the current directory, do this:
cp dir1/* .
This assumes you're not trying to copy hidden files.
Assuming t1 is the folder with files in it, and t2 is the empty directory. What you want is something like this:
sudo cp -R t1/* t2/
Bear in mind, for the first example, t1 and t2 have to be the full paths, or relative paths (based on where you are). If you want, you can navigate to the empty folder (t2) and do this:
sudo cp -R t1/* ./
Or you can navigate to the folder with files (t1) and do this:
sudo cp -R ./* t2/
Note: The * sign (or wildcard) stands for all files and folders. The -R flag means recursively (everything inside everything).
cp -R t1/ t2
The trailing slash on the source directory changes the semantics slightly, so it copies the contents but not the directory itself. It also avoids the problems with globbing and invisible files that Bertrand's answer has (copying t1/* misses invisible files, copying `t1/* t1/.*' copies t1/. and t1/.., which you don't want).
For inside some directory, this will be use full as it copy all contents from "folder1" to new directory "folder2" inside some directory.
$(pwd) will get path for current directory.
Notice the dot (.) after folder1 to get all contents inside folder1
cp -r $(pwd)/folder1/. $(pwd)/folder2
Nov, 2021 Update:
This code with Flag "-R" copies perfectly all the contents of "folder1" to existing "folder2":
cp -R folder1/. folder2
Flag "-R" copies symbolic links as well but Flag "-r" skips symbolic links so Flag "-R" is better than Flag "-r".
The latest GNU Grep 3.7:
-R, --dereference-recursive
For each directory operand, read and process all files in that directory,
recursively, following all symbolic links.
-r, --recursive
For each directory operand, read and process all files in that directory,
recursively. Follow symbolic links on the command line, but skip symlinks
that are encountered recursively. Note that if no file operand is given,
grep searches the working directory. This is the same as the
‘--directories=recurse’ option.
Depending on some details you might need to do something like this:
r=$(pwd)
case "$TARG" in
/*) p=$r;;
*) p="";;
esac
cd "$SRC" && cp -r . "$p/$TARG"
cd "$r"
... this basically changes to the SRC directory and copies it to the target, then returns back to whence ever you started.
The extra fussing is to handle relative or absolute targets.
(This doesn't rely on subtle semantics of the cp command itself ... about how it handles source specifications with or without a trailing / ... since I'm not sure those are stable, portable, and reliable beyond just GNU cp and I don't know if they'll continue to be so in the future).
the correct option should be -T. used with -r to copy recursively.
$ cp -r -T t1 t2

Resources