How to emulate cp and mv --parent on osx - macos

Osx mv and cp does not have the --parents option, so how does one emulate it ?
I.e. mv x/y/a.txt s/x/y/a.txt when s is empty gives a no directory found error unless one does a mkdir first which is rather cumbersome when trying to do this did thousands of files.

The solution (which works on all platforms that has an rsync) is:
Use find or some other tool to create a file with the files you want moved/copied, i.e.
find *.mp3 > files.txt
Then use rsync files-from to specify it and by using --remove-source-files it behaves like a mv -p and without it works like cp -p:
rsync --files-from=files.txt --remove-source-files src dest
Slower than a native mv/cp but its resumable and rsync got alot more options that can otherwise help too for cleaning up your files.

Related

Operating on multiple specific folders at once with cp and rm commands

I'm new to linux (using bash) and I wanted to ask about something that I do often while I work, I'll give two examples.
Deleting multiple specific folders inside a certain directory.
Copying multiple specific folders into a ceratin directory.
I succesfully done this with files, using find with some regex and then using -exec and -delete. But for folders I found it more problematic, because I had problem pipelining the list of folders I got to the cp/rm command succescfully, each time getting the "No such file or directory error".
Looking online I found the following command (in my case for copying all folders starting with a Z):
cp -r $(ls -A | grep "Z*") destination
But when I execute it it says nothing and the prompt won't show up again until I hit Ctrl+C and nothing is copied.
How can I achieve what I'm looking for? For both cp and rm.
Thanks in advance!
First of all, you are trying to grep "Z*" but it means you are looking for Z, ZZ, ZZZZ, ZZZZZ ?
also try to execute ls -A - you will get multiple columns. I think need at least ls -1A to print result one per line.
So for your command try something like:
cp -r $(ls -1A|grep "^p") destination
or
cp -r $(ls -1A|grep "^p") -t destination
But all the above is just to correct syntax of your example.
It is much better to use find. Just in case try to put target directory in quotas like:
find <PATH_FROM> -type d -exec cp -r \"{}\" -t target \;

How to `scp` directory preserving structure but only pick certain files?

I need to secure copy (scp) to remotely copy a directory with its sub structure preserved from the UNIX command line. The sub directories have identically named files that I WANT and bunch of other stuff that I don't. Here is how the structure looks like.
directorytocopy
subdir1
1.wanted
2.wanted
...
1.unwanted
2.notwanted
subdir2
1.wanted
2.wanted
...
1.unwanted
2.notwanted
..
I just want the .wanted files preserving the directory structure. I realize that it is possible to write a shell (I am using bash) script to do this. Is it possible to do this in a less brute force way? I cannot copy the whole thing and delete the unwanted files because I do not have enough space.
Adrian has the best idea to use rsync. You can also use tar to bundle the wanted files:
cd directorytocopy
shopt -s nullglob globstar
tar -cf - **/*.wanted | ssh destination 'cd dirToPaste && tar -xvf -'
Here, using tar's -f option with the filename - to use stdin/stdout as the archive file.
This is untested, and may fail because the archive may not contain the actual subdirectories that hold the "wanted" files.
Assuming GNU tar on the source machine, and assuming that filenames of the wanted files won't contain newlines and they are short enough to fit the tar headers:
find /some/directory -type f -name '*.wanted' | \
tar cf - --files-from - | \
ssh user#host 'cd /some/other/dir && tar xvpf -'
rsync with and -exclude/include list follwing #Adrian Frühwirth's suggestion would be a to do this.

BASH: Copy all files and directories into another directory in the same parent directory

I'm trying to make a simple script that copies all of my $HOME into another folder in $HOME called Backup/. This includes all hidden files and folders, and excludes Backup/ itself. What I have right now for the copying part is the following:
shopt -s dotglob
for file in $HOME/*
do
cp -r $file $HOME/Backup/
done
Bash tells me that it cannot copy Backup/ into itself. However, when I check the contents of $HOME/Backup/ I see that $HOME/Backup/Backup/ exists.
The copy of Backup/ in itself is useless. How can I get bash to copy over all the folders except Backup/. I tried using extglob and using cp -r $HOME/!(Backup)/ but it didn't copy over the hidden files that I need.
try rsync. you can exclude file/directories .
this is a good reference
http://www.maclife.com/article/columns/terminal_101_using_rsync_locally
Hugo,
A script like this is good, but you could try this:
cp -r * Backup/;
cp -r .* Backup/;
Another tool used with backups is tar. This compresses your backup to save disk space.
Also note, the * does not cover . hidden files.
I agree that using rsync would be a better solution, but there is an easy way to skip a directory in bash:
for file in "$HOME/"*
do
[[ $file = $HOME/Backup ]] && continue
cp -r "$file" "$HOME/Backup/"
done
This doesn't answer your question directly (the other answers already did that), but try cp -ua when you want to use cp to make a backup. This recurses directories, copies rather than follows links, preserves permissions and only copies a file if it is newer than the copy at the destination.

What's the best way to store files on a remote host as they are created and remove the originals?

Basically I want to move files to another server on creation preserving the directory structure. I have a solution put it lacks elegance. Also I feel like I'm missing the obvious answer, so thanks in advance for your help and I totally understand if this bores you.
The situation
I have server with limited disk space (let's call it 'Tiny') and a storage server. Tiny creates files every once in a while. I want to store them automatically on the storage server and remove the originals when it's safe. I have to retain the directory structure of tiny. I don't know in advance how the dir structure looks like. That is, all files are created in the directory /some/dir/ but sudirectories of this are created on the fly. They should be sotred in /other/fold/ on the storege server preserving the substcrutre under /some/dir. E.g:
/some/dir/bla/foo/bar/snap_001a on tiny ---> becomes /other/fold/bla/foo/bar/snap_001a on the storage server. They are all called snap_xxxx wgere xxxx is a four letter alphanumeric string.
My old solution
Now I was thinking to loop over files and scp them. Once scp is finished and returns without error the files on tiny are removed with rm.
#!/bin/bash
# This is invoked by a cronjob ever once in a while.
files=$(find /some/dir/ -name snap_*)
IFS='
'
for current in $files; do
name=$(basename $current) # Get base name (i.e. strip directory)
dir=$(dirname $current) # Get the directory name of the curent file on tiny
dir=${dir/\/some\/dir/\/other\/fold} # Replace the directory root on tiny with the root on the storage server
ssh -i keyfile myuser#storage.server.net \
mkidir -p $dir # create the directory on the storage server and all parents if needed
scp -i keyfile $current myuser#storage.server.net:$dir$name \
&& rm $current # remove files on success
done
This however strikes me as unnecssarily complicated and maybe error prone. I thought of rsync but when coping single files, there is no option to create a directory and it's parents if they don't exist. Does anyone have an idea, better than mine?
What I ended up using after this thread
rsync -av --remove-sent-files --prune-empty-dirs \
-e 'ssh -i /full/path/to/keyfile' \
--include="*/" --include="snap_*" --exclude="*" \
/some/dir/ myuser#storage.server.com:/other/fold/
More recent versions then the one I was using take --remove-source-files instead of --remove-sent-files. The former being more of a telling name in that it's clearer what files are deleted. Also --dry-run is a good option to test your parameters BEFORE actually using rsync.
Thanks to Alex Howansky for the solution and to Douglas Leeder for caring!
How do I tell rsync just to copy the snap_xxxx files?
See the --include option.
How do I change the direcotry root?
Just specify it on the command line.
rsync [options] source_dir dest_host:dest_dir
How do I delete the originals on Tiny after transfer to the storage server?
See the --remove-source-files option.
Maybe something like:
touch /tmp/start
rsync -va /some/dir/ /other/fold
find /some/dir -type f -not -newer /tmp/start | xargs rm

install command in OSX and directory structure

I need to copy a few files to another directory. The source structure is as follows
src/foo1.h
src/foo2.h
src/bar/foobar.h
I need to copy these so they end up here
/usr/include/foo/foo1.h
/usr/include/foo/foo2.h
/usr/include/foo/bar/foobar.h
In Linux I use cp -u --parents *.h bar/*.h /usr/include/foo from src, which works great. However, I can't find a suitable replacement in Mac OSX - cp doesn't support parents or an equivalent option, and install supports -d which is supposed to preserve the structure, but gives me the following error: install: foo1.h exists but it's not a directory
I'm stuck. Any ideas?
You could use rsync, e.g.
rsync -a ./src/ /usr/include/foo/ --include \*/ --include \*.h --exclude \*
BTW, you probably don't want to install stuff to /usr/include, as it may well get clobbered by system updates. Consider using e.g. /usr/local/include instead.

Resources