Problems using find and cp to copy just .jpg files from a LOT of directories to one new path - macos

I tried the search, but couldn't find the answer to my specific problem.
When I use,
find /recovered_files "*.jpg" -type f -exec cp {} /out \;
to copy all .jpg files from directories within the /recovered_files directory, the /out directory gets filled with every single file (jpg, txt, xml etc etc) from within the source directories.
Can anyone please explain wherein my stupidity lies, pleeeeeease???
Many thanks, Mark.

What you're doing at the moment is equivalent to calling cp /dir/dir/dir/file.jpg /out for each file, which will copy the file into /out. Thus, all of the files are being put into the same directory.
rsync allows filters to select only certain files to be copied. Change from and to to the appropriate directories in the following:
rsync -r from/* to --include=*.jpg --filter='-! */' --prune-empty-dirs
Credit to this post for this solution.
Edit: changed to rsync solution. Original as follows:
find from -name "*.jpg" -type f -exec mkdir -p to/$(dirname {}) \; -exec cp --parents {} to \;
You should replace from and to with the appropriate locations, and this form won't quite work if from begins with /. Just cd to / first if you need to. Also, you'll end up with all the files inside to underneath the entire directory structure of from, but you can just move them back out again.

Related

How to remove everything in a directory except one file in a subdirectory?

I need to delete everything in directory d1, except the file d1/d2/f1.txt. How can I do that in bash?
This works. It will delete everything, but the directories in path of f1.txt and of course the file itself.
find d1/ ! -iregex '\(d1/\|d1/d2\|d1/d2/f1.txt\)' -delete
However, I would strongly suggest against using -delete as it is permanent and mistyping a character could be disastorous...
You should try something like this instead, putting files and directories in trash folder first just in case you delete a file you don't want to delete you can recover it.
mkdir -p ~/.Trash
find d1/ ! -iregex '\(d1/\|d1/d2\|d1/d2/f1.txt\)' -exec mv {} ~/.Trash \;
Find the contents to delete except for (!) specific file:
find d1/ -type f ! -name 'd1/d2/f1.txt' -delete

Copy files recursively to all sub-domain folders

On my server, I have x number of sub-domains
Their folder names are like this:
example1.mydomain.com , example2.mydomain.com, examplex.mydomain.com
and they all exist in one location.
I am trying to write a simple bash script to copy all folders and files in folder SOURCE for example to all of those sub-domain folders (and replace existing).
In other words, I want to copy my files from source to any folder with the name *.mydomain.com
I tried rsync but couldn't do the *.mydomain part
I suggest:
for i in *.mydomain.com; do rsync -aSv "SOURCE/" "$i"; done
The trailing / after SOURCE is important.
You can use find command to search all files and then use this output to copy them, e.g. assuming you are searching in /home and copying to /target
find /home -name "*.mydomain.com" -exec cp -r {} /target/ \;
But one problem in above solution I see is it might find files / folders with these names and copy them none the less (not sure if it will maintain the same folder hierarchy), perhaps IF you are only looking for folder then try below instead,
find /home -name "*.mydomain.com" -type d -exec cp -r {} /target/ \;

Get all content from a folder with subfolder copied to another directory with bash

I have a directory that has many folders in it that have sub folders again and then there are some files. Can I write a bash script that copies all the files from the given directory into one single folder? So that I do not have to navigate through ever single folder and copy the content to another folder.
In the topmost dir under which you want the files to be copied:
find . -type f -exec cp {} /some/new/location \;
Finds all the normal files and then copies them to /some/new/location
You could use find to list all files inside the folder:
find ./source -type f
and then use the output as argument for cp, it would look like this:
cp $(find ./source -type f) destination
There would be problem if there are some files within the original directory tree with conflicting name. In that case cp would refuse to copy additional files with the same name with error like:
cp: will not overwrite just-created destination/t22' with./source/test/t2/t22'
To make copies of files with same name, you can use backup option, like this:
cp --backup=numbered $(find ./source -type f) destination
If you want to see what is happening use -v (verbose) option
cp -v --backup=numbered $(find ./source -type f) destination

Mac Terminal command move files from subdirectory to parent

On my Mac I amm trying to move hundreds of files on my NAS drive, from a parent directory with a load of subdirectories (and possibly directories inside them) and put all of the files into one folder.
They don't have the same file extension for all the files.
Is anyone able to help with the terminal command I need to do this? So far I know that find . -type f will list all the files in the directory and subdirectories but Im unsure how to tell it to get them to move them all into another folder.
For anyone else who may have this same issue:
Ive managed to extract just the .jpg's and put them in the parent folder.
find . -type f -iname '*.jpg' -mindepth 2 -print0 | xargs -0 -I{} mv -n '{}' .
Not quite what I wanted - I was hoping to get every single file and put it into a completely different folder if possible but this has got me further than before.
Go inside the source parent directory and use:
find . -type f -exec mv "$PWD"/{} <destination directory> \;
If you want to move all the files to parent directory itself, use it as the destination directory.

Find files, rename in place unix bash

This should be relatively trivial but I have been trying for some time without much luck.
I have a directory, with many sub-directories, each with their own structure and files.
I am looking to find all .java files within any directory under the working directory, and rename them to a particular name.
For example, I would like to name all of the java files test.java.
If the directory structure is a follows:
./files/abc/src/abc.java
./files/eee/src/foo.java
./files/roo/src/jam.java
I want to simply rename to:
./files/abc/src/test.java
./files/eee/src/test.java
./files/roo/src/test.java
Part of my problem is that the paths may have spaces in them.
I don't need to worry about renaming classes or anything inside the files, just the file names in place.
If there is more than one .java file in a directory, I don't mind if it is overwritten, or a prompt is given, to choose what to do (either is OK, it is unlikely that there are more than one in each directory.
What I have tried:
I have looked into mv and find; but, when I pipe them together, I seem to be doing it wrong. I want to make sure to keep the files in their current location and rename, and not move.
The GNU version of find has an -execdir action which changes directory to wherever the file is.
find . -name '*.java' -execdir mv {} test.java \;
If your version of find doesn't support -execdir then you can get the job done with:
find . -name '*.java' -exec bash -c 'mv "$1" "${1%/*}"/test.java' -- {} \;
If your find command (like mine) doesn't support -execdir, try the following:
find . -name "*.java" -exec bash -c 'mv "{}" "$(dirname "{}")"/test.java' \;

Resources