I'd like to use find command with excluding a directory.
I have this:
GLUE="$GLUE -not -iwholename */dir3/*"
And I want to use a GLUE variable in a find command (find $GLUE [...]). Unfortunately, instead of -not -iwholename */dir3/* in GLUE I get -not -iwholename dir3/file1 dir3/file2 dir3/file3, i.e., */dir3/* turns into names of files which meet this condition. And, of course, find doesn't work because of it. How to stop that?
find ./ -iname ! -iname <dirname of file name to be not part of search results>
find ./ -iname <file name> ! -path "./dirt o be Excluded/*"
I hope one will help
Let me answer this question first:
How can I use find to search recursively while excluding a directory from the search? My find command supports the -prune action and the common extensions (e.g., provided by GNU find).
You're very lucky that your find supports the -prune action. Congratulations!
In this case, only add:
\! \( -name 'dir_to_exclude' -prune \)
This will be false if the current name processed by find is dir_to_exclude, and at the same time prune (i.e., cut that branch) from the search tree1.
Let's try it (in a scratch directory):
$ mkdir {a,b,c,d}/{,1,2,3}
$ touch {a,b,c,d}/{,1,2,3}/{file1,file2}
$ tree
.
|-- a
| |-- 1
| | |-- file1
| | `-- file2
| |-- 2
| | |-- file1
| | `-- file2
| |-- file1
| `-- file2
|-- b
| |-- 1
| | |-- file1
| | `-- file2
| |-- 2
| | |-- file1
| | `-- file2
| |-- file1
| `-- file2
`-- c
|-- 1
| |-- file1
| `-- file2
|-- 2
| |-- file1
| `-- file2
|-- file1
`-- file2
9 directories, 18 files
$ find \! \( -name a -prune \)
.
./b
./b/file2
./b/1
./b/1/file2
./b/1/file1
./b/file1
./b/2
./b/2/file2
./b/2/file1
./c
./c/file2
./c/1
./c/1/file2
./c/1/file1
./c/file1
./c/2
./c/2/file2
./c/2/file1
Looks good!
Now, you shouldn't put your arguments to find (or, more generally your commands and arguments) in a variable, but in an array! If you don't you'll very soon run into problems with arguments containing spaces or, like in your OP, wildcards.
Put your glue stuff in an array: for example, to find all the files that have a 1 in their name and are empty2:
glue=( '-name' '*1*' '-type' 'f' '-empty' )
Then an array of the exclude this directory:
exclude_dir=( '!' '(' '-name' 'a' '-prune' ')' )
Then find all files that have a 1 in their name and are empty, while excluding directory a (in the same scratch directory as before):
$ find "${exclude_dir[#]}" "${glue[#]}"
./b/1/file1
./b/file1
./b/2/file1
./c/1/file1
./c/file1
./c/2/file1
Looks really good! (and observe the quotes!).
1
If you really want to be sure that you're only excluding the directory with name dir_to_exclude, in case you have a file called dir_to_exclude, you can specify it thus:
\! \( -name 'dir_to_exclude' -type d -prune \)
2
I'm using a lot of quotes here. Just a good habit that actually saves me for the wildcard part *1* and for the parentheses too!
Related
Let's assume I have following directory tree:
.
|-- foo
`-- foodir
|-- bardir
| |-- bar
| `-- foo
|-- foo -> bardir/foo
`-- foodir
|-- bar
`-- foo
3 directories, 6 files
How can I rename all foo into buz, including symlinks? like:
.
|-- buz
`-- buzdir
|-- bardir
| |-- bar
| `-- buz
|-- buz -> bardir/buz
`-- buzdir
|-- bar
`-- buz
3 directories, 6 files
I thought it is relatively easy at the first glance, but it turns out that was unexpectedly tough.
Firstly, I tried to mv around all files using git ls-files:
$ for file in $(git ls-files '*foo*'); do mv "$file" "${file//foo/buz}"; done
This gave me a bunch of errors said that I have to create new directories before doing so:
mv: cannot move 'foodir/bardir/bar' to 'buzdir/bardir/bar': No such file or directory
mv: cannot move 'foodir/bardir/foo' to 'buzdir/bardir/buz': No such file or directory
mv: cannot move 'foodir/foo' to 'buzdir/buz': No such file or directory
mv: cannot move 'foodir/foodir/bar' to 'buzdir/buzdir/bar': No such file or directory
mv: cannot move 'foodir/foodir/foo' to 'buzdir/buzdir/buz': No such file or directory
I didn't want to care about cleaning up empty directories after copy, so I tried find -exec expecting it can handle file renaming while finding files based on its names.
$ find . -path .git -prune -o -name '*foo*' -exec bash -c 'mv "$0" "${0//foo/buz}"' "{}" \;
But find seems still tried renaming files from renamed path.
find: ./foodir: No such file or directory
My final solution is to find the first file/directory for every single mv commands.
#!/bin/bash
# Rename file paths recursively
while :; do
path=$(find . -path .git -prune -o -name '*foo*' -print -quit)
if [ -z "$path" ]; then
break
fi
if ! mv "$path" "${path/foo/buz}"; then
break
fi
done
# Change symlink targets as well
find . -path .git -prune -o -type l -exec bash -c '
target=$(readlink "$0")
if [ "$target" != "${target//foo/buz}" ]; then
ln -sfn "${target//foo/buz}"
fi
' "{}" \;
This, kinda lame, but works as I expected. So my questions are:
Can I assume find always output directories before its sub directories/files?
Is there any chance to avoid using find multiple times?
Thank you.
I have some directories with the following structure:
DAY1/ # Files under this directory should have DAY1 in the name.
|-- Date
| |-- dir1 # Something wrong here, there are files with DAY2 and files with DAY1.
| |-- dir2
| |-- dir3
| |-- dir4
DAY2/ # Files under this directory should all have DAY2 in the name.
|-- Date
| |-- dir1
| |-- dir2 # Something wrong here, there are files with DAY2, and files with DAY1.
| |-- dir3
| |-- dir4
In each dir there are hundreds of thousands of files with names containing DAY, for example 0.0000.DAY1.01927492. Files with DAY1 on the name should only appear under parent directory DAY1.
Something went wrong when copying files around, so that I now have mixed files with DAY1 and DAY2 in some of the dir directories.
I wrote a script to find folders that contain mixed files, so I can then look at them more closely. My script is the following:
for directory in */; do
if ls $directory | grep -q DAY2 ; then
if ls $directory | grep -q DAY1; then
echo "mixed files in $directory";
fi ;
fi;
done
The problem here is that I'm going through all files twice, which doesn't make sense considering that I'd only have to look through the files once.
What would be a more efficient way achieve what I want?
If i understand you correctly, then you need to find the files under DAY1 directory recursively that have DAY2 in their names, similarly for DAY2 directory the files what have DAY1 in their names.
If so, for DAY1 directory:
find DAY1/ -type f -name '*DAY2*'
this will get you the files under DAY1 directory that have DAY2 in their names. Similarly for DAY2 directory:
find DAY2/ -type f -name '*DAY1*'
Both are recursive operations.
To get the directory names only:
find DAY1/ -type f -name '*DAY2*' -exec dirname {} +
Note that the $PWD will be shown as ..
To get uniqueness, pass the output to sort -u:
find DAY1/ -type f -name '*DAY2*' -exec dirname {} + | sort -u
Given that the difference between going through them once and going through them twice is just a factor-of-two difference, changing to an approach that goes through them only once might actually not be a win, since the new approach might easily take twice as long per file.
So you'll definitely want to experiment; it's not necessarily something that you can confidently reason about.
However, I will say that in addition to going through the files twice, the ls version also sorts the files, which probably has a more-than-linear cost (unless it's doing some kind of bucket-sort). Eliminating that, by writing ls --sort=none instead of just ls, will actually improve your algorithmic complexity, and is almost certain to give a tangible improvement.
But FWIW, here's a version that only goes through the files once, that you can try:
for directory in */; do
find "$directory" -maxdepth 1 \( -name '*DAY1*' -or -name '*DAY2*' \) -print0 \
| { saw_day1=
saw_day2=
while IFS= read -d '' subdirectory ; do
if [[ "$subdirectory" == *DAY1* ]] ; then
saw_day1=1
fi
if [[ "$subdirectory" == *DAY2* ]] ; then
saw_day2=1
fi
if [[ "$saw_day1" ]] && [[ "$saw_day2" ]] ; then
echo "mixed files in $directory"
break
fi
done
}
done
I have a directory full of files. The tree looks something like this:
|-- test1a
| |-- test1b
| |-- foo.txt
| |-- bar.txt
|-- test2a
| |-- test2b
Where the directory names match the regular expression test[1-9][ab].
Using find in bash, I'm trying to create blank files in test2b with the same filenames and extensions as those in test1b.
So far, I've tried the following:
find test1a/test1b -type f -exec touch test2a/test2b {} \;
This, however, does not work. I don't have much experience with bash, so I'm not sure where to go from here. Where am I going wrong?
I was able to solve this problem using the following:
$ cd test2a/test2b
$ find ../../test1a/test1b -type f -exec sh -c 'touch $(basename {})' \;
I believe the problem was resulting from {} giving the full path rather than the filename. It was then trying to create a file that already existed, so it left it alone and did nothing.
Here is a second approach:
find test1a/test1b -type f -execdir echo touch test2a/test2b/{} \; > adhoc.sh
sh adhoc.sh
Good morning !
This is a simple one I believe, but I am still a noob :)
I am trying to find all folders with a certain name. I am able to do this with the command
find /path/to/look/in/ -type d | grep .texturedata
The output gives me lots of folders like this :
/path/to/look/in/.texturedata/v037/animBMP
But I would like it to stop at .texturedata :
/path/to/look/in/.texturedata/
I have hundreds of these paths and would like to lock them down by piping the output of grep into chmod 000
I was given a command with the argument -dpe once, but I have no idea what it does and the Internet has not be able to help me determine it's usage
Thanks you very much for your help !
I am trying to find all folders with a certain name. I am able to do
this with the command find /path/to/look/in/ -type d | grep
.texturedata
No need to grep output of find to look for specific directory name. -name option of find will do the same job.
find /path/to/look/in/ -type d -name '.texturedata'
I would like it to stop at .texturedata
-prune option is quite suitable for this requirement
find /path/to/look/in/ -type d -name '.texturedata' -prune
I have hundreds of these paths and would like to lock them down by
piping the output of grep into chmod 000
Try using find with -exec option
find /path/to/look/in/ -type d -name '.texturedata' -exec chmod 000 {} \; -prune
More efficient approach would be to pipe output of find using xargs
find /path/to/look/in/ -type d -name '.texturedata' -prune -print0 | xargs -0 chmod 000
TEST
$ tree -pa
.
|-- [drwxrwxrwx] .texturedata
| `-- [drwxrwxrwx] .texturedata
|-- [drwxrwxrwx] dir1
| |-- [drwxrwxrwx] .texturedata
| | `-- [-rwxrwxrwx] file2
| `-- [drwxrwxrwx] dir11
| `-- [-rwxrwxrwx] file111
|-- [drwxrwxrwx] dir2
| `-- [drwxrwxrwx] .texturedata
| `-- [-rwxrwxrwx] file3
|-- [drwxrwxrwx] dir3
| `-- [-rwxrwxrwx] file4
`-- [-rwxrwxrwx] file1
8 directories, 5 files
$ find . -type d -name '.texturedata' -prune -print0 | xargs -0 chmod 000
$ tree -pa
.
|-- [d---------] .texturedata [error opening dir]
|-- [drwxrwxrwx] dir1
| |-- [d---------] .texturedata [error opening dir]
| `-- [drwxrwxrwx] dir11
| `-- [-rwxrwxrwx] file111
|-- [drwxrwxrwx] dir2
| `-- [d---------] .texturedata [error opening dir]
|-- [drwxrwxrwx] dir3
| `-- [-rwxrwxrwx] file4
`-- [-rwxrwxrwx] file1
7 directories, 3 files
Try
find /path/to/look/in/ -type d -name .texturedata -print0 | xargs -0 chmod 000
or
find /path/to/look/in/ -type d -name .texturedata -exec chmod 000 {} \;
No need to use grep. The above will only change the permissions of .texturedata, not its children, provided no directory in .texturedata is also named .texturedata. And it will find all .texturedata inside /path/to/look/in.
You can use -quit option in gnu find with -exec:
find /path/to/look/in/ -type d -name ".texturedata" -exec chmod 000 '{}' \; -quit
I have a Wordpress upload folder that is structured using subfolders for months.
wolfr2:uploads wolfr$ tree .
.
|-- 2007
| |-- 08
| | |-- beautifulkatamari.jpg
| | |-- beautifulkatamari.thumbnail.jpg
| | |-- beetle.jpg
| | |-- beetle.thumbnail.jpg
How do I use terminal to copy all the images recursively into another folder? I can't seem to wildcard folders like you can wildcard filenames. (e.g. *.jpg or *) (I'm on Mac OSX)
cp -R ./*.jpg .
?
This will copy all *.jpg files from the current folder to a new folder and preserve the directory structure.
tar cvfp `find . -name "*.jpg"` | (cd <newfolder>; tar xfp -)
To copy without preserving the directory structure:
cp `find . -name "*.jpg"` <newfolder>
Off the top of my head:
find . -type f -name \*.jpg -exec cp \{\} $TARGETFOLDER \;
If that doesn't work, comment and I'll try again, but find is definitely the way to go.
None of the above commands worked for me as such on macOS 10.15. Here's the one that worked:
find . -name "*.jpg" -exec cp /{/} [target folder path] \;