Why is my `find` command giving me errors relating to ignored directories? - bash

I have this find command:
find . -type f -not -path '**/.git/**' -not -path '**/node_modules/**' | xargs sed -i '' s/typescript-library-skeleton/xxx/g;
for some reason it's giving me these warnings/errors:
find: ./.git/objects/3c: No such file or directory
find: ./.git/objects/3f: No such file or directory
find: ./.git/objects/41: No such file or directory
I even tried using:
-not -path '**/.git/objects/**'
and got the same thing. Anybody know why the find is searching in the .git directory? Seems weird.

why is the find searching in the .git directory?
GNU find is clever and supports several optimizations over a naive implementation:
It can flip the order of -size +512b -name '*.txt' and check the name first, because querying the size will require a second syscall.
It can count the hard links of a directory to determine the number of subdirectories, and when it's seen all it no longers needs to check them for -type d or for recursing.
It can even rewrite (-B -or -C) -and -A so that if the checks are equally costly and free of side effects, the -A will be evaluated first, hoping to reject the file after 1 test instead of 2.
However, it is not yet clever enough to realize that -not -path '*/.git/*' means that if you find a directory .git then you don't even need to recurse into it because all files inside will fail to match.
Instead, it dutifully recurses, finds each file and matches it against the pattern as if it was a black box.
To explicitly tell it to skip a directory entirely, you can instead use -prune. See How to exclude a directory in find . command

Both more efficient and more correct would be to avoid the default -print action, change -not -path ... to -prune, and ensure that xargs is only used with NUL-delimited input:
find . -name .git -prune -o \
-name node_modules -prune -o \
-type f -print0 | xargs -0 sed -i '' s/typescript-library-skeleton/xxx/g '{}' +
Note the following points:
We use -prune to tell find to not even recurse down the undesired directories, rather than -not -path ... to tell it to discard names in those directories after they were found.
We put the -prunes before the -type f, so we're able to match directories for pruning.
We have an explicit action, not depending on the default -print. This is important because the default -print effectively has a set of parenthesis: find ... behaves like find '(' ... ')' -print, not like find ... -print, no if explicit action is given.
We use xargs only with the -0 argument enabling NUL-delimited input, and the -print0 action on the find side to generate a NUL-delimited list of names. NUL is the only character which cannot be present in an arbitrary file path (yes, newlines can be present) -- and thus the only character which is safe to use to separate paths. (If the -0 extension to xargs and the -print0 extension to find are not guaranteed to be available, use -exec sed -i '' ... {} + instead).

Related

Script to find recursively the number of files with a certain extension

We have a highly nested directory structure, where we have a directory, let's call it 'my Dir', appearing many times in our hierarchy. I am interested in counting the number of "*.csv" files in all directories named 'my Dir' (yes, there is a whitespace in the name). How can I go about it?
I tried something like this, but it does not work:
find . -type d -name "my Dir" -exec ls "{}/*.csv" \; | wc -l
If you want to the number of files matching the pattern '*.csv' under "my Dir", then:
don't ask for -type d; ask for -type f
don't ask for -name "my Dir" if you really want -name '*.csv'
don't try to ls *.csv on each match, because if there's more N csv files in a directory, you would potentially count each one N times
also beware of embedding {} in -exec code!
For counting files from find, I like to use a trick I learned from Stéphane Chazelas on U&L; for example, from: Counting files in Linux:
find "my Dir" -type f -name '*.csv' -printf . | wc -c
This requires GNU find, as -printf is a GNU extension to the POSIX standard.
It works by looking within "my Dir" (from the current working directory) for files that match the pattern; for each matching file, it prints a single dot (period); that's all piped to wc who counts the number of characters (periods) that find produced -- the number of matching files.
You would exclude all pathcs that are not My Dir:
find . -type f -not '(' -not -path '*/my Dir/*' -prune ')' -name '*.csv'
Another solution is to use the -path predicate to select your files.
find . -path '*/my Dir/*.csv'
Counting the number of occurrences could be a simple matter of piping to wc -l, though this will obviously produce the wrong result if some of the files contain newlines in their names. (This is slightly pathological, but definitely something you want to cover in production code.) A common arrangement is to just print a newline for every found file, instead of its name.
find . -path '*/my Dir/*.csv' -printf '.\n' | wc -l
(The -printf predicate is not in POSIX but it's not hard to replace with an -exec or similar.)

In Bash, how do you delete all files with same name, except the one located in a specific folder?

I have a specific file which is found in several directories. Usually I delete all of them by using the syntax:
find . -name "<Filename>" -delete
However, I want to retain one file from a specific folder, say FOLDER1.
How do I do this using find? (I want to use find because I use -print before -delete to check what files I am deleting. I am apprehensive on using rm since there is danger of deleting files I want to keep.)
Thanks in advance.
You can do it with
find . -name "filename" -and -not -path "./path/to/filename" -delete
You will want either to make sure that the path expression is a relative one, including the initial ./, so that it's matched by the expression, or else use wildcards. So if you know that it's in a folder named myfolder, but you don't know the full path to it, you can use
find . -name "filename" -and -not -path "*/myfolder/filename" -delete
If you don't want to delete anything under any directory named FOLDER1, you can tell find not to recurse down any directory so named at all, using -prune:
find . -name FOLDER1 -prune -o -name filename -delete
This is more efficient than recursing down that directory and then filtering out results that include it later.
Side note: When testing this, be sure you use the explicit -print:
find . -name FOLDER1 -prune -o -name filename -print
...whereas an implicit one won't behave as you expect:
# not what you want: equivalent to the below, not the above:
find . -name FOLDER1 -prune -o -name filename
...will behave as:
find . '(' -name FOLDER1 -prune -o -name filename ')' -print
...which thus includes contents on either side of the -o operator for the action.

Bash - Excluding subdirectories using the find command [duplicate]

This question already has answers here:
How do I exclude a directory when using `find`?
(46 answers)
Closed 7 years ago.
I'm using the find command to get a list of folders where certain files are located. But because of a permission denied error for certain subdirectories, I want to exclude a certain subdirectory name.
I already tried these solutions I found here:
find /path/to/folders -path "*/noDuplicates" -prune -type f -name "fileName.txt"
find /path/to/folders ! -path "*/noDuplicates" -type f -name "fileName.txt"
And some variations for these commands (variations on the path name for example).
In the first case it won't find a folder at all, in the second case I get the error again, so I guess it still tries to access this directory. Does anyone know what I'm doing wrong or does anyone have a different solution for this?
To complement olivm's helpful answer and address the OP's puzzlement at the need for -o:
-prune, as every find primary (action or test, in GNU speak), returns a Boolean, and that Boolean is always true in the case of -prune.
Without explicit operators, primaries are implicitly connected with -a (-and), which, like its brethren -o (-or) performs short-circuiting Boolean logic.
-a has higher precedence than -o.
For a summary of all find concepts, see https://stackoverflow.com/a/29592349/45375
Thus, the accepted answer,
find . -path ./ignored_directory -prune -o -name fileName.txt -print
is equivalent to (parentheses are used to make the evaluation precedence explicit):
find . \( -path ./ignored_directory -a -prune \) \
-o \
\( -name fileName.txt -a -print \)
Since short-circuiting applies, this is evaluated as follows:
an input path matching ./ignored_directory causes -prune to be evaluated; since -prune always returns true, short-circuiting prevents the right side of the -o operator from being evaluated; in effect, nothing happens (the input path is ignored)
an input path NOT matching ./ignored_directory, instantly - again due to short-circuiting - continues evaluation on the right side of -o:
only if the filename part of the input path matches fileName.txt is the -print primary evaluated; in effect, only input paths whose filename matches fileName.txt are printed.
Edit: In spite of what I originally claimed here, -print IS needed on the right-hand side of -o here; without it, the implied -print would apply to the entire expression and thus also print for left-hand side matches; see below for background information.
By contrast, let's consider what mistakenly NOT using -o does:
find . -path ./ignored_directory -prune -name fileName.txt -print
This is equivalent to:
find . -path ./ignored_directory -a -prune -a -name fileName.txt -a -print
This will only print pruned paths (that also match the -name filter), because the -name and -print primaries are (implicitly) connected with logical ANDs;
in this specific case, since ./ignored_directory cannot also match fileName.txt, nothing is printed, but if -path's argument is a glob, it is possible to get output.
A word on find's implicit use of -print:
POSIX mandates that if a find command's expression as a WHOLE does NOT contain either
output-producing primaries, such as -print itself
primaries that execute something, such as -exec and -ok
(the example primaries given are exhaustive for the POSIX spec. of find, but real-world implementations such as GNU find and BSD find add others, such as the output-producing -print0 primary, and the executing -execdir primary)
that -print be applied implicitly, as if the expression had been specified as:
\( expression \) -print
This is convenient, because it allows you to write commands such as find ., without needing to append -print.
However, in certain situations an explicit -print is needed, as is the case here:
Let's say we didn't specify -print at the end of the accepted answer:
find . -path ./ignored_directory -prune -o -name fileName.txt
Since there's now no output-producing or executing primary in the expression, it is evaluated as:
find . \( -path ./ignored_directory -prune -o -name fileName.txt \) -print
This will NOT work as intended, as it will print paths if the entire parenthesized expression evaluates to true, which in this case mistakenly includes the pruned directory.
By contrast, by explicitly appending -print to the -o branch, paths are only printed if the right-hand side of the -o expression evaluates to true; using parentheses to make the logic clearer:
find . -path ./ignored_directory -prune -o \( -name fileName.txt -print \)
If, by contrast, the left-hand side is true, only -prune is executed, which produces no output (and since the overall expression contains a -print, -print is NOT implicitly applied).
Following my previous comment, this works on my Debian :
find . -path ./ignored_directory -prune -o -name fileName.txt -print
or
find /path/to/folder -path "*/ignored_directory" -prune -o -name fileName.txt -print
or
find /path/to/folder -name fileName.txt -not -path "*/ignored_directory/*"
The differences are nicely debated here
Edit (added behavior specification details)
Pruning all permission denied directories in find
Using gnufind.
Specification behavior details - in this solutions we want to:
exclude unreadable directories contents (prune them),
avoid "permission denied" errors coming from unreadable dierctory,
keep the other errors and return states, but
process all files (even unreadable files, if we can read their names)
The basic design pattern is:
find ... \( -readable -o -prune \) ...
Example
find /var/log/ \( -readable -o -prune \) -name "*.1"
\thanks{mklement0}
The problem is in the way find evaluates the expression you are passing to the -path option.
Instead, you should try something like:
find /path/to/folders ! -path "*noDuplicates*" -type f -name "fileName.txt"

Suppress error message from bash loop

I've got the following line in my bash script:
for i in $(find ./TTDD* -type f)
do
It works when there's files in the directory, but when it's empty I get the following:
find ... No such file or directory
How can I suppress that exact error message, as I'm logging output and doesn't care about that specific error message.
The problem is that globs that don't have any matches expand to themselves, and since there's no file named TTDD*, you get this error.
You can rewrite it in different ways. The most straight forward is:
find . -path './TTDD*' -type f
This will show the same files.
If there are other directories in the current dir, it will waste some time going through their files even if they'll never match. If required, you can short-circuit such directories with a less readable find . -path . -o -not -path './TTDD*' -prune -o -type f -print.
NB: iterating over these files with a for loops will break for files with spaces and various other special characters. You can combine this with anubhava's answer to safely read all filenames while also not suppressing all of find's other potentially useful error messages.
while IFS= read -rd '' f; do
printf "Processing [%s]\n" "$f"
done < <(find . -path './TTDD*' -type f -print0 2>/dev/null)
How about you use "-exec"?
This way you can redirect all results of your find command to the input of some other command, e.g., let's say that I want to find all the files within a given folder that are owned by the root user and I want to change the ownership of these files, here is the code:
find ${DataFS}/Data -user root -exec chown someuser:someuser {} \;
The approach you are using is similar to
find ${DataFS}/Data -user root -print0 | xargs -0 chown someuser:someuser
Which is not ideal because it will fail in case it can't find any files (i.e., if it prints an empty string), i.e., chown: missing operand after `someuser:someuser'

Gnu find: apply -prune to directories which match a pattern in external file

I wonder if there is a more efficient way to obtain directory patterns for use with -prune from an external file:
find . \( -type d -a -exec sh -c "echo \"{}\" | grep -qEx -f patterns.prune" \; \) -prune -o \( <further checks> \)
this works but is of course very slow due to the use of a shell/pipe for every previous match. So is there a more elegant way than the above or do i really have to chain the lines of the pattern file as commandline switches for find ?
Thanks.
You could try to pipe to grep at the end of the run, to only invoke it once, i.e. something like:
find . <your_other_conditions> | grep -v -f patterns.prune
This may not apply to your particular case, since it will now A) find everything under the pruned directories as well (though you can fix that by tweaking patterns.prune) and B) relieve control from find, so that you can't use find's builtins (e.g. -exec) on the results.

Resources