find - suppress "No such file or directory" errors - bash

I can use -s in grep to suppress errors, but I don't see an equivalent for the find command in the man page... Is the only option to redirect STDERR>/dev/null?
Or is there an option that handles this? (open to fancy awk and perl solutions if needed)
Example:
$ for dir in `ls /mnt/16_c/`; do find /mnt/16_c/$dir/data/ -mtime +180 -type f -exec echo {} \;; done
find: `/mnt/16_c/test_container/dat/': No such file or directory

You can redirect stderr with 2>/dev/null, for example:
find /mnt/16_c/$dir/data/ -mtime +180 -type f -exec echo {} \; 2>/dev/null
Btw, the code in your question can be replaced with:
find /mnt/16_c/*/data/ -mtime +180 -type f 2>/dev/null
And if there is at least one matching directory,
then you don't even need to suppress stderr,
because find will only search in directories that match this pattern.

I know this isn't exactly what you asked, but my typical approach is to find a path that definitely exists, then use the -path flag to filter.
So instead of find /home/not-a-path, which raises an error, I would do find /home -path "/home/not-a-path/*", which doesn't raise an error.
I had to do this because when I would stream a failed find to /dev/null in a make file, the error would still cause the command to crash. The approach I described above works though.

Related

bash, delete all files with a pattern name

I need to delete all files with a pattern name:  2020*.js
Inside a specific directory: server/db/migrations/
And then show what it have been deleted: `| xargs``
I'm trying this:
find . -name 'server/db/migrations/2020*.js' #-delete | xargs
But nothing is deleted, and shows nothing.
What I'm doing wrong?
The immediate problem is that -name only looks at the last component of the file name (so 2020xxx.js) and cannot match anything with a slash in it. You can use the -path predicate but the correct solution is to simply delete these files directly:
rm -v server/db/migrations/2020*.js
The find command is useful when you need to traverse subdirectories.
Also, piping the output from find to xargs does not do anything useful; if find prints the names by itself, xargs does not add any value, and if it doesn't, well, xargs can't do anything with an empty input.
If indeed you want to traverse subdirectories, try
find server/db/migrations/ -type f -name '2020*.js' -print -delete
If your shell supports ** you could equally use
rm -v server/db/migrations/**/2020*.js
which however has a robustness problem if there can be very many matching files (you get "command line too long"). In that scenario, probably fall back to find after all.
You're looking for something like this:
find server/db/migrations -type f -name '2020*.js' -delete -print
You have try this:
find . -name 'server/db/migrations/2020*.js' | xargs rm

Find file and rename it BASH

Any idea why this is not working? I checked many links and I can't figure it out why I think the syntax is correct.
I want to find the file maplist.txt.old and then rename it to maplist.txt in the same folder. I got no errors.
find ~ -type f -name csgo/maplist.txt.old -execdir mv csgo/maplist.txt.old maplist.txt \;
Lots of ways to handle this. Since you are looking in ~/csgo you can go directly to the directory in the find. The -execdir option will run the command in the directory. So without changing your example much:
find ~/csgo -type f -name maplist.txt.old -execdir mv maplist.txt.old maplist.txt \;
To automate this a bit further, you may want to handle this with a bash for loop, for example:
for file in $( find ~/csgo -type f -name maplist.txt.old ) ; do
mv $file $( echo $file | sed -e 's/\.old//' )
done

Suppress error message from bash loop

I've got the following line in my bash script:
for i in $(find ./TTDD* -type f)
do
It works when there's files in the directory, but when it's empty I get the following:
find ... No such file or directory
How can I suppress that exact error message, as I'm logging output and doesn't care about that specific error message.
The problem is that globs that don't have any matches expand to themselves, and since there's no file named TTDD*, you get this error.
You can rewrite it in different ways. The most straight forward is:
find . -path './TTDD*' -type f
This will show the same files.
If there are other directories in the current dir, it will waste some time going through their files even if they'll never match. If required, you can short-circuit such directories with a less readable find . -path . -o -not -path './TTDD*' -prune -o -type f -print.
NB: iterating over these files with a for loops will break for files with spaces and various other special characters. You can combine this with anubhava's answer to safely read all filenames while also not suppressing all of find's other potentially useful error messages.
while IFS= read -rd '' f; do
printf "Processing [%s]\n" "$f"
done < <(find . -path './TTDD*' -type f -print0 2>/dev/null)
How about you use "-exec"?
This way you can redirect all results of your find command to the input of some other command, e.g., let's say that I want to find all the files within a given folder that are owned by the root user and I want to change the ownership of these files, here is the code:
find ${DataFS}/Data -user root -exec chown someuser:someuser {} \;
The approach you are using is similar to
find ${DataFS}/Data -user root -print0 | xargs -0 chown someuser:someuser
Which is not ideal because it will fail in case it can't find any files (i.e., if it prints an empty string), i.e., chown: missing operand after `someuser:someuser'

Error = find: -exec: no terminating ";" or "+"

I am looking for some help trying to get a command working. I want to find some files only and move them, but when I enter this command:
find /Volumes/NEXSAN/Engine\ Folders/Input/DTO_Proxy/* -type f -mtime +7 -exec mv -v {} /Volumes/NEXSAN/.2BeDeleted4realz/
I get this error
find: -exec: no terminating ";" or "+"
I know I probably have it wrong, but I can't figure out what's missing?
Just terminate the find command with \;, making sure to include the space before the \;.
find /Volumes/NEXSAN/Engine\ Folders/Input/DTO_Proxy/* -type f -mtime +7 -exec mv -v {} /Volumes/NEXSAN/.2BeDeleted4realz/ \;
If you want to correct the find command that you had, it should look like this:
find . -name '*.xml' -exec SetFile -t TEXT {} \;
The *.xml needs to be quoted so it's passed as a parameter to find instead of expanded by the shell. The ; also needs to be escaped so it's passed as part of the parameter to find and not interpreted by the shell.
Keep in mind this will only work for files within the current directory (and subdirectories) and for any new files created, you would need to run the command again.

How to return the absolute path of recursively matched arguments? (BASH)

OK, so simple enough.. I want to recursively search a directory for files with a specific extension - and then perform an action on those files.
# pwdENTER
/dir
# ls -R | grep .txt | xargs -I {} open {} ENTER
The file /dir/reallyinsubfolder.txt does not exist. ⬅ fails (bad)
Not output, but succeeds.. /dir/fileinthisfolder.txt ⬅ opens silently (good)
This does find ALL the files I am interested in… but only OPEN's those which happen to be "1-level" deep. In this case, the attempt to open /dir/reallyinsubfolder.txt fails, as reallyinsubfolder.txt is actually /dir/sub/reallyinsubfolder.txt.
I understand that grep is simply returning the matched filename… which then chokes (in this case), the open command, as it fails to reach down to the correct sub-directory to execute the file..
How do I get grep to return the full path of a match?
How about using the find command -
find /path/to/dir -type f -iname "*.txt" -exec action to perform {} \;
find . -name *.txt -exec open {};
(Decorate with backslashes of your needing)
I believe you're asking the wrong question; parsing ls(1) output in this fashion is far more trouble than it is worth.
What would work far more reliably:
find /dir -name '*.txt' -print0 | xargs -0 open
or
find /dir -name '*.txt' -exec open {} \;
find(1) does not mangle names nearly as much as ls(1) and makes executing programs on matched files far more reliable.

Resources