find outputs result but -exec ignores it - bash

I wrote a simple command to find folders older than a month and delete them.
Here is the command :
find . -type d -mtime +31 -exec bash -c 'rm -rfv "$0"' {} \;
It works fine in most cases, but sometime, the -exec "ignores" the result.
After running the command once, If I run the find without -exec, it still finds some folders older than a month which have not been removed.
I then tried with a simple echo and got no output :
$ find . -type d -mtime +31
./folder_A
./Folder_B
$ find . -type d -mtime +31 -exec bash -c 'echo "$0"' {} \;
<No output>
I found a workround using grep but I'm wondering why the -exec ignores some results.
Anyone knows ?
Here is the workaround :
find . -type d -mtime +31 | grep . --color=never | while read line ; do rm -rvf "$line" ; done

Your find commands seem a bit strange to me. I'd suggest changing:
find . -type d -mtime +31 -exec bash -c 'rm -rfv "$0"' {} \;
to:
find . -type d -mtime +31 -exec rm -rfv {} \;
or even better (increases performance):
find . -type d -mtime +31 -exec rm -rfv {} +
If you insist on calling Bash explicitly (although I can't think of any reason why you should as rm is not a Bash built-in), I'd suggest changing:
find . -type d -mtime +31 -exec bash -c 'rm -rfv "$0"' {} \;
to:
find . -type d -mtime +31 -exec bash -c 'rm -rfv {}' \;
Same goes for the find command containing echo.

Related

GNU find following symbolic links and name pattern fails to execute remove (rm)

I tried to remove files which might be located in symlinked directories. I recognized for certain basenames, that find fails with the error No such file or directory.
Here is what I tried. Am I doing something wrong? Thanks!
touch a_b.c.d
touch a_b.d
find -L . -type f -name '*.c.d' -exec bash -c 'rm $(basename {} .c.d).d' \;
This fails, too
touch a_b.c.d
touch a_b.d
find -L . -type f -name '*.c.d' -exec rm a_b.d \;
But this works
touch a_x_b.c.d
touch a_x_b.d
find -L . -type f -name '*.c.d' -exec bash -c 'rm $(basename {} .c.d).d' \;
or this, respectively
touch a_x_b.c.d
touch a_x_b.d
find -L . -type f -name '*.c.d' -exec rm a_x_b.d \;

find exec and strip extension from filenames

Any idea why this command is not working? btw, I'm trying to strip out the extensions of all csv files in current directory.
find -type f -iname "*.csv" -exec mv {} $(basename {} ".csv") \;
Tried many variants including the parameter expansions, xargs ... Even then all went futile.
This should do:
find ./ -type f -iname "*.csv" -exec sh -c 'mv {} $(basename {} .csv)' \;
find is able to substitute {} with its findings since the quotes prevent executing the subshell until find is done. Then it executes the -exec part.
The problem why yours is not working is that $(basename {} ".csv") is executed in a subshell (-> $()) and evaluated beforehand. If we look at the command execution step-by-step you will see what happens:
find -type f -iname "*.csv" -exec mv {} $(basename {} ".csv") \; - your command
find -type f -iname "*.csv" -exec mv {} {} \; - subshell gets evaluated ($(basename {} ".csv") returns {} since it interprets {} as a literal)
find -type f -iname "*.csv" -exec mv {} {} \; - as you see now: move does actually nothing
First, take care that you have no subdirectories; find, without extra arguments, will automatically recur into any directory below.
Simple approach: if you have a small enough number of files, just use the glob (*) operator, and take advantage of rename:
$ rename 's/.csv$//' *.csv
If you have too many files, use find, and perhaps xargs:
$ find . -maxdepth 1 -type f -name "*.csv" | xargs rename 's/.csv$//'
If you want to be really safe, tell find and xargs to delimit with null-bytes, so that you don't have weird filenames (e.g., with spaces or newlines) mess up the process:
$ find . -maxdepth 1 -type f -name "*.csv" -print0 | xargs -0 rename 's/.csv$//'

Using + instead of \; in find -exec

I understand that using + rather than \; in a find command with -exec can speed things up because with \; the target of -exec is executed once for each result of the find command, whereas with + the target of -exec is executed "as needed."
This code works as expected and processes all subdirectories:
find "${directory}" -iname "*.jpg" -type d -prune -exec bash -c 'myscript "{}"' \;
But this code does NOT work:
find "${directory}" -iname "*.jpg" -type d -prune -exec bash -c 'myscript "$#"' bash {} +
It processes only one directory rather than all of them.
I'm obviously missing something about the proper syntax of using + when calling a function.

I am getting an error "arg list too long" in unix

i am using the following command and getting an error "arg list too long".Help needed.
find ./* \
-prune \
-name "*.dat" \
-type f \
-cmin +60 \
-exec basename {} \;
Here is the fix
find . -prune -name "*.dat" -type f -cmin +60 |xargs -i basename {} \;
To only find files in the current directory, use -maxdepth 1.
find . -maxdepth 1 -name '*.dat' -type f -cmin +60 -exec basename {} \;
In all *nix systems the shell has a maximum length of arguments that can be passed to a command. This is measured after the shell has expanded filenames passed as arguments on the command line.
The syntax of find is find location_to_find_from arguments..... so when you are running this command the shell will expand your ./* to a list of all files in the current directory. This will expand your find command line to find file1 file2 file3 etc etc This is probably not want you want as the find is recursive anyway. I expect that you are running this command in a large directory and blowing your command length limit.
Try running the command as follows
find . -name "*.dat" -type f -cmin +60 -exec basename {} \;
This will prevent the filename expansion that is probably causing your issue.
Without find, and only checking the current directory
now=$(date +%s)
for file in *.dat; do
if (( $now - $(stat -c %Y "$file") > 3600 )); then
echo "$file"
fi
done
This works on my GNU system. You may need to alter the date and stat formats for different OS's
If you have to show only .dat filename in the ./ tree. Execute it without -prune option, and use just path:
find ./ -name "*.dat" -type f -cmin +60 -exec basename {} \;
To find all the .dat files which are older than 60 minutes in the present directory only do as follows:
find . -iregex "./[^/]+\.dat" -type f -cmin +60 -exec basename {} \;
And if you have croppen (for example aix) version of find tool do as follows:
find . -name "*.dat" -type f -cmin +60 | grep "^./[^/]\+dat" | sed "s/^.\///"

Performance difference in find command

Is there any performance difference in the below shell commands:
find . -type f -empty -exec rm '{}' \;
find . -type f -empty -exec sh -c "/bin/rm {}" \;
Your 2nd command is going to be slower since it will spawn a sub shell for each entry found by find command.
However at the same time 2nd command will be more flexible in nature if you want to do some variable assignment etc like this:
find . -type f -empty -exec sh -c "x=1; /bin/rm {}" \;

Resources