Why does my find command seem to execute twice? [duplicate] - shell

This is the contents of the directory I'm working with:
misha#hp-laptop:~/work/c/5$ ls -l
total 8
-rw-rw-r-- 1 misha misha 219 May 20 15:37 demo.c
drwxrwxr-x 2 misha misha 4096 May 20 16:07 folder
-rw-rw-r-- 1 misha misha 0 May 20 16:06 test
Now I would like to remove everything from this directory except for the file demo.c. Here's the command I've come up with:
find . ! \( -name demo.c -o -name . \) -exec rm -Rf {} \;
It does exactly what you'd think it would do (meaning, the file test and the directory folder are gone), but at the same time it also displays the following error message:
find: `./folder': No such file or directory
Why do you think that is?

it also displays this error message:
find: `./folder': No such file or directory
Why is that?
Because find recognizes ./folder as a directory when it first reads directory ., before considering whether it matches the find criteria or performing any action on it. It does not recognize that the action will remove that directory, so after performing the action, it attempts to descend into that directory to scan its contents. By the time it does that, however, the directory no longer exists.
There are multiple ways to address the problem. One not yet mentioned is to use the -prune action. This tells find not to descend into directories that match the tests:
find . ! \( -name demo.c -o -name . \) -exec rm -Rf {} \; -prune
That will serve nicely here, and it also has applications in areas where you are not deleting the directory and you do not want to limit the search depth.
Additionally, another way to avoid affecting . would be to make use of the fact that find accepts multiple base paths to test, that these can designate regular files if you wish, and that during pathname expansion any leading . in a filename must be matched explicitly. If, as in your case, there are no dotfiles in the target directory (other than . and ..), then you can accomplish your objective like this:
find * ! -name demo.c -exec rm -Rf {} \; -prune

You can change your find command to this:
find . -mindepth 1 -not -name demo.c -delete
-mindepth 1 ensure that you don't select DOT
-delete will delete all files and directories

#before
ls -lrt
total 4
-rw-rw-r-- 1 user super 0 May 20 09:14 demo.c
drwxrwxr-x 2 user super 4096 May 20 09:14 folder/
-rw-rw-r-- 1 user super 0 May 20 09:14 test
#Command
ls -1 | grep -v demo.c |xargs rm -rf
#After
ls -lrt
total 0
-rw-rw-r-- 1 user super 0 May 20 09:14 demo.c

Related

List the files that I can read

I would like to list any files that can be read by my current user in bash.
I'm not sure what would be the best way to check for that. I'm thinking something along the lines of ls -l | grep <myusername>|<mygroupname> or find ., but that doesn't deal with the other permissions.
Also, I'm working on NetBSD box.
Considering the 2 files below, where one can be read by user, and the other can't:
[fsilveir#fsilveir tmp]$ ls -l ./test_dir/can_read.txt ./test_dir/cant_read.txt
-rw-r--r--. 1 root root 861784 May 29 20:34 ./test_dir/can_read.txt
-rwx------. 1 root root 0 May 29 20:30 ./test_dir/cant_read.txt
You can use find with -perm option. By using +r you'll list the files you can read, and using -r for finding the ones you can't read, as shown below:
[fsilveir#fsilveir tmp]$ find . -name "*.txt" -perm -g+r 2>/dev/null
./test_dir/can_read.txt
[fsilveir#fsilveir tmp]$
Another approach is using find with -readable option, as shown below:
[fsilveir#fsilveir tmp]$ find . -name "*.txt" -readable
./test_dir/can_read.txt
[fsilveir#fsilveir tmp]$ find . -name "*.txt" ! -readable
./test_dir/cant_read.txt

List all files older than x days only in current directory

I am new to unix and couldn't get appropriate outcome in other questions.
I want to list only files in current directory which are older than x days. I have below restriction
List only files in current folder which are older than 30
days
Output shouldn't include directories and subdirectories
This should list files similar as "ls" command does
Output should look like file1 file2 file3 ..
I used find . -mtime +30. but this gives files and files in sub-directories as well. I would like to restrict doing search recursively and not to search inside directories.
Thanks a lot in advance !
You can do this:
find ./ -maxdepth 1 -type f -mtime +30 -print
If having problems, do:
find ./ -depth 1 -type f -mtime +30 -print
To add on #Richasantos's answer:
This works perfectly fine
$ find . -maxdepth 1 -type f -mtime +30
Prints:
./file1
./file2
./file3
You can now pipe this to anything you want. Let's say you want to remove all those old files:
$ find . -maxdepth 1 -type f -mtime +30 -print | xargs /bin/rm -f
From man find: ``
If you are piping the output of find into another program and there is the faintest possibility that the files which you are searching for might contain a newline, then you should seriously consider using the -print0 option instead of -print.
So using -print0
$ find . -maxdepth 1 -type f -mtime +30 -print0
Prints (with null characters in between):
./file1./file2./file3
And is used like this to remove those old files:
$ find . -maxdepth 1 -type f -mtime +30 -print0 | xargs -0 /bin/rm -f
You can use find . -maxdepth 1 to exclude subdirectories.
A slightly different spin on this: find is incredibly versatile, you can specify size and time as follows:
This finds you all the logs that are 4 months or older and bigger than 1 meg.
If you remove the + sign, it finds files that are roughly that size.
find /var/log -type f -mtime +120 -size +1M
/var/log/anaconda/journal.log
/var/log/ambari-agent/ambari-alerts.log.23
/var/log/ambari-agent/ambari-alerts.log.22
/var/log/ambari-agent/ambari-alerts.log.24
/var/log/ambari-agent/ambari-alerts.log.25
/var/log/ambari-agent/ambari-alerts.log.21
/var/log/ambari-agent/ambari-alerts.log.20
/var/log/ambari-agent/ambari-alerts.log.19
What's even better, you can feed this into an ls:
find /var/log -type f -mtime +120 -size +1M -print0 | xargs -0 ls -lh
-rw-r--r--. 1 root root 9.6M Oct 1 13:24 /var/log/ambari-agent/ambari-alerts.log.19
-rw-r--r--. 1 root root 9.6M Sep 27 07:44 /var/log/ambari-agent/ambari-alerts.log.20
-rw-r--r--. 1 root root 9.6M Sep 22 03:32 /var/log/ambari-agent/ambari-alerts.log.21
-rw-r--r--. 1 root root 9.6M Sep 16 23:23 /var/log/ambari-agent/ambari-alerts.log.22
-rw-r--r--. 1 root root 9.6M Sep 11 19:12 /var/log/ambari-agent/ambari-alerts.log.23
-rw-r--r--. 1 root root 9.6M Sep 6 15:02 /var/log/ambari-agent/ambari-alerts.log.24
-rw-r--r--. 1 root root 9.6M Sep 1 10:51 /var/log/ambari-agent/ambari-alerts.log.25
-rw-------. 1 root root 1.8M Mar 11 2019 /var/log/anaconda/journal.log

Delete Directory Older Than n Days Using Find

My requirements are pretty much the same as this question: Shell script to delete directories older than n days
I've got directories that look like this:
Jul 24 05:46 2013_07_24
Jul 31 22:30 2013_08_01
Sep 18 05:43 2013_09_18
Oct 07 08:41 2013_10_07
I want to remove anything older than 90 days. Based on the solution given in the aforementioned thread, I used the following in my script:
find $BASE_DIR -type d -ctime +90 -exec rm -rf {} \;
The script is successfully deleting the directories, but it is also failing with this error:
find: 0652-081 cannot change directory to <actual_path>:
: A file or directory in the path name does not exist.
The only thing here that $BASE_DIR points to a location that's virtual location and the actual_path in the error message points to the actual location. There are soft links in the environment.
Try
find $BASE_DIR -mindepth 1 -maxdepth 1 -type d -ctime +90 -exec rm -rf {} \;
This will only cover directories directly under $BASE_DIR, but it should avoid generating that error message.
find .$BASE_DIR -type d -ctime +90 | sort -r | xargs rm -rf
sort -r will sort our directories in reverse order, so we won't try do delete external directories then internal ones.

How to only find files in a given directory, and ignore subdirectories using bash

I'm running the find command to find certain files, but some files in sub-directories have the same name which I want to ignore.
I'm interested in files/patterns like this:
/dev/abc-scanner, /dev/abc-cash ....
The command:
find /dev/ -name 'abc-*'
What's being returned:
/dev/abc-scanner
/dev/abc-cash
...
...
...
/dev/.udev/names/abc-scanner
/dev/.udev/names/abc-cash
I want to ignore the latter files: /dev/.udev/...
If you just want to limit the find to the first level you can do:
find /dev -maxdepth 1 -name 'abc-*'
... or if you particularly want to exclude the .udev directory, you can do:
find /dev -name '.udev' -prune -o -name 'abc-*' -print
Is there any particular reason that you need to use find? You can just use ls to find files that match a pattern in a directory.
ls /dev/abc-*
If you do need to use find, you can use the -maxdepth 1 switch to only apply to the specified directory.
This may do what you want:
find /dev \( ! -name /dev -prune \) -type f -print
I got here with a bit more general problem - I wanted to find files in directories matching pattern but not in their subdirectories.
My solution (assuming we're looking for all cpp files living directly in arch directories):
find . -path "*/arch/*/*" -prune -o -path "*/arch/*.cpp" -print
I couldn't use maxdepth since it limited search in the first place, and didn't know names of subdirectories that I wanted to exclude.
There is an alternative to find called rawhide (rh) and it's much easier to use. Instead of:
find /dev -maxdepth 1 -name 'abc-*'
You can do:
rh -r /dev '"abc-*"'
The -r is the same as "-m1 -M1" which is the same as find's "-mindepth 1 -maxdepth 1", just a lot shorter.
Rawhide (rh) is available from https://raf.org/rawhide or https://github.com/raforg/rawhide. It works at least on Linux, FreeBSD, OpenBSD, NetBSD, Solaris, macOS, and Cygwin.
Disclaimer: I am the current author of rawhide
find /dev -maxdepth 1 -name 'abc-*'
Does not work for me. It return nothing. If I just do '.' it gives me all the files in directory below the one I'm working in on.
find /dev -maxdepth 1 -name "*.root" -type 'f' -size +100k -ls
Return nothing with '.' instead I get list of all 'big' files in my directory as well as the rootfiles/ directory where I store old ones.
Continuing. This works.
find ./ -maxdepth 1 -name "*.root" -type 'f' -size +100k -ls
564751 71 -rw-r--r-- 1 snyder bfactory 115739 May 21 12:39 ./R24eTightPiPi771052-55.root
565197 105 -rw-r--r-- 1 snyder bfactory 150719 May 21 14:27 ./R24eTightPiPi771106-2.root
565023 94 -rw-r--r-- 1 snyder bfactory 134180 May 21 12:59 ./R24eTightPiPi77999-109.root
719678 82 -rw-r--r-- 1 snyder bfactory 121149 May 21 12:42 ./R24eTightPiPi771098-10.root
564029 140 -rw-r--r-- 1 snyder bfactory 170181 May 21 14:14 ./combo77v.root
Apparently /dev means directory of interest. But ./ is needed, not just .. The need for the / was not obvious even after I figured out what /dev meant more or less.
I couldn't respond as a comment because I have no 'reputation'.

Cron Job - Command to delete all .flv files everyday

I have this command that I run everyday via cron:
find /home/get/public_html/videos -daystart -maxdepth 0
-mtime +1 -type f -name "*.flv" |xargs rm -f
The problem is that it doesn't delete the .flv files in a directory that are 1 or more days old.
How can I correct the above command?
EDIT: Paul - the command "ls -l /home/get/public_html/videos" results in 2000+ files but here is 2 of them that should be deleted:
-rw-r--r-- 1 get get 3501188 Jan 4 15:24 f486cf0a2b6bb40e4c50c991785084131231104229.flv
-rw-r--r-- 1 get get 10657314 Jan 4 17:51 f5f1490ddaa11a663686f9d06fb37d981231112941.flv
It's better to use -print0 on find and -0 in xargs in case one file has an uncommon name.
Also, you want to use -maxdepth 1 to actually find something in the specified directory.
-maxdepth 0 means it'll only find in the directories listed in the command line, it won't check the contents of those directories.
Do you mean, if you have a directory /home/get/public_html/videos/foo it doesn't delete the files in them? That would be because you have the -maxdepth 0 argument set, which prevents find from descending into subdirectories.
-maxdepth 1

Resources