Delete Directory Older Than n Days Using Find - shell

My requirements are pretty much the same as this question: Shell script to delete directories older than n days
I've got directories that look like this:
Jul 24 05:46 2013_07_24
Jul 31 22:30 2013_08_01
Sep 18 05:43 2013_09_18
Oct 07 08:41 2013_10_07
I want to remove anything older than 90 days. Based on the solution given in the aforementioned thread, I used the following in my script:
find $BASE_DIR -type d -ctime +90 -exec rm -rf {} \;
The script is successfully deleting the directories, but it is also failing with this error:
find: 0652-081 cannot change directory to <actual_path>:
: A file or directory in the path name does not exist.
The only thing here that $BASE_DIR points to a location that's virtual location and the actual_path in the error message points to the actual location. There are soft links in the environment.

Try
find $BASE_DIR -mindepth 1 -maxdepth 1 -type d -ctime +90 -exec rm -rf {} \;
This will only cover directories directly under $BASE_DIR, but it should avoid generating that error message.

find .$BASE_DIR -type d -ctime +90 | sort -r | xargs rm -rf
sort -r will sort our directories in reverse order, so we won't try do delete external directories then internal ones.

Related

Delete folders older then 1 day not working with "find" cmd

I'm trying to delete backup folders older then 1 day (creation date) with find command, but it's not working
Folder ls -l:
drwxrws---+ 2 root data 42 Mai 15 16:46 15-05-2019
drwxrws---+ 2 root data 89 Mai 16 14:19 16-05-2019
The creation date is 15 Mai.
This cmd should work:
find /data/backup/VMs/centos/ -type d -mtime +1 -exec rm {} \;
I tried with this first to see what happens before the remove:
find /data/backup/VMs/centos/ -type d -mtime +1 -exec ls {} \; >> find_test.txt
It should write to the file the folder to delete, but the txt file is empty.
besides use find, how can I remove this folders using the date in the name?
rm normally doesn't print on standard output, however if an error occurs it prints to standard error which can also be redirected to another file or to the same duplicating the file descriptor 2>&1
find /data/backup/VMs/centos/ -type d -mtime +1 -exec ls {} \; >> find_test.txt 2>&1
to print the name find -print action can be used, also find has actions -delete and -ls (which is not exactly the same than ls) to avoid to execute a command on each file
find /data/backup/VMs/centos/ -type d -mtime +1 -print -delete >> find_test.txt 2>&1
be careful before using -delete to avoid loosing unwanted files

Why does my find command seem to execute twice? [duplicate]

This is the contents of the directory I'm working with:
misha#hp-laptop:~/work/c/5$ ls -l
total 8
-rw-rw-r-- 1 misha misha 219 May 20 15:37 demo.c
drwxrwxr-x 2 misha misha 4096 May 20 16:07 folder
-rw-rw-r-- 1 misha misha 0 May 20 16:06 test
Now I would like to remove everything from this directory except for the file demo.c. Here's the command I've come up with:
find . ! \( -name demo.c -o -name . \) -exec rm -Rf {} \;
It does exactly what you'd think it would do (meaning, the file test and the directory folder are gone), but at the same time it also displays the following error message:
find: `./folder': No such file or directory
Why do you think that is?
it also displays this error message:
find: `./folder': No such file or directory
Why is that?
Because find recognizes ./folder as a directory when it first reads directory ., before considering whether it matches the find criteria or performing any action on it. It does not recognize that the action will remove that directory, so after performing the action, it attempts to descend into that directory to scan its contents. By the time it does that, however, the directory no longer exists.
There are multiple ways to address the problem. One not yet mentioned is to use the -prune action. This tells find not to descend into directories that match the tests:
find . ! \( -name demo.c -o -name . \) -exec rm -Rf {} \; -prune
That will serve nicely here, and it also has applications in areas where you are not deleting the directory and you do not want to limit the search depth.
Additionally, another way to avoid affecting . would be to make use of the fact that find accepts multiple base paths to test, that these can designate regular files if you wish, and that during pathname expansion any leading . in a filename must be matched explicitly. If, as in your case, there are no dotfiles in the target directory (other than . and ..), then you can accomplish your objective like this:
find * ! -name demo.c -exec rm -Rf {} \; -prune
You can change your find command to this:
find . -mindepth 1 -not -name demo.c -delete
-mindepth 1 ensure that you don't select DOT
-delete will delete all files and directories
#before
ls -lrt
total 4
-rw-rw-r-- 1 user super 0 May 20 09:14 demo.c
drwxrwxr-x 2 user super 4096 May 20 09:14 folder/
-rw-rw-r-- 1 user super 0 May 20 09:14 test
#Command
ls -1 | grep -v demo.c |xargs rm -rf
#After
ls -lrt
total 0
-rw-rw-r-- 1 user super 0 May 20 09:14 demo.c

Deleting files from an AIX system

We have an AIX system, which gets files on a daily basis, so we manually delete the previous days files manually. Is it possible to write a script which will take the files 15 or 20 days before today and delete the files from the folder?
Or you can use native AIX find command:
find /dir/to/files -type f -mtime +15 -exec rm {} \;
where:
-type f - Find only files, not directories
-mtime +15 - Find files, that modification time more then 15 days
-exec rm {} \; - Run command rm on each matched file
You can run this command with -exec ls -l {} \; for testing, that found files correspond to your criteria.
If you can/may install GNU!find, them it's simple, e.g.:
#!/bin/sh
cd /var/log/apache
gfind . -name '*log*Z' -mtime +30 -delete
this script is run by cron; a line from crontab:
02 23 1 * * /root/cmd/httpd.logdelete >/dev/null 2>&1
Edit:
-mdays + means files of which last modification date is earlier than now-
-delete means deleting the files that match the criteria

Shell script to delete directories older than n days

I have directories named as:
2012-12-12
2012-10-12
2012-08-08
How would I delete the directories that are older than 10 days with a bash shell script?
This will do it recursively for you:
find /path/to/base/dir/* -type d -ctime +10 -exec rm -rf {} \;
Explanation:
find: the unix command for finding files / directories / links etc.
/path/to/base/dir: the directory to start your search in.
-type d: only find directories
-ctime +10: only consider the ones with modification time older than 10 days
-exec ... \;: for each such result found, do the following command in ...
rm -rf {}: recursively force remove the directory; the {} part is where the find result gets substituted into from the previous part.
Alternatively, use:
find /path/to/base/dir/* -type d -ctime +10 | xargs rm -rf
Which is a bit more efficient, because it amounts to:
rm -rf dir1 dir2 dir3 ...
as opposed to:
rm -rf dir1; rm -rf dir2; rm -rf dir3; ...
as in the -exec method.
With modern versions of find, you can replace the ; with + and it will do the equivalent of the xargs call for you, passing as many files as will fit on each exec system call:
find . -type d -ctime +10 -exec rm -rf {} +
If you want to delete all subdirectories under /path/to/base, for example
/path/to/base/dir1
/path/to/base/dir2
/path/to/base/dir3
but you don't want to delete the root /path/to/base, you have to add -mindepth 1 and -maxdepth 1 options, which will access only the subdirectories under /path/to/base
-mindepth 1 excludes the root /path/to/base from the matches.
-maxdepth 1 will ONLY match subdirectories immediately under /path/to/base such as /path/to/base/dir1, /path/to/base/dir2 and /path/to/base/dir3 but it will not list subdirectories of these in a recursive manner. So these example subdirectories will not be listed:
/path/to/base/dir1/dir1
/path/to/base/dir2/dir1
/path/to/base/dir3/dir1
and so forth.
So , to delete all the sub-directories under /path/to/base which are older than 10 days;
find /path/to/base -mindepth 1 -maxdepth 1 -type d -ctime +10 | xargs rm -rf
find supports -delete operation, so:
find /base/dir/* -ctime +10 -delete;
I think there's a catch that the files need to be 10+ days older too. Haven't tried, someone may confirm in comments.
The most voted solution here is missing -maxdepth 0 so it will call rm -rf for every subdirectory, after deleting it. That doesn't make sense, so I suggest:
find /base/dir/* -maxdepth 0 -type d -ctime +10 -exec rm -rf {} \;
The -delete solution above doesn't use -maxdepth 0 because find would complain the dir is not empty. Instead, it implies -depth and deletes from the bottom up.
I was struggling to get this right using the scripts provided above and some other scripts especially when files and folder names had newline or spaces.
Finally stumbled on tmpreaper and it has been worked pretty well for us so far.
tmpreaper -t 5d ~/Downloads
tmpreaper --protect '*.c' -t 5h ~/my_prg
Original Source link
Has features like test, which checks the directories recursively and lists them.
Ability to delete symlinks, files or directories and also the protection mode for a certain pattern while deleting
OR
rm -rf `find /path/to/base/dir/* -type d -mtime +10`
Updated, faster version of it:
find /path/to/base/dir/* -mtime +10 -print0 | xargs -0 rm -f

Cron Job - Command to delete all .flv files everyday

I have this command that I run everyday via cron:
find /home/get/public_html/videos -daystart -maxdepth 0
-mtime +1 -type f -name "*.flv" |xargs rm -f
The problem is that it doesn't delete the .flv files in a directory that are 1 or more days old.
How can I correct the above command?
EDIT: Paul - the command "ls -l /home/get/public_html/videos" results in 2000+ files but here is 2 of them that should be deleted:
-rw-r--r-- 1 get get 3501188 Jan 4 15:24 f486cf0a2b6bb40e4c50c991785084131231104229.flv
-rw-r--r-- 1 get get 10657314 Jan 4 17:51 f5f1490ddaa11a663686f9d06fb37d981231112941.flv
It's better to use -print0 on find and -0 in xargs in case one file has an uncommon name.
Also, you want to use -maxdepth 1 to actually find something in the specified directory.
-maxdepth 0 means it'll only find in the directories listed in the command line, it won't check the contents of those directories.
Do you mean, if you have a directory /home/get/public_html/videos/foo it doesn't delete the files in them? That would be because you have the -maxdepth 0 argument set, which prevents find from descending into subdirectories.
-maxdepth 1

Resources