How do I recursively delete files in directory that were modified more than 6 hours ago?
This example work for 1 day:
find /data2/input -type f -mtime +1 -delete -print
Use -mmin instead of mtime. It will allow you to specify the number of minutes since the files was last modified. So for files older than 6 hours:
find /data2/input -type f -mmin +360 -delete -print
Check the flags -cmin or -mmin in the manual page.
Related
I'm able to extract last 60 days files from current date. But I want last 60 days files from yesterday.
Below is the command, I'm using to fetch last 60 days files
find . -name $val\* -mtime -60 -print
I can pipe the output to another find probably like below
find . -name $val\* -mtime -60|find . -name $val\* -mtime 1 -print
But that would produce files which was modified exactly one days ago from the list of last 60 days files.
Pls help me how to achieve it.
You can provide multiple predicates to a single find command to filter the
list of files being returned. In this case, combining -mtime +1 with your
first command will return all the files that have been modified less than 60 days ago and (logical AND is implicit) more than one day ago:
find . -name $val\* -mtime -60 -mtime +1 -print
Is this command useful?
find . -name $val\* \( -mtime -61 -and -not -mtime -1 \) -print
It will list all last 61 days files but exclude files which were modified in last 24 hours.
The following command it finding all the files which are older than 60 minutes but in sub directories also.
find . -type f -mmin +60 -print
How can we restrict it find files only in given directory?
I have archive folders in sub direcotries which have older files that is causing problem.
Thanks in advance :)
Use the -maxdepth 1 argument to find to limit results to the current directory.
So your full command would be find . -type f -mmin +60 -maxdepth 1 -print
find ./your-directory -daystart -maxdepth 1 -mmin +10 -type f -delete
I've got a script that finds files within folders older than 30 days:
find /my/path/*/README.txt -mtime +30
that'll then produce a result such as
/my/path/jobs1/README.txt
/my/path/job2/README.txt
/my/path/job3/README.txt
Now the part I'm stuck at is I'd like to remove the folder + files that are older than 30 days.
find /my/path/*/README.txt -mtime +30 -exec rm -r {} \;
doesn't seem to work. It's only removing the readme.txt file
so ideally I'd like to just remove /job1, /job2, /job3 and any nested files
Can anyone point me in the right direction ?
This would be a safer way:
find /my/path/ -mindepth 2 -maxdepth 2 -type f -name 'README.txt' -mtime +30 -printf '%h\n' | xargs echo rm -r
Remove echo if you find it already correct after seeing the output.
With that you use printf '%h\n' to get the directory of the file, then use xargs to process it.
You can just run the following command in order to recursively remove directories modified more than 30 days ago.
find /my/path/ -type d -mtime +30 -exec rm -rf {} \;
I'm trying to find all files in a given folder that were modified withing a certain time frame, say between 5 and 15 minutes ago.
Currently I can find anything modified say up to 15 minutes ago by using find -cmin
#!/bin/bash
minutes="15"
FILETYPES=`find . *PATTERN*.txt* -maxdepth 0 -type f -cmin -$minutes`
How do I give it a time frame?
Try this :
find . -name '*pattern.txt' -maxdepth 1 -type f \( -mmin -15 -a -mmin +5 \)
Notes
the parenthesis are not mandatory here with and : -a, but it's necessary for case with or: -o
always use single quotes around the pattern to prevent shell expansion of the wildcard
to give a pattern, use -name or -iname
for the date/hour, -mmin is the way to go for minutes and -mtime for days.
Using find, you can add additional conditions to create the range. Each condition is implied as "and" unless -o is used. You also want -mmin instead of -cmin for modified time (but they are often the same).
find . '*PATTERN*.txt*' -maxdepth 0 -type f -mmin -15 -mmin +5
I'm sure this is straight forward and answered somewhere, but I didn't manage to find what I was looking for. Basically, I'm trying to run a cron script to clear the contents of a given directory every 7 days. So far I have tried the following,
find /myDir -mtime 7 -exec rm -rf {} \;
This however also deletes the parent directory myDir, which I do not want. I also tried,
find /myDir -type f -type d -mtime 7 -delete
which appeared to do nothing. I also tried,
fnd /myDir -type d -delete
which deleted all but the parent directory just as I need. However, a warning message came up reading,
relative path potentially not safe
I'd appreciate if anyone can rectify my script so that it safely deletes all subdirectories in folder.
Many thanks. =)
UPDATE: I decided to go for the following,
find /myDir -mindepth 1 -mtime 7 -delete
Based upon what I learned from all who replied. Again, many thanks to you all.
Try:
find /myDir -mindepth 1 -mtime 7 -exec rm -rf {} \;
What about
cd myDir/ ; find . -type d -delete
assuming that you run this from myDir parent directory.
If you can't guarantee myDir exists, then this is safer:
cd myDir/ && find . -type d -delete
find /myDir -mindepth 1 -mtime 7 -delete
should probably be
find /myDir -mindepth 1 -mtime +7 -delete
(or maybe mtime +6). The + means things 7 days old or older rather than exactly 7 days.