find -mtime files older than 1 hour [duplicate] - bash

This question already has answers here:
How to delete files older than X hours
(9 answers)
Closed 6 years ago.
I have this command that I run every 24 hours currently.
find /var/www/html/audio -daystart -maxdepth 1 -mtime +1 -type f -name "*.mp3" -exec rm -f {} \;
I would like to run it every 1 hour and delete files that are older than 1 hour. Is this correct:
find /var/www/html/audio -daystart -maxdepth 1 -mtime **+0.04** -type f -name "*.mp3" -exec rm -f {} \;
I am not sure of my use of the decimal number??
Thanks for any corrections.
EDIT
OR could I just use -mmin 60? Is this correct?
EDIT2
I tried your test, good thing you suggested it. I got an empty result. I want all files OLDER than 60mins to be deleted! How can I do this?? Does my command actually do this?

What about -mmin?
find /var/www/html/audio -daystart -maxdepth 1 -mmin +59 -type f -name "*.mp3" \
-exec rm -f {} \;
From man find:
-mmin n
File's data was last modified n minutes ago.
Also, make sure to test this first!
... -exec echo rm -f '{}' \;
^^^^ Add the 'echo' so you just see the commands that are going to get
run instead of actual trying them first.

Related

Find and tar files which are X hours old

I am trying to tar all the log files of a specific folder which are X hours old, i have done it with X days and I need it for X hours.
find $DEST_DIRECTORY/*.log -type f ! -name "*.tar.gz" -mtime +$hours -exec mv '{}' ${DEST_DIRECTORY}/${TAR_DIR_NAME}/ \
Above code is not working for hours.
Use:
-mmin n to filter for files that were modified n minutes ago.
So in your example you should replace -mtime +$hours with -mmin +$[$hours * 60].
Full docs are here: http://man7.org/linux/man-pages/man1/find.1.html
you should try this :
find $DEST_DIRECTORY/*.log -type f ! -name "*.tar.gz" -mmin -180
180 refers to 3 hours and you can tweak it to your need

Find files created 1 hour ago

I want to list files with long description on AIX that was created from 1 hour ago. I am trying
find . -cmin -60 but it only shows the file names. Was trying also find . -cmin -60 -exec ls -l {} \; but it was displaying the whole files in the directory.
Thank you
I think what you want is
find . -cmin +60 -exec ls -al {} \;
It will list all the files in current directory created more than 60 minutes agp.
The '+' in the '+60' means more than 60 minutes ago while a '-' in the '-60' means less than 60 minutes ago.
Some options:
find . -cmin -60 -exec ls -ld {} \;
find . -cmin -60 -type f -exec ls -l {} \;
find . -cmin -60 -print0 | xargs -0 ls -ld
find . -cmin -60 -type f -print0 | xargs -0 ls -l
The last two are better, but require GNU-findutils.
Edit: As noted by others, -cmin -60 means recently modified files, -cmin +60 means the not-recently modified files.
When your find lacks a cmin option, you can touch a file with a timestamp 1 hour ago and use the find with -newer.

Log rotation (zipping/deleting) script in Solaris

Had same log rotation files in linux environment and there they work. In Solaris I have problems with running those scrips:
The main purpose of scripts is to delete all logs that are older than 30 days and zip all logs that are older than 5 days. -not -name is used because I want to operate only on rotated log files, for example something.log.20181102 because .log files are current ones and I don't want to touch them.
#!/bin/bash
find ./logs -mindepth 1 -mtime +30 -type f -not -name "*.log" -delete
find ./logs -mtime +5 -not -name "*.log" -exec gzip {} \;
Problems occur with -mindepth and -not because it gives errors:
find: bad option -not
find: [-H | -L] path-list predicate-list
Based on search I have to use -prune somehow in the find, but I am not too sure how to.
If you look at the man page for find(1) on Linux (or gfind(1) on Solaris), you'll see
-not expr
Same as ! expr, but not POSIX compliant.
So you should be able to replace -not with !, though you'll need to escape it from the shell, either with a backslash or with single quotes:
find ... \! -name "*.log" ...
Note that on Solaris, there's a command called logadm which is intended to help you take care of things like this, and may be worth investigating, unless you want to have the exact same behavior on both Solaris and Linux.
With the help of #Danek Duvall and with some searching I got it working:
find ./logs -mtime +30 -type f ! -name "*.log" -exec rm -f {} \;
find ./logs -mtime +5 ! -name "*.log" -exec gzip {} \;
It deletes all log files that are older than 30 days and then zips the ones that are older than 5 days.

Find files created over a certain number of seconds ago then copy it

So I'm trying to make a script that watches for files that haven't been modified for at least 10 seconds then to execute an rsync on the file. My original line was this:
find "/Volumes/Media/" -type f -size +2G -cmin +1 -cmin -60 -exec rsync -aq --update {} /Volumes/LocalMedia/ \;
But that does 1 minute, which is too long. So far I've gotten down to this:
find "/Volumes/Media/" -type f -size +2G -exec bash -c 'echo $(( $(date +%s) - $(stat -f%c "{}") ))' \;
which gives me the output of the files in the directory by seconds. But I seem to be having trouble evaluating that and performing the aforementioned rsync. This is macosx so it's bsd find not GNUfind.
Any thoughts/help would be lovely.
Thanks,
-N
You should be able to accomplish this using mtime:
find "/Volumes/Media/" -type f -size +2G -mtime +10s -exec rsync -aq --update {} /Volumes/LocalMedia/ \;
Using -mtime +10s would return to find only files older than 10 seconds

Trying to remove a file and its parent directories

I've got a script that finds files within folders older than 30 days:
find /my/path/*/README.txt -mtime +30
that'll then produce a result such as
/my/path/jobs1/README.txt
/my/path/job2/README.txt
/my/path/job3/README.txt
Now the part I'm stuck at is I'd like to remove the folder + files that are older than 30 days.
find /my/path/*/README.txt -mtime +30 -exec rm -r {} \;
doesn't seem to work. It's only removing the readme.txt file
so ideally I'd like to just remove /job1, /job2, /job3 and any nested files
Can anyone point me in the right direction ?
This would be a safer way:
find /my/path/ -mindepth 2 -maxdepth 2 -type f -name 'README.txt' -mtime +30 -printf '%h\n' | xargs echo rm -r
Remove echo if you find it already correct after seeing the output.
With that you use printf '%h\n' to get the directory of the file, then use xargs to process it.
You can just run the following command in order to recursively remove directories modified more than 30 days ago.
find /my/path/ -type d -mtime +30 -exec rm -rf {} \;

Resources