what does -mtime mean in a bash script? - bash

I've got a working script:
#!/bin/bash
find ~/.backups/ -type f -name '*.tgz' -mtime +0.5 -exec rm {} \;
Nothing wrong with it. Just wondering what -mtime is and how it's calculated. Can't seem to get a hit on Google.

From man find:
-mtime n
File's data was last modified n*24 hours ago. See the
comments for -atime to understand how rounding affects the
interpretation of file modification times.

Related

Delete files that are 5 days old using bash script

I currently use a command that searches a directory and deletes 5 day old files.
find /path/to/files* -mtime +5 -exec rm {} \;
I run it from the command line and it works fine. But when I put it in a .sh file it says findĀ /path/to/files*: No such file or directory.
There is only two lines in the shell script:
#! /usr/bin/env bash
find /path/to/files* -mtime +5 -exec rm {} \;
How can I rewrite the script so that it works?
`
The error happens if there are currently no files matching the wildcard, presumably because none have been created since you deleted them previously.
The argument to find should be the directory containing the files, not the filenames themselves, since find will automatically search the directory. If you want to restrict the filenames, use the -name option to specify the wildcard.
And if you don't want to go into subdirectories, use the -maxdepth option.
find /path/to -maxdepth 1 -type f -name 'files*' -mtime +5 -delete
This works:
#! /usr/bin/env bash
find /home/ubuntu/directory -type f -name 'filename*' -mtime +4 -delete
Here is an example:
find /home/ubuntu/processed -type f -name 'output*' -mtime +4 -delete

Log rotation (zipping/deleting) script in Solaris

Had same log rotation files in linux environment and there they work. In Solaris I have problems with running those scrips:
The main purpose of scripts is to delete all logs that are older than 30 days and zip all logs that are older than 5 days. -not -name is used because I want to operate only on rotated log files, for example something.log.20181102 because .log files are current ones and I don't want to touch them.
#!/bin/bash
find ./logs -mindepth 1 -mtime +30 -type f -not -name "*.log" -delete
find ./logs -mtime +5 -not -name "*.log" -exec gzip {} \;
Problems occur with -mindepth and -not because it gives errors:
find: bad option -not
find: [-H | -L] path-list predicate-list
Based on search I have to use -prune somehow in the find, but I am not too sure how to.
If you look at the man page for find(1) on Linux (or gfind(1) on Solaris), you'll see
-not expr
Same as ! expr, but not POSIX compliant.
So you should be able to replace -not with !, though you'll need to escape it from the shell, either with a backslash or with single quotes:
find ... \! -name "*.log" ...
Note that on Solaris, there's a command called logadm which is intended to help you take care of things like this, and may be worth investigating, unless you want to have the exact same behavior on both Solaris and Linux.
With the help of #Danek Duvall and with some searching I got it working:
find ./logs -mtime +30 -type f ! -name "*.log" -exec rm -f {} \;
find ./logs -mtime +5 ! -name "*.log" -exec gzip {} \;
It deletes all log files that are older than 30 days and then zips the ones that are older than 5 days.

find files older than X days in bash and delete

I have a directory with a few TB of files. I'd like to delete every file in it that is older than 14 days.
I thought I would use find . -mtime +13 -delete. To make sure the command works as expected I ran find . -mtime +13 -exec /bin/ls -lh '{}' \; | grep '<today>'. The latter should return nothing, since files that were created/modified today should not be found by find using -mtime +13. To my surprise, however, find just spew out a list of all the files modified/created today!
find your/folder -type f -mtime +13 -exec rm {} \;
This works for me.
$ find ./folder_name/* -type f -mtime +13 -print | xargs rm -rf
The simplest solution to this is in #navid's and #gniourf_gniourf's comments. Because it's buried in the comments, I'd like to bring it up to be more visible.
find your/folder -type f -mtime +13 -delete
This avoids any possible issues with spaces and whatnot in the filenames and it doesn't spin up another executable to do the deleting so it should be faster too.
I tried and tested this.

Trying to remove a file and its parent directories

I've got a script that finds files within folders older than 30 days:
find /my/path/*/README.txt -mtime +30
that'll then produce a result such as
/my/path/jobs1/README.txt
/my/path/job2/README.txt
/my/path/job3/README.txt
Now the part I'm stuck at is I'd like to remove the folder + files that are older than 30 days.
find /my/path/*/README.txt -mtime +30 -exec rm -r {} \;
doesn't seem to work. It's only removing the readme.txt file
so ideally I'd like to just remove /job1, /job2, /job3 and any nested files
Can anyone point me in the right direction ?
This would be a safer way:
find /my/path/ -mindepth 2 -maxdepth 2 -type f -name 'README.txt' -mtime +30 -printf '%h\n' | xargs echo rm -r
Remove echo if you find it already correct after seeing the output.
With that you use printf '%h\n' to get the directory of the file, then use xargs to process it.
You can just run the following command in order to recursively remove directories modified more than 30 days ago.
find /my/path/ -type d -mtime +30 -exec rm -rf {} \;

find -mtime files older than 1 hour [duplicate]

This question already has answers here:
How to delete files older than X hours
(9 answers)
Closed 6 years ago.
I have this command that I run every 24 hours currently.
find /var/www/html/audio -daystart -maxdepth 1 -mtime +1 -type f -name "*.mp3" -exec rm -f {} \;
I would like to run it every 1 hour and delete files that are older than 1 hour. Is this correct:
find /var/www/html/audio -daystart -maxdepth 1 -mtime **+0.04** -type f -name "*.mp3" -exec rm -f {} \;
I am not sure of my use of the decimal number??
Thanks for any corrections.
EDIT
OR could I just use -mmin 60? Is this correct?
EDIT2
I tried your test, good thing you suggested it. I got an empty result. I want all files OLDER than 60mins to be deleted! How can I do this?? Does my command actually do this?
What about -mmin?
find /var/www/html/audio -daystart -maxdepth 1 -mmin +59 -type f -name "*.mp3" \
-exec rm -f {} \;
From man find:
-mmin n
File's data was last modified n minutes ago.
Also, make sure to test this first!
... -exec echo rm -f '{}' \;
^^^^ Add the 'echo' so you just see the commands that are going to get
run instead of actual trying them first.

Resources