SHELL: Exit the script if DIRECTORY was modified within last 10min - macos

I am trying to simply check:
If Directory was modified within the last 10min then stop the script.
If Directory was not modified within the last 10min then Continue.
The below always returns True
if "`find "$DirName" -type d -mindepth 1 -mmin -10`"; then
exit
fi
OR
find "$DirName" -type d -mindepth 1 -mmin -10 && exit
Also tried comparing stats and date +%y%m%d%s. But doesn't seem to get anywhere.
Could someone point me to the right direction?
macOS Majove

Are you trying to recursively search the directory and determine if any file within the tree has been modified (which seems to be the only reason to use find), or do you just want something like:
eval "$(stat -s "$DirName")" # Set st_mtime to the mtime of the dir
if test "$(( $(date +%s) - $st_mtime ))" -lt 600; then
exit # mtime less than 10 minutes ago
fi
Although it's probably cleaner to skip the eval and write:
if test "$(( $(date +%s) - $(stat -f %m "$DirName") ))" -lt 600; then ...

Related

How to delete all files in ~/Downloads that have not been touched, added, or opened in the last 30 days?

I'm trying to create an automator workflow or application that, when activated, deletes all the files and subfolders in my Downloads folder that have not been created, modified, added, opened, or accessed in any way in the last 30 days.
I tried filtering like this:
But that doesn't really do the job like I want it. First of all, there's no option to filter by "date added", which I would really like. Secondly, I would prefer it to prioritize a subfolder over that subfolder's contents. For example, I have a folder which I added today, but the file inside that folder has a "date added" of much longer ago. My preference would be that that folder, including its contents, are ignored and therefor not deleted.
Then I read in another Stack Overflow thread (or it was at least some Stack Exchange site) that someone recommended to use a bash script instead. Something like this for example:
$ find "$HOME/Downloads" -type fd -mtime +30d -atime +30d -iname '*.*'
But even that doesn't seem to filter out the exact items that I want to filter out.
So just to be clear, I want to delete everything in my Downloads folder that has not been added, opened, created or modified in the last 30 days. And if there's any subtree where any of the folders or files within that subtree has been added, opened, created or modified within the last 30 days, then I would like that entire subtree to be ignored and left alone. Can anyone help me out here?
This is another approach if you don't want to use find. Works with folders, files. comment out the rm -rf to confirm.
#!/bin/bash
compareDate=$(date -d "30 days ago" '+%Y%m%d')
for f in ~/Downloads/*;
do
fileDate=$(date -r "$f" -u "+%Y%m%d")
if [ ! "$fileDate" -gt "$compareDate" ];then
echo Deleting - "$f";
rm -rf "$f"
else
echo Keeping - "$f"
fi
done
You can do so with find and the -newerXY option (which you negate) where XY is equal to mt for modification time, at for access time and ct for creation time. You simply pass -delete to remove the matching filenames. You can do:
d=$(date -d "30 days ago" '+%F %T') # get date and time 30 days ago
find ~/Downloads -type f ! -newermt "$d" ! -newerat "$d" ! -newerct "$d" -delete
(the order of the options is important as they are evaluated as an expression, if you put -delete first it will delete all files under the ~/Download path as there is nothing to modify the list of files before -delete is encountered)
Note: test without -delete to make sure it returns the list you expect it to and then add the option back in to actually remove the files.
Thoughts On Change of Question to If Any File is Newer in Mod, Access or Change -- Keep all in that Directory
After your edit, where any one file in a subdirectory will prevent removal of any files in the subdirectory, that will prevent a single call to find from being helpful since find processes a single file at a time without knowledge of how tests on other files went.
Here my thought is more to loop over the directories under ~/Downloads one at a time relying on globstar being set. You will change to your "$HOME" directory (in the script) so the paths generated by the **/ search will be relative to "$HOME" without extraneous other path components for /home/user prepended to them.
Create a short function that loops over each file in the directory being processed, and if any one file is newer in modification, access or change, do nothing with that directory, all files are save.
A quick implementation using stat to using date and mod, access and change times in seconds since epoch, you could do:
#!/bin/bash
shopt -s globstar # endable globstar globbing
dt=$(date -d "30 days ago" '+%s') # 30 days ago in seconds since epoch
cd "$HOME" || exit 1 # change to home directory path globbing to Downloads
dld="Downloads" # set Downloads varabile
# function returns 0 if no files in dir with access or mod time in 30 days
# returns 1 otherwise (don't remove)
nonenewerthan30 () {
local dir="$1"
[ -d "$dir" ] || return 1 # validate it is a dir
for f in "$dir"/*; do # loop over files in dir
[ -d "$f" ] && continue # skip any directories in dir
[ $(stat -c %X "$f") -gt "$dt" ] && return 1 # mod time since epoch
[ $(stat -c %Y "$f") -gt "$dt" ] && return 1 # access time since epoch
[ $(stat -c %Z "$f") -gt "$dt" ] && return 1 # change time since epoch
done
return 0 # directory can be removed.
}
for d in "$dld"/**/; do # loop Downloads and all subdirs
d="${d%/}" # remove trailing '/'
[ "$d" = "$dld" ] && continue # skip Downloads until subs processed
printf "\nprocessing: %s\n" "$d"
nonenewerthan30 "$d" && { # call func, on 0 return, remove sub
echo " can remove $d"
# rm -r "$d" # uncomment after your verify behavior
}
done
Currently it skips processing the files in Downloads until all the subdirectories are done. You will need to keep track if files are retained at any level to know whether or not removing them from Downloads is even an options. Adding that logic I leave to you.

Remove files older than the start of the current day

I want to use logic that allows to use the find command to find all files older than today's date.
Using the below has a 24 hour timestamp from the current time:
find /home/test/ -mtime +1
I am trying to achieve a solution that no matter what time it executes in the cron it will check all files older than the start of the day at 00:00. I believe this can be achieved using epoch, but struggling to find the best logic for this.
#!/bin/ksh
touch -t $(date +%Y%m%d0000.00) fence
find /home/test/ ! -newer fence -exec \
sh -c '
for f in "$#"; do
[[ $f -ot fence ]] && printf "%s\n" "$f"
done
' sh {} + \
;
rm fence
Why find(1) has no -older expression. :-(
UNIX find: opposite of -newer option exists?

Boolean check if a file has been opened in the past hour

I am trying to write a crontab that checks inside some specified directory and checks if the files are more than an hour old.
!#/bin/bash
for F in /My/Path/*.txt;do
if [ ***TEST IF FILE WAS OPENED IN THE PAST HOUR *** ]
then
echo "$F"
fi
done
thanks for any help
This can be done with a simple find
find /path/to/directory -type f -newermt "1 hours ago"
Any files accessed / modified within the past hour will print to stdout. No need to loop and print.
#/bin/bash
OLD_FILES=$(find /path/to/directory -type f -newermt "1 hours ago")
if [[ -n $OLD_FILES ]]; then
echo "$OLD_FILES"
else
echo "No old files found in dir"
fi
You can always pipe the results to a log file if you're trying to compile a list as well
find /path/to/directory -type f -newermt "1 hours ago" >> $yourLogFile
A more rigorous approach using GNU date, which has an option -r
-r, --reference=FILE
display the last modification time of FILE
Using the above, incorporating in your script
#!/bin/bash
for filename in /My/Path/*.txt ; do
if (( (($(date +%s) - $(date -r "$filename" +%s))/60) <= 60 )); then
echo "$filename"
fi
done
The logic is straight-forward, we are getting the file modification time in minutes by subtracting the file's modification EPOCH with the current time. If the file is modified within 60 minutes, the particular file is printed.

How to check if a file is older than 30 minutes in unix

I've written a script to iterate though a directory in Solaris. The script looks for files which are older than 30 minutes and echo. However, my if condition is always returning true regardless how old the file is. Someone please help to fix this issue.
for f in `ls -1`;
# Take action on each file. $f store current file name
do
if [ -f "$f" ]; then
#Checks if the file is a file not a directory
if test 'find "$f" -mmin +30'
# Check if the file is older than 30 minutes after modifications
then
echo $f is older than 30 mins
fi
fi
done
You should not parse the output of ls
You invoke find for every file which is unnecessarily slow
You can replace your whole script with
find . -maxdepth 1 -type f -mmin +30 | while IFS= read -r file; do
[ -e "${file}" ] && echo "${file} is older than 30 mins"
done
or, if your default shell on Solaris supports process substitution
while IFS= read -r file; do
[ -e "${file}" ] && echo "${file} is older than 30 mins"
done < <(find . -maxdepth 1 -type f -mmin +30)
If you have GNU find available on your system the whole thing can be done in one line:
find . -maxdepth 1 -type f -mmin +30 -printf "%s is older than 30 mins\n"
Another option would be to use stat to check the time. Something like below should work.
for f in *
# Take action on each file. $f store current file name
do
if [ -f "$f" ]; then
#Checks if the file is a file not a directory
fileTime=$(stat --printf "%Y" "$f")
curTime=$(date +%s)
if (( ( ($curTime - $fileTime) / 60 ) < 30 ))
echo "$f is less than 30 mins old"
then
echo "$f is older than 30 mins"
fi
fi
done
Since you are iterating through a directory you could try the below command which will find all files ending with log type edited in the past 30 min. Using:
-mmin +30 would give all files edited before 30 minutes ago
-mmin -30 would give all files that have changed within the last 30 minutes
find ./ -type f -name "*.log" -mmin -30 -exec ls -l {} \;

Removing old folders in bash backup script

I have a bash script that rsyncs files onto my NAS to the directory below:
mkdir /backup/folder_`date +%F`
How would I go about writing a cleanup script that removes directories older than 7 days old based upon the date in directories name?
#!/bin/bash
shopt -s extglob
OLD=$(exec date -d "now - 7 days" '+%s')
cd /backup || exit 1 ## If necessary.
while read DIR; do
if read DATE < <(exec date -d "${DIR#*folder_}" '+%s') && [[ $DATE == +([[:digit:]]) && DATE -lt OLD ]]; then
echo "Removing $DIR." ## Just an example message. Or we could just exclude this and add -v option to rm.
rm -ir "$DIR" ## Change to -fr to skip confirmation.
fi
done < <(exec find -maxdepth 1 -type d -name 'folder_*')
exit 0
We could actually use more careful approaches like -rd $'\0', -print0 and IFS= but I don't think they are really necessary this time.
Create a list of folders with the pattern you want to remove, remove the folders you want to keep from the list, delete everything else.
How about a simple find:
find /backup -name 'folder_*' -type d -ctime 7 -exec rm -rf {} \;

Resources