Move all the Files generated 23 hrs. after they were first created - shell

I am using below command to pick the files 23 hrs after they were first created, but it is not picking can you tell me where i am going wrong
find /test/files -maxdepth 1 -type f -mtime +0.9
mtime +1 means 24 hours later
so, used +0.9 so it should pick 23 hours , but it is not picking.

I am afraid fractions will not work in find. What you can do is to create file with timestamp 23 hours ago and with find get older files:
touch -d '23 hours ago' /tmp/tmp_file
find /test/files -maxdepth 1 -type f ! -newer /tmp/tmp_file

Related

Bash: files older than a certain date [duplicate]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
How do I use the UNIX command find to search for files created on a specific date?
As pointed out by Max, you can't, but checking files modified or accessed is not all that hard. I wrote a tutorial about this, as late as today. The essence of which is to use -newerXY and ! -newerXY:
Example: To find all files modified on the 7th of June, 2007:
$ find . -type f -newermt 2007-06-07 ! -newermt 2007-06-08
To find all files accessed on the 29th of september, 2008:
$ find . -type f -newerat 2008-09-29 ! -newerat 2008-09-30
Or, files which had their permission changed on the same day:
$ find . -type f -newerct 2008-09-29 ! -newerct 2008-09-30
If you don't change permissions on the file, 'c' would normally correspond to the creation date, though.
Use this command to search for files and folders on /home/ add a time period of time according to your needs:
find /home/ -ctime time_period
Examples of time_period:
More than 30 days ago: -ctime +30
Less than 30 days ago: -ctime -30
Exactly 30 days ago: -ctime 30
It's two steps but I like to do it this way:
First create a file with a particular date/time. In this case, the file is 2008-10-01 at midnight
touch -t 0810010000 /tmp/t
Now we can find all files that are newer or older than the above file (going by file modified date). You can also use -anewer for accessed and -cnewer file status changed.
find / -newer /tmp/t
find / -not -newer /tmp/t
You could also look at files between certain dates by creating two files with touch
touch -t 0810010000 /tmp/t1
touch -t 0810011000 /tmp/t2
This will find files between the two dates & times
find / -newer /tmp/t1 -and -not -newer /tmp/t2
You could do this:
find ./ -type f -ls |grep '10 Sep'
Example:
[root#pbx etc]# find /var/ -type f -ls | grep "Dec 24"
791235 4 -rw-r--r-- 1 root root 29 Dec 24 03:24 /var/lib/prelink/full
798227 288 -rw-r--r-- 1 root root 292323 Dec 24 23:53 /var/log/sa/sar24
797244 320 -rw-r--r-- 1 root root 321300 Dec 24 23:50 /var/log/sa/sa24
You can't. The -c switch tells you when the permissions were last changed, -a tests the most recent access time, and -m tests the modification time. The filesystem used by most flavors of Linux (ext3) doesn't support a "creation time" record. Sorry!
#Max: is right about the creation time.
However, if you want to calculate the elapsed days argument for one of the -atime, -ctime, -mtime parameters, you can use the following expression
ELAPSED_DAYS=$(( ( $(date +%s) - $(date -d '2008-09-24' +%s) ) / 60 / 60 / 24 - 1 ))
Replace "2008-09-24" with whatever date you want and ELAPSED_DAYS will be set to the number of days between then and today. (Update: subtract one from the result to align with find's date rounding.)
So, to find any file modified on September 24th, 2008, the command would be:
find . -type f -mtime $(( ( $(date +%s) - $(date -d '2008-09-24' +%s) ) / 60 / 60 / 24 - 1 ))
This will work if your version of find doesn't support the -newerXY predicates mentioned in #Arve:'s answer.
With the -atime, -ctime, and -mtime switches to find, you can get close to what you want to achieve.
cp `ls -ltr | grep 'Jun 14' | perl -wne 's/^.*\s+(\S+)$/$1/; print $1 . "\n";'` /some_destination_dir
I found this scriplet in a script that deletes all files older than 14 days:
CNT=0
for i in $(find -type f -ctime +14); do
((CNT = CNT + 1))
echo -n "." >> $PROGRESS
rm -f $i
done
echo deleted $CNT files, done at $(date "+%H:%M:%S") >> $LOG
I think a little additional "man find" and looking for the -ctime / -atime etc. parameters will help you here.

I have a find command used in my script. command use - find . -type f -name "*.txt" -mtime +30. How does this command work? Does it consider day alone

find . -type f -name "*.txt" -mtime +30
Does this command consider day alone? Or day and time?
If I am executing the command on 12.00 pm, will it fetch files older than 30 days and created before 12 pm?
I don't have unix machine at hand to test, but past experience leads me to believe that hours are not considered and simply cut off. That means for me that a day therefor starts at 00:00 and lasts until 24:00 (midnight).
Also, -mtime is not creation time, but modify time.
find
. (= current directory)
-type f (= {and} files only)
-name "*.txt" (= {and} all files with .txt extension)
-mtime +30 (= {and} files modified greater than 30 days previously)
Executing that command on Jan 31 will return all files that were modified on or before any hour of Jan 1, and match the other criteria.
https://man7.org/linux/man-pages/man1/find.1.html

Using find to locate files that haven't been modified since the last time it was 6 AM

I want to have a cron job that deletes files that haven't been changed since the last time it was 6 AM. It might not be clear so here is an example:
If it is 8 AM on Monday, I want to delete every file before 6 AM on the same day.
But if it is 4 AM on Monday, I want to delete every file before 6 AM on Sunday.
It is why I can't just use
find /path/ -type f ! -newermt '06:00:00' -delete
How can I do this?
Check if it's past 6 AM first and modify the argument to -newermt accordingly.
if (( $(date +%-H) < 6 )); then
when='yesterday 6'
else
when='6'
fi
find /path/ -type f ! -newermt "$when" -print

find -daystart argument explanation

So I understand that a line such as:
find /var/log/ -mtime +60 -type f -exec ls -l {} \;
Will list all files in /var/log which were modified 60 days or more ago.
After reading through the find man page though I noticed:
Measure times (for -amin, -atime, -cmin, -ctime, -mmin, and
-mtime) from the beginning of today rather than from 24 hours
ago. This option only affects tests which appear later on the
command line.
Can someone explain the rest? (-amin, -atime, -cmin, -ctime, -mmin) The man page itself does not seem to really declare what each of these do?
Some example questions which might help me understand:
Find files modified an hour or more ago?
Find files modified between 60 minutes and 10 minutes ago?
Find files modified 2 weeks ago?
Find files created in the last 5 minutes?
Find files modified an hour or more ago?
-mmin +60
Find files modified between 60 minutes and 10 minutes ago?
-mmin -60 -mmin +10
Find files modified 2 weeks ago?
-mtime +7 -mtime -8
Find files created in the last 5 minutes?
Can't be done. POSIX has no specification for creation time.
These options are explained in the TESTS subsection of the EXPRESSIONS section of the find(1) man page.

How to delete files older than X hours

I'm writing a bash script that needs to delete old files.
It's currently implemented using :
find $LOCATION -name $REQUIRED_FILES -type f -mtime +1 -delete
This will delete of the files older than 1 day.
However, what if I need a finer resolution that 1 day, say like 6 hours old? Is there a nice clean way to do it, like there is using find and -mtime?
Does your find have the -mmin option? That can let you test the number of mins since last modification:
find $LOCATION -name $REQUIRED_FILES -type f -mmin +360 -delete
Or maybe look at using tmpwatch to do the same job. phjr also recommended tmpreaper in the comments.
Here is the approach that worked for me (and I don't see it being used above)
$ find /path/to/the/folder -name '*.*' -mmin +59 -delete > /dev/null
deleting all the files older than 59 minutes while leaving the folders intact.
You could to this trick: create a file 1 hour ago, and use the -newer file argument.
(Or use touch -t to create such a file).
-mmin is for minutes.
Try looking at the man page.
man find
for more types.
For SunOS 5.10
Example 6 Selecting a File Using 24-hour Mode
The descriptions of -atime, -ctime, and -mtime use the ter-
minology n ``24-hour periods''. For example, a file accessed
at 23:59 is selected by:
example% find . -atime -1 -print
at 00:01 the next day (less than 24 hours later, not more
than one day ago). The midnight boundary between days has no
effect on the 24-hour calculation.
If you do not have "-mmin" in your version of "find", then "-mtime -0.041667" gets pretty close to "within the last hour", so in your case, use:
-mtime +(X * 0.041667)
so, if X means 6 hours, then:
find . -mtime +0.25 -ls
works because 24 hours * 0.25 = 6 hours
If one's find does not have -mmin and if one also is stuck with a find that accepts only integer values for -mtime, then all is not necessarily lost if one considers that "older than" is similar to "not newer than".
If we were able to create a file that that has an mtime of our cut-off time, we can ask find to locate the files that are "not newer than" our reference file.
To create a file that has the correct time stamp is a bit involved because a system that doesn't have an adequate find probably also has a less-than-capable date command that could do things like: date +%Y%m%d%H%M%S -d "6 hours ago".
Fortunately, other old tools can manage this, albeit in a more unwieldy way.
To begin finding a way to delete files that are over six hours old, we first have to find the time that is six hours ago. Consider that six hours is 21600 seconds:
$ date && perl -e '#d=localtime time()-21600; \
printf "%4d%02d%02d%02d%02d.%02d\n", $d[5]+1900,$d[4]+1,$d[3],$d[2],$d[1],$d[0]'
> Thu Apr 16 04:50:57 CDT 2020
202004152250.57
Since the perl statement produces the date/time information we need, use it to create a reference file that is exactly six hours old:
$ date && touch -t `perl -e '#d=localtime time()-21600; \
printf "%4d%02d%02d%02d%02d.%02d\n", \
$d[5]+1900,$d[4]+1,$d[3],$d[2],$d[1],$d[0]'` ref_file && ls -l ref_file
Thu Apr 16 04:53:54 CDT 2020
-rw-rw-rw- 1 root sys 0 Apr 15 22:53 ref_file
Now that we have a reference file exactly six hours old, the "old UNIX" solution for "delete all files older than six hours" becomes something along the lines of:
$ find . -type f ! -newer ref_file -a ! -name ref_file -exec rm -f "{}" \;
It might also be a good idea to clean up our reference file...
$ rm -f ref_file
Here is what one can do for going on the way #iconoclast was wondering about in their comment on another answer.
use crontab for user or an /etc/crontab to create file /tmp/hour:
# m h dom mon dow user command
0 * * * * root /usr/bin/touch /tmp/hour > /dev/null 2>&1
and then use this to run your command:
find /tmp/ -daystart -maxdepth 1 -not -newer /tmp/hour -type f -name "for_one_hour_files*" -exec do_something {} \;
find $PATH -name $log_prefix"*"$log_ext -mmin +$num_mins -exec rm -f {} \;

Resources