I have a backup system in my ubuntu server that every day makes a database backup and save it in a folder named like the day: $(date +%d%m%y)
But on the script, when I try to find and delete the folders from the last week, the command don't find any directory.
Im trying with: find -name $(date +%d%m%y) -type d -mtime +7 -exec rm -r {};
And never find a directory. Y tryed changing the -mtime time to 1 day or 2, but dont find nothing.
I think you made a small mistake:
When you backup on the 7th of may, you create a folder with name 070515. When you search a week later, you look for a folder with name 140515 modified more then 7 days ago. However, this folder has been created only today.
You may not need the name of the folder, just use
find /backup/path -type d -mtime +7
to find all folders older then 7 days.
I suspect at least two errors in your find command:
The path is missing where to search: find /where/to/search -name ...
$(date +%d%m%y) always gives the actual date. It looks reasonable that directories with actual date don't have a modification time +1 or +7. Instead try following:
find /where/to/search -type d -mtime +7
Related
I have a cron job, every 5 minutes, backing up my MYSQL to files ending in .sql.gz. But this is hundreds of files a day. So I searched the internet and found this bash script which I expected to just work on the files in the /backup folder specified and only on .sql.gz files. but I soon found that it deleted everything in my root folder. :-) I was able to FTP the files back and have my site back up in half an hour, but I still need the script to work as intended. I'm new to bash scripting so I'm asking what did I do wrong in editing the script I found on the internet to my needs? What would work?
Here is the rogue script. DO NOT run this as is. its broken, thats why im here:
find /home/user/backups/*.gz * -mmin +60 -exec rm {} \;
Im suspecting its that last backslash should be /home/user/backups/
And also I should remove the * before -min
so what I need should be:
find /home/user/backups/*.gz -mmin +60 -exec rm {} /home/user/backups/;
Am I correct? Or still missing something?
BTW Im running this on Dreamhost shared hosting CRON. Their support don't want to help with BASH questions really, I tried.
The filename arguments to find should be the directories to start the recursive search. Then use -name and other options to filter down to the files that match the criteria you want.
find /home/user/backups -type f -name '*.sql.gz' -mmin +60 -exec rm {} +
-type f means only select ordinary files
-name '*.sql.gz' means only filenames ending in .sql.gz
-mmin +60 means files more than 60 minutes old
And using + instead of \; at the end of -exec means that it should just run the command once with all the selected filenames, rather than separately for each filename; this is a minor efficiency improvement.
I have files with name as
filename_201702200800.tar.bz2
filename_201702201800.tar.bz2
and so on
I am looking to remove files which are 5 days and older.
The date in the files is of the format %Y%m%d%H%M.
As the creation time corresponds to the names, just use find:
find /path/to/files -type f -ctime +5 -exec rm {} +
From man page:
-exec command {} +
This variant of the -exec action runs the specified command on the selected files, but the command line is built by
appending each selected file name at the end; the total number of
invocations of the command will be much less than the number of
matched files. The command line is built in much the same way that
xargs builds its command lines. Only one instance of ‘{}’ is
allowed within the command. The command is executed in the starting
directory.
Not enough rep to comment.
Can you use the mtime of the files?
find /path/to/files -type f -mtime +5 -delete
Otherwise, you could calculate the dates to find with:
date -d "5 days ago" +%Y%m%d%H%M
A cron action saves database files on an hourly basis and assigns a file name based on year, month, day and hour
/$(date +\%m)/$(date +\%y\%m\%d\%H)_thedb.sql
This leads to archive bloat and the goal is to keep the last file of the day (i.e. delete all those lesser than 15050923* ) in a separate cron action.
What is an effective way of achieving this?
Before you start with complex bash string substitutions, I suggest you try to go after the file date. find can help you with that.
For example, to delete all files in a directory that are older than 5 days, you could try something like this:
find <DIR> -mtime +5 -exec rm {} \;
Now if there are subdirectories in <DIR>, you might also want to include the options -type f to limit the finding to files, and -maxdepth 1 to not search subdirectories.
If you have a file and want to delete everything older than that, you could slightly modify this:
find <DIR> -not -newer <FILE> -not -name <FILE> -exec rm {} \;
I simply don't know why there is no -older search term in find, it seems so obvious.
Warning: I strongly recommend to first leave out -exec and everything after it to check whether the files it finds can all be deleted.
I have a directory where so many files created daily and need to copy the new files which were generated. And all files will be created with starting name abc_
Ex:I have a file abc_0520123.pdf on the next day two files were created abc_0521234.pdf and abc_0521254.pdf now I want to copy only these two files created newly.
Please help me how can I compare old files with new one and to copy them.
You can use find.
find /my_directory -mtime -1d # Finds everything modified less than one day ago.
find /my_directory -ctime -1d # Finds everything created less than one day ago.
find /my_directory -ctime +5d # Finds stuff created more than 5 days ago.
If you want to move the files you can use -exec
find /my_directory -mtime -1d -type f -exec mv {} /new_dir/. \;
Finds files only located under /my_directory which are less than 1 day old and moves them to /new_dir
Find is one of the most useful commands you can ever learn!
Could someone please give me the command to delete log files before today (except today and yesterday) date?
You can use find with the -mtime option to get a list of files modified more than N days ago.
For example:
find . -maxdepth 1 -name '*.txt' -mtime +2
will give you all the *.txt files in the current directory older than 48 hours.
You can add -delete to actually get rid of them.
find /path/to/files* -mtime +2 -delete