Delete *.ZIP files older than 15 days - bash

I'm currently using a PHP script to back up my databases daily and it works like a charm. After the backup, I am using the shell script below to zip the backup file:
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
I need to add a piece of code that will also scan previously created "$(date +%F)_backup.sql.zip" and delete any that are older than 15 days.
Does anyone have any recommendations on how to make this work?
UPDATE 10/16/2019 1601HRS EST
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
find /home/root/BACKUPS/ -mtime +14 -type f -iname '*.backup.sql.zip' -exec rm {} \;
This did not remove the files that should have been removed. I'm not sure what I'm missing; maybe a ';' after the first line. Although the first line is running properly by zipping and naming the SQL file, the second line is not working.

cannot comment yet.
I guess you want something like this shell-command:
find /home/root/BACKUPS/ -mtime +15 -type f -iname '*.backup.sql.zip' -exec rm {} \;
Edit:
Some explanation: This finds and deletes (-exec rm {} \; ) all files (type f) with name ending in "backup.sql.zip" with a modification-time older than 15 days (mtime +15).
Hope it helps.

This worked perfectly for me.
Backup.php
#!/usr/bin/php
<?php
$file = "/home/root/BACKUPS/backup.sql";
$command = "mysqldump -uroot_admin -pkeypw --all-databases > $file";
system($command);
?>
Backup.bat
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
find /home/root/BACKUPS -name "*.zip" -type f -mtime +15 -exec rm -f {} \;
Reference: https://tecadmin.net/delete-files-older-x-days/
Basically #Sebastian the .backup.sql part of the file name reference did not have to be included. So instead of *.backup.sql.zip it needed to simply be *.zip Thank you #Sebastian for the lead.

Related

execdir and rename commands together?

I ask a question about finding, copying and renaming which can be found here Find, copy, rename within the same directory
The answer was great and solved the issue I had in that thread but it did bring up another question about how I can rename just part of the file....for example when running this command;
find /home/ian/Desktop/TEST/ -type f -mmin -1 -execdir echo cp \{} \{}_backup \;
and the file is called TEST_MASTER how can you run the above and have the new file called TEST_BACKUP as opposed to TEST_MASTER_BACKUP?
I can solve this by running a new rename command straight after like below;
find /home/ian/Desktop/TEST/ -type f -mmin -1 -execdir cp \{} \{}_backup \; ;
rename __MASTER_backup _backup *MASTER_backup ;
but there must be a way to do this in one go?
All the best,
Ian
You can use this find command:
find /home/ian/Desktop/TEST/ -type f -mmin -1 -execdir bash -c 'cp "$1" "${1%%_*}_BACKUP"' - '{}' \;
I came with almost the same answer as anubhava:
find /home/ian/Desktop/TEST/ -type f -name '*_MASTER' -mmin -1 -execdir \
bash -c 'mv $1 ${1/MASTER/BACKUP}' - \{} \;
This will only backup *_MASTER files. If you need to backup the other files as well (and add an extra _BACKUP at the end, vote for anubhava!

Trying to remove a file and its parent directories

I've got a script that finds files within folders older than 30 days:
find /my/path/*/README.txt -mtime +30
that'll then produce a result such as
/my/path/jobs1/README.txt
/my/path/job2/README.txt
/my/path/job3/README.txt
Now the part I'm stuck at is I'd like to remove the folder + files that are older than 30 days.
find /my/path/*/README.txt -mtime +30 -exec rm -r {} \;
doesn't seem to work. It's only removing the readme.txt file
so ideally I'd like to just remove /job1, /job2, /job3 and any nested files
Can anyone point me in the right direction ?
This would be a safer way:
find /my/path/ -mindepth 2 -maxdepth 2 -type f -name 'README.txt' -mtime +30 -printf '%h\n' | xargs echo rm -r
Remove echo if you find it already correct after seeing the output.
With that you use printf '%h\n' to get the directory of the file, then use xargs to process it.
You can just run the following command in order to recursively remove directories modified more than 30 days ago.
find /my/path/ -type d -mtime +30 -exec rm -rf {} \;

bash: How can I use find to check for backup file and delete (if exists)?

I have an onsite backup folder /backup/ that has an offsite rsynced copy mounted locally as /mnt/offsite/backup/. My onsite backup drive is getting full, so I'd like to delete files older than 365 days, but first check to see if that file exists offsite, and log to file the filenames that were removed (to exclude from rsync).
I've come close with this:
cd /mnt/offsite/backup && find . -type f -mtime +365 -exec rm /backup/{} \; | >> file.lst
However the redirection isn't working. I've tried placing the >> in different places, and can't get it to work with exec in there. I've also tried using xargs rm, and can get the redirect working, but can't get xargs to delete from the second path:
cd /mnt/offsite/backup && find . -type f -mtime +365 >> file.lst | xargs rm /backup/
What's the best approach?
Hope this helps
find /mnt/offsite/backup -type f -mtime +365 -exec rm {} \; -print >> file.lst

script to move folder based on modified date

first post.
Background- My wife is a photographer, she take a lot of pictures and saves them to a drive (mnt/STStorage) while she is editing them, but never cleans up afterward. I have a drive that i would like to move the folders to based on modified date.(/mnt/LTStorage).
Need help with a script that i can add to a cron job to run once a day 30 1 * * *
I would like for the script to..
Check /mnt/STStorage/ root folders for last modified date & if older
than 14 days, move that folder and everything in it to
/mnt/LTStorage/ while keeping the same folder name.
Then write what was moved to /mnt/STStorage/ so that we know what
was moved and email me a log of folders moved.
OS CentOS 6.4
here is what i have, think this may work for now. Could be cleaner.
#/bin/bash
dt=$(date +%Y.%m.%d)
From="/mnt/STStorage/"
To="/mnt/LTStorage/"
if [[ ! -d "$To" ]]; then
mkdir -p "$To"
fi
cd "$From"
for i in "$#"; do
find . -type d -mtime +14 -exec mv "{}" "$To" \; > "$From"/Moved."$dt".txt
uuencode "$From"/Moved."$dt".txt "$From"/Moved."$dt".txt | mail -s "Files Moved"
me#me.com
done
Then i will add this to crontab to run once a day.
You can use -exec along with find. Something like:
find /mnt/STStorage/ -type d -mtime +14 -exec mv {} /mnt/LTStorage/ \;
-type d will ensure only directories are moved.
Another option is to use xargs
find /mnt/STStorage/ -type d -mtime +14 | xargs -I '{}' mv {} /mnt/LTStorage/
Update:
To add what is being moved, you can set the verbose mode option for mv
find /mnt/STStorage/ -type d -mtime +14 -exec mv -v {} /mnt/LTStorage/ \;
Since this will print everything on standard out. You can redirect it to a log file.
find /mnt/STStorage/ -type d -mtime +14 -exec mv {} /mnt/LTStorage/ \; > /mnt/STStorage/log.file
For emailing you can do something like -
uuencode /mnt/STStorage/log.file /mnt/STStorage/log.file | mail -s "this is my subject line" chip#email.com

Recursively unzip files and then delete original file, leaving unzipped files in place from shell

I've so far figured out how to use find to recursively unzip all the files:
find . -depth -name `*.zip` -exec /usr/bin/unzip -n {} \;
But, I can't figure out how to remove the zip files one at a time after the extraction. Adding rm *.zip in an -a -exec ends up deleting most of the zip files in each directory before they are extracted. Piping through a script containing the rm command (with -i enabled for testing) causes find to not find any *.zips (or at least that's what it complains). There is, of course, whitespace in many of the filenames but at this point syntaxing in a sed command to add _'s is a bit beyond me. Thank for your help!
have you tried:
find . -depth -name '*.zip' -exec /usr/bin/unzip -n {} \; -exec rm {} \;
or
find . -depth -name '*.zip' -exec /usr/bin/unzip -n {} \; -delete
or running a second find after the unzip one
find . -depth -name '*.zip' -exec rm {} \;
thx for the 2nd command with -delete! helped me a lot..
just 2 (maybe helpful) remarks from my side:
-had to use '.zip' instead of `.zip` on my debian system
-use -execdir instead of -exec > this will extract each zip file within its current folder, otherwise you end up with all extracted content in the dir you invoked the find cmd.
find . -depth -name '*.zip' -execdir /usr/bin/unzip -n {} \; -delete
THX & Regards,
Nord
As mentioned above, this should work.
find . -depth -name '*.zip' -execdir unzip -n {} \; -delete
However, note two things:
The -n option instructs unzip to not overwrite existing files. You may not know if the zip files differ from the similarly named target files. Even so, the -delete will remove the zip file.
If unzip can't unzip the file--say because of an error--it might still delete it. The command will certainly remove it if -exec rm {} \; is used in place of -delete.
A safer solution might be to move the files following the unzip to a separate directory that you can trash when you're sure you have extracted all the files successfully.
Unzip archives in subdir based on the file name (../file.zip -> ../file/..):
for F in $(find . -depth -name *.zip); do unzip "$F" -d "${F%.*}/" && rm "$F"; done
I have a directory filling up with zipped csv files. External processes are writing new zipped files to it often. I wish to bulk unzip and remove the originals as you do.
To do that I use:
unzip '*.zip'
find . | sed 's/$/\.zip/g' | xargs -n 1 rm
It works by searching and expanding all zip files presently in the directory. Later, after it finishes there are potentially new unzipped new files mixed in there too that are not to be deleted yet.
So I delete by finding successfully unzipped *.csv files, and using sed to regenerate the original filenames for deletion which is then fed to rm via the xargs command.

Resources