Delete all files on ftp created before 10 days from now - bash

Is there a command to delete all files with a certain name created a couple of days ago from my ftp server, running ubuntu 14.04?
Here is what I have
find /path/to/files* -mtime +10 -exec rm {} \;

Try this on the ftp server (with mydir changed into your dir).
find mydir -type f ! -mtime 5 -exec echo rm {} \;
When the echo shows what you like, remove the echo.

Related

Delete *.ZIP files older than 15 days

I'm currently using a PHP script to back up my databases daily and it works like a charm. After the backup, I am using the shell script below to zip the backup file:
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
I need to add a piece of code that will also scan previously created "$(date +%F)_backup.sql.zip" and delete any that are older than 15 days.
Does anyone have any recommendations on how to make this work?
UPDATE 10/16/2019 1601HRS EST
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
find /home/root/BACKUPS/ -mtime +14 -type f -iname '*.backup.sql.zip' -exec rm {} \;
This did not remove the files that should have been removed. I'm not sure what I'm missing; maybe a ';' after the first line. Although the first line is running properly by zipping and naming the SQL file, the second line is not working.
cannot comment yet.
I guess you want something like this shell-command:
find /home/root/BACKUPS/ -mtime +15 -type f -iname '*.backup.sql.zip' -exec rm {} \;
Edit:
Some explanation: This finds and deletes (-exec rm {} \; ) all files (type f) with name ending in "backup.sql.zip" with a modification-time older than 15 days (mtime +15).
Hope it helps.
This worked perfectly for me.
Backup.php
#!/usr/bin/php
<?php
$file = "/home/root/BACKUPS/backup.sql";
$command = "mysqldump -uroot_admin -pkeypw --all-databases > $file";
system($command);
?>
Backup.bat
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
find /home/root/BACKUPS -name "*.zip" -type f -mtime +15 -exec rm -f {} \;
Reference: https://tecadmin.net/delete-files-older-x-days/
Basically #Sebastian the .backup.sql part of the file name reference did not have to be included. So instead of *.backup.sql.zip it needed to simply be *.zip Thank you #Sebastian for the lead.

Deleting files from an AIX system

We have an AIX system, which gets files on a daily basis, so we manually delete the previous days files manually. Is it possible to write a script which will take the files 15 or 20 days before today and delete the files from the folder?
Or you can use native AIX find command:
find /dir/to/files -type f -mtime +15 -exec rm {} \;
where:
-type f - Find only files, not directories
-mtime +15 - Find files, that modification time more then 15 days
-exec rm {} \; - Run command rm on each matched file
You can run this command with -exec ls -l {} \; for testing, that found files correspond to your criteria.
If you can/may install GNU!find, them it's simple, e.g.:
#!/bin/sh
cd /var/log/apache
gfind . -name '*log*Z' -mtime +30 -delete
this script is run by cron; a line from crontab:
02 23 1 * * /root/cmd/httpd.logdelete >/dev/null 2>&1
Edit:
-mdays + means files of which last modification date is earlier than now-
-delete means deleting the files that match the criteria

Trying to remove a file and its parent directories

I've got a script that finds files within folders older than 30 days:
find /my/path/*/README.txt -mtime +30
that'll then produce a result such as
/my/path/jobs1/README.txt
/my/path/job2/README.txt
/my/path/job3/README.txt
Now the part I'm stuck at is I'd like to remove the folder + files that are older than 30 days.
find /my/path/*/README.txt -mtime +30 -exec rm -r {} \;
doesn't seem to work. It's only removing the readme.txt file
so ideally I'd like to just remove /job1, /job2, /job3 and any nested files
Can anyone point me in the right direction ?
This would be a safer way:
find /my/path/ -mindepth 2 -maxdepth 2 -type f -name 'README.txt' -mtime +30 -printf '%h\n' | xargs echo rm -r
Remove echo if you find it already correct after seeing the output.
With that you use printf '%h\n' to get the directory of the file, then use xargs to process it.
You can just run the following command in order to recursively remove directories modified more than 30 days ago.
find /my/path/ -type d -mtime +30 -exec rm -rf {} \;

bash: How can I use find to check for backup file and delete (if exists)?

I have an onsite backup folder /backup/ that has an offsite rsynced copy mounted locally as /mnt/offsite/backup/. My onsite backup drive is getting full, so I'd like to delete files older than 365 days, but first check to see if that file exists offsite, and log to file the filenames that were removed (to exclude from rsync).
I've come close with this:
cd /mnt/offsite/backup && find . -type f -mtime +365 -exec rm /backup/{} \; | >> file.lst
However the redirection isn't working. I've tried placing the >> in different places, and can't get it to work with exec in there. I've also tried using xargs rm, and can get the redirect working, but can't get xargs to delete from the second path:
cd /mnt/offsite/backup && find . -type f -mtime +365 >> file.lst | xargs rm /backup/
What's the best approach?
Hope this helps
find /mnt/offsite/backup -type f -mtime +365 -exec rm {} \; -print >> file.lst

script to move folder based on modified date

first post.
Background- My wife is a photographer, she take a lot of pictures and saves them to a drive (mnt/STStorage) while she is editing them, but never cleans up afterward. I have a drive that i would like to move the folders to based on modified date.(/mnt/LTStorage).
Need help with a script that i can add to a cron job to run once a day 30 1 * * *
I would like for the script to..
Check /mnt/STStorage/ root folders for last modified date & if older
than 14 days, move that folder and everything in it to
/mnt/LTStorage/ while keeping the same folder name.
Then write what was moved to /mnt/STStorage/ so that we know what
was moved and email me a log of folders moved.
OS CentOS 6.4
here is what i have, think this may work for now. Could be cleaner.
#/bin/bash
dt=$(date +%Y.%m.%d)
From="/mnt/STStorage/"
To="/mnt/LTStorage/"
if [[ ! -d "$To" ]]; then
mkdir -p "$To"
fi
cd "$From"
for i in "$#"; do
find . -type d -mtime +14 -exec mv "{}" "$To" \; > "$From"/Moved."$dt".txt
uuencode "$From"/Moved."$dt".txt "$From"/Moved."$dt".txt | mail -s "Files Moved"
me#me.com
done
Then i will add this to crontab to run once a day.
You can use -exec along with find. Something like:
find /mnt/STStorage/ -type d -mtime +14 -exec mv {} /mnt/LTStorage/ \;
-type d will ensure only directories are moved.
Another option is to use xargs
find /mnt/STStorage/ -type d -mtime +14 | xargs -I '{}' mv {} /mnt/LTStorage/
Update:
To add what is being moved, you can set the verbose mode option for mv
find /mnt/STStorage/ -type d -mtime +14 -exec mv -v {} /mnt/LTStorage/ \;
Since this will print everything on standard out. You can redirect it to a log file.
find /mnt/STStorage/ -type d -mtime +14 -exec mv {} /mnt/LTStorage/ \; > /mnt/STStorage/log.file
For emailing you can do something like -
uuencode /mnt/STStorage/log.file /mnt/STStorage/log.file | mail -s "this is my subject line" chip#email.com

Resources