Deleting files from an AIX system - shell

We have an AIX system, which gets files on a daily basis, so we manually delete the previous days files manually. Is it possible to write a script which will take the files 15 or 20 days before today and delete the files from the folder?

Or you can use native AIX find command:
find /dir/to/files -type f -mtime +15 -exec rm {} \;
where:
-type f - Find only files, not directories
-mtime +15 - Find files, that modification time more then 15 days
-exec rm {} \; - Run command rm on each matched file
You can run this command with -exec ls -l {} \; for testing, that found files correspond to your criteria.

If you can/may install GNU!find, them it's simple, e.g.:
#!/bin/sh
cd /var/log/apache
gfind . -name '*log*Z' -mtime +30 -delete
this script is run by cron; a line from crontab:
02 23 1 * * /root/cmd/httpd.logdelete >/dev/null 2>&1
Edit:
-mdays + means files of which last modification date is earlier than now-
-delete means deleting the files that match the criteria

Related

Delete *.ZIP files older than 15 days

I'm currently using a PHP script to back up my databases daily and it works like a charm. After the backup, I am using the shell script below to zip the backup file:
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
I need to add a piece of code that will also scan previously created "$(date +%F)_backup.sql.zip" and delete any that are older than 15 days.
Does anyone have any recommendations on how to make this work?
UPDATE 10/16/2019 1601HRS EST
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
find /home/root/BACKUPS/ -mtime +14 -type f -iname '*.backup.sql.zip' -exec rm {} \;
This did not remove the files that should have been removed. I'm not sure what I'm missing; maybe a ';' after the first line. Although the first line is running properly by zipping and naming the SQL file, the second line is not working.
cannot comment yet.
I guess you want something like this shell-command:
find /home/root/BACKUPS/ -mtime +15 -type f -iname '*.backup.sql.zip' -exec rm {} \;
Edit:
Some explanation: This finds and deletes (-exec rm {} \; ) all files (type f) with name ending in "backup.sql.zip" with a modification-time older than 15 days (mtime +15).
Hope it helps.
This worked perfectly for me.
Backup.php
#!/usr/bin/php
<?php
$file = "/home/root/BACKUPS/backup.sql";
$command = "mysqldump -uroot_admin -pkeypw --all-databases > $file";
system($command);
?>
Backup.bat
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
find /home/root/BACKUPS -name "*.zip" -type f -mtime +15 -exec rm -f {} \;
Reference: https://tecadmin.net/delete-files-older-x-days/
Basically #Sebastian the .backup.sql part of the file name reference did not have to be included. So instead of *.backup.sql.zip it needed to simply be *.zip Thank you #Sebastian for the lead.

Delete folders older then 1 day not working with "find" cmd

I'm trying to delete backup folders older then 1 day (creation date) with find command, but it's not working
Folder ls -l:
drwxrws---+ 2 root data 42 Mai 15 16:46 15-05-2019
drwxrws---+ 2 root data 89 Mai 16 14:19 16-05-2019
The creation date is 15 Mai.
This cmd should work:
find /data/backup/VMs/centos/ -type d -mtime +1 -exec rm {} \;
I tried with this first to see what happens before the remove:
find /data/backup/VMs/centos/ -type d -mtime +1 -exec ls {} \; >> find_test.txt
It should write to the file the folder to delete, but the txt file is empty.
besides use find, how can I remove this folders using the date in the name?
rm normally doesn't print on standard output, however if an error occurs it prints to standard error which can also be redirected to another file or to the same duplicating the file descriptor 2>&1
find /data/backup/VMs/centos/ -type d -mtime +1 -exec ls {} \; >> find_test.txt 2>&1
to print the name find -print action can be used, also find has actions -delete and -ls (which is not exactly the same than ls) to avoid to execute a command on each file
find /data/backup/VMs/centos/ -type d -mtime +1 -print -delete >> find_test.txt 2>&1
be careful before using -delete to avoid loosing unwanted files

Unix - arg list too long

Using bash 3.2. Trying to delete some log files older than 7 days...anyways this command works on another server but not on the current one.
Wondering if anyone can fix the syntax for me as I'm no Unix expert:
find /export/home1/dir1/dir2/sync/logs/* -mtime +7 -exec rm -f {} \;
Remove * from path of find:
find /export/home1/dir1/dir2/sync/logs/ -mtime +7 -exec rm -f {} \;
or if you newer find version:
find /export/home1/dir1/dir2/sync/logs/ -mtime +7 -delete
By having * in path shell expands to all the available entries in the given directory.

Trying to remove a file and its parent directories

I've got a script that finds files within folders older than 30 days:
find /my/path/*/README.txt -mtime +30
that'll then produce a result such as
/my/path/jobs1/README.txt
/my/path/job2/README.txt
/my/path/job3/README.txt
Now the part I'm stuck at is I'd like to remove the folder + files that are older than 30 days.
find /my/path/*/README.txt -mtime +30 -exec rm -r {} \;
doesn't seem to work. It's only removing the readme.txt file
so ideally I'd like to just remove /job1, /job2, /job3 and any nested files
Can anyone point me in the right direction ?
This would be a safer way:
find /my/path/ -mindepth 2 -maxdepth 2 -type f -name 'README.txt' -mtime +30 -printf '%h\n' | xargs echo rm -r
Remove echo if you find it already correct after seeing the output.
With that you use printf '%h\n' to get the directory of the file, then use xargs to process it.
You can just run the following command in order to recursively remove directories modified more than 30 days ago.
find /my/path/ -type d -mtime +30 -exec rm -rf {} \;

Shell script to delete directories older than n days

I have directories named as:
2012-12-12
2012-10-12
2012-08-08
How would I delete the directories that are older than 10 days with a bash shell script?
This will do it recursively for you:
find /path/to/base/dir/* -type d -ctime +10 -exec rm -rf {} \;
Explanation:
find: the unix command for finding files / directories / links etc.
/path/to/base/dir: the directory to start your search in.
-type d: only find directories
-ctime +10: only consider the ones with modification time older than 10 days
-exec ... \;: for each such result found, do the following command in ...
rm -rf {}: recursively force remove the directory; the {} part is where the find result gets substituted into from the previous part.
Alternatively, use:
find /path/to/base/dir/* -type d -ctime +10 | xargs rm -rf
Which is a bit more efficient, because it amounts to:
rm -rf dir1 dir2 dir3 ...
as opposed to:
rm -rf dir1; rm -rf dir2; rm -rf dir3; ...
as in the -exec method.
With modern versions of find, you can replace the ; with + and it will do the equivalent of the xargs call for you, passing as many files as will fit on each exec system call:
find . -type d -ctime +10 -exec rm -rf {} +
If you want to delete all subdirectories under /path/to/base, for example
/path/to/base/dir1
/path/to/base/dir2
/path/to/base/dir3
but you don't want to delete the root /path/to/base, you have to add -mindepth 1 and -maxdepth 1 options, which will access only the subdirectories under /path/to/base
-mindepth 1 excludes the root /path/to/base from the matches.
-maxdepth 1 will ONLY match subdirectories immediately under /path/to/base such as /path/to/base/dir1, /path/to/base/dir2 and /path/to/base/dir3 but it will not list subdirectories of these in a recursive manner. So these example subdirectories will not be listed:
/path/to/base/dir1/dir1
/path/to/base/dir2/dir1
/path/to/base/dir3/dir1
and so forth.
So , to delete all the sub-directories under /path/to/base which are older than 10 days;
find /path/to/base -mindepth 1 -maxdepth 1 -type d -ctime +10 | xargs rm -rf
find supports -delete operation, so:
find /base/dir/* -ctime +10 -delete;
I think there's a catch that the files need to be 10+ days older too. Haven't tried, someone may confirm in comments.
The most voted solution here is missing -maxdepth 0 so it will call rm -rf for every subdirectory, after deleting it. That doesn't make sense, so I suggest:
find /base/dir/* -maxdepth 0 -type d -ctime +10 -exec rm -rf {} \;
The -delete solution above doesn't use -maxdepth 0 because find would complain the dir is not empty. Instead, it implies -depth and deletes from the bottom up.
I was struggling to get this right using the scripts provided above and some other scripts especially when files and folder names had newline or spaces.
Finally stumbled on tmpreaper and it has been worked pretty well for us so far.
tmpreaper -t 5d ~/Downloads
tmpreaper --protect '*.c' -t 5h ~/my_prg
Original Source link
Has features like test, which checks the directories recursively and lists them.
Ability to delete symlinks, files or directories and also the protection mode for a certain pattern while deleting
OR
rm -rf `find /path/to/base/dir/* -type d -mtime +10`
Updated, faster version of it:
find /path/to/base/dir/* -mtime +10 -print0 | xargs -0 rm -f

Resources