Unix shell script to delete older log files - shell

Could someone please give me the command to delete log files before today (except today and yesterday) date?

You can use find with the -mtime option to get a list of files modified more than N days ago.
For example:
find . -maxdepth 1 -name '*.txt' -mtime +2
will give you all the *.txt files in the current directory older than 48 hours.
You can add -delete to actually get rid of them.

find /path/to/files* -mtime +2 -delete

Related

Remove old files with date in the name

I have files with name as
filename_201702200800.tar.bz2
filename_201702201800.tar.bz2
and so on
I am looking to remove files which are 5 days and older.
The date in the files is of the format %Y%m%d%H%M.
As the creation time corresponds to the names, just use find:
find /path/to/files -type f -ctime +5 -exec rm {} +
From man page:
-exec command {} +
This variant of the -exec action runs the specified command on the selected files, but the command line is built by
appending each selected file name at the end; the total number of
invocations of the command will be much less than the number of
matched files. The command line is built in much the same way that
xargs builds its command lines. Only one instance of ‘{}’ is
allowed within the command. The command is executed in the starting
directory.
Not enough rep to comment.
Can you use the mtime of the files?
find /path/to/files -type f -mtime +5 -delete
Otherwise, you could calculate the dates to find with:
date -d "5 days ago" +%Y%m%d%H%M

Find & delete folder (ubuntu server)

I have a backup system in my ubuntu server that every day makes a database backup and save it in a folder named like the day: $(date +%d%m%y)
But on the script, when I try to find and delete the folders from the last week, the command don't find any directory.
Im trying with: find -name $(date +%d%m%y) -type d -mtime +7 -exec rm -r {};
And never find a directory. Y tryed changing the -mtime time to 1 day or 2, but dont find nothing.
I think you made a small mistake:
When you backup on the 7th of may, you create a folder with name 070515. When you search a week later, you look for a folder with name 140515 modified more then 7 days ago. However, this folder has been created only today.
You may not need the name of the folder, just use
find /backup/path -type d -mtime +7
to find all folders older then 7 days.
I suspect at least two errors in your find command:
The path is missing where to search: find /where/to/search -name ...
$(date +%d%m%y) always gives the actual date. It looks reasonable that directories with actual date don't have a modification time +1 or +7. Instead try following:
find /where/to/search -type d -mtime +7

Deleting oldest files with shell

I have a folder /var/backup where a cronjob saves a backup of a database/filesystem. It contains a latest.gz.zip and lots of older dumps which are names timestamp.gz.zip.
The folder ist getting bigger and bigger and I would like to create a bash script that does the following:
Keep latest.gz.zip
Keep the youngest 10 files
Delete all other files
Unfortunately, I'm not a good bash scripter so I have no idea where to start. Thanks for your help.
In zsh you can do most of it with expansion flags:
files=(*(.Om))
rm $files[1,-9]
Be careful with this command, you can check what matches were made with:
print -rl -- $files[1,-9]
You should learn to use the find command, possibly with xargs, that is something similar to
find /var/backup -type f -name 'foo' -mtime -20 -delete
or if your find doesn't have -delete:
find /var/backup -type f -name 'foo' -mtime -20 -print0 | xargs -0 rm -f
Of course you'll need to improve a lot, this is just to give ideas.

Recursively delete files in directory

How do I recursively delete files in directory that were modified more than 6 hours ago?
This example work for 1 day:
find /data2/input -type f -mtime +1 -delete -print
Use -mmin instead of mtime. It will allow you to specify the number of minutes since the files was last modified. So for files older than 6 hours:
find /data2/input -type f -mmin +360 -delete -print
Check the flags -cmin or -mmin in the manual page.

Using find to delete all subdirectories (and their files)

I'm sure this is straight forward and answered somewhere, but I didn't manage to find what I was looking for. Basically, I'm trying to run a cron script to clear the contents of a given directory every 7 days. So far I have tried the following,
find /myDir -mtime 7 -exec rm -rf {} \;
This however also deletes the parent directory myDir, which I do not want. I also tried,
find /myDir -type f -type d -mtime 7 -delete
which appeared to do nothing. I also tried,
fnd /myDir -type d -delete
which deleted all but the parent directory just as I need. However, a warning message came up reading,
relative path potentially not safe
I'd appreciate if anyone can rectify my script so that it safely deletes all subdirectories in folder.
Many thanks. =)
UPDATE: I decided to go for the following,
find /myDir -mindepth 1 -mtime 7 -delete
Based upon what I learned from all who replied. Again, many thanks to you all.
Try:
find /myDir -mindepth 1 -mtime 7 -exec rm -rf {} \;
What about
cd myDir/ ; find . -type d -delete
assuming that you run this from myDir parent directory.
If you can't guarantee myDir exists, then this is safer:
cd myDir/ && find . -type d -delete
find /myDir -mindepth 1 -mtime 7 -delete
should probably be
find /myDir -mindepth 1 -mtime +7 -delete
(or maybe mtime +6). The + means things 7 days old or older rather than exactly 7 days.

Resources