How to delete specific files in unix - bash

We have a few files on our server instance under /wslogs/instance_name directory and these are all log files created on daily basis.
I am looking for a script to automatically delete those files based on date.
So lets say delete files older than 10 days. The problem is that the filename is not purely of date format rather it is
hostname_%m%d%Y_access.log and hostname_%m%d%Y_error.log
For example, ra70960708_12042016_access.log and ra70960708_12042016_error.log (where ra70960708 is the server name or hostname).
I'm trying to use rm command, but unable to figure out how to specify the files here if say I have to delete those which are 10 days older from current date.
Any help would be greatly appreciated.
Cheers,
Ashley

Forgot about name, and use modification time instead:
The below will list files in current directory, that matches the glob: hostname_*_error.log and which are last modified +10 days ago:
find . -maxdepth 1 -mindepth 1 \
-type f -name 'hostname_*_error.log' \
-mtime +10
They can then be deleted with -delete.
. is the directory to search in.

Related

Delete files in a directory older than the current month in shell

I want to know how to delete the files in a directory those are older than the current month?
I have tried for this-
Assume that 30 days per month, use find -mtime +30
-mtime n
File's data was last modified n*24 hours ago. See the comments
for -atime to understand how rounding affects the interpretation
of file modification times.

Shell - Delete files in directory after 3 months

All,
I need to delete files within a directory past a certain date. I have a function with format:
Example file: my_test_file_2015_04_01.log
Example function
rem_files $my_directory my_test_file_*.log
How would I remove all files after 3 months?
function rem_files
{
?????????
}
If you can trust file, "modified time" you can use find -name my_test_file_*.log -ctime +90 to find files older than 90 days

Bash Script to Check That Files Are Being Created

We have an Amazon EC2 instance where we upload output from our security cameras. Every now and then, the cameras have an issue, stop uploaded, and need to be rebooted. The easy way for us to determine this is by seeing if the files are not being created. The problem is it creates lots and lots of files. If I use find with -ctime, it takes a very long time for this script to run. Is there a faster way to check to see if files have been created since yesterday? I just need to capture the result, (yes there are some files, or not there are not,) and email a message, but it would be nice to have something that didn't take half an hour to run.
#!/bin/bash
find /vol/security_ftp/West -ctime -1
find /vol/security_ftp/BackEntrance -ctime -1
find /vol/security_ftp/BoardroomDoor -ctime -1
find /vol/security_ftp/MainEntrance -ctime -1
find /vol/security_ftp/North -ctime -1
find /vol/security_ftp/South -ctime -1
Using find is a natural solution, but if you really must avoid it, you can see the newest file in a directory using ls and sorting the output according to ctime, eg.
ls /vol/security_ftp/West -clt | head --lines=1
This would be enough if you want to see the date.
If you need better formatted output (or only ctime to process it further) you can feed the filename to stat:
stat --format="%z" $( ls /vol/security_ftp/West -ct | head --lines=1 )
This does not answer automatically if any file was created recently, though.
The simple (and recommended man find) solution is:
find /vol/security_ftp/ -mtime 0
To find files in /vol/security_ftp modified within the last 24 hours. Give it a try and see if it will meet your time requirements. We can look for another solution if the default can't do it quick enough. If the delay is due to numerous subdirectories under /vol/security_ftp, then limit the depth and type with:
find /vol/security_ftp/ -maxdepth 1 -type f -mtime 0

Bash script to find file older than X days, then subsequently delete it, and any files with the same base name?

I am trying to figure out a way to search a directory for a file older than 365 days. If it finds a match, I'd like it to both delete the file and locate any other files in the directory that have the same basename, and delete those as well.
File name examples: 12345.pdf (Search for) then delete, 12345_a.pdf, 12345_xyz.pdf (delete if exist).
Thanks! I am very new to BASH scripting, so patience is appreciated ;-))
I doubt this can be done cleanly in a single pass.
Your best bet is to use -mtime or a variant to collect names and then use another find command to delete files matching those names.
UPDATE
With respect to your comment, I mean something like:
# find basenames of old files
find .... -printf '%f\n' | sort -u > oldfiles
for file in ($<oldfiles); do find . -name $file -exec rm; done

Need to find files where the file is named for a API generated serial number + YYYYMMDD

I need to find files named with a API generated serial number + YYYYMMDD followed API time stamp data.
Examples:
send0167663112011110414150180.xml --Created 20111104
send0148402292010121812300296.xml --Created 20101218
send0152858682009032013000173.xml --Created 20090320
Many thanks in advance.
You did not specify which language you are trying this on.
If you are using c#, use DirectoryInfo.GetFiles on the directory of interest and get the file info of the files inside the directory using 'FileInfo' and check for 'Name' and compare with what you want.
This will find files that end with the date and meet the other criteria.
date=20111104
find -type -f -name "*$date" -mtime +142 -exec ls -ltr {} \;

Resources