I have a script that'll backup my svn repo to another server (setup as a cronjob to run daily)
#!/bin/bash
svnadmin dump /path/to/repo | gzip > /backups/`date +%F`_repo.svn.gz
scp /backups/`date +%F`_repo.svn.gz user#ip:/backups/svn/
So example filenames:
2014-04-30_repo.svn.gz, 2014-04-29_repo.svn.gz, 2014-04-28_repo.svn.gz
Using bash, How would I go about removing backups older than 7 days old?
This should work:
find /path/to/files -name '*_repo.svn.gz' -mtime +7 | xargs rm
If you're trying to rely totally on the file name for the date, then something like this:
TODAY=$(date '+%s')
for f in /backup/*_repo.svn.gz ; do
DATESTR=$(echo $f | sed "s/^\/backup\/\(.*\)_repo\.svn\.gz/\1/")
FILEDATE=$(date -d "$DATESTR" '+%s')
if ((FILEDATE + 7*24*60*60 < TODAY)) ; then
rm $f
fi
done
Related
I am trying to rename by adding timestamp at the front of all the files(folders not included) in the directories and sub-directories with bash script.
From this
/home/user/folder/A.log
/home/user/folder/folder1/B.txt
to
/home/user/folder/2018_5_26_12_10_38_A.log
/home/user/folder/folder1/2018_5_26_12_10_38_B.txt
This is the bash script that i tried so far.I want bash that enter subdirectoy. My current bash wont enter sub-dirs
cd /home/yolo/filename_test
for i in $(ls -al | grep '^-')
do
mv -T "$i" "$(date -r "$i" +"%Y %m %d %H %M %S" | sed -e 's/ /_/g') "$i""
done
I would use:
find . -type f -execdir bash -c '
echo mv {,$(date +'%Y_%m_%d_%H_%M_%S_')}"${1##*/}" ' _ {} \;
this finds only files and changes the directory path to the path that files found each time -execdir and rename the files with prepending today's date to them each file at a time.
with ${1##*/} we remove the leading ./ from the file; and {,$(...)}"${1##*/}" expand to "${1##*/}" $(...)"${1##*/}" which passing to the mv to rename the file.
If you need files only in a recursive way, you can use find (recursive by default) with flag type set to f
Something like this should suffice:
cd /home/yolo/filename_test
for f in $(find ./ -type f); do
mv -T "$f" "${f%/*}/$(date -r "$f" +"%Y %m %d %H %M %S" | sed -e 's/ /_/g') "${f##*/}""
done
I didn't touch the mv part, replace it with whatever you want.
Personally I would use something like:
mv "$f" "${f%/*}/$(date -r "$f" +"%Y%m%d_%H%M%S")_${f##*/}"
Keep in mind that you shouldn't use ls to list files in a cycle, because a filename could use every character (including *, \n, etc.), except for null.
I have a folder that contains database backups, but i want to automate it using cron to delete old backups.
so i created the following script
#Get the current year
YEAR=$(date +'%Y')
#Get the current month
MONTH=$(date +'%m')
#Delete data from previous months
deleteOldData() { ls /root/copy/dbbackup/smpp_credits/ | awk -F "-" -v m="$MONTH" '$2 < m' | xargs -d "\n" rm -rf ;}
#Delete data from previous years ( if any )
deletePrevYearData() { ls /root/copy/dbbackup/smpp_credits/ | awk -F "-" -v y="$YEAR" '$3 < y' | xargs -d "\n" rm -rf ;}
deleteOldData
deletePrevYearData
Executing ls /root/copy/dbbackup/smpp_credits/ | awk -F "-" -v m="$MONTH" '$2 < m' in the terminal works as expected, (it lists the required files).
but upon appending | xargs -d "\n" rm -rf the code runs and returns without any output, and checking the directory reveals that the files are still there. By the way, this code is stored and executed from a .sh file
Assuming GNU find and date, -newermt can be used to compare a file's modification time against a specific date given as an argument:
delete_older_than_date="$(date +'%Y-%m-01')"
find /root/copy/dbbackup/smpp_credits \
-maxdepth 1 \
-type f \
'!' -newermt "$delete_older_than_date" \
-exec rm -rf -- '{}' +
Parsing ls output is widely considered to be a bad idea. I would try a find command, which should be cleaner.
find /root/copy/dbbackup/smpp_credits/ -maxdepth 1 -mtime +365 -exec rm -rf {} \;
from here. You can use -mtime +30 for files that are older than one month.
This question already has answers here:
Delete all files older than 30 days, based on file name as date
(3 answers)
Closed 3 years ago.
I have CSV files get updated every day and we process the files and delete the files older than 30 days based on the date in the filename.
Example filenames :
XXXXXXXXXXX_xx00xx_**20171001**.000000_0.csv
I would like to schedule the job in crontab to delete 30 days older files daily.
Path could be /mount/store/
XXXXXXXXXXX_xx00xx_**20171001**.000000_0.csv
if [ $(date -d '-30 days' +%Y%m%d) -gt $D ]; then
rm -rf $D
fi
this above script doesn't seem to help me. Kindly help me on this.
I have been trying this for last two days.
Using CENTOS7
Thanks.
For all files:
Extract the date
touch the file with that date
delete files with the -mtime option
Do this in the desired dir for all files:
f=XXXXXXXXXXX_xx00xx_20171001.000000_0.csv
d=$(echo $f | sed -r 's/[^_]+_[^_]+_(20[0-9]{6})\.[0-9]{6}_.\.csv/\1/')
touch -d $d $f
After performing that for the whole dir, delete the older-thans:
find YourDir -type f -mtime +30 -name "*.csv" -delete
Gnu-sed has the -delete option. Other finds might need -exec rm ... .
Test before. Other pitfalls are different kind of dates, affected by touch (mtime, ctime, atime).
Test, manipulating the date with touch:
touch XXXXXXXXXXX_xx00xx_20171001.000000_0.csv
f=XXXXXXXXXXX_xx00xx_20171001.000000_0.csv; d=$(echo $f | sed -r 's/[^_]+_[^_]+_(20[0-9]{6})\.[0-9]{6}_.\.csv/\1/'); touch -d $d $f
ls -l $f
-rw-rw-r-- 1 stefan stefan 0 Okt 1 00:00 XXXXXXXXXXX_xx00xx_20171001.000000_0.csv
An efficient way to extract date from filename is to use variable expansions
f=XXXXXXXXXXX_xx00xx_20171001.000000_0.csv
d=${f%%.*} # removes largest suffix .*
d=${d##*_} # removes largest prefix *_
Or to use bash specific regex
if [[ $f =~ [0-9]{8} ]]; then echo "$BASH_REMATCH"; fi
Here is a solution if you have dgrep from dateutils.
ls *.csv | dateutils.dgrep -i '%Y%m%d' --le $(date -d "-30 day" +%F) | xargs -d '\n' rm
First we can use either ls or find to obtain a list of filenames. We can then pipe the results to dgrep to filter the filenames that contains a date string which matches our condition (in this case older than 30 days). Finally, we pipe the result to xargs rm to remove all the matched files.
-i '%Y%m%d' input date format as specified in your filename
--le $(date -d "-30 day" +%F) filter dates that are older than 30 days
You can change rm to printf "%s\n" to test the command before actually deleting it.
The following approach does not look at any generation time information of the file, it assumes the date in the filename is unrelated to the day the file is created.
#/usr/bin/env bash
d=$(date -d "-30 days" "+%Y%m%d")
for file in /yourdir/*csv; do
date=${file:$((${#file}-21)):8}
(( date < d )) && rm $file
done
I have a bash script that rsyncs files onto my NAS to the directory below:
mkdir /backup/folder_`date +%F`
How would I go about writing a cleanup script that removes directories older than 7 days old based upon the date in directories name?
#!/bin/bash
shopt -s extglob
OLD=$(exec date -d "now - 7 days" '+%s')
cd /backup || exit 1 ## If necessary.
while read DIR; do
if read DATE < <(exec date -d "${DIR#*folder_}" '+%s') && [[ $DATE == +([[:digit:]]) && DATE -lt OLD ]]; then
echo "Removing $DIR." ## Just an example message. Or we could just exclude this and add -v option to rm.
rm -ir "$DIR" ## Change to -fr to skip confirmation.
fi
done < <(exec find -maxdepth 1 -type d -name 'folder_*')
exit 0
We could actually use more careful approaches like -rd $'\0', -print0 and IFS= but I don't think they are really necessary this time.
Create a list of folders with the pattern you want to remove, remove the folders you want to keep from the list, delete everything else.
How about a simple find:
find /backup -name 'folder_*' -type d -ctime 7 -exec rm -rf {} \;
Is it possible to get the modification date and time of a folder?
I know you can use stat -f "%m" folder, but it doesn't reflect sub-files/folders changes.
Things that doesn't work:
ls -l folder - doesn't reflect changes inside the folder
stat -f "%m" folder - same as above
date -r folder - same again
find foo bar baz -printf - the printf option doesn't exist on my version of find
Versions of things:
OS: Mac OS X 10.7.1
Bash: GNU bash, version 3.2.48(1)-release (x86_64-apple-darwin11)
Solution:
find . -exec stat -f "%m" \{} \; | sort -n -r | head -1
Explanation:
the find command traverses the current directory (.) and for each file encountered executes (-exec) the command stat -f "%m". stat -f "%m" prints the last modification unix timestamp of the file.
sort -n -r sorts the output of the find command numerically (-n) in reverse order (-r). This will list the latest modification timestamp first.
head -1 then extracts the first line of the output from sort. This is the latest modification unix timestamp of all the files.
You could try 'date -r folder' to give you a date last modified
You could always get it from ls :
ls -ld mydir | awk -F' ' '{ print $6 " "$7 }'
if you need to clear cache after build . then you can check the age of the last change and delete it like this
sh("find ~/Library/Developer/Xcode/DerivedData/ -type d -maxdepth 1 -mmin +360 -name 'Cache-folder-*' -print0 | xargs -0 -I {} /bin/rm -rf '{}' || echo 'There is no such folder! Start script execution' ; exit 0")
sh("find ~/Library/Developer/Xcode/DerivedData/ -type d -maxdepth 1 -mtime 0 -name 'Cache-folder-*' -ls -exec rm -r {} \\;")