first post.
Background- My wife is a photographer, she take a lot of pictures and saves them to a drive (mnt/STStorage) while she is editing them, but never cleans up afterward. I have a drive that i would like to move the folders to based on modified date.(/mnt/LTStorage).
Need help with a script that i can add to a cron job to run once a day 30 1 * * *
I would like for the script to..
Check /mnt/STStorage/ root folders for last modified date & if older
than 14 days, move that folder and everything in it to
/mnt/LTStorage/ while keeping the same folder name.
Then write what was moved to /mnt/STStorage/ so that we know what
was moved and email me a log of folders moved.
OS CentOS 6.4
here is what i have, think this may work for now. Could be cleaner.
#/bin/bash
dt=$(date +%Y.%m.%d)
From="/mnt/STStorage/"
To="/mnt/LTStorage/"
if [[ ! -d "$To" ]]; then
mkdir -p "$To"
fi
cd "$From"
for i in "$#"; do
find . -type d -mtime +14 -exec mv "{}" "$To" \; > "$From"/Moved."$dt".txt
uuencode "$From"/Moved."$dt".txt "$From"/Moved."$dt".txt | mail -s "Files Moved"
me#me.com
done
Then i will add this to crontab to run once a day.
You can use -exec along with find. Something like:
find /mnt/STStorage/ -type d -mtime +14 -exec mv {} /mnt/LTStorage/ \;
-type d will ensure only directories are moved.
Another option is to use xargs
find /mnt/STStorage/ -type d -mtime +14 | xargs -I '{}' mv {} /mnt/LTStorage/
Update:
To add what is being moved, you can set the verbose mode option for mv
find /mnt/STStorage/ -type d -mtime +14 -exec mv -v {} /mnt/LTStorage/ \;
Since this will print everything on standard out. You can redirect it to a log file.
find /mnt/STStorage/ -type d -mtime +14 -exec mv {} /mnt/LTStorage/ \; > /mnt/STStorage/log.file
For emailing you can do something like -
uuencode /mnt/STStorage/log.file /mnt/STStorage/log.file | mail -s "this is my subject line" chip#email.com
Related
I'm currently using a PHP script to back up my databases daily and it works like a charm. After the backup, I am using the shell script below to zip the backup file:
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
I need to add a piece of code that will also scan previously created "$(date +%F)_backup.sql.zip" and delete any that are older than 15 days.
Does anyone have any recommendations on how to make this work?
UPDATE 10/16/2019 1601HRS EST
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
find /home/root/BACKUPS/ -mtime +14 -type f -iname '*.backup.sql.zip' -exec rm {} \;
This did not remove the files that should have been removed. I'm not sure what I'm missing; maybe a ';' after the first line. Although the first line is running properly by zipping and naming the SQL file, the second line is not working.
cannot comment yet.
I guess you want something like this shell-command:
find /home/root/BACKUPS/ -mtime +15 -type f -iname '*.backup.sql.zip' -exec rm {} \;
Edit:
Some explanation: This finds and deletes (-exec rm {} \; ) all files (type f) with name ending in "backup.sql.zip" with a modification-time older than 15 days (mtime +15).
Hope it helps.
This worked perfectly for me.
Backup.php
#!/usr/bin/php
<?php
$file = "/home/root/BACKUPS/backup.sql";
$command = "mysqldump -uroot_admin -pkeypw --all-databases > $file";
system($command);
?>
Backup.bat
find /home/root/BACKUPS/backup.sql | xargs zip -czvPf /home/root/BACKUPS/$(date +%F)_backup.sql.zip
find /home/root/BACKUPS -name "*.zip" -type f -mtime +15 -exec rm -f {} \;
Reference: https://tecadmin.net/delete-files-older-x-days/
Basically #Sebastian the .backup.sql part of the file name reference did not have to be included. So instead of *.backup.sql.zip it needed to simply be *.zip Thank you #Sebastian for the lead.
i am a total noob, but i figured out this script for doing the following:
I have a folder called "unrar" in there are subfolders with unknown foldername with rar file inside.
Now i enter unknownsubfolder, find rar file and unrar it in unknownsubfolder.
After that i find the new file and rename it with the unknownsubfoldername. Now i grab the file and move it to ./unrar.
#!/bin/bash
cd /home/user/unrar/
for dir in /home/user/unrar/*;
do (cd "$dir" && find -name "*.rar" -execdir unrar e -r '{}' \;); done
echo "$(tput setaf 2)-> unrar done!$(tput sgr0)"
for dir in /home/user/unrar/*;
do (cd "$dir" && find -name "*.mkv" -exec mv '{}' "${PWD##*\/}.mkv" \;); done
for dir in /home/user/unrar/*;
do (cd "$dir" && find -name "*.mp4" -exec mv '{}' "${PWD##*\/}.mp4" \;); done
for dir in /home/user/unrar/*;
do (cd "$dir" && find -name "*.avi" -exec mv '{}' "${PWD##*\/}.avi" \;); done
cd /home/user/unrar
find -name "*.mkv" -exec mv '{}' /home/user/unrar \;
find -name "*.mp4" -exec mv '{}' /home/user/unrar \;
find -name "*.avi" -exec mv '{}' /home/user/unrar \;
This works fine with most files, but in some cases it doesn't
I want to find *.rar in DIR and unrar it. the newfile.(.mkv|.avi|.mp4) should be renamed to DIR(.mkv|.avi|.mp4) and moved to ./unrar
This is my filestructure.
./unrar/
- unknownsubfolder/
-file.rar
-file.r00
-....
- unknownsubfolder1/
- s01/
- file.rar
- file.r00
- ....
- s02/
- file.rar
- file.r00
- ....
- ....
If case1, unrar "/unknownsubfolder/file.rar" and get "x.mkv". the file is renamed from "x.mkv" to "unknwonsubfolder.mkv" and moved to "./unrar/unknownsubfolder.mkv"
(same with *.avi + *.mp4) ==perfekt
if case2, in my script unknownsubfolder/s01/file.rar will be unrard, but not renamed to s01.mkv insted to unknwonsubfolder1.mkv.
(if there are more like s02, s03, s04 ...) i always end up with one unknownsubfolder.mkv file in ./unrar) ==wrong output
So i guess i have 3 questions
How do i get the right DIRname for renaming the file? Or how do i enter unknownsubfolder/s01 ....?
Is there a way to exclude a word from the find? sometimes "unknownsubfolder" contains another folder+file called "sample(.mkv|.avi|.mp4)". I would like to exclude that, to prevent the original file to be overwritten with the sample file. happens sometimes.
I am sure i can combine some of the code,to make it even shorter. Could someone explain how? So how i combine the mkv,avi and mp4 in one line.
regards, wombat
(EDIT: for better understanding)
UPDATE:
I adjusted the solution to work with unrar. Since I did not had unrar installed previously, I used gunzip to construct the solution and then simply replaced it with unrar. The problem with this approach was that, by default, unrar extracts to the current working directory. Another difference is that the name of the extracted file can be completely different from the archive's name - it is not just a matter of different extensions. The original archive is also not deleted after extraction.
Here is the solution specifically tailored to work with unrar with respect to aforementioned behavior:
#!/bin/bash
path="$1"
omit="$2"
while read f;do
unrar e -r "${f}" "${f%/*}" > /dev/null
done < <(find "${path}" -type d -name "${omit}" -prune -o -type f -print)
while read f;do
new="${f%/*}"
new="${new##*/}"
mv "${f}" "${path}/${new}"
done < <(find "${path}" -type d -name "${omit}" -prune -o -type f -a \! -name '*.rar' -print )
You can save the script, e.g., as rename-script (do not forget to make it executable), and then call it like
./rename-script /path/to/unrar omitfolder
Notice, that inside the script there is no cd. You will have to at least provide the location of the unrar folder as first parameter, otherwise you will get an error. In case of OP this would be /home/user/unrar. The omitfolder is not a path, it is just the name of the folder that you want to omit. So in OP's case this would be sample.
./rename-script /home/user/unrar sample
As requested by OP in the comments, you can read about the bash read-builtin and process substitution in order to understand how the while-loop works and how it assigns the filenames returned by find to the variable f.
I've got a script that finds files within folders older than 30 days:
find /my/path/*/README.txt -mtime +30
that'll then produce a result such as
/my/path/jobs1/README.txt
/my/path/job2/README.txt
/my/path/job3/README.txt
Now the part I'm stuck at is I'd like to remove the folder + files that are older than 30 days.
find /my/path/*/README.txt -mtime +30 -exec rm -r {} \;
doesn't seem to work. It's only removing the readme.txt file
so ideally I'd like to just remove /job1, /job2, /job3 and any nested files
Can anyone point me in the right direction ?
This would be a safer way:
find /my/path/ -mindepth 2 -maxdepth 2 -type f -name 'README.txt' -mtime +30 -printf '%h\n' | xargs echo rm -r
Remove echo if you find it already correct after seeing the output.
With that you use printf '%h\n' to get the directory of the file, then use xargs to process it.
You can just run the following command in order to recursively remove directories modified more than 30 days ago.
find /my/path/ -type d -mtime +30 -exec rm -rf {} \;
I have an onsite backup folder /backup/ that has an offsite rsynced copy mounted locally as /mnt/offsite/backup/. My onsite backup drive is getting full, so I'd like to delete files older than 365 days, but first check to see if that file exists offsite, and log to file the filenames that were removed (to exclude from rsync).
I've come close with this:
cd /mnt/offsite/backup && find . -type f -mtime +365 -exec rm /backup/{} \; | >> file.lst
However the redirection isn't working. I've tried placing the >> in different places, and can't get it to work with exec in there. I've also tried using xargs rm, and can get the redirect working, but can't get xargs to delete from the second path:
cd /mnt/offsite/backup && find . -type f -mtime +365 >> file.lst | xargs rm /backup/
What's the best approach?
Hope this helps
find /mnt/offsite/backup -type f -mtime +365 -exec rm {} \; -print >> file.lst
I have the following simple script for backing up my website files and db. The script is run each day via a cron job.
#!/bin/sh
NOW=$(date +"%Y-%m-%d")
mysqldump --opt -h localhost -u username -p'password' dbname > /path/to/folder/backup/db-backup-$NOW.sql
gzip -f /path/to/folder/backup/db-backup-$NOW.sql
tar czf /path/to/folder/backup/web-backup-$NOW.tgz /path/to/folder/web/content/
It works great, but I don't want loads of old backups clogging my system. How can I modify the script to remove any backups older than a week when the script is run?
How about adding something like this:
find -ctime +7 -print0 | xargs -0 rm -v
find -ctime +7 -print0 finds all files that were changed (the c) more than 7 days ago (+7) and sends that out as a \0 separated string (-print0) which xargs -0 sends to rm -v as arguments.
with GNU find, you can use -delete
find /path -type f -iname "*backup*gz" -mtime +7 -delete
or you can use +; in place of xargs.
find /path -type f -iname "*backup*gz" -mtime +7 -exec rm -f "{}" +;