Shell Command to remove older files - shell

I am making backups of a client's website on a remote FTP location. I have a script (usable without root access on cPanel) which is making backups on given cron and transfer it to remote ftp location. Now the real problem is starting; as we can't have unlimited gigabytes of disk space on any server so we have to limit the backups. I was finding shell command (which can be added to cronjob directly or by creating a bash script and call that script from cron. I want to keep 1 week's daily backups. I want to delete any backup from that directory which is older than 1 week. I found following command which looks promising
find /path/to/files -mtime +30 -exec rm {}\;
But when I ran this command (for testing I replaced 'rm' with 'ls -l') I got following error
find: missing argument to `-exec'
can anybody help to resolve this little issue?
I am running CentOS + cPanel
Thank You

May be you just have to put space after the right bracket:
find /path/to/files -mtime +30 -exec rm {} \;
I couldn't test on CentOS, but on my system it doesn't work if I don't put spaces around the brackets.

The semi-colon must be a separate argument (and a week is 7 days):
find /path/to/files -mtime +7 -exec rm {} ';'
Actually, you would probably do better to use the notation + in place of ; as that will combine as many files names as convenient into a single command execution, rather like xargs does but without invoking xargs. Hence:
find /path/to/files -mtime +7 -exec rm {} +
One other advantage of this is that there are no characters that must be protected from the shell.

Related

"find: missing argument to -exec" when executing a shell script

I am trying to execute a shell script which has the following within it:
find /hana/shared/directory -type d -mtime +2 -exec rm -rf {} \;
This works on other SUSE Linux servers but on one. It keeps returning the following:
find: missing argument to -exec
If, however, I place the same syntax into a terminal and run it manually, it runs without issue.
I can see this is a common issue, but I believe I have tried many of the suggestions to no avail and I'm a bit stuck now.
Very carefully read find(1), proc(5), and the GNU Bash documentation.
You might want to run (this is dangerous; see below):
find / -type d mtime +2 -exec /bin/rm -f '{}' \;
(use at least -ok instead of -exec)
And you probably want to clean just your $HOME.
But you should avoid removing files from /proc/, /sys/, /dev/, /lib/, /usr/, /bin/, and /sbin/. See hier(7) and environ(7).

How to make this bash script NOT delete all my files?

I have a cron job, every 5 minutes, backing up my MYSQL to files ending in .sql.gz. But this is hundreds of files a day. So I searched the internet and found this bash script which I expected to just work on the files in the /backup folder specified and only on .sql.gz files. but I soon found that it deleted everything in my root folder. :-) I was able to FTP the files back and have my site back up in half an hour, but I still need the script to work as intended. I'm new to bash scripting so I'm asking what did I do wrong in editing the script I found on the internet to my needs? What would work?
Here is the rogue script. DO NOT run this as is. its broken, thats why im here:
find /home/user/backups/*.gz * -mmin +60 -exec rm {} \;
Im suspecting its that last backslash should be /home/user/backups/
And also I should remove the * before -min
so what I need should be:
find /home/user/backups/*.gz -mmin +60 -exec rm {} /home/user/backups/;
Am I correct? Or still missing something?
BTW Im running this on Dreamhost shared hosting CRON. Their support don't want to help with BASH questions really, I tried.
The filename arguments to find should be the directories to start the recursive search. Then use -name and other options to filter down to the files that match the criteria you want.
find /home/user/backups -type f -name '*.sql.gz' -mmin +60 -exec rm {} +
-type f means only select ordinary files
-name '*.sql.gz' means only filenames ending in .sql.gz
-mmin +60 means files more than 60 minutes old
And using + instead of \; at the end of -exec means that it should just run the command once with all the selected filenames, rather than separately for each filename; this is a minor efficiency improvement.

Linux command to copy recently created/updated files?

I want to copy recently created/updated files to another folder. Say, for eg, the files which created in last 3 days should be copied to another folder(/tmp). how to do that? Is it possible.
You can use the find command's mtime argument to find files that were last modified by a certain time and then use it's exec argument to copy them somewhere.
For example, this command will find files modified within three days in your current directory and copy them to your /tmp directory:
find . -mtime -3 -type f -exec cp "{}" /tmp \;
-mtime n File's data was last modified n*24 hours ago. See the comments for -atime to understand how rounding affects the
interpretation of file modification times.
-exec command ; Execute command; true if 0 status is returned. All following arguments to find are taken to be arguments to the command
until an argument consisting of ';' is encountered. The string '{}' is
replaced by the current file name being processed everywhere it occurs
in the arguments to the command, not just in arguments where it is
alone, as in some versions of find. Both of these constructions might
need to be escaped (with a '\') or quoted to protect them from
expansion by the shell. See the EXAMPLES section for examples of the
use of the -exec option. The specified command is run once for each
matched file. The command is executed in the starting directory. There
are unavoidable security problems surrounding use of the -exec action;
you should use the -execdir option instead.

Shell says find: missing argument to `-exec' and no alternatives working

A backup program I used recently made duplicates of whole bunch of files throughout my computer because of some setting that I've since changed.
When the backup program made a copy, it renamed the old one original1.thefilename.extension. I'm trying to automatically delete all of these unnecessary files with a simple shell command.
find -type f -name 'original1*' -exec rm {} \;
However, when I try to run this I get
find: missing argument to `-exec'
I've looked all over the web for a solution. I've found suggestions that I should try exec rm +, -exec rm {} +, -exec rm {} \;, -exec rm + etc. but none of them work. I am using Windows 8.1
I would really appreciate any help!
In Windows command shell, you don't need to escape the semicolon.
find -type f -name 'original1*' -exec rm {} ;
Your version of the command should work in a bash shell (like cygwin).
It's interesting that you get the gnu find to execute, because on my Windows 8.1 machine, I get Microsoft's find.

Deleting oldest files with shell

I have a folder /var/backup where a cronjob saves a backup of a database/filesystem. It contains a latest.gz.zip and lots of older dumps which are names timestamp.gz.zip.
The folder ist getting bigger and bigger and I would like to create a bash script that does the following:
Keep latest.gz.zip
Keep the youngest 10 files
Delete all other files
Unfortunately, I'm not a good bash scripter so I have no idea where to start. Thanks for your help.
In zsh you can do most of it with expansion flags:
files=(*(.Om))
rm $files[1,-9]
Be careful with this command, you can check what matches were made with:
print -rl -- $files[1,-9]
You should learn to use the find command, possibly with xargs, that is something similar to
find /var/backup -type f -name 'foo' -mtime -20 -delete
or if your find doesn't have -delete:
find /var/backup -type f -name 'foo' -mtime -20 -print0 | xargs -0 rm -f
Of course you'll need to improve a lot, this is just to give ideas.

Resources