PM2 receives errors when running bash script - bash

I keep receiving error when running a bash script that delete files over a certain number of days old.
However, this particular process' log file doesn't contain any entries. How do I confirm what went wrong with this process?
My bash script is the following:
#!/usr/bin/bash
find /home/collector/cloudtrail/ -mtime +3 -exec rm {} +
find /home/collector/stackdriver/ -mtime +3 -exec rm {} +
I tried looking at the process' logs with pm2 logs 8 but it doesn't show me anything.

Related

"find: missing argument to -exec" when executing a shell script

I am trying to execute a shell script which has the following within it:
find /hana/shared/directory -type d -mtime +2 -exec rm -rf {} \;
This works on other SUSE Linux servers but on one. It keeps returning the following:
find: missing argument to -exec
If, however, I place the same syntax into a terminal and run it manually, it runs without issue.
I can see this is a common issue, but I believe I have tried many of the suggestions to no avail and I'm a bit stuck now.
Very carefully read find(1), proc(5), and the GNU Bash documentation.
You might want to run (this is dangerous; see below):
find / -type d mtime +2 -exec /bin/rm -f '{}' \;
(use at least -ok instead of -exec)
And you probably want to clean just your $HOME.
But you should avoid removing files from /proc/, /sys/, /dev/, /lib/, /usr/, /bin/, and /sbin/. See hier(7) and environ(7).

How to make this bash script NOT delete all my files?

I have a cron job, every 5 minutes, backing up my MYSQL to files ending in .sql.gz. But this is hundreds of files a day. So I searched the internet and found this bash script which I expected to just work on the files in the /backup folder specified and only on .sql.gz files. but I soon found that it deleted everything in my root folder. :-) I was able to FTP the files back and have my site back up in half an hour, but I still need the script to work as intended. I'm new to bash scripting so I'm asking what did I do wrong in editing the script I found on the internet to my needs? What would work?
Here is the rogue script. DO NOT run this as is. its broken, thats why im here:
find /home/user/backups/*.gz * -mmin +60 -exec rm {} \;
Im suspecting its that last backslash should be /home/user/backups/
And also I should remove the * before -min
so what I need should be:
find /home/user/backups/*.gz -mmin +60 -exec rm {} /home/user/backups/;
Am I correct? Or still missing something?
BTW Im running this on Dreamhost shared hosting CRON. Their support don't want to help with BASH questions really, I tried.
The filename arguments to find should be the directories to start the recursive search. Then use -name and other options to filter down to the files that match the criteria you want.
find /home/user/backups -type f -name '*.sql.gz' -mmin +60 -exec rm {} +
-type f means only select ordinary files
-name '*.sql.gz' means only filenames ending in .sql.gz
-mmin +60 means files more than 60 minutes old
And using + instead of \; at the end of -exec means that it should just run the command once with all the selected filenames, rather than separately for each filename; this is a minor efficiency improvement.

Error with Unix shell script Find and delete based on hours

I'm trying to delete a folder contents in unix based on the date created using the following statement in a shell script.
find /mypath -mmin +$((60*24*1)) -exec rm -rf {} \;
I have configured to run this script from Control M. This deletes the folder however the script ends with the error code 1. How can i avoid getting the error so that my job does not get failed?
find: '/mypath/Xdb/20170802_001028': No such file or directory
find: '/mypath/Xdb/20170802_001027': No such file or directory
find: '/mypath/Xdb/20170801_142539': No such file or directory
I don't understand why do you use
min +$((60*24*1))
use this:
find /mypath -mtime +1 -exec rm -rf {} \;
or this removed directory without checking it is empty or not
find /mypath -mtime +7 -type d -print0 | xargs -0 rm -rf
+7 remove older then , -7 removes from today to 7
In addition to the answer from SamOl, if you want to stick with your original command then you can tell the Control-M to accept the string No such file or directory as "OK".
To do this simply add an On/Do Action in the last tab of the job def =
On Do Actions
Specific statement output
Statement = *
Code = *No such file or directory*
Do = Set job to OK
There is a YouTube clip here - https://www.youtube.com/watch?v=Y3S7GdAwjQ8

find and remove only if the file is present

I am working on a shell script to clear out files from a folder if they are older than 60 days. I am using find command to do that. I have one more requirement in that as if the file exist then the script can remove such files. If no files are there which are older than 60 days, i don't want my command to give any error messages.
find AA* -mtime +60 -exec rm {} \;
I am using above given command. It gives the error message "No such file or directory" if there is no files are older than 60 days as well as if they are not starts with "AA". I don't want any such messages reported for my command.
Please help.
TIA.
Perhaps you are missing the target, i. e. AA* files in current directory?
How about changing it to
find . -name 'AA*' -mtime +60 -exec rm {} \;
or if you need to check only files or directories starting from 'AA' you could use -regex:
find . -regex './AA.*' -mtime +60 -exec rm {} \;
I don't want any such messages reported for my command.
Redirect command's standard error to /dev/null
... 2>/dev/null

Shell Command to remove older files

I am making backups of a client's website on a remote FTP location. I have a script (usable without root access on cPanel) which is making backups on given cron and transfer it to remote ftp location. Now the real problem is starting; as we can't have unlimited gigabytes of disk space on any server so we have to limit the backups. I was finding shell command (which can be added to cronjob directly or by creating a bash script and call that script from cron. I want to keep 1 week's daily backups. I want to delete any backup from that directory which is older than 1 week. I found following command which looks promising
find /path/to/files -mtime +30 -exec rm {}\;
But when I ran this command (for testing I replaced 'rm' with 'ls -l') I got following error
find: missing argument to `-exec'
can anybody help to resolve this little issue?
I am running CentOS + cPanel
Thank You
May be you just have to put space after the right bracket:
find /path/to/files -mtime +30 -exec rm {} \;
I couldn't test on CentOS, but on my system it doesn't work if I don't put spaces around the brackets.
The semi-colon must be a separate argument (and a week is 7 days):
find /path/to/files -mtime +7 -exec rm {} ';'
Actually, you would probably do better to use the notation + in place of ; as that will combine as many files names as convenient into a single command execution, rather like xargs does but without invoking xargs. Hence:
find /path/to/files -mtime +7 -exec rm {} +
One other advantage of this is that there are no characters that must be protected from the shell.

Resources