Had same log rotation files in linux environment and there they work. In Solaris I have problems with running those scrips:
The main purpose of scripts is to delete all logs that are older than 30 days and zip all logs that are older than 5 days. -not -name is used because I want to operate only on rotated log files, for example something.log.20181102 because .log files are current ones and I don't want to touch them.
#!/bin/bash
find ./logs -mindepth 1 -mtime +30 -type f -not -name "*.log" -delete
find ./logs -mtime +5 -not -name "*.log" -exec gzip {} \;
Problems occur with -mindepth and -not because it gives errors:
find: bad option -not
find: [-H | -L] path-list predicate-list
Based on search I have to use -prune somehow in the find, but I am not too sure how to.
If you look at the man page for find(1) on Linux (or gfind(1) on Solaris), you'll see
-not expr
Same as ! expr, but not POSIX compliant.
So you should be able to replace -not with !, though you'll need to escape it from the shell, either with a backslash or with single quotes:
find ... \! -name "*.log" ...
Note that on Solaris, there's a command called logadm which is intended to help you take care of things like this, and may be worth investigating, unless you want to have the exact same behavior on both Solaris and Linux.
With the help of #Danek Duvall and with some searching I got it working:
find ./logs -mtime +30 -type f ! -name "*.log" -exec rm -f {} \;
find ./logs -mtime +5 ! -name "*.log" -exec gzip {} \;
It deletes all log files that are older than 30 days and then zips the ones that are older than 5 days.
Related
I currently use a command that searches a directory and deletes 5 day old files.
find /path/to/files* -mtime +5 -exec rm {} \;
I run it from the command line and it works fine. But when I put it in a .sh file it says findĀ /path/to/files*: No such file or directory.
There is only two lines in the shell script:
#! /usr/bin/env bash
find /path/to/files* -mtime +5 -exec rm {} \;
How can I rewrite the script so that it works?
`
The error happens if there are currently no files matching the wildcard, presumably because none have been created since you deleted them previously.
The argument to find should be the directory containing the files, not the filenames themselves, since find will automatically search the directory. If you want to restrict the filenames, use the -name option to specify the wildcard.
And if you don't want to go into subdirectories, use the -maxdepth option.
find /path/to -maxdepth 1 -type f -name 'files*' -mtime +5 -delete
This works:
#! /usr/bin/env bash
find /home/ubuntu/directory -type f -name 'filename*' -mtime +4 -delete
Here is an example:
find /home/ubuntu/processed -type f -name 'output*' -mtime +4 -delete
I have a folder with 20k plus Images and most gui filemanagers (like dolphin) aren't able to manage this amount of data.
So I decided to use the bash instead. My problem is the following:
most of the files are *.IMG or *.LBL files
I am not interested in those files. I look for the others
with find . -type f -not -name "*.LBL" I am able to see all files instead of the *.LBL
with find . -type f -not -name "*.IMG" I am able to see all files instead of the *.IMG
both is not very helpful, since it still fills my terminal
either combining both seems not to work:
find . -type f -not -name "*.LBL" -o -not -name "*.IMG"
What is the correct way to see the files inside a folder excluding multiple filesuffixes?
Group conditions, I think -o -not isn't working as expected. Try this:
find . -type f -not \( -name "*.LBL" -o -name "*.IMG" \)
You can use bash's extended pattern matching (Might have to be turned on in a script with shopt -s extglob; usually enabled by default in an interactive shell):
printf "%s\n" !(*.LBL|*.IMG)
I would like to delete old files from multiple directories but there is a wild card for one of the path attributes. So I'm trying to loop through each of those directories without specifying each one. I think I'm almost there but I'm not sure how to cd into the specific directory to delete the relevant files.
#! /bin/bash
DELETE_SEARCH_DIR=/apps/super/userprojects/elasticsearch/v131/node*/elasticsearch-1.3.1/logs/
for entry in `ls $DELETE_SEARCH_DIR`; do
find $path -name "*super*" -type f -mtime +10 -print -delete
#find . -type f -name $entry -exec rm -f {} \;
done
Any ideas on how to get into the specific directory and apply the delete?
find can search in multiple directories. You can do it like this:
DELETE_SEARCH_DIR=/apps/super/userprojects/elasticsearch/v131/node*/elasticsearch-1.3.1/logs
find $DELETE_SEARCH_DIR -type f -name '*super*' -mtime +10 -print -delete
I have a directory with a few TB of files. I'd like to delete every file in it that is older than 14 days.
I thought I would use find . -mtime +13 -delete. To make sure the command works as expected I ran find . -mtime +13 -exec /bin/ls -lh '{}' \; | grep '<today>'. The latter should return nothing, since files that were created/modified today should not be found by find using -mtime +13. To my surprise, however, find just spew out a list of all the files modified/created today!
find your/folder -type f -mtime +13 -exec rm {} \;
This works for me.
$ find ./folder_name/* -type f -mtime +13 -print | xargs rm -rf
The simplest solution to this is in #navid's and #gniourf_gniourf's comments. Because it's buried in the comments, I'd like to bring it up to be more visible.
find your/folder -type f -mtime +13 -delete
This avoids any possible issues with spaces and whatnot in the filenames and it doesn't spin up another executable to do the deleting so it should be faster too.
I tried and tested this.
I've got a script that finds files within folders older than 30 days:
find /my/path/*/README.txt -mtime +30
that'll then produce a result such as
/my/path/jobs1/README.txt
/my/path/job2/README.txt
/my/path/job3/README.txt
Now the part I'm stuck at is I'd like to remove the folder + files that are older than 30 days.
find /my/path/*/README.txt -mtime +30 -exec rm -r {} \;
doesn't seem to work. It's only removing the readme.txt file
so ideally I'd like to just remove /job1, /job2, /job3 and any nested files
Can anyone point me in the right direction ?
This would be a safer way:
find /my/path/ -mindepth 2 -maxdepth 2 -type f -name 'README.txt' -mtime +30 -printf '%h\n' | xargs echo rm -r
Remove echo if you find it already correct after seeing the output.
With that you use printf '%h\n' to get the directory of the file, then use xargs to process it.
You can just run the following command in order to recursively remove directories modified more than 30 days ago.
find /my/path/ -type d -mtime +30 -exec rm -rf {} \;