Cron Job - Command to delete all .flv files everyday - shell

I have this command that I run everyday via cron:
find /home/get/public_html/videos -daystart -maxdepth 0
-mtime +1 -type f -name "*.flv" |xargs rm -f
The problem is that it doesn't delete the .flv files in a directory that are 1 or more days old.
How can I correct the above command?
EDIT: Paul - the command "ls -l /home/get/public_html/videos" results in 2000+ files but here is 2 of them that should be deleted:
-rw-r--r-- 1 get get 3501188 Jan 4 15:24 f486cf0a2b6bb40e4c50c991785084131231104229.flv
-rw-r--r-- 1 get get 10657314 Jan 4 17:51 f5f1490ddaa11a663686f9d06fb37d981231112941.flv

It's better to use -print0 on find and -0 in xargs in case one file has an uncommon name.
Also, you want to use -maxdepth 1 to actually find something in the specified directory.
-maxdepth 0 means it'll only find in the directories listed in the command line, it won't check the contents of those directories.

Do you mean, if you have a directory /home/get/public_html/videos/foo it doesn't delete the files in them? That would be because you have the -maxdepth 0 argument set, which prevents find from descending into subdirectories.

-maxdepth 1

Related

Check from files in directory which is the most recent in Bash Shell Script

I am making a bash script to run in a directory with files generated everyday and copy the most recent file to another directory.
I am using this now
for [FILE in directory]
do
if [ls -Art | tail -n 1]
something...
else
something...
fi
done
I know this is not alright. I would like to compare the date modified of the files with the current date and if it was equal, copy that file then.
How would that work or is there an easier method to do it?
We could use find:
find . -maxdepth 1 -daystart -type f -mtime -1 -exec cp -f {} dest \;
Explanation:
-maxdepth 1 limits the search to the current directory.
-daystart sets the reference time of -mtime to the beginning of today.
-type f limits the search to files.
-mtime -1 limits the search to files that have been modified less than 1 day from reference time.
-exec cp -f {} dest \; copies the found files to directory dest.
Note that -daystart -mtime -1 means anytime after today 00:00 (included), but also tomorrow or any time in the future. So if you have files with last modification time in year 2042 they will be copied too. Use -mtime 0 if you prefer coping files that have been modified between today at 00:00 (excluded) and tomorrow at 00:00 (included).
Note also that all this could be impacted by irregularities like daylight saving time or leap seconds (not tested).
The newest file is different from file(s) modified today.
Using ls is actually a pretty simple and portable approach. The stdout output format is defined by POSIX (if not printing to a terminal), and ls -A is also in newer POSIX standards.
It should look more like this though:
newest=$(ls -At | head -n 1)
You could add -1, but it AFAIK it shouldn’t be required, as it’s not printing to a terminal.
If you don’t want to use ls, you can use this on linux:
find . -mindepth 1 -maxdepth 1 -type f -exec stat -c ‘%Y:%n’ {} + |
sort -n |
tail -n 1 |
cut -d : -f 2-
Note using 2- not 2 with cut, in case a filename contains :.
Also, the resulting file name will be a relative path (./file), or an empty string if no files exist.

Delete folders older then 1 day not working with "find" cmd

I'm trying to delete backup folders older then 1 day (creation date) with find command, but it's not working
Folder ls -l:
drwxrws---+ 2 root data 42 Mai 15 16:46 15-05-2019
drwxrws---+ 2 root data 89 Mai 16 14:19 16-05-2019
The creation date is 15 Mai.
This cmd should work:
find /data/backup/VMs/centos/ -type d -mtime +1 -exec rm {} \;
I tried with this first to see what happens before the remove:
find /data/backup/VMs/centos/ -type d -mtime +1 -exec ls {} \; >> find_test.txt
It should write to the file the folder to delete, but the txt file is empty.
besides use find, how can I remove this folders using the date in the name?
rm normally doesn't print on standard output, however if an error occurs it prints to standard error which can also be redirected to another file or to the same duplicating the file descriptor 2>&1
find /data/backup/VMs/centos/ -type d -mtime +1 -exec ls {} \; >> find_test.txt 2>&1
to print the name find -print action can be used, also find has actions -delete and -ls (which is not exactly the same than ls) to avoid to execute a command on each file
find /data/backup/VMs/centos/ -type d -mtime +1 -print -delete >> find_test.txt 2>&1
be careful before using -delete to avoid loosing unwanted files

Why does my find command seem to execute twice? [duplicate]

This is the contents of the directory I'm working with:
misha#hp-laptop:~/work/c/5$ ls -l
total 8
-rw-rw-r-- 1 misha misha 219 May 20 15:37 demo.c
drwxrwxr-x 2 misha misha 4096 May 20 16:07 folder
-rw-rw-r-- 1 misha misha 0 May 20 16:06 test
Now I would like to remove everything from this directory except for the file demo.c. Here's the command I've come up with:
find . ! \( -name demo.c -o -name . \) -exec rm -Rf {} \;
It does exactly what you'd think it would do (meaning, the file test and the directory folder are gone), but at the same time it also displays the following error message:
find: `./folder': No such file or directory
Why do you think that is?
it also displays this error message:
find: `./folder': No such file or directory
Why is that?
Because find recognizes ./folder as a directory when it first reads directory ., before considering whether it matches the find criteria or performing any action on it. It does not recognize that the action will remove that directory, so after performing the action, it attempts to descend into that directory to scan its contents. By the time it does that, however, the directory no longer exists.
There are multiple ways to address the problem. One not yet mentioned is to use the -prune action. This tells find not to descend into directories that match the tests:
find . ! \( -name demo.c -o -name . \) -exec rm -Rf {} \; -prune
That will serve nicely here, and it also has applications in areas where you are not deleting the directory and you do not want to limit the search depth.
Additionally, another way to avoid affecting . would be to make use of the fact that find accepts multiple base paths to test, that these can designate regular files if you wish, and that during pathname expansion any leading . in a filename must be matched explicitly. If, as in your case, there are no dotfiles in the target directory (other than . and ..), then you can accomplish your objective like this:
find * ! -name demo.c -exec rm -Rf {} \; -prune
You can change your find command to this:
find . -mindepth 1 -not -name demo.c -delete
-mindepth 1 ensure that you don't select DOT
-delete will delete all files and directories
#before
ls -lrt
total 4
-rw-rw-r-- 1 user super 0 May 20 09:14 demo.c
drwxrwxr-x 2 user super 4096 May 20 09:14 folder/
-rw-rw-r-- 1 user super 0 May 20 09:14 test
#Command
ls -1 | grep -v demo.c |xargs rm -rf
#After
ls -lrt
total 0
-rw-rw-r-- 1 user super 0 May 20 09:14 demo.c

Delete Directory Older Than n Days Using Find

My requirements are pretty much the same as this question: Shell script to delete directories older than n days
I've got directories that look like this:
Jul 24 05:46 2013_07_24
Jul 31 22:30 2013_08_01
Sep 18 05:43 2013_09_18
Oct 07 08:41 2013_10_07
I want to remove anything older than 90 days. Based on the solution given in the aforementioned thread, I used the following in my script:
find $BASE_DIR -type d -ctime +90 -exec rm -rf {} \;
The script is successfully deleting the directories, but it is also failing with this error:
find: 0652-081 cannot change directory to <actual_path>:
: A file or directory in the path name does not exist.
The only thing here that $BASE_DIR points to a location that's virtual location and the actual_path in the error message points to the actual location. There are soft links in the environment.
Try
find $BASE_DIR -mindepth 1 -maxdepth 1 -type d -ctime +90 -exec rm -rf {} \;
This will only cover directories directly under $BASE_DIR, but it should avoid generating that error message.
find .$BASE_DIR -type d -ctime +90 | sort -r | xargs rm -rf
sort -r will sort our directories in reverse order, so we won't try do delete external directories then internal ones.

How to view file date of result of find command in bash

I use a find command to find some kinds of files in bash. Everything goes fine unlness the result that is shown to me just contains the file name but not the (last modification) date of file. I tried to pipe it into ls or ls -ltr but it just does not show the filedate column in result, also I tried this:
ls -ltr | find . -ctime 1
but actually I didn't work.
Can you please guide me how can I view the filedate of files returned by a find command?
You need either xargs or -exec for this:
find . -ctime 1 -exec ls -l {} \;
find . -ctime 1 | xargs ls -l
(The first executes ls on every found file individually, the second bunches them up into one ore more big ls invocations, so that they may be formatted slightly better.)
If all you want is to display an ls like output you can use the -ls option of find:
$ find . -name resolv.conf -ls
1048592 8 -rw-r--r-- 1 root root 126 Dec 9 10:12 ./resolv.conf
If you want only the timestamp you'll need to look at the -printf option
$ find . -name resolv.conf -printf "%a\n"
Mon May 21 09:15:24 2012
find . -ctime 1 -printf '%t\t%p\n'
prints the datetime and file path, separated by a ␉ character.

Resources