Is there some elegant/simple way to delete the folder's contents in such a way there's no error output if it is empty?
The following command
$ rm -r $dir/*
doesn't work if the directory is empty, since in such a case, the wilcard * is not expanded and you get an error saying that rm cannot find file *.
Of course, the standard way is check if it is empty (with ls $dir | wc -w or find $dir -link 2 or any other related command), and deleting its contents otherwise.
Is there an alternative way not to check folder contents and only "truncate" the directory instead?
Bash
Simply,
$ rm -rf dir/*
(By default I believe) Bash doesn't complain about not finding anything with the glob. It just passes your literal glob through to your command:
$ echo dir/*
dir/*
When rm doesn't find a filename that has the literal glob character, it complains about not finding the file it's been asked to delete:
$ rm "dir/*"
rm: cannot remove ‘dir/*’: No such file or directory
$ echo $?
1
But if you force it, it won't complain:
$ rm -f "dir/*"
$ echo $?
0
I don't know if that refrain-from-complain is POSIX.
Do note, however, that if you don't have the shell option "dotglob" set, you'll miss files that start with a dot, AKA "hidden" files.
Generally
Zsh doesn't pass the literal glob through by default. You have to ask for it with "set -o nonomatch".
$ echo dir/*
zsh: no matches found: dir/*
$ echo $?
1
$ set -o nonomatch
$ echo dir/*
dir/*
For compatibility, I wouldn't use the above modern-Bash-specific "rm -rf dir/*", but would use the more general, widely-compatible solution:
$ find dir -mindepth 1 -delete
Find all files in "dir" at a minimum depth of 1 ("dir" itself is at depth 0), and delete them.
You can use rm -rf:
rm -rf "$dir"/*
As per man bash:
-f, --force
ignore nonexistent files and arguments, never prompt
rm -rf dir/*
does not delete hidden files which name starts with dot.
This is quite weird, when bash glob the *, it does not include .* files.
mkdir -p dir
touch dir/.a
rm -fr dir/*
ls dir/.a && echo I am not deleted
output is
dir/.a
I am not deleted
Besides, the rm -fr dir/* has another disadvantage: when there are too many files in the dir, the rm command will get too many arguments and results in error too many arguments. Also, it is very slow in that case.
Seems that the most reliable and fastest way is
find dir -mindepth 1 -delete
Related
I'm trying to remove all .js and .js.map files from any sub-directory of src called __tests__.
$ find . -path './src/**' -name __tests__ | # find subdirectories
> sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' | # for each subdirectory, concat *.js and *.js.map
> xargs rm # remove files
This fails with the following errors:
rm: cannot remove './src/game/__tests__/*.js': No such file or directory
rm: cannot remove './src/game/__tests__/*.js.map': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js.map': No such file or directory
However, if I change my xargs rm to xargs echo rm, copy and paste the output, and run it, it works.
$ find . -path './src/**' -name __tests__ | sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' |
> xargs echo rm # echo command to remove files
rm ./src/game/__tests__/*.js ./src/game/__tests__/*.js.map ./src/helpers/__tests__/*.js ./src/helpers/__tests__/*.js.map
$ rm ./src/game/__tests__/*.js ./src/game/__tests__/*.js.map ./src/helpers/__tests__/*.js ./src/helpers/__tests__/*.js.map
Wrapping the output of my echo in $(...) and prepending rm results in the same error as before.
$ rm $(find . -path './src/**' -name __tests__ | sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' | xargs echo rm
rm: cannot remove './src/game/__tests__/*.js': No such file or directory
rm: cannot remove './src/game/__tests__/*.js.map': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js.map': No such file or directory
What am I doing wrong?
I doubt it matters, but I'm using GitBash on Windows.
First, to explain the issue: In find | sed | xargs rm, the shell only sets up communication between those programs, but it doesn't actually process the results in any way. That's a problem here because *.js needs to be expanded by a shell to replace it with a list of filenames; rm treats every argument it's given as a literal name. (This is unlike Windows, where programs do their own command-line parsing and glob expansion).
Arguably, you don't need find here at all. Consider:
shopt -s globstar # enable ** as a recursion operator
rm ./src/**/__tests__/*.js{,.map} # delete *.js and *.js.map in any __tests__ directory under src
...or, if you do want to use find, let it do the work of coming up with a list of individual files matching *.js, instead of leaving that work to happen later:
find src -regextype posix-egrep -regex '.*/__tests__/[^/]*[.]js([.]map)?' -delete
You need to have your globs (*) expanded. File name expansion is performed by the shell on UNIX, not by rm or other programs. Try:
.... | xargs -d $'\n' sh -c 'IFS=; for f; do rm -- $f; done' sh
...to explain this:
The -d $'\n' ensures that xargs splits only on newlines (not spaces!), and also stops it from treating backslashes and quotes as special.
sh -c '...' sh runs ... as a script, with sh as $0, and subsequent arguments in $1, etc; for f; will thus iterate over those arguments.
Clearing IFS with IFS= prevents string-splitting from happening when $f is used unquoted, so only glob expansion happens.
Using the -- argument to rm ensures that it treats subsequent arguments as filenames, not options, even if they start with dashes.
That said, if you have really a lot of files for each pattern, you might run into an "argument list too long", even though you are using xargs.
Another caveat is that filenames containing newlines can potentially be split into multiple names (depending on the details of the version of find you're using). A way to solve this that will work with all POSIX-compliant versions of find might be:
find ./src -type d -name __tests__ -exec sh -c '
for d; do
rm -- "$d"/*.js{,.map}
done
' sh {} +
rm -fr *
won't delete .files
On the other hand,
rm -fr * .*
will delete too much!
Is there a reliable way to recursively delete all contents of a directory in Bash?
One way I can think of is:
rm -fr $PWD
mkdir $PWD
cd $PWD
This has the side-effect of deleting $PWD temporarily.
I suggest to use first:
shopt -s dotglob
dotglob: If set, bash includes filenames beginning with a . in the results of pathname expansion
rm -fr * .*
is relatively "safe". rm is forbidden by POSIX from acting on . and ...
rm -rf . ..
will be a no-op, though it will return 1. If you don't want the error return, you can do:
rm -rf .[!.]*
which is POSIX standardized, no bash extension required.
You can also use find:
find . -delete
You could use find with -delete and -maxdepth:
find . -name "*" -delete -maxdepth 2
So let's say you are in the directory temp which looks like this:
./temp
|_____dir1
| |_____subdir1
X|_file X|_file |_file
|
X|_____dir2
X|_file
Looking at the tree the files and directories which have an X next to them would be deleted using the command above. subdir1 is spared, since the maximum depth at which find will delete a file is set at 2 and there is a file residing within it. find will delete files starting with . — however, it doesn't work for symbolic links.
-delete
Delete found files and/or directories. Always returns true.
This executes from the current working directory as find recurses
down the tree. It will not attempt to delete a filename with a
``/'' character in its pathname relative to ``.'' for security
reasons. Depth-first traversal processing is implied by this
option. Following symlinks is incompatible with this option.
The usual wisdom for UNIX is to use something like:
rm -rf * .[!.]* ..?*
That will list all files that start with a dot or even double dot (without including the plain double dot (./..).
But that globbing expansion will keep the asterisk if files of that type do not exist.
Let's test:
$ mkdir temp5; cd temp5
$ touch {,.,..}{aa,bb,cc}
$ echo $(find .)
. ./aa ./cc ./..bb ./..aa ./.cc ./.bb ./..cc ./.aa ./bb
And, as indicated, this will include all files:
$ echo * .[!.]* ..?*
aa bb cc .aa .bb .cc ..aa ..bb ..cc
But if one of the types doesn't exist, the asterisk will stay:
$ rm ..?*
$ echo * .[!.]* ..?*
aa bb cc .aa .bb .cc ..?*
We need to avoid arguments that contain an asterisk to workaround this issue.
I have a small bash script which goes through some directories and deletes contents within them like so:
#!/bin/bash
echo "Start job at: $(date)"
rm -rf /my/location/log_folder1/*
exit 0
unfortunately, I have these log_folder from 1 --> 10. So, log_folder1, log_folder2, log_folder3 etc..
How can I efficiently loop through this rather than write a separate line for each rm?
I do not want to use the find command to do this - I would like to learn how to loop through.
You can do it with brace expansion:
rm -rf /my/location/log_folder{1..10}/*
See man bash - Brace Expansion for details.
Brace Expansion Example:
$ echo "asd"{1..10}
asd1 asd2 asd3 asd4 asd5 asd6 asd7 asd8 asd9 asd10
Given:
$ find . -type d
.
./log_folder1
./log_folder1/sub_log_folder1
./log_folder1/sub_log_folder2
./log_folder1/sub_log_folder3
./log_folder10
./log_folder2
./log_folder3
./log_folder4
./log_folder5
./log_folder6
./log_folder7
./log_folder8
./log_folder9
Use a filename expansion (a 'glob'), and a for loop:
$ for fn in log_folder*/*; do
> rm -rf "$fn"
> done
Resulting in:
$ find . -type d
.
./log_folder1
./log_folder10
./log_folder2
./log_folder3
./log_folder4
./log_folder5
./log_folder6
./log_folder7
./log_folder8
./log_folder9
Be sure to use quotes around the file name in "$fn" so you actually delete what your think and not the unintended result of word splits.
Or, you can just use a glob that targets subdirectories:
$ rm -rf log_folder*/*
Or, an expansion that only targets some of the sub directories, not all:
$ rm -rf log_folder{1..5}/*
Which can also be used in the for loop.
(I suggest being careful with rm -rf .... if you are just getting started with scripting and expansions however. Perhaps practice with something more reversible...)
Use find
find log_folder* -delete
You can use this loop:
# using a common base directory makes it a little safer with 'rm -rf'
fbase=/my/location/log_folder
for n in {1..10}; do
f="$fbase$n"
echo "Cleaning up folder $f"
rm -rf "$f"/* # this will empty everything under $f, but not $f itself will remain as an empty folder
# use 'rm -rf "$f"' to remove everything including the top folder
done
When using sudo rm -r, how can I delete all files, with the exception of the following:
textfile.txt
backup.tar.gz
script.php
database.sql
info.txt
find [path] -type f -not -name 'textfile.txt' -not -name 'backup.tar.gz' -delete
If you don't specify -type f find will also list directories, which you may not want.
Or a more general solution using the very useful combination find | xargs:
find [path] -type f -not -name 'EXPR' -print0 | xargs -0 rm --
for example, delete all non txt-files in the current directory:
find . -type f -not -name '*txt' -print0 | xargs -0 rm --
The print0 and -0 combination is needed if there are spaces in any of the filenames that should be deleted.
rm !(textfile.txt|backup.tar.gz|script.php|database.sql|info.txt)
The extglob (Extended Pattern Matching) needs to be enabled in BASH (if it's not enabled):
shopt -s extglob
find . | grep -v "excluded files criteria" | xargs rm
This will list all files in current directory, then list all those that don't match your criteria (beware of it matching directory names) and then remove them.
Update: based on your edit, if you really want to delete everything from current directory except files you listed, this can be used:
mkdir /tmp_backup && mv textfile.txt backup.tar.gz script.php database.sql info.txt /tmp_backup/ && rm -r && mv /tmp_backup/* . && rmdir /tmp_backup
It will create a backup directory /tmp_backup (you've got root privileges, right?), move files you listed to that directory, delete recursively everything in current directory (you know that you're in the right directory, do you?), move back to current directory everything from /tmp_backup and finally, delete /tmp_backup.
I choose the backup directory to be in root, because if you're trying to delete everything recursively from root, your system will have big problems.
Surely there are more elegant ways to do this, but this one is pretty straightforward.
I prefer to use sub query list:
rm -r `ls | grep -v "textfile.txt\|backup.tar.gz\|script.php\|database.sql\|info.txt"`
-v, --invert-match select non-matching lines
\| Separator
Assuming that files with those names exist in multiple places in the directory tree and you want to preserve all of them:
find . -type f ! -regex ".*/\(textfile.txt\|backup.tar.gz\|script.php\|database.sql\|info.txt\)" -delete
You can use GLOBIGNORE environment variable in Bash.
Suppose you want to delete all files except php and sql, then you can do the following -
export GLOBIGNORE=*.php:*.sql
rm *
export GLOBIGNORE=
Setting GLOBIGNORE like this ignores php and sql from wildcards used like "ls *" or "rm *". So, using "rm *" after setting the variable will delete only txt and tar.gz file.
Since nobody mentioned it:
copy the files you don't want to delete in a safe place
delete all the files
move the copied files back in place
You can write a for loop for this... %)
for x in *
do
if [ "$x" != "exclude_criteria" ]
then
rm -f $x;
fi
done;
A little late for the OP, but hopefully useful for anyone who gets here much later by google...
I found the answer by #awi and comment on -delete by #Jamie Bullock really useful. A simple utility so you can do this in different directories ignoring different file names/types each time with minimal typing:
rm_except (or whatever you want to name it)
#!/bin/bash
ignore=""
for fignore in "$#"; do
ignore=${ignore}"-not -name ${fignore} "
done
find . -type f $ignore -delete
e.g. to delete everything except for text files and foo.bar:
rm_except *.txt foo.bar
Similar to #mishunika, but without the if clause.
If you're using zsh which I highly recommend.
rm -rf ^file/folder pattern to avoid
With extended_glob
setopt extended_glob
rm -- ^*.txt
rm -- ^*.(sql|txt)
Trying it worked with:
rm -r !(Applications|"Virtualbox VMs"|Downloads|Documents|Desktop|Public)
but names with spaces are (as always) tough. Tried also with Virtualbox\ VMs instead the quotes. It deletes always that directory (Virtualbox VMs).
Just:
rm $(ls -I "*.txt" ) #Deletes file type except *.txt
Or:
rm $(ls -I "*.txt" -I "*.pdf" ) #Deletes file types except *.txt & *.pdf
Make the files immutable. Not even root will be allowed to delete them.
chattr +i textfile.txt backup.tar.gz script.php database.sql info.txt
rm *
All other files have been deleted.
Eventually you can reset them mutable.
chattr -i *
I belive you can use
rm -v !(filename)
Except for the filename all the other files will e deleted in the directory and make sure you are using it in
This is similar to the comment from #siwei-shen but you need the -o flag to do multiple patterns. The -o flag stands for 'or'
find . -type f -not -name '*ignore1' -o -not -name '*ignore2' | xargs rm
You can do this with two command sequences.
First define an array with the name of the files you do not want to exclude:
files=( backup.tar.gz script.php database.sql info.txt )
After that, loop through all files in the directory you want to exclude, checking if the filename is in the array you don't want to exclude; if its not then delete the file.
for file in *; do
if [[ ! " ${files[#]} " ~= "$file" ]];then
rm "$file"
fi
done
The answer I was looking for was to run script, but I wanted to avoid deleting the sript itself. So incase someone is looking for a similar answer, do the following.
Create a .sh file and write the following code:
cp my_run_build.sh ../../
rm -rf * cp
../../my_run_build.sh .
/*amend rest of the script*/
Since no one yet mentioned this, in one particular case:
OLD_FILES=`echo *`
... create new files ...
rm -r $OLD_FILES
(or just rm $OLD_FILES)
or
OLD_FILES=`ls *`
... create new files ...
rm -r $OLD_FILES
You may need to use shopt -s nullglob if some files may be either there or not there:
SET_OLD_NULLGLOB=`shopt -p nullglob`
shopt -s nullglob
FILES=`echo *.sh *.bash`
$SET_OLD_NULLGLOB
without nullglob, echo *.sh *.bash may give you "a.sh b.sh *.bash".
(Having said all that, I myself prefer this answer, even though it does not work in OSX)
Rather than going for a direct command, please move required files to temp dir outside current dir. Then delete all files using rm * or rm -r *.
Then move required files to current dir.
Remove everything exclude file.name:
ls -d /path/to/your/files/* |grep -v file.name|xargs rm -rf
I have some files on my Unix machine that start with
--
e.g. --testings.html
If I try to remove it I get the following error:
cb0$ rm --testings.html
rm: illegal option -- -
usage: rm [-f | -i] [-dPRrvW] file ...
unlink file
I tried
rm "--testings.html" || rm '--testings.html'
but nothing works.
How can I remove such files on terminal?
rm -- --testings.html
The -- option tells rm to treat all further arguments as file names, not as options, even if they start with -.
This isn't particular to the rm command. The getopt function implements it, and many (all?) UNIX-style commands treat it the same way: -- terminates option processing, and anything after it is a regular argument.
http://www.gnu.org/software/hello/manual/libc/Using-Getopt.html#Using-Getopt
rm -- --somefile
While that works, it's a solution that relies on rm using getopts for parsing its options. There are applications out there that do their own parsing and will puke on that too (because they might not necessarily implement the "-- means end of options" logic).
Because of that, the solution you should drive through your skull is this one:
rm ./--somefile
It will always work, because this way your arguments never begin with a -.
Moreover, if you're trying to make really decent shell scripts; you should technically be putting ./ in front of all your filename parameter expansions to prevent your scripts from breaking due to funky filename input (or to prevent them being abused/exploited to do things they're not supposed to do: for instance, rm will delete files but skip over directories; while rm -rf * will delete everything. Passing a filename of "-rf" to a script or somebody touch ~victim/-rf'ing could in this way be used to change its behaviour with really bad consequences).
Either rm -- --testings.html or rm ./--testings.html.
rm -- --testings.html
Yet another way to do it is to use find ... -name "--*" -delete
touch -- --file
find -x . -mindepth 1 -maxdepth 1 -name "--*" -delete
For a more generalised approach for deleting files with impossible characters in the filename, one option is to use the inode of the file.
It can be obtained via ls -i.
e.g.
$ ls -lai | grep -i test
452998712 -rw-r--r-- 1 dim dim 6 2009-05-22 21:50 --testings.html
And to erase it, with the help of find:
$ find ./ -inum 452998712 -exec rm \{\} \;
This process can be beneficial when dealing with lots of files with filename peculiarities, as it can be easily scripted.
rm ./--testings.html
or
rm -- --testings.html