I run the following command on my Macbook:
mkdir ~/tmp/~
Now, I want to delete this ~/tmp/~
.
How to do it? It is not a link actually, if I run rm -rf ~/tmp/~, the home files are all dropped.
So interesting. This task can be done in this way.
# This form is safe and functional.
rm -rf ~/tmp/~
But if you try to do this, your home data is going to be lost:
# THIS FORM IS DANGEROUS; DO NOT USE IT
cd ~/tmp
rm -rf ~
I agree with Ivan X in his comment, the ~ in this context does not cause any particular problems and rmdir ~/tmp~ removes the directory without problem (see #1 below).
However, what you describe is a classic Unix/BSD problem. In order to remove a directory that contains special characters you have to make sure your shell ignores interpolation of them. There are two methods to achieve this.
You can use a full path, e.g. rmdir ~/tmp/~ or...
In the case of a file or directory starting with -, you can use -- to tell the shell that there are no options following, ala rmdir -- -foo-
So interesting. This task can be done in this way.
# This form is safe and functional.
rm -rf ~/tmp/~
But if you try to do this, your home data is going to be lost:
# THIS FORM IS DANGEROUS; DO NOT USE IT
cd ~/tmp
rm -rf ~
Quite so, and quite simply because (see Tilde Expansion):
If a word begins with an unquoted tilde character
(‘~’), all of the characters up to the first
unquoted slash (or all characters, if there is no unquoted slash) are
considered a tilde-prefix. If none of the characters in the
tilde-prefix are quoted, the characters in the tilde-prefix following
the tilde are treated as a possible login name. If this
login name is the null string, the tilde is replaced with the value of
the HOME shell variable.
Thus, rm -rf ~ is expanded as rm -rf $HOME.
Deletion of any file with a funny name is better done with a File browser. It is simply easier and safer.
If you don't want to use Finder, use any other text based file browser, e.g. emacs dired mode.
[...]
In case you insist in doing this using a shell...
ls -i # will show you the inode number of the files you have
Once you have the right inode, say 123456789, you can use find to delete it.
find . -maxdepth 1 -type f -inum 123456789 -delete
Related
I want to rm or cp a bunch of files with common extensions, some of them start with a - and so unix complains about unknown options. What can I do?
rm *csv
man rm:
To remove a file whose name starts with a '-', for example '-foo',
use one of these commands:
rm -- -foo
rm ./-foo
So:
$ touch -- -test test
$ rm -- *test
rm: remove regular empty file 'test'? y
rm: remove regular empty file '-test'? y
$
Also, quoting works to inhibit globbing if there's, say, a literal asterisk in the name:
rm "*cvs"
Sometimes it might be useful to use the interactive option and confirm the file you want to delete:
rm -i -- *
This is handy if filenames have characters hard to type on your keyboard.
This question states:
It is amazing how many users don't know about the rm ./-rf or rm -- -rf tricks.
I am afraid to try these, but curious as to what they do. They are also very difficult to search...
Can someone enlighten me?
rm ./-rf and/or rm -- -rf would attempt to remove a file named, specifically, -rf
The only trick here is that you normally can't delete a file that starts with a "-" because the command will assume it's a command argument. By preceding the file with a full path, or using the -- option (which means, end all options) the command will no longer assume it's an argument.
It should be noted that the -- version of this trick may or may not work with all shell commands either, so it's best to use the first version.
If you have a file named -rf in your directory, it is difficult to remove that file if you don't know the trick. That's because:
rm -rf
supplies two command line options (-r and -f) as a single argument, and tells rm to recursively and forcibly remove things.
If you write:
rm ./-rf
the argument does not start with a dash any more, so it is simply a file name. Similarly, by common (but not universal) convention, -- marks the end of the option arguments and anything afterwards is no longer an option (which usually means it is a file name). So:
rm -- -rf
removes the file because rm knows that the arguments that follow the -- are file names, not options for it to process.
The file -rf is even more dangerous if you use wildcards:
rm *rf*
Suddenly, this will remove directories as well as files (but won't harm the file -rf).
Not a complete answer, as I think the other answers give good explanations.
When I'm unsure what a given rm invocation is going to delete, I try to remember to simply ls the file list first to make sure it is actually what I want to delete:
$ ls -rf
-rf .. .
$
OK, clearly thats not right, lets try again:
$ ls ./-rf
./-rf
$
Thats better. Lets do a history replacement of ls with rm -v (-v just for extra paranoia/checking) and do the actual delete:
$ rm -v !!:*
rm -v ./-rf
removed `./-rf'
$
This also works nicely with wildcards, brace expansions, etc, when you're not sure what the expansion will be exactly.
Also if you're wondering how files like -rf get created in the first place, its astonishingly easy if you mess up a redirection a bit:
$ ls
$ echo > -rf
$ ls
-rf
$
I have directory a that is symlinked somewhere. I want to copy its contents to directory b. Doesn't the following simple solution break in some corner cases (e.g. hidden files, exotic characters in filenames, etc.)?
mkdir b
cp -rt b a/*
Simply adding a trailing '/' will follow the symlink and copy the contents rather than the link itself.
cp -a symlink/ dest
Bash globbing does not choke on special characters in filenames. This is the reason to use globbing, rather than parsing the output of a command such as ls. The following would also be fine.
shopt -s dotglob
mkdir -p dest
cp -a symlink/* dest/
There are often times that I want to execute a command on all files (including hidden files) in a directory. When I try using
chmod g+w * .*
it changes the permissions on all the files I want (in the directory) and all the files in the parent directory (that I want left alone).
Is there a wildcard that does the right thing or do I need to start using find?
You will need two glob patterns to cover all the potential “dot files”: .[^.]* and ..?*.
The first matches all directory entries with two or more characters where the first character is a dot and the second character is not a dot. The second picks up entries with three or more characters that start with .. (this excludes .. because it only has two characters and starts with a ., but includes (unlikely) entries like ..foo).
chmod g+w .[^.]* ..?*
This should work well in most all shells and is suitable for scripts.
For regular interactive use, the patterns may be too difficult to remember. For those cases, your shell might have a more convenient way to skip . and ...
zsh always excludes . and .. from patterns like .*.
With bash, you have to use the GLOBIGNORE shell variable.
# bash
GLOBIGNORE=.:..
echo .*
You might consider setting GLOBIGNORE in one of your bash customization files (e.g. .bash_profile/.bash_login or .bashrc).
Beware, however, becoming accustomed to this customization if you often use other environments.
If you run a command like chmod g+w .* in an environment that is missing your customization, then you will unexpectedly end up including . and .. in your command.
Additionally, you can configure the shells to include “dot files” in patterns that do not start with an explicit dot (e.g. *).
# zsh
setopt glob_dots
# bash
shopt -s dotglob
# show all files, even “dot files”
echo *
Usually I would just use . .[a-zA-Z0-9]* since my file names tend to follow certain rules, but that won't catch all possible cases.
You can use:
chmod g+w $(ls -1a | grep -v '^..$')
which will basically list all the files and directories, strip out the parent directory then process the rest. Beware of spaces in file names though, it'll treat them as separate files.
Of course, if you just want to do files, you can use:
find . -maxdepth 0 -type f -exec chmod g+w {} ';'
or, yet another solution, which should do all files and directories except the .. one:
for i in * .* ; do if [[ ${i} != ".." ]] ; then chmod g+w "$i"; fi done
but now you're getting into territory where scripts or aliases may be necessary.
What i did was
tar --directory my_directory --file my_directory.tar --create `ls -A mydirectory/`
Works just fine the ls -A my_directory expands to everything in the directory except . and ... No wierd globs, and on a single line.
ps: Perhaps someone will tell me why this is not a good idea. :p
How about:
shopt -s dotglob
chmod g+w ./*
Since you may not want to set dotglob for the rest of your bash session you can set it for a single set of commands by running in a subprocess like so:
$ (shopt -s dotglob; chmod g+w ./*)
If you are sure that two character hidden file names will never be used, then the simplest option is just be to do:
chmod g+w * ...*
I have some files on my Unix machine that start with
--
e.g. --testings.html
If I try to remove it I get the following error:
cb0$ rm --testings.html
rm: illegal option -- -
usage: rm [-f | -i] [-dPRrvW] file ...
unlink file
I tried
rm "--testings.html" || rm '--testings.html'
but nothing works.
How can I remove such files on terminal?
rm -- --testings.html
The -- option tells rm to treat all further arguments as file names, not as options, even if they start with -.
This isn't particular to the rm command. The getopt function implements it, and many (all?) UNIX-style commands treat it the same way: -- terminates option processing, and anything after it is a regular argument.
http://www.gnu.org/software/hello/manual/libc/Using-Getopt.html#Using-Getopt
rm -- --somefile
While that works, it's a solution that relies on rm using getopts for parsing its options. There are applications out there that do their own parsing and will puke on that too (because they might not necessarily implement the "-- means end of options" logic).
Because of that, the solution you should drive through your skull is this one:
rm ./--somefile
It will always work, because this way your arguments never begin with a -.
Moreover, if you're trying to make really decent shell scripts; you should technically be putting ./ in front of all your filename parameter expansions to prevent your scripts from breaking due to funky filename input (or to prevent them being abused/exploited to do things they're not supposed to do: for instance, rm will delete files but skip over directories; while rm -rf * will delete everything. Passing a filename of "-rf" to a script or somebody touch ~victim/-rf'ing could in this way be used to change its behaviour with really bad consequences).
Either rm -- --testings.html or rm ./--testings.html.
rm -- --testings.html
Yet another way to do it is to use find ... -name "--*" -delete
touch -- --file
find -x . -mindepth 1 -maxdepth 1 -name "--*" -delete
For a more generalised approach for deleting files with impossible characters in the filename, one option is to use the inode of the file.
It can be obtained via ls -i.
e.g.
$ ls -lai | grep -i test
452998712 -rw-r--r-- 1 dim dim 6 2009-05-22 21:50 --testings.html
And to erase it, with the help of find:
$ find ./ -inum 452998712 -exec rm \{\} \;
This process can be beneficial when dealing with lots of files with filename peculiarities, as it can be easily scripted.
rm ./--testings.html
or
rm -- --testings.html