I'd like to do this in OS X:
ls -rt | xargs rm -i
However, rm is choking on the fact that some of the files have whitespaces.
I mention OS X because BSD's version of ls does not have a -Q flag.
Is there a way to do this without having to use find -print0?
[sgeorge#sgeorge-ld staCK]$ touch "file name"{1..5}.txt
[sgeorge#sgeorge-ld staCK]$ ls -1
file name1.txt
file name2.txt
file name3.txt
file name4.txt
file name5.txt
[sgeorge#sgeorge-ld staCK]$ ls -rt | xargs -I {} rm -v {}
removed `file name5.txt'
removed `file name4.txt'
removed `file name3.txt'
removed `file name2.txt'
removed `file name1.txt'
OR
[sgeorge#sgeorge-ld staCK]$ ls -1
file a
file b
file c
file d
[sgeorge#sgeorge-ld staCK]$ OLDIFS=$IFS; IFS=$'\n'; for i in `ls -1`; do rm -i $i; done; IFS=$OLDIFS
rm: remove regular empty file `file a'? y
rm: remove regular empty file `file b'? y
rm: remove regular empty file `file c'? y
rm: remove regular empty file `file d'? y
You have two options. You can either call xargs with the -0 option, which splits the input into arguments using NUL characters (\0) as delimiters:
ls -rt | tr '\n' '\0' | xargs -0 rm -i
or you can use the -I option to split the input on newlines only (\n) and call the desired command once for each line of the input:
ls -rt | xargs -I_ rm -i _
The difference is that the first version only calls rm once, with all the arguments provided as a single list, while the second one calls rm individually for each line in the input.
Just have find delete it for you.
find . -print -depth 1 -exec rm -i {} \;
It's more flexible, should be ok with spaces.
If you want to delete all files, what's wrong with rm -i *?
Don't parse ls
Try this
while read -u 3
do
rm -i "$REPLY"
done 3< <(ls -rt)
Related
bash on mac, installed by brew
λ brew list | grep bash
bash
λ which bash
/usr/local/bin/bash
λ rm !("shorturl.api")
-bash: !: event not found
λ ls -1 | grep -v shorturl.api | xargs rm
rm: cannot remove ''$'\033''[0m'$'\033''[01;32mapi'$'\033''[0m': No such file or directory
rm: cannot remove ''$'\033''[01;34metc'$'\033''[0m': No such file or directory
rm: cannot remove ''$'\033''[01;34minternal'$'\033''[0m': No such file or directory
rm: cannot remove ''$'\033''[00mshorturl.go'$'\033''[0m': No such file or directory
The !(pattern-list) globbing pattern only works when extended globbing is enabled. See the extglob section in glob - Greg's Wiki. In this case you need:
shopt -s extglob
rm -- !(shorturl.api)
The -- with rm is to prevent files whose names begin with - being treated as options.
One way to do it without extended globbing is:
find . -maxdepth 1 -type f ! -name shorturl.api -delete
The ls -1 | grep -v shorturl.api | xargs rm attempt in the question is broken in several ways, including:
The output of ls is intended for reading by humans. It is not suitable for automatic processing. See Why you shouldn't parse the output of ls(1).
The grep -v shorturl.api will exclude files other than the intended one. For instance, old-shorturl.api would be excluded.
xargs by default uses spaces and newlines to split its input into arguments. xargs rm won't delete files that have such characters in their names.
thanks #GordonDavisson.
use ls --color before pipeline to xargs
ls -1 --color=never | grep -v shorturl.api | xargs rm -rf
I'm trying to remove all .js and .js.map files from any sub-directory of src called __tests__.
$ find . -path './src/**' -name __tests__ | # find subdirectories
> sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' | # for each subdirectory, concat *.js and *.js.map
> xargs rm # remove files
This fails with the following errors:
rm: cannot remove './src/game/__tests__/*.js': No such file or directory
rm: cannot remove './src/game/__tests__/*.js.map': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js.map': No such file or directory
However, if I change my xargs rm to xargs echo rm, copy and paste the output, and run it, it works.
$ find . -path './src/**' -name __tests__ | sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' |
> xargs echo rm # echo command to remove files
rm ./src/game/__tests__/*.js ./src/game/__tests__/*.js.map ./src/helpers/__tests__/*.js ./src/helpers/__tests__/*.js.map
$ rm ./src/game/__tests__/*.js ./src/game/__tests__/*.js.map ./src/helpers/__tests__/*.js ./src/helpers/__tests__/*.js.map
Wrapping the output of my echo in $(...) and prepending rm results in the same error as before.
$ rm $(find . -path './src/**' -name __tests__ | sed -E 's/([^ ]+__tests__)/\1\/*.js \1\/*.js.map/g' | xargs echo rm
rm: cannot remove './src/game/__tests__/*.js': No such file or directory
rm: cannot remove './src/game/__tests__/*.js.map': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js': No such file or directory
rm: cannot remove './src/helpers/__tests__/*.js.map': No such file or directory
What am I doing wrong?
I doubt it matters, but I'm using GitBash on Windows.
First, to explain the issue: In find | sed | xargs rm, the shell only sets up communication between those programs, but it doesn't actually process the results in any way. That's a problem here because *.js needs to be expanded by a shell to replace it with a list of filenames; rm treats every argument it's given as a literal name. (This is unlike Windows, where programs do their own command-line parsing and glob expansion).
Arguably, you don't need find here at all. Consider:
shopt -s globstar # enable ** as a recursion operator
rm ./src/**/__tests__/*.js{,.map} # delete *.js and *.js.map in any __tests__ directory under src
...or, if you do want to use find, let it do the work of coming up with a list of individual files matching *.js, instead of leaving that work to happen later:
find src -regextype posix-egrep -regex '.*/__tests__/[^/]*[.]js([.]map)?' -delete
You need to have your globs (*) expanded. File name expansion is performed by the shell on UNIX, not by rm or other programs. Try:
.... | xargs -d $'\n' sh -c 'IFS=; for f; do rm -- $f; done' sh
...to explain this:
The -d $'\n' ensures that xargs splits only on newlines (not spaces!), and also stops it from treating backslashes and quotes as special.
sh -c '...' sh runs ... as a script, with sh as $0, and subsequent arguments in $1, etc; for f; will thus iterate over those arguments.
Clearing IFS with IFS= prevents string-splitting from happening when $f is used unquoted, so only glob expansion happens.
Using the -- argument to rm ensures that it treats subsequent arguments as filenames, not options, even if they start with dashes.
That said, if you have really a lot of files for each pattern, you might run into an "argument list too long", even though you are using xargs.
Another caveat is that filenames containing newlines can potentially be split into multiple names (depending on the details of the version of find you're using). A way to solve this that will work with all POSIX-compliant versions of find might be:
find ./src -type d -name __tests__ -exec sh -c '
for d; do
rm -- "$d"/*.js{,.map}
done
' sh {} +
Below is the command I am using for moving files from dir a to dir b
ls /<someloc>/a/* | tail -2000 | xargs -I{} mv {} /<someloc>/b/
-bash: /usr/bin/ls: Argument list too long
folder a has files in millions ..
Need your help to fix this please.
If the locations of both directories are on the same disk/partition and folder b is originally empty, you can do the following
$ rmdir /path/to/b
$ mv /other/path/to/a /path/to/b
$ mkdir /other/path/to/a
If folder b is not empty, then you can do something like this:
find /path/to/a/ -type f -exec mv -t /path/to/b {} +
If you just want to move 2000 files, you can do
find /path/to/a/ -type f -print | tail -2000 | xargs mv -t /path/to/b
But this can be problematic with some filenames. A cleaner way would be is to use -print0 of find, but the problem is that head and tail can't process those, so you have to use awk for this.
# first 2000 files (mimick head)
find /path/to/a -type f -print0 \
| awk 'BEGIN{RS=ORS="\0"}(NR<=2000)' \
| xargs -0 mv -t /path/to/b
# last 2000 files (mimick tail)
find /path/to/a -type f -print0 \
| awk 'BEGIN{RS=ORS="\0"}{a[NR%2000]=$0}END{for(i=1;i<=2000;++i) print a[i]}' \
| xargs -0 mv -t /path/to/b
The ls in the code in the question does nothing useful. The glob (/<someloc>/a/*) produces a sorted list of files, and ls just copies it (after re-sorting it), if it works at all. See “Argument list too long”: How do I deal with it, without changing my command? for the reason why ls is failing.
One way to make the code work is to replace ls with printf:
printf '%s\n' /<someloc>/a/* | tail -2000 | xargs -I{} mv {} /<someloc>/b/
printf is a Bash builtin, so running it doesn't create a subprocess, and the "Argument list too long" problem doesn't occur.
This code will still fail if any of the files contains a newline character in its name. See the answer by kvantour for alternatives that are not vulnerable to this problem.
I have managed to do this separately using
grep -r "zone 19" path
mkdir zone19
find . -name "ListOfFilesfromGrep" -exec mv -i {} zone19 \;
I just don't know how to combine the two, that is, how to input the list of files I get from grep into the find command.
You should use grep from within find:
find /path/to/dir -type f -exec grep -q "zone 19" {} \; -exec mv -i {} zone19 \;
You could try
grep -lr "zone 19" path | while read in ; do mv -i "$in" zone19; done
-l prints the filenames with matched string; while ... done move the files one by one.
Using GNU versions of the standard tools:
grep -l will give you the filenames.
mv -t will move to a given directory.
xargs -r will invoke a command using arguments from stdin, but only if there's at least one.
Combine them like this:
grep -l -r -e 'zone 19' path | xargs -r mv -i -t 'zone19'
Or (if your filenames might contain newlines etc):
grep -lZr -e 'zone 19' path | xargs -0r mv -it 'zone19'
You can pipe the result from grep and use xargs:
grep -lr "zone 19" path | xargs <command>
<command> will be applied on each result of grep. Note thta -o flag tells grep to show only matching parts.
Below is the command to move all files containing string "Hello" to folder zone19.
grep Hello * |cut -f1 -d":"|sort -u|xargs -I {} mv {} zone19
grep -n magenta *| rm *
grep: a.txt: No such file or directory
grep: b: No such file or directory
Above command removes all files present in the directory except ., .. .
It should remove only those files which contains the word "magenta"
Also, tried grep magenta * -exec rm '{}' \; but no luck.
Any idea?
Use xargs:
grep -l --null magenta ./* | xargs -0 rm
The purpose of xargs is to take input on stdin and place it on the command line of its argument.
What the options do:
The -l option tells grep not to print the matching text and instead just print the names of the files that contain matching text.
The --null option tells grep to separate the filenames with NUL characters. This allows all manner of filenames to be handled safely.
The -0 option to xargs to treat its input as NUL-separated.
Here is a safe way:
grep -lr magenta . | xargs -0 rm -f --
-l prints file names of files matching the search pattern.
-r performs a recursive search for the pattern magenta in the given directory ..
If this doesn't work, try -R.
(i.e., as multiple names instead of one).
xargs -0 feeds the file names from grep to rm -f
-- is often forgotten but it is very important to mark the end of options and allow for removal of files whose names begin with -.
If you would like to see which files are about to be deleted, simply remove the | xargs -0 rm -f -- part.