A coworker showed me a nifty way of using rm and xargs for deleting filenames listed in a .txt - but I can't remember what he did.
I ran
echo | xargs -a file.txt
where file.txt contained
1
2
3
4
And it printed
1 2 3 4
My logic says that
rm | xargs -a file.txt
should delete the files I created titled 1 and 2 and 3 and 4.
But that is not the behavior I get.
How do I form this simple command?
I believe you want:
xargs -a file.txt rm
The last argument to xargs should be the command you want it to run on all of the items in the file.
The solution proposed by Lynch is also valid and equivalent to this one.
Try this command:
xargs rm < file.txt
xargs take every line in input and append it to the command you specify.
so if file.txt contains:
a
b
then xargs will execute rm a b
Unless file.txt is really large, xargs is unnecessary and this is equivalent:
rm $(<file.txt)
and portable (POSIX) too.
Related
Saying that I have two files t1 and t2, they have the same content: abc.
Now I want to delete all files, who contains the string abc.
So I tried to execute the command: grep -rl abc . | rm but it doesn't work.
Then I add xargs: grep -rl abc . | xargs rm and it works.
I can't understand clearly what xargs did.
grep puts the output as stdout. But rm cannot process data from stdin (the pipe links both).
You want instead, that the output of grep is put as argument of rm. So xargs command "convert" stdin into arguments of xargs first argument, and it call the command (the first argument).
As alternative, you could do
rm `grep -rl abc .`
or
rm $(grep -rl abc .)
But xargs handles well also the case where there are too many arguments for a single call of the command. The above command will give you shell error (argument string too long).
rm doesn't read from standard input (except when prompting, like with -i) but takes its arguments on the command line. That's what xargs does for you: read things from standard input and give them to rm as arguments.
Example with echo:
$ (echo a; echo b; date) | xargs echo
a b tor 12 apr 2018 14:18:50 CEST
Linux/bash, taking the list of lines on input and using xargs to work on each line:
% ls -1 --color=never | xargs -I{} echo {}
a
b
c
Cygwin, take 1:
$ ls -1 --color=never | xargs -I{} echo {}
xargs: invalid option -- I
Usage: xargs [-0prtx] [-e[eof-str]] [-i[replace-str]] [-l[max-lines]]
[-n max-args] [-s max-chars] [-P max-procs] [--null] [--eof[=eof-str]]
[--replace[=replace-str]] [--max-lines[=max-lines]] [--interactive]
[--max-chars=max-chars] [--verbose] [--exit] [--max-procs=max-procs]
[--max-args=max-args] [--no-run-if-empty] [--version] [--help]
[command [initial-arguments]]
Cygwin, take 2:
$ ls -1 --color=never | xargs echo
a b c
(yes, I know there's a universal method of ls -1 --color=never | while read X; do echo ${X}; done, I have tested that it works in Cygwin too, but I'm looking for a way to make xargs work correctly in Cygwin)
damienfrancois's answer is correct. You probably want to use -n to enforce echo to echo one file name at a time.
However, if you are really interested in taking each file and executing it one at a time, you may be better off using find:
$ find . -maxdepth 1 --exec echo {} \;
A few things:
This will pick up file names that begin with a period (including '.')
This will put a ./ in front of your file names.
The echo being used is from /bin/echo and not the built in shell version of echo.
However, it doesn't depend upon the shell executing ls * and possibility causing issues (such as coloring file names, or printing out files in sub-directories (which your command will do).
The purpose of xargs was to minimize the execution of a particular command:
$ find . -type f | xargs foo
In this case, xargs will execute foo only a minimal number of times. foo will only execute when the command line buffer gets full, or there are no more file names. However, if you are forcing an execution after each name, you're probably better off using find. It's a lot more flexible and you're not depending upon shell behavior.
Use the -n argument of xargs, which is really the one you should be using, as -I is an option that serves to give the argument a 'name' so you can make them appear anywhere in the command line:
$ ls -1 --color=never | xargs echo
a b c
$ ls -1 --color=never | xargs -n 1 echo
a
b
c
From the manpage:
-n max-args
Use at most max-args arguments per command line
-I replace-str
Replace occurrences of replace-str in the initial-arguments with names read from standard input.
I found 5 last core files.
I need to delete all core files except these 5 files.
ls -t /u01/1/bin/core.siebprocmw.* |head -n 5
command to find 5 last files by time.
ls -t /u01/1/bin/core.siebprocmw.* |head -n 5 |xargs rm -r
command remove found last 5 files.
I need to delete all files except these last 5 files. Any ideas?
You could use sed to exclude the first five newest files, then delete the rest:
ls -t /u01/1/bin/core.siebprocmw.* | sed '1,5d' | xargs rm -r
You could also try
ls -t /u01/1/bin/core.siebprocmw.* | head -n -5 | xargs rm -r
head -n -5 selects everything except the last 5 lines in the output of ls.
Problem
In a directory there are files of the format: *-foo-bar.txt.
Example directory:
$ ls *-*
asdf-foo-bar.txt ghjk-foo-bar.txt l-foo-bar.txt tyui-foo-bar.txt
bnm-foo-bar.txt iop-foo-bar.txt qwer-foo-bar.txt zxcv-foo-bar.txt
Desired directory:
$ ls *.txt
asdf.txt bnm.txt ghjk.txt iop.txt l.txt qwer.txt tyui.txt zxcv.txt
Solution 1
The first solution that came to my mind looks somewhat like this ugly hack:
ls *-* | cut -d- -f1 | sed 's/.*/mv "\0-foo-bar.txt" "\0.txt"/' > rename.sh && sh rename.sh
The above solution creates a script, on the fly, to rename the files. This solution also tries to parse the output of ls which is not a good thing to do as per http://mywiki.wooledge.org/ParsingLs.
Solution 2
This problem can be solved more elegantly with a shell script like this:
for i in *-*
do
mv "$i" "`echo $i | cut -f1 -d-`.txt"
done
The above solution uses a loop to rename the files.
Question
Is there a way to solve this problem in a single line such that we do not have to explicitly script a loop, or generate a script, or invoke a new or the current shell (i.e. avoid sh, bash, ., etc. commands)?
Have you tried the rename command?
For example:
rename 's/-foo-bar//' *-foo-bar.txt
If you don't have that available, I would use find, sed, and xargs:
find . -maxdepth 1 -mindepth 1 -type f -name '*-foo-bar.txt' | sed 's/-foo-bar.txt//' | xargs -I{} mv {}-foo-bar.txt {}.txt
I use the below command to delete changed files sometimes when using hg.
hg status -n | xargs rm
I have come across an problem where if the output of
hg status -n
contains any file paths with spaces in the file will not be found. Usually i would quote or escape spaces in file names but im not sure how to do this with piped output. Any help would be great thanks :)
Tell both commands to use NUL as the delimiter:
hg status -n0 | xargs -0 rm
Also be careful: the -n option will print even files Mercurial doesn't know about.
Maybe you want this instead?
hg status -mn0 | xargs -0 rm
Also, don't forget about hg revert or hg purge. Maybe they do what you want, e.g.
hg revert --all --no-backup
or
.hgrc
[extensions]
hgext.purge=
shell
hg purge
I don't have hg installed. So I will do it with ls:
$ touch 'file A' 'file B'
$ ls -1
file A
file B
$ ls | xargs rm
rm: cannot remove `file': No such file or directory
rm: cannot remove `A': No such file or directory
rm: cannot remove `file': No such file or directory
rm: cannot remove `B': No such file or directory
$ ls | tr '\n' '\0' | xargs -0 rm
$ ls
Let xargs handle that with the -I option:
hg status -n | xargs -I FileName rm FileName
-I increases the safety, but reduces the efficiency as only one filename at a time will be passed to 'rm'
An example:
$ printf "%s\n" one "2 two" "three 3 3" | xargs printf "%s\n"
one
2
two
three
3
3
$ printf "%s\n" one "2 two" "three 3 3" | xargs -I X printf "%s\n" X
one
2 two
three 3 3
Beside -0, newer xargs has option -d which can help you doing such things:
<command returning \n-separated paths> | xargs -d \\n rm -v