To find some file with grep and delete them with rm I tried following command -
$ ls | grep 2019 | xargs -i rm \"{}\"
That did not work. Got the following error message -
rm: cannot remove '"2019-05-10 00:00:00-TO-2019-05-10 23:59:59_PDT_disconnection_info.csv"': No such file or directory
Looks like xargs is taking quotes literally. So, tried echoing instead of passing directly -
ls | grep 2019 | xargs -i echo \"{}\" | xargs rm
This worked.
Why does not it work without echoing?
The proper quoting is done by xargs, there is no need to quote it again. Just:
... | xargs -i rm {}
Or better, because rm accepts multiple arguments, just do:
... | xargs rm
Why does not it work without echoing?
When not used with -i, -I, -d or similar, the xargs utility handles proper quoting in input with double or single quotes or escaping with a backslash. The quotes are removed by the second xargs and rm is passed unquoted string. From man xargs:
.... xargs reads
items from the standard input, delimited by blanks (which can be
protected with double or single quotes or a backslash)
Compare:
$ echo "\e\r\t\q\e" | xargs -t echo
echo ertqe
ertqe
Also see Why you shouldn't parse the output of ls(1).
I can't find posts that help with this exact problem:
On Mac Terminal I want to read a txt file (example.txt) containing file names such as:
20130815 144129 865 000000 0172 0780.bmp
20130815 144221 511 000003 1068 0408.bmp
....100 more
And I want to search for them in a certain folder/subfolders (example_folder). After each find, the file should be copied to a new folder x (new_destination).
Your help would be much appreciated!
Chers,
Mo
You could use a piped command with a combination of ls, grep, xargs and cp.
So basically you start with getting the list of files
ls
then you filter them with egrep -e, grep -e or whatever flavor of grep Mac uses for their terminal. If you want to find all files ending with text you can use the regex .txt$ (which means ends with '.txt')
ls | egrep -e "yourRegexExpression"
After that you get an input stream, but cp doesn't work with input streams and only takes a bunch of arguments, that's why we use xargs to convert it to arguments. The final step is to add the flag -t to the argument to signify that the next argument is the target directory.
ls | egrep -e "yourRegexExpression" | xargs cp -t DIRECTORY
I hope this helps!
Edit
Sorry I didn't read the question well enough, I updated to be match your problem. Here you can see that the egrep command compiles a rather large regex string with all the file names in this way (filename1|filename2|...|fileN). The $() evaluates the command inside and uses the tr to translate newLines to "|" for the regex.
ls | egrep -e "("$(cat yourtextfile.txt | tr "\n" "|")")" | xargs cp -t DIRECTORY
You could do something like:
$ for i in `cat example.txt`
find /search/path -type f -name "$i" -exec cp "{}" /new/path \;
This is how it works, for every line within example.txt:
for i in `cat example.txt`
it will try to find a file matching the line $i in the defined path:
find /search/path -type f -name "$i"
And if found it will copy it to the desired location:
-exec cp "{}" /new/path \;
Linux/bash, taking the list of lines on input and using xargs to work on each line:
% ls -1 --color=never | xargs -I{} echo {}
a
b
c
Cygwin, take 1:
$ ls -1 --color=never | xargs -I{} echo {}
xargs: invalid option -- I
Usage: xargs [-0prtx] [-e[eof-str]] [-i[replace-str]] [-l[max-lines]]
[-n max-args] [-s max-chars] [-P max-procs] [--null] [--eof[=eof-str]]
[--replace[=replace-str]] [--max-lines[=max-lines]] [--interactive]
[--max-chars=max-chars] [--verbose] [--exit] [--max-procs=max-procs]
[--max-args=max-args] [--no-run-if-empty] [--version] [--help]
[command [initial-arguments]]
Cygwin, take 2:
$ ls -1 --color=never | xargs echo
a b c
(yes, I know there's a universal method of ls -1 --color=never | while read X; do echo ${X}; done, I have tested that it works in Cygwin too, but I'm looking for a way to make xargs work correctly in Cygwin)
damienfrancois's answer is correct. You probably want to use -n to enforce echo to echo one file name at a time.
However, if you are really interested in taking each file and executing it one at a time, you may be better off using find:
$ find . -maxdepth 1 --exec echo {} \;
A few things:
This will pick up file names that begin with a period (including '.')
This will put a ./ in front of your file names.
The echo being used is from /bin/echo and not the built in shell version of echo.
However, it doesn't depend upon the shell executing ls * and possibility causing issues (such as coloring file names, or printing out files in sub-directories (which your command will do).
The purpose of xargs was to minimize the execution of a particular command:
$ find . -type f | xargs foo
In this case, xargs will execute foo only a minimal number of times. foo will only execute when the command line buffer gets full, or there are no more file names. However, if you are forcing an execution after each name, you're probably better off using find. It's a lot more flexible and you're not depending upon shell behavior.
Use the -n argument of xargs, which is really the one you should be using, as -I is an option that serves to give the argument a 'name' so you can make them appear anywhere in the command line:
$ ls -1 --color=never | xargs echo
a b c
$ ls -1 --color=never | xargs -n 1 echo
a
b
c
From the manpage:
-n max-args
Use at most max-args arguments per command line
-I replace-str
Replace occurrences of replace-str in the initial-arguments with names read from standard input.
How do I pipe commands and their results in Ubuntu when writing them in the terminal. I would write the following commands in sequence -
$ ls | grep ab
abc.pdf
cde.pdf
$ cp abc.pdf cde.pdf files/
I would like to pipe the results of the first command into the second command, and write them all in the same line. How do I do that ?
something like
$ cp "ls | grep ab" files/
(the above is a contrived example and can be written as cp *.pdf files/)
Use the following:
cp `ls | grep ab` files/
Well, since the xargs person gave up, I'll offer my xargs solution:
ls | grep ab | xargs echo | while read f; do cp $f files/; done
Of course, this solution suffers from an obvious flaw: files with spaces in them will cause chaos.
An xargs solution without this flaw? Hmm...
ls | grep ab | xargs '-d\n' bash -c 'docp() { cp "$#" files/; }; docp "$#"'
Seems a bit klunky, but it works. Unless you have files with returns in them I mean. However, anyone who does that deserves what they get. Even that is solvable:
find . -mindepth 1 -maxdepth 1 -name '*ab*' -print0 | xargs -0 bash -c 'docp() { cp "$#" files/; }; docp "$#"'
To use xargs, you need to ensure that the filename arguments are the last arguments passed to the cp command. You can accomplish this with the -t option to cp to specify the target directory:
ls | grep ab | xargs cp -t files/
Of course, even though this is a contrived example, you should not parse the output of ls.
How can I make xargs execute the command exactly once for each line of input given?
It's default behavior is to chunk the lines and execute the command once, passing multiple lines to each instance.
From http://en.wikipedia.org/wiki/Xargs:
find /path -type f -print0 | xargs -0 rm
In this example, find feeds the input of xargs with a long list of file names. xargs then splits this list into sublists and calls rm once for every sublist. This is more efficient than this functionally equivalent version:
find /path -type f -exec rm '{}' \;
I know that find has the "exec" flag. I am just quoting an illustrative example from another resource.
The following will only work if you do not have spaces in your input:
xargs -L 1
xargs --max-lines=1 # synonym for the -L option
from the man page:
-L max-lines
Use at most max-lines nonblank input lines per command line.
Trailing blanks cause an input line to be logically continued on
the next input line. Implies -x.
It seems to me all existing answers on this page are wrong, including the one marked as correct. That stems from the fact that the question is ambiguously worded.
Summary: If you want to execute the command "exactly once for each line of input," passing the entire line (without newline) to the command as a single argument, then this is the best UNIX-compatible way to do it:
... | tr '\n' '\0' | xargs -0 -n1 ...
If you are using GNU xargs and don't need to be compatible with all other UNIX's (FreeBSD, Mac OS X, etc.) then you can use the GNU-specific option -d:
... | xargs -d\\n -n1 ...
Now for the long explanation…
There are two issues to take into account when using xargs:
how does it split the input into "arguments"; and
how many arguments to pass the child command at a time.
To test xargs' behavior, we need an utility that shows how many times it's being executed and with how many arguments. I don't know if there is a standard utility to do that, but we can code it quite easily in bash:
#!/bin/bash
echo -n "-> "; for a in "$#"; do echo -n "\"$a\" "; done; echo
Assuming you save it as show in your current directory and make it executable, here is how it works:
$ ./show one two 'three and four'
-> "one" "two" "three and four"
Now, if the original question is really about point 2. above (as I think it is, after reading it a few times over) and it is to be read like this (changes in bold):
How can I make xargs execute the command exactly once for each argument of input given? Its default behavior is to chunk the input into arguments and execute the command as few times as possible, passing multiple arguments to each instance.
then the answer is -n 1.
Let's compare xargs' default behavior, which splits the input around whitespace and calls the command as few times as possible:
$ echo one two 'three and four' | xargs ./show
-> "one" "two" "three" "and" "four"
and its behavior with -n 1:
$ echo one two 'three and four' | xargs -n 1 ./show
-> "one"
-> "two"
-> "three"
-> "and"
-> "four"
If, on the other hand, the original question was about point 1. input splitting and it was to be read like this (many people coming here seem to think that's the case, or are confusing the two issues):
How can I make xargs execute the command with exactly one argument for each line of input given? Its default behavior is to chunk the lines around whitespace.
then the answer is more subtle.
One would think that -L 1 could be of help, but it turns out it doesn't change argument parsing. It only executes the command once for each input line, with as many arguments as were there on that input line:
$ echo $'one\ntwo\nthree and four' | xargs -L 1 ./show
-> "one"
-> "two"
-> "three" "and" "four"
Not only that, but if a line ends with whitespace, it is appended to the next:
$ echo $'one \ntwo\nthree and four' | xargs -L 1 ./show
-> "one" "two"
-> "three" "and" "four"
Clearly, -L is not about changing the way xargs splits the input into arguments.
The only argument that does so in a cross-platform fashion (excluding GNU extensions) is -0, which splits the input around NUL bytes.
Then, it's just a matter of translating newlines to NUL with the help of tr:
$ echo $'one \ntwo\nthree and four' | tr '\n' '\0' | xargs -0 ./show
-> "one " "two" "three and four"
Now the argument parsing looks all right, including the trailing whitespace.
Finally, if you combine this technique with -n 1, you get exactly one command execution per input line, whatever input you have, which may be yet another way to look at the original question (possibly the most intuitive, given the title):
$ echo $'one \ntwo\nthree and four' | tr '\n' '\0' | xargs -0 -n1 ./show
-> "one "
-> "two"
-> "three and four"
As mentioned above, if you are using GNU xargs you can replace the tr with the GNU-specific option -d:
$ echo $'one \ntwo\nthree and four' | xargs -d\\n -n1 ./show
-> "one "
-> "two"
-> "three and four"
If you want to run the command for every line (i.e. result) coming from find, then what do you need the xargs for?
Try:
find path -type f -exec your-command {} \;
where the literal {} gets substituted by the filename and the literal \; is needed for find to know that the custom command ends there.
EDIT:
(after the edit of your question clarifying that you know about -exec)
From man xargs:
-L max-lines
Use at most max-lines nonblank input lines per command line. Trailing
blanks cause an input line to be logically continued on the next input line.
Implies -x.
Note that filenames ending in blanks would cause you trouble if you use xargs:
$ mkdir /tmp/bax; cd /tmp/bax
$ touch a\ b c\ c
$ find . -type f -print | xargs -L1 wc -l
0 ./c
0 ./c
0 total
0 ./b
wc: ./a: No such file or directory
So if you don't care about the -exec option, you better use -print0 and -0:
$ find . -type f -print0 | xargs -0L1 wc -l
0 ./c
0 ./c
0 ./b
0 ./a
How can I make xargs execute the command exactly once for each line of input given?
-L 1 is the simple solution but it does not work if any of the files contain spaces in them. This is a key function of find's -print0 argument – to separate the arguments by '\0' character instead of whitespace. Here's an example:
echo "file with space.txt" | xargs -L 1 ls
ls: file: No such file or directory
ls: with: No such file or directory
ls: space.txt: No such file or directory
A better solution is to use tr to convert newlines to null (\0) characters, and then use the xargs -0 argument. Here's an example:
echo "file with space.txt" | tr '\n' '\0' | xargs -0 ls
file with space.txt
If you then need to limit the number of calls you can use the -n 1 argument to make one call to the program for each input:
echo "file with space.txt" | tr '\n' '\0' | xargs -0 -n 1 ls
This also allows you to filter the output of find before converting the breaks into nulls.
find . -name \*.xml | grep -v /target/ | tr '\n' '\0' | xargs -0 tar -cf xml.tar
These two ways also work, and will work for other commands that are not using find!
xargs -I '{}' rm '{}'
xargs -i rm '{}'
example use case:
find . -name "*.pyc" | xargs -i rm '{}'
will delete all pyc files under this directory even if the pyc files contain spaces.
Another alternative...
find /path -type f | while read ln; do echo "processing $ln"; done
find path -type f | xargs -L1 command
is all you need.
The following command will find all the files (-type f) in /path and then copy them using cp to the current folder. Note the use if -I % to specify a placeholder character in the cp command line so that arguments can be placed after the file name.
find /path -type f -print0 | xargs -0 -I % cp % .
Tested with xargs (GNU findutils) 4.4.0
You can limit the number of lines, or arguments (if there are spaces between each argument) using the --max-lines or --max-args flags, respectively.
-L max-lines
Use at most max-lines nonblank input lines per command line. Trailing blanks cause an input line to be logically continued on the next input
line. Implies -x.
--max-lines[=max-lines], -l[max-lines]
Synonym for the -L option. Unlike -L, the max-lines argument is optional. If max-args is not specified, it defaults to one. The -l option
is deprecated since the POSIX standard specifies -L instead.
--max-args=max-args, -n max-args
Use at most max-args arguments per command line. Fewer than max-args arguments will be used if the size (see the -s option) is exceeded,
unless the -x option is given, in which case xargs will exit.
#Draemon answers seems to be right with "-0" even with space in the file.
I was trying the xargs command and I found that "-0" works perfectly with "-L". even the spaces are treated (if input was null terminated ). the following is an example :
#touch "file with space"
#touch "file1"
#touch "file2"
The following will split the nulls and execute the command on each argument in the list :
#find . -name 'file*' -print0 | xargs -0 -L1
./file with space
./file1
./file2
so -L1 will execute the argument on each null terminated character if used with "-0". To see the difference try :
#find . -name 'file*' -print0 | xargs -0 | xargs -L1
./file with space ./file1 ./file2
even this will execute once :
#find . -name 'file*' -print0 | xargs -0 | xargs -0 -L1
./file with space ./file1 ./file2
The command will execute once as the "-L" now doesn't split on null byte. you need to provide both "-0" and "-L" to work.
It seems I don't have enough reputation to add a comment to Tobia's answer above, so I am adding this "answer" to help those of us wanting to experiment with xargs the same way on the Windows platforms.
Here is a windows batch file that does the same thing as Tobia's quickly coded "show" script:
#echo off
REM
REM cool trick of using "set" to echo without new line
REM (from: http://www.psteiner.com/2012/05/windows-batch-echo-without-new-line.html)
REM
if "%~1" == "" (
exit /b
)
<nul set /p=Args: "%~1"
shift
:start
if not "%~1" == "" (
<nul set /p=, "%~1"
shift
goto start
)
echo.
In your example, the point of piping the output of find to xargs is that the standard behavior of find's -exec option is to execute the command once for each found file. If you're using find, and you want its standard behavior, then the answer is simple - don't use xargs to begin with.
execute ant task clean-all on every build.xml on current or sub-folder.
find . -name 'build.xml' -exec ant -f {} clean-all \;