Why does xargs -t not display quotes? - quotes

Suppose you open two directories that you search for by ls -Q and grep
$ mkdir "example 1"
$ mkdir "example 2"
$ ls -Q | grep example | xargs -t nautilus
Then the option -t shows nautilus example 1 example 2 without quotes. However, the folders are opened correctly.
$ ls -Q | grep example | xargs -t echo
echo example 1 example 2
example 1 example 2
And to be totally complete, let me show the input for xargs:
$ ls -Q | grep example
"example 1"
"example 2"
So the quotes where there...
What is going on here? Where did the quotes go?

xargs considers quotes and backslash as special. If you want it to emit quotes, you'll need to escape those. Pass the grep output to sed to escape the quotes for you:
$ ls -Q | grep example | sed 's/"/\\"/g' | xargs echo
"example 1" "example 2"

Related

delimit grep output from xargs

Is there a way to delimit the outputs from running grep with each arg from xargs?
I'm trying to run this:
echo 'pattern1\npattern2' | xargs -I{} grep -r '{}' *
For context, I'm trying to test existence of each pattern within a directory; something like this:
#pattern exit_status
pattern1 0
pattern2 1
I don't exactly need the delimiter, but want one as it would allow me to count per pattern. If there's a better way to do this I'd be open to that, too!
You could run something like this:
printf '%s\n' pattern1 pattern2 | xargs -I{} sh -c 'grep -rq "$1"; echo "$1 $?"' sh {}
This would echo each pattern and its exit status of the recursive grep command.

file and number of lines matching

I am trying to count lines in the files which matches to the given pattern. now the problem is it gives me only the number of lines. How can i get the file location or name along with number of matched lines?
command i am using now is
for i in $(find . -name 'foo.txt' | sed 's/\.\///g');
do
grep -l && -c '^>' $i;
done
so output i am expecting is like "file location/name number of lines matching"
grep can show you the file name if you specify it as a command-line argument. You can use xargs to invoke grep for each batch of filenames. It'll read the names from standard input and use them as command line arguments for grep.
find . | xargs grep -cH '^>'
Using your find command:
find . -name 'foo.txt' | sed 's/\.\///g' | xargs grep -cH '^>'
You capture the number of lines matching in a variable and test it:
n=$(grep -c '^>' "$i")
(( n > 0 )) && echo "$i:$n"
Actually, you don't even need the test: grep exits unsuccessfully if no matches found, so
n=$(grep -c '^>' "$i") && echo "$i:$n"
Actually, that's too much work. With GNU grep at least, use the -H option. A demo with a file I have lying around:
$ grep -c zero note.xml
16
$ grep -Hc zero note.xml
note.xml:16

bash: Script arguments ${1} interfering with xargs' ${1} inside script

Long story short, I have a script chunk that looks like this:
cd ${1} # via command-line argument
find . -name "output*.txt" | xargs cat ${1} | grep -v ${filter} > temp.txt
I essentially got to this point buy building the find ... line in the command line, then pasting it into my script, then adding the cd command to make it easy to reuse this script in a wrapper that will run this script on a large set of directories. Anyway ...
The problem is that cd and xargs use the same ${1} variable, which sort of makese
I know that I can drop the ${1} argument from xargs, and I can probably rewrite the find command to not need xargs at all, but my question remains:
Is there a way to "reset" ${1} after I use it for cd so that xargs doesn't
I'm not familiar with a version of xargs that uses ${1} as a default replacement string, but the following should work:
find . -name "output*.txt" | xargs -I '{}' cat '{}' | grep -v ${filter} > temp.txt
Your use of find + xargs suffers from The separator problem https://en.wikipedia.org/wiki/Xargs#Separator_problem
Here is a solution that does not have that problem. It uses GNU Parallel:
find . -name "output*.txt" |
parallel cat {} |
grep -v ${filter} > temp.txt
It takes literally 10 seconds to install GNU Parallel:
$ (wget -O - pi.dk/3 || lynx -source pi.dk/3 || curl pi.dk/3/ || \
fetch -o - http://pi.dk/3 ) > install.sh
$ sha1sum install.sh | grep 883c667e01eed62f975ad28b6d50e22a
12345678 883c667e 01eed62f 975ad28b 6d50e22a
$ md5sum install.sh | grep cc21b4c943fd03e93ae1ae49e28573c0
cc21b4c9 43fd03e9 3ae1ae49 e28573c0
$ sha512sum install.sh | grep da012ec113b49a54e705f86d51e784ebced224fdf
79945d9d 250b42a4 2067bb00 99da012e c113b49a 54e705f8 6d51e784 ebced224
fdff3f52 ca588d64 e75f6033 61bd543f d631f592 2f87ceb2 ab034149 6df84a35
$ bash install.sh
Watch the intro videos to learn more: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
the utility xargs is just to manage the number of arguments and there braces act as a placeholder for the arguments or filename {}.
i did simulate the wrapper you have and had the same issue the pathname passed as ${1} to my function was picked up by xargs ...... but this is how shell works , actually the shell would just substitute/expand all the variables and stuff to make the commands complete before execution
the utility arnt coded to take care of this by themselves , they just dont do that ......
example :
echo * ;
here the shell would expand * and replace it with all the filenames in the current directory and pass them to the echo utility as argument # before execution of the command echo *
likewise in your case the shell is expanding the ${1} to the pathname value you had passed before executing the command ..... thats why you got what you had.
Solution : you could just use the braces (empty of course) or better just drop them ....... it works fine for me .....
find . -iname "*${filename}*" | xargs ls -l | more commands ....;
hope this helps .

how to pipe commands in ubuntu

How do I pipe commands and their results in Ubuntu when writing them in the terminal. I would write the following commands in sequence -
$ ls | grep ab
abc.pdf
cde.pdf
$ cp abc.pdf cde.pdf files/
I would like to pipe the results of the first command into the second command, and write them all in the same line. How do I do that ?
something like
$ cp "ls | grep ab" files/
(the above is a contrived example and can be written as cp *.pdf files/)
Use the following:
cp `ls | grep ab` files/
Well, since the xargs person gave up, I'll offer my xargs solution:
ls | grep ab | xargs echo | while read f; do cp $f files/; done
Of course, this solution suffers from an obvious flaw: files with spaces in them will cause chaos.
An xargs solution without this flaw? Hmm...
ls | grep ab | xargs '-d\n' bash -c 'docp() { cp "$#" files/; }; docp "$#"'
Seems a bit klunky, but it works. Unless you have files with returns in them I mean. However, anyone who does that deserves what they get. Even that is solvable:
find . -mindepth 1 -maxdepth 1 -name '*ab*' -print0 | xargs -0 bash -c 'docp() { cp "$#" files/; }; docp "$#"'
To use xargs, you need to ensure that the filename arguments are the last arguments passed to the cp command. You can accomplish this with the -t option to cp to specify the target directory:
ls | grep ab | xargs cp -t files/
Of course, even though this is a contrived example, you should not parse the output of ls.

Escaping piped output between commands

I use the below command to delete changed files sometimes when using hg.
hg status -n | xargs rm
I have come across an problem where if the output of
hg status -n
contains any file paths with spaces in the file will not be found. Usually i would quote or escape spaces in file names but im not sure how to do this with piped output. Any help would be great thanks :)
Tell both commands to use NUL as the delimiter:
hg status -n0 | xargs -0 rm
Also be careful: the -n option will print even files Mercurial doesn't know about.
Maybe you want this instead?
hg status -mn0 | xargs -0 rm
Also, don't forget about hg revert or hg purge. Maybe they do what you want, e.g.
hg revert --all --no-backup
or
.hgrc
[extensions]
hgext.purge=
shell
hg purge
I don't have hg installed. So I will do it with ls:
$ touch 'file A' 'file B'
$ ls -1
file A
file B
$ ls | xargs rm
rm: cannot remove `file': No such file or directory
rm: cannot remove `A': No such file or directory
rm: cannot remove `file': No such file or directory
rm: cannot remove `B': No such file or directory
$ ls | tr '\n' '\0' | xargs -0 rm
$ ls
Let xargs handle that with the -I option:
hg status -n | xargs -I FileName rm FileName
-I increases the safety, but reduces the efficiency as only one filename at a time will be passed to 'rm'
An example:
$ printf "%s\n" one "2 two" "three 3 3" | xargs printf "%s\n"
one
2
two
three
3
3
$ printf "%s\n" one "2 two" "three 3 3" | xargs -I X printf "%s\n" X
one
2 two
three 3 3
Beside -0, newer xargs has option -d which can help you doing such things:
<command returning \n-separated paths> | xargs -d \\n rm -v

Resources