xargs into different files - shell

I have a bash 'for loop' that does what I want
for i in *.data
do
./prog $i >dir/$i.bck
done
Can I turn this into an xargs construct ?
I've tried something like
ls *.data|xargs -n1 -I FILE ./prog FILE >dir/FILE.bck
But I have problems with the FILE rightside of '>'
thanks

Give this a try (you can use FILE instead of % if you prefer):
find -maxdepth 1 -name '*.data' -print0 | xargs -0 -n1 -I % sh -c './prog % > dir/%.bck'

GNU Parallel http://www.gnu.org/software/parallel/ is designed for this kind of tasks:
ls *.data | parallel ./prog {} '>'dir/{}.bck
IMHO this is more readable than the xargs solution provided.
Watch the intro video to learn more: http://www.youtube.com/watch?v=OpaiGYxkSuQ

Related

xargs how I can specify replace the {} with N inputs?

I would like to do something like following
find ./testsrc -type f -print0 | xargs -0 -P4 -n10 -I{} cp --parents {} <dest>/
The cp is just an example of command that expect something after the input got from xargs. I know in this case I might do | xargs -0 -P4 -n10 cp --parents -t <dest>/ but there are commands cannot do this.
Here -n conflicts with -I
How can I achieve same effect with -I{}?
Looks like you're using Linux's xargs which gives the below warning (--max-args is the same as -n) :
xargs: warning: options --max-args and --replace/-I/-i are mutually exclusive, ignoring previous --max-args value
As indeed -I will only consume a single element from the input, to keep -n10 you'll need a different approach. Below command line works by ab-using shell's ${#}, letting -n10 pass all these args to a sh which wraps the cp run:
find ./testsrc -type f -print0 | xargs -0 -P4 -n10 sh -c 'cp --parents "$#" <dest>/' --
Hope it helps :)
xargs cannot do this directly. GNU Parallel, however, can:
find ./testsrc -type f -print0 |
parallel -0 -P4 -n10 -I{} cp --parents {} <dest>/
(-I{} is redundant, -P4 is the default if you have 4 CPU threads).

xargs -I % command -option1 % -option2 % under cygwin

I learned from answers under this question:
Making xargs work in Cygwin
that option xargs -I does not work properly under Cygwin. There were some workarounds, but unfortunately it does not help in my case.
My question is, how can I approach the same result as:
..somthing that produces multiple lines.. | xargs -I % command -option1 % -option2 %
under Cygwin environment?
Edit:
To clarify,
I would like to get some values from stdin and invoke the "command", putting them into two places as its arguments "%". I would like to invoke my command multiple times on a data produced by the "something".
Example 1: (i haven't been programming in cpp for a huge time so please forgive me mistakes)
find -name *.cpp | cut -d. -f1 | xargs -I % gcc -o %.o -I %.h %.cpp
Example 2:
cat songs_to_process.txt | xargs -I % convert --format=mp3 --source=%.avi --output=%.mp3
You don't need xargs for this job at all -- not in any of your examples.
find . -name '*.cpp' -print0 | while IFS= read -r -d '' filename; do
basename=${filename%.*}
gcc -o "${basename}.o" -I "${basename}.h" "${basename}.cpp"
done
...or:
while IFS= read -r song; do
song=${song%$'\r'} # repair if your input file is in DOS (CRLF) format
convert --format=mp3 --source="$song".avi --output="$song".mp3
done <songs_to_process.txt
See BashFAQ #1 for an introduction to best practices for processing inputs line-by-line in bash.

how to pipe commands in ubuntu

How do I pipe commands and their results in Ubuntu when writing them in the terminal. I would write the following commands in sequence -
$ ls | grep ab
abc.pdf
cde.pdf
$ cp abc.pdf cde.pdf files/
I would like to pipe the results of the first command into the second command, and write them all in the same line. How do I do that ?
something like
$ cp "ls | grep ab" files/
(the above is a contrived example and can be written as cp *.pdf files/)
Use the following:
cp `ls | grep ab` files/
Well, since the xargs person gave up, I'll offer my xargs solution:
ls | grep ab | xargs echo | while read f; do cp $f files/; done
Of course, this solution suffers from an obvious flaw: files with spaces in them will cause chaos.
An xargs solution without this flaw? Hmm...
ls | grep ab | xargs '-d\n' bash -c 'docp() { cp "$#" files/; }; docp "$#"'
Seems a bit klunky, but it works. Unless you have files with returns in them I mean. However, anyone who does that deserves what they get. Even that is solvable:
find . -mindepth 1 -maxdepth 1 -name '*ab*' -print0 | xargs -0 bash -c 'docp() { cp "$#" files/; }; docp "$#"'
To use xargs, you need to ensure that the filename arguments are the last arguments passed to the cp command. You can accomplish this with the -t option to cp to specify the target directory:
ls | grep ab | xargs cp -t files/
Of course, even though this is a contrived example, you should not parse the output of ls.

How to execute multiple commands after xargs -0?

find . -name "filename including space" -print0 | xargs -0 ls -aldF > log.txt
find . -name "filename including space" -print0 | xargs -0 rm -rdf
Is it possible to combine these two commands into one so that only 1 find will be done instead of 2?
I know for xargs -I there may be ways to do it, which may lead to errors when proceeding filenames including spaces. Any guidance is much appreciated.
find . -name "filename including space" -print0 |
xargs -0 -I '{}' sh -c 'ls -aldF {} >> log.txt; rm -rdf {}'
Ran across this just now, and we can invoke the shell less often:
find . -name "filename including space" -print0 |
xargs -0 sh -c '
for file; do
ls -aldF "$file" >> log.txt
rm -rdf "$file"
done
' sh
The trailing "sh" becomes $0 in the shell. xargs provides the files (returrned from find) as command line parameters to the shell: we iterate over them with the for loop.
If you're just wanting to avoid doing the find multiple times, you could do a tee right after the find, saving the find output to a file, then executing the lines as:
find . -name "filename including space" -print0 | tee my_teed_file | xargs -0 ls -aldF > log.txt
cat my_teed_file | xargs -0 rm -rdf
Another way to accomplish this same thing (if indeed it's what you're wanting to accomplish), is to store the output of the find in a variable (supposing it's not TB of data):
founddata=`find . -name "filename including space" -print0`
echo "$founddata" | xargs -0 ls -aldF > log.txt
echo "$founddata" | xargs -0 rm -rdf
I believe all these answers by now have given out the right ways to solute this problem. And I tried the 2 solutions of Jonathan and the way of Glenn, all of which worked great on my Mac OS X. The method of mouviciel did not work on my OS maybe due to some configuration reasons. And I think it's similar to Jonathan's second method (I may be wrong).
As mentioned in the comments to Glenn's method, a little tweak is needed. So here is the command I tried which worked perfectly FYI:
find . -name "filename including space" -print0 |
xargs -0 -I '{}' sh -c 'ls -aldF {} | tee -a log.txt ; rm -rdf {}'
Or better as suggested by Glenn:
find . -name "filename including space" -print0 |
xargs -0 -I '{}' sh -c 'ls -aldF {} >> log.txt ; rm -rdf {}'
As long as you do not have newline in your filenames, you do not need -print0 for GNU Parallel:
find . -name "My brother's 12\" records" | parallel ls {}\; rm -rdf {} >log.txt
Watch the intro video to learn more: http://www.youtube.com/watch?v=OpaiGYxkSuQ
Just a variation of the xargs approach without that horrible -print0 and xargs -0, this is how I would do it:
ls -1 *.txt | xargs --delimiter "\n" --max-args 1 --replace={} sh -c 'cat {}; echo "\n"'
Footnotes:
Yes I know newlines can appear in filenames but who in their right minds would do that
There are short options for xargs but for the reader's understanding I've used the long ones.
I would use ls -1 when I want non-recursive behavior rather than find -maxdepth 1 -iname "*.txt" which is a bit more verbose.
You can execute multiple commands after find using for instead of xargs:
IFS=$'\n'
for F in `find . -name "filename including space"`
do
ls -aldF $F > log.txt
rm -rdf $F
done
The IFS defines the Internal Field Separator, which defaults to <space><tab><newline>. If your filenames may contain spaces, it is better to redefine it as above.
I'm late to the party, but there is one more solution that wasn't covered here: user-defined functions. Putting multiple instructions on one line is unwieldy, and can be hard to read/maintain. The for loop above avoids that, but there is the possibility of exceeding the command line length.
Here's another way (untested).
function processFiles {
ls -aldF "$#"
rm -rdf "$#"
}
export -f processFiles
find . -name "filename including space"` -print0 \
| xargs -0 bash -c processFiles dummyArg > log.txt
This is pretty straightforward except for the "dummyArg" which gave me plenty of grief. When running bash in this way, the arguments are read into
"$0" "$1" "$2" ....
instead of the expected
"$1" "$2" "$3" ....
Since processFiles{} is expecting the first argument to be "$1", we have to insert a dummy value into "$0".
Footnontes:
I am using some elements of bash syntax (e.g. "export -f"), but I believe this will adapt to other shells.
The first time I tried this, I didn't add a dummy argument. Instead I added "$0" to the argument lines inside my function ( e.g. ls -aldf "$0" "$#" ). Bad idea.
Aside from stylistic issues, it breaks when the "find" command returns nothing. In that case, $0 is set to "bash", Using the dummy argument instead avoids all of this.
Another solution:
find . -name "filename including space" -print0 \
| xargs -0 -I FOUND echo "$(ls -aldF FOUND > log.txt ; rm -rdf FOUND)"

xargs to execute a string - what am I doing wrong?

I'm trying to rename all files in current directory such that upper case name is converted to lower. I'm trying to do it like this:
ls -1|gawk '{print "`mv "$0" "tolower($0)"`"}'|xargs -i -t eval {}
I have two files in the directory, Y and YY
-t added for debugging, and output is:
eval `mv Y y`
xargs: eval: No such file or directory
if I execute the eval on its own, it works and moves Y to y.
I know there are other ways to achieve this, but I'd like to get this working if I can!
Cheers
eval is a shell builtin command, not a standalone executable. Thus, xargs cannot run it directly. You probably want:
ls -1 | gawk '{print "`mv "$0" "tolower($0)"`"}' | xargs -i -t sh -c "{}"
Although you're looking at an xargs solution, another method to perform the same thing can be done with tr (assuming sh/bash/ksh syntax):
for i in *; do mv $i `echo $i | tr '[A-Z]' '[a-z]'`; done
If your files are created by creative users, you will see files like:
My brother's 12" records
The solutions so far do not work on that kind of files. If you have GNU Parallel installed this will work (even on the files with creative names):
ls | parallel 'mv {} "$(echo {} | tr "[:upper:]" "[:lower:]")"'
Watch the intro video to learn more: http://www.youtube.com/watch?v=OpaiGYxkSuQ
You can use eval with xargs like the one below.
Note: I only tested this in bash shell
ls -1| gawk '{print "mv "$0" /tmp/"toupper($0)""}'| xargs -I {} sh -c "eval {}"
or
ls -1| gawk '{print "mv "$0" /tmp/"toupper($0)""}'| xargs -I random_var_name sh -c "eval random_var_name"
I generally use this approach when I want to avoid one-liner for loop.
e.g.
for file in $(find /some/path | grep "pattern");do somecmd $file; done
The same can be written like below
find /some/path | grep "pattern"| xargs -I {} sh -c "somecmd {}"

Resources