launch several "while read" commands with xargs - bash

I have a file that contains a list of commands like this
while read line;do tabix ftp://.../myfile.gz. >> output.vcf; done < input.txt
and I would like to pass this list of 45 commands to xargs.
I'm trying to call:
cat mycommands.txt | xargs -P45 -n10 bash
but I'm not sure whether bash understands > or >> as an argument and it is not working.
Does anyone see something I'm not seeing? A mistake...
Thank you very much in advance!

Did you try using the -I flag?
Like this
cat mycommands.txt | xargs -P45 -n10 -I {} bash -c {}
As it appears in the xargs man page:
Replace occurrences of replace-str in the initial-arguments with names
read from standard input.
kind regards

Related

Pass a list of files to sed to delete a line in them all

I am trying to do a one liner command that would delete the first line from a bunch of files. The list of files will be generated by grep command.
grep -l 'hsv,vcv,tro,ztk' ${OUTPUT_DIR}/*.csv | tr -s "\n" " " | xargs /usr/bin/sed -i '1d'
The problem is that sed can't see the list of files to act on.I'm not able to work out what is wrong with the command. Please can someone point me to my mistake.
Line numbers in sed are counted across all input files. So the address 1 only matches once per sed invocation.
In your example, only the first file in the list will get edited.
You can complete your task with loop such as this:
grep -l 'hsv,vcv,tro,ztk' "${OUTPUT_DIR}/"*.csv |
while IFS= read -r file; do
sed -i '1d' "$file"
done
This might work for you (GNU sed and grep):
grep -l 'hsv,vcv,tro,ztk' ${OUTPUT_DIR}/*.csv | xargs sed -i '1d'
The -l ouputs the file names which are received as arguments for xargs.
The -i edits in place the file and removes the first line of each file.
N.B. The -i option in sed works at a per file level, to use line numbers for each file within a stream use the -s option.
The only solution that worked for me is this apart from the one posted by Dan above -
for k in $(grep -l 'hsv,vcv,tro,ztk' ${OUTPUT_DIR}/*.csv | tr -s "\n" " ")
do
/usr/bin/sed -i '1d' "${k}"
done

How can I use each line of a file as an input switch in bash?

I have a file that contains a list of filenames on each line.
myfile.txt:
somepath/Documents/a.txt
somepath/Documents/b.txt
somepath/Documents/c.txt
This file can contain any number of lines. What I want to do is then run a command that runs cat with each line being an input, such as:
cat <line 1> <line 2> <line 3> new_combination_file.txt
I was looking this up and I think I should be using xargs to do this, but when I looked up examples and the man page, it didn't make sense to me. Can someone help?
Use a while read loop to read each line into a variable
while read -r filename
do
cat "$filename"
done < myfile.txt > new_combination_file.txt
You can also use xargs:
xargs cat < myfile.txt > new_combination_file.txt
However, this won't work if any filenames contain spaces.
You can use xargs to do this like so:
xargs -t -a myfile.txt cat >new_combination_file.txt
If you want to use xargs command, just use
cat myfile.txt | xargs -I {} cat {}
-I specifies that the following command will be executed for each line of the input (that is, cat myfile.txt in this example) and
{} represents the value of each line
If you want to combine the output produced just use the regular redirect > command followed by the filename.
Hope this helps.

XARGS with for loop pr

Hi I am working in bash shell with a file of file names that contains multiple files for the same sample on different lines
file.txt
Filename1_1 SampleName1
Filename1_2 SampleName1
Filename2_1 SampleName2
Filename2_2 SampleName2
I am trying to use xargs with a for loop to pass the filenames into one argument (i.e print Filename1_1 FileName1_2).
Which would be the effect of :
cat file.txt | xargs bash -c 'echo ${0} ${2}'
Since it is quite a long file i cannot use this repeatedly and thought using a for loop will help. But isn't producing the output i expected
Here is what i thought would be simple to do.
for (( i = 0,j=2; i<=63; i= i+4,j=j+4 ))
do
cat file.txt | xargs bash -c 'echo ${i} ${j}'
done
However running this loops through and prints a bunch of blank lines.
Anyone have an idea of getting this to work like i want?
I am looking for an output that looks like below to pass each line to another function
Filename1_1 Filename1_2
Filename2_1 Filename2_2
Filename3_1 Filename3_2
Filename4_1 Filename4_2
Just use -n2 and specify maximum number of arguments.
<file.txt xargs -n2 bash -c 'echo $1 $2' _

how to read a value from filename and insert/replace it in the file?

I have to run many python script which differ just with one parameter. I name them as runv1.py, runv2.py, runv20.py. I have the original script, say runv1.py. Then I make all copies that I need by
cat runv1.py | tee runv{2..20..1}.py
So I have runv1.py,.., runv20.py. But still the parameter v=1 in all of them.
Q: how can I also replace v parameter to read it from the file name? so e.g in runv4.py then v=4. I would like to know if there is any one-line shell command or combination of commands. Thank you!
PS: direct editing each file is not a proper solution when there are too many files.
Below for loop will serve your purpose I think
for i in `ls | grep "runv[0-9][0-9]*.py"`
do
l=`echo $i | tr -d [a-z.]`
sed -i 's/v/'"$l"'/g' runv$l.py
done
Below command was to pass the parameter to script extracted from the filename itself
ls | grep "runv[0-9][0-9]*.py" | tr -d [a-z.] | awk '{print "./runv"$0".py "$0}' | xargs sh
in the end instead of sh you can use python or bash or ksh.

How to apply shell command to each line of a command output?

Suppose I have some output from a command (such as ls -1):
a
b
c
d
e
...
I want to apply a command (say echo) to each one, in turn. E.g.
echo a
echo b
echo c
echo d
echo e
...
What's the easiest way to do that in bash?
It's probably easiest to use xargs. In your case:
ls -1 | xargs -L1 echo
The -L flag ensures the input is read properly. From the man page of xargs:
-L number
Call utility for every number non-empty lines read.
A line ending with a space continues to the next non-empty line. [...]
You can use a basic prepend operation on each line:
ls -1 | while read line ; do echo $line ; done
Or you can pipe the output to sed for more complex operations:
ls -1 | sed 's/^\(.*\)$/echo \1/'
for s in `cmd`; do echo $s; done
If cmd has a large output:
cmd | xargs -L1 echo
You can use a for loop:
for file in * ; do
echo "$file"
done
Note that if the command in question accepts multiple arguments, then using xargs is almost always more efficient as it only has to spawn the utility in question once instead of multiple times.
You actually can use sed to do it, provided it is GNU sed.
... | sed 's/match/command \0/e'
How it works:
Substitute match with command match
On substitution execute command
Replace substituted line with command output.
A solution that works with filenames that have spaces in them, is:
ls -1 | xargs -I %s echo %s
The following is equivalent, but has a clearer divide between the precursor and what you actually want to do:
ls -1 | xargs -I %s -- echo %s
Where echo is whatever it is you want to run, and the subsequent %s is the filename.
Thanks to Chris Jester-Young's answer on a duplicate question.
xargs fails with with backslashes, quotes. It needs to be something like
ls -1 |tr \\n \\0 |xargs -0 -iTHIS echo "THIS is a file."
xargs -0 option:
-0, --null
Input items are terminated by a null character instead of by whitespace, and the quotes and backslash are
not special (every character is taken literally). Disables the end of file string, which is treated like
any other argument. Useful when input items might contain white space, quote marks, or backslashes. The
GNU find -print0 option produces input suitable for this mode.
ls -1 terminates the items with newline characters, so tr translates them into null characters.
This approach is about 50 times slower than iterating manually with for ... (see Michael Aaron Safyans answer) (3.55s vs. 0.066s). But for other input commands like locate, find, reading from a file (tr \\n \\0 <file) or similar, you have to work with xargs like this.
i like to use gawk for running multiple commands on a list, for instance
ls -l | gawk '{system("/path/to/cmd.sh "$1)}'
however the escaping of the escapable characters can get a little hairy.
Better result for me:
ls -1 | xargs -L1 -d "\n" CMD

Resources