This question already has answers here:
How to pass command output as multiple arguments to another command
(5 answers)
Closed 3 years ago.
I have this two step bash command:
L=`wc -l testfile | cut -d' ' -f1`
myprogram testfile $L testfile.out
Long story short, myprogram needs the line count as an input.
I want to combine this into one line.
This does not work because using redirect | to - passes stdout stream as a file, not a string.
wc -l testfile | cut -d' ' -f1 | myprogram testfile - testfile.out
Is there a way to combine this into one line?
Use process substitution:
myprogram testfile $(wc -l < testfile) testfile.out
^^^^^^^^^^^^^^^^^^^
This way, wc -l < testfile is evaluated together with the call of the program and you have both commands combined.
Note wc -l < file returns you just the number, so you don't have to do cut or any other thing to clean the output.
Related
I want to apply a command to each line of piped stdin like so:
cat file.txt | grep ... | ./filter | wc -l
the problem is ./filter accepts only a single line of input and gives a single line of output. I've tried xargs but it spawns a subshell and I can't capture it's output to continue working with the result. Is there an easy way to do that?
If it accepts a single line, then you should put it in a loop if you want to process multiple lines,
cat file.txt |
grep ... |
while read line ; do
echo "$line" | ./filter
done |
wc -l
To call a command for each line you can read a line into a variable and use the variable as a standard input. (Also, let’s avoid UUOC.)
grep ... < file.txt |
while IFS= read -r line; do
./filter <<< "$line"
done |
wc -l
In this case it looks like things may get way easier if you instead write the whole filter in awk. Because it will give you wc -l for free (NR), plus line and record splitting and filtering better than what grep can do.
This question already has answers here:
write to a file after piping output from tail -f through to grep
(4 answers)
Closed 5 years ago.
I am having an issue with filtering a log file that is being written and writing the output to another file (if possible using tee, so I can see it working as it goes).
I can get it to output on stdout, but not write to a file, either using tee or >>.
I can also get it to write to the file, but only if I drop the -f options from tail, which I need.
So, here is an overview of the commands:
tail -f without writing to file: tail -f test.log | sed 's/a/b/' works
tail writing to file: tail test.log | sed 's/a/b/' | tee -a a.txt works
tail -f writing to file: tail -f test.log | sed 's/a/b/' | tee -a a.txt doesn't output on stdout nor writes to file.
I would like 3. to work.
It's the sed buffering. Use sed -u. man sed:
-u, --unbuffered
load minimal amounts of data from the input files and flush the
output buffers more often
And here's a test for it (creates files fooand bar):
$ for i in {1..3} ; do echo a $i ; sleep 1; done >> foo &
[1] 12218
$ tail -f foo | sed -u 's/a/b/' | tee -a bar
b 1
b 2
b 3
Be quick or increase the {1..3} to suit your skillz.
This question already has answers here:
How to get "wc -l" to print just the number of lines without file name?
(10 answers)
Closed 7 years ago.
I'm counting the number of lines in a big file using
wc -l myFile.txt
Result is
110 myFile.txt
But I want only the number
110
How can I do that?
(I want the number of lines as an input argument in a bash script)
There are lots of ways to do this. Here are two:
wc -l myFile.txt | cut -f1 -d' '
wc -l < myFile.txt
Cut is an old Unix tool to
print selected parts of lines from each FILE to standard output.
You can use cat and pipe wc -l:
cat myFile.txt | wc -l
Or if you insist wc -l be the first command, you can use awk:
wc -l myFile.txt | awk '{print $1}'
You can try
wc -l file | awk '{print $1}'
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
bash echo number of lines of file given in a bash variable
Was wondering how you output the number of lines in a text file to screen and then store it in a variable.
I have a file called stats.txt and when I run wc -l stats.txt it outputs 8 stats.txt
I tried doing x = wc -l stats.txt thinking it would store the number only and the rest is just for visual but it does not work :(
Thanks for the help
There are two POSIX standard syntax for doing this:
x=`cat stats.txt | wc -l`
or
x=$(cat stats.txt | wc -l)
They both run the program and replace the invocation in the script with the standard output of the command, in this case assigning it to the $x variable. However, be aware that both trim ending newlines (this is actually what you want here, but can be dangerous sometimes, when you expect a newline).
Also, the second case can be easily nested (example: $(cat $(ls | head -n 1) | wc -l)). You can also do it with the first case, but it is more complex:
`cat \`ls | head -n 1\` | wc -l`
There are also quotation issues. You can include these expressions inside double-quotes, but with the back-ticks, you must continue quoting inside the command, while using the parenthesis allows you to "start a new quoting" group:
"`echo \"My string\"`"
"$(echo "My string")"
Hope this helps =)
you may try:
x=`cat stats.txt | wc -l`
or (from the another.anon.coward's comment):
x=`wc -l < stats.txt`
I have the following three constructs in a bash script:
NUMOFLINES=$(wc -l $JAVA_TAGS_FILE)
echo $NUMOFLINES" lines"
echo $(wc -l $JAVA_TAGS_FILE)" lines"
echo "$(wc -l $JAVA_TAGS_FILE) lines"
And they both produce identical output when the script is run:
121711 /home/slash/.java_base.tag lines
121711 /home/slash/.java_base.tag lines
121711 /home/slash/.java_base.tag lines
I.e. the name of the file is also echoed (which I don't want to). Why do these scriplets fail and how should I output a clean:
121711 lines
?
An Example Using Your Own Data
You can avoid having your filename embedded in the NUMOFLINES variable by using redirection from JAVA_TAGS_FILE, rather than passing the filename as an argument to wc. For example:
NUMOFLINES=$(wc -l < "$JAVA_TAGS_FILE")
Explanation: Use Pipes or Redirection to Avoid Filenames in Output
The wc utility will not print the name of the file in its output if input is taken from a pipe or redirection operator. Consider these various examples:
# wc shows filename when the file is an argument
$ wc -l /etc/passwd
41 /etc/passwd
# filename is ignored when piped in on standard input
$ cat /etc/passwd | wc -l
41
# unusual redirection, but wc still ignores the filename
$ < /etc/passwd wc -l
41
# typical redirection, taking standard input from a file
$ wc -l < /etc/passwd
41
As you can see, the only time wc will print the filename is when its passed as an argument, rather than as data on standard input. In some cases, you may want the filename to be printed, so it's useful to understand when it will be displayed.
wc can't get the filename if you don't give it one.
wc -l < "$JAVA_TAGS_FILE"
You can also use awk:
awk 'END {print NR,"lines"}' filename
Or
awk 'END {print NR}' filename
(apply on Mac, and probably other Unixes)
Actually there is a problem with the wc approach: it does not count the last line if it does not terminate with the end of line symbol.
Use this instead
nbLines=$(cat -n file.txt | tail -n 1 | cut -f1 | xargs)
or even better (thanks gniourf_gniourf):
nblines=$(grep -c '' file.txt)
Note: The awk approach by chilicuil also works.
It's a very simple:
NUMOFLINES=$(cat $JAVA_TAGS_FILE | wc -l )
or
NUMOFLINES=$(wc -l $JAVA_TAGS_FILE | awk '{print $1}')
I normally use the 'back tick' feature of bash
export NUM_LINES=`wc -l filename`
Note the 'tick' is the 'back tick' e.g. ` not the normal single quote