How i get count lines in terminal - terminal

I need a total of lines in terminal output
Like cat lista_de_compras.txt:
arroz
feijão
leite
I want to get 3

Use grep:
grep -c ^ lista_de_compras.txt
The output will be 3
or use the cat with the wc
cat lista_de_compras.txt | wc -l
The output will be 3
Or Use only the wc:
wc -l lista_de_compras.txt (suggested)
The output will be 3

A way to do it is as follows:
cat lista_de_compras.txt | wc -l
See: https://ss64.com/bash/wc.html

Related

Using grep to count total number of occurrences and output found line

I am using grep tool inside a Bash loop to count the number of found unique values inside the population but I would also like to output the found line.
Currently I am running the following command inside my loop:
n=$(grep -o -i ${uniqueVal} ${population} | wc -l)
total+=n
This however only outputs the number of occurrences found it does not output the matched line. Is there a way to count the number of occurrences while outputting only the matches to the screen?
uniqueVal
192.168.1.1
192.168.1.2
192.168.1.3
192.168.1.4
192.168.1.5
population
192.168.1.4
192.168.1.1
192.168.1.88
192.168.1.77
192.168.1.72
192.168.1.66
192.168.1.55
192.168.1.32
192.168.1.22
192.168.1.24
192.168.1.98
192.168.1.99
Output Total Unique Found: 2
I want the Desired Output to display the line in population that was found and then the total number found.
Found: 192.168.1.4
Found: 192.168.1.1
Total Unique Found: 2
Assuming uniqueVal and population are files:
join uniqueVal population | sed 's/.*/Found: &/' | tee /dev/stderr | wc -l
or f they are variables:
join <(echo $uniqueVal) <(echo $population) | sed 's/.*/Found: &/' | tee /dev/stderr | wc -l
Add -ne flag to get the line number, along with tee /dev/stderr to print the resulting output
grep -o -i ${uniqueVal} ${population} -ne | tee /dev/stderr | wc -l
If you want to print the lines with ${population} inside ${population} file, along with the number of lines founds, then I would use awk, you can use the following:
awk '/${population}/' ${uniqueVal} | tee /dev/stderr | wc -l

Find unique URLs in a file

Situation
I have many URLs in a file, and I need to find out how many unique URLs exist.
I would like to run either a bash script or a command.
myfile.log
/home/myfiles/www/wp-content/als/xm-sf0ab5df9c1262f2130a9b313192deca4-f0ab5df9c1262f2130a9b313192deca4-c23c5fbca96e8d641d148bac41017635|https://public.rgfl.org/HS/PowerPoint%20Presentations/Health%20and%20Safety%20Law.ppt,18,17
/home/myfiles/www/wp-content/als/xm-s4bf050d47df5bfaf0486a50a8528cb16-4bf050d47df5bfaf0486a50a8528cb16-c23c5fbca96e8d641d148bac41017635|https://public.rgfl.org/HS/PowerPoint%20Presentations/Health%20and%20Safety%20Law.ppt,15,14
/home/myfiles/www/wp-content/als/xm-sad122bf22152ba4823a520cc2fe59f40-ad122bf22152ba4823a520cc2fe59f40-c23c5fbca96e8d641d148bac41017635|https://public.rgfl.org/HS/PowerPoint%20Presentations/Health%20and%20Safety%20Law.ppt,17,16
/home/myfiles/www/wp-content/als/xm-s3c0f031eebceb0fd5c4334ecef15292d-3c0f031eebceb0fd5c4334ecef15292d-c23c5fbca96e8d641d148bac41017635|https://public.rgfl.org/HS/PowerPoint%20Presentations/Health%20and%20Safety%20Law.ppt,12,11
/home/myfiles/www/wp-content/als/xm-sff661e8c3b4f94957926d5434d0ad549-ff661e8c3b4f94957926d5434d0ad549-c23c5fbca96e8d641d148bac41017635|https://quality.gha.org/Portals/2/documents/HEN/Meetings/nursesinstitute/062013/nursesroleineliminatingharm_moddydunning.pptx,17,16
/home/myfiles/www/wp-content/als/xm-s32c41ec2a5440ad220008b9abfe9add2-32c41ec2a5440ad220008b9abfe9add2-c23c5fbca96e8d641d148bac41017635|https://quality.gha.org/Portals/2/documents/HEN/Meetings/nursesinstitute/062013/nursesroleineliminatingharm_moddydunning.pptx,19,18
/home/myfiles/www/wp-content/als/xm-s28787ca2f4372ddb3616d3fd53c161ab-28787ca2f4372ddb3616d3fd53c161ab-c23c5fbca96e8d641d148bac41017635|https://quality.gha.org/Portals/2/documents/HEN/Meetings/nursesinstitute/062013/nursesroleineliminatingharm_moddydunning.pptx,22,21
/home/myfiles/www/wp-content/als/xm-s89a7b68158e38391da9f0de1e636c0d5-89a7b68158e38391da9f0de1e636c0d5-c23c5fbca96e8d641d148bac41017635|https://quality.gha.org/Portals/2/documents/HEN/Meetings/nursesinstitute/062013/nursesroleineliminatingharm_moddydunning.pptx,13,12
/home/myfiles/www/wp-content/als/xm-sc4b14e10f6151995f21334061ff1d139-c4b14e10f6151995f21334061ff1d139-c23c5fbca96e8d641d148bac41017635|https://royalmechanical.files.wordpress.com/2011/06/hy-wire-car-2.pptx,13,12
/home/myfiles/www/wp-content/als/xm-se589d47d163e43fa0c0d68e824e2c286-e589d47d163e43fa0c0d68e824e2c286-c23c5fbca96e8d641d148bac41017635|https://royalmechanical.files.wordpress.com/2011/06/hy-wire-car-2.pptx,19,18
/home/myfiles/www/wp-content/als/xm-s52f897a623c539d09bfb988bfb153888-52f897a623c539d09bfb988bfb153888-c23c5fbca96e8d641d148bac41017635|https://royalmechanical.files.wordpress.com/2011/06/hy-wire-car-2.pptx,14,13
/home/myfiles/www/wp-content/als/xm-sccf27a904c5b88e96a3522b2e1180fed-ccf27a904c5b88e96a3522b2e1180fed-c23c5fbca96e8d641d148bac41017635|https://royalmechanical.files.wordpress.com/2011/06/hy-wire-car-2.pptx,18,17
/home/myfiles/www/wp-content/als/xm-s6874bf9d589708764dab754e5af06ddf-6874bf9d589708764dab754e5af06ddf-c23c5fbca96e8d641d148bac41017635|https://royalmechanical.files.wordpress.com/2011/06/hy-wire-car-2.pptx,17,16
/home/myfiles/www/wp-content/als/xm-s46c55ec8387dbdedd7a83b3ad541cdc1-46c55ec8387dbdedd7a83b3ad541cdc1-c23c5fbca96e8d641d148bac41017635|https://royalmechanical.files.wordpress.com/2011/06/hy-wire-car-2.pptx,19,18
/home/myfiles/www/wp-content/als/xm-s08cfdc15f5935b947bbaa93c7193d496-08cfdc15f5935b947bbaa93c7193d496-c23c5fbca96e8d641d148bac41017635|https://royalmechanical.files.wordpress.com/2011/06/hydro-power-plant.ppt,9,8
/home/myfiles/www/wp-content/als/xm-s86e267bd359c12de262c0279cee0c941-86e267bd359c12de262c0279cee0c941-c23c5fbca96e8d641d148bac41017635|https://royalmechanical.files.wordpress.com/2011/06/hydro-power-plant.ppt,15,14
/home/myfiles/www/wp-content/als/xm-s5aa60354d134b87842918d760ec8bc30-5aa60354d134b87842918d760ec8bc30-c23c5fbca96e8d641d148bac41017635|https://royalmechanical.files.wordpress.com/2011/06/hydro-power-plant.ppt,14,13
Desired Result:
Unique Urls: 4
cut -d "|" -f 2 file | cut -d "," -f 1 | sort -u | wc -l
Output:
4
See: man cut, man sort
An awk solution would be
awk '{sub(/^[^|]*\|/,"");gsub(/,[^,]*/,"");i+=a[$0]++?0:1}END{print i}' file
4
If you happen to use GNU awk then below would also give you the same result
awk '{i+=a[gensub(/.*(http[^,]*).*/,"\\1",1)]++?0:1}END{print i}' file
4
Or even short as pointed out in this cracker comment by #cyrus
awk -F '[|,]' '{i+=!a[$2]++} END{print i}' file
4
which uses awk multiple field separator functionality with more idiomatic awk.
Note: See the [ awk manual ] for more info.
Parse with sed, and since file appears to be already sorted,
(with respect to URLs), just run uniq, and count it:
echo Unique URLs: $(sed 's/^.*|\([^,]*\),.*$/\1/' file | uniq | wc -l)
Use GNU grep to extract URLs:
echo Unique URLs: $(grep -o 'ht[^|,]*' file | uniq | wc -l)
Output (either method):
Unique URLs: 4
tr , '|' < myfile.log | sort -u -t '|' -k 2,2 | wc -l
tr , '|' < myfile.log translates all commas into pipe characters
sort -u -t '|' -k 2,2 sorts unique (-u), pipe delimited (-t '|'), in the second field only (-k 2,2)
wc -l counts the unique lines

Parsing functionality in shell script

If I am trying to look up which host bus is the hard drive attached to, I would use
ls -ld /sys/block/sd*/device
it returns
lrwxrwxrwx 1 root root 0 Oct 18 14:52 /sys/block/sda/device -> ../../../1:0:0:0
Now if I want to parse out that "1" in the end of the above string, what would be the quickest way?
Sorry I am very new to shell scripting, I can't make full use of this powerful scripting language.
Thanks!
Split with slashes, select last field, split it with colons and select first result:
ls -ld /sys/block/sd*/device | awk -F'/' '{ split( $NF, arr, /:/ ); print arr[1] }'
It yields:
1
Try doing this :
$ ls -ld /sys/block/sd*/device | grep -oP '\d+(?=:\d+:\d:\d+)'
0
2
3
or
$ printf '%s\n' /sys/block/sd*/device |
xargs readlink -f |
grep -oP '\d+(?=:\d+:\d:\d+)'
and if you want only the first occurence :
grep ...-m1 ...

Is it possible to append a stdout to another stdout?

I'm trying to run a command like:
gunzip -dc file.gz | tail +5c
So this will output the binary file contents minus the first 4 bytes to stdout, and it works. Now I need to append 3 extra bytes to the end of the stream, but only using stdout, never a file.
Imagine the file contains:
1234567890
With the current command, I get:
567890
But I need:
567890000
So... any idea?
Try this :
{ gunzip -dc file.gz | tail -c 5 | tr -d '\n'; echo 000; }
Ok, so based on the answers, the final solution was:
gzcat file.gz | tail -c +5 | echo 000
I didn't need to, and actually shouldn't, use the tr -d '\n', as it will remove the newlines in the middle of the file.
May something like
$ echo "`gunzip -dc file.gz | tail +5c`BBB"
(where BBB are your three extra bytes) work for you?

Heads and Tails - Trying to get first line and last ten lines of each file

I've got a directory of output files that I'd like to display the first line of each file and the last ten lines of each file in order.
I've got part of the command down:
ls output/*Response | sort -t_ --key=2 -g | xargs tail | less
Which give me something like this:
==> output/Acdb_18_Response <==
150707,"SOVO","Other","","","","","","160x600",0,0,1432,0,0,1432
167493,"Asper","Other","","","","","","160x600",143200,0,0,1432,0,0
269774,"AIKA","Other","","","","","","160x600",0,1432,0,0,1432,0
342275,"Lorrum","Other","","","","","","160x600",0,0,1432,0,0,1432
347954,"Game","Other","","","","","","160x600",0,1432,0,0,1432,0
418858,"Technologies","Other","","","","","","160x600",0,1432,0,0,1432,0
24576,"Media ","Other","","","","","","300x600",0,0,1432,0,0,1432
23351," Plus","Other","","","","","","425x600",0,4296,0,0,4296,0
#rowcount=79
which is nice but I'd like to include the first line to get the header. I tried tee'ing the output to head but so far I haven't been able to figure out how to arrange the pipes.
Any suggestions?
ls output/*Response | sort -t_ --key=2 -g \
| xargs -I {} sh -c 'head -1 {}; tail {}' | less
You can also try the following:
ls output/*Response | sort -t_ --key=2 -g | ((head -n 1) && (tail -n 10)) | less

Resources