Using Apache Benchmark to load the urls from a text file - apachebench

Need to test the performance of API's with variable parameters (x and y)
e.g. http://myapiurl?x=1&y=2&z=6
http://myapiurl?x=3&y=3&z=6
http://myapiurl?x=5&y=2&z=6
..........
now while bench marking I want to hit the urls randomly from a text file.
$ ab -c 100 -c 500 urls.txt

Patch - https://github.com/philipgloyne/apachebench-for-multi-url/blob/master/README.md
gcc -I /usr/include/apr-1.0 -I /usr/include/apache2 ab.c -o ab -lm -lapr-1 -laprutil-1
ab -c 100 -v 4 -n 2000 -L urls.txt > results.txt
This will load the target urls from text file.

Related

How to run compression in gnu parallel?

Hi I am trying to compress a file with the bgzip command
bgzip -c 001DD.txt > 001DD.txt.gz
I want to run this command in parallel. I tried:
parallel ::: bgzip -c 001DD.txt > 001DD.txt.gz
but it gives me this error:
parallel: Error: Cannot open input file 'bgzip': No such file or directory
You need to chop the big file into smaller chunks and compress these. It can be done this way:
parallel --pipepart -a 001DD.txt --block -1 -k bgzip > 001DD.txt.gz

Issue with download multiple file with names in BASH

I'm trying to download multiple files in parallel using xargs. Things worked so well if I only download the file without given name. echo ${links[#]} | xargs -P 8 -n 1 wget. Is there any way that allow me to download with filename like wget -O [filename] [URL] but in parallel?
Below is my work. Thank you.
links=(
"https://apod.nasa.gov/apod/image/1901/sombrero_spitzer_3000.jpg"
"https://apod.nasa.gov/apod/image/1901/orionred_WISEantonucci_1824.jpg"
"https://apod.nasa.gov/apod/image/1901/20190102UltimaThule-pr.png"
"https://apod.nasa.gov/apod/image/1901/UT-blink_3d_a.gif"
"https://apod.nasa.gov/apod/image/1901/Jan3yutu2CNSA.jpg"
)
names=(
"file1.jpg"
"file2.jpg"
"file3.jpg"
"file4.jpg"
"file5.jpg"
)
echo ${links[#]} ${names[#]} | xargs -P 8 -n 1 wget
With GNU Parallel you can do:
parallel wget -O {2} {1} ::: "${links[#]}" :::+ "${names[#]}"
If a download fails, GNU Parallel can also retry commands with --retry 3.

Different results with MACS2 when Peakcalling with .bed or .bam

I got the following problem:
I use MACS2 (2.1.0.20140616) with the following short commandline:
macs2 callpeak -t file.bam -f bam -g 650000000 -n Test -B --nomodel -q 0.01
It seems to work as I want, but when I convert the .bamfile into .bed via
bedtools bamtobed -i file.bam > file.bed
and use MACS2 on this, I get a lot more peaks. As far as I understand, the .bed-file should contain the same information as the .bam-file, so that's kinda odd.
Any suggestions what's the problem?
Thanks!

wget to parse a webpage in shell

I am trying to extract URLS from a webpage using wget. I tried this
wget -r -l2 --reject=gif -O out.html www.google.com | sed -n 's/.*href="\([^"]*\).*/\1/p'
It is displaiyng FINISHED
Downloaded: 18,472 bytes in 1 files
But not displaying the weblinks. If I try to do it seperately
wget -r -l2 --reject=gif -O out.html www.google.com
sed -n 's/.*href="\([^"]*\).*/\1/p' < out.html
Output
http://www.google.com/intl/en/options/
/intl/en/policies/terms/
It is not displaying all the links
ttp://www.google.com
http://maps.google.com
https://play.google.com
http://www.youtube.com
http://news.google.com
https://mail.google.com
https://drive.google.com
http://www.google.com
http://www.google.com
http://www.google.com
https://www.google.com
https://plus.google.com
And more over I want to get links from 2nd level and more can any one give a solution for this
Thanks in advance
The -O file option captures the output of wget and writes it to the specified file, so there is no output going through the pipe to sed.
You can say -O - to direct wget output to standard output.
If you don't want to use grep, you can try
sed -n "/href/ s/.*href=['\"]\([^'\"]*\)['\"].*/\1/gp"

Setting a variable in make

This is my makefile
file1:
uglifyjs myfile1.js -c | gzip -c -9 > myfile1.min.js
file2:
uglifyjs myfile2.js -c | gzip -c -9 > myfile2.min.js
How can I change my makefile to remove duplicate code:
file1:
FILE=myfile1.js
#How to call build target?
file2:
FILE=myfile2.js
build:
uglifyjs $(FILE).js -c | gzip -c -9 > $(FILE).min.js
I know I can use make build but is there another way to do this without invoking make recursively?
Use automatic variables:
file1 file2:
uglifyjs my$#.js -c | gzip -c -9 > my$#1.min.js
I don't know why you're using targets like file1 when the file you're actually building is myfile1.min.js. That's not a good makefile.
But, that's not the question you asked.
Use a pattern rule to run the command, and then make your targets depend on the files you want:
file1: myfile1.min.js
file2: myfile2.min.js
%.min.js: %.js
uglifyjs $< -c | gzip -c -9 >$#
The pattern rule tells make how to build a .min.js file from a .js file, and the other rules tell it to build specific files.

Resources