How do I get a server's response time? - ruby

How can I get a server response time with ruby?
Im using Net::HTTP.get_response(URI.parse(url)) to the response code and response time of the URL. Actually my code is:
start_time = Time.now
#req = Net::HTTP.get_response(URI.parse(url))
end_time = (Time.now - start_time) * 1000
puts ("%.2f" % end_time + "ms")
Is working perfectly, but I'm getting too high response times, e.g.: Twitter.com (630.52ms). If I try to ping twitter.com, I'm getting 70/120ms of response.
Is this the best way to calculate the response time of this server?

What you implemented does not show the server response time, it shows:
the time spent in ruby to send the request
the network time of the request
the server's response time
the network time of the response
the time spent in ruby to process the response
If you need to see only the time that the server took to process the request you need to do it with another way. You can use the HTTP response headers. The date header in particular can help you with that.

Ruby timings are included in your version of code as #xlembouras said.
I would suggest to use curl for what you are looking for:
response_time_total = `curl -w \"%{time_total}\" google.com -o /dev/null -s`
You can try it out in terminal:
curl -w \"%{time_total}\n\" google.com -o /dev/null -s
There are different metrics you can get like time_namelookup, time_connect, time_starttransfer, time_redirect, etc. Example:
response_times = `curl -w \"%{time_connect}:%{time_starttransfer}:%{time_total}\" google.com -o /dev/null -s`
time_connect, time_starttransfer, time_total = response_times.split(':')
All available metrics and detailed information can be found at the cURL manpage
Refs
How do I measure request and response times at once using cURL?

Related

Jmeter - How to capture curl manpage parameters for http request

I am trying to create a report in Jmeter where I am fetching a file from URL and in report I need to capture following information -
DNS resolution (time_namelookup)
Connect (time_connect)
TLS handshake (time_appconnect)
HTTP request start (time_pretransfer)
HTTP response start (time_starttransfer)
HTTP response completed (time_total)
I am able to capture these using following command from my Mac's terminal.
curl -w "#curl-format.txt" -o /dev/null -s "https:///large5-1663346153.bmp"
here's the content of curl-format.txt file
time_namelookup: %{time_namelookup}s\n
time_connect: %{time_connect}s\n
time_appconnect: %{time_appconnect}s\n
time_pretransfer: %{time_pretransfer}s\n
time_redirect: %{time_redirect}s\n
time_starttransfer: %{time_starttransfer}s\n
----------\n
time_total: %{time_total}s\n
As per JMeter Glossary JMeter can report the following metrics:
Connect time
Latency (time to first byte)
Elapsed time (time to last byte)
The only way to get the metrics you're looking for in JMeter is executing the curl command via OS Process Sampler or JSR223 Sampler and get the values from the response using suitable Post-Processors

regular expression inside a cURL call

I have a cURL call like this:
curl --silent --max-filesize 500 --write-out "%{http_code}\t%{url_effective}\n" 'http://fmdl.filemaker.com/maint/107-85rel/fmpa_17.0.2.[200-210].dmg' -o /dev/null
This call generates a list of of URLs with the HTTP code (200 or 404 normally) like this:
404 http://fmdl.filemaker.com/maint/107-85rel/fmpa_17.0.2.203.dmg
404 http://fmdl.filemaker.com/maint/107-85rel/fmpa_17.0.2.204.dmg
200 http://fmdl.filemaker.com/maint/107-85rel/fmpa_17.0.2.205.dmg
404 http://fmdl.filemaker.com/maint/107-85rel/fmpa_17.0.2.206.dmg
The only valid URLs are the ones preceded by the 200 HTTP code, so I would like to put a regular expression in the cURL so that it only downloads the lines that start with 200
Any ideas on how to do this without being a bash script?
Thank you in advance
You can use the following :
curl --silent -f --max-filesize 500 --write-out "%{http_code}\t%{url_effective}\n" -o '#1.dmg' 'http://fmdl.filemaker.com/maint/107-85rel/fmpa_17.0.2.[200-210].dmg'
This will try to reach every url and when it's not a 404 nor too large download it into a file whose name will be based on the index in the url.
The -f flag makes it avoid to output the content of the response when the HTTP code isn't a success one, while the -o flag specifies an output file, where #1 corresponds to the effective value of your [200-210] range (adding other [] or {} would let you refer to other parts of the URL by their index).
Note that during my tests, the --max-filesize 500 flag prevented the download of the only url which didn't end up in a 404, fmpa_17.0.2.205.dmg

Curl taking too long to send https request using command line

I have implemented one shall script which send an https request to proxy with authorization header using GET request.
Here is my command :
curl -s -o /dev/null -w "%{http_code}" -X GET -H "Authorization: 123456:admin05" "https://www.mywebpage/api/request/india/?ID=123456&Number=9456123789&Code=01"
It takes around 12 second to wait and then sending request to proxy and revert back with some code like 200,400,500 etc..
Is it possible to reduce time and make it faster using CURL ?
Please advice me for such a case.
Thanks.
Use option -v or --verbose along with --trace-time
It gives details of actions begin taken along with timings.
Includes DNS resolution, SSL handshake, etc. Line starting with '>' means header/body being sent, '<' means being received.
Based on difference between operation sequence - you can decipher whether server is taking time to respond (time between request and response) or network latency or bandwidth(response taking) time.

How to verify AB responses?

Is there a way to make sure that AB gets proper responses from server? For example:
To force it to output the response of a single request to STDOUT OR
To ask it to check that some text fragment is included into the response body
I want to make sure that authentication worked properly and i am measuring response time of the target page, not the login form.
Currently I just replace ab -n 100 -c 1 -C "$MY_COOKIE" $MY_REQUEST with curl -b "$MY_COOKIE" $MY_REQUEST | lynx -stdin .
If it's not possible, is there an alternative more comprehensive tool that can do that?
You can use the -v option as listed in the man doc:
-v verbosity
Set verbosity level - 4 and above prints information on headers, 3 and above prints response codes (404, 200, etc.), 2 and above prints warnings and info.
https://httpd.apache.org/docs/2.4/programs/ab.html
So it would be:
ab -n 100 -c 1 -C "$MY_COOKIE" -v 4 $MY_REQUEST
This will spit out the response headers and HTML content. The 3 value will be enough to check for a redirect header.
I didn't try piping it to Lynx but grep worked fine.
Apache Benchmark is good for a cursory glance at your system but is not very sophisticated. I am currently attempting to tune a web service and am finding that AB does not measure complete response time when considering the transfer of the body. Also as you mention you can not verify what is returned.
My current recommendation is Apache JMeter. http://jmeter.apache.org/
I am having much better success with it. You may find the Response Assertion useful for your situation. http://jmeter.apache.org/usermanual/component_reference.html#Response_Assertion

Ruby Curb not following redirects

I'm using Curb to get various URLs, and if the response is 200, I get what I need. However, if the response is a redirect, Curb doesn't seem to follow the redirects, even though I ask it to - e.g:
easy = Curl::Easy.new
easy.follow_location = true
easy.max_redirects = 3
easy.url = "http://stats.berr.gov.uk/ed/vat/VATStatsTables2a2d2007.xls"
easy.perform
=> Curl::Err::GotNothingError: Curl::Err::GotNothingError
from /Users/stuart/.rvm/gems/ruby-2.0.0-p0#datakitten/gems/curb-0.8.4/lib/curl/easy.rb:60:in `perform'
However, if I do curl -L http://stats.berr.gov.uk/ed/vat/VATStatsTables2a2d2007.xls on the command line, I get the expected response. What am I doing wrong?
It sounds as this server returns an empty reply[1] if you do not provide an user agent.
To solve you problem just set one:
...
easy.useragent = "curb"
easy.perform
[1]: curl -A '' -L http://stats.berr.gov.uk/... gives (52) Empty reply from server.

Resources