Large Waiting time for an HTTP request - ajax

I'm working on developing a web site using cakephp. I'm analyzing the website now using firebug + Yslow and Google chrome developer tools. In an Ajax request I get a large waiting time about 6s while the receiving time is too small 66ms which cause a great latency in the request. Does anybody know why the waiting time is too large??

Waiting time - From the time of request to the time first byte is received, which involves a round trip time. There can be latency if your server away from your machine. Usually it requires 3 round trips. 1 for DNS lookup and 1 for establishing TCP Connection, 1 for request and response pair.
Receiving Time : It will be less if there is less amount of data being downloaded from the server to the client.
For further reference : http://www.webperformancematters.com/journal/2007/7/24/latency-bandwidth-and-response-times.html

My guess is that you might be performing a SQL query as part of the resource that you are calling via Ajax. If this is the case, you may need to tune your query or indexes to improve the speed of the query. Can you post some code so we may review?

Related

JMeter Sampler Result: Understanding Load time, Connect time and Latency

First off, I'm new to JMeter and wanted to clear some doubts regarding the relationship between Load time, Connect time, and Latency.
I found some resources that explain the relationship between these metrics:
Latency time – Connect time = Server Processing Time
Elapsed time – Latency time = Download Time
resource
And then another resource says this:
Response Time = Latency + Processing Time
Given below is one of the sampler results I got. If you take this into consideration, can we really comment on how long it took for the server to process the request?
NOTE: In this scenario, my plan is to analyze how much of a load the server had to withstand. I don't really care about the delay of connection establishing and passing around of data packets.
Basically, I want to know the connection between the 3 aforementioned metrics: Load time, Connect time, and Latency. Any help is greatly appreciated. Thanks in advance :)
You cannot say "how long it took for the server to process the request" by looking to JMeter results because:
Latency is time to first byte
Elapsed time is time to last byte
The request lifecycle looks like:
JMeter establishes the connection (connect time)
JMeter sends request body to the server (unknown)
Server processes the request (unknown)
Server sends the response to JMeter (unknown)
JMeter receives the first byte of the response (Latency)
JMeter receives the last byte of the response (Elapsed time)
So you cannot say what is the server processing time even with millisecond precision as JMeter can only get high-level network metrics, if you want to enrich your report with server processing time you need to use an APM or a profiler tool or at least something like JMeter PerfMon Plugin to get this form of information directly from the application under test.
This documentation explains the metrics :
https://jmeter.apache.org/usermanual/glossary.html
Latency:
JMeter measures the latency from just before sending the request to just after the first response has been received. Thus the time includes all the processing needed to assemble the request as well as assembling the first part of the response, which in general will be longer than one byte. Protocol analysers (such as Wireshark) measure the time when bytes are actually sent/received over the interface. The JMeter time should be closer to that which is experienced by a browser or other application client.
Connect Time:
JMeter measures the time it took to establish the connection, including SSL handshake. Note that connect time is not automatically subtracted from latency. In case of connection error, the metric will be equal to the time it took to face the error, for example in case of Timeout, it should be equal to connection timeout.
Load time or Elapsed time:
JMeter measures the elapsed time from just before sending the request to just after the last response has been received. JMeter does not include the time needed to render the response, nor does JMeter process any client code, for example Javascript.
In layman terms I would describe these terms as below:
Load time: total time taken by the request. First req to the final packet
Connect time: Time taken by the request to reach the server
Latency: time taken by request for first response. (if the response is small this can be same as load time)

Why is JMeter Result is different to User Experience Result?

We are currently conducting performance tests on both web apps that we have, one is running within a private network and the other is accessible for all. For both apps, a single page-load of the landing page or initial page only takes between 2-3 seconds on a user POV, but when we use blaze and JMeter, the results are between 15-20 seconds. Am I missing something? The 15-20 seconds result came from the Loadtime/Sample Time in JMeter and in Elapsed column if extracted to .csv. Please help as I'm stuck.
We have tried conducting tests on multiple PCs within the office premises along with a PC remotely accessed on another site and we still get the same results. The number of thread and ramp-up period is both set to 1 to imitate a single user only.
Where a delta exists, it is certain to mean that two different items are being timed. It would help to understand on your front end are you timing to a standard metric, such as w3c domComplete, time to interactive, first contentful paint, some other location, and then compare where this comes into play on the drilldown on the performance tab of chrome. Odds are that there is a lot occuring that is not visible that is being captured by Jmeter.
You might also look for other threads on here on how jmeter operates as compared to a "real browser" There are differences which could come into play affecting your page comparisons, particularly if you have dozens/hundreds of elements that need to be downloaded to complete your page. Also, pay attention to third party components where you do not have permission to test their servers.
I can think of 2 possible causees:
Clear your browser history, especially browser cache. It might be the case you're getting HTTP Status 304 for all requests in browser because responses are being returned from the browser cache and no actual requests are being made while JMeter always uses "clean" session.
Pay attention to Connect Time and Latency metrics as it might be the case the server response time is low but the time for network packets to travel back and forth is very high.
Connect Time. JMeter measures the time it took to establish the connection, including SSL handshake. Note that connect time is not automatically subtracted from latency. In case of connection error, the metric will be equal to the time it took to face the error, for example in case of Timeout, it should be equal to connection timeout.
Latency. JMeter measures the latency from just before sending the request to just after the first response has been received. Thus the time includes all the processing needed to assemble the request as well as assembling the first part of the response, which in general will be longer than one byte. Protocol analysers (such as Wireshark) measure the time when bytes are actually sent/received over the interface. The JMeter time should be closer to that which is experienced by a browser or other application client.
So basically "Elapsed time = Connect Time + Latency + Server Processing Time"
In general given:
the same machine
clean browser session
and JMeter configured to behave like a real browser
you should get similar or equal timings for the same page

jmeter server response time calculation

I'm basically trying to calculate in JMeter 5.1 the server processing time for a HTTP request. I've read the JMeter documentation (specially https://jmeter.apache.org/usermanual/glossary.html) to know more about Elapsed time, Latency and Connect time.
Let's say I have a test plan with one thread which does successively 3 identical HTTP requests to one server. The thing is that for the first request, Connect time is (obviously) not equal to 0, but it is for second and third request.
However, from my understanding, Latency includes Connect time, hence for my first request, the Latency is always (much) larger than for the second and third request, and it does not reflect the time spent waiting (server processing time) for this first request.
Can I assume that, if I substract the Connect time from the Latency (Latency - Connect time), it gives me a meaningfull value of the server processing time (+ download content time maybe?)
See w3c time-taken HTTP request log field. Just turn this on and post process the HTTP request logs at the end of your test. You will have the complete processing time for each individual request.

Why Jmeter response time is not accurate when compared it with server time by verifying logs

My query is when I finish my performance testing and get the result file, I could see that there will be a difference between the Jmeter response time and Server response time.
I verified the server response time by checking the server logs.I am not writing any extra elements in the result file also.
Can I get an explanation why response time shown by Jmeter is always high when compared to actual response time
Have you thought about the network? According to JMeter glossary:
Elapsed time. JMeter measures the elapsed time from just before sending the request to just after the last response has been received. JMeter does not include the time needed to render the response, nor does JMeter process any client code, for example Javascript.
Latency. JMeter measures the latency from just before sending the request to just after the first response has been received. Thus the time includes all the processing needed to assemble the request as well as assembling the first part of the response, which in general will be longer than one byte. Protocol analysers (such as Wireshark) measure the time when bytes are actually sent/received over the interface. The JMeter time should be closer to that which is experienced by a browser or other application client.
Connect Time. JMeter measures the time it took to establish the connection, including SSL handshake. Note that connect time is not automatically subtracted from latency. In case of connection error, the metric will be equal to the time it took to face the error, for example in case of Timeout, it should be equal to connection timeout.
So my expectation is that the server measures only the time which is required to process request and respond while JMeter measures all end-to-end transaction to wit:
Establishing the connection (in particular initial SSL Handshake could be very long)
Sending packets to the server
here server starts measurement
Processing the request by the server
here server stops measurement
Waiting for the first packet to come (Latency)
Waiting for the last packet to come (Elapsed time)
The time needed for the request to travel back and forth can really matter, for example if you have a faulty router or not properly configured load balancer even if the actual server response time is low the user experience won't be smooth.

How can I decrease "Connecting" and "Waiting" times from AJAX requests to the Server?

My actual script execution time, is less then a microsecond and yet, the total time, the response takes is about 250 ms - 1000 times more, on a typical ajaxcall. Even in environments where I have a reliable T1 connection, the responses still take 50-100ms.
Background info:
Call are being made via POST/GET through AJAX, jQuery
Backend is PHP/mysql on the Joyent servers.
the information shown below comes from firebug, net tab.
DNS Lookup = 0
Connecting = 46ms
Sending = 0ms
Waiting = 172ms
Receiving = 0ms
you need to move closer to the servers. :) Sounds like the speed of light is your bottleneck.
Have a look at the trace route of your network packets to the server.

Resources