how to calculate Average time, standard deviation, Throughput to measure Performance - jmeter

Summary Report
I have set a thread of 1500 with a ramp up period of 100, Could any one please tell me how should i verify the report generated by Jmeter.
Calculation of Average, Standard deviation and throughput.

If you want the formulas - they're all available in the Calculator class.
Average response time: is basically a sum of all Sample Results elapsed time divided by their number, in other words arithmetic mean
Standard Deviation: is normal statistical measurement of result set values distribution
Throughput: is number of requests divided by total test duration
It is not clear what you mean by "verify", if you want to fail certain samplers if their response time exceeds certain threshold values you can use Duration Assertion or define reasonable timeouts using HTTP Request Defaults
If you want to process calculated values like mark test as failed if average response time is greater than X milliseconds or throughput is lower than Y requests for the specified duration or standard deviation for full response time is greater than Z milliseconds unfortunately as of JMeter 5.1.1 it is not possible out of the box, however you can consider using Taurus tool as a wrapper for your JMeter test, Taurus is capable of "understanding" JMeter tests and you can add custom pass/fail criteria including but not limited to the metrics you listed.

Related

Jmeter deveation is more but the report has zero errors

rampup - 400
Thread- 100
Loop count -10
Deveation is more than average value ...as per my knowledge deveation should be less or half of the average and report has 0 errors
Can anyone tell me what happens if deveation is more and developers going to fix this
And I'm I giving the ramp up time correct what should be rampup period in general for 100 users ...when I give for same input rampup has 100 I'm getting time out errors in my report
As per JMeter Glossary:
Standard Deviation is a measure of the variability of a data set. This is a standard statistical measure. See, for example: Standard Deviation entry at Wikipedia. JMeter calculates the population standard deviation (e.g. STDEVP function in spreadsheets), not the sample standard deviation (e.g. STDEV).
As per Understanding Your Reports: Part 3 - Key Statistics Performance Testers Need to Understand
Standard Deviations
The standard deviation is the measurement of the density of the cluster of the data around the sought value (mean). Low standard deviation means that points are closer to the mean. High standard deviation means the points are farther away. This parameter can help determine how reliable the data is. If the standard deviation is high, this means that results vary very much, and the analysis should be conducted accordingly.
If you have standard deviation higher than the average response time it basically means that you have more samplers with response time above the average than the ones which response time is below the average. Not sure if there is anything to fix there, maybe it's expected that some samplers last longer than otherse, for example "Logout" operation is normally very quick and "search" operations can last longer, if your user does multiple searches and only one logout - the deviation will be higher than the average. You can look at i.e. 90%, 95% and 99% lines of the Aggregate Report listener to see which percentage of users have for each and every action (and overall), compare the values with your NFRs or SLAs and raise issues if necessary.
Per se deviation higher than the average doesn't necessarily mean that there is a performance problem, you need to correlate other metrics with the business requirements

How is throughput value is calculating in summary report when we have samples more than 1?

I have create a test plan setting no. of threads= 1 ramp-up period= 1 and loop count = 1
If i want to verify throughput value of 2nd label i am using this formula 2/5 means (no.of samples / average time) which results in 0.4 ms but the value in jmeter is showing as 4.9/min. And how are the last two rows of summary report are calculating which include labels of Test(it is my transaction controller) and Total. Please explain with formula. The image of my summary report is in the given link.
summary report
You're using the wrong formula, according to JMeter Glossary
Throughput is calculated as requests/unit of time. The time is calculated from the start of the first sample to the end of the last sample. This includes any intervals between samples, as it is supposed to represent the load on the server.
The formula is: Throughput = (number of requests) / (total time).
So you should be dividing the number of requests not by average response time, but by the whole duration of the test.
If you want to exclude some Samplers which are "not interesting" using Filter Results Tool where you can specify the label(s) of the sampler(s) you would like to get metrics for.
Filter Results Tool can be installed using JMeter Plugins Manager

Throughput or Standard Deviation in JMeter

Once after execution is completed in JMeter, in the summary report, whether I need consider throughput value or standard deviation values for the result analysis purpose?
You must consider both of the values in order to analyze the results.
In the Summary Report:
The Throughput: is the number of requests per unit of time (seconds, minutes, hours) that are sent to your server during the test.
The throughput is the real load processed by your server during a run but it does not tell you anything about the performance of your server during this same run. This is the reason why you need both measures in order to get a real idea about your server’s performance during a run. The response time tells you how fast your server is handling a given load.
The Response time: is the elapsed time from the moment when a given request is sent to the server until the moment when the last bit of information has returned to the client.
Average: This is the Average (Arithmetic mean μ = 1/n * Σi=1…n xi) Response time of your total samples.
Min and Max are the minimum and maximum response time.
Now, An important thing to understand is that the mean value can be very misleading as it does not show you how close (or far) your values are from the average.For this purpose, we need the Deviation value since the Average value can be the Same for the different response time of the samples!!
Deviation: The standard deviation (σ) measures the mean distance of the values to their average (μ).It gives you a good idea of the dispersion or variability of the measures to their mean value.
The following equation show how the standard deviation (σ) is calculated:
σ = 1/n * √ Σi=1…n (xi-μ)2
For Details, see here!!
So, if the deviation value is low compared to the mean value, it will indicate you that your measures are not dispersed (or mostly close to the mean value) and that the mean value is significant.

Throughput calculation using response time and no of request

I received an requirement were I need to display the response time, number of threads running, latency and throughput in one report. I used below code in Beanshell post processor to display throughput, response time and number of threads:
long repons=prev.getTime();
vars.put("responseTime",String.valueOf(recons));
//print("res" +responseTime);
log.info("Response time" + repons);
long thread=prev.getAllThreads();
vars.put("threads", Integer.toString(prev.getAllThreads()));
log.info("Thread number is"+thread);
float throughput=thread/repons;
log.info("Through put"+throughput);
I guess it is wrong. Can anyone help on this?
You have syntax error in your script, you have repons in the first line and recons in the second, they should be the same
It is better to use JSR223 Elements and Groovy language for scripting.
And finally, your approach is wrong, according to JMeter glossary:
Throughput is calculated as requests/unit of time. The time is calculated from the start of the first sample to the end of the last sample. This includes any intervals between samples, as it is supposed to represent the load on the server.
The formula is: Throughput = (number of requests) / (total time).
So you need to divide total number of requests by total time taken to
execute these requests, your "code" most likely will be returning zero throughput
You can consider the following workarounds:
Use Backend Listener and a 3rd-party visualisation tool, see Real-time results article for details.
Run your JMeter test via Taurus framework which has Interactive Reporting feature

Jmeter output result interpretation help required

I would like to understand the Jmeter output for in depth.
I am confused with the 'throughput rate' concept.Does it mean that the server can only handle 48.1 requests/min at the given load or does it mean something else .What is the difference between the total throughput rate and the throughput rate shown by individual requests.In my case there 8 requests sent and the individual request shows throughput rate as 6.1/min.Please explain.
I need to suggest any changes to server side/explain the jmeter report,Please suggest how i can explain what needs to be done.
The total summary report is as below:
Total Users:100
Ramp up time:1000s
Total Samples : 800
Min:325
Max:20353
Std.Dev: 4524.91
Throughput:48.1/min
Error: 0.38%
Thanks in advance.
As per JMeter Glossary
Throughput is calculated as requests/unit of time. The time is calculated from the start of the first sample to the end of the last sample. This includes any intervals between samples, as it is supposed to represent the load on the server.
The formula is: Throughput = (number of requests) / (total time).
So you providing the "load" of 0.8 requests per second which is quite low.
JMeter provides a test element which controls this "Throughput" value so you can choose whether you will be simulating "N" concurrent users or sending "N" requests per second. Take a look at How to use JMeter's Throughput Constant Timer guide for more details on goal-oriented load test scenario implementation with JMeter.

Resources