Jmeter throughput values in CSV File - jmeter

In summary-report, the throughput value is mentioned in sec or min .i.e 21/m,15 sec. but while when I converting to CSV File It is displaying throughput value in decimal i.e .28032 or .338292. How can I solve this Issue?

When running a JMeter test, throughput is measured in requests per second / minute / hour. The time unit is chosen so that the displayed rate is at least 1.0, in order to graph the data. When the throughput is saved to a CSV file, it is expressed in requests per second (or requests / seconds).
This means, for example that 30 requests per minute is saved as 0.5 (because 30 / 60 = 0.5).
So, in the example you posted in your question, calculating 0.28032:
0.28032 = x / 60
0.28032 = y / 3600
x = 16.81
y = 1009.15
This means that your load test performed 0.28032 requests per second, or 16.81 requests per minute, or 1009.15 requests per hour.

Related

How do I achieve the expected throughput in JMeter for a given scenario?

I have about 300 users (configured in the thread group) who would perform an activity (e.g.: run an e-learning course) twice. That would mean I need to expect about 600 iterations i.e 300 users performing an activity twice.
My thread group contains the following transaction controllers:
Login
Dashboard
Launch Course
Complete Course
Logout
As I need 600 iterations per 5400 seconds i.e 3600 + 900 + 900 seconds (1 hour steady state + 15 mins ramp-up and 15 mins ramp-down), and the sum of sampler requests within the total thread group are 18, would I be correct to say I need about 2 RPS?
Total number of iterations * number of requests per iteration = Total number of requests
600 * 18 = 10800
Total number of requests / Total test duration in seconds = Requests per second
10800 / 5400 = 2
Are my calculations correct?
In addition, what is the best approach to achieve the expected throughput?
Your calculation looks more or less correct. If you need to limit your test throughput to 2 RPS you can do it using Constant Throughput Timer or Throughput Shaping Timer.
However 2 RPS is nothing more than statistical noise, my expectation is that you need much higher load to really test your application performance, i.e.
Simulate the anticipated number of users for a short period. Don't care about iterations, just let your test to run i.e. for an hour with the number of users you expect. This is called load testing
Do the same but for longer period of time (i.e. overnight or weekend). This is called soak testing.
Gradually increase the number of users until you will see errors or response time will start exceeding acceptable thresholds. This is called stress testing.

How Throughput is calculate and display in Sec,Minute and Hours in Jmeter?

I have one observation and want to get knowledge on Throughput calculation,Some time Throughput is displaying in seconds,some times in minutes and some times in Hours,please any one provide exact answer to calculate throughput and when it will display in Seconds,Minutes and Hours in Jmeter Summary Report
From JMeter Docs:
Throughput is calculated as requests/unit of time. The time is
calculated from the start of the first sample to the end of the last
sample. This includes any intervals between samples, as it is supposed
to represent the load on the server. The formula is: Throughput =
(number of requests) / (total time).
unit time varies based on the throughput values.
examples:
In 10 seconds, 10 requests are sent, then throughput is 10/10 = 1/sec
In 10 seconds, 1 requests are sent, then throughput is 1/10 = 0.1/sec = 6/min (showing 0.1/sec in decimals will be automatically shown in next higher unit time)
If you understand, it is to avoid small values (like, 0.1, 0.001 etc). In such cases, higher unit time is more friendly in understanding, while all unit times are correct. It is a matter of usability.
so,
1/sec = 60/min = 3600/hour = SAME

Jmeter: Random number of labels everyday

I am executing same test plan for two consecutive days:
First day no of Label (column A) are more than 1400
Second day no of Label are only 968 only
First day:
Second day:
I see that first day has samples 12, throughput almost zero and KB/sec 0.1
Second has better performance. Please help me understand:
What is difference between Label/Samples/Requests?
Do Number of labels depends on Throughput and Kb/sec i.e. Column K and L?
Label is basically the name of the thread group in your case. it is basically the request name that you are hitting from JMeter.
Samples are the number of times that particular request was executed.
e.g. If you have some request called login and number of samples for login are 5 it means that the login request was executed 5 times during the test.
The number of samples would vary based on the Test settings like number of users, iterations or duration of the test specified.
The number of labels = number of samples and throughput is related to each other.
Throughput = Number of Requests per second or minute and
Kb/Second = (Throughput* Average Bytes) / 1024
So the two are correlated.
I hope this helps.

Throughput calculation in Jmeter

Attached is the Summary Report for my tests.
Please help me understand how is the throughput value calculated by JMeter:
example the throughput of the very first line 53.1/min, how was this figure calculated by JMeter with which formula.
Also, wanted to know how are the throughput values in the subsequent test divided into mins or secs. example the 2nd line has a throughput 1.6/sec, so how does JMeter calculate this throughput values based on the time units ?
Tried many websites on the net and have got a common reply that the throughput is the number of requests per unit of time (seconds, minutes, hours) that are sent to your server during the test. But that didn't apply to the results I see in my graph the way it was explained straight forward.
Documentation defines Throughput as
requests/unit of time. The time is calculated from the start of the first sample to the end of the last sample. This includes any intervals between samples, as it is supposed to represent the load on the server.
The formula is: Throughput = (number of requests) / (total time).
So in your case you had 1 request, which took 1129ms, so
Throughput = 1 / 1129ms = 0.00088573959/ms
= 0.00088573959 * 1000/sec = 0.88573959/sec
= 0.88573959 * 60/min = 53.1443754/min, rounded to 53.1/min
For 1 request total time (or elapsed time) is the same as the time of this single operation. For requests executed multiple times, it would be equal to
Throughput = (number of requests) / (average * number of requests) = 1 / average
For instance if you take the last line in your screenshot (with 21 requests), it has an average of 695, so throughput is:
Throughput = 1 / 695ms = 0.0014388489/ms = 1.4388489/sec, rounded to 1.4/sec
In terms of units (sec/min/hour), Summary report does this:
By default it displays throughput in seconds
But if throughput in seconds < 1.0, it will convert it to minutes
If it's still < 1.0, it will convert it to hours
It rounds the value to 1 decimal digit afterwards.
This is why some values are displayed in sec, some in min, and some could be in hours. Some may even have value 0.0, which basically means that throughput was < 0.04
I have been messing with this for a while and here is what I had to do into order for my numbers to match what jmeter says
Loop through my lines in the csv file, gather the LOWEST start time for each of the labels you have, also grab the HIGHEST (timestamp + elapsed time)
Calculate the difference between those in seconds
then do number of samples / the difference
So in excel, the easiest way to do it is get the csv file and add a column for timestamp + elapsed
First sort the block by the timestamp - lowest to highest then fine the first instance of each label and grab that time
Then sort by your new column highest to lowest and grab the first time again for each label
For each label then gather both of these times in a new sheet
A would be the label
B would be the start time
C would be the endtime+elapsed time
D would then be (C-B)1000 (diff in seconds)
E would then be the number of samples for each label
F would be E/D (samples per second)
G would be F60 (samples per minute)

What does this mean in JMeter load? 100 in13.2s= 7.4/s

I have checked for load 100 and got result 100 in13.2/s= 7.4/s.
So what is the meaning of 100 in 13.2/s = 7.4/s?
It means the Number of Executed Samples or Requests are 100. Test duration is 13.2 seconds and Throughput is 7.4/s. So your application handled average 7.4 requests per second during those 13.2 seconds. From that test, the total number of requests are 100.
Throughput is calculated as requests/unit of time. The time is calculated from the start of the first sample to the end of the last sample. This includes any intervals between samples, as it is supposed to represent the load on the server.
The formula is: Throughput = (number of requests) / (total time).
In fact, there's been a mistake in the question: it should be "100 in 13.2s" not "100 in 13.2/s" !!
For further detail, go through Apache JMeter User Manual: Glossary & Elemants of a Test Plan.

Resources