how to get the response time of web pages in jmeter? - performance

How to generte csv file and load csv using response time graph listener?
Can any one help me in detail that how we find response time in jmeter ?

If you run JMeter in command-line non-GUI mode as follows:
jmeter -n -t /path/to/your/test_plan.jmx -l /path/to/results_file.jtl
your results_file.jtl content will look like:
1409124780902,182,Logon,200,OK,Thread Group 1-1,text,true,214,0
1409124781219,153,Logout,200,OK,Thread Group 1-1,text,true,110,0
where second column is page response time in milliseconds.
Other values are:
"1409124780902" - current time stamp in ms
"182" - page response time
"Logon" - sampler name
"200" - Response Code
"OK" - Response Message
"Thread Group 1-1" - Parent Thread Group name, thread number and iteration.
"text" - response data type
"214" - response data size in bytes
"0" - latency
Once your test run is done you can open JMeter GUI and load this results_file.jtl into the listener of your choice.
You might also be interested in JMeter Plugins Extras Set which is capable of generating nice looking and easy understandable response-time related graphs to wit:
Response Times vs Threads
Response Times Distribution
Response Times Percentiles

You can get it by adding Reporters.
Please keep in mind Reporters is cpu and memory intensive components and thus should not be used while actual load test.
But for sample testing you can use it and for load test run you can get response time, average,throughput etc by saving output to jtl file in JMeter.
For normal/sample run
Aggregate report gives Average response time, min, max, median etc.
Summary report also gives the same with less details,
While performing actual run you can save output of these reporters in a jtl file. After the test results can be analyzed from jtl files.

Related

Can someone share the reason for the results of Locust testing differs from jmeter testing?

I have run the JMeter testing with 400 concurrent users with a 125 loop count for the following GET request.
https://example.com/rest_api?from_pt={from_point}
I have run the same request in locust with 500 concurrent users and 100 spawn rate per second and get the following charts
Locust file containing code
import csv
from locust import HttpUser, task
class WebsiteUser(HttpUser):
def on_start(self):
pass
#task(1)
def request_testing(self):
with open("test.csv", 'rt') as f:
reader = csv.DictReader(f)
for row in reader:
from_point = row.get('from_point')
to_point = row.get('to_point')
headers = {'content-type': 'application/json','q':'{{api_key_value}}'}
self.client.get("rest_api?from_pt=%s" %(from_point),headers=headers)
I need to find where I was wrong in doing locust testing.
I fail to see any JMeter test results, you're looking at PerfMon metrics (usage of CPU, RAM, etc.). Moreover it's not clear how did you configure and run JMeter test (Thread Group settings, etc.)
Generate HTML Reporting Dashboard from the JMeter .jtl test results file and compare it with Locust results.
In general comparing 2 load testing tools output doesn't make a lot of sense, if you're running the same test (in terms of number of concurrent users,sending the same request payload at the same rate, etc.) you should be getting the same results

JMeter 5.1.1 - Performance Metrics appears different during execution in Non-GUI console and later in GUI Listener for the same Test

Look at the results in Non-GUI Console
Look at the results in GUI Listener
It's very strange, I see different results for the same test.
The Average Response Time on Non-GUI Console displays as: 368 ms, whereas it displays 578 ms on Listener
Likewise, the Maximum Response Time on Non-GUI Console displays as: 4524 ms, whereas it displays 9999 ms on Listener
It appears to be happening on Jmeter 5.1.1 version, can someone help me out.
In the Summariser:
summary = 9377
In the Summary Report:
TOTAL = 11941
My expectation is that the inconsistency is being caused by extra ~2500 sample results in the .jtl results file, to wit you're appending the results of the current test run to the results of the previous test run. JMeter's summariser considers only current session and when you load the .jtl file in the Listener - it calculates the average from all entries in the file.
Consider passing -f command-line argument to your JMeter startup command line like:
jmeter -f -n -t test.jms -l result.jtl
this way you should get "clean" results and Summariser output will be in line with the Listener output
Thanks #Dmitri for pointing out the deviation of Sample size in Summariser and Listener which was causing the other Performance Metrics.
It all worked when I edited the below configuration in the Jmeter.properties file
Thanks

JMeter test works differently from CLI than GUI - why?

I'm creating a small test using JMeter. So far I have one Thread Group that executes an HTTP request, waits for 10 seconds, then executes an other HTTP Request and checks what was returned. If I start 100 such threads with 1 second ramp-up period from the JMeter GUI, it works fine, I get the expected values and the whole test finishes in 22 seconds. However, when I start the very same jmx file from the command line, the test runs for more than 120 seconds and some threads (at the last run, 36 out of the 100) don't get the expected value. This might indicate a bug in the system I test, but I don't understand why the test takes that long time from the CLI and why I get errors from the CLI. What is the difference between running the test from the GUI and from the CLI? Does the CLI run the tests "more parallel"? By the way, this is the command line I'm using:
/home/nar/apache-jmeter-3.3/bin/jmeter -n -t test_transactions.jmx -l test_transactions.out
I'm afraid I cannot share the test plan, but I can share the "outline":
+ Thread Group
+ CSV Data Set Config
+ HTTP Request
| + JSON Extractor
+ Constant timer
+ HTTP Request
| + JSON Extractor
| + Response Assertion
+ View Results Tree
+ Save Responses to a file
+ View Results in Table
+ Summary Report
The Constant timer waits for 10 seconds. The first HTTP Request sends in some data and initiates a computation, the second checks the result.
I think you should disable the following listeners in non gui test:
View Results Tree
Save Responses to a file
View Results in Table
Summary Report
After disable you still have result using -l test_transactions.out which you can later view using GUI mode with Browse button in your Listener
In non GUI you can also generate dashboard report if you want by adding -e -o /path/dashboardfolder
It actually does indicate the bug in the system under test. The reason is that you must run JMeter in non-GUI mode as GUI creates huge overhead in terms of resources consumption, especially when you're using Listeners, especially if one of them is View Results Tree.
So my expectation is that in non-GUI mode you're basically creating more immense load which your application cannot handle. You can check this out using i.e. Active Threads Over Time and Transactions Per Second listeners.

config Jmeter Generating Report Dashboard Average response time millisecond to second

in Jmeter http://jmeter.apache.org/usermanual/generating-dashboard.html
we can see the response time is second not millisecond
transaction response time example from Jmeter
but in my report it is millisecond not second, how can I config millisecond to second?
My test result
The report is in Milliseconds not in seconds and it’s not configrable.
So check if you are not receiving error pages with a 200 response code and add assertions:
http://www.ubik-ingenierie.com/blog/best-practice-using-jmeter-assertions/
Also check your are following the prerequisites for the report:
http://jmeter.apache.org/usermanual/generating-dashboard.html#configuration_requirements

The way of JMeter Result Analysis

No Of Requests - 2113 ;
Average Response time (s) - 123.5 ;
Response time/Sec (90% of Requests) - 142.9
Minimum Response time (s) - 2.4
Maximum response time (s) - 14.9
Error% -0.0
My Questions - For 2113 requests average response time is 123.5 secs I need to know what will be the response time of average one single request in 2113 requests
The average response time of a single request (1 out of 2,113) will be the value itself, but I'm sure this isn't your question.
Are you simply trying to locate the response time of each request after a given test plan has fully executed, that is, to see each of the 2,113 response times? If so, just add a Summary Report to your thread group. By doing this you'll need to specify an output file (which will get generated if it doesn't already exist) and will show you in detail each of the requests sent to the server, along with the HTTP response code, response time and other goodies.
UPDATE
Per the question posed in the comments via Ripon Al Wasim, the default extension of the results file is CSV, however this is configurable in /bin/jmeter.properties:
# legitimate values: xml, csv, db. Only xml and csv are currently supported.
#jmeter.save.saveservice.output_format=csv
As we can see, JMeter only appears to support XML and CSV.

Resources