Should average response time include failed transactions or not? - performance

In loadrunner report it excludes failed transactions for calculating average response time but in JMeter it includes failed transactions as well for calculating average response time. I am bit confused here. What is the best way to calculate average response time? Should it include failed transactions or not? Detailed explanations will be highly appreciated.

It depends on where exactly your "transaction" failed.
If it reached the server, made a "hit" (or several hits), kicked off request processing and failed with non-successful status code - I believe it should be included as your load testing tool has triggered the request and it's the application under test which failed to respond properly or on time.
If the "transaction" didn't start due to missing test data or incorrect configuration of the load testing tool - it shouldn't be included. However it means that your test is not correct and needs to be fixed.
So for well-behaved tests I would include everything into the report and maybe prepared 3 views:
Everything (with passed and failed transactions)
Successes only
Failures only
In JMeter you can use Filter Results Tool to remove failed transactions from the final report, the tool can be installed using JMeter Plugins Manager

A failed transaction can be faster than one which passes. Example, a 4xx or 5xx status message may arrive almost instantaneously back to the client. Get enough of these errors and your average response time will drop considerably. In fact, if I was an unscrupulous tester, castigated for the level of failure on my tests, I might include a lot of "fast responses" in my data set to deliberately skew the response time so my stakeholders don't yell at me anymore.
Not that this every happens.

Related

The request run in while loop controller on Jmeter captured in report

I am trying to generate a test report for a thread group.
I see that since i run a http request in while loop controller, the request shows up in the report with too many failures as i also have an assertion to the request.
These are not failures but, tries, how do i make sure that these dont get captured in the report.
I am using View results in table listener for report. I tried to remove assertion for this http request so that it is not captured in the report? Is this the right approach?
As per 9 Easy Solutions for a JMeter Load Test “Out of Memory” Failure article:
Use Assertions Sparingly
Every test element added to the test plan will be processed - and this takes CPU time and memory. This also applies to all your Assertions, and is especially true for Compare Assertions - which consumes a lot of resources and memory. Only use the assertions you need and, even then, just use the amount that are absolutely required.
So removing the assertion seems to be a right approach to me.
Also if it's a Response Assertion you can modify it to accept 2 (or more) criteria so you can configure it to not to fail on "tries", example setup which accepts HTTP Status Codes 200 and 501 and will fail for everything else:

Response code 500 in JMeter when running with threads

Getting the following error in JMeter while running the list of APIs (with no of threads:1-140 with ramp up period-1).
Response code:500
Response message: Internal Server Error
How should I overcome this Error Response code in order to get the accurate response?
What should do to decrease amount of response with this response code?
In general a 500 is an unhandled response on the part of a developer. Usually on the backend but also on the performance testing tool front end.
Ask yourself, are you validating responses that come back from the server for appropriate content? I am not just suggesting an HTTP200 is valid. You need to check response content to ensure that it is what you expect is valid for the business process, for you can have a completely valid HTTP200 class page which contains a response which will send your business process off the rails. If you do not handle the exception on the part of the unexpected response then you will find that one to two steps down the road in the business process then you are pretty much guaranteed that you will find a 500 as your request is completely out of context with the state of the application at that point.
Test101, for every step there is an expected and positive result which allows the business process to continue. Check for that result and branch your code when you do not find that the result is true.
Or, if this is a single step business process then you are likely handing the service poor data and the developer has not fully fleshed out the graceful part of dealing with your poor data.
The general advice in JMeter is Ramp-up = number of threads, in your case 140
Start with Ramp-up = number of threads and adjust up or down as needed.
Currently you are sending every 1/140 seconds new thread which is almost simultaneously, the reason to the change is:
Ramp-up needs to be long enough to avoid too large a work-load at the start of a test
Status code - 500 comes from server/API's and it's not an issue of Jmeter. Sometimes the concurrent requests are rejected by server as it's too weak to handle that number of requests.In my case, I asked my server team to scale up servers so that we can test the underlying API . It's worth mentioning that sometimes Jmeter also runs out of memory. You can do some tweaking in set HEAP=-Xms512m -Xmx512m property of jmeter execuble file. Also listeners consume too much resources.Try not to use them.

Jmeter: Using Assertions on jmeter scripts being used in Blazemeter will it cause slowness/increase in response time?

I need clarification that if I use Assertion in our scripts will it make any slowness?
I was using 10 to 15 assertion on my scripts and the type used are Response and Duration assertion, will there is any impact that could cause slowness/increase in response time and other metrics?
Please help me on this by getting the answers from expertise, as the mail address support team.
According to what BlazeMeter say about assertions they only consume CPU and Memory
All assertions come with a cost, in terms of CPU or memory consumption. However, some assertions carry a greater cost than others. According to the JMeter Performance and Tuning Tips guide, the Response Assertion and the Duration Assertion are typically lower-impact choices, whereas Compare Assertion and other XML-based ones like XPath Assertion consume more CPU and memory.
The assertions are done at the machine which is running the tests not on the server where the application is running so there is no need to worry about slower response times from the server. The only thing that assertions are going to bother are your processor and the RAM eventually.
I see that you already got the answer for your question. Just wanted to share additional info that It is not just only for assertion.
The issue will even happen based on the type of response data extractors you have in your test plan. Check this link for a simple comparison and to get an idea.
Each and every element in the test plan affects the execution. You need to be very careful what you are adding in the test plan.
For the above question I got reply from the Blazemter team and here is the Message;
"Some assertions are consuming a lot of system resources and using a lot of assertions can cause slowness and out of memory errors.
The duration assertion is not very resource consuming, however the Response assertion can consume a lot of memory as it has to fetch the whole page full of data.
I highly recommend reading this short blog post(https://www.blazemeter.com/blog/why-you-must-use-jmeter-assertions-your-load-tests-0) about how to use assertion in JMeter and also which assertions you should avoid."

what is meant by Error % in jmeter(summary result)?

1) I can not understand error % in summary result in Listeners. 2) For example first time I've to run a test plan its error% is 90% and then run same test plan it shows 100% error. This error% is vary when i run my test plan.
Error% denotes the percent of requests with errors.
100% error means all the requests sent from JMeter have failed.
You should add a Tree View Listener and then check the individual requests and responses. Such high percentage of error means that either your server is not available or all of your requests are invalid.
So you should use Tree View Listener in order to identify the actual issue.
Error % means how many requests failed or resulted in error throughout the test duration. Its calculated based on the #samples field.
2 and 3 Can you please give more details about your test plan? Like number of threads, ramp-up and duration.
Such high error percentage will need further analysis. Check if you have missed out correlation of some requests i.e. any dynamic values that are passed from one request to other or check for resource utilization of your target system if it can handle the load you are generating.

Jmeter Sample Increment vs Throughput

We're running a load test right now in JMeter viewing the Aggregate Report page. While we watch, Samples are increasing by nearly 500/second the number is going up very fast. However, throughput on the same page stays pegged at 18/second and our error rate is not increasing.
How can jmeter be sending so many samples if our server is only handling 18/second and the # of errors is not increasing (we only have 20 errors out of millions of samples).
Do requests equate to samples (they seem to)? Are we missing something?
If you add a "View Results Tree" Listener you can see EACH request and response - and you should check if the responses are what you actually want.
And in the "View Results in Table" Listener compare the Bytes for each response. Does it match the size in all cases?
In cases of errors or incorrect responses - these will be different.
Requests DO equal samples.
Throughput is the number of requests per unit of time (seconds, minutes, hours) that are sent to your server during the test. The aggregate report is number of requests that hit the server PER HOUR.
Remember that almost all errors are user defined. Using JoseK's recommendation, install the View Results Tree to see what your responses actually are. If they are green, but fail your own criteria, add assertions to turn them into errors.

Resources