Stop JMeter test execution only after n assertion errors - jmeter

Problem
I am modelling stress tests in JMeter 2.13. My idea of it is to stop the test after certain response time cap is reached, which I test with Duration Assertion node.
I do not want, however, to stop the test execution after first such fail - it could be a single event in otherwise stable situation. I would like the execution to fail after n such assertion errors, so I can be relatively sure the system got stressed and the average response should be around what I defined as a cap, which is where I want to stop the whole thing.
What I tried
I am using Stepping Thread Group from JMeter plugins. There I could use a checkbox to stop the test after an error, but it does that on first occasion. I found no other node in documentation that could model it, so I'm guessing there's a workaround I'm not seeing right now.

I would recommend switching to Beanshell Assertion as it is more flexible and allows you to put some custom code in there.
For instance you have 3 User Defined Variables:
threshold - maximum sampler execution time. Any value exceeding the threshold will be counted
maxErrors - maximum amount of errors, test will be stopped if reached and/or exceeded
failures - variable holding assertion failure count. Should be zero in the beginning.
Example Assertion code:
long elapsed = SampleResult.getTime();
long threshold = Long.parseLong(vars.get("threshold"));
if (elapsed > threshold) {
int failureCount = Integer.parseInt(vars.get("failures"));
failureCount++;
int maxErrors = Integer.parseInt(vars.get("maxErrors"));
if (failureCount >= maxErrors) {
SampleResult.setSuccessful(false);
SampleResult.setResponseMessage(failureCount + " requests failed to finish in " + threshold + " ms");
SampleResult.setStopTest(true);
} else {
vars.put("failures", String.valueOf(failureCount));
}
}
Example assertion work:
See How to use BeanShell: JMeter's favorite built-in component guide to learn more about extending your JMeter tests with scripting.

Close but not exactly what you're asking for: The Auto-Stop Jmeter plugin. See the documentation here. You can configure it to stop your test if there are n% failures in a certain amount of time.
If you want a specific number of errors, you can use a test-action sampler, combined with an if-controller - if (errorCount = n) test-action stop test

Related

How to add timer in jmeter script, which we can start at first call, poll the status & stop once the first request is completed & add assertions

I am doing load testing on generating report and the requirement is like the report should get generated within 10mins.
It includes one HTTP post request for report generation, and then there is a status check call, which keeps on checking the status of the first request. Once the status of first request changes to complete then the report generation is successful.
Basically I want to start the timer at the begining of the first request and stop the timer once the status is complete and need to add assertion if the time is less than 10 mins then test is pass else fail.
I tried multiple approaches like using Transaction controller, and adding all request under it. But this doesn't give sum but the average response time of all the request under it.
Also, I tried beanshell listener, extracting the response time for every request and adding them all...
var responseTime;
props.put("responseTime", sampleResult.getTime());
log.info(" responseTime :::" + props.get("responseTime"));
log.info("time: "+ sampleResult.getTime());
props.put("responseTime", (sampleResult.getTime()+props.get("responseTime")));
log.info("new responseTime :::" + props.get("responseTime"));
However, I am not interested in adding the response time of these requests, instead I need to just know what is the time elapsed from when the report is triggered and till it gives status as complete.
All the jmeter timers are adding delays, I dnt wish to add delay instead I need it as a timer.
Any help is highly appreciated.
Thank you
Since JMeter 3.1 it's recommended to use JSR223 Test Elements and Groovy language for scripting mainly due to performance reasons so I'll provide one of possible solutions in Grovy
Add JSR223 PostProcessor as a child of the HTTP Request which kicks off the report generation and put the following code there:
vars.putObject('start', System.currentTimeMillis())
Add JSR223 Sampler after checking the status and put the following code there:
def now = System.currentTimeMillis()
def start = vars.getObject('start')
def elapsed = now - start
if (elapsed >= 600000) {
SampleResult.setSuccessful(false)
SampleResult.setResponseMessage('Report generation took: ' + (elapsed / 1000 / 60) + ' minutes instead of 10')
}
Example setup:

JSR-233 Timer strange(?) behavior

I'm using JSR-223 Timer (jMeter 5.4.1), with groovy language, and trying to add delay\pauses to my threads.
I'm following the instructions by BlazeMeter (How to Easily Implement Pacing).
The strange(?) behavior is that the actual delay is double than required.
The script is as follows:
Long pacing = 5000 - prev.getTime();
Integer iPacing = pacing != null ? pacing.intValue() : null;
log.info("Transaction Pacing: " +String.valueOf(iPacing));
vars.put("myDelay", String.valueOf(iPacing));
return iPacing;
I get the duration of the Sampler action, then calculate "myDelay" as the difference from a base duration of 5,000 mSec. myDelay is a variable I use in the Flow Control Sampler.
Now the strange result:
The actual delay I achieve is TWICE than calculated. In this example, the delay is 5K mSec, but the actual delay is 10K mSec.
Now here is the real strange issue:
If I mark-out the return iPacing, the delay is 5K mSec as required (with a warning message in log file).
See the output below.
Why does the Flow Control Sampler adds myDelay and the iPacing values?
The first block - iPacing is returned. The overall pause is myDelay + iPacing.
The second block - iPacing is marked-out. The delay is myDelay only.
Your delay is TWICE simply BECAUSE you're setting it TWICE.
This statement:
return iPacing;
will create a delay BEFORE each SAMPLER in the JSR223 Time SCOPE
So there is no need to use the Flow Control Action sampler because you're creating the delay in the JSR223 timer ALREADY.
In general PACING is not implemented in JMETER because there is an EASIER way of creating the LOAD in terms of X REQUESTS per second: Constant THROUGHPUT timer and friends.

JMeter - Disable/Ignore assertion only for first request

I have a Thread Group with 10+ requests in same hierarchy.
I added Duration Assertion for all and it's working fine, except in 1 case:
If server is upload before test, first request failed due long duration caused by a server startup delay.
How can I ignore assertion in first request on the first execution?
You can add a JSR223 Listener to your Test Plan and use the code like:
if (prev.getSampleLabel().equals('First Sampler') && vars.getIteration() == 1) {
prev.setSuccessful(true)
}
It will mark the sampler with label of First Sampler as successful if it fails during first Thread Group iteration.
prev stands for SampleResult class instance, check out JavaDoc for available functions and properties. For example you might be interested in getAssertionResults()

Jmeter Test Plan summary report PASS/FAIL

I'm stuck on finding solution on one problem with Jmeter. I need to put some logic into my Test Plan that can give simple report PASS/FAIL calculated on test cases execution results and put in generated JTL report afterwards. For instance
All tests passed - Test Plan result=PASS
One or more tests failed - Test Plan result=FAIL
The majority of suitable options assume using third-party tools, to wit:
you can run JMeter test in Jenkins and use Performance plugin, it allows to conditionally fail the build if the amount of failed requests exceeds specified threshold
you can run JMeter test using Taurus tool as a wrapper, it has flexible and powerful Pass/Fail Criteria Subsystem allowing to set different criteria definitions to mark the test as passed or failed. If build is failed Taurus process returns non-zero exit code.
If above approaches are not suitable for any reason please elaborate your question and explain how and where you would like to see this "FAIL" or "PASS" result.
Add one BeanShell Listener and one BeanShell Sampler at the end of your Thread Group and put this in Listener:
if(sampleEvent.getResult() instanceof org.apache.jmeter.protocol.http.sampler.HTTPSampleResult)
if (!sampleEvent.getResult().isResponseCodeOK())
vars.put("res", -1);
And in BS Sampler put:
if you wanna store result as property:
props.put("testPlanResult", vars.get("res") != -1 ? "PASS" : "FAIL");
if you wanna store result in a file:
f = new FileOutputStream("/path/to/file.txt", false);
p = new PrintStream(f);
p.println("Result: " + (vars.get("res") != -1 ? "PASS" : "FAIL"));
p.close();
f.close();
From here you can do what ever you need with created property or file containing result...
Hope this helps you!
EDIT:
You will need to add this import if writing result to file:
import org.apache.jmeter.services.FileServer;

How to stop Jmeter test after specified number of threads failed

I have one Transaction controller which has one http request in my Jmeter test plan. Transaction name and url comes from CSV file. At the end total execution is divided into 5 different transactions.
Testplan:
Testplan
-Thread Group
- User defined variable
Total sample execution will be 8000-10000. Now what i want, if total sample failures reached to 100, my JMeter test should stop test execution.
I have added User defined variable name "thread" and with value "0". I have added below code in Beanshell Post-Processor
int count= Integer.parseInt(vars.get("thread"));
if (prev.getErrorCount()==1){
count++;
System.out.println(count);
vars.put("thread",Integer.toString(count));
}
if (count==100){
System.out.println("Reached to max number of errors in load test, stopping test");
log.info ("Reached to max number of errors in load test, stopping test");
prev.setStopTestNow(true);
}
Somehow code is not working as expected. When error count reach to 100, Jmeter test is not getting stopped. Test is stopped when error count reached to 130. I am not sure who to fix above code.
Can someone please let me know what is issue in above code?
Variables are specific to 1 thread while Properties are shared by all threads.
See:
https://jmeter.apache.org/api/org/apache/jmeter/util/JMeterUtils.html#getProperty(java.lang.String)
https://jmeter.apache.org/api/org/apache/jmeter/util/JMeterUtils.html#setProperty(java.lang.String,%20java.lang.String)
Ensure you synchronize access
Another option is to use a custom java class as a Singleton and increment its value.
Here an example implementation using Beanshell (Prefer JSR223 + Groovy for performances):
setupThreadGroup that resets the counter on test start:
BeanshellPostProcessor that updates counter:
Note that as you call setStopTestNow, test threads are interrupted but do not stop exactly at this time unless you have some timer (which is usually the case)
prev.setStopTestNow(true);

Resources