Organize Functional Test in Jmeter - jmeter

I am using Jmeter to create an functional automation suite for our application under test (Right now this is only tool that i can think of which supports interaction with Active MQ , Database , Rest and SOAP API both which are our needs)
Down the line i will be having different test set and configuration files for the application under test.
Below is the process that i will follow to test:
1 Stop the application
2 Load a particular file
3 Start the application
4 Run the test test that match the loaded config
REPEAT THE SAME FOR OTHER CONFIGURATION.
Now every Test case comes with steps , liked.
1) Call a Rest API
2) Call a Rest API
3) Call DB
4) Validate the result from step 2
See the attached image for more details on how my i test case is organized.
Problem :
When the report is generated it is not generated on the thread group level but in sampler level , i.e. in the report i have lines , and there is no way to distinguish which TC (Or thread group) and Test Set they belong.
Can someone please suggest how do i achieve this ?
Please consider this is mind :
1 ) Down the line i will have multiple Test Set
2 ) I will also need to merge all this reports from multiple Test Set and create 1 single report that provides a clear picture of what failed / passed and probably the error message received.
Existing Report :
timeStamp,elapsed,label,responseCode,responseMessage,threadName,dataType,success,failureMessage,bytes,sentBytes,grpThreads,allThreads,URL,Latency,IdleTime,Connect
1565180794011,2067,DeactiveExistingActiveScenario,Non HTTP response code: org.apache.http.conn.HttpHostConnectException,"Non HTTP response message: Connect to localhost:1 [localhost/127.0.0.1, localhost/0:0:0:0:0:0:0:1] failed: Connection refused: connect",TC1_Probe_MbaWmcOutboundHappyFlowScenario 1-1,text,false,Test failed: code expected to contain /200/,2738,0,1,1,http://localhost:1/XXX/XXX/XXXX,0,0,2067
1565180796093,2007,ActiveMbaWmcOutboundHappyFlowScenario,Non HTTP response code: org.apache.http.conn.HttpHostConnectException,"Non HTTP response message: Connect to localhost:1 [localhost/127.0.0.1, localhost/0:0:0:0:0:0:0:1] failed: Connection refused: connect",parallel bzm - Parallel Controller,text,false,Test failed: text expected to contain /All 25 invocations validated successful./,3104,0,2,1,http://localhost:1/XXX/XXX/XXX?awaitSeconds=30,0,0,2007
1565180796092,2479,Call DB Procedure,200,OK,parallel bzm - Parallel Controller,text,true,,42,0,1,1,null,2478,0,390
Actual expected :
Probably the same report in a different format like
Test Set 1 :
TC1 :
Step 1:
Step 2:
Step 3:
TC2 :
Step 1:
Step 2:
Step 3:
Current Test Set Structure :
https://ibb.co/F4SVHxq

Two approach, i can think of:
Use transaction controller. Put all request of one test under 1 transaction controller. It will be shown in the report at the end. So, first steps then test case name at the end as shown below:-
Use dummy sampler for test set to produce the extra label as shown below:-
Here TC1 and TC2 are dummy sampler. Based on the above, you can use Test set and test cases labels according to your need. Test Set1--dummy, TC1--dummy,Step 1, Step 2 so on.
Considering, functional test with 1 thread.
Hope this helps.

Related

JMeter - Disable/Ignore assertion only for first request

I have a Thread Group with 10+ requests in same hierarchy.
I added Duration Assertion for all and it's working fine, except in 1 case:
If server is upload before test, first request failed due long duration caused by a server startup delay.
How can I ignore assertion in first request on the first execution?
You can add a JSR223 Listener to your Test Plan and use the code like:
if (prev.getSampleLabel().equals('First Sampler') && vars.getIteration() == 1) {
prev.setSuccessful(true)
}
It will mark the sampler with label of First Sampler as successful if it fails during first Thread Group iteration.
prev stands for SampleResult class instance, check out JavaDoc for available functions and properties. For example you might be interested in getAssertionResults()

Running JMeter in Command mode is not creating any log for anything under while loop in .jmx file

I have a .jmx file with two thread groups. The first thread group is for data comparison (DB Vs API) and has a JDBC request where I plugin my SQL script and saves it to a tab delimited file. Then I have a while loop under which I have an HTTP request.
Second thread group for negative scenarios validation.
Below is the structure of the .jmx file
-- Thread Group Name - FX-Rates
-- JDBC Request Name - FX-SQL
-- While loop
-- HTTP Request - FX Rates - API
-- Thread Group Name - Negative Testing
-- Error Codes
I am running JMeter in Non GUI mode using below command.
jmeter -n -t "F:\MY DOCUMENTS\PSM\PSM_Automation\bin\Non_GUI_FX_Rates_Validation.jmx" -l "F:\MY DOCUMENTS\PSM\PSM_Automation\log\Non_GUI_FX_Rates_Validation.jtl"
I see that it is creating log for each individual sampler i.e; SQL and negative scenarios but not for anything under while loop. Below is the log that it created.
Logfile:
timeStamp,elapsed,label,responseCode,responseMessage,threadName,dataType,success,failureMessage,bytes,sentBytes,grpThreads,allThreads,Latency,IdleTime,Connect
1492185939615,12140,FX - SQL,200,OK,FX Rates 1-1,text,true,,18549,0,1,1,12017,0,1566
1492185951933,0,Error 400: Invalid Date Format,Non HTTP response code: java.net.URISyntaxException,Non HTTP response message: Illegal character in query at index 80: https://sys-fxrt-v0.apps.system.pcf.ntrs.com/foreign-exchange-rates?as-of-date=${D_EXCH_RT_EFF},Negative Testing - Error Codes 2-1,text,false,,1105,0,1,1,0,0,0
1492185951935,190,Error 404: No Account,404,Not Found,Negative Testing - Error Codes 2-1,text,true,,354,232,1,1,189,0,170
1492185952127,20,Error 404: Incorrect URL,404,Not Found,Negative Testing - Error Codes 2-1,text,false,,354,241,1,1,19,0,12
1492185952147,19,Error 204: No Data ,404,Not Found,Negative Testing - Error Codes 2-1,text,true,,354,260,1,1,19,0,12
Obviously, the kind of sampler result is there:
Error 400: Invalid Date Format,Non HTTP response code: java.net.URISyntaxException,Non HTTP response message: Illegal
character in query at index 80:
https://sys-fxrt-v0.apps.system.pcf.ntrs.com/foreign-exchange-rates?as-of-date=${D_EXCH_RT_EFF}
Obviously that is because the D_EXCH_RT_EFF is not set/resolved in your URL.
That's the problem you're gonna debug.
Obviously.
If you're already doing that (debugging), and run into a problem during it - then describe it please.
And show two things: how you set (initialize) the variable; how you use it, means, literally, show your HTTP sampler (yes, screenshot, at least).
The reasons of not entering the "While Loop" can be in:
While condition is returning false
While condition is wrong
If your While Condition depends on a variable coming from JDBC Request Name - FX-SQL sampler - double check this variable value using Debug Sampler and View Results Tree listener combination.
General "good practice" is running your test with 1-2 virtual users in GUI mode and inspect requests and responses details in the View Results Tree listener. Also pay attention to any suspicious entries in jmeter.log file. Don't run your JMeter test with the full load until you are totally sure that it is doing what it is supposed to be doing. See How to Debug your Apache JMeter Script article for more information on JMeter tests troubleshooting techniques.

Jmeter Test Plan summary report PASS/FAIL

I'm stuck on finding solution on one problem with Jmeter. I need to put some logic into my Test Plan that can give simple report PASS/FAIL calculated on test cases execution results and put in generated JTL report afterwards. For instance
All tests passed - Test Plan result=PASS
One or more tests failed - Test Plan result=FAIL
The majority of suitable options assume using third-party tools, to wit:
you can run JMeter test in Jenkins and use Performance plugin, it allows to conditionally fail the build if the amount of failed requests exceeds specified threshold
you can run JMeter test using Taurus tool as a wrapper, it has flexible and powerful Pass/Fail Criteria Subsystem allowing to set different criteria definitions to mark the test as passed or failed. If build is failed Taurus process returns non-zero exit code.
If above approaches are not suitable for any reason please elaborate your question and explain how and where you would like to see this "FAIL" or "PASS" result.
Add one BeanShell Listener and one BeanShell Sampler at the end of your Thread Group and put this in Listener:
if(sampleEvent.getResult() instanceof org.apache.jmeter.protocol.http.sampler.HTTPSampleResult)
if (!sampleEvent.getResult().isResponseCodeOK())
vars.put("res", -1);
And in BS Sampler put:
if you wanna store result as property:
props.put("testPlanResult", vars.get("res") != -1 ? "PASS" : "FAIL");
if you wanna store result in a file:
f = new FileOutputStream("/path/to/file.txt", false);
p = new PrintStream(f);
p.println("Result: " + (vars.get("res") != -1 ? "PASS" : "FAIL"));
p.close();
f.close();
From here you can do what ever you need with created property or file containing result...
Hope this helps you!
EDIT:
You will need to add this import if writing result to file:
import org.apache.jmeter.services.FileServer;

Beanshell script launched once (start and end of test plan) in JMeter

Good afternoon !
I will try to explain you clearly my problem.
The context
I have a JMeter TestPlan which send HTTP requests to a server. I have a Beanshell script to assert each different case of error returned.
302 response code -> OK
200 response code -> ?
In each error 200, I check the response data string to see if it is an error or a correct case. (User error like User don't have correct rights is OK, but Server is unavailable is ERROR and both have 200 as response code.)
Here is my test plan :
The goal
As I have several errors returned by only one assertion script, I am not able to differenciate each error, except by uncollaspe the assertion in a ViewResultTree. But I disable it when launching my test, and I will launch my TestPlan remotely.
I had the idea to manually count each error. All my samples goes in my Assertion script, and goes to the correct if block according to their content. I increment some variables (JMeter.properties in fact) in each block.
int test = Integer.parseInt(props.getProperty("302"));
test++;
props.setProperty("302", ""+test);
I want to display all those variables in a JFrame at the end of my testplan like this :
The problem
My problem is that I don't know how to launch a Beanshell script before and after the TestPlan.
I want a first script to be started before any sample is send, just to initialize all my properties variables to 0 (else, they keep the value of the last TestPlan).
And, I want a second one to display my Frame with all the variables after the test plan is finished. (Currently it is a JFrame but it will not stay like this.)
Tested solutions
1) For my first script, I set a Counter (JMeter > Config Element > Counter) in the beginning of my test plan to 0.
I use it to check if my test already started of not with an If Controller :
I have a Pre-Processor Beanshell with props.set("302","0"); where "302" is my property to count all 302 response code.
It correctly works but I want to know if there is a proper way to do this.
2) Then, for my second script, I tried to use ${JMeterThread.last_sample_ok} in an If Controller aswell but it doesn't work like I expected. If I put it after my sample, it start after all OK assertion, and if I put it at the end of the test plan, it is never called.
How can I run my beanshell script once, after all my threads are stopped (i.e. all sample finished) ?
Thank you in advance, I hope you understood everything !
JMeter SetUp thread group and TearDown thread group are meant for exactly this.
Add your beanshell component to the setUp thread group to do some setup activities before your actual test starts. Similarly the tearDown thread group runs after your test execution is complete.

Thrift API load test

I am new into Apache Jmeter. Basically I want to load test our couple of thrift APIs but have no clue where to start with. It is in java where api takes 2 parameter and then send java object as response.
Any pointer would be very helpful.
JMeter isn't especially for it but it's flexible enough to support your use case.
There is an extensibility mechanism which uses BeanShell. JMeter provides BeanShell Sampler which is capable of invoking Java code, including using external jars.
Simple usage:
Start with empty JMeter project
Create a Thread Group with all defaults (you can play with number of threads, ramp-up, etc)
Add a BeanShell Sampler with following code:
Thread.sleep(2000L);
Add View Results Tree listener
Save and run
You should see a green triangle (or triangles) basing on your number of threads and loops) with output like following:
Thread Name: Thread Group 1-1
Sample Start: 2013-11-02 14:48:11 GMT+03:00
Load time: 5030
Latency: 0
Size in bytes: 0
Headers size in bytes: 0
Body size in bytes: 0
Sample Count: 1
Error Count: 0
Response code: 200
Response message: OK
If you use any of techniques to analyze results, i.e.
JMeter embedded listeners like Aggregate Report, Summary Report, Graph Resuls, etc.
Storing results to CSV file and opening them with Excel or equivalent (see jmeter.properties file under /bin directory of your JMeter installation. Properties prefix is "jmeter.save.saveservice."
JMeter Ant Task (see Test.jmx and build.xml in /extras folder under your JMeter installation)
JMeter Results Analysis Plugin
You'll see your request(s) success rate, min/max/average times (something like 2 seconds I guess) and some more information (depending on your configuration).
Particular your use case assumes
IMPORTANT Placing thrift (or whatever) jars under lib/ext folder (or you won't be able to access your APIs
importing classes you need to test somewhere in BeanShell Sampler
import yourpackage.YourClass;
Invoking methods you want to test from BeanShell Sampler
(optional) do some assertions on responses. i.e.
if (yourresponse != yourexpectedresponse){
IsSuccess=false;
ResponseMessage= "Test Failed";
}
Hope this helps
You can use JSR223 Sampler + Groovy (add groovy-all.jar in jmeter/lib) and look at this client example, see NonblockingClient code for an example:
http://www.javacodegeeks.com/2012/03/apache-thrift-quickstart-tutorial.html
Make your groovy code call a least the following at end:
SampleResult.setSuccessful(true/false)
SampleResult.setResponseCode("code")
SampleResult.setResponseMessage("message")
See:
http://jmeter.apache.org/usermanual/component_reference.html#JSR223_Sampler
And of course, ensure you add the required dependencies in jmeter/lib.
I have writtena CustomThriftSampler for JMeter to load test HBase through thrift service. You can get the details about it at my blog - http://1-st.blogspot.in/2013/12/load-testing-thrift-services-custom.html . Couldn't create a generalized code. Anyway its simple and starightforward java code. Anyone could try it. If time permit I shall write a generalised code and commit to github!!

Resources