I would like to use Extent Report in Jmeter for functional testing. Please suggest some sample script and language, library to do this. I explored our page and unable to get the lib and steps to implement them.
I followed the Using extentreports for jmeter test results
However, I am getting error message
Typed variable declaration : Class: ExtentReports not found in namespace
I am using extentreports-3.1.2.jar and kept inside the Jmeter Lib folder. Also I have imported them in the script as well.
Have you tried to look at Extents reports documentation? It seems to be comprehensive enough.
I don't know what you mean by "use Extent Report in Jmeter for functional testing", but given you're asking for a sampler code here is a minimal snippet suitable for using in any JSR223 Test Element assuming Groovy as the language:
import com.aventstack.extentreports.ExtentReports
import com.aventstack.extentreports.Status
import com.aventstack.extentreports.reporter.ExtentSparkReporter
import com.aventstack.extentreports.reporter.configuration.ViewStyle
def extentReports = new ExtentReports()
def reporter = new ExtentSparkReporter("report", ViewStyle.SPA)
extentReports.attachReporter(reporter)
def firstTest = extentReports.createTest("SomeTest")
firstTest.log(Status.PASS, 'Everything ok')
extentReports.flush()
You will need to:
Have extentreports-4.1.7.jar in JMeter Classpath (as well as all it's dependencies if you will be using MongoDB as the reporting backend)
Restart JMeter to pick the .jar up
Related
I am using WebDriver Sampler to test the web application's client-side performance in JMeter.
NOTE: Javascript is used as Groovy was not working. I need to switch to Groovy.
Everything works fine, and page load time is displayed in listeners.
A few common functions are used across the WebDriver Samplers.
Is there a way to define the Javascript functions globally and use them within each WebDriver Samplers?
var pkg = JavaImporter(org.openqa.selenium)
var time = JavaImporter(java.time)
var support_ui = JavaImporter(org.openqa.selenium.support.ui);
var wait = new support_ui.WebDriverWait(WDS.browser, time.Duration.ofSeconds(10));
WDS.sampleResult.sampleStart();
WDS.log.info("Start navigating to new bus incident creation page");
waitAndClick(pkg.By.cssSelector("#New"));
waitAndClick(pkg.By.cssSelector("#Bus_3"));
waitAndClick(pkg.By.cssSelector("#Bus_3_0"));
waitUntilLoadingIsCompleted();
WDS.log.info("Accessed the new bus incident creation page");
WDS.sampleResult.sampleEnd();
function waitUntilLoadingIsCompleted() {
wait.until(support_ui.ExpectedConditions.presenceOfElementLocated(pkg.By.cssSelector(".k-loading-text")));
wait.until(support_ui.ExpectedConditions.invisibilityOfElementLocated(pkg.By.cssSelector(".k-loading-text")));
}
function waitAndClick(element){
wait.until(support_ui.ExpectedConditions.elementToBeClickable(element));
WDS.browser.findElement(element).click()
}
I can only think of storing your functions into a separate .js file and reading them using load() function, this way you will be able to refer the "common" functions without having to copy and paste them again and again.
Going forward I would like to remind that Nashorn scripting engine has been removed in OpenJDK 15 so you won't be able to use javascript engine with newer JVMs so it worth migrating to Groovy as soon as possible, moreover it's recommended scripting option since JMeter 3.1 and it has much better performance comparing to other engines.
This is really the wrong model to test client performance. Use the builkt in tools of your browser, along with the JavaScript profiling tools. If you want larger sample sets, then include a RUM agent in your code and collect all of the w3c Navtiming metrics in your choice of tool (Splunk, Datadog, Dynatrace, Elasticstack,....) for statistical analysis
We are using JMeter to execute load tests and our APIs expect an encrypted value which we have generated using crypto lib of nodejs.
Use Rhino/Nashorn load function like:
load('/path/to/crypto.js')
For example I have the following file hello.js
function hello(name) {
return 'Hello, ' + name;
}
and this is how I call it in JSR223 Sampler
The same way you can load any JS library of your choice
Just be aware that starting from JMeter 3.1 you're supposed to use JSR223 Test Elements and Groovy language for scripting as Groovy engine provides the maximum performance
I'm working on regression API testing automation using JMeter.
I'm searching for some basic organic solution to validate JSON schema using build-in JMeter tools.
CI is built with Team City so a basic solution will be faster.
Out of the box JMeter doesn't provide JSON Schema validation functionality, however you can use a 3rd-party library like JSON Schema Validator to add this to JMeter
Get the latest version of org.everit.json.schema.jar
Get the appropriate version of JSON in Java
Get the appropriate version of Handy URI Templates
Drop 3 above .jar files to "lib" folder of your JMeter installation (or whatever place in JMeter Classpath)
Add JSR223 Assertion as a child of the Sampler which returns the JSON which you need to check against the schema (or according to the JMeter Scoping Rules)
Put the following code into "Script" area:
def schemaPath = '/path/to/your/schema.json'
def rawSchema = new org.json.JSONObject(new org.json.JSONTokener(org.apache.commons.io.FileUtils.readFileToString(new File(schemaPath), 'UTF-8')))
def schema = org.everit.json.schema.loader.SchemaLoader.load(rawSchema)
schema.validate(new JSONObject(prev.getResponseDataAsString()))
That's it, if schema validation fails the affected Sampler(s) will be marked as failed
Initially all the imports were working fine until I closed and re-opened the script after which the color of few import statement has changed and are giving error when running the script.
See above, the some classes are showing black and some in golden. The ones in black are giving the beanshell exception.
It was working but suddenly after closing and re-opening the script has created this chaos.
Can someone explain this weird behaviour...?
As per Beanshell User Manual
Default Imports
By default, common Java core and extension packages are imported for you. They
are, in the order in which they are imported:
javax.swing.event
javax.swing
java.awt.event
java.awt
java.net
java.util
java.io
java.lang
Two BeanShell package classes are also imported by default:
bsh.EvalError
bsh.Interpreter
So basically you don't need to import these "in black" packages.
It is also possible to use "super import" to load the entire classpath like:
import *;
In order to get to the bottom of your script failure either add debug() directive to the beginning of your script - this way you will get comprehensive debugging information in stdout or put your code inside the try block like:
try {
//your code here
}
catch (Exception ex) {
log.error("Beanshell failure", ex);
}
This way you will get "normal" stacktrace in jmeter.log file.
See How to Use BeanShell: JMeter's Favorite Built-in Component article for more details.
Also be aware that since JMeter 3.1 it is recommended to use JSR223 Test Elements and Groovy language for scripting so I would recommend considering moving to Groovy, it is more Java compliant, has nice SDK enhancements and its performance is much higher.
As anyone who uses jmeter for api functional testing, the reporting is, eh, not that great. Has anyone used something like http://extentreports.com/ for displaying their test results? Any ideas on other ways to display test results better? In trying to use a tool that was mainly focused on performance testing and those test results, that does not work as well for when we are testing REST API calls and those results. For example, it would be nice to be able to capture data that is getting created during the test runs but, none of the reports that are built into jmeter do this. Any advice is appreciated.
Looking into Extent - Report API it should be possible to integrate it with JMeter, see How to Write a plugin for JMeter guide to get started
You can generate an HTML Reporting Dashboard at the end of your test run, it is not that bad.
You can use Backend Listener to store JMeter test results somewhere and visualise them in the most suitable for your form. See Real-time results JMeter User Manual chapter for more details.
And last but not least you can use 3rd-party analysis services like JAnalyzer or BM.Sense
I was able to use extentreports 2.41.1 and posted this code into a JSR223 PreProcessor. This seems to be the wrong element to put this in because I need results for each endpoint to show up in the report. This only just shows the script ran. Anyway, sharing in hopes it helps others
import com.relevantcodes.extentreports.ExtentReports;
import com.relevantcodes.extentreports.ExtentTest;
import com.relevantcodes.extentreports.LogStatus;
import java.io.File;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import java.util.Properties;
ExtentReports report;
ExtentTest testLogger;
String resultsPath = "C:/portalQA/Automation/Results";
String configPath = "C:/jmeter/apache-jmeter-3.2/lib";
String reportPath;
reportPath = resultsPath+"/test2/testing_extRpts_${myTimeinMills}.html";
report = new ExtentReports(reportPath, true);
report.loadConfig( new File(configPath+"extent-config.xml"));
testLogger = report.startTest("Entire Script");
testLogger.log(LogStatus.INFO, "This is the API test script for Login");
report.flush();
//Thread.sleep(2000);
//report.close();