Passing -J parameters programmatically to JMeter - jmeter

I'm using programmatic way to run JMeter defined in the step 4 of this post.
The code looks as follows:
final StandardJMeterEngine jmeter = new StandardJMeterEngine();
JMeterUtils.setJMeterHome(getAbsolutePath("/jmeter"));
JMeterUtils.loadJMeterProperties(getAbsolutePath("/jmeter/bin/jmeter.properties"));
JMeterUtils.initLocale();
try {
SaveService.loadProperties();
final File jmeterConfig = new File(getAbsolutePath(pathToJmx));
final HashTree testPlanTree = SaveService.loadTree(jmeterConfig);
jmeter.configure(testPlanTree);
} catch (final IOException e) {
throw new JMeterConfigurationException(e);
}
jmeter.run();
I want to provide the values for ${__P(parameter_name)} parameters I specified in .jmx file, that can be done using -J parameter in console.
How can I pass values for this parameters in the code above?

Given you already use JMeterUtils class you should be able to call JMeterUtils.setProperty() function like:
JMeterUtils.setProperty("parameter_name","foo");
And then in your script refer the property using __P() function as ${__P(parameter_name,)}
You can also add the next line:
parameter_name=foo
to the jmeter.properties file which you're loading with JMeterUtils.loadJMeterProperties function.
Dont' forget to add ApacheJMeter_functions.jar to your project classpath otherwise __P() function will not be resolved.
More information: Apache JMeter Properties Customization Guide

Related

Set P function property for jmeter API in Java

I have a jmeter test that is already defined in the gui like this.
I am automating running this jmeter test from java, and I want to set ${__P(threads)} from within the java code.
The relevant code is :
public List<String> runJmxTest(String jmxFile, String jtlFile) throws IOException {
HashTree testPlanTree;
List<String> resultSet = new ArrayList<>();
// Initialize JMeter SaveService
SaveService.loadProperties();
JMeterVariables j = new JMeterVariables();
j.put("threads", "10");
// Load existing .jmx Test Plan
File in = new File(jmeterHome.getPath() + "/bin/testPlans/" + jmxFile);
try{
testPlanTree = SaveService.loadTree(in);
} catch (FileNotFoundException e){
resultSet.add("fail");
resultSet.add(e.toString());
return resultSet;
}
// set up custom result collector with summariser
Summariser summer = new Summariser("caos-mbm summariser");
collector = new myResultCollector(summer);
if(jtlFile != null){
if(!jtlFile.contains(".jtl")) {
String jtlTmp = jtlFile.concat(".jtl");
collector.setFilename(jmeterHome.getPath() + "/bin/testPlans/Output/" + jtlTmp);
} else {
collector.setFilename(jmeterHome.getPath() + "/bin/testPlans/Output/" + jtlFile);
}
}
testPlanTree.add(testPlanTree.getArray()[0], collector);
// Run Test Plan
jm.configure(testPlanTree);
jm.run();
resultSet.add("success");
resultSet.add(Double.toString(collector.getErrorPercent()));
return resultSet;
}
I have tried setting the property through the props, adding it to the test plan tree, adding jmeterproperties to the jmetercontext. I can't get it to pick up the variable though.
Any advice would be appreciated. I have also looked through quite a few posts on here that seem similar but the solutions didn't work for me or the implementation was off.
You're using the wrong class, remove these lines:
JMeterVariables j = new JMeterVariables();
j.put("threads", "10");
and add the following instead:
org.apache.jmeter.util.JMeterUtils.setProperty("threads", "10");
You need to do this after loading the Test Plan and before running the test.
Also make sure to add ApacheJMeter_functions.jar to your project CLASSPATH
More information on running JMeter test using JMeter API: Five Ways To Launch a JMeter Test without Using the JMeter GUI

How to update Atlassian Confluence Wiki using JMeter and the REST API

I wanted a way to update a wiki status page and upload a file after a JMeter test was done running. This is something that you could conditionally kick off depending on the results of your Jenkins job.
I did this with these steps:
in a setup thread group, added a BeanShell Sampler to locate the most recent report file in my results folder.
import org.apache.commons.io.FileUtils;
import org.apache.commons.io.filefilter;
import org.apache.commons.io.filefilter.WildcardFileFilter;
import org.apache.commons.io.comparator.LastModifiedFileComparator;
log.info("GET MOST RECENT RESULTS REPORT FOR THE APP TESTED");
String dir_path = props.get("test_results_path");
File theNewestFile = null;
try {
File dir = new File(dir_path);
FileFilter fileFilter = new WildcardFileFilter("Results_${testApp}*.*");
File[] files = dir.listFiles(fileFilter);
if (files.length > 0) {
/** The newest file comes first **/
Arrays.sort(files, LastModifiedFileComparator.LASTMODIFIED_REVERSE);
theNewestFile = files[0];
String fileName = files[0].getName().toString();
log.info("fileName: "+fileName);
print("fileName: "+fileName);
props.put("varResultsReportFile",fileName);
}
return theNewestFile;
}
catch (Throwable ex) {
log.error("Failed in Beanshell", ex);
throw ex;
}
login with a wiki/confluence system account
GET rest/api/content?title=${testApp}&spaceKey=${testSpaceKey}&expand=version,history
Use a JSON Extractors to extract page Version number(results..version.number) and page id(results..id)
Use a BeanShell PostProcessor to add 1 to the page version number and store that value in a variable. You will need this when you PUT your update into the wiki
GET rest/api/content?title=${testApp}&spaceKey=${testSpaceKey}&expand=body.storage
Use JSON Extractor to extact page body value(results..body.storage.value)
Using a CSS/JQuery Extractor on the JMeter Variable you created in step 7, Extract all the table values. For example,CSS/JQuery Expression=td and Match No= 1 to extract first column value.
PUT rest/api/content/${varPageId} and in the JSON body, update the single table value that you need to update and restore the values you extracted that you dont need updated.
POST rest/api/content/${varResultsPageId}/child/attachment For the Files upload tab, File Path=${__P(test_results_path)}${__P(varResultsReportFile)}, Parameter Name=file, MIME Type=text/csv
logout

jmeter - how to make a groovy script easier to maintain for extentreports

Below is a script that helps me build an extentreport for jmeter. It is a JSR223 PostProcessor element. It's working nicely however, the problem is that I have it duplicated after every HTTP Request in the script. I have several scripts with 100's of HTTP requests that would need essentially a copy of the same PostProcessor groovy script. This = hard to maintain!
I have tried splitting common parts into an external groovy script that I tried calling on the JSR223 PostProcessor. I also tried chunking up the bits of the script and putting the values into a csv so that I could just update the csv values if anything changed.
I'm sure there's a cleaner/better way to do this but I'm still learning so I'm not sure of the best way to make this easier to maintain. Here's the JSR223 PostProcessor. The only bit that changes with each http request is the "//test result" section
import com.relevantcodes.extentreports.ExtentReports;
import com.relevantcodes.extentreports.ExtentTest;
import com.relevantcodes.extentreports.LogStatus;
import java.io.File;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import java.util.Properties;
//configure object for response data
def response = prev.getResponseDataAsString();
//configure extentreports objects
ExtentReports report;
ExtentTest testLogger;
//set location for file and report config
String resultsPath = "C:/portalQA/Automation/Results/";
String configPath = "C:/portalQA/Automation/JMeter/TestReportConfig/";
String reportPath =
resultsPath+"Login_Results_${reportDate}_${currentTime}_${testenv}.html";
File file = new File(reportPath);
if (!file.exists()) {
//if file does not exist, create it
report = new ExtentReports(reportPath, true);
report.loadConfig( new File(configPath+"extent-config.xml"));
} else {
//else append to existing report
report = new ExtentReports(reportPath, false);
report.loadConfig( new File(configPath+"extent-config.xml"));
}
//test result
testLogger = report.startTest("Authenticate");
testLogger.assignCategory("Initialize Session");
if (response.contains("com.blah.portal.model.User")) {
testLogger.log(LogStatus.PASS, "Logged in with: ${username}");
testLogger.log(LogStatus.INFO, response);
} else {
testLogger.log(LogStatus.FAIL, "Could not authenticate session");
testLogger.log(LogStatus.INFO, response);
}
log.info("Authenticate");
print("Authenticate print");
report.endTest(testLogger);
report.flush();
I see two options:
I suggest using JSR223 Listener instead. First of all, that way you will only have 1 listener in your script, which resolves your original problem, but it is a better option for writing into file in general, since listener has only one instance for all running threads, so you won't be creating a race condition when writing to file.
If you rather have a post-processor, you can put it on higher level (not under any particular sampler) which will cause it to run after each request within the same scope or below.
For example, configuration like
Thread Group
Post-processor
Sampler 1
...
Sampler N
Will cause Post-processor to run after each Sampler 1...Sampler N
In both cases you may need to check which sampler you are processing, and skip those you don't want to add to your report (easiest way to do it, is to come up with some name convention for excluded samplers)
I also faced the same challenge. In my case I need to check if JSON response from REST service was correct. I solved it in the following way.
I've created a JSR223 PreProcessor under the script root. It contains my custom class to handle JSON parsing and asserts.
import groovy.json.JsonSlurper
import org.apache.jmeter.assertions.AssertionResult
class CustomAssert {
def parseResponse(json) {
def jsonSlurper = new JsonSlurper()
return jsonSlurper.parseText(json)
}
def assertResult(assertionResult, expectedResult, actualResult) {
if (!expectedResult.equals(actualResult)) {
assertionResult.setFailure(true);
assertionResult.setFailureMessage("Expected ${expectedResult} but was ${actualResult}");
}
}
}
vars.putObject('customAssert', new CustomAssert())
Note the last line:
vars.putObject('customAssert', new CustomAssert())
I put an instance of my CustomAssert to vars.
Then under my HTTP Requests I've added JSR233 Assertion
def a = vars.getObject('customAssert')
def response = a.parseResponse(prev.getResponseDataAsString())
a.assertResult(AssertionResult, 'DRY', response.sensorResultHolderUIs[0].result.toString())
a.assertResult(AssertionResult, 'DRY', response.sensorResultHolderUIs[1].result.toString())
a.assertResult(AssertionResult, 'DRY', response.sensorResultHolderUIs[2].result.toString())
It basically retrieves the instance of CustomAssert from vars and calls its methods. I can put as many JSR233 Assertions as I want. The only code that is copied is those two lines on top:
def a = vars.getObject('customAssert')
def response = a.parseResponse(prev.getResponseDataAsString())
To sum up:
Take the common part of your code (that doesn't have to be copied).
Wrap it in a class.
Put the class in JSR233 PreProcessor under the root and export its instance via vars
Take the rest of your code and adjust it to use class defined in 2.
Put that code in as many JSR233 Assertions as you want remembering to retrieve the instance created in 3. from vars
Thank you user1053510. Your advice lead me to build my own JSR223 Listener that renders the report. Below is the code in my JSR223 Listener:
import com.aventstack.extentreports.*;
import com.aventstack.extentreports.reporter.*;
import com.aventstack.extentreports.markuputils.*;
ExtentHtmlReporter htmlReporter;
ExtentReports extent;
ExtentTest test;
// create the HtmlReporter
htmlReporter = new ExtentHtmlReporter("C:/AUTO_Results/Results_${testApp}_${reportDate}_${currentTime}_${testenv}.html");
//configure report
htmlReporter.config().setCreateOfflineReport(true);
htmlReporter.config().setChartVisibilityOnOpen(true);
htmlReporter.config().setDocumentTitle("${testApp} Results");
htmlReporter.config().setEncoding("utf-8");
htmlReporter.config().setReportName("${testApp} Results ${reportDate}_${currentTime}_${testenv}");
htmlReporter.setAppendExisting(true);
// create ExtentReports
extent = new ExtentReports();
// attach reporter to ExtentReports
extent.attachReporter(htmlReporter);
extent.setReportUsesManualConfiguration(true);
// Show Env section and set data on dashboard
extent.setSystemInfo("Tool","JMeter");
extent.setSystemInfo("Test Env","${testenv}");
extent.setSystemInfo("Test Date","${reportDate}");
extent.setSystemInfo("Test Time","${currentTime}");
//stringify test info
String threadName = sampler.getThreadName();
String samplerName = sampler.getName();
String requestData = props.get("propRequestData");
String respCode = props.get("propRespCode");
String respMessage = props.get("propRespMessage");
String responseData = props.get("propResponse");
// create test
test = extent.createTest(threadName+" - "+samplerName);
//test.assignCategory("API Testing");
// analyze sampler result
if (vars.get("JMeterThread.last_sample_ok") == "false") {
log.error("FAILED: "+samplerName);
print("FAILED: "+samplerName);
test.fail(MarkupHelper.createLabel("FAILED: "+sampler.getName(),ExtentColor.RED));
} else if (vars.get("JMeterThread.last_sample_ok") == "true") {
if(responseData.contains("#error")) {
log.info("FAILED: "+sampler.getName());
print("FAILED: "+sampler.getName());
test.fail(MarkupHelper.createLabel("FAILED: "+sampler.getName(),ExtentColor.RED));
} else if (responseData.contains("{")) {
log.info("Passed: "+sampler.getName());
print("Passed: "+sampler.getName());
test.pass(MarkupHelper.createLabel("Passed: "+sampler.getName(),ExtentColor.GREEN));
}
} else {
log.error("Something is really wonky");
print("Something is really wonky");
test.fatal("Something is really wonky");
}
//info messages
test.info("RequestData: "+requestData);
test.info("Response Code and Message: "+respCode+" "+respMessage);
test.info("ResponseData: "+responseData);
//playing around
//markupify json into code blocks
//Markup m = MarkupHelper.createCodeBlock(requestData);
//test.info(MarkupHelper.createModal("Modal text"));
//Markup mCard = MarkupHelper.createCard(requestData, ExtentColor.CYAN);
// test.info("Request "+m);
// test.info(mCard);
// test.info("Response Data: "+MarkupHelper.createCodeBlock(props.get("propResponse")));
// test.info("ASSERTION MESSAGE: "+props.get("propAssertion"));
// end the reporting and save the file
extent.flush();
Then in each threadgroup I have a BeanShell Assertion with these lines:
//request data
String requestData = new String(prev.SamplerData);
//String requestData = new String(requestData);
props.put("propRequestData", requestData);
//response data
String respData = new String(prev.ResponseData);
//String respData = new String(prev.getResponseDataAsString());
props.put("propResponse", respData);
//response code
String respCode = new String(prev.ResponseCode);
props.put("propRespCode",respCode);
//response message
String respMessage = new String(prev.ResponseMessage);
props.put("propRespMessage",respMessage);

Getting exception on bean shell assertion for org/json/simple/JSONArray

error :_ jmeter.util.BeanShellInterpreter: Error invoking bsh method: eval org/json/simple/JSONArray
Import java method from project jar file which is placed in lib folder.
source code on written for pass json value in arraylist:
import jsonresponse.common.JsonResponseProcessor;
import assertions.AssertResponse;
import databaseresponse.common.DbResponseProcessors;
import org.json.simple.JSONArray;
String resJson = prev.getResponseDataAsString();
String res = resJson.get("queueId");
log.info("----->>>>"+resJson);
ArrayList list1 = new ArrayList();
list1.add("queueId");
list1.add("name");
list1.add("faxNumber");
list1.add("description");
list1.add("type");
ArrayList list2 = new ArrayList();
list2.add("userId");
ArrayList list3 = new ArrayList();
list3.add("agencyId");
ArrayList list4 = new ArrayList();
list4.add("usertypeId");
JsonResponseProcessor obj = new JsonResponseProcessor();
System.out.println("%%%%%%%%%%%%%%fgfggfffgf%%%%%%%%%%%%%%%");
ArrayList Jsoin = obj.getvalueofsubmapoflist(resJson,".result[0]",list1);
System.out.println("%%%%%%%%%%%%%%%%%%%%%%%%%%%%%" +Jsoin);
log.info(">>>"+Jsoin);
Make sure you have all referenced .jar files in JMeter classpath (i.e. copy them to JMeter's "lib" folder). Don't forget to restart JMeter to pick the jars up
If you still experience issues try putting your code inside try block like:
try {
//your code here
}
catch (Throwable ex) {
log.error("Problem in Beanshell", ex);
throw ex;
}
This way you'll have informative stacktrace printed to jmeter.log file.
One more way to get more information regarding your Beanshell script is putting debug(); directive to the beginning of your code. If will trigger debugging output into stdout
See How to Use BeanShell: JMeter's Favorite Built-in Component article for more information on using Java and JMeter APIs from Beanshell test elements in JMeter tests.

How to pass the out put of BSF sampler as in input to Beanshell scripting

Hi I need to pass the output of BSF sampler as input to the Bean shell pre-processor below are my programs
BSF sampler:
function makeid()
{
var ts = new Date().getTime();
var digits = 10e10;
//var timestamp = ts.toString() + Math.floor(Math.random() *digits ).toString();
var timestamp = new Date().getTime().toString() + Math.floor(Math.random() *digits ).toString();
return timestamp;
}
function test()
{
var uniqueId=makeid();
var numCopies = 20,status = 'done printing',timestampDonePrintingAttr = "timestamp done printing",Title='demo.jpg',Username='keshavka'
date='1429036296',printstatus='OK',time='1429036296';
var joblog = {};
joblog[uniqueId] = {};
joblog[uniqueId]['NumCopies'] = numCopies;
joblog[uniqueId]['Status'] = status;
joblog[uniqueId]['Title'] = Title;
joblog[uniqueId]['Username'] = Username;
joblog[uniqueId]['date'] = date;
joblog[uniqueId]['print status'] = printstatus;
joblog[uniqueId]['time'] = time;
joblog[uniqueId]['timestampDonePrintingAttr'] = new Date().getTime();
//console.log(i);
var json = JSON.stringify(joblog);
return json;
}
Bean Shell Program :
FileWriter fstream = new FileWriter("C:\\apache-jmeter-2.13\\detail_log7.txt",false);
BufferedWriter out = new BufferedWriter(fstream);
out.write(${test});
out.close();
BSF sampler gives me a JSON and i need to create a File using Bean shell scripting and write content to it .
please help me through this
Approaches may differ depending on where BSF Sampler and Beanshell PreProcessor live.
If Beanshell PreProcessor is a child of BSF Sampler - it won't play as PreProcessors are executed before samplers.
For other cases:
If you look at the top of "Script" input you'll see some pre-defined variables names like ctx, vars, props, etc.
For instance, vars stands for JMeterVariables class instance, so you can set variable in BSF Sampler and access it in Beanshell PreProcessor as:
In BSF:
vars.put('json', json);
In Beanshell:
out.write(vars.get("json"));
If above approach won't help update the question with screenshot of your test plan.
NB:
You don't need to switch to Beanshell just for writing JSON to file. See Using Java From Scripts guide to learn how you can access Java classes from JavaScript and vice versa
Pre-defined variables like ctx, vars, props, etc. are described in detail in the How to use BeanShell: JMeter's favorite built-in component guide.
Both JavaScript and Beanshell have performance problems so if this code is executed with more than i.e. 10 threads it is recommended to use JSR223 test elements and Groovy for scripting where required.

Resources