How to log the current time in JMeter Webdriver sampler - jmeter

I want log the current time to the JMeter log from the JMeter Webdriver sampler.
I am using the below code in the Webdriver sampler which is logging the time in milliseconds but the problem is my Webdriver sampler has a waiting time of 2 minutes. The getStartTime() and getEndTime() functions are considering the sampler waiting time also.
Is it any other way to get the current time from the Webdriver sampler? I want to measure the time taken between two actions in the browser.
WDS.log.info(WDS.sampleResult.getStartTime())
WDS.log.info('WDS.sampleResult.getEndTime())

You can record start and end time using Date.prototype.getTime() function like:
var before = new Date().getTime()
// here is the code which duration you would like to measure
var after = new Date().getTime()
WDS.log.info('Time taken = ' + (after - before) + ' ms')
Demo:
See The WebDriver Sampler: Your Top 10 Questions Answered for more WebDriver sampler tips and tricks

Exclude the wait time by covering only the required code block with the start time and end time function.
Ex:-
WDS.log.info(WDS.sampleResult.getStartTime())
---some code
WDS.log.info('WDS.sampleResult.getEndTime())
Wait for two minutes..wait(2)
WDS.log.info(WDS.sampleResult.getStartTime())
--some code
WDS.log.info('WDS.sampleResult.getEndTime())
In this way, wait time is excluded from the calculation.
This is not the best solution but one which I can think of.

Related

How to add timer in jmeter script, which we can start at first call, poll the status & stop once the first request is completed & add assertions

I am doing load testing on generating report and the requirement is like the report should get generated within 10mins.
It includes one HTTP post request for report generation, and then there is a status check call, which keeps on checking the status of the first request. Once the status of first request changes to complete then the report generation is successful.
Basically I want to start the timer at the begining of the first request and stop the timer once the status is complete and need to add assertion if the time is less than 10 mins then test is pass else fail.
I tried multiple approaches like using Transaction controller, and adding all request under it. But this doesn't give sum but the average response time of all the request under it.
Also, I tried beanshell listener, extracting the response time for every request and adding them all...
var responseTime;
props.put("responseTime", sampleResult.getTime());
log.info(" responseTime :::" + props.get("responseTime"));
log.info("time: "+ sampleResult.getTime());
props.put("responseTime", (sampleResult.getTime()+props.get("responseTime")));
log.info("new responseTime :::" + props.get("responseTime"));
However, I am not interested in adding the response time of these requests, instead I need to just know what is the time elapsed from when the report is triggered and till it gives status as complete.
All the jmeter timers are adding delays, I dnt wish to add delay instead I need it as a timer.
Any help is highly appreciated.
Thank you
Since JMeter 3.1 it's recommended to use JSR223 Test Elements and Groovy language for scripting mainly due to performance reasons so I'll provide one of possible solutions in Grovy
Add JSR223 PostProcessor as a child of the HTTP Request which kicks off the report generation and put the following code there:
vars.putObject('start', System.currentTimeMillis())
Add JSR223 Sampler after checking the status and put the following code there:
def now = System.currentTimeMillis()
def start = vars.getObject('start')
def elapsed = now - start
if (elapsed >= 600000) {
SampleResult.setSuccessful(false)
SampleResult.setResponseMessage('Report generation took: ' + (elapsed / 1000 / 60) + ' minutes instead of 10')
}
Example setup:

JSR-233 Timer strange(?) behavior

I'm using JSR-223 Timer (jMeter 5.4.1), with groovy language, and trying to add delay\pauses to my threads.
I'm following the instructions by BlazeMeter (How to Easily Implement Pacing).
The strange(?) behavior is that the actual delay is double than required.
The script is as follows:
Long pacing = 5000 - prev.getTime();
Integer iPacing = pacing != null ? pacing.intValue() : null;
log.info("Transaction Pacing: " +String.valueOf(iPacing));
vars.put("myDelay", String.valueOf(iPacing));
return iPacing;
I get the duration of the Sampler action, then calculate "myDelay" as the difference from a base duration of 5,000 mSec. myDelay is a variable I use in the Flow Control Sampler.
Now the strange result:
The actual delay I achieve is TWICE than calculated. In this example, the delay is 5K mSec, but the actual delay is 10K mSec.
Now here is the real strange issue:
If I mark-out the return iPacing, the delay is 5K mSec as required (with a warning message in log file).
See the output below.
Why does the Flow Control Sampler adds myDelay and the iPacing values?
The first block - iPacing is returned. The overall pause is myDelay + iPacing.
The second block - iPacing is marked-out. The delay is myDelay only.
Your delay is TWICE simply BECAUSE you're setting it TWICE.
This statement:
return iPacing;
will create a delay BEFORE each SAMPLER in the JSR223 Time SCOPE
So there is no need to use the Flow Control Action sampler because you're creating the delay in the JSR223 timer ALREADY.
In general PACING is not implemented in JMETER because there is an EASIER way of creating the LOAD in terms of X REQUESTS per second: Constant THROUGHPUT timer and friends.

How to calculate time for two different actions in JMeter Webdriver sampler

I want to log the time for two different actions in the JMeter webdriver sampler.
The issue i am facing is, it is logging the same time for both. Here is my code.
WDS.sampleResult.sampleStart()
WDS.log.info('Click on baseline icon and start time for device'+'${DeviceName}'+':-' +WDS.sampleResult.getStartTime())
WDS.browser.findElement(pkg.By.xpath("//a[#id='baseline-icon-${DeviceName}']")).click()
WDS.sampleResult.sampleEnd()
WDS.log.info('Click on baseline icon and end time for device'+'${DeviceName}'+':-' + WDS.sampleResult.getEndTime())
WDS.sampleResult.sampleStart()
WDS.log.info('Baseline commit start time for device'+'${DeviceName}'+':-' +WDS.sampleResult.getStartTime())
wait.until(pkg.ExpectedConditions.elementToBeClickable(pkg.By.id( "commitToLib"))).click()
wait.until(pkg.ExpectedConditions.invisibilityOfElementLocated(pkg.By.xpath( "//*[#id='device-name-${DeviceName}']/../../../../../../..//div[contains(text(),'Manage Library is in progress')]")))
WDS.sampleResult.sampleEnd()
WDS.log.info('Baseline commit end time for device'+'${DeviceName}'+':-' + WDS.sampleResult.getEndTime())
The time it is logging for getStartTime() is same in both the case and getEndTime() is same in both the cases.
You cannot call WDS.sampleResult.sampleEnd() function twice in the same instance of the WebDriver Sampler.
I would recommend splitting your actions into 2 WebDriver Samplers, i.e.
action 1 should go into WebDriver Sampler 1
and action 2 into WebDriver Sampler 2.
WebDriver instance is shared across all WebDriver Samplers so you can basically continue where you left off and from WebDriver perspective it will be no difference.
If you want to have cumulative time of action 1 and action 2 in the report - put the relevant WebDriver Samplers under the Transaction Controller

How to send the requests in batches using Jmeter

I've a scenario where I need to send the requests in batches of user defined number (for example 1K,5K,10K etc) with a specified interval between each batch.
Assume Interval between batch is 30 Seconds, I've to send 'N' number of request per batch, for example 1K. Sending 1K request got finished within 10 Seconds, so for next 20 Seconds no request should go. Once the interval gets over another batch of 1K should be sent.
Input : Data is flowing from a CSV, for the 2nd batch it should ideally start from 1001.
Options tried : Constant Throughput Timer. With this I'm restricting the speed of the request, which I do not want to do.
Can someone help me with other option which i can try with?
Add JSR223 Samplers before and after your requests. Your test plan should look like this:
JSR223 Sampler 1
You requests
JSR223 Sampler 2
Add this code to your first JSR223 Sampler:
interval = 30000 //Specify the desired interval here
startTime = System.currentTimeMillis()
vars.put("startTime", startTime.toString())
vars.put("interval", Long.toString(interval))
Add this code to your second JSR223 Sampler:
startTime = Long.parseLong(vars.get("startTime"))
interval = Long.parseLong(vars.get("interval"))
endTime = System.currentTimeMillis()
duration = endTime - startTime
if (duration < interval) {
sleepTime = interval - duration
log.info("Sleeping for ${sleepTime} ms")
Thread.sleep(sleepTime)
}
This will make your threads sleep until the interval is over (if they've already completed their work).
If you need more precision you can modify this solution to make all of your threads respect the same time interval.
You also may use beanshell/JSR223 timer (after all your samples in a thread group) instead of sampler or post processor, as it's been proposed.
As well as pre-processor (before all your samples in a thread group) to set the start time variable - instead of sampler.
In such a timer, you're going to simply return the delay to be applied (like return (interval - (endTime - startTime)); )

To manullay calculate the total duration of jmeter testplan from the Logfile

I want to manually calculate the Duration of jmeter testplan from the csv Logfile.I was following the calculation of last timestamp-first timestamp and it looks correct if am running for 1 thread group.For more than 1 threadgroup the samplers will be repeating and I think it should not be the right way to calculate the duration.I tried using transaction controller thinking that the corresponding timestamp will give me the duration of all contained samples but got confused when I saw multiple transaction controller entry in the Log file for more than one threadgroup. I am newcomer in the performance testing and in the jmeter.Any help will be appreciated.
JMeter provides variable which holds test start timestamp, it is ${TESTSTART.MS}
You could use tearDown Thread Group which is designed to run post-test actions. Under tearDown Thread Group you can use Beanshell Sampler to print test duration to jmeter.log file as follows:
long start = Long.parseLong(vars.get("TESTSTART.MS"));
long end = System.currentTimeMillis();
log.info("Test duration: " + (end - start) / 1000 + " seconds");
By the end of the test you should see something like:
2015/06/17 22:20:15 INFO - jmeter.util.BeanShellTestElement: Test duration: 300 seconds
See How to use BeanShell: JMeter's favorite built-in component guide for more Beanshell scripting tips and tricks.
If you have only result file, another option is open .jtl results file with Excel or Google Sheets or equivalent, sort timestamp column (usually the first one), subtract first cell/first row value from the first sell/last row value - this way you'll get test duration in milliseconds.

Resources