Automate the Endurance Testing using Jmeter - jmeter

I want to automate the endurance testing using jmeter. I have a test plan consisting of 2 thread groups, in turn having multiple requests.
I want to run the test plan first for 10 users, then for 50, then for 100...so on with each run having a pause of 20 minutes, in an automatic manner so that i do not have to sit and wait for 20 minutes and then type 50 users in the command line argument and again wait for 20 minutes and then type 100 users in the command line argument and again wait for 20 minutes and so on.

What you are asking is possible using Ultimate threadgroup,
Sample should be like,
I have added steps for 10, 50, 100 users for 20 min. If needed more runs you can add more rows with customized settings.

I figured it out. I can have an excel file which has a column containing the thread count like: 10,20,30 etc. Then in a java program using eclipse/intellij, I can read the above values from the excel file in a loop and assign the values to a String variable: "value" inside the loop. Then inside the loop,I can invoke the jmx file as below:
Runtime rt = Runtime.getRuntime();
Process pr = rt.exec("C:\\apache-jmeter-2.13\\bin\\jmeter.bat -t \"C:\\jmeter scripts\\test.jmx\" -Jusers="+value+" -n -l \"C:\\jmeter scripts\\nonGUI.csv\" -j \"C:\\jmeter scripts\\jmeterLogs.log\"");Thread.sleep(120000);
Hope this helps others who want to perform Endurance testing using jmeter+java

Related

Jmeter request and report validation

My Requirement is to execute 10 request per second for 5 minutes. The configuration I have used is
Thread Group Properties:
number of threads : 600
ramp up period : 60
Loop count : 5,
Add Sampler -> Flow Control Action
Select Logical Action on thread as Pause
Duration(milliseconds) :60000
I used the below command to run jmx from command line to generate xls and generate html report.
/jmeter.sh -n -t demo.jmx -l demo.xls -e -o ./report
Need to know if the configuration I have mapped is correct or not?
I have also looked at constant throughput timer and Runtime controller to stop the execution after 5 mins?
I am not able to validate the end result?
You can see what actual throughput your test generated using Transactions Per Second listener or the relevant chart in the HTML Reporting Dashboard
With regards to configuring JMeter for sending 10 requests per second for 5 minutes I would rather recommend going for Concurrency Thread Group and Throughput Shaping Timer combination.
Configurations:

Is it possible to run threads at different time interval- JMeter

I have 8 threads in JMeter, which i am executing for every 5 minutes using Task scheduler.
Now i have included 2 threads which want to run for 5 times per day only (ex: at 12am, 5am,10am...)
when the moment comes, the execution shall be 8+2 & remaining time, it shall be only 8 threads.
Is it possible to configure such usecase in Jmeter..
If you're going to use the same .jmx script and want to execute either 8 or 10 "threads" (whatever it is), you can go for:
If Controller - for conditional execution of this or that test elements
__groovy() function to check current time, an example condition which trigger the test at i.e. 5 AM would be:
${__groovy(Calendar.getInstance().get(Calendar.HOUR_OF_DAY) == 5 && Calendar.getInstance().get(Calendar.MINUTE) == 0,)}

JMeter test works differently from CLI than GUI - why?

I'm creating a small test using JMeter. So far I have one Thread Group that executes an HTTP request, waits for 10 seconds, then executes an other HTTP Request and checks what was returned. If I start 100 such threads with 1 second ramp-up period from the JMeter GUI, it works fine, I get the expected values and the whole test finishes in 22 seconds. However, when I start the very same jmx file from the command line, the test runs for more than 120 seconds and some threads (at the last run, 36 out of the 100) don't get the expected value. This might indicate a bug in the system I test, but I don't understand why the test takes that long time from the CLI and why I get errors from the CLI. What is the difference between running the test from the GUI and from the CLI? Does the CLI run the tests "more parallel"? By the way, this is the command line I'm using:
/home/nar/apache-jmeter-3.3/bin/jmeter -n -t test_transactions.jmx -l test_transactions.out
I'm afraid I cannot share the test plan, but I can share the "outline":
+ Thread Group
+ CSV Data Set Config
+ HTTP Request
| + JSON Extractor
+ Constant timer
+ HTTP Request
| + JSON Extractor
| + Response Assertion
+ View Results Tree
+ Save Responses to a file
+ View Results in Table
+ Summary Report
The Constant timer waits for 10 seconds. The first HTTP Request sends in some data and initiates a computation, the second checks the result.
I think you should disable the following listeners in non gui test:
View Results Tree
Save Responses to a file
View Results in Table
Summary Report
After disable you still have result using -l test_transactions.out which you can later view using GUI mode with Browse button in your Listener
In non GUI you can also generate dashboard report if you want by adding -e -o /path/dashboardfolder
It actually does indicate the bug in the system under test. The reason is that you must run JMeter in non-GUI mode as GUI creates huge overhead in terms of resources consumption, especially when you're using Listeners, especially if one of them is View Results Tree.
So my expectation is that in non-GUI mode you're basically creating more immense load which your application cannot handle. You can check this out using i.e. Active Threads Over Time and Transactions Per Second listeners.

How to get the responses of only the spike added in the soak test in jmeter?

My scenario is I'm running 50 threads for 15 mins and the running 100 threads for 15 mins. The total time of the of the test is 21 mins.
The 50 threads will start running after 10 seconds, slowly ramping up, for 5 mins 50 threads will run simultaneously and then after 5 mins 100 threads with start running slowly with ramping up and run for 15 mins.
After 100 threads finish the 50 threads will continue there running.
The image below will show you the jp#gc thread group
The image will show you the jp#gc ultimate thread group
I only want the responses (maily in graph format ) drilled down to only when 100 users are present, I dont want aggregate of all the soak test.How can this be done? I have also tried loading the jtl.gz file on https://loadosophia.org , it also gives the aggregate reportwhich i dont want.
I only want the specific report of the spike added of 100 users for 15 mins
Please let me know.
Thanks in advance
You can grep your file to only select the interval of time you want and use it to generate file.
Another option is to use this method:
http://www.ubik-ingenierie.com/blog/automatically-generating-nice-graphs-at-end-of-your-load-test-with-apache-jmeter-and-jmeter-plugins/
With this plugin:
http://jmeter-plugins.org/wiki/GraphsGeneratorListener/
And use the fields :
Start Offset
End Offset

Alternatives to using CSV in JMeter (for generating usernames)

I have a JMeter Test Plan with following structure
Test Plan
**ThreadGroup1**
--CSV Data Config-001
----SimpleController
--------------LoginRequest
--------------Action-abc-Request
**ThreadGroup2**
--CSV Data Config-002
----SimpleController
--------------LoginRequest
--------------Action-xyz-Request
I have two CSV files which contain list of users like this..
**CSV-001**
Username1
Username2
.. ..
Username50
**CSV-002**
Username51
Username52
.. ..
Username100
In my scenario, I need to run a load test with say 100 users. 50 users login from ThreadGroup1 and other 50 users login from ThreadGroup2. Users from both threadgroups login simultaneously.
Currently, I have to go through process of manually creating/editing these CSV files whenever I change the number of total users.
Please suggest if there are any alternative time-saving & performance-efficient approaches through I which can fulfill my scenario requirements (without using CSV files).
I will appreciate, if you can explain the alternative solution with some details as I am quite new to JMeter stuff. Thanks.
Another idea is to use
Username${__threadNum}
for the first thread group and
Username${__BeanShell(ctx.getThreadNum()+Z+1)}
for the second, where Z equals the total number of threads in thread group 1. You also need to add 1 since ctx.getThreadNum() returns a thread number using a 0 based index, whereas the __threadNum function is 1 based.
You can use a counter in each thread. The start value for the counter in the first thread would be 1, in the second 51. Be sure the 'Track counter independently for each user' check box is unchecked.
If you set the reference names to thread1Count and thread2Count respectively, you can use
Username${thread1Count}
for the first thread and
Username${thread2Count}
for the second.

Resources