Increasing User Threads in my JMeter performance testing, found some of the requests called before login. So those requests were failed.
My Test plan is below
In .csv file, added 1000 users with {email, password} and given to Login Http Request -> CSV Data Set Config. I'm extracting {uid} from login response and stored in uid
Second Random Controller, have multiple HttpRequest's, every request needs {uid}.
This works on below 50 user threads. But when i increased threads, some Random Controller ==> Http Request sends without uid
See, this one is with uid
But this one is with out uid
Can any one help me, how we can achieve this?
I can see 2 possible reasons:
Your application gets overloaded therefore your Login Request doesn't return correct uid so your extractors fail
Your JMeter intsance get overloaded so JMeter cannot handle 50+ users as you are running it in a non-optimal way
I would start with point 1 by temporarily enabling storing requests and responses data into .jtl results file by adding the next lines to user.properties file:
jmeter.save.saveservice.response_data=true
jmeter.save.saveservice.samplerData=true
jmeter.save.saveservice.requestHeaders=true
jmeter.save.saveservice.url=true
jmeter.save.saveservice.responseHeaders=true
This way you will be able to see response data for failed requests using View Results Tree listener and will be able to identify where the test fails.
Other things worth checking/implementing:
check your application logs
setup monitoring of baseline OS health metrics (CPU, RAM, Disk/Network IO) on both JMeter and application under test sides, you can use JMeter PerfMon Plugin for this
Make sure you're following JMeter Best Practices, i.e.
Run your test in command-line non-GUI mode
Increase JVM heap size
Switch to JSR223 Test elements instead of Beanshell
Related
I am using jmeter to load test a website. I have tested from 1 to 400 users. however while testing for above 500 users/threads, I am getting 401/unauthorized error for few users. Hope you'll help me to find out a solution to this problem.
I can think of 3 possible reasons:
There is a parameterization problem with your test, i.e. it sends wrong credentials, you can use i.e. Simple Data Writer configured to save requests and response data for failing requests and inspect it using View Results Tree listener.
JMeter gets overloaded and cannot properly perform correlation. In addition to point 1 check whether you're following JMeter Best Practices and ensure that JMeter has enough headroom to operate in terms of CPU, RAM, etc. using i.e. JMeter PerfMon Plugin
Your application gets overloaded and cannot handle 500+ users, check your application logs, resources usage, output of APM tools, etc.
Executed load testing script using JMeter for 50 users but it is not adding data in database for 50 users and adding only for 20 users, What is the issue?
Open your .jtl reusults file with a listener like Aggregate Report and check the number of failures, if it's around 40% - then you need to investigate the cause of the failures and see what needs to be done to fix them either from JMeter or from application under test perspectives.
If there are no failures in the .jtl file but your test is not doing what it is supposed to be doing - most probably it "silently" fails somewhere somehow. I would recommend adding Response Assertions to each and every step to validate that each response contains the anticipated data (or alternatively doesn't contain errors).
JMeter automatically treats HTTP status codes below 400 as successful and doesn't do any other checks so in case when an error occurs but application responds with a positive HTTP status code JMeter will consider the request as passed so you need to specify pass/fail criteria.
And last but not the least, you can temporarily amend the Results File Configuration so JMeter would store all the request/response details, after test run you will be able to examine the flows using View Results Tree listener and identify where and how exactly your script does fail.
I am working on migration of scripts from performance center to Jmeter5.2.1.
As part of this migration , we are using same functional flow which we did in performance center.
My scenario consists of users logging in to the web application perform 10-15 iterations and then logout.
This is my Testplan.
TestPlan
--ThreadGroup1
--Once Only Controller (login of users)
--Loop Controller (10 Iterations)
HTTP1
HTTP2
HTTP3
.
.
--Once only Controller (logout of users)
--csv Config data ( username/password)
--csv config data( unique data for the loop controller)
With this approach I am noticing that the time taken to complete the test in Jmeter is much more than what we have in performance center ( I took care of think times and added the similar values)
Why is my test run slow in Jmeter?
Is loop controller sequential? Meaning at a given time it can run only one request?
If not loop controller what other options we have to satisfy my scenario.
If I include different thread groups , carrying JSESSIONIDs needs to be done across thread groups which is not a best practice to do so.
Update:
Comparison between performance center and Jmeter settings
Below are the settings in Jmeter.
Thread Group settings:
TestPlan :
HTTP Cookie manager in Thread Group
CSV data files in Test plan
Once Only counters for Login and Logout
Loop Controller for Iterations.
HTTP request Defaults: ( Even with out checking retrieve all embedded and parallel downloads its taking more than an hour for 3 users)
TestPlan
Performance Center results :
Every Sampler has HTTP Header manager
Entire Test Plan
Given you send the same requests you should have the same response times, no matter which tool is being used under the hood.
It's hard to say what the differences are without seeing the full scripts from the both tools so generic advice is to use a third-party sniffer tool like Wireshark or Fiddler in order to identify the differences and configure JMeter to behave exactly like the "performance center" (whatever it is)
For example I fail to see HTTP Cache Manager and it will cause JMeter to download embedded resources (images, scripts, styles, sounds, fonts) for each and every HTTP request while real browser does it only once.
I also don't see HTTP Header Manager which might be very important, for example if you send Accept header the server will be aware that the client can understand gzipped responses which will greatly reduce network traffic.
More information: How to make JMeter behave more like a real browser
How can I execute those JMeter recorded script in parallel as we do while creating HTTP SAMPLER (Embedded HTTP Resource) ?
While recording I also checked Retrieve All Embedded Resource with pool of 6. Because of this I getting incorrect response time(Varies from browser timing of page).
Is there any way that we could execute our recorded HTTP Samplers in parallel?
As of JMeter version 3.2 it is not possible to kick off extra threads to run a specific sampler(s) group in parallel, each thread (virtual user) executes samplers upside down.
So you should be very careful with what you are recording. For example, you must not be recording any embedded resources calls, there is `URLs Patterns to Exclude" input on the "Requests Filtering" tab of the HTTP(S) Test Script Recorder where you can define which resources need to be excluded from the recording.
Remember that well-behaved JMeter test should be producing response time similar to real browser, but you need to configure JMeter to mimic real browser, to wit:
Add HTTP Header Manager to represent browser headers
Add HTTP Cookie Manager to represent browser cookies and deal with cookie-based authentication
Add HTTP Cache Manager to act like browser's memory and disk caches
See How to make JMeter behave more like a real browser guide for more information on JMeter fine tuning.
When I have given 500 concurrent users load via jmeter my server throwing error message but the same time I have called same request via browser showing proper response. How it is possible? Is there any settings in jmeter for avoiding same.
It is hard to say what can go wrong without seeing your JMeter configuration, full server response, JMeter and application under test logs and network dump for browser and JMeter.
The whole idea of performance testing is mimicking real user as close as possible, so you need at least
Add HTTP Request Defaults and set JMeter to:
Download embedded resources
Use concurrent pool of 2-5 threads
Add HTTP Cookie Manager
Add HTTP Cache Manager
Add HTTP Header Manager
Correlate any dynamic parameters
Simulate any specific application behaviour (i.e. AJAX calls)
etc.
In addition to above recommendations: ideally given "good" JMeter you shouldn't see any "response messages", you should see a number of errors in final report so double check you:
Run JMeter in non-GUI mode
Storing only those metrics which are absolutely required
Follow other recommendations from 9 Easy Solutions for a JMeter Load Test “Out of Memory” Failure
beside, what Dmitri described above, I would also check the actual throughput the server returns in either cases.
Throughput depends a lot on the timers you configured in Jmeter to simulate think time.
Jmeter has no rendering and no javascript engine, so each thread is much much faster than a real browser.