When I have given 500 concurrent users load via jmeter my server throwing error message but the same time I have called same request via browser showing proper response. How it is possible? Is there any settings in jmeter for avoiding same.
It is hard to say what can go wrong without seeing your JMeter configuration, full server response, JMeter and application under test logs and network dump for browser and JMeter.
The whole idea of performance testing is mimicking real user as close as possible, so you need at least
Add HTTP Request Defaults and set JMeter to:
Download embedded resources
Use concurrent pool of 2-5 threads
Add HTTP Cookie Manager
Add HTTP Cache Manager
Add HTTP Header Manager
Correlate any dynamic parameters
Simulate any specific application behaviour (i.e. AJAX calls)
etc.
In addition to above recommendations: ideally given "good" JMeter you shouldn't see any "response messages", you should see a number of errors in final report so double check you:
Run JMeter in non-GUI mode
Storing only those metrics which are absolutely required
Follow other recommendations from 9 Easy Solutions for a JMeter Load Test “Out of Memory” Failure
beside, what Dmitri described above, I would also check the actual throughput the server returns in either cases.
Throughput depends a lot on the timers you configured in Jmeter to simulate think time.
Jmeter has no rendering and no javascript engine, so each thread is much much faster than a real browser.
Related
I am using jmeter to load test a website. I have tested from 1 to 400 users. however while testing for above 500 users/threads, I am getting 401/unauthorized error for few users. Hope you'll help me to find out a solution to this problem.
I can think of 3 possible reasons:
There is a parameterization problem with your test, i.e. it sends wrong credentials, you can use i.e. Simple Data Writer configured to save requests and response data for failing requests and inspect it using View Results Tree listener.
JMeter gets overloaded and cannot properly perform correlation. In addition to point 1 check whether you're following JMeter Best Practices and ensure that JMeter has enough headroom to operate in terms of CPU, RAM, etc. using i.e. JMeter PerfMon Plugin
Your application gets overloaded and cannot handle 500+ users, check your application logs, resources usage, output of APM tools, etc.
I am new to performance testing. I have a task on measuring the web application performance. I need to find out which modules/calls are causing deadlock, timeout and memory issues.
Q1. How can I use JMeter to find out deadlock, memory and timeout issues? If I do the following steps, it is the right way to trace those issues?
create a test plan in JMeter, which contains multiple Thread Group.
In each thread group, it contains multiple HTTP requests and 200 or
more users plus infinite loop.
Monitor JMeter results and SQL
profiler for deadlock.
Q2. JMeter is the right tool for tracking those issues? Or, should I use browser based performance testing tool such as LoadNinja, LoadView?
Thanks
Bonnie
Q1 JMeter per se doesn't provide any toolchain to detect deadlock and memory issues, the HTTP Request sampler (or even better HTTP Request Defaults) provides possibility to set the timeouts, if the value is blank - it will default to operating system timeout or web server timeout, whatever comes the first
If you conduct some form of stress test, i.e. start with 1 virtual user and gradually increase the load at some point you will see that response time starts growing and number of requests per second starts decreasing. So it's the point of maximum system performance and after that the performance will be degrading.
To monitor application under test memory you can use JMeter PerfMon Plugin, it will allow you to state whether the lack of RAM is the cause of the performance issue
With regards to deadlocks, it should result in HTTP Request sampler failure (or timeout), JMeter won't give you the underlying reason, but it will give you the timestamp and you should be able to check what happened with your application/database at that moment.
Q2 well-behaved JMeter test must produce the same network footprint as a real browser, if your test plan is good enough the system under test shouldn't be able to distinguish whether it's being hit by JMeter or by a real user using the real browser. JMeter will not give you client-side performance metrics like page rendering time or JavaScript execution time as:
JMeter is not a browser, it works at protocol level. As far as web-services and remote services are concerned, JMeter looks like a browser (or rather, multiple browsers); however JMeter does not perform all the actions supported by browsers. In particular, JMeter does not execute the Javascript found in HTML pages. Nor does it render the HTML pages as a browser does (it's possible to view the response as HTML etc., but the timings are not included in any samples, and only one sample in one thread is ever displayed at a time).
How can I execute those JMeter recorded script in parallel as we do while creating HTTP SAMPLER (Embedded HTTP Resource) ?
While recording I also checked Retrieve All Embedded Resource with pool of 6. Because of this I getting incorrect response time(Varies from browser timing of page).
Is there any way that we could execute our recorded HTTP Samplers in parallel?
As of JMeter version 3.2 it is not possible to kick off extra threads to run a specific sampler(s) group in parallel, each thread (virtual user) executes samplers upside down.
So you should be very careful with what you are recording. For example, you must not be recording any embedded resources calls, there is `URLs Patterns to Exclude" input on the "Requests Filtering" tab of the HTTP(S) Test Script Recorder where you can define which resources need to be excluded from the recording.
Remember that well-behaved JMeter test should be producing response time similar to real browser, but you need to configure JMeter to mimic real browser, to wit:
Add HTTP Header Manager to represent browser headers
Add HTTP Cookie Manager to represent browser cookies and deal with cookie-based authentication
Add HTTP Cache Manager to act like browser's memory and disk caches
See How to make JMeter behave more like a real browser guide for more information on JMeter fine tuning.
There is Jmeter response time difference between Jmeter run results and manually captured the response time from the local system using stop watch on the web application.
Browse the web app from local windows system and use stop watch to see the response time to load the page.
Run the Jmeter in non-gui/gui mode and observe the response time (used Listeners only to debug, when run the script no listener was added)
Can see there is a difference in both. Please suggest how to know if Jmeter has captured the correct response time.
Given you properly configure JMeter you should get the same or similar response time for the same request. "Proper" configuration stands for:
You should configure JMeter to retrieve embedded resources and use parallel thread pool of ~5 threads for this
This options "tells" JMeter to fetch images, styles and scripts referenced in the main HTML page and do it in parallel like real browsers do
Add HTTP Cache Manager to your Test Plan. Real browsers download embedded resources but do it only once, on subsequent requests the resources are being returned from disk cache, no actual request is being made. HTTP Cache Manager enables cache simulation and cache-control headers handling.
Add HTTP Cookie Manager to represent cookies/sessions and deal with cookie-based authentication
Add HTTP Header Manager to your test plan to represent browser headers. It might be important as i.e. missing Accept-Encoding header may disable server-side compression therefore client will receive much more data and it will take more time.
Assuming "good" JMeter configuration you should see more or less same behavior
If there are still differences - capture the requests sent by JMeter and the browser using a sniffer tool like Wireshark and amend JMeter configuration to eliminate the differences
JMeter have three basic measurements it captures per request:
Elapsed Time (which is overall timespan from the point when it just starts sending request to the last bit received)
Latency (which starts the same point in time and ends when server starts responding)
And Connect time (which is included in latency and is basically the time for handshakes with server, including SSL/TLS negotiations)
So if you set a data writer among your listeners (e.g SimpleDataWriter, although AggregateReport & SummaryReport can do it as well), you'll see these metrics in your data file (while standard listeners/visualisers/aggregators stuck to elapsed time only).
But mind that these metrics doesn't include response rendering, and especially any code to be executed by browser.
JMeter just doesn't do it at all: obviously, it measures just the combined performance of Server + Network, skipping everything on the client side (except for bare necessities, like protocol negotiations).
That might explain the difference you've experienced.
As well as the difference between logged server processing times & the response times measured by JMeter: server just doesn't count what the network brings in.
PS And you don't have to sit and click on stopwatch with your browser: modern ones have a Dev Tools capable of showing you the precision timings divided by stages. E.g., just call Ctrl+Shift+I in Chrome, switch to network tab & behold the timings right there as you doing your requests.
We are using Apache JMeter for the performance testing of web application. Apparently response time is too high in comparison to loading page in browser during load. When we open the page during load it opens in 2 seconds however JMeter reports 70 seconds. I understand browser memory cache and disk cache are used in browser however isn't JMeter cache manager does same. How to assert it, comparing response header is one option. Any thoughts on this will be appreciated.
It may be wrong configuration in the script. There won't be much difference between web browser and Jmeter response times (browser rendering time is ignored in Jmeter, not a big factor but must be considered)
If you are using single Http Sampler for a web page and retrieving all resources in tha page, then select "Parallel Downloads" option to '6' in Http sampler advanced section. So, you are simulating the browser behaviour of parallel downloading of the resources like .js, .css, images etc.
If you recorded the script using Test Script Recorder, then there will be an Http Sampler for each resource requet for that page which will be sent sequentially, hence increase in response time. You might be in this case, as of now, there is no feature/option to send the http samplers in parallel. so I suggest using the approach of adding one sampler, use parallel download of resources option in Http Sampler advanced section.
Also, caching is an important factor which decides the response time. Adding Http Cache manager can solve the issue in jmeter. This simulates browser behaviour of caching. I don't think there will be huge difference b/w browser and jmeter in implementing caching althought may not be perfect.
The reason could be the lack of resources on JMeter machine or JMeter not being properly configured for producing high loads. JMeter default configuration is good for load tests development and debugging, you can run load tests up to certain amount of virtual users, but if you need to conduct a really high loads - you need to perform some configuration changes.
First of all, double check my guess using jvisualvm and your operating system monitoring tools (there is PerfMon JMeter Plugin which can be used for monitoring different metrics on application under test and JMeter load generators sides). If it is the case - take the next steps to get the most performance of your JMeter installation:
Increase JVM Heap Size, NewSize, switch to ConcurrentMarkSweep Garbage Collector.
Run your test in non-GUI mode.
Disable all the listeners during the test.
Follow other recommendations from 9 Easy Solutions for a JMeter Load Test “Out of Memory” Failure article.