I am new to Jmeter; I have only been using it for two weeks, and am running into some issues with a test I have created.
The test is designed to hit a lambda in AWS to generate a pre-sign URL via an API call, which is required for placing an object into a S3 bucket, for this to be successful, a signature is required.
Below is the Jmeter test:
Bzm - Concurrency Thread Group:
User Defined Variables
HTTP Header Manager
Jp#gc- throughput shaping timer
HTTP request:
JSR223 PreProcessor (Generate a random guid for the object)
JSR223 PreProcessor (Generates the required signature)
I am using the above to perform the following load testing, start with a baseline of 1 request per second and every 20 minutes increase the request per second to 30 for two minutes, then return to 1 request per second, this cycle repeats over a 2-hour period.
This test is running across 10 fargate tasks, so the total number of requests, which should be hitting the lambda, is 10 request per second at the baseline and 300 request per second during the burst.
My problem is that when I get to my third burst in the cycle my test is returning a 403 error, when checking Jmeter this reports the following for the 403 error ‘Signature expired is now earlier than’ message.
I am unclear of the reason to why my request suddenly start to fail with this error after successfully running for an hour. The only information I have been able to find relating to the root cause of this was a clock skew issue; however as the test run successfully for an hour before this happens and everything is being hosted in AWS I don’t believe this a clock skew issue and if it is how I resolve this.
Has anyone else run into similar problems?
As per Authenticating Requests (AWS Signature Version 4) article:
Protect against reuse of the signed portions of the request – The signed portions (using AWS Signatures) of requests are valid within 15 minutes of the timestamp in the request. An unauthorized party who has access to a signed request can modify the unsigned portions of the request without affecting the request's validity in the 15 minute window. Because of this, we recommend that you maximize protection by signing request headers and body, making HTTPS requests to Amazon S3, and by using the s3:x-amz-content-sha256 condition key (see Amazon S3 Signature Version 4 Authentication Specific Policy Keys) in AWS policies to require users to sign S3 request bodies.
So you need to check the timestamp field of your request and compare it to the current time on the machine.
Also be aware that you can create a GUID using __UUID() function so no need to write custom code.
Make sure to use Groovy language and tick Cache compiled script if available box and avoid inlining of JMeter functions or variables into your script body.
You can see an example of generating an AWS signature in How to Handle Dynamic AWS SigV4 in JMeter for API Testing article
Related
I'm using BlazeMeter plugin to create JMeter scripts. I've been able to create multiple scenarios and merge them into one JMeter test. All of the scenarios have an initial Auth with then a series of GET's and PUT's. The JMeter tests import successfully and I'm able to run them and get results but when I try on day two (the next day) to run them, the auth works fine but then I get a series of 401 unauthorized on the GET's and PUT's so I'm trying to figure out what setting is causing it to run the same day I create but not the day after. I've messed with the cookie, cache and auth manager settings to no avail (i.e. tried clearing/non clearing), but it doesn't work. Trying to understand why it would work on the day I created it but not the next. Any help would be appreciated.
Most probably your application uses some form of security token like Bearer Token which has some limited life time, i.e. one hour or one day so when you record your test scenario you're able to replay it successfully for limited amount of time.
But when the token expires you're not able to replay it successfully because you're not authorized anymore.
The solution would be performing the correlation: the process of extracting a dynamic value using a suitable JMeter Post-Processor and save it into a JMeter Variable. Once done replace the recorded hard-coded value with the dynamic variable from the Post-Processor. This time when the virtual user logs in he will get a "fresh" token so you will be able to replay your test successfully.
You can try out BlazeMeter Proxy Recorder, it's capable of exporting recorded scripts in "SmartJMX" mode with automatic detection and correlation of dynamic parameters. See How to Cut Your JMeter Scripting Time by 80% article for more details.
I am using jmeter to perform load testing on my dev cluster. I have used HTTP Authorization Manager and have passed baseurl, username, password etc. Now my thread group have:
Number of threads:100 ,
Ramp-up period: 1 and Loop Count:1
Now for the first 50-60 calls the authorization works as expected but later it starts failing by giving error as 403. can anyone please tell me how to fix this?
I am using jmeter 5.4.1
Is there any response data? Perhaps you are getting rate limited. I also noticed you are reading in variables from a CSV. Are you sure the data in each row is correct? If you are providing auth credentials through the CSV, an incorrect row could lead to a 403.
As a side note, JMeter 5.4.1 has a log4j vulnerability, you might want to upgrade it to the latest version.
If the problem occurs under the load I can think of 2 possible reasons:
Your application gets overloaded and cannot properly handle the requests. Check your application logs and resources usage (CPU, RAM, etc.)
JMeter gets overloaded and not being able to properly send the requests. You're violating several JMeter Best Practices in particular:
you're running JMeter in GUI mode, GUI is only for tests development and debugging, when it comes to execution you should be using command-line non-GUI mode
you're using Listeners, especially View Results Tree, they don't add any value and just consuming resources
you're not using the latest version of JMeter, current stable release is 5.4.3
I run the same API 4 times in the same JMeter script. in the 1st running API get the high time and after that same API get low times.
User Create API - 2067 ms
User Create API 1- 948 ms
User Create API 2- 869 ms
User Create API 3- 902 ms
User Create API 4- 993 ms
why this kind of scenario does in the JMeter??
JMeter only sends requests, waits for responses, measures time in-between and writes down the performance metrics and KPIs.
If first request takes longer than the following ones the reasons could be in:
Your application under test uses lazy initialization pattern
Your application under test needs to warm up its caches
First request takes longer due to the process of establishing the connection and subsequent requests are simply re-using the connection if you're sending Keep-Alive header
Your API endpoint response is cached on database or in-memory level
etc. the reasons could be numerous, you need to monitor everything you can on both JMeter and the system under test sides to understand this.
Jmeter tries to initialize TCP connections and handles SSL handshakes for the first request. For next requests it uses its config parameters httpclient4.time_to_live and httpclient.reset_state_on_thread_group_iteration.
You can refer to Jmeter's properties reference for more information
I recorded login and logoff requests using blazemeter. After the record nearly 10 request has been created by blazemeter, which some of them includes .../signalr/.../connectionToken labels.
when i run the test these labels return an error like and .
the test included 10 users. The users have different username and passwords. The other labels (another from these signalr labels) return true.
So, i wonder now if i can disable these pages and not include in the tests? or
any solution for this issue?
if i can disable these pages and not include in the tests
theoretically yes, you can ask around whether SignalR is in scope for performance testing and if not you could disable those, however I believe well-behaved JMeter test should act exactly like a real browser so if the real browser sends these requests to .../signalr/.../connectionToken - JMeter should also send them
or any solution for this issue
I don't think you can record and successfully replay anything in Web 2.0 times, the majority of web applications heavily use dynamic tokens for various reasons (saving client state, security, tracking, etc.). When you record the request you get a hard-coded value which can expire or invalidate by other means, you need to identify all the dynamic values like this SignalR token and perform correlation: the process of dynamic values extraction from the previous responses using suitable JMeter Post-Processors and re-using them in next requests in form of JMeter Variables or Functions
If you record the same user scenario second time and compare two resulting test plans all the request parameters which change will require correlation. There is also an alternative recording option which is capable of exporting recorded requests in "SmartJMX" mode with automatic detection and correlation of dynamic parameters, see How to Cut Your JMeter Scripting Time by 80% for more details.
I am working on migration of scripts from performance center to Jmeter5.2.1.
As part of this migration , we are using same functional flow which we did in performance center.
My scenario consists of users logging in to the web application perform 10-15 iterations and then logout.
This is my Testplan.
TestPlan
--ThreadGroup1
--Once Only Controller (login of users)
--Loop Controller (10 Iterations)
HTTP1
HTTP2
HTTP3
.
.
--Once only Controller (logout of users)
--csv Config data ( username/password)
--csv config data( unique data for the loop controller)
With this approach I am noticing that the time taken to complete the test in Jmeter is much more than what we have in performance center ( I took care of think times and added the similar values)
Why is my test run slow in Jmeter?
Is loop controller sequential? Meaning at a given time it can run only one request?
If not loop controller what other options we have to satisfy my scenario.
If I include different thread groups , carrying JSESSIONIDs needs to be done across thread groups which is not a best practice to do so.
Update:
Comparison between performance center and Jmeter settings
Below are the settings in Jmeter.
Thread Group settings:
TestPlan :
HTTP Cookie manager in Thread Group
CSV data files in Test plan
Once Only counters for Login and Logout
Loop Controller for Iterations.
HTTP request Defaults: ( Even with out checking retrieve all embedded and parallel downloads its taking more than an hour for 3 users)
TestPlan
Performance Center results :
Every Sampler has HTTP Header manager
Entire Test Plan
Given you send the same requests you should have the same response times, no matter which tool is being used under the hood.
It's hard to say what the differences are without seeing the full scripts from the both tools so generic advice is to use a third-party sniffer tool like Wireshark or Fiddler in order to identify the differences and configure JMeter to behave exactly like the "performance center" (whatever it is)
For example I fail to see HTTP Cache Manager and it will cause JMeter to download embedded resources (images, scripts, styles, sounds, fonts) for each and every HTTP request while real browser does it only once.
I also don't see HTTP Header Manager which might be very important, for example if you send Accept header the server will be aware that the client can understand gzipped responses which will greatly reduce network traffic.
More information: How to make JMeter behave more like a real browser