Use Case :-
We have an application which can have multiple users logged in simultaneously on a single database or multiple users logged in on multiple databases . The relationship can be 1:M , M:M or M:1 . The issue is we have a very rigorous Business Logic which authenticates these users before letting them log in . Each user will have a token of its own , plus generate his own session accordingly . I cannot fake user's as the app under test will not let it Log In .
I can put up a Load test using some authentic users that are already present in a single database and generate load using HTTP Thread - VM users from different machines and make the session go up periodically .
How do I go for this specific condition - Test for 5x - 150K concurrent Users, 250k Sessions/min . I cannot have that many database present which will give me a window of 150k concurrent Users . Please advise .
If the app allows concurrent logins it might be one option, like you have X pairs of credentials in the CSV file and the CSV Data Set Config set up for reading these credentials. By default each thread will read next line on next iteration so if the app won't kick out the previously logged in guy it might be a viable solution.
Another option is login once and save the token/session into a JMeter Property, JMeter Properties are global and can be read by multiple virtual users, it can be done using i.e. __setProperty() function
The best solution would be generating as many test users in the system as you need because each JMeter thread must represent real user using a real browser as close as possible.
Generating session is not an issue , but you cannot have data for 150k concurrent users . Plus there are conditions such as - what will we target multiple databases or a single one as we have a provision for both in our target app . Need an answer where some one would have executed a use case like this one . I could set userid's , password and other required information in CSV file and later read data from it for each user . But the question is how many users , I cannot create 150k users and then set each one for each iteration . Is there a way around it .
Related
I have a login scenario to be tested for 10000 users, which contains phone# as input and have given in csv file. Im able to perform load testing for 10000 users and able to show the report with total samples, throughput etc post exporting to a file. However issue is customer wants proof that we are giving input as 10000 users and not using loop eg(500 threads * 2 loops). Kindly help with suggestions if we have any option to show that we are using 10000 unique users.
There is a listener called Active Threads Over Time (can be installed using JMeter Plugins Manager) which shows how many users were active during this or that phase of the test
jmeter.log file contains information how many threads were started for this or that Thread Group
.jtl results file contains number of active threads, both overall and in the current Thread Group
HTML Reporting Dashboard contains the Threads Over Time chart
I'm currently new to Talend and I'm learning through videos and documentation, so I'm just not sure how to approach/implement this with best practices.
Goal
Integrate Magento and Quick Book using Talend.
My thoughts
Initially my first thought was I will setup direct DB connection for Magento and will take relevant data which I need and will process it and will send to QuickBook using REST API's(specifically bulk API's in batch)
But then again I thought it would be little hectic for me to query Magento database(multiple joins) so I've another option to use Magento's REST API.
But as I'm not much familiar with the tool I'm struggling little to find best suitable approach, so any help is appreciated.
What I've done till now?
I've saved my auth(for QB) and db(Magento) credentials data in file and using tFileInputDelimited and tContextLoad, I'm storing them in context variables so they can be accessible globally.
I've successfully configured database connection and dbinput but I've not used metadata for connection(should I use that and if Yes how can I pass dynamic values there?). I've used my context variables data in db connection settings.
I've taken relevant fields for now but if I want multiple fields simple query is not enough as Magento stores data in multiple tables for Customer etc but it's not big deal I know but I think it might increase my work.
For now that's what I've built and my next step is send the data to QB using REST while getting access_token and saving it to context variable and again storing the QB reference into Magento DB.
Also I've decided to use QB bulk API's but I'm not sure how I can process data in chunks in Talend(I tried to check multiple resources but no luck) i.e. if the Magento is returning 500 rows I want to process them in chunks of 30 as QB batch max limit is 30, so I will be sending it using REST to QB and as I said I also want to store back QB reference ID in magento(so I can update it later).
Also this all will be on local, then how can I do same in production? how I can maintain development and production environment?
Resources I'm referring
For REST and Auth best practices - https://community.talend.com/t5/How-Tos-and-Best-Practices/Using-OAuth-2-0-with-Talend-to-Access-Goo...
Nice example for batch processing here:
https://community.talend.com/t5/Design-and-Development/Batch-processing-in-talend-job/td-p/51952
Redirect your input to a tFileOutputDelimited.
Enter the output filename, tick the option "Split output in several files" from the "Advanced settings" and enter the value of 1000 into the field "Rows in each output file". This will create n files based on the filename with 1000 in each.
On the next subjob, use a tFileList to iterate over this file list to get records from each file.
I am new to JMeter and need your help with a problem.
I have 4 test scenarios and I need to run it with 30 users load with distribution as 30,10,30,30 percent. Out of 4 scenarios, 1 scenario create a customer ID and that ID is being used in the rest of the scenarios.TO test this, I have created a test data of customer ID's with my 1 scenarios and saved in a CSV file. Now my question is when I will run my test how would I handle the customer iD's generated at the run time and how to manage it with my test data which I have already created. Please help me.
With regards to reusing the data, generated in the runtime - you can extract the required data, i.e. customer ID using suitable JMeter Post-Processor and store it into a JMeter Variable. Once done the variable can be re-used in other scenarios. The process is known as correlation and there is a lot of information on implementation with examples over the web.
With regards to the distribution there are different approaches as well:
Throughput Controller
Switch Controller
Weighted Switch Controller
With regards to "manage test data you created" - you can read the values from a CSV file using CSV Data Set Config or __CSVRead() function
I have being using Jmeter-plugin Ultimate thread group for concurrent request.
But now I'm finding it difficult to use because the scenario is :
Each request has a trackingnumber(The trackingnumber are already generated in the system when a form is submitted, so I have to use the generated tracking number from DB) which are generated passed as a POST in http request, these trackingnumber are unique and have configured csv config for passing the trackingnumber. So once when trackingnumber is used, it cant be used again (as it would give me a error message) . So can someone please suggest me how to stress test this scenario where I have to hit a particular URL (with unique trackingnumber from csv file) for approximately 60/30 mins (with varing no of threads) till I get the crash point of the system.
1st way:-
You can pass the tracking numbers via csv file steps as,
allocate all the tracking numbers to specific uses (this can be possible with database query).
copy-paste those tracking numbers in csv file.
pass those tracking numbers as an parameter via csv data set config.
2nd way:-
fill the form & generated tracking number can be fetch via regular expression.
set allocation logic to specific user each time (disable other users).
log-in with this user & pass the fetched tracking number.
Hope will be helpful to you.
out of 50 users 10 users should perform one activity and 5 users should perform another and soon. I distributed the users for different activities.
the scenarios as follows:
Activity
Full text search 10 users
Key word Search 3 users
wild card search 3 users
logical operator search 4 users
Enable 5 users
Disable 5 users
View Config 5 users
View Log history 5 users
View Graph 5 users
View Alerts 5 users
Total users 50 users
can any please help me for this
Thanks in advace
You could use CSV Data Set Config. For each thread, one line from a csv file is read and stored in jmeter variables. The threads then can branch depending on the values.
Take a look into the following Test Elements:
Throughput Controller - to distribute requests between keyword search, wildcard search, etc.
Constant Throughput Timer - to set exact load of 50 requests per second (you'll need to put "3000" in "Target throughput" input field.
I'd create a ThreadGroup for every type of activity required by the test. If you need custom input data, you can use CSV data to configured outside jmeter for each test.
This way you have all the operations under the same script and just play to execute them.