I need to do a Load test on application which is built on ZK framework.
When i record a script which performs below action
a. User Login
b. Select Role
c. Open and Create Record
d. Log out.
When i run the script with multiple users say 10 users then scripts create 10 records in application.
But after some random duration say 4-5 hours later same script does not create any record even though all requests are shown as passed. Script also records COMET request (Ajax Push)
I am not able to figure out the reason.
Read this which explains how ids work :
http://books.zkoss.org/index.php?title=Small_Talks/2012/January/Execute_a_Loading_or_Performance_Test_on_ZK_using_JMeter
http://blog.zkoss.org/index.php/2013/08/06/zk-jmeter-plugin/
Related
I have to perform load and performance testing of my new site which requires login functionality. I am using JMeter to test performance and load. Can you please send me the details of how I can create multiple users to the database by using JMeter, so that I can use them to login multiple users at a time?
Thanks.
JMeter is a Load testing tool and not a data creation tool.
It should be used for load testing and not functional aspects.
Though it can be used for data creation because of its record and replay feature. (of course with parametrization)
To create data,
you need to record create user scenario using JMeter.
parametrize add user request (username,details)with csv data set config element.
add CSV data set config with required no. of entries (500 in your case) where each line represents user details,
user1,India,passwd1
user2,US,passwd2
run script with 500 threads (not advisable but a possible way) or single thread with 500 loop count.
This will create users with data from csv. After that you can load test your website.
First of all you need to set following things to your Jmeter test plan :
1 - Thread group - Here you can set parallel users[Ex: 450 users] , ramp up period[Ex: 300 seconds] and loop count[Ex : 5]
2 - Login page with get and post method
3 - Listener like summary report , table , tree or other which is suitable for you.
Reference detailed link for login process : How to do login using Jmeter
Please let me know if you have any confusion during your load testing.
If you need to create users directly in database the correct JMeter Test Element is JDBC Request sampler which allows to execute arbitrary SQL queries against any database which supports JDBC protocol.
Download a relevant JDBC driver for your database and drop it to /lib folder of your JMeter installation
Restart JMeter if it's running (jars loading isn't dynamic and performed on start)
Add JDBC Connection Configuration element to your Test Plan or Thread Group and populate at least the following fields:
Variable Name
Database URL
JDBC Driver Class
Username and Password
Configure JDBC Request to insert email/password pairs. You can use it in conjunction with CSV Data Set Config so user credentials could be read from CSV file.
See Building a Database Test Plan guide or select File -> Templates -> JDBC Load Test to get idea on how your Test Plan should look like.
I am using JMeter to test performance of the following Server Infrastructure. The Code Base uses ICEfaces framework and hence generates dynamic ID's each time there is a new build.
I record the scripts and run them for different variants of load (10 Users, 20 Users, 30 Users and so on). Whenever a new code base is deployed, because of change in ID's, I have to re-record the scripts before I perform Test runs again.
As of now I am able to satisfactorily get my job done.
I wish to take my job to a whole new level by trying to test performance on the following Server Infrastructure.
The following are my issues -
Because of two different nodes (Node1 and Node2) each node has a unique set of dynamic ID's associated with it and when I record a script on a particular login session, I cannot be sure of the Node my session is pinned on and as a result the recorded script is tailor made for a single node and not a cluster.
When "Load Balancer" gets in action, I cannot be sure of the Node JMeter hits during performance run and for obvious reasons the run fails to generate results.
I want a cleaver way to record script which can successfully run on a multiple server configuration.
How to perform performance Testing on this configuration?
Problem
Users can submit data to generate a report, which triggers a spring-batch job. If the same data is submitted (by the same user or another user) the same report should be generated such that Spring Batch doesn't start a new job, under the premise that the report has already been generated.
To make matters a little more complicated, generated reports expire after 90 days. The idea behind this is that the data gleaned from various web services used to build the report is likely out of date. Therefore, after 90 days the report should be re-generated using new data from those web services.
Questions
When a job has already run, how can I discover the job execution id for that job? This id is used in the URL to uniquely identify a report. JobExplorer is severely limited in querying Spring Batch data.
How can I trigger another instance of the job only after 90 days? The issue is that given duplicate job parameters, a JobInstanceAlreadyCompleteException will be thrown. Must I encode the 90 days has an extra identifying parameter, or is there an easier way?
Clean up old jobs must be done using business methods as well as for expired reports.
After this premise you can try a different path to solve your problem:
Every user launch a different job, with same report properties but
an extra job-parameters to make every jobs different
First step is to check - using business method - if there is a running job for that report ; in this case notify user he have to wait or retry later (use a decider)
Second step is to check if there is a completed - not expired - report using a business method;if so retrieve it and show to user (use a decider as for step 2)
Generate report (delete old, if necessary)
Show report to user.
Of course, generated report metadata tables are different from SB tables and should be accessed using DAO related to domain context (the report in your case).
Can this a valid alternative?
I would like to develop an application which listens a database table and process the data immediately. Batch does not execute for only one time. It listens the table regularly, for example with 10 seconds interval. Whenever new data are inserted in the table, the batch starts the process.
Is it possible to develop such an application with "Spring Batch". If yes, can you please give some advice for it?
Thank you.
This goal can be archived using quartz ; just launch the job every 10 seconds and process, in your step, the new data. I haven't tried myself and can't check now (I'm writing from my mobile) but there are a lot of example on SO.
What is the best way to save data in session variables in a classic web site?
I am maintaining a classic web site and want to be able to allow my users to demo all functionality of the site, this means allowing them to delete records.
The closet example I have seen so far are the demos of Telerik controls where they are saving the dataset in sessions on first load and allowing the user to manipulate the data.
How can I achieve the same in ASP with an MS Access backend?
If you want to persist the state over multiple pages (e.g. to demo you complete application) then it's a bit tricky.
I would suggest copying the MDB file for each session and using the copied version. This would ensure that every session uses its own data.
create a version of your access db which will be used as a fresh template for each user
on session copy the template and name it after the users session ID
use the individual MDB
Note: Then only drawback I can see here is that you need to remove the unused MDB files as it can get a lot after sometime. You could do it with a scheduled task or even on session start before you create a new one.
I am not sure what you can use to check if it's used or not but check the files creation date or maybe the LDF file can help you as well (if it does not exist = unused).
You can store a connection or inclusive an object in a session variable as far you remember what kind of variable are you storing at the retrieving time. I had never stored a dataset in a session variable but I had stored a lot of arrays in session variables so you can use the ADO Getrows method to locate a complete dataset into a session variable.
How big is the Access database? If your database is small enough (relative to the server capacity, expected number of users, and so forth) then I like the idea of using a fresh copy of the database for each user that runs the demo.
With this approach, you simplify your possible code paths. Otherwise this "are we in demo mode or not?" logic will permeate a heck of a lot of your code.
I'd do it like this...
When the user begins the demo, make a copy of the Access DB for that user to use. If your db is foo.mdb, copy it to /tempdb/foo_1234567890.mdb where 1234567890 is the user's session ID.
Alter the user's connection string to point to the fresh database copy. From this point on, your app can operate like "normal" with no further modifications.
Have a scheduled task that deletes all files in /tempdb with last-modified times more than __ hours in the past. If you don't have the ability to schedule tasks on the server (perhaps you're in a shared hosting environment, etc) then you could do this at the same time you do step #1.