How does one go about testing the server-side performance of a web application?
I'm working on a small web application (specifically, it will solely be responding to AJAX requests with database rows). I want to see how it performs under load. However, I cannot upload it to an internet host right away. The development environment is, however, part of the local intranet, so I can use as many machines as I want to hammer the development server, presumably using Python in conjunction with urllib2.
My question is, is such an approach really accurate for determining the high-load performance of a server-side script? Is there a better way to do this? Am I missing something here?
Have you tried a tool like Apache JMeter? http://jmeter.apache.org/
Related
i want to conduct an http flood to a test website that i have designed in Visual Studio 2017. It is an ASP.NET Webforms site, so i want to ask if Apache JMeter is a proper tool for such a project. I have done some research and found from other users that Apache JMeter is having some problems with ASP.NET apps in some cases. So i'm a little confused. Also, i am considering to use two computers, one for running the website, and the other for running the JMeter script, in order to avoid the resource consumption that may lead to inaccurate metrics. Is it possible to succeed the http flood in such a way? Any other suggestions are welcome.
Thanks.
JMeter doesn't have any problems with ASP.NET websites (as well as any other websites), JMeter is backend-agnostic and it knows nothing about server-side technologies stack as it basically gets HTML and Headers from the server.
Just make sure to perform correlation of dynamic parameters like VIEWSTATE, EVENTVALIDATION, etc. and you should be good to go.
With regards to "flood" approach - I would rather recommend implementing real life user scenarios, to wit JMeter test should represent real usage of your web application by the real user using the real browser including business steps (login, browse, search, etc.) and technical side of things (Cookies, embedded resources, headers, cache)
I'm trying to build a simulator that simulates hundreds of users on a web application.
I'm usually using Microsoft Load Simulator and WebTests.
If the webpage has some complexity I'm using WebTest plugins to adjust it correctly.
Now, I have a web page that does tons of ajax requests per url I visit, the ajax requests are based on complex calculations done in the browser.
If I'll just browse to the url and record all the traffic, even after I'll make some dynamic parameters I still won't be able to simulate it correctly since there may be different requests that could be sent based on the server previous responses.
If I'll build a webtest that simulates it correctly it will require a lot of webtest plugins and will be impossible to maintaince since the site will probably change each few weeks.
I thought about using selenium but if I'll use it I will need far too much hardware resources to run hundreds of users.
I came up with the idea of using a headless browser such as PhantomJs, SimpleBrowser, HtmlUnit and etc.
Both SimpleBrowser and HtmlUnit does not support executing javascript/AJAX which makes them useless for me.
I tried using PhantomJs but I had a problem with running multiple users in parallel since the localStorage is the same for all so it keeps the same session for all so I can't simulate different users in parallel.
Does anyone had any experience with loadtesting complex ajax web applications?
I will Love you for eternity if you would help me with this issue.
P.S
I'm usually coding in C# but I'm open for new languages\technologies.
Using Selenium for performance testing is not logical. I recommend you to use Locust for real performance testing. For getting and using a dynamic data you can check the this answer. You need to write simple Python script for simulating users.
I am investigating www.loadbooster.com that can import a Selenium script and run headless borwoser with PhantomJS to run the script as a load test.
It is still work in progress for me, so I cant comment on how good it is, but you could investigate it.
I am not able to find out anywhere that how can we do performance test manually.
Please help me out for this query.
Thanks!
Maybe you are looking for JMeter or a similar tool.
What browser? Most of the current browsers support the W3C Navigation Timing spec and expose performance data directly on the DOM. You can access it from the console, from javascript on your pages or from browser extensions that display the information.
If you want more detail like a resource load waterfall then you can usually access that directly from the dev tools provided by the various browsers.
One thing you will want to be really careful of is to make sure you do your testing in a configuration that is similar to the users. If you are running a server locally and testing from a browser on the same machine or even the same network then your performance data will be pretty worthless (unless it's an intranet app).
you can perform manual testing (Performance testing) for any webpage by optimizing your css, Javascript and images ( size).
I think JMeter is a best tool for same to check webpage testing if you want add some scripting you can also add.
Also you can check Yslow addons of firefox.This addons give you filter data to optimized your page perfromes.
Also there are some online link available.
How can we run performance testing manually for any webpage?
You can simple use GTMatrix tool which will response of your site Performaces overall in detail.
The best way to go for Performance Testing without any tool is to provide a Standard loading time for each page as per one's experience knowledge. Else request the client to provide an ideal time for each page. Against which the loading time can be verified. But in case of multiple user simultaneously JMeter is the best hands on Approach available. Its Open source. Easy to understand. And you get reports too.
But of course there are multiple factors that would hinder the Performance. They are :
Your network speed
The Server speed on which your application is hosted
The number of Simultaneous users using
The Heavy images in pages
Last but not the least unnecessary links, codes, in short memory consumption in Code, could be loops not required. All the gifts from Developer Teams !!
I need to deploy an application onto some Windows machines for purposes of data collection from a group of people (i.e. the application will be used to gather responses to a series of survey questions). The process is interactive, alternating between displays of text and images with specific timing requirements. I have put together a prototype application using HTML and JavaScript that implements the survey. However, there are some unique constraints on the deployment environment that have me stuck:
While the machine is Internet-connected, the client requires that the survey application must run fully local to the PC that it runs on. Therefore, sending the survey results to a remote server is not permissible. Obviously, saving to a local file from a Web browser is typically not permitted for security reasons.
Installation of applications onto the machines that will run the survey is not permitted.
The configuration of the machines is not known specifically a priori, but I can assume some recent version of Windows with IE8+.
The "no remote access" requirement was a late comer, and has thrown a wrench into the plan of just writing a simple Web application that could post results to an HTTP server. I'm now looking for the easiest way forward. Two main approaches come to mind:
Use a GUI framework that provides a control that can display HTML/JavaScript; running a full-blown application on the PC would allow me to save the results to the filesystem. I've never done this, but it seems like in this day and age it shouldn't be too difficult. This would allow me to reuse much of my existing prototype implementation, but I would need some way of transferring the results (which would be stored in a JavaScript data structure) outside of the Web control to where the rest of the application could access it.
Reimplement the entire application using some GUI framework (I've used PyQt successfully before, although not on Windows). This approach is obviously less desirable than #1 due to the lack of reuse. However, it may be necessary if #1 isn't feasible.
Any recommendations for the best way to go? Ideally, I'm looking for a solution that can be run in a "portable" manner from a USB thumbdrive or similar.
Have you looked at HTML Applications (HTA)? They work in IE5+ and can use Windows Scripting Host to write to local drives and UNC shares...
Maybe you can use a portable web server with a scripting language on the server side. http://code.google.com/p/mongoose/ Mongoose, for example, you can run PHP, CGI, etc. .. scripts. Then, simply create a script to save a file to your hard drive. And let the rest of the application in the same manner.
Use a script to start the web server, and perhaps a portable web browser like K-Meleon to start the application http://kmeleon.sourceforge.net/ This is highly configurable. Or start the system explorer to your localhost URL.
The only problem may be that the user has to modify the firewall for the first time you run the server?
I have been tasked with looking for a performance testing solution for one of our Java applications running on a Weblogic server. The requirement is to record production requests (both GET and POST including POST data) and then run these requests in a performance test environment with a copy of the production database.
The reasons for using production requests instead of a test script are:
It is a large application with no existing test scripts so it would be a a large amount of work to write scripts to cover the entire application.
Some performance issues only appear when users do a number of actions in a particular order.
To test using actual user interaction with the system not an estimation at how the users may interact with the system. We all know that users will do things we have not thought of.
I want to be able to fix performance issues and rerun the requests against the fixed code before releasing to production.
I have looked at using JMeters Access Log Sampler with server access logs however the access logs do not contain POST data and the access log sampler only looks at the request URL so it cannot simulate users submitting form data.
I have also looked at using the JMeter HTTP Proxy Server however this can record the actions of only one user and requires the user to configure their browser to use the proxy. This same limitation exist with Tsung and The Grinder.
I have looked at using Wireshark and TCReplay but recording at the packet level is excessive and will not give any useful reports at a request level.
Is there a better way to analyze production performance considering I need to be able to test fixes before releasing to production?
That is going to be a hard ask. I work with Visual Studio Test Edition to load test my applications and we are only able to "estimate" the users activity on the site.
It is possible to look at the logs and gather information on the likelyhood of certain paths through your app. You can then look at the production database to look at the likely values entered in any post requests. From that you will have to make load tests that approach the useage patterns of your production site.
With any current tools I don't think it is possible to record and playback actual user interation.
It is possible to alter your web app so that is records and logs every request and post against session and datetime. This custom logging could be then used to generate load test requests against a test website. This would be some serious code change to your existing site and would likely have performance impacts.
That said, I have worked with web apps that do this level of logging and the ability to analyse the exact series of page posts/requests that caused an error is quite valuable to a developer.
So in summary: It is possible, but I have not heard of any off the shelf tools that do it.
Please check out this Whitepaper by Impetus Technologies on this page.. http://www.impetus.com/plabs/sandstorm.html
Honestly, I'm not sure the task you're being asked to do is even possible, let alone a good idea. Depending on how complex the application's backend is, and how perfect you can recreate the state (ie: all the way down to external SOA services or the time/clock), it may not be possible to make those GET and POST requests reproduce the same behavior.
That said, performance testing against production data is always great, but it usually requires application-specific knowledge that will stress said data. Simply repeating HTTP GETs and POSTs will almost certainly not yield useful results.
Good luck!
I would suggest the following to get the production requests and simulate the accurate workload:
1) Use coremetrics: CoreMetrics provides such solutions using which you can know the application usage patterns. This would help in coming up with an accurate workload model. This model can then be converted into test scripts and executed against a masked copy of production database. This will provide you accurate results about the application performance in realtime.
2) Another option would be creating a small utility using AOP (Aspect oriented apporach) so that it can trace all the requests and corresponding method traces. This would help in identifying the production usage pattern and in turn accurate simulation of workload. AOP frameworks such as AspectJ can be used. This would not require any changes in code. The instrumentation can be done on the fly. The other benefit would be that thi cna only be enabled for a specific time window and then it can be turned off.
Regards,
batterywalam