Use of Apache JMeter for conducting http flood with an ASP.NET WebForms site - jmeter

i want to conduct an http flood to a test website that i have designed in Visual Studio 2017. It is an ASP.NET Webforms site, so i want to ask if Apache JMeter is a proper tool for such a project. I have done some research and found from other users that Apache JMeter is having some problems with ASP.NET apps in some cases. So i'm a little confused. Also, i am considering to use two computers, one for running the website, and the other for running the JMeter script, in order to avoid the resource consumption that may lead to inaccurate metrics. Is it possible to succeed the http flood in such a way? Any other suggestions are welcome.
Thanks.

JMeter doesn't have any problems with ASP.NET websites (as well as any other websites), JMeter is backend-agnostic and it knows nothing about server-side technologies stack as it basically gets HTML and Headers from the server.
Just make sure to perform correlation of dynamic parameters like VIEWSTATE, EVENTVALIDATION, etc. and you should be good to go.
With regards to "flood" approach - I would rather recommend implementing real life user scenarios, to wit JMeter test should represent real usage of your web application by the real user using the real browser including business steps (login, browse, search, etc.) and technical side of things (Cookies, embedded resources, headers, cache)

Related

How can we run performance testing manually for any webpage?

I am not able to find out anywhere that how can we do performance test manually.
Please help me out for this query.
Thanks!
Maybe you are looking for JMeter or a similar tool.
What browser? Most of the current browsers support the W3C Navigation Timing spec and expose performance data directly on the DOM. You can access it from the console, from javascript on your pages or from browser extensions that display the information.
If you want more detail like a resource load waterfall then you can usually access that directly from the dev tools provided by the various browsers.
One thing you will want to be really careful of is to make sure you do your testing in a configuration that is similar to the users. If you are running a server locally and testing from a browser on the same machine or even the same network then your performance data will be pretty worthless (unless it's an intranet app).
you can perform manual testing (Performance testing) for any webpage by optimizing your css, Javascript and images ( size).
I think JMeter is a best tool for same to check webpage testing if you want add some scripting you can also add.
Also you can check Yslow addons of firefox.This addons give you filter data to optimized your page perfromes.
Also there are some online link available.
How can we run performance testing manually for any webpage?
You can simple use GTMatrix tool which will response of your site Performaces overall in detail.
The best way to go for Performance Testing without any tool is to provide a Standard loading time for each page as per one's experience knowledge. Else request the client to provide an ideal time for each page. Against which the loading time can be verified. But in case of multiple user simultaneously JMeter is the best hands on Approach available. Its Open source. Easy to understand. And you get reports too.
But of course there are multiple factors that would hinder the Performance. They are :
Your network speed
The Server speed on which your application is hosted
The number of Simultaneous users using
The Heavy images in pages
Last but not the least unnecessary links, codes, in short memory consumption in Code, could be loops not required. All the gifts from Developer Teams !!

include static resources like images, css, js etc in tests

I´ve recently started using JMeter to create load tests for my web applications. I really like the tool, and after watching some videos it was really easy to get started with creating tests.
There´s however one thing that I´m not clear about.
Reading on the JMeter homepage, there´s a "Best practice" section. Among other things, it says:
The most important thing to do is filter out all requests you aren't interested in. For instance, there's no point in recording image requests (JMeter can be instructed to download all images on a page - see HTTP Request ). These will just clutter your test plan.
I´ve seen this on other pages aswell, saying that you shouldn´t include requests for images or any other static resources in your tests. However I´ve still not been able to find a single page which gives a good explanation as to WHY you shoudn´t include static resources.
Sure, JMeter isn´t a browser, but requests for static resources would no doubt affect performance of your application? Can someone please give me a good explanation :-)
It all depends on what you are trying to test.
In general, there are two types of performance test I do with JMeter: specific tests, where I look at things that I'm worried about, and "safety net" tests, where I measure the entire application to make sure it does indeed work the way I expect it to.
Specific tests nearly always deal with the dynamic aspects of the web application - the server-side code (.aspx, .php, .jsp etc.). This is where most applications have their bottlenecks - the effort to run a server-side script is many, many times higher than the effort to retrieve a CSS file from disk and serve it up to the browser without any additional processing. If I'm testing the server-side scripts, I don't want to also load the assets - because they clutter up the tests, and consume bandwidth at the test client end. I don't want my tests to fail because my JMeter server is downloading a 5MB video file on each thread and consuming all the bandwidth, when what I'm actually trying to do is see how many logins per second the server can support.
There's very little point in testing your webserver's ability to serve static files - Microsoft, the Apache team, whoever, have already done a brilliant job at that; unless you have a very specific concern, there are better ways to spend your testing budget.
Safety net tests put the whole thing together to prove that it all really does work the way I expect it. Usually, I run these on a production(like) environment, so I have a CDN, production-grade hardware, and the "live" application config. I usually employ a cloud-based testing service for this, so I can see performance from different locations, and generate enough load to stress production-grade kit. You could use JMeter for this (and there are a couple of JMeter Cloud services I've used in the past). It's expensive, it may require downtime, and you should only do it as a safety net.
When you want to do a proper performance test (especially a stress test), where you need to produce your application's response time as a function of number of users/threads in time, you need to include all static resources, just as jMeter Proxy saved them when you recorded your test.
To take browser cache into account you can either use HTTP Cache Manager or Once Only controller, so that each thread only downloads static stuff once, with their first request.
HTTP Cache Manager is the recommended way to go and much easier to set up, simply include it in your test, as the first child of a thread group.
Once Only controller is regularly used when you need to log-in users only on their first request.
BTW parametrization of non-static HTTP Requests is recommended, you won't e.g. search for the same product name or e.g. buy the same book every time, that's usually the starting point which can give you a general idea of performance efficiency of you app.
Hope this helps...
Unless your app is used by casual visitors who only look at one page and then go, there is a good chance that the static resources are being downloaded once, and then served from the browser cache.
Moreover, although static resources affect the bandwidth and the overall response time for the user, they should have a small impact on the server load, and they might not be the kind of thing that you want to measure.
I guess you need to try mimicking what actual, real users would do with the application.

Load-testing a web application

How does one go about testing the server-side performance of a web application?
I'm working on a small web application (specifically, it will solely be responding to AJAX requests with database rows). I want to see how it performs under load. However, I cannot upload it to an internet host right away. The development environment is, however, part of the local intranet, so I can use as many machines as I want to hammer the development server, presumably using Python in conjunction with urllib2.
My question is, is such an approach really accurate for determining the high-load performance of a server-side script? Is there a better way to do this? Am I missing something here?
Have you tried a tool like Apache JMeter? http://jmeter.apache.org/

Performance testing application for bottle necks using production data

I have been tasked with looking for a performance testing solution for one of our Java applications running on a Weblogic server. The requirement is to record production requests (both GET and POST including POST data) and then run these requests in a performance test environment with a copy of the production database.
The reasons for using production requests instead of a test script are:
It is a large application with no existing test scripts so it would be a a large amount of work to write scripts to cover the entire application.
Some performance issues only appear when users do a number of actions in a particular order.
To test using actual user interaction with the system not an estimation at how the users may interact with the system. We all know that users will do things we have not thought of.
I want to be able to fix performance issues and rerun the requests against the fixed code before releasing to production.
I have looked at using JMeters Access Log Sampler with server access logs however the access logs do not contain POST data and the access log sampler only looks at the request URL so it cannot simulate users submitting form data.
I have also looked at using the JMeter HTTP Proxy Server however this can record the actions of only one user and requires the user to configure their browser to use the proxy. This same limitation exist with Tsung and The Grinder.
I have looked at using Wireshark and TCReplay but recording at the packet level is excessive and will not give any useful reports at a request level.
Is there a better way to analyze production performance considering I need to be able to test fixes before releasing to production?
That is going to be a hard ask. I work with Visual Studio Test Edition to load test my applications and we are only able to "estimate" the users activity on the site.
It is possible to look at the logs and gather information on the likelyhood of certain paths through your app. You can then look at the production database to look at the likely values entered in any post requests. From that you will have to make load tests that approach the useage patterns of your production site.
With any current tools I don't think it is possible to record and playback actual user interation.
It is possible to alter your web app so that is records and logs every request and post against session and datetime. This custom logging could be then used to generate load test requests against a test website. This would be some serious code change to your existing site and would likely have performance impacts.
That said, I have worked with web apps that do this level of logging and the ability to analyse the exact series of page posts/requests that caused an error is quite valuable to a developer.
So in summary: It is possible, but I have not heard of any off the shelf tools that do it.
Please check out this Whitepaper by Impetus Technologies on this page.. http://www.impetus.com/plabs/sandstorm.html
Honestly, I'm not sure the task you're being asked to do is even possible, let alone a good idea. Depending on how complex the application's backend is, and how perfect you can recreate the state (ie: all the way down to external SOA services or the time/clock), it may not be possible to make those GET and POST requests reproduce the same behavior.
That said, performance testing against production data is always great, but it usually requires application-specific knowledge that will stress said data. Simply repeating HTTP GETs and POSTs will almost certainly not yield useful results.
Good luck!
I would suggest the following to get the production requests and simulate the accurate workload:
1) Use coremetrics: CoreMetrics provides such solutions using which you can know the application usage patterns. This would help in coming up with an accurate workload model. This model can then be converted into test scripts and executed against a masked copy of production database. This will provide you accurate results about the application performance in realtime.
2) Another option would be creating a small utility using AOP (Aspect oriented apporach) so that it can trace all the requests and corresponding method traces. This would help in identifying the production usage pattern and in turn accurate simulation of workload. AOP frameworks such as AspectJ can be used. This would not require any changes in code. The instrumentation can be done on the fly. The other benefit would be that thi cna only be enabled for a specific time window and then it can be turned off.
Regards,
batterywalam

best way to check performance of web application

what is the best tool (open or commercial) currently available, that lets me send customized requests to a web server and get back a response to check the performance?
i will be sending it a load of more than 20K per second, but i need to get numbers for each call made. also, the numbers might be in some microseconds or nanoseconds. How in this small measurement unit, can i work out a baseline and a benchmark?
If you're using Apache, Apache AB is a benchmarking to test how many requests your serve can serve per second and how well it handles load and concurrency. It's an open-source project - check it out here.
In addition, wikipedia has a nice list of benchmarking software for testing servers.
You can use the Web Application Stress Tool of Microsoft
The Microsoft WAS web stress tool is designed to realistically simulate multiple browsers requesting pages from a web site. You can use this tool to gather performance and stability information about your web application. This tool simulates a large number of requests with a relatively small number of client machines. The goal is to create an environment that is as close to production as possible so that you can find and eliminate problems in the web application prior to deployment.
You can find a list of Open Source software for performance (most of them are for web that send custom request to webserver).
Don't know if either of these have granularity better than milliseconds but check out JMeter (open source) and LoadRunner (Commercial). LoadRunner is not cheap but it allows you to span load generation across multiple machines with aggregated results.

Resources