Load Testing in jmeter - jmeter

Hi since I was not aware of load testing I got a doubt at the time of learning.
hope if it is a not valid also pls let me help.
In jmeter we can simply record and do load test right. if that is the case if I load some unknown application with lots of load from my client side it might causes the server crashes right. then what should they do if server crashes of unknown person load test.
is there any specific things to do load test or simply we can do load test on any website .pls let me know this thing even my query is not a valid one also...thanks in advance

The majority of web applications are protected from DoS attacks therefore most likely you will not be able to "crash" the server, the traffic from your IP will simply get blocked and your IP will get banned.
Moreover, your action fall under Computer Misuse Act and you may be a subject to imprisonment up to 1 year and a fine up to 5000 pounds. The above law applies to the UK however I'm pretty much sure the equivalent exists in all the countries around the globe.
So don't load test the application without explicit permission of that application owner or you will run into a trouble.
Check out Websites Forbidden to Test Using BlazeMeter for explicit list of web sites you must not test by any means. There are some sites you can use for practicing like http://newtours.demoaut.com/ or http://blazedemo.com/ however I would recommend using something you can deploy locally as this is the safest way to practice load testing, moreover you will be able to see server-side impact of your test

Related

Is there any possible way to load test WOWZA Cloud Live streaming video?

I am using JMeter to test load /performance of live streaming severs including WOWZA live streaming engine. But I am unable to test Live streaming of WOWZA cloud, since I am getting a lot of Time out errors. I am very well aware that the timeout is not because of the delay in response because the live streaming is running smoothly when opened from an external network. I found out that, after some period of load request being send to Wowza cloud, the domain name itself is getting changed(its dynamic). I have created the config in jmeter in such a way that all the URL path, Playlist.m3u8, chunklist.m3u8, and corresponding stream(ts) files are dynamic. But, since the domain name itself is getting changed after a period of load test, the request sending are partially getting failed(maybe because the domain name which I am sending request is not responsible to handle all the requests anymore). Can anybody suggest what to do? And is there any way to test load in WOWZA cloud?
You can use this JMeter plugin.
It is a plugin that will do URL extraction automatically from manifest without you needing to use JMeter extractors, as a consequence even when segment’s URLs will change due to Wowza Cloud scaling, this will be taken into account.
Besides, it will accurately similate how player request the server and give metrics on User Experience.
Still as written by the other answer ensure you:
ask for authorization to avoid your test being marked as DDOS
ensure you disable Java DNS caching for JMeter's JVM
Disclaimer: I work for the company that develops it.
As you are testing a multi-tenant cloud environment the first thing you must do is get permission from Wowza. Almost all Cloud applications have restrictions on the use of automation outside of their published interfaces. Your point of contact inside of Wowza will work with you for your testing window, scale, approve your performance test plan, your pacing and think times to ensure that they are reasonable and will not impact their service to other tenants on the system.
They can also provide technical insight on how to construct your tests given some unique features/capabilities/engineering for the site. They may even be able to provide you with sample code.
As a general rule of thumb, you don't point and fire tactical nuclear software at sites you don't own, manage, control or have direct written permission from those that do have those rights.

Sending 100's of page request at the same time

I want to test the performance of my website. I have hosted it on godaddy and I want to see how it performance when 100s of users are trying to access it.
Is their a way to do the above? Is their a script that can be developed to send multiple page request?
Thanks
Consider trying Jmeter or siege.
Apache Bench is commonly used for doing load testing (which is pretty much what you are describing). There are also a bunch of services that will do it for you (some free, most with varying costs).
You could simply script curl or whet to beat on it in parallel but just throwing load at it isn't terribly useful if you don't also track how the site performs under the load (which is where the other tools come in).
One thing to watch out for is if you test just the base page/application or if you use a real browser engine to test the full page (including images and static resources).

include static resources like images, css, js etc in tests

I´ve recently started using JMeter to create load tests for my web applications. I really like the tool, and after watching some videos it was really easy to get started with creating tests.
There´s however one thing that I´m not clear about.
Reading on the JMeter homepage, there´s a "Best practice" section. Among other things, it says:
The most important thing to do is filter out all requests you aren't interested in. For instance, there's no point in recording image requests (JMeter can be instructed to download all images on a page - see HTTP Request ). These will just clutter your test plan.
I´ve seen this on other pages aswell, saying that you shouldn´t include requests for images or any other static resources in your tests. However I´ve still not been able to find a single page which gives a good explanation as to WHY you shoudn´t include static resources.
Sure, JMeter isn´t a browser, but requests for static resources would no doubt affect performance of your application? Can someone please give me a good explanation :-)
It all depends on what you are trying to test.
In general, there are two types of performance test I do with JMeter: specific tests, where I look at things that I'm worried about, and "safety net" tests, where I measure the entire application to make sure it does indeed work the way I expect it to.
Specific tests nearly always deal with the dynamic aspects of the web application - the server-side code (.aspx, .php, .jsp etc.). This is where most applications have their bottlenecks - the effort to run a server-side script is many, many times higher than the effort to retrieve a CSS file from disk and serve it up to the browser without any additional processing. If I'm testing the server-side scripts, I don't want to also load the assets - because they clutter up the tests, and consume bandwidth at the test client end. I don't want my tests to fail because my JMeter server is downloading a 5MB video file on each thread and consuming all the bandwidth, when what I'm actually trying to do is see how many logins per second the server can support.
There's very little point in testing your webserver's ability to serve static files - Microsoft, the Apache team, whoever, have already done a brilliant job at that; unless you have a very specific concern, there are better ways to spend your testing budget.
Safety net tests put the whole thing together to prove that it all really does work the way I expect it. Usually, I run these on a production(like) environment, so I have a CDN, production-grade hardware, and the "live" application config. I usually employ a cloud-based testing service for this, so I can see performance from different locations, and generate enough load to stress production-grade kit. You could use JMeter for this (and there are a couple of JMeter Cloud services I've used in the past). It's expensive, it may require downtime, and you should only do it as a safety net.
When you want to do a proper performance test (especially a stress test), where you need to produce your application's response time as a function of number of users/threads in time, you need to include all static resources, just as jMeter Proxy saved them when you recorded your test.
To take browser cache into account you can either use HTTP Cache Manager or Once Only controller, so that each thread only downloads static stuff once, with their first request.
HTTP Cache Manager is the recommended way to go and much easier to set up, simply include it in your test, as the first child of a thread group.
Once Only controller is regularly used when you need to log-in users only on their first request.
BTW parametrization of non-static HTTP Requests is recommended, you won't e.g. search for the same product name or e.g. buy the same book every time, that's usually the starting point which can give you a general idea of performance efficiency of you app.
Hope this helps...
Unless your app is used by casual visitors who only look at one page and then go, there is a good chance that the static resources are being downloaded once, and then served from the browser cache.
Moreover, although static resources affect the bandwidth and the overall response time for the user, they should have a small impact on the server load, and they might not be the kind of thing that you want to measure.
I guess you need to try mimicking what actual, real users would do with the application.

Load-testing a web application

How does one go about testing the server-side performance of a web application?
I'm working on a small web application (specifically, it will solely be responding to AJAX requests with database rows). I want to see how it performs under load. However, I cannot upload it to an internet host right away. The development environment is, however, part of the local intranet, so I can use as many machines as I want to hammer the development server, presumably using Python in conjunction with urllib2.
My question is, is such an approach really accurate for determining the high-load performance of a server-side script? Is there a better way to do this? Am I missing something here?
Have you tried a tool like Apache JMeter? http://jmeter.apache.org/

Performance testing application for bottle necks using production data

I have been tasked with looking for a performance testing solution for one of our Java applications running on a Weblogic server. The requirement is to record production requests (both GET and POST including POST data) and then run these requests in a performance test environment with a copy of the production database.
The reasons for using production requests instead of a test script are:
It is a large application with no existing test scripts so it would be a a large amount of work to write scripts to cover the entire application.
Some performance issues only appear when users do a number of actions in a particular order.
To test using actual user interaction with the system not an estimation at how the users may interact with the system. We all know that users will do things we have not thought of.
I want to be able to fix performance issues and rerun the requests against the fixed code before releasing to production.
I have looked at using JMeters Access Log Sampler with server access logs however the access logs do not contain POST data and the access log sampler only looks at the request URL so it cannot simulate users submitting form data.
I have also looked at using the JMeter HTTP Proxy Server however this can record the actions of only one user and requires the user to configure their browser to use the proxy. This same limitation exist with Tsung and The Grinder.
I have looked at using Wireshark and TCReplay but recording at the packet level is excessive and will not give any useful reports at a request level.
Is there a better way to analyze production performance considering I need to be able to test fixes before releasing to production?
That is going to be a hard ask. I work with Visual Studio Test Edition to load test my applications and we are only able to "estimate" the users activity on the site.
It is possible to look at the logs and gather information on the likelyhood of certain paths through your app. You can then look at the production database to look at the likely values entered in any post requests. From that you will have to make load tests that approach the useage patterns of your production site.
With any current tools I don't think it is possible to record and playback actual user interation.
It is possible to alter your web app so that is records and logs every request and post against session and datetime. This custom logging could be then used to generate load test requests against a test website. This would be some serious code change to your existing site and would likely have performance impacts.
That said, I have worked with web apps that do this level of logging and the ability to analyse the exact series of page posts/requests that caused an error is quite valuable to a developer.
So in summary: It is possible, but I have not heard of any off the shelf tools that do it.
Please check out this Whitepaper by Impetus Technologies on this page.. http://www.impetus.com/plabs/sandstorm.html
Honestly, I'm not sure the task you're being asked to do is even possible, let alone a good idea. Depending on how complex the application's backend is, and how perfect you can recreate the state (ie: all the way down to external SOA services or the time/clock), it may not be possible to make those GET and POST requests reproduce the same behavior.
That said, performance testing against production data is always great, but it usually requires application-specific knowledge that will stress said data. Simply repeating HTTP GETs and POSTs will almost certainly not yield useful results.
Good luck!
I would suggest the following to get the production requests and simulate the accurate workload:
1) Use coremetrics: CoreMetrics provides such solutions using which you can know the application usage patterns. This would help in coming up with an accurate workload model. This model can then be converted into test scripts and executed against a masked copy of production database. This will provide you accurate results about the application performance in realtime.
2) Another option would be creating a small utility using AOP (Aspect oriented apporach) so that it can trace all the requests and corresponding method traces. This would help in identifying the production usage pattern and in turn accurate simulation of workload. AOP frameworks such as AspectJ can be used. This would not require any changes in code. The instrumentation can be done on the fly. The other benefit would be that thi cna only be enabled for a specific time window and then it can be turned off.
Regards,
batterywalam

Resources