Performance load testing using Jmeter and how assets files may affect the results - performance

Good Day Folks,
I have been playing around with Jmeter for a load testing project. I was looking for a way that simulates Full / Real user behavior starting by loading the Home page for the app, Doing the login, Then Send a predefined message to a certain user, And load all assets files and Images, not just the direct API CALLS.
Doing the previous steps using a straightforward way will be a bit complicated and It will take a longer time.
I have found this chrome extension (BlazeMeter | The Continuous Testing Platform) That help with Recording set of actions and export it in Jmeter format.
But,
It does only records the direct API calls and does not load any asset or socket/WebSocket sessions.
.
Figuring that we may use Selenium driver with Jmeter may solve this state. But it won't be a good approach to follow if you're going to test with 10k users or even more (Resources limitation).
Any recommendation or workaround?

Two options:-
Distributed mode with selenium using Grid.
Go to cloud base execution that can provide parallel execution with so many number of Vusers.
First one is hectic and you need to have very good/powerful system and resources to make it work along with sync problems and consolidation. I think easier would be to go with selenium and use a cloud base system to run the script in parallel with 10K Vusers.

Related

100k HTTP Requests simultaneously

I have to test my rest API such that 100k API calls are made simultaneously(within 500ms).Any Idea how to simulate it?Utility to use?
I would suggest to use JMeter too, it enable your test to create multiple concurrent jmeter server. Just to be clear you can control multiple remote JMeter engines from a single JMeter client and replicate a test across many computers and thus simulate a larger load on the target server.
To be honest, your target is quite high (100k API calls simultaneously within 500ms), i.e. you'll need a lot of jmeter servers. When you create stress tests, there are not magical recipes, guides or manuals. Trial and error is a fundamental method of solving this kind of problems.
In my experience, I first try with few concurrent users and see how the server react. Then increase the number of concurrent users till to reach an intolerable performance decrease or, worst, a bottleneck .
http://jmeter.apache.org/usermanual/remote-test.html
You will obviously need a load testing tool which can be run in a distributed mode, i.e. 1 controller and X load generators executing the same test.
Grinder - scripts are written in some Python dialect
Apache JMeter - this guy doesn't require any specific knowledge, you can create tests using simple GUI
Tsung - is written in Erlang, known for capability to produce high loads even on low-end hardware.
See Open Source Load Testing Tools: Which One Should You Use? article for more information on above tools.
JMeter
The Apache JMeter™ application is open source software, a 100% pure
Java application designed to load test functional behavior and measure
performance. It was originally designed for testing Web Applications
but has since expanded to other test functions.

How can we run performance testing manually for any webpage?

I am not able to find out anywhere that how can we do performance test manually.
Please help me out for this query.
Thanks!
Maybe you are looking for JMeter or a similar tool.
What browser? Most of the current browsers support the W3C Navigation Timing spec and expose performance data directly on the DOM. You can access it from the console, from javascript on your pages or from browser extensions that display the information.
If you want more detail like a resource load waterfall then you can usually access that directly from the dev tools provided by the various browsers.
One thing you will want to be really careful of is to make sure you do your testing in a configuration that is similar to the users. If you are running a server locally and testing from a browser on the same machine or even the same network then your performance data will be pretty worthless (unless it's an intranet app).
you can perform manual testing (Performance testing) for any webpage by optimizing your css, Javascript and images ( size).
I think JMeter is a best tool for same to check webpage testing if you want add some scripting you can also add.
Also you can check Yslow addons of firefox.This addons give you filter data to optimized your page perfromes.
Also there are some online link available.
How can we run performance testing manually for any webpage?
You can simple use GTMatrix tool which will response of your site Performaces overall in detail.
The best way to go for Performance Testing without any tool is to provide a Standard loading time for each page as per one's experience knowledge. Else request the client to provide an ideal time for each page. Against which the loading time can be verified. But in case of multiple user simultaneously JMeter is the best hands on Approach available. Its Open source. Easy to understand. And you get reports too.
But of course there are multiple factors that would hinder the Performance. They are :
Your network speed
The Server speed on which your application is hosted
The number of Simultaneous users using
The Heavy images in pages
Last but not the least unnecessary links, codes, in short memory consumption in Code, could be loops not required. All the gifts from Developer Teams !!

Sending 100's of page request at the same time

I want to test the performance of my website. I have hosted it on godaddy and I want to see how it performance when 100s of users are trying to access it.
Is their a way to do the above? Is their a script that can be developed to send multiple page request?
Thanks
Consider trying Jmeter or siege.
Apache Bench is commonly used for doing load testing (which is pretty much what you are describing). There are also a bunch of services that will do it for you (some free, most with varying costs).
You could simply script curl or whet to beat on it in parallel but just throwing load at it isn't terribly useful if you don't also track how the site performs under the load (which is where the other tools come in).
One thing to watch out for is if you test just the base page/application or if you use a real browser engine to test the full page (including images and static resources).

Performance testing application for bottle necks using production data

I have been tasked with looking for a performance testing solution for one of our Java applications running on a Weblogic server. The requirement is to record production requests (both GET and POST including POST data) and then run these requests in a performance test environment with a copy of the production database.
The reasons for using production requests instead of a test script are:
It is a large application with no existing test scripts so it would be a a large amount of work to write scripts to cover the entire application.
Some performance issues only appear when users do a number of actions in a particular order.
To test using actual user interaction with the system not an estimation at how the users may interact with the system. We all know that users will do things we have not thought of.
I want to be able to fix performance issues and rerun the requests against the fixed code before releasing to production.
I have looked at using JMeters Access Log Sampler with server access logs however the access logs do not contain POST data and the access log sampler only looks at the request URL so it cannot simulate users submitting form data.
I have also looked at using the JMeter HTTP Proxy Server however this can record the actions of only one user and requires the user to configure their browser to use the proxy. This same limitation exist with Tsung and The Grinder.
I have looked at using Wireshark and TCReplay but recording at the packet level is excessive and will not give any useful reports at a request level.
Is there a better way to analyze production performance considering I need to be able to test fixes before releasing to production?
That is going to be a hard ask. I work with Visual Studio Test Edition to load test my applications and we are only able to "estimate" the users activity on the site.
It is possible to look at the logs and gather information on the likelyhood of certain paths through your app. You can then look at the production database to look at the likely values entered in any post requests. From that you will have to make load tests that approach the useage patterns of your production site.
With any current tools I don't think it is possible to record and playback actual user interation.
It is possible to alter your web app so that is records and logs every request and post against session and datetime. This custom logging could be then used to generate load test requests against a test website. This would be some serious code change to your existing site and would likely have performance impacts.
That said, I have worked with web apps that do this level of logging and the ability to analyse the exact series of page posts/requests that caused an error is quite valuable to a developer.
So in summary: It is possible, but I have not heard of any off the shelf tools that do it.
Please check out this Whitepaper by Impetus Technologies on this page.. http://www.impetus.com/plabs/sandstorm.html
Honestly, I'm not sure the task you're being asked to do is even possible, let alone a good idea. Depending on how complex the application's backend is, and how perfect you can recreate the state (ie: all the way down to external SOA services or the time/clock), it may not be possible to make those GET and POST requests reproduce the same behavior.
That said, performance testing against production data is always great, but it usually requires application-specific knowledge that will stress said data. Simply repeating HTTP GETs and POSTs will almost certainly not yield useful results.
Good luck!
I would suggest the following to get the production requests and simulate the accurate workload:
1) Use coremetrics: CoreMetrics provides such solutions using which you can know the application usage patterns. This would help in coming up with an accurate workload model. This model can then be converted into test scripts and executed against a masked copy of production database. This will provide you accurate results about the application performance in realtime.
2) Another option would be creating a small utility using AOP (Aspect oriented apporach) so that it can trace all the requests and corresponding method traces. This would help in identifying the production usage pattern and in turn accurate simulation of workload. AOP frameworks such as AspectJ can be used. This would not require any changes in code. The instrumentation can be done on the fly. The other benefit would be that thi cna only be enabled for a specific time window and then it can be turned off.
Regards,
batterywalam

performance testing tool

We are using watir and integrated with VS 2008 using ruby in steel and we have automated our web application and it awsome.
Is there way to use the same script to do the performance testing or is there any better tool.
It's hard to tell if you want something that analyzes the performance of your website (ie: profiler) or a load/stress testing tool. I'm going to assume you want a load testing tool and not a profiler, given that you're talking about script reuse.
All load testing tools, except for one (disclaimer: my company is that one), work by recording HTTP traffic and then replaying it. The script is very different from a functional testing script like one you'd have for Watir.
You can either record the HTTP traffic generated by your Watir script or try to run your functional tests directly.
If you're also using FireWatir, you can use Firebug, which is an excellent web developer tool and shows you the recorded traffic for each page. If you're using IE primarily, check out HttpWatch. It's commercial, but provides great network timings for IE and can export to various data formats. Alternatively, many load testing tools provide a proxy that can record traffic and generate a load script for you.
Once you've got the network data, you can likely quickly turn it in to a script that Pylot, Grinder, JMeter, etc can understand. The problem with this method is that you need to re-record your script whenever any part of the site or the test changes. And if your app is anything more than basic HTML (ie: Ajax, .NET viewstate, etc) then you may have to use some advanced parts of your load testing tool. See my article on ajax load testing for more info.
Shameless plug: if you were using Selenium (or were willing to convert a couple Watir scripts to Selenium scripts), which is another open source functional testing tool, you could use BrowserMob, which provides a load testing service that uses real browsers to play back the load and functional test scripts (Selenium) to drive them. It uses a lot more resources, but thanks to cloud computing the price point is still very low.
There's rawk that you could run over the log files. This gives a pretty comprehensive summary of what's taking so long.
Alternatively there's NewRelic which provides monitoring for your rails app and gives you a detailed breakdown of what every request is doing.
And finally there's FiveRuns which does things very similar to NewRelic.
Have a look at LoadWise, you could reuse existing functional test scripts for performance testing.
With the same load test scripts without change, You can either preview it in Firefox (via FireWatir) or hit your web sites with X number of virtual users (via Celerity).
http://testwisely.com/en/loadwise
dIf you are truely talking just 'performance' then you could alter the scripts to start capturing and recording the page load tines. Any time you so some action that causes a page to load (like navigating to a link) Watir returns the time to load the page.
You just need to have the scripts implement some kind of simple logging method to be able to record the time to load each page, and then alter the scripts so that it return value is captured ala
loadtime = browser.goto(someurl)
perflogger(someurl, loadtime)

Resources