Request simulation plugin for grails - performance

I have been searching for a grails plugin that ca be used to simulate requests. I have a jersey service that calls some helper methods and I'm trying to check if any of these methods are bottleneck method, I found this JavaMelody Grails Plugin, . But, this tool does not create request simulation, it rather measure the actual traffic.
Anybody knows a free profiler that can drill down to the helper methods level?

A combination of load simulator, like Jmeter, as well as a profiling tool are your best bet to get the sorts of information I think you are after. Though I try not to promote proprietary solutions, New Relic will, I think, give you the sorts of real-time profiling you are after.

I think Jmeter can generate request load, you can also use badboy this allows you to record and then play in threads or optionally you can export for Jmeter file so can be used for more complex request generation.

Related

Trying to Performance test an application developed in OJET technology. Which tool/protocol should I use for scripting?

Trying to Performance test an application developed in OJET technology. Which tool/protocol should I use for scripting? I tried HTTP/Web protocol with Jmeter and Load Runner. But that doesn't capture all the requests and responses at the javascript/browser level. Hence I am facing issues in correlating the dynamic values during test design. Hence, scripts fail during the replay. Currently trying to do it with Truclient Web protocol as an alternative. But I need to know which tool/protocol should I use for scripting?
According to OJET looks like this is a web app generator.
If you choose to start with JMeter use post-processor such as regex to catch and save every value that is needed for as arg in the next request.
Don't be afraid of these dynamic values. Try to follow next articles to get the idea.
None of tools will provide you automatic correlation without issue. Nor LoadRunner, nor Jmeter. It is always tricky.
Ask more specific questions when you start facing issue.
Jmeter catch correlations
You need to implement real user using your application with 100% accuracy in terms of network footprint
Both JMeter and/or LoadRunner are not capable of executing client-side JavaScript, the options are in:
Implement these JavaScript-driven network calls using scripting (in JMeter it will be JSR223 Test Elements)
Use a real browser, LoadRunner's Truclient protocol is basically a headless web browser, in JMeter can be integrated with Selenium browser automation framework via WebDriver Sampler
With regards to "which protocol/tool" to use:
Implementing JavaScript calls manually will take extra effort, however your test will consume less resources (CPU, RAM, etc.)
Using real browsers will take less efforts, but the test will consume much more resources (something like 1 CPU core and 2 GB of RAM per user/ browser instance) and you won't have metrics like Connect Time, Latency, etc.
LoadRunner TruClient. This will handle all of the Javascript executions and dynamic elements related to session, state, date/time, object identifiers, ... You will still need to appropriately handle user input items.

How to create Performance testing framework in jmeter?

For functional automation we use to create a framework which is reusable for automating application. Is there any way to create Performance testing framework in jmeter. So that we can use same framework for Performance testing of different applications.
Please help if any one knows and provide more information regarding it.
You can consider JMeter as a "framework" which already comes with test elements to build requests via different protocols/transports, applying assertions, generating reports, etc.
It is highly unlikely you will be able to re-use existing script for another application as JMeter acts on protocol level therefore there will be different requests for different applications.
There is a mechanism in JMeter allowing to re-use pieces of test plan as modules so you won't have to duplicate your code, check out Test Fragments and Module Controller, however it is more applicable for a single application.
The only "framework-like" approach I can think of is adding your JMeter tests into continuous integration process so you will have a build step which will execute performance tests and publish reports, basically you will be able to re-use the same test setup and reporting routine and the only thing which will change from application to application will be .jmx test script(s). See JMeter Maven Plugin and/or JMeter Ant Task for more details.
You must first ask yourself, how dynamic is my conversation that I am attempting to replicate. If you have a very stable services API where the exposed external interface is static, but the code to handle it on the back end is changing, then you have a good shot at building something which has a long life.
But, if you are like the majority of web sites in the universe then you are dealing with developers who are always changing something, adding a resource, adding of deleting form values (hidden or not), headers, etc.... In this case you should consider that your scripts are perishable, with a limited life, and you will need to rebuild them at some point.
Having noted the limited lifetime of a piece of code to test a piece of code with a limited lifetime, are there some techniques you can use to insulate yourself? Yes. Rule of thumb is the higher up the stack you go to build your test scripts the more insulated you are from changes under the covers ( assuming the layer you build to is stable ). The trade off is with more of the intelligence under the covers of your test interface, the higher the resource cost for any individual virtual user which then dictates more hosts for test execution and more skew from client side code which can distort the view of what is coming from the server. An example, run a selenium script instead of a base jmeter script. A browser is invoked, you have the benefit of all of the local javascript processing to handle the dynamic changes and your script has a longer life.

performance testing tool with record &use function

I am looking for performance tool which should have record and use function, I looked into jmeter though. Is there any other tool? which has record option too
I understand manual editing will be there even we record and playback,but at least I don't need to give URL manually. I have used jmeter, manage engine Qengine too.
I also tried to convert my selenium scripts to jmeter performance testing ,but I am not able to get every URL's request /response.
requirements :
http/https
record and playback - I can write scripts
option to substitute parameter from response etc. (which is dynamic)
You may try Silk performer or HP Load Runner. But both the tools are expensive to buy. Other opensource tools include Grinder

Performance testing application for bottle necks using production data

I have been tasked with looking for a performance testing solution for one of our Java applications running on a Weblogic server. The requirement is to record production requests (both GET and POST including POST data) and then run these requests in a performance test environment with a copy of the production database.
The reasons for using production requests instead of a test script are:
It is a large application with no existing test scripts so it would be a a large amount of work to write scripts to cover the entire application.
Some performance issues only appear when users do a number of actions in a particular order.
To test using actual user interaction with the system not an estimation at how the users may interact with the system. We all know that users will do things we have not thought of.
I want to be able to fix performance issues and rerun the requests against the fixed code before releasing to production.
I have looked at using JMeters Access Log Sampler with server access logs however the access logs do not contain POST data and the access log sampler only looks at the request URL so it cannot simulate users submitting form data.
I have also looked at using the JMeter HTTP Proxy Server however this can record the actions of only one user and requires the user to configure their browser to use the proxy. This same limitation exist with Tsung and The Grinder.
I have looked at using Wireshark and TCReplay but recording at the packet level is excessive and will not give any useful reports at a request level.
Is there a better way to analyze production performance considering I need to be able to test fixes before releasing to production?
That is going to be a hard ask. I work with Visual Studio Test Edition to load test my applications and we are only able to "estimate" the users activity on the site.
It is possible to look at the logs and gather information on the likelyhood of certain paths through your app. You can then look at the production database to look at the likely values entered in any post requests. From that you will have to make load tests that approach the useage patterns of your production site.
With any current tools I don't think it is possible to record and playback actual user interation.
It is possible to alter your web app so that is records and logs every request and post against session and datetime. This custom logging could be then used to generate load test requests against a test website. This would be some serious code change to your existing site and would likely have performance impacts.
That said, I have worked with web apps that do this level of logging and the ability to analyse the exact series of page posts/requests that caused an error is quite valuable to a developer.
So in summary: It is possible, but I have not heard of any off the shelf tools that do it.
Please check out this Whitepaper by Impetus Technologies on this page.. http://www.impetus.com/plabs/sandstorm.html
Honestly, I'm not sure the task you're being asked to do is even possible, let alone a good idea. Depending on how complex the application's backend is, and how perfect you can recreate the state (ie: all the way down to external SOA services or the time/clock), it may not be possible to make those GET and POST requests reproduce the same behavior.
That said, performance testing against production data is always great, but it usually requires application-specific knowledge that will stress said data. Simply repeating HTTP GETs and POSTs will almost certainly not yield useful results.
Good luck!
I would suggest the following to get the production requests and simulate the accurate workload:
1) Use coremetrics: CoreMetrics provides such solutions using which you can know the application usage patterns. This would help in coming up with an accurate workload model. This model can then be converted into test scripts and executed against a masked copy of production database. This will provide you accurate results about the application performance in realtime.
2) Another option would be creating a small utility using AOP (Aspect oriented apporach) so that it can trace all the requests and corresponding method traces. This would help in identifying the production usage pattern and in turn accurate simulation of workload. AOP frameworks such as AspectJ can be used. This would not require any changes in code. The instrumentation can be done on the fly. The other benefit would be that thi cna only be enabled for a specific time window and then it can be turned off.
Regards,
batterywalam

performance testing tool

We are using watir and integrated with VS 2008 using ruby in steel and we have automated our web application and it awsome.
Is there way to use the same script to do the performance testing or is there any better tool.
It's hard to tell if you want something that analyzes the performance of your website (ie: profiler) or a load/stress testing tool. I'm going to assume you want a load testing tool and not a profiler, given that you're talking about script reuse.
All load testing tools, except for one (disclaimer: my company is that one), work by recording HTTP traffic and then replaying it. The script is very different from a functional testing script like one you'd have for Watir.
You can either record the HTTP traffic generated by your Watir script or try to run your functional tests directly.
If you're also using FireWatir, you can use Firebug, which is an excellent web developer tool and shows you the recorded traffic for each page. If you're using IE primarily, check out HttpWatch. It's commercial, but provides great network timings for IE and can export to various data formats. Alternatively, many load testing tools provide a proxy that can record traffic and generate a load script for you.
Once you've got the network data, you can likely quickly turn it in to a script that Pylot, Grinder, JMeter, etc can understand. The problem with this method is that you need to re-record your script whenever any part of the site or the test changes. And if your app is anything more than basic HTML (ie: Ajax, .NET viewstate, etc) then you may have to use some advanced parts of your load testing tool. See my article on ajax load testing for more info.
Shameless plug: if you were using Selenium (or were willing to convert a couple Watir scripts to Selenium scripts), which is another open source functional testing tool, you could use BrowserMob, which provides a load testing service that uses real browsers to play back the load and functional test scripts (Selenium) to drive them. It uses a lot more resources, but thanks to cloud computing the price point is still very low.
There's rawk that you could run over the log files. This gives a pretty comprehensive summary of what's taking so long.
Alternatively there's NewRelic which provides monitoring for your rails app and gives you a detailed breakdown of what every request is doing.
And finally there's FiveRuns which does things very similar to NewRelic.
Have a look at LoadWise, you could reuse existing functional test scripts for performance testing.
With the same load test scripts without change, You can either preview it in Firefox (via FireWatir) or hit your web sites with X number of virtual users (via Celerity).
http://testwisely.com/en/loadwise
dIf you are truely talking just 'performance' then you could alter the scripts to start capturing and recording the page load tines. Any time you so some action that causes a page to load (like navigating to a link) Watir returns the time to load the page.
You just need to have the scripts implement some kind of simple logging method to be able to record the time to load each page, and then alter the scripts so that it return value is captured ala
loadtime = browser.goto(someurl)
perflogger(someurl, loadtime)

Resources