One scriptlet instance for multiple reports - performance

For multiple reports, I have a scriptlet that calls many web services. The calls are done under the afterReportInit() method to fill a couple of HashMaps which in turn are used in the reports. The problem is that web services calls are performed at every report generation which results in very low performance.
I'm looking for a workaround to make thoses web services called once for all reports.
Note that reports are deployed on JasperServer

I'm looking into the thing. So far what I've found is that JasperReports has a custom class loader, JRClassLoader that would need to be extended:
http://grepcode.com/file/repo1.maven.org/maven2/net.sf.jasperreports/jasperreports/4.1.2/net/sf/jasperreports/engine/util/JRClassLoader.java#JRClassLoader.%3Cinit%3E%28java.lang.ClassLoader%29
I personally wouldn't bother with the class loading mechanism and would instead try caching the results of the web service calls using memcache if possible... Depending on your setup, apache could probably do this as well.

Related

ResolveRequestCache state takes much time

I have a C# MVC application hosted in IIS test environment with only one action method in APIController. Clients call this single method and depending upon the parameters different small processes are performed.
I am using IIS 10.0.17763. Application is built in .Net Framework 4.6
I have disabled these modules as i don't need them.
WebDAVModule
WindowsAuthentication
ScriptModule-4.0
DefaultAuthentication
ServiceModel-4.0
UrlAuthorization
FileAuthorization
The problem is that under load test from jmeter, all calls somehow stay longer in ResolveRequestCache State.
Can someone guide me the problem behind or suggest me something to check. I am not using any kind of caching due to business requirement.
Here is the Screenshot of requests states from IIS
Edit. I have removed some other modules too to check the effect.
Here is the list of loaded modules in my application

How to create Performance testing framework in jmeter?

For functional automation we use to create a framework which is reusable for automating application. Is there any way to create Performance testing framework in jmeter. So that we can use same framework for Performance testing of different applications.
Please help if any one knows and provide more information regarding it.
You can consider JMeter as a "framework" which already comes with test elements to build requests via different protocols/transports, applying assertions, generating reports, etc.
It is highly unlikely you will be able to re-use existing script for another application as JMeter acts on protocol level therefore there will be different requests for different applications.
There is a mechanism in JMeter allowing to re-use pieces of test plan as modules so you won't have to duplicate your code, check out Test Fragments and Module Controller, however it is more applicable for a single application.
The only "framework-like" approach I can think of is adding your JMeter tests into continuous integration process so you will have a build step which will execute performance tests and publish reports, basically you will be able to re-use the same test setup and reporting routine and the only thing which will change from application to application will be .jmx test script(s). See JMeter Maven Plugin and/or JMeter Ant Task for more details.
You must first ask yourself, how dynamic is my conversation that I am attempting to replicate. If you have a very stable services API where the exposed external interface is static, but the code to handle it on the back end is changing, then you have a good shot at building something which has a long life.
But, if you are like the majority of web sites in the universe then you are dealing with developers who are always changing something, adding a resource, adding of deleting form values (hidden or not), headers, etc.... In this case you should consider that your scripts are perishable, with a limited life, and you will need to rebuild them at some point.
Having noted the limited lifetime of a piece of code to test a piece of code with a limited lifetime, are there some techniques you can use to insulate yourself? Yes. Rule of thumb is the higher up the stack you go to build your test scripts the more insulated you are from changes under the covers ( assuming the layer you build to is stable ). The trade off is with more of the intelligence under the covers of your test interface, the higher the resource cost for any individual virtual user which then dictates more hosts for test execution and more skew from client side code which can distort the view of what is coming from the server. An example, run a selenium script instead of a base jmeter script. A browser is invoked, you have the benefit of all of the local javascript processing to handle the dynamic changes and your script has a longer life.

How to load test web application developed on asp.net mvc3?

I want to load test the web application that we're working on? Can you name some automated load testing tool for a website developed on asp.net mvc3? I would like to develop it for concurrent 100 users, 200 users and so on. We want to test it with many users and test the load that creates on the application and server.
BTW, we're also running profiler at the same time to find the application bottlenecks so that we can find code that is slow that we can improve.
There are a number of different options; they vary in all kinds of exciting ways.
I use the open source Apache JMeter for this kind of testing - it's not hugely user friendly, but is very powerful once you get used to it, and the lack of licensing restrictions means you can use it in all sorts of configurations.
Some of our projects have glued JMeter into the continuous integration cycle, running performance tests on nightly builds. Some projects need to scale to huge numbers of users, and we use JMeter in the cloud (there are some service providers who can do this too).
it works nicely with Asp.Net MVC apps.
We are currently load testing our MVC application and the external company uses a product called LoadRunner.
However, depending on how intricate your testing is you could use the WebClient class as a base to run your own volume tests.
Web Performance Load Tester works very well with .NET apps. We test a lot of them (disclaimer: I work for Web Performance and am closely involved with the product).
We have integrated with Fiddler load testing tool called StresStimulus

Web service vs. class file - performance

I am trying to figure out the best way to go about doing this: I am working on a project and I'm putting all my data access layer code into .ASMX files to keep them separated from my presentation layer. I am calling all my methods from the code behind and using the web services like class files. I am following this practice based on one other developer's work. Two opinions on this so far: One says when the code-behind calls the method from the web service, it's a performance hit because it has to go do an HTTP request and the other says, no performance hit. The ASMX files are within the same project on the same server. Is there indeed a performance hit or not really? I tend to think not.
Any help or opinion on this would be appreciated.
If you call as a web service, you still have to go through the proxy and argument marshalling even if you are calling within the same server; there is a performance hit compared to calling the same class directly; the call overhead may be orders of magnitude higher. You wouldn't want to do this if the called method isn't doing some substantial work.

Should I build a REST backend for GWT application

I am planning a new application and have been experimenting with GWT as a possible frontend. The design question I am facing is this.
Should I use
Option A: GWT-RPC and build the app quickly
Option B: Build a REST backend using Spring MVC 3.0 with all the great #Controller, #Service, #Repository annotations and build a client side library to talk to the backend using the GWT overlay features and the GWT Request builder?
I am interested in all the pros and cons and people experiences with this type of design?
Ask yourself the question: "Will I need to reuse the server-side interface with a non-GWT front-end?"
If the answer is "no, I'll just have a GWT client": You can use GWT-RPC, and take advantage of the fact that you can use your Java objects both on the server and the client-side. This can also make the communication a bit more efficient, at least when used with <inherits name="com.google.gwt.user.RemoteServiceObfuscateTypeNames" />, which shortens the type names to small numeric values. You'll also get the advantage of better error handling (using Exceptions), type safety, etc.
If the answer is "yes, I'll make my service accessible for multiple kinds of front-ends": You can use REST with JSON (or XML), which can also be understood by non-GWT clients. In addition to switching clients, this would also allow you to switch to a different server implementation (maybe non-Java) in the future more easily. The disadvantage is, that you'll probably have to write wrappers (JavaScript Overlay Types) or transformation code on the GWT client side to build nice Java objects from the JSON objects. You'll have to be especially careful when you deploy a new version of the service, which brings us back to the lack of type safety.
The third option of course would be to build both. I'd choose this option, if the public REST interface should be different from the GWT-RPC interface anyway - maybe providing just a subset of easy to use services.
You can do both if use also use the RestyGWT project. It will make calling REST based JSON resources as easy as using GWT-RPC. Plus you can typically reuse the same request response DTOs from the server side on the client side.
We ran into the same issue when we created the Spiffy UI Framework. We chose REST and I would never go back. I'd even say GWT-RPC is a GWT Anti-pattern.
REST is a good idea even if you never intend to expose your REST endpoints. Creating a REST API will make your UI faster, your API better, and your entire application more maintainable.
I would say build a REST backend. In my last project we started by developing using GWT-RPC for the first few months, we wanted fast bootstrapping. Later on, when we needed the REST API, it was so expensive to do the refactoring we ended up with two backend APIs (REST and RPC)
If you build a proper REST backend, and a deserialization infra on the client side (to transform the json\xml to GWT Java objects) then the benefit of the RPC is almost nothing.
Another sometimes forgotten advantage of the REST approach is that it's more natural to the browser running the client, RPC is a propitiatory protocol, where all the requests are using POST. You can benefit from client side caching when reading resources in the standard way.
Answering ams comments:
Regarding the RPC protocol, last time I "sniffed" it using firebug it didn't look like json, so I don't know about that. Though, even if it is json based, it still uses only the HTTP POST method to communicate with the server, so my point here about caching is still valid, the browser won't cache POST requests.
Regarding the retrospective and what could have done better, writing the RPC service in a resource oriented architecture could lead later to easier porting to REST. remember that in REST one usually exposes resources with the basic CRUD operations, if you focus on that approach when writing the RPC service then you should be fine.
The REST architectural style promotes inspectable messages (which aids debugging and security), API evolution, multiple platforms, simple interfaces, failure recovery, high scalability, and (optionally) extensible systems via code on demand. It trades per-interaction performance for overall network efficiency. It reduces the server's control over consistent application behavior.
The "RPC style" (as we speak of it in opposition to REST) promotes platform uniformity, interface variability, code generation (and thereby the ability to pretend the network doesn't exist, but see the Fallacies), and customized interactions. It trades overall network efficiency for high per-interaction performance. It increases the server's control over consistent application behavior.
If your application desires the former qualities, use the REST style. If it desires the latter, use the RPC style.
If you're planning on using using Hibernate/JPA on the server-side and sending the resulting POJO's with relational data in them to the client (ie. an Employee object with a collection of Phones), definitely go with the REST implementation.
I started my GWT project a month ago using GWT RPC. All was well until I tried to serialize an object from the underlying db with a One-To-Many relationship in it. And got the dreaded:
com.google.gwt.user.client.rpc.SerializationException: Type 'org.hibernate.collection.PersistentList' was not included in the set of types which can be serialized by this SerializationPolicy
If you encounter this and want to stay with GWT RPC you will have to use something like:
GWT Request Factory (www.gwtproject.org/doc/latest/DevGuideRequestFactory.html) - which forces you to write 3+ classes/interfaces per POJO you want to share with the client. OUCH!
Gilead (sourceforge.net/projects/gilead/) - which appears to a dead project.
I'm now using RestyGWT. The switch was fairly painless and my POJO's serialize without issue.
I would say that this depends on the scope of your total application. If your backend should be used by other clients, needs to be extendable etc. then create a separate module using REST. If the backend is to be used by only this client, then go for the GWT-RPC solution.

Resources