I have a BIRT report with performance problems: it takes approximately 5 minutes to run.
At the beginning I though the problem was the database: this report uses a quite complex SQL Server stored procedure to retrieve data. After a lot of SQL optimizations this procedure now takes ~20 seconds to run (in the management console).
However, the report itself still takes too much time (several minutes). How do I identify the other bottlenecks in BIRT report generation? Is there a way to profile the entire process? I'm running it using the www viewer (running inside Tomcat 5.5), and I don't have any Java event handlers, everything is done using standard SQL and JavaScript.
I watched the webinar "Designing High Performance BIRT Reports" 1, it has some interesting considerations but it didn't help much...
As I write this answer the question is getting close to 2 years old, so presumably you found a way around the problem. No one has offered a profiler for the entire process, so here are some ways of identifying bottle necks.
Start up time - About a minute can be spent here
running a couple reports one after the other or starting a second after the first is running can help diagnosis issues.
SQL Query run time - Good solutions are mentioned in the question
any SQL trace and performance testing will identify issues.
Building the report - This is where I notice the lions share of time being taken. Run a SQL trace while the report is being created. Even a relatively simple tables with lots of data can take around a minute to configure and display (HTML via apache tomcat) after the SQL trace indicates the query is done.
simplify the report or create a clone with fewer graphs or tables run with and without pieces to see if any create a notable difference
modify the query to bring back less records, less records are easier to display,
Delivery method PDF, Excel, HTML each can have different issues
try the report to different formats
if one is significantly greater, try different emitters.
For anyone else having problems with BIRT performance, here are a few more hints.
Profiling a BIRT report can be done using any Java profiler - write a simple Java test that runs your report and then profile that.
As an example I use the unit tests from the SpudSoft BIRT Excel Emitters and run JProfiler from within Eclipse.
The problem isn't with the difficulty in profiling it, it's in understanding the data produced :)
Scripts associated with DataSources can absolutely kill performance. Even a script that looks as though it should only have an impact up-front can really stop this thing. This is the biggest performance killer I've found (so big I rewrote quite a chunk of the Excel Emitters to make it unnecessary).
The emitter you use has an impact.
If you are trying to narrow down performance problems always do separate Run and Render tasks so you can easily see where to concentrate your efforts.
Different emitter options can impact performance, particularly with the third party emitters (the SpudSoft emitters now have a few options for making large reports faster).
The difference between Fixed-Layout and Auto-Layout is significant, try both.
Have you checked how much memory you are using in Tomcat? You may not be assigning enough memory. A quick test is to launch the BIRT Designer and assign it additional memory. Then, within the BIRT Designer software, run the report.
Related
Is there any tool or plugin which we can incorporate with existing Functional Automation Test suite.
Earlier till 2011-12 we were using Dynatrace Ajax edition. It was a plugin for Chrome and IE. We had configured this plugin into Selenium Test case. When these Selenium test were executing it was running all functional scenarios of application and parallel that Dynatrace tool was capturing performance stats for each actions performing on the application. At last we were getting a consolidate Performance report along with the Selenium test results.
But now I am not able to get any such kind of tool,plugin etc. which will help us to capture Performance stats of application with Selenium suit run.
Please help me in finding suitable way for it.
Yes, if you are using ChromeDriver you can enable performance logging. I believe this is what you are looking for.
DesiredCapabilities cap = DesiredCapabilities.chrome();
LoggingPreferences logPrefs = new LoggingPreferences();
logPrefs.enable(LogType.PERFORMANCE, Level.ALL);
cap.setCapability(CapabilityType.LOGGING_PREFS, logPrefs);
RemoteWebDriver driver = new RemoteWebDriver(new URL("http://127.0.0.1:9515"), cap);
The above code is what you will use the enable the logging for your session.
You don't need any kind of plugin/extensions for this. It uses Chrome's own performance logging feature.
Let me know if you are looking for this specifically.
My answer is more of an opinion than a true answer to your question, but IMHO it's not so useful to measure the performance as part of your normal functional test cycle.
Measuring performance as part of a regular functional test cycle can provide you with a lot of performance data, but it won't tell you much. If you want to measure performance effectively, you should start with the question: what operation do I care to measure? And "Everything" is not the right answer... Then you have to define what performance do expect and under which circumstances. Then you should build a test and a corresponding environment to match these requirements. In addition, performance is usually not a fixed value, as it can be affected by many factors that we can't control (like external processes that may be running in the background). Therefore you should usually define the expected performance in statistical terms, like: 90% of the times, the measured operation should take no more than 3 seconds. This means that you should run the test at least 10 times (actually much more in order to be safe) in order to determine if the performance is good enough or not.
I have a classic ASP (JScript) web-site that is running slow and are there any profilers that can help me identify what is taking time?
Other hints on how to optimize or debug ASP performance issues would be helpful.
Assuming you're looking for free solutions, here are some suggestions used in past (very old) project of mine:
ASP Profiler component. This is a line-level performance profiler for Active Server Pages (with VBScript) code. It shows how your ASP page runs, which lines are executed how many times, and how many milliseconds each take. Especially for heavy data-driven pages, you can see exactly which lines slow down the page, and optimize where necessary.
Googling around I have also found a couple of very old articles on timing/profiling ASP execution code: have a look here and here.
If you have an issue with server side code being slow I have found it is almost always the database causing the issue. You need to check for SQL which is slow to return a result; if you find any you need to look at applying new indexes to your tables. If your app is too chatty with the database you need to look at reducing the number of calls to the database. To find these problems you can always use SQL Server Profiler; this comes bundled with SQL Server 2005/2008 Developer edition.
Also you can use a free SQL Profiler available at xsqlsoftware.com
I was looking for ETL tool and on google found lot about Pentaho Kettle.
I also need a Data Analyzer to run on Star Schema so that business user can play around and generate any kind of report or matrix. Again PentaHo Analyzer is looking good.
Other part of the application will be developed in java and the application should be database agnostic.
Is Pentaho good enough or there are other tools I should check.
Pentaho seems to be pretty solid, offering the whole suite of BI tools, with improved integration reportedly on the way. But...the chances are that companies wanting to go the open source route for their BI solution are also most likely to end up using open source database technology...and in that sense "database agnostic" can easily be a double-edged sword. For instance, you can develop a cube in Microsoft's Analysis Services in the comfortable knowledge that whatver MDX/XMLA your cube sends to the database will be intrepeted consistently, holding very little in the way of nasty surprises.
Compare that to the Pentaho stack, which will typically end interacting with Postgresql or Mysql. I can't vouch for how Postgresql performs in the OLAP realm, but I do know from experience that Mysql - for all its undoubted strengths - has "issues" with the types of SQL that typically crops up all over the place in an OLAP solution (you can't get far in a cube without using GROUP BY or COUNT DISTINCT). So part of what you save in licence costs will almost certainly be used to solve issues arising from the fact the Pentaho doesn't always know which database it is talking to - robbing Peter to (at least partially) pay Paul, so to speak.
Unfortunately, more info is needed. For example:
will you need to exchange data with well-known apps (Oracle Financials, Remedy, etc)? If so, you can save a ton of time & money with an ETL solution that has support for that interface already built-in.
what database products (and versions) and file types do you need to talk to?
do you need to support querying of web-services?
do you need near real-time trickling of data?
do you need rule-level auditing & counts for accounting for every single row
do you need delta processing?
what kinds of machines do you need this to run on? linux? windows? mainframe?
what kind of version control, testing and build processes will this tool have to comply with?
what kind of performance & scalability do you need?
do you mind if the database ends up driving the transformations?
do you need this to run in userspace?
do you need to run parts of it on various networks disconnected from the rest? (not uncommon for extract processes)
how many interfaces and of what complexity do you need to support?
You can spend a lot of time deploying and learning an ETL tool - only to discover that it really doesn't meet your needs very well. You're best off taking a couple of hours to figure that out first.
I've used Talend before with some success. You create your translation by chaining operations together in a graphical designer. There were definitely some WTF's and it was difficult to deal with multi-line records, but it worked well otherwise.
Talend also generates Java and you can access the ETL processes remotely. The tool is also free, although they provide enterprise training and support.
There are lots of choices. Look at BIRT, Talend and Pentaho, if you want free tools. If you want much more robustness, look at Tableau and BIRT Analytics.
We are migrating our test report data (unit, regression, integration, etc..) from an XML format to a database format for better analysis. Right now the majority of our test analysis is done using the CruiseControl.NET dashboard, but this is limited to primarily the most recent test data. Older test data can be accessed but not easily compared to new test data. We want to pin point problem components and better narrow down bugs. With the onset of tons of information brought on by our newly implemented regression and integration testing I would like to see some better metrics generated (possibly performance and the like). Have you worked with any business intelligence systems that will provide a framework for accurately and easily implementing some sort of analysis and reporting?
I have looked into JasperReports and Pentaho but I'm struggling with implemetation of Pentaho at the moment. Should I continue my fight with the system? Is this what I'm looking for?
You could always just use SQL Server Reporting Services and Report Builder (MS's web based designer) or Report Designer (component of Visual studio). It's pretty easy to get this set up too.
Report Builder: http://msdn.microsoft.com/en-us/library/ms155933.aspx
Report Designer: http://msdn.microsoft.com/en-us/library/ms157166.aspx
Tutorial: http://www.simple-talk.com/sql/learn-sql-server/beginning-sql-server-2005-reporting-services-part-1/
How to add Reporting Services to an existing SQL Server: http://www.mssqltips.com/tip.asp?tip=1444
There are a few end user reporting solutions around as well that make it easier to dynamically create reports, if you're willing to invest a bit of cash.
My company produce one: http://www.rsinteract.com has a very cheap standard edition with a limited number of reports (30 day free trial). It reports directly off SQL server with Reporting Services installed. It won best of TechEd 2006 - http://windowsitpro.com/article/articleid/53944/best-of-tech-ed-2006-winners.html
We actually use ours to analyse the support requests from clients i.e. which component is failing most, who reports the most bugs etc. Not tried it on test data.
There's also Proclarity, ApexSQL Report, and Tableau all of which are good.
You could try looking at rolling your own (if you know what you're looking for) using Processing written by Ben Fry. It's best accompanied by his book "Visualizing Data".
The tool is free and I guess you can get a free 45 day trial of O'Reilly Books Online to get a head start and see if its right for you. I do know there are chapters on reading and crunching data from all kinds of sources (including XML and databases) and then making meaningful and useful visualisations from them.
I'm currently using it to get my head round the dependency complexities of an inherited code base and its been massively useful.
Which part of Pentaho?
The Kettle project has stuff to convert your Cruise Control info and load it into a relational database. That's probably a good module to get working properly, especially if you're almost done figuring it out. I hope you'll share this stuff. I could use it too.
The Platform will autoschedule stuff once Kettle has it loading.
To make Mondrian really useful you'll need to work out a fact / dimension organization to your test data. That may or may not be worth your trouble at this point.
Once you have your data loaded you'll probably be able to get a lot of benefit out of simple SQL queries like this...
select *
from test
where failed='yes'
order by testno, date desc
and this...
select max(date), min(date), testno
from test
where failed='yes'
group by testno
order by testno
and stuff like that. You might consider creating views in your table server for your favorite queries.
There are myriad ways to convert your sql queries into reports, including the pentaho reporting module, BIRT (an eclipse plugin), Crystal Reports, and all kinds of PHP or JSP stuff you could put together.
Does anyone know of a better GUI client for displaying Windows System Monitor log files? (System Monitor is sometimes called Performance Monitor.) I'm trying to track a long-term memory leak in a C# application running on Windows XP or 2K3 by comparing memory usages to run logs.
Specifically I want a client that will allow me to see the following (because System Monitor is unable or difficult):
Specify exact date time ranges for viewing data (or at least finer granularity than hours)
Show time intervals along the horizontal axis
Show max, min, average for the time range
Somewhere show the interval on which source data was captured (1 sec, 5 min, etc.)
(If no such thing exists I'm willing to hear recommendations for better long term performance/memory capturing tools.)
Edit: I've done Google searches and haven't found anything except tutorials on how to create System Monitor logs.
See this question.
The PAL tool does a nice job of creating an HTML report with charts and graphs. By creating your own Threshold file you can control what goes into the report.
While I accepted Patrick Cuff's answer, for my needs I found a better way to graph the data: Excel
It still doesn't provide everything I need, but it is a marked improvement over the System Monitor GUI. I use the relog command line tool to convert the log into a CSV, and then import the CSV into Excel. Excel does not automatically handle the third one, but I can add new columns to graph, and it does allow me to have better control over which data I'm displaying.
One of the tricks that I have used in the past is to use performance/system monitor to log this data to a SQL database. SQL Expression can work great for this. Then you can generate reports using Reporting Services or for the more adventurous types you can do some cube analysis with Analysis Services. So while this does not solve the UI problem, it does allow you to make your own UI. When I had done this previously I just used a simple Reporting Services graph.
SCOM 2007 with reporting services actually does a pretty good job of this. If not the SQLh2 tool is almost as good and its free. You will probably have to customize the reports yourself though