Does anybody know how to improve the performance of a Crystal Report that has a subreport? The subreport uses ADO.NET objects...and takes FOREVER to generate.
Is the subreport being called for every record of your report, ie does it live in the details section? If so, there's not much you can do other than reduce what the report displays or calculates. Subreports in details sections are a blight on performance.
However, if it's elsewhere then optimizations would necessarily depend on the design of the subreport itself. For example, running totals or the use of distinct counts can hit performance depending on design or report size.
Related
I have added conditional formatting to a Cognos report, and it seems to have slowed down
The report was running okay before I added the formatting, and I have not changed anything else on the report, other than the conditional formatting.
Does conditional formatting, as a general rule, cause Cognos to run slower?
As a general concept, conditional formatting will not slow down a report.
That said, I can envision one scenario where the addition of conditional formatting could have an impact: You base your conditional formatting on a query item that wasn't previously included in the main data container (list, crosstab etc.).
Cognos' SQL generation is opportunistic. If your report only references one query, all other queries will be left out of the SQL statement sent to the data source. If you include a data item that comes from another query (assuming there is an established join between the two), Cognos will now include the second query in the SQL statement, constructing a join with the original query in accordance on how you define the relationship. Joining tables inevitably results in some slowdown.
If your original report took 10 seconds to generate and then added conditional formatting that forced a join, it's inevitable that the result will take longer. It could be an imperceptible amount of time or a considerable slowdown depending on the query joined and the nature of the join.
Barring the scenario I described, I would generate the tabular data for the query and see how long it takes back. When you generate the tablular data, conditional formatting is ignored. If tabular data is slow then you know it's not the conditional formatting causing the problem.
If you want to really track Cognos performance, check out the article on my blog regarding automatic report timing: Automated Cognos Report Performance Measurement
I inherited a report from a developer where he combined 5 reports into one SSRS report. It looks like he just copied and pasted each tablix from the original reports one below the other. This was done so that when the user exports to Excel they can have each report on a separate tab. I've never done a multiple SSRS report like this before so I'm just now analyzing how this whole thing works. A major problem I'm finding is that it runs extremely slow, about 10 minutes, seemingly because it has to run all 5 queries. Each stored procedure is listed separately as a data set. Does anyone know a better way to create multiple SSRS reports onto one page, or at least how to make this thing faster?
The first step to improving performance for an SSRS report is to determine what the bottleneck is. Run a query against the view named ExecutionLog4 in the ReportServer database. For each recent execution of a report, the view will give you a record that includes 3 critical fields: TimeDataRetrieval, TimeProcessing, and TimeRendering.
TimeDataRetrieval indicates how long (in milliseconds) it takes for all of the queries to run and return your datasets. If this number is high, then you will need to tune your queries or eliminate some of them to improve performance. You can run a profiler trace to identify which of the procedures is running slowly.
Keep in mind also that subreports fire their dataset queries each time they are rendered in the report. So even a minor performance hiccup in a subreports dataset gets magnified by the number of executions.
TimeProcessing indicates how much time the report server spends manipulating the retrieved data. If this number is high, you may want to consider performing aggregate calculations that are being run many times within a report to run on the SQL side.
TimeRendering indicates how long the server takes to actually render the report. If this number is high, consider avoiding or simplifying expressions used on visual properties that repeat over and over again. This scenario is less common than the other two, in my experience.
Furthermore, here are some tips I've picked up that help to avoid performance issues:
-Avoid using row visibility expressions if you expect a large number of rows to be returned.
-Hiding an object does not prevent dataset execution. If your datasets have similar structure, consider combining them and using object filters to limit what is displayed in different sections. Or use an IF statement in your stored procedure if you only intend to display one of several choices depending on data or parameters.
-Try to limit the number of column groupings in a large tablix. For each grouping in a tablix, you multiply the number of rows of data that may be returned to pivot into those groupings.
More info on SSRS performance can be found at
https://technet.microsoft.com/en-us/library/bb522806(v=sql.105).aspx
This was written for 2008R2, but seems mostly applicable to 2012 as well.
Give all that a shot, then post back here with a more specific question if you get stuck.
I'm trying to figure out how SSRS handles caching of subreports. The report I have has a lot going on: dynamic formatting, dynamic links, dynamic graphs, etc. Because of all this, it takes quite a while to load (about 10 seconds), and every time you click on something in the report, it has to reload everything (another 10 seconds). I looked into caching options, but the problem is I need the data shown in the graphs to be the live data (the rest of the report doesn't display any live data).
I came up with an idea to put the graphs in a subreport, so that I could cache the main report, and only the subreport would have to be reprocessed on every load. My thinking was that this would significantly cut down on the processing time for the whole report, and I could schedule the cache to be preloaded every night.
I know that Reporting Services isn't the ideal way to deliver this type of thing, and creating it as a website would yield much better performance, but my company wants to try and make it work with SSRS first.
Does anyone know if this would work? Does SSRS cache subreports separate from their parent reports, or would caching the parent report also cache the subreport?
Sub reports are processed at the same time as the main report you are calling, and you would need to setup caching on the main report to cache the contents of the sub report.
For the components that do not require live data, you can cache shared datasets to improve report processing time. Seperate the data sources for the items that you need live data feeds from the items you can cache.
https://msdn.microsoft.com/en-us/library/ee636149.aspx
To open the Caching properties page for a shared dataset
Open Report Manager, and locate the report for which you want to configure shared dataset properties:
Point to the shared dataset, and click the drop-down arrow.
In the drop-down list, click Manage. The General properties page for the report opens.
Click the Caching tab.
http://i.stack.imgur.com/vFHQP.png
I have a stored procedure that returns about 50000 records in 10sec using at most 2 cores in SSMS. The SSRS report using the stored procedure was taking 20min and would max out the processor on an 8 core server for the entire time. The report was relatively simple (i.e. no graphs, calculations). The report did not appear to be the issue as I wrote the 50K rows to a temp table and the report could display the data in a few seconds. I tried many different ideas for testing altering the stored procedure each time, but keeping the original code in a separate window to revert back to. After one Alter of the stored procedure, going back to the original code, the report and server utilization started running fast, comparable to the performance of the stored procedure alone. Everything is fine for now, but I am would like to get to the bottom of what caused this in case it happens again. Any ideas?
I'd start with a SQL Profiler trace of both the stored procedure when you execute it normally, and then the same SP when it's called by SSRS. Make sure you include the execution plans involved, so you can see if it's making some bad decisions (though that seems unlikely - the SQL Server should execute an optimal - or at least consistent - plan regardless of the query's source).
We used to have cases where Business Objects would execute stored procs dozens of times for no aparent reason and it lead to occasionally horrible performance, though I've never seen that same behavior with SSRS. It may be somewhere to start, though. You'll also see the execution begin/end times - that will make it clear if it's the database layer that's hanging up, or if the SQL Server hands back the data in 10 seconds and then it's the SSRS service that's choking somewhere.
The primary solution to speeding SSRS reports is to cache the reports. If one does this (either my preloading the cache at 7:30 am for instance) or caches the reports on-hit, one will find massive gains in load speed.
You may also find that monthly restarts of SSRS application domain to resolve your issue.
Please note that I do this daily and professionally and am not simply waxing poetic on SSRS
Caching in SSRS
http://msdn.microsoft.com/en-us/library/ms155927.aspx
Pre-loading the Cache
http://msdn.microsoft.com/en-us/library/ms155876.aspx
If you do not like initial reports taking long and your data is static i.e. a daily general ledger or the like, meaning the data is relatively static over the day, you may increase the cache life-span.
Finally, you may also opt for business managers to instead receive these reports via email subscriptions, which will send them a point in time Excel report which they may find easier and more systematic.
You can also use parameters in SSRS to allow for easy parsing by the user and faster queries. In the query builder type IN(#SSN) under the Filter column that you wish to parameterize, you will then find it created in the parameter folder just above data sources in the upper left of your BIDS GUI.
[If you do not see the data source section in SSRS, hit CTRL+ALT+D.
See a nearly identical question here: Performance Issuses with SSRS
I need to compose a report using multiple subreports, "chained" together at runtime in a C# Forms project.
The subreports each represent a subtest of a product, and the data needs special formatting to make sense to the report users ( Special graphs, sensible column names with/without engineering details etc )
I Imagine that every Subreport has a subreport field so that I at runtime can insert the next subreport into. Obviously the first (main) report has a subreport as well, and a finalizing subreport does not (summary subreport)
Is it possible to build chain subreports together at runtime ?
Does anyone out there have a sample?
Kind Regards
Jes
I imagine this is possible with the Reporting Services product, but I don't know how to do it. In our experience as part of the ActiveReports team, we've found that subreports are also not always the most performant and memory efficient way to accomplish this.
For information about how we suggest to do this with our ActiveReports product see the following explanation:
http://www.datadynamics.com/Help/ActiveReports6/arHOWInsertOrAddPages.html
Scott Willeke
GrapeCity - Data Dynamics