Crosstab large data limit error - performance

I have a jasper crosstab report, that returns the following error
Crosstab bucket/measure limit (1,000,000) exceeded
How do I make the crosstab work to accomodate large data? I understand that net.sf.jasperreports.crosstab.bucket.measure.limit needs to be used in the report.
But, this will also have a risk of out of memory
How do I solve this?

Related

Adding multiple SSRS reports into one report is very slow

I inherited a report from a developer where he combined 5 reports into one SSRS report. It looks like he just copied and pasted each tablix from the original reports one below the other. This was done so that when the user exports to Excel they can have each report on a separate tab. I've never done a multiple SSRS report like this before so I'm just now analyzing how this whole thing works. A major problem I'm finding is that it runs extremely slow, about 10 minutes, seemingly because it has to run all 5 queries. Each stored procedure is listed separately as a data set. Does anyone know a better way to create multiple SSRS reports onto one page, or at least how to make this thing faster?
The first step to improving performance for an SSRS report is to determine what the bottleneck is. Run a query against the view named ExecutionLog4 in the ReportServer database. For each recent execution of a report, the view will give you a record that includes 3 critical fields: TimeDataRetrieval, TimeProcessing, and TimeRendering.
TimeDataRetrieval indicates how long (in milliseconds) it takes for all of the queries to run and return your datasets. If this number is high, then you will need to tune your queries or eliminate some of them to improve performance. You can run a profiler trace to identify which of the procedures is running slowly.
Keep in mind also that subreports fire their dataset queries each time they are rendered in the report. So even a minor performance hiccup in a subreports dataset gets magnified by the number of executions.
TimeProcessing indicates how much time the report server spends manipulating the retrieved data. If this number is high, you may want to consider performing aggregate calculations that are being run many times within a report to run on the SQL side.
TimeRendering indicates how long the server takes to actually render the report. If this number is high, consider avoiding or simplifying expressions used on visual properties that repeat over and over again. This scenario is less common than the other two, in my experience.
Furthermore, here are some tips I've picked up that help to avoid performance issues:
-Avoid using row visibility expressions if you expect a large number of rows to be returned.
-Hiding an object does not prevent dataset execution. If your datasets have similar structure, consider combining them and using object filters to limit what is displayed in different sections. Or use an IF statement in your stored procedure if you only intend to display one of several choices depending on data or parameters.
-Try to limit the number of column groupings in a large tablix. For each grouping in a tablix, you multiply the number of rows of data that may be returned to pivot into those groupings.
More info on SSRS performance can be found at
https://technet.microsoft.com/en-us/library/bb522806(v=sql.105).aspx
This was written for 2008R2, but seems mostly applicable to 2012 as well.
Give all that a shot, then post back here with a more specific question if you get stuck.

Performance issues in SSRS? - Filtering measures

I am working on a report that I had no problems executing until I added two extra measures to my dataset. When I tried to preview my report it just kept loading and loading...
Here is a snapshot of the dataset. The measures that I am interested in here are Date of 2:nd markdown and Date of 3:rd markdown
Here's a snapshot of the dataset filter. Nothing fancy, just [Date_of_2_nd_markdown] > 20120101. I only want to retrieve articles that have been lowered twice and this should filter out the 0's
Am I doing something wrong?
I have found a solution! My dataset was filtering through a lot of rows and the best way to speed this up was by using an MDX query to retrieve and filter through the data directly in the Query Designer.
I do not know why the filtering goes faster here, but this is the query that I used:
FILTER([Product].[Article].MEMBERS, [Measures].[Date of 2:nd markdown] > 20120101)
I also removed any unnecessary fields that weren't being used in the report.

Out of Memory Exception in mvc 5?

I need to display 40,000 records, I got system out of memory exception in MVC 5. Sometimes 70,000 records loads correctly and sometimes not even 40,000 records load. I need to display all records and export these records to the MS-Excel.
I used kendo grid to display the records.
I saw somewhere kendo grid doesn't load huge number of records.
From the Telerik forum:
When OpenAccess executes a query the actual retrieval of results is split into chunks. There is a fetch size that determines the number of records that are read from the database in a single pass. With a query that returns a lot of records this means that the fetch size is not exceeded and not all 40 000 records will be retrieved at one time in memory. Iterating over the result data you will get several reads from the database until the iteration is over. However, when you iterate over the result set subsequent reads are accumulated when you keep references to the objects that are iterated.
An out of memory exception may be caused when you operate with all the records from the grid. The way to avoid such an error would be to work with the data in chunks. For example, a paging for the grid and an option that exports data sequentially from all pages will achieve this. The goal is to try to reduce the objects kept in-memory at a time and let the garbage collection free unneeded memory. A LINQ query with Skip() and Take() is ideal in such cases where having all the data in-memory is costly.
and from http://docs.telerik.com/devtools/aspnet-ajax/controls/grid/functionality/exporting/overview
We strongly recommend not to export large amounts of data since there is a chance to encounter an exception(Timeout or OutOfMemory) if more than one user tries to export the same data simultaneously. RadGrid is not suitable for such scenarios and therefore we suggest that you limit the number of columns and rows. Also it is important to note that the hierarchy and the nested controls have a considerable effect on the performance in this scenario.
What the above is basically saying, is to reduce your result set via paging and/or reducing the number of columns that are fetched from the db to show only what is actually needed.
Not really sure what else you could do. You have too much data, and you're running out of memory. Gotta reduce the data to reduce the memory used.
Please go for the paging.And try to export all 40000 records without loding on to page. Which clears you that load data takes time and goes to memory out of exception.

How to improve DB load produced by SSRS reports

I would like to know whether there is a posibility to reduce the DB load produced by SSRS reports.
I have a SSRS report consisting of several sub-reports. Every one of them has at least one DB query.
Many of them query the same data since the most sub-reports have a kind of template header filled with dynamic data.
Some sub-reports are shown depending on whether a query returns any data. So once the data is queried to determine whether to show the report at all. Then the report itself queries the same data to show it in a table
In general I can tell that I need a mechanism to pass queried DB data from parent report to a sub-report. The parent report will query some data, it will iterate over the data sets and for every data set it will show a sub-report passing the current data set as a parameter.
I could not find a mechanism to pass the data set (data row). That's why I show the sub-report by passing a kind of data set ID. The sub-report itself queries the same data again, filters by the passed data set ID and shows only the relevant data set. This causes huge load on the DB.
Thank you in advance!
The design you describe is fairly standard and I would not expect it to cause "huge load on the DB". I would expect the DB load of running 10 filtered sub-reports to only be about 10-20% more than running one report covering the same 10 items.
I would add an index on the "data set ID" column to make that filter more efficient.
Depending on complexity of your subreports using lookup function may be an acceptable faster solution. And previous comment about hiding rows or subreports with no data applies here too.

Handle large dataset in birt

I am running into issues with out of memory exceptions . I need to display a large set of data in a cross tab. I need to display 5,277,888 rows aggregated into 403,920 rows. I don't think birt can handle this and would like some advice.
These are the options I was thinking
Some how fetch some data at a time and aggregate it (might still run out of memory)
Find a different reporting framework that renders html
Not use cross tab and do all of the aggregation server side and try to display it in a sudo cross tab.
Fetaching large amonunt of data and providing it to BIRT increases data trafic and also many a times (as in your case) lead to system / report engine hang.
What you are thinkning is correct. (Option 3) It is preferable to use aggregate functions in your db and give an already summarized data to BIRT at times.
SQL provides options for cross tab outputs (SQL Pivot Function) as well in case required.

Resources