How does ssrs cache reports that contain subreports - caching

I'm trying to figure out how SSRS handles caching of subreports. The report I have has a lot going on: dynamic formatting, dynamic links, dynamic graphs, etc. Because of all this, it takes quite a while to load (about 10 seconds), and every time you click on something in the report, it has to reload everything (another 10 seconds). I looked into caching options, but the problem is I need the data shown in the graphs to be the live data (the rest of the report doesn't display any live data).
I came up with an idea to put the graphs in a subreport, so that I could cache the main report, and only the subreport would have to be reprocessed on every load. My thinking was that this would significantly cut down on the processing time for the whole report, and I could schedule the cache to be preloaded every night.
I know that Reporting Services isn't the ideal way to deliver this type of thing, and creating it as a website would yield much better performance, but my company wants to try and make it work with SSRS first.
Does anyone know if this would work? Does SSRS cache subreports separate from their parent reports, or would caching the parent report also cache the subreport?

Sub reports are processed at the same time as the main report you are calling, and you would need to setup caching on the main report to cache the contents of the sub report.
For the components that do not require live data, you can cache shared datasets to improve report processing time. Seperate the data sources for the items that you need live data feeds from the items you can cache.
https://msdn.microsoft.com/en-us/library/ee636149.aspx
To open the Caching properties page for a shared dataset
Open Report Manager, and locate the report for which you want to configure shared dataset properties:
Point to the shared dataset, and click the drop-down arrow.
In the drop-down list, click Manage. The General properties page for the report opens.
Click the Caching tab.
http://i.stack.imgur.com/vFHQP.png

Related

Automating Validation of PDF Prepared Reports

Our team uses Spotfire to host online analyses and also prepare monthly reports. One pain point that we have is around validation. The reports are all prepared reports, and the process for creating them each month is as simple as 1) refresh the data (through Infolink connected to Oracle) and 2) Press button to export each report. The format of the final product is a PDF.
The issue is that there are a lot of small things that can go wrong with the reports (filter accidentally applied, wrong month selected, data didn't refresh, new department not grouped correctly, etc.) meaning that someone on our team has to manually validate each of the reports. We create almost 20 reports each month and some of them are as many as 100 pages.
We've done a great job automating the creation of the reports, but now we have this weird imbalance where it takes like 25 minutes to create all the reports but 4+ hours to validate each one.
Does anyone know of a good way to automate, or even cut down, the time we have to spend each month validating the reports? I did a brief google and all I could find was in the realm of validating reports to meet government regulation standards
It depends on 2 factors:
Do your reports have the same template (format) each time you extract them? You said that you pull them out automatically so I guess the answer is Yes.
What exactly are you trying to check/validate? You need to have a clear list on what are you validating. You mentioned month, grouping, data values (for the refresh)). But the clearer the picture you have for validation, the more likely the process can be fully automated.
There are so called RPA (robot process automation) tools that can automate complex workflows.
A "data extract" task, which is part of a workflow, can detect and collect data from documents (PDF for example).
A robot that runs on the validating machine can:
batch read all your PDF reports from specified locations on your computer (or on another computer);
based on predefined templates it can read through the documents for specific fields that you specify (through defined anchors on the templates) and collect the exact data from there;
compare the extracted data with the baseline that you set (compare the month to be correct, compare a data field to confirm proper refresh of the data, another data field to confirm grouping, etc.);
It takes a bit of time to dissect the PDF for each report template and correctly set the anchors but then it runs seamless each time.
One such tool I used is called Atomatik. It has a studio environment where you design the robot (or robots) and run the process.

How to make a Spotfire link open faster?

I've published a Spotfire file with 70 '.txt' files linked to it. The total size of the files is around 2Gb. when the users open it in their web browser it takes + - 27 minutes to load the linked tables.
I need an option that enhances opening performance. The issue seems to be the aumont of data and the way they are linked to Spotfire.
This runs in a server and the users open the BI in their browser.
I've tryed to embeed the data, it lowers the time, but forces me to interact with the software every time I want to update the data. The solution is supposed to run automatically.
I need to open this in less than 5 minutes.
Update:
- I need the data to be updated at least twice a day.
- The embedded link is acceptable from the time perspective, but the system need to run without my intetrvention.
- I've never used Spotfire automation services.
Schedule the report to cache twice a day on the Spotfire server by setting up a rule under scheduling and routing. The good thing about this is while it is updating the analysis for the second time during the day, it will still allow users to quickly open older data until it is complete. To the end user it will open in seconds but behind the scenes you have just pre-opened the report. Once you set up the rule this will run automatically with no intervention needed.
All functionality and scripting within the report will work the same, and it can be opened up many times at the same time from different users. This is really the best way if you have to link to that many files. Otherwise, try collapsing files, aggregating data, removing all unnecessary columns and data tables for the data to pull through faster.

Exporting time series response data for VS2013 load tests

I am trying to figure out how to export and then analyze the results of a load test, but after the test is over it seems I cannot find the data for each individual request by url. This data shows during the load test itself, but after it is over it seems as if that data is no longer accessible and all I can find are totals. The data that I want is under the "Page response time" graph on the graphs window during the test. I know this is not the response time for every single request and is probably averaged, but that would suffice for the calculations I want to make.
I have looked in the database on my local machine (LoadTest2010, where all of the summary data is stored) and I cannot find the data I'm looking for. I am load testing a single page application, fyi.
My goal is to plot (probably in excel) each request url against the user load and analyze the slope of the response time averages to determine which requests scale the worst (and best). During the load test I can see this data and get a visual idea but when it ends I cannot seem to find it to export.
A) Can this data be exported from within visual studio? Is there a setting required to make VS persist this data to the database? I have, from under Run Settings, the "Results" section "Timing Details Storage" set to "All individual details" and the Storage Type set to "Database".
B) Is this data in any of the tables in the LoadTest2010 database where all of the summary data is stored? It might be easier to query manually if its not spread out overly, but all I was able to find was summary data.
I was able to find the data in the database that I wanted. The tables I needed were the WebLoadTestRequestMap (which has the request URI's in it) and the LoadTestPageDetail (which has the individual response times themselves). They can be joined on webloadtestrequestmap.requestId and loadtestpagedetail.pageId (unintuitively).
I do have the "Results" section "Timing Details Storage" set to "All individual details" and the Storage Type set to "Database", it did not seem like every load tests results were available, maybe because of this setting.
More data on the layout of the load test database here: http://blogs.msdn.com/b/slumley/archive/2010/02/12/description-of-tables-and-columns-in-vs-2010-load-test-database.aspx

Reporting software for a huge database that has 20 millions of records

Is there any reporting or BI that could generate and preview instantly on a huge database? Imagine I will create a report that will get data on single table that has 20 millions record.
What I would like in a reporting or BI is that it should only get data in database that a single page or document needed so that it could show it to the end user immediately and do the other page on the background. Also, when navigating between pages, it should dispose the pages wisely so that it will not leave in the memory. Imagine if the report has 10000 pages (i know this is not practical) but on the end user they didn't know the actual number of pages.
DBxtra is able to generate reports from big databases and it's able to create the first page of the report quickly and generate the rest of them in the background. But honestly, more than 1,000 pages reports are not practical unless you plan to put them on storage for later audits.

SSRS Performance Mystery

I have a stored procedure that returns about 50000 records in 10sec using at most 2 cores in SSMS. The SSRS report using the stored procedure was taking 20min and would max out the processor on an 8 core server for the entire time. The report was relatively simple (i.e. no graphs, calculations). The report did not appear to be the issue as I wrote the 50K rows to a temp table and the report could display the data in a few seconds. I tried many different ideas for testing altering the stored procedure each time, but keeping the original code in a separate window to revert back to. After one Alter of the stored procedure, going back to the original code, the report and server utilization started running fast, comparable to the performance of the stored procedure alone. Everything is fine for now, but I am would like to get to the bottom of what caused this in case it happens again. Any ideas?
I'd start with a SQL Profiler trace of both the stored procedure when you execute it normally, and then the same SP when it's called by SSRS. Make sure you include the execution plans involved, so you can see if it's making some bad decisions (though that seems unlikely - the SQL Server should execute an optimal - or at least consistent - plan regardless of the query's source).
We used to have cases where Business Objects would execute stored procs dozens of times for no aparent reason and it lead to occasionally horrible performance, though I've never seen that same behavior with SSRS. It may be somewhere to start, though. You'll also see the execution begin/end times - that will make it clear if it's the database layer that's hanging up, or if the SQL Server hands back the data in 10 seconds and then it's the SSRS service that's choking somewhere.
The primary solution to speeding SSRS reports is to cache the reports. If one does this (either my preloading the cache at 7:30 am for instance) or caches the reports on-hit, one will find massive gains in load speed.
You may also find that monthly restarts of SSRS application domain to resolve your issue.
Please note that I do this daily and professionally and am not simply waxing poetic on SSRS
Caching in SSRS
http://msdn.microsoft.com/en-us/library/ms155927.aspx
Pre-loading the Cache
http://msdn.microsoft.com/en-us/library/ms155876.aspx
If you do not like initial reports taking long and your data is static i.e. a daily general ledger or the like, meaning the data is relatively static over the day, you may increase the cache life-span.
Finally, you may also opt for business managers to instead receive these reports via email subscriptions, which will send them a point in time Excel report which they may find easier and more systematic.
You can also use parameters in SSRS to allow for easy parsing by the user and faster queries. In the query builder type IN(#SSN) under the Filter column that you wish to parameterize, you will then find it created in the parameter folder just above data sources in the upper left of your BIDS GUI.
[If you do not see the data source section in SSRS, hit CTRL+ALT+D.
See a nearly identical question here: Performance Issuses with SSRS

Resources