access-2013 report opens very slow - performance

I've a weird problem here with a report which I use every day.
I've moved from XP to WIN-7 some time ago and use access 2013.
(Language is german, so sorry I can only guess how the modes are called in english)
"Suddenly" (I really can't say when this started) opening the report in "report-view" takes VERY long. Around 1 minute, or so. Then, switching to "page-view" and formatting the report takes only 2 or 3 seconds. Switching back to report-view, again takes 1 minute.
The report has a complex Query as datasource. (In fact, a UNION of 8 sub-queries) Opening the this query displays the data after 1 second which is ok.
All tables are "linked" from the same ODBC Datasource, which points to a mysql server on our network.
Further testing I opened every table the queries use, one after another. I noticed that opening these tables takes around 9 seconds for every single table. It doesn't matter if it's a small or big table. Always these 9 seconds.
The ODBC datasource is defined using the IP address of the server, not the name. So I consider it not being a nameserver problem / timeout/ ...
What could cause this slowdown on opening tables ????
I'm puzzeled..

Here are a few steps I would try:
Taking a fresh copy of the Access app running on one of those "fast clients" and see if that solves the issue
try comparing performance with a fast client after setting the same default printer
check the version of the driver on both machines, and if you rely on a DSN, compare them

Related

How to make a Spotfire link open faster?

I've published a Spotfire file with 70 '.txt' files linked to it. The total size of the files is around 2Gb. when the users open it in their web browser it takes + - 27 minutes to load the linked tables.
I need an option that enhances opening performance. The issue seems to be the aumont of data and the way they are linked to Spotfire.
This runs in a server and the users open the BI in their browser.
I've tryed to embeed the data, it lowers the time, but forces me to interact with the software every time I want to update the data. The solution is supposed to run automatically.
I need to open this in less than 5 minutes.
Update:
- I need the data to be updated at least twice a day.
- The embedded link is acceptable from the time perspective, but the system need to run without my intetrvention.
- I've never used Spotfire automation services.
Schedule the report to cache twice a day on the Spotfire server by setting up a rule under scheduling and routing. The good thing about this is while it is updating the analysis for the second time during the day, it will still allow users to quickly open older data until it is complete. To the end user it will open in seconds but behind the scenes you have just pre-opened the report. Once you set up the rule this will run automatically with no intervention needed.
All functionality and scripting within the report will work the same, and it can be opened up many times at the same time from different users. This is really the best way if you have to link to that many files. Otherwise, try collapsing files, aggregating data, removing all unnecessary columns and data tables for the data to pull through faster.

Oracle Apex Interactive Report bad performance while loading

I have an interactive report in one of my APEX application. The SQL query used in the IR runs pretty fine when executed in SQL Developer.
But, at times in the application it gets stuck and requires more time than usual to load the IR. (Usually it takes less than 5 secs to load but at times more than 50 secs).
What might be the possible reasons for it to load slow ?
The query is well tuned and IR has default settings with no modification. I have also checked the stats on the tables and it is fresh.
The SQL query used in IR fetches 10k records.
If you go into Component View and then click Interactive Report under Regions, there is a setting near the bottom under the Performance heading called Maximum Rows To Process. Also limiting the number of rows to display sped things up for me.
Sorry but i can't write comments. Is there any database view in your query?
I have similar situation where query from database view with 6 mil. records take around 3 min to complete in Oracle Apex IR and 10-15 seconds in SQL Developer. So after some research i try to put sql from view directly into IR and result was almost same as this in SQL Developer.
Also You can remove pagination from IR or change it from "x to y from z" to be only "x to y".
I hope this can help you.
Query response time in SQL Developer versus any other Web browser cannot be compared directly. Some of the reasons for its slugishness could be related to server setup, server load, current user traffic, page load processes, page and region rendering, number of regions,components and plugins, navigation menu query, report query, number or columns and rows being displayed, row content length, apex items especially LOV with SQL queries, etc.
From your question, it looks like performance issue is not consistent and so, I think issue may be related to server setup or traffic. Try to check if you see any difference in load time after bouncing the server, if that's an option. Try to isolate the problem and if the issue is specific to interactive report, build a classic report and compare times.
Another thing that has helped me in past is to compare and verify compute times using APEX Debugger, here is the screenshot.
Also look at network and timeline tabs in Chrome debugger,
Implement indexes on your tables
Verify with your DBA if you have database locks
Verify the amount of logs in Database
Switch to classic reports.
Regards

Why when using MS Access to query data from remote Oracle DB into local table takes so long

I am connecting to a remote Oracle DB using MS Access 2010 and ODBC for Oracle driver
IN MS Access it takes about 10 seconds to execute:
SELECT * FROM SFMFG_SACIQ_ISC_DRAWING_REVS
But takes over 20 minutes to execute:
SELECT * INTO saciq_isc_drawing_revs FROM SFMFG_SACIQ_ISC_DRAWING_REVS
Why does it take so long to build a local table with the same data?
Is this normal?
The first part is reading the data and you might not be getting the full result set back in one go. The second is both reading and writing the data which will always take longer.
You haven't said how many records you're retrieving and inserting. If it's tens of thousands then 20 minutes (or 1200 seconds approx.) seems quite good. If it's hundreds then you may have a problem.
Have a look here https://stackoverflow.com/search?q=insert+speed+ms+access for some hints as to how to improve the response and perhaps change some of the variables - e.g. using SQL Server Express instead of MS Access.
You could also do a quick speed comparison test by trying to insert the records from a CSV file and/or Excel cut and paste.

Cognos report performance and cache

I am working on Cognos 8, one of my report take roughly 1 minute to run but sometime 20 seconds as it loads from cache. Now for few needs I want to prove that report ran from cache for second time, how can I prove that? Is the performance is logged some where?
Cognos 8 uses old 32-bit CQM engine.
The cache of this engine is very primitive:
Cache only works in same session.
Only works if the query is identical.
By defualt it cache the last 5 queries.
So based on limitation I wrote above you can do the following:
Run the report in different session (different browser or user or user).
Change any value in the prompt for different value.
This will ensure the report is not running from cache.
if you want to trace performance of queries, then using DB to capture the queries is the most efficient way. The alternative would be activating Congos ipf trace:
Cognos 8 report performance issues

SSRS Performance Mystery

I have a stored procedure that returns about 50000 records in 10sec using at most 2 cores in SSMS. The SSRS report using the stored procedure was taking 20min and would max out the processor on an 8 core server for the entire time. The report was relatively simple (i.e. no graphs, calculations). The report did not appear to be the issue as I wrote the 50K rows to a temp table and the report could display the data in a few seconds. I tried many different ideas for testing altering the stored procedure each time, but keeping the original code in a separate window to revert back to. After one Alter of the stored procedure, going back to the original code, the report and server utilization started running fast, comparable to the performance of the stored procedure alone. Everything is fine for now, but I am would like to get to the bottom of what caused this in case it happens again. Any ideas?
I'd start with a SQL Profiler trace of both the stored procedure when you execute it normally, and then the same SP when it's called by SSRS. Make sure you include the execution plans involved, so you can see if it's making some bad decisions (though that seems unlikely - the SQL Server should execute an optimal - or at least consistent - plan regardless of the query's source).
We used to have cases where Business Objects would execute stored procs dozens of times for no aparent reason and it lead to occasionally horrible performance, though I've never seen that same behavior with SSRS. It may be somewhere to start, though. You'll also see the execution begin/end times - that will make it clear if it's the database layer that's hanging up, or if the SQL Server hands back the data in 10 seconds and then it's the SSRS service that's choking somewhere.
The primary solution to speeding SSRS reports is to cache the reports. If one does this (either my preloading the cache at 7:30 am for instance) or caches the reports on-hit, one will find massive gains in load speed.
You may also find that monthly restarts of SSRS application domain to resolve your issue.
Please note that I do this daily and professionally and am not simply waxing poetic on SSRS
Caching in SSRS
http://msdn.microsoft.com/en-us/library/ms155927.aspx
Pre-loading the Cache
http://msdn.microsoft.com/en-us/library/ms155876.aspx
If you do not like initial reports taking long and your data is static i.e. a daily general ledger or the like, meaning the data is relatively static over the day, you may increase the cache life-span.
Finally, you may also opt for business managers to instead receive these reports via email subscriptions, which will send them a point in time Excel report which they may find easier and more systematic.
You can also use parameters in SSRS to allow for easy parsing by the user and faster queries. In the query builder type IN(#SSN) under the Filter column that you wish to parameterize, you will then find it created in the parameter folder just above data sources in the upper left of your BIDS GUI.
[If you do not see the data source section in SSRS, hit CTRL+ALT+D.
See a nearly identical question here: Performance Issuses with SSRS

Resources