I have tabular report, dataset is returning 3M records and while running report getting Out of Memory exception. We are using SSRS 2012 and Microsoft jet base engine as well. Please suggest how can I export the report without issue
Related
I have a rudimentary question, when connecting an SSAS-TM (SQL Server Analysis Services - Tabular Model) database (on-premise) in my own local machine to my Power BI Desktop (also in my local machine) via Import mode.
I am not at all familiar with the memory allocation parameters.
The relational database I have is a very simple AdventureWorksDW. I develop a SQL Server Analysis Services - Tabular Model project, using Visual Studio 2015, and deploy the Project as a new database in the Analysis Services Engine. I am able to query tables in this SSAS-TM database in the following format in the SSMS (SQL Server Management Studio), using DAX language:
EVALUATE 'tablename'
However, when I try to connect this SSAS-TM database to my Power BI Desktop via an Import connection, I get the following error.
AnalysisServices: The operation has been cancelled because there is not enough memory available for the application. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.
I have the properties from the SSMS in this image file.
I tried to refer some links related to setting up some Vertipaq memory parameters, after referring some links in learn.microsoft.com. It has not been useful to me.
My simple question is this:
What properties do I need to change in the image file above to make this connection successful? This is only training, so am using Adventure Works DW here. So size is not an issue. My laptop has a lot of memory and disk space, and is 64-bit. The Power BI Desktop is 64-bit.
Can someone help me ?
The Power BI connector for Analysis Services that has the table picker will generate an MDX query instead of a DAX query. And if you try to extract a more than a handful of rows, it will fail. It's a known issue, but low priority because.
Don't import from SSAS. Use Live Connect. You've already got a Cube/Data Set, you can just connect to it and write reports.
If you absolutely must import from SSAS, use a DAX query, eg
In M:
AnalysisServices.Database("MySSAS", "AdventureWorksDW", [Query="evaluate FactResellerSales", Implementation="2.0"])
or in the UI
Use Live Connect if you are only getting data from cube. If you are getting data from Excel files, etc. then you are forced to use Import. I have used Import to get many tables from cube, no memory errors. What you can do is Import 3 tables at a time, then in Power BI advanced editor select option to add more tables from cube, add another 3 tables, see how that goes. With Live Connect even if the relationship columns are hidden, you still get them. With Import, if they are hidden, you can't select them, so can't create the relationships.
I am situation where I have to check and confirm whether SSAS partitions queries are running parallel or not while processing the SSAS cube using SSIS job. SSIS job/package using 'Analysis Services Processing Task' to process cube by selecting each object(dimensions and partitions) in it instead of selecting direct SSAS DB.
Can any one please guide how to check parallelism using sql profiler?
Also if anyone can point out why cube processing using above way is taking longer than the cube processing by SSIS job in which 'Analysis Services Processing Task' selecting ssas db name directly.
please help with any comments/ suggestions.
Many Thanks
Regards,
Update: My end db from which partitions will fetch the data is Oracle
I think there is an easier way than using SQL Profiler, you can benefit from the amazing stored procedure sp_whoisactive to check what are the current query running on the server (Data Source SQL Database Engine) while processing the Analysis Services Processing Task.
Just download the stored procedure and create it on your master database.
sp_whoisactive homepage
How to Log Activity Using sp_whoisactive
Hint: In SQL Server Management Studio, go to data source properties and check the maximum allowed connections property, since it may prevent queries parallel execution
If you are looking for an answer using SQL Profiler, you can simply create a trace to monitor the SQL Server that contains the data sources used by partitions. And while the partitions are processed if many SQL queries are executed in parallel then parallelism is achieved.
If you are new to SQL Profiler you can refer to the following links to learn how to monitor only T-SQL commands:
How to monitor just t-sql commands in SQL Profiler?
Use SQL Server Profiler to trace database calls from third party applications
But if you are looking for a simpler solution, then the other answer is what you are looking for.
I've been trying to create an SSIS project to read from an Oracle 11.x database to an SQL Server database.
When I set this up in Visual Studio 10 Shell, I do not receive any logs . It gives me a successful message but nothing happens.
I tried to connect to an Oracle 12c database and the same happened.
I tried to get data from an Oracle 11.x project and dump it into an excel file. I also tried to get data from an Oracle 11.x table and dump it into a new Oracle 11.x table (in the same database) and in both cases I got the following error:
> TITLE: Microsoft Visual Studio
Failed to start project
------------------------------ ADDITIONAL INFORMATION:
Exception deserializing the package "The package failed to load due to
error 0xC0011008 "Error loading from XML. No further detailed error
information can be specified for this problem because no Events object
was passed where detailed error information can be stored.". This
occurs when CPackage::LoadFromXML fails. ".
(Microsoft.DataTransformationServices.VsIntegration)
The package failed to load due to error 0xC0011008 "Error loading from
XML. No further detailed error information can be specified for this
problem because no Events object was passed where detailed error
information can be stored.". This occurs when CPackage::LoadFromXML
fails. (Package)
------------------------------ BUTTONS:
OK
Can anyone help me please?
Thank you
You haven't posted how you are trying to get data from oracle exactly so can say much about the error. I can only give my solution in 2008 r2:
create an oracle linked server in your sql server and then use an open query in the SSIS package to pull anything you need
I have created a report using Pentaho Report Designer. the report data retrieved from Oracle database. Once I upload the report to the pentaho bi server and try to retrieved the report I have this error
Report validation failed.
pentaho tomcate log shoes this error
18:10:09,019 ERROR [AbstractReportProcessor] 1291807259: Report processing failed.
18:10:09,020 ERROR [SimpleReportingComponent] [execute] Component execution failed.
org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Failed at query: SELECT * FROM testTable
Am I missing any library to added it to the tomcat or any configuration that I should change?
would anyone who has come across the error explain how I can overcome it..
have you kept oracle connector file into the tomcat/libs folder?
you have to download and place jdbc connector file for oracle according to your version. example : ojdbc14.jar
I am using Attunity Oracle drivers to connect to Oracle Database on a remote server to retrieve data and dump into Excel File.
Everything works fine in Visual Studio BIDS. From VS I can connect directly to remote Oracle server and retrieve the data.
But when i deploy this ETL to my production server (64 Bit Windows Server 2008 & SQL Server 2012), ETL is always get stuck at Execution phase. After running for some time (20-30 mins), it gives following warning & still keeps running without giving any errors -
[SSIS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 0 buffers were considered and 0 were locked.
Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.
Some more info -
I have checked server memory, only 3GB is in use out of total 12GB.
I have already set SQL server to use max 8GB.
I am using SQL Server Agent job to run the ETL periodically every 15 mins.
I have tried stopping all other ETLs on the server and tried running this ETL through Execute Package Utility but still the result is same.
I am using a date range in Oracle Query to retrieve the data, when the query for a particular date range does not return any data, ETL execution is always successful !!.
Progress log (Execute Package Utility) -
Any pointers/suggestion ??
Hope I am able to describe the issue properly.
Update (5/Mar/2014) -
I tried reducing the amount of data I am retrieving, and the ETL was successful.
I have also set the DefaultBufferSize to 10 MB(Max size).
But if the query data is exceeding DefaultBufferSize then why the package is successful on my development machine but not on the server ??
Thanks,
Prateek