Power BI Oracle on prem gateway issue - Unable to find the requested .Net Framework Data Provider - oracle

I have a Power BI dashboard which has direct queries to an Oracle database, where I import data using SQL queries. On my local pbix file everything is fine. When I publish it to my enterprise powerbi.com site and want to refresh the data, I get the following error:
{"error":{"code":"DM_GWPipeline_Gateway_ProviderDataAccessArgumentError","pbi.error":
{"code":"DM_GWPipeline_Gateway_ProviderDataAccessArgumentError","parameters":{},"details":
[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage","detail":{"type":1,"value":"Unable to
find the requested .Net Framework Data Provider. It may not be installed."}},
{"code":"DM_ErrorDetailNameCode_UnderlyingHResult","detail":
{"type":1,"value":"-2147024809"}}],"exceptionCulprit":1}}}
Does anyone have any idea what could be causing the issue?
I have trawled through the Power BI forums and there does not seem to be a definitive remedy.
I don't have such an issue using Tibco Spotfire, however we are being pushed to use Power BI.

I found a work around. It seems that after you create the Views in Oracle, in PowerBI, you should rather import (as selected below) than make a direct query. Just click the import option and find your table with the view you create in the following options.
I still see the same error if I want to manually fresh the data on the PowerBI app, but I can at least now see my dashboard working, with the tables and graphics fully visible.

Related

From local Excel sheets into dashboard online

I have around 30000 sheets of excel with standardized report.
The goal is to provide online dashboard to view the data.
My first thought as programmer is to create a database and find a way to import the data and then build front end on the top to create a customized dashboard.
Any easier way to do this?
If your reports are always going to be in excel, I don't think it would be a good approach to convert them to a database. It will only add an overhead of converting the data and creating the frontend for it, that would be an overengineering.
Instead what you can do is you can use data visualization tools like retool which already offer visualization components on top of any source data. The source can be anything be it an excel sheet or a database.
There are other alternatives as well like redash which also offers connections to CSV files.
However, if you still feel that creating a database like MySQL or any other database of your choice is a better approach to go. Then you should consider using Metabase or superset. These are excellent tools for data visualization and only require the database credentials to connect to the DB. These are feature-rich, accepted within the industry and you can create possibly all the visualizations as per your need.
Also, the best part is all the tools that I have mentioned here are totally open source or have a nice trial period.
FYI, For a better and more effective solution, I would recommend running a SQL Server on your machine and uploading the excel report to the SQL server and then connecting the SQL server instance to the Metabase, That would do everything that you need and you don't have to spend time to write any code to manage all these stuff. Below is a glimpse of a Metabase Visualisation Dashboard:

Microsoft SSAS-Tabular Model (TM) connection to Power BI via Import mode - 'not enough memory available for the application'

I have a rudimentary question, when connecting an SSAS-TM (SQL Server Analysis Services - Tabular Model) database (on-premise) in my own local machine to my Power BI Desktop (also in my local machine) via Import mode.
I am not at all familiar with the memory allocation parameters.
The relational database I have is a very simple AdventureWorksDW. I develop a SQL Server Analysis Services - Tabular Model project, using Visual Studio 2015, and deploy the Project as a new database in the Analysis Services Engine. I am able to query tables in this SSAS-TM database in the following format in the SSMS (SQL Server Management Studio), using DAX language:
EVALUATE 'tablename'
However, when I try to connect this SSAS-TM database to my Power BI Desktop via an Import connection, I get the following error.
AnalysisServices: The operation has been cancelled because there is not enough memory available for the application. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.
I have the properties from the SSMS in this image file.
I tried to refer some links related to setting up some Vertipaq memory parameters, after referring some links in learn.microsoft.com. It has not been useful to me.
My simple question is this:
What properties do I need to change in the image file above to make this connection successful? This is only training, so am using Adventure Works DW here. So size is not an issue. My laptop has a lot of memory and disk space, and is 64-bit. The Power BI Desktop is 64-bit.
Can someone help me ?
The Power BI connector for Analysis Services that has the table picker will generate an MDX query instead of a DAX query. And if you try to extract a more than a handful of rows, it will fail. It's a known issue, but low priority because.
Don't import from SSAS. Use Live Connect. You've already got a Cube/Data Set, you can just connect to it and write reports.
If you absolutely must import from SSAS, use a DAX query, eg
In M:
AnalysisServices.Database("MySSAS", "AdventureWorksDW", [Query="evaluate FactResellerSales", Implementation="2.0"])
or in the UI
Use Live Connect if you are only getting data from cube. If you are getting data from Excel files, etc. then you are forced to use Import. I have used Import to get many tables from cube, no memory errors. What you can do is Import 3 tables at a time, then in Power BI advanced editor select option to add more tables from cube, add another 3 tables, see how that goes. With Live Connect even if the relationship columns are hidden, you still get them. With Import, if they are hidden, you can't select them, so can't create the relationships.

Connecting Power BI to ORacle? Filling out TNSNames.ORa

We are using Oracle cloud CRM. Our organization has been using it since quite few years and the people who set it up have already left. I am new here and am trying to connect the CRM data to Power BI. I intalled the ODAC drivers and everything. However, I do not know what to enter in my tnsnames.ora file. That file has asked for service name, server name and hostID/name. No one in the organization has this information. I reached out to Oracle support and here is the response they gave:
Oracle’s response to the service request: “These details are not found in Documentation because they cannot be provided. You'll need to reach out to Power BI support to see if there are alternate ways to create this integration without these details.”
Does anyone know why Oracle would not share these details with us? If there is any other way to find out the server and service name? How should I proceed in such scenarios.
As of now, we use a link to login to the service and we do not have much documentation
Let me attempt to translate.
We are using Oracle cloud CRM
Oracle is hosting our application.
I am new here and am trying to connect the CRM data to Power BI
We want to query the database being used to store our application data.
Oracle’s response to the service request: “These details are not found in Documentation because they cannot be provided..."
This is where it gets fun, they are saying - we do not give clients direct access to the database where their data is hosted. So in other words, you CANNOT connect your tool directly to the database.
So, I think you're best bet, is to look into REST APIs that have been published for you as a subscriber to the service. This is often provided in lieu of providing direct access to your hosted environment.
Your other bet is to contact someone in your organisation that has the oracle connections in a file which you could load SQL Developer by Oracle, and explore the connections there. Most likely this will be a data engineer or IT contact in your organisation who will have this information. Once you have the connection info visible, you can then enter this directly in Power BI after creating an Oracle connection.

Power BI dynamic filtering according to user account

I want to develop indicators using Microsoft Power BI.
I have a Datawarehouse on SQL Server and I directly query from the DW.
I would like to display different information according to the user accessing the report.
I didn't find a way on how I can dynamically filter the data according to the user account.
Is it possible?
If not, then what are the possible solutions to do it?
I believe what you are looking for is Row-Level Security and it is out with Sql server 2016 and Azure Sql DB, check here: https://msdn.microsoft.com/en-us/library/dn765131.aspx
I too have been looking for a similar solution to the same problem, I am using Azure SQL DW and I have been told the Row-Level Security will be coming to AzureSQLDW with GA, but not sure of the date on that.
So back to your question as far as I know the only possible solutions would be to use Sql 2016 Row-Level Security or you can also use SSAS in front of your data warehouse.
Hope that helps.

DB connection issue with excel spreadsheet and ODBC Oracle Connect

I have a spreadsheet located on a centrally hosted server. Each day, a view on my Oracle database exports a report to this excel. To access the document, various users across a number of locations can login and review the doc. However, for no obvious reason, a number of locations have started being requested for the server name. Most of the locations have the 3 fields filled in by default, but in the problem locations they have to enter the server name each time.
Does anyone have an idea how I can solve this? No configuration changes have been made recently, and I am fairly new to the system as a whole, and cannot come up with an explanation.
See attached screenshot for what I mean.
Thanks!
http://i.imgur.com/eymlE.jpg
I browsed to Admin Tools > Data Sources (ODBC) and created a File DSN. From here I was able to specify the following:
[ODBC]
DRIVER=Microsoft ODBC for Oracle
UID=xxxxxxxxx
PWD=***************
server=xxxx.xxx.xx
This meant creating new database queries inside excel, I could point to this as reference and connect externally. But more importantly for my issue, remote users could now access a server based excel, without being prompted for the server-name. I still don't know how they lost this functionality in the first place, but this solved my issue. :)
Turns out this doesn't work - I was opening a different version of the file. Still no sure how the connection box with excel is populated from!

Resources