Having trouble exporting my Application Insight data to SQL table? - visual-studio

I am viewing Applications Insight through Visual Studio 2017 View->Other Windows -.Applications Insight.
Can i export this data from Visual Studio to a sql table to perform more detailed queries ?

You don't need to import data into sql table since Application insights has its own table for these data. You can also perform more detailed queries on these data via the sql-like language kusto query language.
To do that, you need to go to azure portal -> your application insights -> Logs. In Logs, you can see some tables like traces, requests etc. -> then you can write your own query to do anything like you do in sql table:
By the way, there is no direct way to export these data into sql. If you want to do that, you should export these data into azure storage first(it's called Continuous Export), then use other methods to export them into sql. But I don't suggest it, people should always use the built-in logs table in application insights.

Related

From local Excel sheets into dashboard online

I have around 30000 sheets of excel with standardized report.
The goal is to provide online dashboard to view the data.
My first thought as programmer is to create a database and find a way to import the data and then build front end on the top to create a customized dashboard.
Any easier way to do this?
If your reports are always going to be in excel, I don't think it would be a good approach to convert them to a database. It will only add an overhead of converting the data and creating the frontend for it, that would be an overengineering.
Instead what you can do is you can use data visualization tools like retool which already offer visualization components on top of any source data. The source can be anything be it an excel sheet or a database.
There are other alternatives as well like redash which also offers connections to CSV files.
However, if you still feel that creating a database like MySQL or any other database of your choice is a better approach to go. Then you should consider using Metabase or superset. These are excellent tools for data visualization and only require the database credentials to connect to the DB. These are feature-rich, accepted within the industry and you can create possibly all the visualizations as per your need.
Also, the best part is all the tools that I have mentioned here are totally open source or have a nice trial period.
FYI, For a better and more effective solution, I would recommend running a SQL Server on your machine and uploading the excel report to the SQL server and then connecting the SQL server instance to the Metabase, That would do everything that you need and you don't have to spend time to write any code to manage all these stuff. Below is a glimpse of a Metabase Visualisation Dashboard:

Microsoft SSAS-Tabular Model (TM) connection to Power BI via Import mode - 'not enough memory available for the application'

I have a rudimentary question, when connecting an SSAS-TM (SQL Server Analysis Services - Tabular Model) database (on-premise) in my own local machine to my Power BI Desktop (also in my local machine) via Import mode.
I am not at all familiar with the memory allocation parameters.
The relational database I have is a very simple AdventureWorksDW. I develop a SQL Server Analysis Services - Tabular Model project, using Visual Studio 2015, and deploy the Project as a new database in the Analysis Services Engine. I am able to query tables in this SSAS-TM database in the following format in the SSMS (SQL Server Management Studio), using DAX language:
EVALUATE 'tablename'
However, when I try to connect this SSAS-TM database to my Power BI Desktop via an Import connection, I get the following error.
AnalysisServices: The operation has been cancelled because there is not enough memory available for the application. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.
I have the properties from the SSMS in this image file.
I tried to refer some links related to setting up some Vertipaq memory parameters, after referring some links in learn.microsoft.com. It has not been useful to me.
My simple question is this:
What properties do I need to change in the image file above to make this connection successful? This is only training, so am using Adventure Works DW here. So size is not an issue. My laptop has a lot of memory and disk space, and is 64-bit. The Power BI Desktop is 64-bit.
Can someone help me ?
The Power BI connector for Analysis Services that has the table picker will generate an MDX query instead of a DAX query. And if you try to extract a more than a handful of rows, it will fail. It's a known issue, but low priority because.
Don't import from SSAS. Use Live Connect. You've already got a Cube/Data Set, you can just connect to it and write reports.
If you absolutely must import from SSAS, use a DAX query, eg
In M:
AnalysisServices.Database("MySSAS", "AdventureWorksDW", [Query="evaluate FactResellerSales", Implementation="2.0"])
or in the UI
Use Live Connect if you are only getting data from cube. If you are getting data from Excel files, etc. then you are forced to use Import. I have used Import to get many tables from cube, no memory errors. What you can do is Import 3 tables at a time, then in Power BI advanced editor select option to add more tables from cube, add another 3 tables, see how that goes. With Live Connect even if the relationship columns are hidden, you still get them. With Import, if they are hidden, you can't select them, so can't create the relationships.

Power BI Oracle on prem gateway issue - Unable to find the requested .Net Framework Data Provider

I have a Power BI dashboard which has direct queries to an Oracle database, where I import data using SQL queries. On my local pbix file everything is fine. When I publish it to my enterprise powerbi.com site and want to refresh the data, I get the following error:
{"error":{"code":"DM_GWPipeline_Gateway_ProviderDataAccessArgumentError","pbi.error":
{"code":"DM_GWPipeline_Gateway_ProviderDataAccessArgumentError","parameters":{},"details":
[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage","detail":{"type":1,"value":"Unable to
find the requested .Net Framework Data Provider. It may not be installed."}},
{"code":"DM_ErrorDetailNameCode_UnderlyingHResult","detail":
{"type":1,"value":"-2147024809"}}],"exceptionCulprit":1}}}
Does anyone have any idea what could be causing the issue?
I have trawled through the Power BI forums and there does not seem to be a definitive remedy.
I don't have such an issue using Tibco Spotfire, however we are being pushed to use Power BI.
I found a work around. It seems that after you create the Views in Oracle, in PowerBI, you should rather import (as selected below) than make a direct query. Just click the import option and find your table with the view you create in the following options.
I still see the same error if I want to manually fresh the data on the PowerBI app, but I can at least now see my dashboard working, with the tables and graphics fully visible.

Power BI dynamic filtering according to user account

I want to develop indicators using Microsoft Power BI.
I have a Datawarehouse on SQL Server and I directly query from the DW.
I would like to display different information according to the user accessing the report.
I didn't find a way on how I can dynamically filter the data according to the user account.
Is it possible?
If not, then what are the possible solutions to do it?
I believe what you are looking for is Row-Level Security and it is out with Sql server 2016 and Azure Sql DB, check here: https://msdn.microsoft.com/en-us/library/dn765131.aspx
I too have been looking for a similar solution to the same problem, I am using Azure SQL DW and I have been told the Row-Level Security will be coming to AzureSQLDW with GA, but not sure of the date on that.
So back to your question as far as I know the only possible solutions would be to use Sql 2016 Row-Level Security or you can also use SSAS in front of your data warehouse.
Hope that helps.

Linq DataContext.CreateDatabase on Azure

This came up once before: Use DataContext.CreateDatabase in SQL Azure
The answer accepted was "maybe it's not possible". Didn't seem like a full answer.
I have a set of classes fully defined and I am wanting to create a database on Azure for this. It's not working because the USE statement does not work: http://msdn.microsoft.com/en-us/library/azure/ee336288.aspx
So, the database gets created as blank, and internally Linq generates a USE statement to move to that database and start adding tables. This fails and it throws an exception.
So how can I create my database? Can I use Linq to add tables to an existing database? Can I enable USE on Azure somehow? Seems ridiculous this does not work.
After messing around for a while on this, I ended up creating the database against a local SQL Server instance. Then used SQL Server Management Studio -> Tasks -> Script Database, and turned on the export type to be Microsoft Azure. Then I had the script file needed to run on the Azure server. I'll leave the question open for a day or two because I am curious if this can work with Azure directly somehow. If I don't hear anything, I will close it.
The USE statement does not switch between databases in Azure SQL Database. You will have to connect to the database to create a table on that database.
Regards
Dhruv

Resources