TODAY() in power Bi desktop is giving me my local timezone but TODAY() in PowerBI app is giving me different date and time.
Does any one know how to fix this ?
Power BI online makes use of a different time zone GMT. That's why your date is different in dekstop, where you use your local timezone as configured in the settings.
Related
I didn't find any api query to get the projects list with the project creation date(not last analysys date).
GET api/projects/search
The above query giving only project names but without creation date.
How to get the report for this?
I tried running sql query on postgres database directly but there last analysis date also showing as created_at date and getting multiple records for same project.
On the official documentation there is the following parameter:
analyzedBefore
Filter the projects for which the last analysis of all branches are
older than the given date (exclusive). Either a date (server timezone)
or datetime can be provided.
I hope that you can somehow connect projects with branch creation dates and with that create a report. Otherwise, I can't see an option for searching by project creation dates (at least on the official documentation).
I have a rudimentary question, when connecting an SSAS-TM (SQL Server Analysis Services - Tabular Model) database (on-premise) in my own local machine to my Power BI Desktop (also in my local machine) via Import mode.
I am not at all familiar with the memory allocation parameters.
The relational database I have is a very simple AdventureWorksDW. I develop a SQL Server Analysis Services - Tabular Model project, using Visual Studio 2015, and deploy the Project as a new database in the Analysis Services Engine. I am able to query tables in this SSAS-TM database in the following format in the SSMS (SQL Server Management Studio), using DAX language:
EVALUATE 'tablename'
However, when I try to connect this SSAS-TM database to my Power BI Desktop via an Import connection, I get the following error.
AnalysisServices: The operation has been cancelled because there is not enough memory available for the application. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.
I have the properties from the SSMS in this image file.
I tried to refer some links related to setting up some Vertipaq memory parameters, after referring some links in learn.microsoft.com. It has not been useful to me.
My simple question is this:
What properties do I need to change in the image file above to make this connection successful? This is only training, so am using Adventure Works DW here. So size is not an issue. My laptop has a lot of memory and disk space, and is 64-bit. The Power BI Desktop is 64-bit.
Can someone help me ?
The Power BI connector for Analysis Services that has the table picker will generate an MDX query instead of a DAX query. And if you try to extract a more than a handful of rows, it will fail. It's a known issue, but low priority because.
Don't import from SSAS. Use Live Connect. You've already got a Cube/Data Set, you can just connect to it and write reports.
If you absolutely must import from SSAS, use a DAX query, eg
In M:
AnalysisServices.Database("MySSAS", "AdventureWorksDW", [Query="evaluate FactResellerSales", Implementation="2.0"])
or in the UI
Use Live Connect if you are only getting data from cube. If you are getting data from Excel files, etc. then you are forced to use Import. I have used Import to get many tables from cube, no memory errors. What you can do is Import 3 tables at a time, then in Power BI advanced editor select option to add more tables from cube, add another 3 tables, see how that goes. With Live Connect even if the relationship columns are hidden, you still get them. With Import, if they are hidden, you can't select them, so can't create the relationships.
I use Oracle BI 12c and I am facing this strange problem while using it. The thing is there is an ETL process that transfers data from one database to another and that database(dwh) is used for OBIEE.
Even though transfer succeeds and data is present in tables for yesterday, creating analysis on that day returning no data. The only way I get it is restarting BI servers through enterprise manager.
So my question is how can I schedule BI servers to restart every day in the morning or is there any other solution for this problem?
EDIT
Today while I traced log messages when created an analysis, I noticed two error messages repeating. one is the following:
POST
/bisearch/rest/BISearchEventService/postEvent?eventType=web_catalog&eventSubType=web_catalog_object_updated&objectKey=%2f&tenantID=ssi&ownerID=18446744073709551615&sessionID=0000&locale=en_US
and the other one is
[43065] The connection with Cluster Controller
::ffff:10.11.10.19.34713 was lost.
I guess the last one is the issue. Have somebody faced it?
I want to develop indicators using Microsoft Power BI.
I have a Datawarehouse on SQL Server and I directly query from the DW.
I would like to display different information according to the user accessing the report.
I didn't find a way on how I can dynamically filter the data according to the user account.
Is it possible?
If not, then what are the possible solutions to do it?
I believe what you are looking for is Row-Level Security and it is out with Sql server 2016 and Azure Sql DB, check here: https://msdn.microsoft.com/en-us/library/dn765131.aspx
I too have been looking for a similar solution to the same problem, I am using Azure SQL DW and I have been told the Row-Level Security will be coming to AzureSQLDW with GA, but not sure of the date on that.
So back to your question as far as I know the only possible solutions would be to use Sql 2016 Row-Level Security or you can also use SSAS in front of your data warehouse.
Hope that helps.
Now I want to see some values like total wait time, total physical reads ,logical read, Disk Activity and Input-Output, total sort done for query for some specific time like 9.00 AM to 10.00 AM so I can estimate the situation of the Database.
How to find current memory status?
And how can i monitoring user activity also.
You need to view some dynamic performance view like v$sysstat,V$session for your day to day activity
plz go to web and search "dynamic performance view"
i hope you understand
Can you please specify which database version you are using and if you have standard edition or enterprise edition?
I think you can get most of the infos from Oracle Enterprise Manager Database Control e.g. for 11g.
Here are some Infos from oracle:
Getting Started with Database Administration