I am a Power BI user, but new to Power Automate. I have a Power BI dashboard in the cloud and want to create a Power Automate flow to execute the Power BI subscription on the 5th business day of each month, so I need to exclude weekends and holidays. Is this possible to do in Power Automate? Do I need to create an Excel calendar and store it somewhere for the flow to read it? If so, where do I store this file? Or do I need to create a Sharepoint List? Thanks for any help
Related
Our team uses Spotfire to host online analyses and also prepare monthly reports. One pain point that we have is around validation. The reports are all prepared reports, and the process for creating them each month is as simple as 1) refresh the data (through Infolink connected to Oracle) and 2) Press button to export each report. The format of the final product is a PDF.
The issue is that there are a lot of small things that can go wrong with the reports (filter accidentally applied, wrong month selected, data didn't refresh, new department not grouped correctly, etc.) meaning that someone on our team has to manually validate each of the reports. We create almost 20 reports each month and some of them are as many as 100 pages.
We've done a great job automating the creation of the reports, but now we have this weird imbalance where it takes like 25 minutes to create all the reports but 4+ hours to validate each one.
Does anyone know of a good way to automate, or even cut down, the time we have to spend each month validating the reports? I did a brief google and all I could find was in the realm of validating reports to meet government regulation standards
It depends on 2 factors:
Do your reports have the same template (format) each time you extract them? You said that you pull them out automatically so I guess the answer is Yes.
What exactly are you trying to check/validate? You need to have a clear list on what are you validating. You mentioned month, grouping, data values (for the refresh)). But the clearer the picture you have for validation, the more likely the process can be fully automated.
There are so called RPA (robot process automation) tools that can automate complex workflows.
A "data extract" task, which is part of a workflow, can detect and collect data from documents (PDF for example).
A robot that runs on the validating machine can:
batch read all your PDF reports from specified locations on your computer (or on another computer);
based on predefined templates it can read through the documents for specific fields that you specify (through defined anchors on the templates) and collect the exact data from there;
compare the extracted data with the baseline that you set (compare the month to be correct, compare a data field to confirm proper refresh of the data, another data field to confirm grouping, etc.);
It takes a bit of time to dissect the PDF for each report template and correctly set the anchors but then it runs seamless each time.
One such tool I used is called Atomatik. It has a studio environment where you design the robot (or robots) and run the process.
I am trying MS Dataverse(CDS) for the first time. Say I have a table called "User" which has a column called "credits" which is a whole number. After every 21 days, I want to increment the value of "credits" record for all the users by 5 i;e increase the value of "credits" column for all users by 5 every 21 days.
Is there a way to automate this ?
The table can also have additional columns such as "StartDate", "CreditDate" which can be used for making calculation for 21 days.
I was looking into Rollup and Calculated column types but they don't seem like the right answer since my case has more to do with automating rather than calculation.
Help would be greatly appreciated.
Definitely this can be performed and for our developer's sake Microsoft built and provided a very nice cloud service called Power Automate (Cloud Flows).
Power automate is very powerful and can perform a lot, as it has I don't know number of connections to different system and definitely CDS is out of box
For your use case you need flow on a scheduled
Then you need CDS current env connector
We use Metabase to monitor our KPIs, such as total current active items.
It's a great tool, but when we needed to look at timely trends such as total current active items change by day.
Metabase can't help much, since it only displays what current database has.
I know there are many datawarehousing stacks I can use to store time-series data.
But just curious if there's any framework that can provide a framework to store time-series data of configured KPIs?
What I imagined is like an advanced metabse, the user scenario would be like this:
I can setup metrics I'd like to trace (like DAU, Total Active Items)
The framework automatically saves these metrics by specified time-slot (day, minute, week, month, on user's configuration)
Visualize these data
Would like to have some inputs and thought :D
I am developing a report application in Power BI desktop version. I successfully created a dataset using a query and applying the filters on result data. But Now i have to get data from database in real time with user filters i.e. dataset would be created on the basis of some inputs given by users. We need this as database size is quite huge and we can not load the data then apply filters and create reports.
Same can easily be done in Dot Net application but we have to achieve this on Power Bi.
Please suggest if this can be done.
I would use the Query Parameters feature for this. You add them in the Edit Queries window, from Home / Manage Parameters, then you can use them in Calculated columns or replacing a "hard coded" filter.
There's a detailed write up in a recent blog post:
https://powerbi.microsoft.com/de-de/blog/deep-dive-into-query-parameters-and-power-bi-templates/
I have a requirement where in i have to calculate the number of hit count for the application day wise and generate a report for it. What i have been told is to create a table and then keep on updating the count whenever user accesses the application. Can anyone guide me how to proceed. Any guidance will be very much appreciated.