Run workflow on 1:N related records - dynamics-crm

I have developed an entity intended for staging imported data and doing some validation, as well as an entity that will be used as a batch reference, so that I can ensure all staged records in a batch will go into the system together. How can I run a workflow on all the related child records from the batch entity form?

Because standard CRM Workflow steps don't support the iteration of 1:N relationships, you need to build a custom workflow activity.
Lucky for us there is already one available:
CRM 2011 Distribute Workflow Activity
There is also a small tutorial (powerpoint format), the link is in the same page (last download link in the bottom)

Related

dbt schema cleanup in Snowflake

When we create Pull Requests in GitHub it auto triggers a dbt cloud job that runs a test build of our models. The database in Snowflake for this build is called "Continuous Integration". In this database we have hundreds of schemas going back almost 2 years. Is there any reason to keep these schemas and tables? I sure would like to do some cleanup.
You should be able to delete these old schemas with no consequence.
Each of these schemas is built based on the change introduced in an earlier version of the code & (depending on how you set up your github action) either using a pre-defined test data or the raw data available at the time of the test begin run.
These CI jobs can serve two use-cases.
[primary] test the code works & data validation tests pass
they can act as a way to do time travel, which I'll describe below.
The first use-case does not need the artifact to be preserved after once the job runs
The second use-case may be important to you in trying to debug reports that were generated many months ago.
Example: lets say the finance department wants to know why a historical value of active users has changed in the latest report. this may have been an error that was fixed within your dbt logic, or perhaps the active users was pulled with an incorrect filter from your BI layer, if you had dbt artifacts built from that era, you would be able to use it to look for any dbt level changes.
How far back do you think you'd need the artifacts for time travel? Check with your stakeholders and come up with a time frame that works for your business, and you can delete all the CI artifacts built prior to that date.

How to push data from one Common Data Service to another CDS triggered by a status change of a record?

All environments are in the same tenant, same Azure Active Directory.
Need to push data from one environment's (Line of Business) Common Data Service to another environment's Common Data Service (Central Enterprise CDS) where reporting is running from.
I've looked into using OData Dataflows, however this seems like more of a manually triggered option.
OData dataflows is meant for and designed to support migration and synchronization of large datasets in Common Data Service during such scenarios:
A one-time cross-environment or cross-tenant migration is needed (for
example, geo-migration).
A developer needs to update an app that is being used in production.
Test data is needed in their development environment to easily build
out changes.
Reference: Migrate data between Common Data Service environments using the dataflows OData connector
For continuous data synchronization, use the CDS connector in Power Automate and attribute filters for source CDS record updates to target CDS entities.

How to automate publishing of Power BI (PBIX) from desktop to Power BI Web with config for environment

I currently have a desktop PBIX file that I manually publish to Power BI Web.
I have to keep different version of the same PBIX file just to keep track of different sources based on environment such as Dev/QA/UAT/Prod etc
I have more than one data source for each environment i.e. in same PBIX file I have data coming from say Application Insights and REST API.
I scanned through power bi community to see how to do this but can't find relevant information. All pointers are for refreshing either the local PBIX or using Schedule Refresh option in Power BI Web.
Someone even wrote code to hit Publish code via OLE automation but that's not acceptable solution.
https://community.powerbi.com
I would like to automate this process such that
A. I can provide the data source connection string/ credentials externally based on the environment I want to publish it to.
B. Publish the report to Power BI web using a service account instead of my own.
Our current build and deployment tool set does allow use of PowerShell/ Azure CLI etc. Hence it would be helpful if the solution uses those.
Fetching data from sql Azure won't need refresh but it's expensive.
In one of the organizations I worked for they used views on sql Azure to accomplish this task

How to manage MS Dynamics 365 CRM Portal custom codes?

I would like to know how to auto manage portal's custom code just like in TFS/VSTS?
At present ,I am using XRMToolbox to manage ,push or pull portal's code into CRM Instance but disadvantage is code checkin and checkout.
Can anyone help me in this to manage a code with auto pull and push option into CRM instance with checkin ,checkout options?
Thanks in Advance!
I'm afraid the XRMToolbox plugin doesn't support it yet.
Ref: https://github.com/MscrmTools/MscrmTools.PortalCodeEditor/issues/13
But there is no stopping you from creating your own pipeline - at the end of the day portal code is just bunch of Crm entities. Part of Crm SDK is configuration migration tool - last version is here:
https://www.nuget.org/packages/Microsoft.CrmSdk.XrmTooling.ConfigurationMigration.Wpf
So the idea is:
1) Get this tool
2) Define entities you want to backup & create schema xml file for them. I think you'd want adx_webpage, adx_webfile, adx_pagetemplate (and all attributes from them)
3) Export data using this schema - this exports them to .zip package that contains simple structure (schema file and data file); so you can unzip it and store in your git branch (pull)
4) For push zip this file and again use configuration migration tool to import the data
This gives you also an opportunity to have separate dev version of portal code and production version of portal code (which is always a good thing).
Portals code is made up of configuration changes to a solution (which can be extracted as xml) and data (records such as web pages, web roles etc.)
There are several tools available to help you source control both.
xrm-ci-framework provides automation tools to extract your CRM solution as xml, and then source control it. You can do this locally or in the cloud with Azure DevOps or other.
msbuild-xrm-sourcecontrol is similar. It integrates into Visual Studio to help you extract CRM customisations locally. It also has a partner project xrm-datamigration which helps you extract data from CRM, version control it and deploy it to other environments in your release pipeline. Both have documentation on the GitHub pages I've linked; this blog post is informative too.

Deploy Microsoft Portals

We are just configuring our first Microsoft CRM Portal using the Portals Add-In.
We are doing this in our sandbox, obviously. Question: How do we deploy this to our production system? As far as I see, most of the data is not saved in a solution but in entity records. Is there a practical way to deploy the stuff?
You’ll want to use the Portal Records Mover in the XRMToolbox. It will allow you to export/import the relevant records for the Portal.
https://community.dynamics.com/crm/b/dynamicscrmtools/archive/2017/06/13/new-xrmtoolbox-plugin-portal-records-mover
Alternatively, you can use the Configuration Migration tool (available in the SDK NuGet package) to move the records. I prefer the Portal Records Mover because it provides more granular control and supports updates without overwriting.
As suggested by #Nicknow use Configuration Migration and Portal Records Mover.
I would do the first transfer with the Configuration Migration and continue with Portal Records Mover where you can filter records that were updated on a given time frame.
You can also build your own SSIS package to filter the modified records and move them on the productive system.

Resources