Deploy Microsoft Portals - dynamics-crm

We are just configuring our first Microsoft CRM Portal using the Portals Add-In.
We are doing this in our sandbox, obviously. Question: How do we deploy this to our production system? As far as I see, most of the data is not saved in a solution but in entity records. Is there a practical way to deploy the stuff?

You’ll want to use the Portal Records Mover in the XRMToolbox. It will allow you to export/import the relevant records for the Portal.
https://community.dynamics.com/crm/b/dynamicscrmtools/archive/2017/06/13/new-xrmtoolbox-plugin-portal-records-mover
Alternatively, you can use the Configuration Migration tool (available in the SDK NuGet package) to move the records. I prefer the Portal Records Mover because it provides more granular control and supports updates without overwriting.

As suggested by #Nicknow use Configuration Migration and Portal Records Mover.
I would do the first transfer with the Configuration Migration and continue with Portal Records Mover where you can filter records that were updated on a given time frame.
You can also build your own SSIS package to filter the modified records and move them on the productive system.

Related

How to push data from one Common Data Service to another CDS triggered by a status change of a record?

All environments are in the same tenant, same Azure Active Directory.
Need to push data from one environment's (Line of Business) Common Data Service to another environment's Common Data Service (Central Enterprise CDS) where reporting is running from.
I've looked into using OData Dataflows, however this seems like more of a manually triggered option.
OData dataflows is meant for and designed to support migration and synchronization of large datasets in Common Data Service during such scenarios:
A one-time cross-environment or cross-tenant migration is needed (for
example, geo-migration).
A developer needs to update an app that is being used in production.
Test data is needed in their development environment to easily build
out changes.
Reference: Migrate data between Common Data Service environments using the dataflows OData connector
For continuous data synchronization, use the CDS connector in Power Automate and attribute filters for source CDS record updates to target CDS entities.

How to automate publishing of Power BI (PBIX) from desktop to Power BI Web with config for environment

I currently have a desktop PBIX file that I manually publish to Power BI Web.
I have to keep different version of the same PBIX file just to keep track of different sources based on environment such as Dev/QA/UAT/Prod etc
I have more than one data source for each environment i.e. in same PBIX file I have data coming from say Application Insights and REST API.
I scanned through power bi community to see how to do this but can't find relevant information. All pointers are for refreshing either the local PBIX or using Schedule Refresh option in Power BI Web.
Someone even wrote code to hit Publish code via OLE automation but that's not acceptable solution.
https://community.powerbi.com
I would like to automate this process such that
A. I can provide the data source connection string/ credentials externally based on the environment I want to publish it to.
B. Publish the report to Power BI web using a service account instead of my own.
Our current build and deployment tool set does allow use of PowerShell/ Azure CLI etc. Hence it would be helpful if the solution uses those.
Fetching data from sql Azure won't need refresh but it's expensive.
In one of the organizations I worked for they used views on sql Azure to accomplish this task

How to manage MS Dynamics 365 CRM Portal custom codes?

I would like to know how to auto manage portal's custom code just like in TFS/VSTS?
At present ,I am using XRMToolbox to manage ,push or pull portal's code into CRM Instance but disadvantage is code checkin and checkout.
Can anyone help me in this to manage a code with auto pull and push option into CRM instance with checkin ,checkout options?
Thanks in Advance!
I'm afraid the XRMToolbox plugin doesn't support it yet.
Ref: https://github.com/MscrmTools/MscrmTools.PortalCodeEditor/issues/13
But there is no stopping you from creating your own pipeline - at the end of the day portal code is just bunch of Crm entities. Part of Crm SDK is configuration migration tool - last version is here:
https://www.nuget.org/packages/Microsoft.CrmSdk.XrmTooling.ConfigurationMigration.Wpf
So the idea is:
1) Get this tool
2) Define entities you want to backup & create schema xml file for them. I think you'd want adx_webpage, adx_webfile, adx_pagetemplate (and all attributes from them)
3) Export data using this schema - this exports them to .zip package that contains simple structure (schema file and data file); so you can unzip it and store in your git branch (pull)
4) For push zip this file and again use configuration migration tool to import the data
This gives you also an opportunity to have separate dev version of portal code and production version of portal code (which is always a good thing).
Portals code is made up of configuration changes to a solution (which can be extracted as xml) and data (records such as web pages, web roles etc.)
There are several tools available to help you source control both.
xrm-ci-framework provides automation tools to extract your CRM solution as xml, and then source control it. You can do this locally or in the cloud with Azure DevOps or other.
msbuild-xrm-sourcecontrol is similar. It integrates into Visual Studio to help you extract CRM customisations locally. It also has a partner project xrm-datamigration which helps you extract data from CRM, version control it and deploy it to other environments in your release pipeline. Both have documentation on the GitHub pages I've linked; this blog post is informative too.

How to setup SonarQube for a large organization

I am in the process of setting up SQ for a large organization. I plan to have two separate systems one for update testing and rule development. The second would be the production system where real work occurs. I will be using SQL 2014 typically when I do that I use a SQL always On group to sync to a DR server in another datacenter. My question is with a SonarQube instance does it make sense to DR the application to that level. If my organization can wait for a period of time to stand up a new server in a DR event would that be possible with a proper backup of the DB? Further if there were no backups of the DB what would be lost with a fresh new SonarQube server besides setup/config time? Is there historical value of code scans that would be lost or would the next scan of the code base have us right back to where we were in terms of critical issues found etc.?
Thanks for your replies.
All the data is stored in the database so using DR on the database is a good idea. You should make backup of the database and restoring the database is also a good solution (note that you should do backup of installed plugins).
If you loose the database, you will also loose all the configuration (quality profiles, credentials, etc.) and the history of the analyzed projects.
So to restore a SonarQube instance, you have to :
Restore the database
Restore SonarQube or install the same version
Restore the plugins (${SONAR_HOME}/extensions/plugins)
During the first start, the ES files (${SONAR_HOME}/data/es) will be regenerated and you're instance will then be up and running.
If you have commercial plugins or if you are working with large SonarQube instance you may contact the sales team to have support on this setup.
Disclaimer : I'm working at SonarSource

How I can update the new code to my site in windows azure cloud?

I am new windows azure user. I have gotten selected for 90 days trial account and I am able to upload my ASP.NET MVC3 application to my account. My site is also running now. After I did publish my site, I added more model, views and controller to my proramme. Now I can not find a way to update my application. I can again publish my application but update option is not there. I want to update my new code only but the package option is creating full application. How I can update the new code to my site in windows azure cloud?
[Changed spelling]
With Windows Azure you can publish/update an application following ways:
Log into you Windows Azure account. Select you hosted server name and at the top panel you will see "Upgrade" option, when you will use this option you will be given a chance to select your CSPKG and CSCFG file from local file system or from Windows Azure storage. Once you selected new or updated CSPKG, your current running service will bee upgraded.
You can also use Windows Azure PowerShell Cmdlets to upgrade your current running hosted service using "Update-Deployment" command:
2.1 http://wappowershell.codeplex.com/
You can other 3rd party applications created using Windows Azure Service Management API to upgrade/manage your current running hosted service.
3.1 http://wapmmc.codeplex.com/
3.2 http://www.cerebrata.com/Products/CloudStorageStudio/Default.aspx
Note: With Visual Studio if you again publish your application, it will delete the current running hosted service and then create the new on so for update it is not the good one.
Finally based on your question about partial update, that is not supported. Even when you make a single line change in your code the deployment will be considered a full deployment even when the action is "update/upgrade". There is no diff package deployment so evertime you update your Windows Azure application, you will use the newly created CSPKG file and upgrade your hosted application.
Regarding partial update: If you have multiple Roles, you may choose to upgrade a single role (so that would be a partial update of the deployment). For a given Role, all code is redeployed. If you're running more than one instance, the update will be rolled out across groups of instances, not all instances at once.
For updates such as static content: if you move these into blob storage (a great place for css, jquery, images, etc.), then you may update this content by simply uploading new items to blob storage individually. These updates don't require any code to be rebuilt or redeployed.
If you're in dev mode (e.g. non-production), you may enable Web Deploy, which then allows very fast updates of your app to the running instance. This only works in single-instance mode, and it's great when doing frequent code+test cycles.

Resources