I would like to know how to auto manage portal's custom code just like in TFS/VSTS?
At present ,I am using XRMToolbox to manage ,push or pull portal's code into CRM Instance but disadvantage is code checkin and checkout.
Can anyone help me in this to manage a code with auto pull and push option into CRM instance with checkin ,checkout options?
Thanks in Advance!
I'm afraid the XRMToolbox plugin doesn't support it yet.
Ref: https://github.com/MscrmTools/MscrmTools.PortalCodeEditor/issues/13
But there is no stopping you from creating your own pipeline - at the end of the day portal code is just bunch of Crm entities. Part of Crm SDK is configuration migration tool - last version is here:
https://www.nuget.org/packages/Microsoft.CrmSdk.XrmTooling.ConfigurationMigration.Wpf
So the idea is:
1) Get this tool
2) Define entities you want to backup & create schema xml file for them. I think you'd want adx_webpage, adx_webfile, adx_pagetemplate (and all attributes from them)
3) Export data using this schema - this exports them to .zip package that contains simple structure (schema file and data file); so you can unzip it and store in your git branch (pull)
4) For push zip this file and again use configuration migration tool to import the data
This gives you also an opportunity to have separate dev version of portal code and production version of portal code (which is always a good thing).
Portals code is made up of configuration changes to a solution (which can be extracted as xml) and data (records such as web pages, web roles etc.)
There are several tools available to help you source control both.
xrm-ci-framework provides automation tools to extract your CRM solution as xml, and then source control it. You can do this locally or in the cloud with Azure DevOps or other.
msbuild-xrm-sourcecontrol is similar. It integrates into Visual Studio to help you extract CRM customisations locally. It also has a partner project xrm-datamigration which helps you extract data from CRM, version control it and deploy it to other environments in your release pipeline. Both have documentation on the GitHub pages I've linked; this blog post is informative too.
Related
I currently have a desktop PBIX file that I manually publish to Power BI Web.
I have to keep different version of the same PBIX file just to keep track of different sources based on environment such as Dev/QA/UAT/Prod etc
I have more than one data source for each environment i.e. in same PBIX file I have data coming from say Application Insights and REST API.
I scanned through power bi community to see how to do this but can't find relevant information. All pointers are for refreshing either the local PBIX or using Schedule Refresh option in Power BI Web.
Someone even wrote code to hit Publish code via OLE automation but that's not acceptable solution.
https://community.powerbi.com
I would like to automate this process such that
A. I can provide the data source connection string/ credentials externally based on the environment I want to publish it to.
B. Publish the report to Power BI web using a service account instead of my own.
Our current build and deployment tool set does allow use of PowerShell/ Azure CLI etc. Hence it would be helpful if the solution uses those.
Fetching data from sql Azure won't need refresh but it's expensive.
In one of the organizations I worked for they used views on sql Azure to accomplish this task
I am trying to create VDMs using EDMX from SFSF, using this blog
I create a SCP Business Application template and then from in the srv module I try to add new data model from external source - in this case API Business Hub.
I try to use SuccessFactors Employee Central - Personal Information.
https://api.sap.com/api/ECPersonalInformation/overview
The process starts and fails with the message: "OData models with multiple schemas are not supported" and then "Could not generate Virtual Data Model classes."
The external folder is generated as expected with the XML in the EDMX folder but the csn folder is empty.
As I understand it this should work with any api from the business hub? Am I doing something wrong or am I missing something?
Thanks.
Update:
There seems to be an issue with the conversion from EDMX into CSN used by the Web IDE (which is not part of the SAP Cloud SDK).
The Java VDM generated by the OData Generator from the SAP Cloud SDK (used as a component by the Web IDE) should work without any problem.
This looks like an unexpected behavior. We will investigate this further.
In the meantime, as a workaround, you can use our maven plugin or CLI to create the data model for you. This is described in detail in this blog post.
The tl;dr version (for the CLI) is:
Determine which version of the SAP Cloud SDK you are using (search for sdk-bom in your parent pom.xml). I assume this to be version 2.16.0 for this example.
Download the CLI library from maven central: https://search.maven.org/artifact/com.sap.cloud.s4hana.datamodel/odata-generator-cli/2.16.0/jar
Download the metadata file (edmx) from the API Business Hub (as linked in your question)
Run the CLI with e.g. the following command:
java -jar odata-generator-cli-2.16.0.jar -i <input-directory> -o <output-directory> -b <base-path>
The <base-path> in there is the prefix (service independent) to be used in between your host configuration and the actual service name.
Add the generated code manually to your project.
I will updates this answer with the results of the investigation.
We are just configuring our first Microsoft CRM Portal using the Portals Add-In.
We are doing this in our sandbox, obviously. Question: How do we deploy this to our production system? As far as I see, most of the data is not saved in a solution but in entity records. Is there a practical way to deploy the stuff?
You’ll want to use the Portal Records Mover in the XRMToolbox. It will allow you to export/import the relevant records for the Portal.
https://community.dynamics.com/crm/b/dynamicscrmtools/archive/2017/06/13/new-xrmtoolbox-plugin-portal-records-mover
Alternatively, you can use the Configuration Migration tool (available in the SDK NuGet package) to move the records. I prefer the Portal Records Mover because it provides more granular control and supports updates without overwriting.
As suggested by #Nicknow use Configuration Migration and Portal Records Mover.
I would do the first transfer with the Configuration Migration and continue with Portal Records Mover where you can filter records that were updated on a given time frame.
You can also build your own SSIS package to filter the modified records and move them on the productive system.
Does anyone know about this problem: Any new fields I add work fine in the local back office, but when I use Webmatrix to publish to the server (discountASP.net) fields don't show up. I did a view source in the browser and they're just not there!
For example, #Umbraco.Field("comments")
Thanks!
Daniel
If you add new field, they are only added in the database. That means you would need to update the database on the production website. Webmatrix doesn't do this for you (by default).
There are a few ways to handle this scenario:
copy your database to the production server (i would advice against this, because you might overwrite content and media changes on the production server)
create the fields manually on the production server (easy solution)
use a commercial package like courrier (personally i believe it's a good solution, only if you have a content staging workflow)
use a free package like usync (http://our.umbraco.org/projects/developer-tools/usync)
I am trying to create a custom workflow in ms crm 4 so that when a task is completed it will take some of the attributes of the task and add an entry in project server on a timesheet. I am able to access the project server web services (PSI) and create a time sheet entry from a c# console app and I can do other custom workflows in crm not related to project server. When using the Project Server web services (PSI) I have to reference and include 3 office project dll's but I am unsure how to get those registered in CRM when i do the custom workflow plugin registration. Any thoughts would be helpful.
Thanks
In my experience, you're either going to have to deploy those DLL's to the Server\bin directory or merge them with your DLL using something like ILMerge and register it all as one big chunk.