I am trying to create a custom workflow in ms crm 4 so that when a task is completed it will take some of the attributes of the task and add an entry in project server on a timesheet. I am able to access the project server web services (PSI) and create a time sheet entry from a c# console app and I can do other custom workflows in crm not related to project server. When using the Project Server web services (PSI) I have to reference and include 3 office project dll's but I am unsure how to get those registered in CRM when i do the custom workflow plugin registration. Any thoughts would be helpful.
Thanks
In my experience, you're either going to have to deploy those DLL's to the Server\bin directory or merge them with your DLL using something like ILMerge and register it all as one big chunk.
Related
I would like to know how to auto manage portal's custom code just like in TFS/VSTS?
At present ,I am using XRMToolbox to manage ,push or pull portal's code into CRM Instance but disadvantage is code checkin and checkout.
Can anyone help me in this to manage a code with auto pull and push option into CRM instance with checkin ,checkout options?
Thanks in Advance!
I'm afraid the XRMToolbox plugin doesn't support it yet.
Ref: https://github.com/MscrmTools/MscrmTools.PortalCodeEditor/issues/13
But there is no stopping you from creating your own pipeline - at the end of the day portal code is just bunch of Crm entities. Part of Crm SDK is configuration migration tool - last version is here:
https://www.nuget.org/packages/Microsoft.CrmSdk.XrmTooling.ConfigurationMigration.Wpf
So the idea is:
1) Get this tool
2) Define entities you want to backup & create schema xml file for them. I think you'd want adx_webpage, adx_webfile, adx_pagetemplate (and all attributes from them)
3) Export data using this schema - this exports them to .zip package that contains simple structure (schema file and data file); so you can unzip it and store in your git branch (pull)
4) For push zip this file and again use configuration migration tool to import the data
This gives you also an opportunity to have separate dev version of portal code and production version of portal code (which is always a good thing).
Portals code is made up of configuration changes to a solution (which can be extracted as xml) and data (records such as web pages, web roles etc.)
There are several tools available to help you source control both.
xrm-ci-framework provides automation tools to extract your CRM solution as xml, and then source control it. You can do this locally or in the cloud with Azure DevOps or other.
msbuild-xrm-sourcecontrol is similar. It integrates into Visual Studio to help you extract CRM customisations locally. It also has a partner project xrm-datamigration which helps you extract data from CRM, version control it and deploy it to other environments in your release pipeline. Both have documentation on the GitHub pages I've linked; this blog post is informative too.
I'm having trouble accessing an external database from a CRM plugin.
The error I receive is:
"Request for the permission of type 'System.Data.SqlClient.SqlClientPermission, System.Data, Version=4.0.0.0, Culture=neutral, PublicKeyToken=xxx' failed."
The code runs great locally within a "unit test". I made sure to set the plugin isolation mode to "none".
I tried looking to this article for help, and tried everything it suggested with no luck.
Here is the current code I'm using:
var conn = new SqlConnection(#"Server=MyServer\Instance;DataBase=MyDB;User Id=MyUser;Password=MyPassword;Integrated Security=false;");
conn.Open();
I also tried this connection string and giving the NT AUTHORITY\NETWORK SERVICE user access to the database.
var conn = new SqlConnection(#"Data Source=MyDS\Instance;Initial Catalog=MyDB;Integrated Security=SSPI;");
conn.Open();
I'm on Dynamics CRM 2015 On-Premise.
Update: I found out it was working when I didn't debug, but I got the error when I try to debug it through the plugin registration tool. Any idea on why that would happen?
A SQL connection will require "full trust" to establish which the CRM plugin sandbox does not run within.
We run CRM 2013 On-Premise and I frequently make calls to external databases within custom plugins and workflows, but to overcome the security issues - I created a web service which handles these requests.
For example, a call to update a record in DB2 when an account is updated would work like this:
Account record updated in CRM
Account plugin fired
Establish connection to MyCompanyWebService
Call UpdateDB2 (method within MyCompanyWebService)
Of course you have the overhead of having to develop a separate web service, but (on the bright side) it allows you to separate the logic and you can fully control the trust level within your web service.
The plugin registration tool has only limited debugging capabilities. It was designed for CRM Online, where you cannot use the debugging options of Visual Studio. In OnPremise deployments use either remote debugging or install Visual Studio on the CRM Server. The last one is the recommended approach.
That issue looks like your code is running in partial trust (Sandbox), it's failing even before trying to connect to the SQL Server instance as it doesn't have permissions to instantiate a SqlClient.
In Dynamics CRM 2015 On Premise you don't have to run plugins in Sandbox if you don't want to. Sandbox is a requirement for Dynamics CRM online only.
Did you try running outside the Sandbox? Did you do an iisreset after changing the plugin isolation maybe?
Here is an article with more details.
How can I go about obtaining the latest WSDL files from a CRM4 deployment?
Currently we have a deployment in place on a hosted solution where there are two WSDL files available via Settings > Customization > Download Web Service Description Files
From this location there are two files available:
- CrmService.asmx
- MetadataService.asmx
If I attempt to click on these files then it opens web URLs as follows:
- http://be-crm4.domain.co.uk/MSCrmServices/2007/CrmServiceWsdl.aspx
- http://be-crm4.domain.co.uk/MSCrmServices/2007/MetadataService.asmx?WSDL
However upon looking at the visual studio C# coded connector tool that interacts with the current CRM instance that a previous developer has done I can see that he has reference to 3 WSDL.
CrmService
CRMMetaService
CrmDiscoveryService
The Discovery service URL is as follows:
- http://be-sql-live01/MSCRMServices/2007/AD/CrmDiscoveryService.asmx
We are currently in the process of moving our server to another server and I am testing the webservice component of this but as there have been changes I want to regenerate the WSDL files.
How can I save the wsdl files from the browser? How can I find the discovery URL of the webservices as only 2 of these seem to be appearing
I know it's a bit late but maybe someone will find this helpfull.
On on-permise instane using AD the address for discovery service is:
http[s]://<hostname>[:port]/mscrmservices/2007/AD/CrmDiscoveryService.asmx
On IFD instance:
http[s]://<hostname>[:port]/mscrmservices/2007/IFD/CrmDiscoveryService.asmx
You can get WSDL by adding ?WSDL on the end of webservice address for example:
http[s]://<hostname>[:port]/mscrmservices/2007/AD/CrmDiscoveryService.asmx?WSDL
I am new windows azure user. I have gotten selected for 90 days trial account and I am able to upload my ASP.NET MVC3 application to my account. My site is also running now. After I did publish my site, I added more model, views and controller to my proramme. Now I can not find a way to update my application. I can again publish my application but update option is not there. I want to update my new code only but the package option is creating full application. How I can update the new code to my site in windows azure cloud?
[Changed spelling]
With Windows Azure you can publish/update an application following ways:
Log into you Windows Azure account. Select you hosted server name and at the top panel you will see "Upgrade" option, when you will use this option you will be given a chance to select your CSPKG and CSCFG file from local file system or from Windows Azure storage. Once you selected new or updated CSPKG, your current running service will bee upgraded.
You can also use Windows Azure PowerShell Cmdlets to upgrade your current running hosted service using "Update-Deployment" command:
2.1 http://wappowershell.codeplex.com/
You can other 3rd party applications created using Windows Azure Service Management API to upgrade/manage your current running hosted service.
3.1 http://wapmmc.codeplex.com/
3.2 http://www.cerebrata.com/Products/CloudStorageStudio/Default.aspx
Note: With Visual Studio if you again publish your application, it will delete the current running hosted service and then create the new on so for update it is not the good one.
Finally based on your question about partial update, that is not supported. Even when you make a single line change in your code the deployment will be considered a full deployment even when the action is "update/upgrade". There is no diff package deployment so evertime you update your Windows Azure application, you will use the newly created CSPKG file and upgrade your hosted application.
Regarding partial update: If you have multiple Roles, you may choose to upgrade a single role (so that would be a partial update of the deployment). For a given Role, all code is redeployed. If you're running more than one instance, the update will be rolled out across groups of instances, not all instances at once.
For updates such as static content: if you move these into blob storage (a great place for css, jquery, images, etc.), then you may update this content by simply uploading new items to blob storage individually. These updates don't require any code to be rebuilt or redeployed.
If you're in dev mode (e.g. non-production), you may enable Web Deploy, which then allows very fast updates of your app to the running instance. This only works in single-instance mode, and it's great when doing frequent code+test cycles.
I have a solution that contains multiple integration test projects and one web application project. each integration project connects to the web application when running the tests. I would like for each test project to access the website with its own database connection. I have been trying to use the web deploy functionality built into visual studio. However I have been unable to figure out what I need to add to either the deployment package that is created and/or the post build event for the test projects to declare the binding port for the website when deployed. For example, I want integration project A to create and access the website located at http://localhost:83 and integration project B to create and access the website located at http://localhost:82. Could someone please explain:
Is there anything I need to do the deployment package ?
What do I need to add to my post-build events for my integration projects when deploying the package, so that the website is created at the correct port when building the project?
Update:
I'm wanting to deploy the same site to two different locations on my machine so that I can run both sets of integration tests at the same time.
Update 2:
I have researched the web deploy tool and it allows you to specify parameters that modify what is deployed when you call it from the command line. However I have found the documentation very confusing. http://technet.microsoft.com/en-us/library/dd568968(WS.10).aspx
Update 3:
I expect these to be two different websites, each pointing to there own database. If possible I would like a single package that can be deployed using msdeploy. Which will then be called in a post build event from each of the integration test projects. I would like to specify the connection string and deployment location from the post build script of the integration project.
you can try with webdev.server included in visual studio. VisualStudio use this for start a webserver when you debug. With this you can start a webserver in the desire port (if the port is not used currently).
I made a bat file for change some options.
check it.
::Begin of bat file
cd C:\Program Files\Common Files\microsoft shared\DevServer\10.0\
WebDev.WebServer40.exe /port:80 /path:"C:\PATHTOYOURWEBPROJECT" /vpath:"/NAMEOFYOURWEBPROJECT"
::End of bat file
You can acces in: http://localhost:80
I use the webserver40, but if you don't have net.4 or vs2010 you can try to find webserver[ xx version].exe
I hope that this will be helpful, and sorry for my broken english.
First off, you're approaching this the wrong way.
> I would like for each test project to access the website with its
own database connection.
Who is creating the DB connection? Your web site or the test project? For rest of your question to make sense, I presume its the web site (otherwise, Project A and Project B cannot share a connection out of the box).
If your website is making the connection, unless you're caching or having a static connection, there will be a new connection made as each request runs your your site on a new thread. Another simpler alternative is to take a query param and initiate a new connection based on that. If you seed it off the caller, you can also use it for more detailed logging.
Web Deployment projects are meant for deploying to integration servers, so that means you cannot access them by http://localhost... but the full FQDN of the server.
Most importantly, http://localhost:82/myApp and http://localhost:83/myApp are two different sites (unless you redirect from one of them to another one which in itself can cause additional issues) running the same codebase.
Having said that, you would then need to deploy your website twice and then all you need is to change the config/settings entry in Project A and B to point to these to different sites.
Hope this makes sense.
You can define virtual host configuration.
Refer this guide for more information.
http://docs.jboss.org/jbossas/guides/webguide/r2/en/html/ch07.html