Is there any way to track changes to Metadata, like new fields, new entities and so on?
It is difficult to control a very large project in the same environment, so sometimes there are some customization that should not be deployed to productions (Mostly are mistakes or test in a development environment).
And there is a way to know who did that customization?
I am looking to know every possible change, not any in particular.
You have to use the RetrieveMetadataChangesRequest and it is not possible to know who made the change.
This is available only from Microsoft Dynamics CRM 2011 Update Rollup 12
This request is intended to be used to cache information from the metadata and be able to work offline, but we can use it to track changes to metadata in complex projects and complex teams
Examples on internet are not very friendly so this is how you can use the request:
The request can be completed only with filling one parameter
RetrieveMetadataChangesRequest req = new RetrieveMetadataChangesRequest()
{
ClientVersionStamp = null
};
var response = (RetrieveMetadataChangesResponse)service.Execute(req);
The first time you executed this request ClientVersionStamp needs to be null, because there was no request made to the metadata before and there is no ClientVersionStamp. This parameter is the last time you query for metadata changes and if it is null it will bring all customization from all time, so probably this request won't complete on time so we need to tune up.
var EntityFilter = new MetadataFilterExpression(LogicalOperator.And);
EntityFilter.Conditions.Add(new MetadataConditionExpression("SchemaName", MetadataConditionOperator.Equals, "ServiceAppointment"));
var entityQueryExpression = new EntityQueryExpression()
{
Criteria = EntityFilter
};
RetrieveMetadataChangesRequest req = new RetrieveMetadataChangesRequest()
{
Query = entityQueryExpression,
ClientVersionStamp = null
};
var response = (RetrieveMetadataChangesResponse)service.Execute(req);
This will query all metadata changes for "ServiceAppointment", feel free to use the entity you want, but what we need is the ServerTimeStamp from the response, it will looks like "22319800!09/13/2017 16:17:46", if you try to send this time stamp first, it will throw an exception, so it is necessary to query first to get a server time stamp.
Now you can use the request and the time stamp to retrieve all new changes since "22319800!09/13/2017 16:17:46"
RetrieveMetadataChangesRequest req = new RetrieveMetadataChangesRequest()
{
Query = entityQueryExpression,
ClientVersionStamp = #"22319800!09/13/2017 16:17:46"
};
var response = (RetrieveMetadataChangesResponse)service.Execute(req);
You can filter the query to match your needs, only search for specific entities, labels, relationship, keys and attributes or specific properties.
EntityQueryExpression entityQueryExpression = new EntityQueryExpression()
{
Criteria = EntityFilter,
Properties = EntityProperties,
RelationshipQuery = new RelationshipQueryExpression()
{
Properties = RelationshipProperties,
Criteria = RelationshipFilter
},
AttributeQuery = new AttributeQueryExpression()
{
Properties = AttributeProperties,
Criteria = AttributeFilter
}
};
Use this request and implement it the way you need.
A couple more options:
Register a plugin on Publish and Publish All, and track who did
the publish and when. That may help you narrow down who was making
changes, although someone could technically make a change without
publishing it, so not perfect information.
If you're using Dynamics OnPremise, the Metadata tables sometimes store information about who made a change that is not visible through a Metadata retrieve. I've found this to be very spotty though, not all Metadata has a Modified By user stored.
Related
I have a question regarding a small issue that I'm having. I've created a widget that will live on the Service Portal to allow an admin to Accept or Reject requests.
The data for the widget is pulling from the Approvals (approval_approver) table. Under my GlideRecord, I have a query that checks for the state as requested. (Ex. addQuery('state', 'requested'))
To narrow down the search, I tried entering addQuery('sys_id', current.sys_id). When I use this query, my script breaks and I get an error on the Service Portal end.
Here's a sample of the GlideRecord script I've written to Accept.
[//Accept Request
if(input && input.action=="acceptApproval") {
var inRec1 = new GlideRecord('sysapproval_approver');
inRec1.addQuery('state', 'requested');
//inRec1.get('sys_id', current.sys_id);
inRec1.query();
if(inRec1.next()) {
inRec1.setValue('state', 'Approved');
inRec1.setValue('approver', gs.getUserID());
gs.addInfoMessage("Accept Approval Processed");
inRec1.update();
}
}][1]
I've research the web, tried using $sp.getParameter() as a work-around and no change.
I would really appreciate any help or insight on what I can do different to get script to work and filter the right records.
If I understand your question correctly, you are asking how to get the sysId of the sysapproval_approver record from the client-side in a widget.
Unless you have defined current elsewhere in your server script, current is undefined. Secondly, $sp.getParameter() is used to retrieve URL parameters. So unless you've included the sysId as a URL parameter, that will not get you what you are looking for.
One pattern that I've used is to pass an object to the client after the initial query that gets the list of requests.
When you're ready to send input to the server from the client, you can add relevant information to the input object. See the simplified example below. For the sake of brevity, the code below does not include error handling.
// Client-side function
approveRequest = function(sysId) {
$scope.server.get({
action: "requestApproval",
sysId: sysId
})
.then(function(response) {
console.log("Request approved");
});
};
// Server-side
var requestGr = new GlideRecord();
requestGr.addQuery("SOME_QUERY");
requestGr.query(); // Retrieve initial list of requests to display in the template
data.requests = []; // Add array of requests to data object to be passed to the client via the controller
while(requestsGr.next()) {
data.requests.push({
"number": requestsGr.getValue("number");
"state" : requestsGr.getValue("state");
"sysId" : requestsGr.getValue("sys_id");
});
}
if(input && input.action=="acceptApproval") {
var sysapprovalGr = new GlideRecord('sysapproval_approver');
if(sysapprovalGr.get(input.sysId)) {
sysapprovalGr.setValue('state', 'Approved');
sysapprovalGr.setValue('approver', gs.getUserID());
sysapprovalGr.update();
gs.addInfoMessage("Accept Approval Processed");
}
...
I'm using the Nest client to programmatically execute requests against an Elasticsearch index. I need to use the UpdateByQuery API to update existing data in my index. To improve performance on large data sets, the recommended approach is to use slicing. In my case I'd like to use the automatic slicing feature documented here.
I've tested this out in the Kibana dev console and it works beautifully. I'm struggling on how to set this property in code through the Nest client interface. here's a code snippet:
var request = new Nest.UpdateByQueryRequest(indexModel.Name);
request.Conflicts = Elasticsearch.Net.Conflicts.Proceed;
request.Query = filterQuery;
// TODO Need to set slices to auto but the current client doesn't allow it and the server
// rejects a value of 0
request.Slices = 0;
var elasticResult = await _elasticClient.UpdateByQueryAsync(request, cancellationToken);
The comments on that property indicate that it can be set to "auto", but it expects a long so that's not possible.
// Summary:
// The number of slices this task should be divided into. Defaults to 1, meaning
// the task isn't sliced into subtasks. Can be set to `auto`.
public long? Slices { get; set; }
Setting to 0 just throws an error on the server. Has anyone else tried doing this? Is there some other way to configure this behavior? Other APIs seem to have the same problem, like ReindexOnServerAsync.
This was a bug in the spec and an unfortunate consequence of generating this part of the client from the spec.
The spec has been fixed and the change will be reflected in a future version of the client. For now though, it can be set with the following
var request = new Nest.UpdateByQueryRequest(indexModel.Name);
request.Conflicts = Elasticsearch.Net.Conflicts.Proceed;
request.Query = filterQuery;
((IRequest)request).RequestParameters.SetQueryString("slices", "auto");
var elasticResult = await _elasticClient.UpdateByQueryAsync(request, cancellationToken);
Due to some bad practices from one of our internal users. We need to transfer all activities (mails, notes, etc) from one contact to another contact. I was trying to achieve this via UI and I could not find a way to do this.
Is this possible? I'm looking for any way to achieve this, wether is CRMTool, SSIS, UI or any other way. Only admins will do this so we do not need anything fancy as it will be done maybe 4 times a year to clean up some data.
Thanks a lot :)
Tried using UI but no success.
I can think of two ways to do these updates.
The first method is selecting a view on an activity where the owner is listed (i.e. All Phone Calls) and exporting to Excel. This downloads a XLSX with some hidden columns at the start where IDs for the records are kept. You then update the owner column with the new owner (take care of copying the exact fullname), and then importing the Excel spreadsheet again. You would need to repeat this export/import steps for each activity type (phone calls, email, etc.). So this might be impractical if you have a large volume of date, because of the need to repeat and because there are max numbers of records that you can export.
The other way to do this is using some .NET code. Of course, to do this you will need to use Visual Studio 2019.
If that's the case, this will do the trick:
using System;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Xrm.Tooling.Connector;
namespace ChangeActivitiesOwner
{
class Program
{
static void Main(string[] args)
{
string connectionString = "AuthType=Office365;Url=<TODO:URL>;Username=<TODO:User>;Password=<TODO:Pass>;";
string oldUserFullname = ""; // TODO: place here fullname for the user you want to overwrite
string newUserFullname = ""; // TODO: place here fullname for the user you want to overwrite with
CrmServiceClient client = new CrmServiceClient(connectionString);
IOrganizationService service = client.OrganizationWebProxyClient != null ? client.OrganizationWebProxyClient : (IOrganizationService)client.OrganizationServiceProxy;
QueryByAttribute qbyaOldUser = new QueryByAttribute("systemuser");
qbyaOldUser.AddAttributeValue("fullname", oldUserFullname);
Guid olduserid = (Guid)service.RetrieveMultiple(qbyaOldUser)[0].Attributes["systemuserid"];
QueryByAttribute qbyaNewUser = new QueryByAttribute("systemuser");
qbyaNewUser.AddAttributeValue("fullname", newUserFullname);
Guid newuserid = (Guid)service.RetrieveMultiple(qbyaNewUser)[0].Attributes["systemuserid"];
foreach (string activity in new string[]{ "task", "phonecall", "email", "fax", "appointment", "letter", "campaignresponse", "campaignactivity" }) // TODO: Add other activities as needed!!!
{
QueryExpression query = new QueryExpression(activity)
{
ColumnSet = new ColumnSet("activityid", "ownerid")
};
query.Criteria.AddCondition(new ConditionExpression("ownerid", ConditionOperator.Equal, olduserid));
foreach (Entity e in service.RetrieveMultiple(query).Entities)
{
e.Attributes["ownerid"] = new EntityReference("systemuser", newuserid);
service.Update(e);
}
}
}
}
}
Please complete the lines marked with "TODO" with your info.
You will need to add the packages Microsoft.CrmSdk.CoreAssemblies, Microsoft.CrmSdk.Deployment, Microsoft.CrmSdk.Workflow, Microsoft.CrmSdk.XrmTooling.CoreAssembly, Microsoft.IdentityModel.Clients.ActiveDIrectory and Newtonsoft.Json to your solution, and use .NET Framework 4.6.2.
Hope this helps.
i have a wcf which connects to crm (on prem) to retrieve an account record. i can see when the entity is retrieved it does not hold the current record i.e. some field will still hold the old column value. i tried with various merge option with no avail. please see the code below
using (XrmServiceContext cContext = new XrmServiceContext(con))
{
Entity ent = cContext.Retrieve(ConstantKVP.AccountSchema.ENTITY_LOGICAL_NAME, AccountId, new ColumnSet(true));
}
any suggestions?
Is it possible the data is being cached?
cContext.TryAccessCache(cache => cache.Mode = OrganizationServiceCacheMode.Disabled);
I took this approach for a CrmOrganizationServiceContext, so perhaps the same theory applies.
After save use clear changes cContext.ClearChanges();
For retrieves use MergeOption.OverwriteChanges
Or
Create a new XrmServiceContext object by passing a newed up organizationservice:
var uncachedOrganizationService = new OrganizationService("Xrm");
var uncachedXrmServiceContext = new XrmServiceContext(uncachedOrganizationService);
var ent = uncachedXrmServiceContext.Retrieve(ConstantKVP.AccountSchema.ENTITY_LOGICAL_NAME,AccountId,new ColumnSet(true));
I am trying to do some synchronization work with and Exchange Calendar. I want to keep another external calendar in sync with Exchange. Currently when the other app triggers a creation or update of some sort in Exchange, that change is then sent back to the other calendar creating an endless loop.
I had hoped to use the AppointmentSequenceNumber property when binding the Appointment item, but it always has the value of 0 no matter how many times it is updated. I am including AppointmentSequenceNumber in my PropertySet.
If anyone knows of a way to catch these updates and keep them from being sent back, that would be very helpful.
Thank you.
PropertySet pSet = new PropertySet(BasePropertySet.FirstClassProperties, ItemSchema.Subject, ItemSchema.Body, AppointmentSchema.Start, AppointmentSchema.End,AppointmentSchema.ArchiveTag, AppointmentSchema.InstanceKey, AppointmentSchema.AppointmentSequenceNumber);
ChangeCollection<ItemChange> changes = null;
.....
ExchangeService service = new ExchangeService(ExchangeVersion.Exchange2013)
{
Url = new Uri(exInfo.ServiceURL),
Credentials = new WebCredentials(exInfo.UserName, exInfo.Password)
};
//Pull Subscription Info
Microsoft.Exchange.WebServices.Data.PullSubscription sub = service.SubscribeToPullNotifications(
new FolderId[] { WellKnownFolderName.Calendar }, 30, "",
EventType.Created, EventType.Modified, EventType.Deleted);
syncState = exInfo.SyncState;
//Pull Changes
while (!syncComplete )//&& Count < MaxItems)
{
changes = service.SyncFolderItems(new FolderId(WellKnownFolderName.Calendar),
PropertySet.FirstClassProperties, null, 100, SyncFolderItemsScope.NormalItems, syncState);
foreach (ItemChange change in changes)
{
if (change.ChangeType != ChangeType.Delete) { eventItem = Appointment.Bind(service, change.ItemId, pSet); }
switch (change.ChangeType)
{
case ChangeType.Update:
...
break;
case ChangeType.Create:
...
break;
case ChangeType.Delete:
...
break;
}
Count++;
}
syncState = changes.SyncState;
syncComplete = !changes.MoreChangesAvailable;
}...
The AppointmentSequenceNumber would only be valid for Meetings; on normal Appointments it isn't used.
I had hoped to use the AppointmentSequenceNumber property when binding the Appointment item
That wouldn't work even if it was incrementing. Exchange will always provide you with the current version and the only things valid in a Bind is the EWSId of the appointment (or the Recurrence Sequence).
If anyone knows of a way to catch these updates and keep them from being sent back, that would be very helpful.
Synchronization is complicated but (from a notification perspective) if you modify an item in Exchange it's going to fire a notification and the ChangeKey attribute on the Item will be updated (quote):
"When you work with items in Exchange, another value to keep in mind is the ChangeKey attribute. This value, in addition to the item ID, is used to keep track of the state of an item. Any time an item is changed, a new change key is generated. When you perform an UpdateItem operation, for example, you can use the ChangeKey attribute to let the server know that your update is being applied to the most current version of the item. If another application made a change to the item you’re updating, the change keys won’t match and you will not be able to perform the update."