How can I determine the revision of a calendar item in EWS when using a PullSubscription - exchange-server

I am trying to do some synchronization work with and Exchange Calendar. I want to keep another external calendar in sync with Exchange. Currently when the other app triggers a creation or update of some sort in Exchange, that change is then sent back to the other calendar creating an endless loop.
I had hoped to use the AppointmentSequenceNumber property when binding the Appointment item, but it always has the value of 0 no matter how many times it is updated. I am including AppointmentSequenceNumber in my PropertySet.
If anyone knows of a way to catch these updates and keep them from being sent back, that would be very helpful.
Thank you.
PropertySet pSet = new PropertySet(BasePropertySet.FirstClassProperties, ItemSchema.Subject, ItemSchema.Body, AppointmentSchema.Start, AppointmentSchema.End,AppointmentSchema.ArchiveTag, AppointmentSchema.InstanceKey, AppointmentSchema.AppointmentSequenceNumber);
ChangeCollection<ItemChange> changes = null;
.....
ExchangeService service = new ExchangeService(ExchangeVersion.Exchange2013)
{
Url = new Uri(exInfo.ServiceURL),
Credentials = new WebCredentials(exInfo.UserName, exInfo.Password)
};
//Pull Subscription Info
Microsoft.Exchange.WebServices.Data.PullSubscription sub = service.SubscribeToPullNotifications(
new FolderId[] { WellKnownFolderName.Calendar }, 30, "",
EventType.Created, EventType.Modified, EventType.Deleted);
syncState = exInfo.SyncState;
//Pull Changes
while (!syncComplete )//&& Count < MaxItems)
{
changes = service.SyncFolderItems(new FolderId(WellKnownFolderName.Calendar),
PropertySet.FirstClassProperties, null, 100, SyncFolderItemsScope.NormalItems, syncState);
foreach (ItemChange change in changes)
{
if (change.ChangeType != ChangeType.Delete) { eventItem = Appointment.Bind(service, change.ItemId, pSet); }
switch (change.ChangeType)
{
case ChangeType.Update:
...
break;
case ChangeType.Create:
...
break;
case ChangeType.Delete:
...
break;
}
Count++;
}
syncState = changes.SyncState;
syncComplete = !changes.MoreChangesAvailable;
}...

The AppointmentSequenceNumber would only be valid for Meetings; on normal Appointments it isn't used.
I had hoped to use the AppointmentSequenceNumber property when binding the Appointment item
That wouldn't work even if it was incrementing. Exchange will always provide you with the current version and the only things valid in a Bind is the EWSId of the appointment (or the Recurrence Sequence).
If anyone knows of a way to catch these updates and keep them from being sent back, that would be very helpful.
Synchronization is complicated but (from a notification perspective) if you modify an item in Exchange it's going to fire a notification and the ChangeKey attribute on the Item will be updated (quote):
"When you work with items in Exchange, another value to keep in mind is the ChangeKey attribute. This value, in addition to the item ID, is used to keep track of the state of an item. Any time an item is changed, a new change key is generated. When you perform an UpdateItem operation, for example, you can use the ChangeKey attribute to let the server know that your update is being applied to the most current version of the item. If another application made a change to the item you’re updating, the change keys won’t match and you will not be able to perform the update."

Related

Is it possible to track changes to Entity Metadata in Dynamics CRM?

Is there any way to track changes to Metadata, like new fields, new entities and so on?
It is difficult to control a very large project in the same environment, so sometimes there are some customization that should not be deployed to productions (Mostly are mistakes or test in a development environment).
And there is a way to know who did that customization?
I am looking to know every possible change, not any in particular.
You have to use the RetrieveMetadataChangesRequest and it is not possible to know who made the change.
This is available only from Microsoft Dynamics CRM 2011 Update Rollup 12
This request is intended to be used to cache information from the metadata and be able to work offline, but we can use it to track changes to metadata in complex projects and complex teams
Examples on internet are not very friendly so this is how you can use the request:
The request can be completed only with filling one parameter
RetrieveMetadataChangesRequest req = new RetrieveMetadataChangesRequest()
{
ClientVersionStamp = null
};
var response = (RetrieveMetadataChangesResponse)service.Execute(req);
The first time you executed this request ClientVersionStamp needs to be null, because there was no request made to the metadata before and there is no ClientVersionStamp. This parameter is the last time you query for metadata changes and if it is null it will bring all customization from all time, so probably this request won't complete on time so we need to tune up.
var EntityFilter = new MetadataFilterExpression(LogicalOperator.And);
EntityFilter.Conditions.Add(new MetadataConditionExpression("SchemaName", MetadataConditionOperator.Equals, "ServiceAppointment"));
var entityQueryExpression = new EntityQueryExpression()
{
Criteria = EntityFilter
};
RetrieveMetadataChangesRequest req = new RetrieveMetadataChangesRequest()
{
Query = entityQueryExpression,
ClientVersionStamp = null
};
var response = (RetrieveMetadataChangesResponse)service.Execute(req);
This will query all metadata changes for "ServiceAppointment", feel free to use the entity you want, but what we need is the ServerTimeStamp from the response, it will looks like "22319800!09/13/2017 16:17:46", if you try to send this time stamp first, it will throw an exception, so it is necessary to query first to get a server time stamp.
Now you can use the request and the time stamp to retrieve all new changes since "22319800!09/13/2017 16:17:46"
RetrieveMetadataChangesRequest req = new RetrieveMetadataChangesRequest()
{
Query = entityQueryExpression,
ClientVersionStamp = #"22319800!09/13/2017 16:17:46"
};
var response = (RetrieveMetadataChangesResponse)service.Execute(req);
You can filter the query to match your needs, only search for specific entities, labels, relationship, keys and attributes or specific properties.
EntityQueryExpression entityQueryExpression = new EntityQueryExpression()
{
Criteria = EntityFilter,
Properties = EntityProperties,
RelationshipQuery = new RelationshipQueryExpression()
{
Properties = RelationshipProperties,
Criteria = RelationshipFilter
},
AttributeQuery = new AttributeQueryExpression()
{
Properties = AttributeProperties,
Criteria = AttributeFilter
}
};
Use this request and implement it the way you need.
A couple more options:
Register a plugin on Publish and Publish All, and track who did
the publish and when. That may help you narrow down who was making
changes, although someone could technically make a change without
publishing it, so not perfect information.
If you're using Dynamics OnPremise, the Metadata tables sometimes store information about who made a change that is not visible through a Metadata retrieve. I've found this to be very spotty though, not all Metadata has a Modified By user stored.

How to save a record and immediately use its GUID

I'm executing some javascript from a ribbon button and what I want to do is save the record that I am creating and then immediately use its GUID for some code a bit further on. Each time I try it the GUID is coming back null even though I'm requesting it after the record has been saved. If I try the button again after I've saved it then it works, but not as I'm saving it.
Is there a way to do this?
function RibbonButton_AddProduct()
{
//Save the Record
Xrm.Page.data.entity.save();
LoadProductCreate();
}
function LoadProductCreate()
{
var serverUrl;
var errorMessage = "Context to retrieve the Server URL is not available.";
if (typeof GetGlobalContext != "undefined"){
serverUrl = GetGlobalContext().getServerUrl();
} else {
if (typeof Xrm != "undefined"){
serverUrl = Xrm.Page.context.getServerUrl();
} else {
alert(errorMessage);
return;
}
}
if (serverUrl.match(/\/$/)){
serverUrl = serverUrl.substring(0, serverUrl.length - 1);
}
var recordId = Xrm.Page.data.entity.getId();
alert(recordId);
var url = serverUrl + "/main.aspx?etc=10030&extraqs=%3f_CreateFromId%3d%"+recordId
+"%257d%26_CreateFromType%3d10029%26etc%3d10030%26"
+"pagemode%3diframe%26preloadcache%3d1345465354543&pagetype=entityrecord";
window.open(url);
}
Here’s a different approach to solving this problem.
What you are trying to do is ‘working against the system’ - you are effectively making two save buttons. In the rest of Crm when the Id is required for a ribbon button the record must first be saved. E.g. you can’t use the dialog or workflow buttons on an unsaved record, you also can’t 'add new/existing' to an unsaved record.
So my solution would be to disable the button on unsaved forms, force the user to save the record manually and then allow them to use your button - this is the way Crm is meant to be used, and is the way the rest of Crm will work.
You should not work against the system, you should work with it, you have a product to customise and extend – not change.
If this doesn’t meet your requirement I would suggest uses Greg’s suggestion (1) of having flags, though it sounds a bit messy - but then this is a requirement that inherently is.
You could try one of two things:
Add a hidden boolean attribute to your form(e.g. "new_launchProductCreate"), set it in code prior to save and then read it onLoad.
Instead of setting the value prior to create (and therefore potentially commiting it to the database), you could create a plugin registered against the "Create" step of your record that injects a boolean value into the Entity.Attributes collection as the record is returned to the user. This would prevent the value persisting into the database and running every time your form loads.
You can instead use AJAX to reset the value as you launch your onLoad code so that it doesn't trigger on every form load
Assign the record guid manually, use AJAX to save your record, pop your new window using th enew guid and then reload your original form (so that the form is no longer in an "unsaved" state).
At the risk of being proven wrong as I cannot verify this right away... you will need to save and then reload the page.
The value stored in Xrm.Page.data.entity.getId() is set when the page is loaded/initialised and hence won't be updated when you access it after you have called Save().
It is also why it does work when you reload the page.
Perhaps you could call save and then reload the window adding a querystring variable of your own, to indicate that this event has just occurred?
e.g.
function DoSomething() {
//do your stuff
Xrm.Page.data.entity.save();
//something like - sure someone can do better!
window.location = window.location.href + '&foo=bar';
}
and then register something like this onFormLoad
function OnLoad() {
var queryStringParms = Xrm.Page.context.getQueryStringParameters();
//test to see if your query string param exists here
for (var i in queryStringParams) {
//if you find query string, do extra processing here
}
}

Dynamics CRM 2011 Bulk Update

Running Dynamics CRM 2011 rollout 3. Need to update millions of customer records periodically (delta updates). Using standard update (one by one) takes a few weeks. Also we don't want to touch the DB directly as it may break stuff in the future.
Is there a bulk update method in the Dynamics CRM 2011 webservice/REST API we can use? (WhatWhereHow)
I realize this is post is over 2 years old, but I can add to it in case someone else reads it and has a similar need.
Peter Majeed's answer is on target in that CRM processes requests one record at a time. There is no bulk edit that works the way you are looking for. I encourage you not to touch the DB directly if you need/want Microsoft support.
If you are looking at periodic updates of millions of records, you have a few options. Consider using Scribe or develop your own custom import utility or script using the CRM SDK.
Scribe is probably going to be your best option since it is cost effective for data imports and will allow you to easily update and insert from the same file.
If you write your own .Net/SDK based utility, I'd suggest making it multithreaded and either programmatically break up your input file in memory or on disk and have each thread work with its own subset of the data - that is, of course, if the order of execution does not have to be chronological according to the contents of the input file. If you can divide and conquer the input file over multiple threads, you can reduce the overall execution time considerably.
Also, if your corporate policy allows you to have access to one of the CRM Servers and you can place your code directly on the server and execute it from there - you can eliminate the network latency between a workstation running the code and the CRM web services.
Last but not least, if this large volume of import data is coming from another system, you can write a CRM plug-in to run on the Retrieve and RetrieveMultiple messages (events) in CRM for your specific entity, programmatically retrieve the desired data from the other system (and if the other system is unavailable - just use the cached copy in CRM), and keep CRM up to date in real-time or on a 'last-cached-on' basis. This is certainly more coding effort, but it potentially eliminates the need for a large synchronization job to be run every few weeks.
Yes and no, mostly no. Someone can correct me if I'm mistaken, in which case I'll gladly edit/delete my answer, but everything that's done in Dynamics CRM is done one at a time. It doesn't even try to handle set-based inserts/updates/deletes. So unless you go straight to direct DB operations, it will take you weeks.
The webservice does allow for "bulk" inserts/deletes/updates, but I put "bulk" in quotes because all it does is set up an asynchronous process where it does all the relevant data operations - yep - one at a time. There's a section of the SDK that addresses this sort of data management (linked). And to update the records this way, you'd have to first suffer the overhead of selecting all the data you want to update, then creating an xml file that contains the data, and finally updating the data (remember: one row at a time). So it would actually be more efficient to just loop through your data and issue an Update request for each yourself.
(I will note that our org hasn't experienced any memorable issues regarding direct DB access to handle what the SDK doesn't, nor have I seen anything in my personal internet readings that suggest others have.)
Edit:
See iFirefly's answer below for some other excellent ways to address this issue.
I realize this is an old question but it comes up high on "CRM Bulk Update" so the Update Rollup 12 feature ExecuteMultiple needs to be mentioned here -- its not going to work around your issue (massive volume) because as iFirefly and Peter point out CRM does everything one at a time. What it does do is package all your requests into a single envelope letting CRM handle the execution of each update and reduce the number of round trips between your app and the server if you do end up issuing an Update request for every record.
This is quite an old question, but nobody mentioned the fasted way (but also the most challenging) of updating/creating huge amounts of records in CRM 201X - using built-in import feature, which is totally doable using CRM SDK. There is a perfect MSDN article about that:
https://msdn.microsoft.com/en-us/library/gg328321(v=crm.5).aspx. In short you have to:
1) Build Excel file containing the data you want to import (simply export some data from CRM 201X and check how the structure looks like, remember that the first 3 columns are hidden)
2) create Import Map entity (specify the file you created)
3) Create column mappings if necessary
4) Create Import and ImportFile entity, providing proper mappings
5) Parse data using ParseImportRequest
6) Tranform data using TransformImportRequest
7) Import data using ImportRecordsImportRequest
This were the steps for CRM 2011, now in 2017 we have more versions available and there are slight differences between them. Check the sample that is available on MSDN and in SDK:
https://msdn.microsoft.com/en-us/library/hh547396(v=crm.5).aspx
Of course point 1, will be the most difficult part, because you have to build XML or docx file perfectly corresponding to what CRM expects, but I'm assuming you are doing it from external app, so you can use some great .NET libraries that will make things much simpler.
I never saw anything faster than standard CRM import when it comes to updating/creating records, even if you go for parallelism and Batch Update requests.
If something goes wrong with the MSDN sites, I'm posting here an example from the link above that is showing how to import data to CRM programatically:
using System;
using System.ServiceModel;
using System.Collections.Generic;
using System.Linq;
// These namespaces are found in the Microsoft.Xrm.Sdk.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Sdk.Messages;
using Microsoft.Xrm.Sdk.Metadata;
// These namespaces are found in the Microsoft.Crm.Sdk.Proxy.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Crm.Sdk.Messages;
namespace Microsoft.Crm.Sdk.Samples
{
/// <summary>
/// This sample shows how to define a complex mapping for importing and then use the
/// Microsoft Dynamics CRM 2011 API to bulk import records with that mapping.
/// </summary>
public class ImportWithCreate
{
#region Class Level Members
private OrganizationServiceProxy _serviceProxy;
private DateTime _executionDate;
#endregion
/// <summary>
/// This method first connects to the organization service. Afterwards,
/// auditing is enabled on the organization, account entity, and a couple
/// of attributes.
/// </summary>
/// <param name="serverConfig">Contains server connection information.</param>
/// <param name="promptforDelete">When True, the user will be prompted to delete all
/// created entities.</param>
public void Run(ServerConnection.Configuration serverConfig, bool promptforDelete)
{
using (_serviceProxy = ServerConnection.GetOrganizationProxy(serverConfig))
{
// This statement is required to enable early bound type support.
_serviceProxy.EnableProxyTypes();
// Log the start time to ensure deletion of records created during execution.
_executionDate = DateTime.Today;
ImportRecords();
DeleteRequiredRecords(promptforDelete);
}
}
/// <summary>
/// Imports records to Microsoft Dynamics CRM from the specified .csv file.
/// </summary>
public void ImportRecords()
{
// Create an import map.
ImportMap importMap = new ImportMap()
{
Name = "Import Map " + DateTime.Now.Ticks.ToString(),
Source = "Import Accounts.csv",
Description = "Description of data being imported",
EntitiesPerFile =
new OptionSetValue((int)ImportMapEntitiesPerFile.SingleEntityPerFile),
EntityState = EntityState.Created
};
Guid importMapId = _serviceProxy.Create(importMap);
// Create column mappings.
#region Column One Mappings
// Create a column mapping for a 'text' type field.
ColumnMapping colMapping1 = new ColumnMapping()
{
// Set source properties.
SourceAttributeName = "src_name",
SourceEntityName = "Account_1",
// Set target properties.
TargetAttributeName = "name",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with the data map.
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process)
};
// Create the mapping.
Guid colMappingId1 = _serviceProxy.Create(colMapping1);
#endregion
#region Column Two Mappings
// Create a column mapping for a 'lookup' type field.
ColumnMapping colMapping2 = new ColumnMapping()
{
// Set source properties.
SourceAttributeName = "src_parent",
SourceEntityName = "Account_1",
// Set target properties.
TargetAttributeName = "parentaccountid",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with the data map.
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process),
};
// Create the mapping.
Guid colMappingId2 = _serviceProxy.Create(colMapping2);
// Because we created a column mapping of type lookup, we need to specify lookup details in a lookupmapping.
// One lookupmapping will be for the parent account, and the other for the current record.
// This lookupmapping is important because without it the current record
// cannot be used as the parent of another record.
// Create a lookup mapping to the parent account.
LookUpMapping parentLookupMapping = new LookUpMapping()
{
// Relate this mapping with its parent column mapping.
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId2),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)LookUpMappingProcessCode.Process),
// Set the lookup for an account entity by its name attribute.
LookUpEntityName = Account.EntityLogicalName,
LookUpAttributeName = "name",
LookUpSourceCode =
new OptionSetValue((int)LookUpMappingLookUpSourceCode.System)
};
// Create the lookup mapping.
Guid parentLookupMappingId = _serviceProxy.Create(parentLookupMapping);
// Create a lookup on the current record's "src_name" so that this record can
// be used as the parent account for another record being imported.
// Without this lookup, no record using this account as its parent will be imported.
LookUpMapping currentLookUpMapping = new LookUpMapping()
{
// Relate this lookup with its parent column mapping.
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId2),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)LookUpMappingProcessCode.Process),
// Set the lookup for the current record by its src_name attribute.
LookUpAttributeName = "src_name",
LookUpEntityName = "Account_1",
LookUpSourceCode =
new OptionSetValue((int)LookUpMappingLookUpSourceCode.Source)
};
// Create the lookup mapping
Guid currentLookupMappingId = _serviceProxy.Create(currentLookUpMapping);
#endregion
#region Column Three Mappings
// Create a column mapping for a 'picklist' type field
ColumnMapping colMapping3 = new ColumnMapping()
{
// Set source properties
SourceAttributeName = "src_addresstype",
SourceEntityName = "Account_1",
// Set target properties
TargetAttributeName = "address1_addresstypecode",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with its parent data map
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process)
};
// Create the mapping
Guid colMappingId3 = _serviceProxy.Create(colMapping3);
// Because we created a column mapping of type picklist, we need to specify picklist details in a picklistMapping
PickListMapping pickListMapping1 = new PickListMapping()
{
SourceValue = "bill",
TargetValue = 1,
// Relate this column mapping with its column mapping data map
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId3),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)PickListMappingProcessCode.Process)
};
// Create the mapping
Guid picklistMappingId1 = _serviceProxy.Create(pickListMapping1);
// Need a picklist mapping for every address type code expected
PickListMapping pickListMapping2 = new PickListMapping()
{
SourceValue = "ship",
TargetValue = 2,
// Relate this column mapping with its column mapping data map
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId3),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)PickListMappingProcessCode.Process)
};
// Create the mapping
Guid picklistMappingId2 = _serviceProxy.Create(pickListMapping2);
#endregion
// Create Import
Import import = new Import()
{
// IsImport is obsolete; use ModeCode to declare Create or Update.
ModeCode = new OptionSetValue((int)ImportModeCode.Create),
Name = "Importing data"
};
Guid importId = _serviceProxy.Create(import);
// Create Import File.
ImportFile importFile = new ImportFile()
{
Content = BulkImportHelper.ReadCsvFile("Import Accounts.csv"), // Read contents from disk.
Name = "Account record import",
IsFirstRowHeader = true,
ImportMapId = new EntityReference(ImportMap.EntityLogicalName, importMapId),
UseSystemMap = false,
Source = "Import Accounts.csv",
SourceEntityName = "Account_1",
TargetEntityName = Account.EntityLogicalName,
ImportId = new EntityReference(Import.EntityLogicalName, importId),
EnableDuplicateDetection = false,
FieldDelimiterCode =
new OptionSetValue((int)ImportFileFieldDelimiterCode.Comma),
DataDelimiterCode =
new OptionSetValue((int)ImportFileDataDelimiterCode.DoubleQuote),
ProcessCode =
new OptionSetValue((int)ImportFileProcessCode.Process)
};
// Get the current user to set as record owner.
WhoAmIRequest systemUserRequest = new WhoAmIRequest();
WhoAmIResponse systemUserResponse =
(WhoAmIResponse)_serviceProxy.Execute(systemUserRequest);
// Set the owner ID.
importFile.RecordsOwnerId =
new EntityReference(SystemUser.EntityLogicalName, systemUserResponse.UserId);
Guid importFileId = _serviceProxy.Create(importFile);
// Retrieve the header columns used in the import file.
GetHeaderColumnsImportFileRequest headerColumnsRequest = new GetHeaderColumnsImportFileRequest()
{
ImportFileId = importFileId
};
GetHeaderColumnsImportFileResponse headerColumnsResponse =
(GetHeaderColumnsImportFileResponse)_serviceProxy.Execute(headerColumnsRequest);
// Output the header columns.
int columnNum = 1;
foreach (string headerName in headerColumnsResponse.Columns)
{
Console.WriteLine("Column[" + columnNum.ToString() + "] = " + headerName);
columnNum++;
}
// Parse the import file.
ParseImportRequest parseImportRequest = new ParseImportRequest()
{
ImportId = importId
};
ParseImportResponse parseImportResponse =
(ParseImportResponse)_serviceProxy.Execute(parseImportRequest);
Console.WriteLine("Waiting for Parse async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, parseImportResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
// Retrieve the first two distinct values for column 1 from the parse table.
// NOTE: You must create the parse table first using the ParseImport message.
// The parse table is not accessible after ImportRecordsImportResponse is called.
GetDistinctValuesImportFileRequest distinctValuesRequest = new GetDistinctValuesImportFileRequest()
{
columnNumber = 1,
ImportFileId = importFileId,
pageNumber = 1,
recordsPerPage = 2,
};
GetDistinctValuesImportFileResponse distinctValuesResponse =
(GetDistinctValuesImportFileResponse)_serviceProxy.Execute(distinctValuesRequest);
// Output the distinct values. In this case: (column 1, row 1) and (column 1, row 2).
int cellNum = 1;
foreach (string cellValue in distinctValuesResponse.Values)
{
Console.WriteLine("(1, " + cellNum.ToString() + "): " + cellValue);
Console.WriteLine(cellValue);
cellNum++;
}
// Retrieve data from the parse table.
// NOTE: You must create the parse table first using the ParseImport message.
// The parse table is not accessible after ImportRecordsImportResponse is called.
RetrieveParsedDataImportFileRequest parsedDataRequest = new RetrieveParsedDataImportFileRequest()
{
ImportFileId = importFileId,
PagingInfo = new PagingInfo()
{
// Specify the number of entity instances returned per page.
Count = 2,
// Specify the number of pages returned from the query.
PageNumber = 1,
// Specify a total number of entity instances returned.
PagingCookie = "1"
}
};
RetrieveParsedDataImportFileResponse parsedDataResponse =
(RetrieveParsedDataImportFileResponse)_serviceProxy.Execute(parsedDataRequest);
// Output the first two rows retrieved.
int rowCount = 1;
foreach (string[] rows in parsedDataResponse.Values)
{
int colCount = 1;
foreach (string column in rows)
{
Console.WriteLine("(" + rowCount.ToString() + "," + colCount.ToString() + ") = " + column);
colCount++;
}
rowCount++;
}
// Transform the import
TransformImportRequest transformImportRequest = new TransformImportRequest()
{
ImportId = importId
};
TransformImportResponse transformImportResponse =
(TransformImportResponse)_serviceProxy.Execute(transformImportRequest);
Console.WriteLine("Waiting for Transform async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, transformImportResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
// Upload the records.
ImportRecordsImportRequest importRequest = new ImportRecordsImportRequest()
{
ImportId = importId
};
ImportRecordsImportResponse importResponse =
(ImportRecordsImportResponse)_serviceProxy.Execute(importRequest);
Console.WriteLine("Waiting for ImportRecords async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, importResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
}
/// <summary>
/// Deletes any entity records that were created for this sample.
/// <param name="prompt">Indicates whether to prompt the user
/// to delete the records created in this sample.</param>
/// </summary>
public void DeleteRequiredRecords(bool prompt)
{
bool toBeDeleted = true;
if (prompt)
{
// Ask the user if the created entities should be deleted.
Console.Write("\nDo you want these entity records deleted? (y/n) [y]: ");
String answer = Console.ReadLine();
if (answer.StartsWith("y") ||
answer.StartsWith("Y") ||
answer == String.Empty)
{
toBeDeleted = true;
}
else
{
toBeDeleted = false;
}
}
if (toBeDeleted)
{
// Retrieve all account records created in this sample.
QueryExpression query = new QueryExpression()
{
EntityName = Account.EntityLogicalName,
Criteria = new FilterExpression()
{
Conditions =
{
new ConditionExpression("createdon", ConditionOperator.OnOrAfter, _executionDate),
}
},
ColumnSet = new ColumnSet(false)
};
var accountsCreated = _serviceProxy.RetrieveMultiple(query).Entities;
// Delete all records created in this sample.
foreach (var account in accountsCreated)
{
_serviceProxy.Delete(Account.EntityLogicalName, account.Id);
}
Console.WriteLine("Entity record(s) have been deleted.");
}
}
#region Main method
/// <summary>
/// Standard Main() method used by most SDK samples.
/// </summary>
/// <param name="args"></param>
static public void Main(string[] args)
{
try
{
// Obtain the target organization's web address and client logon
// credentials from the user.
ServerConnection serverConnect = new ServerConnection();
ServerConnection.Configuration config = serverConnect.GetServerConfiguration();
var app = new ImportWithCreate();
app.Run(config, true);
}
catch (FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault> ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine("Timestamp: {0}", ex.Detail.Timestamp);
Console.WriteLine("Code: {0}", ex.Detail.ErrorCode);
Console.WriteLine("Message: {0}", ex.Detail.Message);
Console.WriteLine("Trace: {0}", ex.Detail.TraceText);
Console.WriteLine("Inner Fault: {0}",
null == ex.Detail.InnerFault ? "No Inner Fault" : "Has Inner Fault");
}
catch (System.TimeoutException ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine("Message: {0}", ex.Message);
Console.WriteLine("Stack Trace: {0}", ex.StackTrace);
Console.WriteLine("Inner Fault: {0}",
null == ex.InnerException.Message ? "No Inner Fault" : ex.InnerException.Message);
}
catch (System.Exception ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine(ex.Message);
// Display the details of the inner exception.
if (ex.InnerException != null)
{
Console.WriteLine(ex.InnerException.Message);
FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault> fe = ex.InnerException
as FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault>;
if (fe != null)
{
Console.WriteLine("Timestamp: {0}", fe.Detail.Timestamp);
Console.WriteLine("Code: {0}", fe.Detail.ErrorCode);
Console.WriteLine("Message: {0}", fe.Detail.Message);
Console.WriteLine("Trace: {0}", fe.Detail.TraceText);
Console.WriteLine("Inner Fault: {0}",
null == fe.Detail.InnerFault ? "No Inner Fault" : "Has Inner Fault");
}
}
}
// Additional exceptions to catch: SecurityTokenValidationException, ExpiredSecurityTokenException,
// SecurityAccessDeniedException, MessageSecurityException, and SecurityNegotiationException.
finally
{
Console.WriteLine("Press <Enter> to exit.");
Console.ReadLine();
}
}
#endregion Main method
}
}
Not sure how this would go with millions of records, but you can select your records, then click the Edit button in the ribbon. This will bring up the "Edit Multiple Records" dialog. Any changes you make will be applied to all your records.
The BulkUpdate API works well for me; it is 10 times faster than updating records one at a time. Following is a snippet that performs a bulk update:
public override ExecuteMultipleResponse BulkUpdate(List<Entity> entities)
{
ExecuteMultipleRequest request = new ExecuteMultipleRequest()
{
Settings = new ExecuteMultipleSettings()
{
ContinueOnError = true,
ReturnResponses = true
},
Requests = new OrganizationRequestCollection()
};
for (int i = 0; i < entities.Count; i++)
{
request.Requests.Add(new UpdateRequest() { Target = entities[i] });
}
return (ExecuteMultipleResponse) ServiceContext.Execute(request);
}
I worked on a very large data migration project for Dynamics CRM 2011. We needed to load about 3 million records over a weekend. I ended up building a console application (single thread) and ran multiple instances on multiple machines. Each console application had an id (1, 2, etc.) and was responsible for loading segments of the data based on a unique SQL WHERE clause that matched the application's id.
You could do the same thing with updates. Each instance could query a subset of the records to update and can perform the updates via the SDK. Since we loaded millions of records over a weekend I think you could perform millions of updates (if relatively small) in just a few hours.
Microsoft PFE team for dynamics CRM wrote
new Another CRM SDK library that make use of parallelization
to bulk execute requests ensuring thread safety.
You may try : Parallel Execute Requests
I would be interested to know if it works and scales to millions of records.
CRM doesn't implement a way to update bulk data; there are 3 ways to improve the bulk update operation performance but internally they cannot change the fact that CRM updates record one by one.
Basically the ideas are:
reduce the time wasted on communicating to CRM server
use parallelism to do multiple operations at the same time
make sure the update process does NOT trigger any workflows/plugins. Otherwise you might never see the end of the process...
3 ways to improve bulk operation performance:
After RollUp 12 there is a ExecuteMultipleRequest feature, which allows you to send up to 1000 requests at once. This means you may save some time from sending 1000 requests to CRM web service, however, these requests are processed one after another. So if your CRM server is well configured, most likely this method won't help too much.
You may use OrganizationServiceContext instance to do bulk update. OrganizationServiceContext implements unit of work pattern so you can do multiple updates and transmit these operations to the server in one call. Comparing to ExecuteMultipleRequest, it doesn't have a limit on request amount, but if it encounters a failure during the update, it will rollback all the changes.
Use multithreading or multitask. Either way would improve the speed, but they are likely to generate some connection failures or SQL errors, so you would need to add some retry logic in the code.
One of my client had exactly the same problem. He solved it by creating a custom ETL and doing parallelism attacking two front-end. The whole thing was made in C#. Nowaday, it could be possible with KingswaySoft or Scribe.

Updating an Appointment causes it to change to a Meeting in EWS 1.1

Here's what I'm trying to do:
get all items on a user's calendar between two dates
update the Location or Subject for some items
I get the items with:
FindItemsResults<Appointment> findResults = calendar.FindAppointments(new CalendarView(startDate, endDate));
This query works fine. But whenever I call Update to save the item I get an exception:
Microsoft.Exchange.WebServices.Data.ServiceResponseException: One or more recipients are invalid.
Even though I get an exception, the item is saved and gets changed to have IsMeeting set to true! Now the updated item is a meeting with an organizer etc... This is, effectively, data corruption for me.
Here's the code. It is no more complicated than this. I've tested it by just changing Location or Subject and both cause the problem.
Appointment a = Appointment.Bind(_service, new ItemId(id));
a.Location = newLocation
a.Update(ConflictResolutionMode.AlwaysOverwrite);
Am I missing some concept or something? This seems like a pretty egregious problem.
FWIW, this is EWS 1.1 against an Office 365 server.
I figured it out with help from this Question:
Exchange Appointment Types
The key is the Update method needs to be called with the SendInvitationsOrCancellationsMode.SendToNone flag set in the 2nd parameter.
Like this:
a.Update(ConflictResolutionMode.AlwaysOverwrite, SendInvitationsOrCancellationsMode.SendToNone);
So tig's answer works when you never want to send out appointment updates to the other attendees. However to answer this properly you actually need to get the attendee state loaded.
By default it is trying to send appointment updates to the attendees, however your appointment object doesn't have the attendee state loaded and is hence blowing up. When you do the bind you should load the attendee properties. You should probably also load the organizer as well to cover another edge case:
AppointmentSchema.RequiredAttendees
AppointmentSchema.OptionalAttendees
AppointmentSchema.Resources
AppointmentSchema.Organizer
This will get the attendees populated if you want to do an update that sends out updates to the attendees.
However there is then another edge case that you have to worry about. If you have an appointment with no attendees added to it (just the organizer), then EWS may still complain and throw this error. It will actually work for appointments in some states, but fail in other states.
So the most complete solution is a combination of:
Loading the attendee state.
Inspecting the attendee state to see if there are any attendees other than the organizer (depending on how the appointment was created the organizer may or may not appear in the RequiredAttendees collection). If there are not then you must use SendInvitationsOrCancellationsMode.SendToNone.
So the full sample would look something like:
Appointment a = Appointment.Bind(_service, new ItemId(id), new PropertySet(AppointmentSchema.RequiredAttendees, AppointmentSchema.OptionalAttendees, AppointmentSchema.Resources, AppointmentSchema.Organizer));
a.Location = newLocation
// Check if the appointment has attendees other than the organizer. The organizer may
// or may not appear in the required attendees list.
if (HasNoOtherAttendee(a.Organizer.Address, a.RequiredAttendees) &&
(a.OptionalAttendees.Count == 0) && (a.Resources.Count == 0))
{
a.Update(ConflictResolutionMode.AlwaysOverwrite, SendInvitationsOrCancellationsMode.SendToNone);
}
else
{
// We have other attendees in the appointment, so we can use SendToAllAndSaveCopy so
// they will see the update.
a.Update(ConflictResolutionMode.AlwaysOverwrite, SendInvitationsOrCancellationsMode.SendToAllAndSaveCopy);
}
bool HasNoOtherAttendee(string email, AttendeeCollection attendees)
{
bool emptyOrOnlyMe = true;
foreach (var a in attendees)
{
if (!string.Equals(email, a.Address, StringComparison.OrdinalIgnoreCase))
{
emptyOrOnlyMe = false;
break;
}
}
return emptyOrOnlyMe;
}
To answer this bit of the question
"Even though I get an exception, the item is saved and gets changed to
have IsMeeting set to true! Now the updated item is a meeting with an
organizer etc... This is, effectively, data corruption for me."
The Microsoft documentation states, in the small print, "A meeting request is just an appointment that has attendees. You can convert an appointment into a meeting request by adding required attendees, optional attendees, or resources to the appointment" - as seen here
http://msdn.microsoft.com/en-us/library/office/dd633641%28v=exchg.80%29.aspx
In other words, as soon as you have any attendees, Exchange converts it to a meeting automatically.
public static bool UpdateAppointment(ExchangeCredential credentials,
ItemId appointmentId, string newLocation, string newSubject,
DateTime startTime,
DateTime endTime)
{
ExchangeService service = GetExchangeService(credentials);
try
{
Appointment appt = Appointment.Bind(service, appointmentId,
new PropertySet(BasePropertySet.IdOnly, AppointmentSchema.Start,
AppointmentSchema.ReminderDueBy, AppointmentSchema.End, AppointmentSchema.StartTimeZone,
AppointmentSchema.TimeZone));
appt.Location = newLocation;
appt.Start = startTime;
appt.End = endTime;
appt.Subject = newSubject;
// very important! you must load the new timeZone
appt.StartTimeZone = TimeZoneInfo.Local;
//appt.Body.Text = newBody; //if needed
appt.Update(ConflictResolutionMode.AlwaysOverwrite);
}
catch (Exception ex)
{
throw ex;
}
return true;
}

Saving Data Locally and Remotely (Syncing)

When data is entered, it ultimately needs to be saved remotely on a server. I do want the app to work if there is no data connection at the time also, so I need to save everything locally on the phone too. The app can then sync with the server when it gets a connection.
This brings up a little issue. I'm used to saving everything on the server and then getting the records back with id's generated from the server for them. If there is no connection, the app will save locally to the phone but not the server. When syncing with the server, I don't see a way for the phone to know when a record comes back which locally record it's associated with. There isn't enough unique data to figure this out.
What is the best way to handle this?
One way I've been thinking is to change the id of the records to a GUID and let the phone set the id. This way, all records will have an id locally, and when saving to the server, it should still be a unique id.
I'd like to know what other people have been doing, and what works and what doesn't from experience.
This is how we done with a first windows phone 7 app finished few days ago with my friend.
It might not be the best solution but 'till additional refactoring it works just fine.
It's an application for a web app like a mint.com called slamarica.
If we have feature like save transaction, we first check if we have connection to internet.
// Check if application is in online or in offline mode
if (NetworkDetector.IsOnline)
{
// Save through REST API
_transactionBl.AddTransaction(_currentTransaction);
}
else
{
// Save to phone database
SaveTransactionToPhone(_currentTransaction);
}
If transaction is successfully saved via REST, it responses with transaction object and than we save it to local database. If REST save failed we save data to local database.
private void OnTransactionSaveCompleted(bool isSuccessful, string message, Transaction savedTransaction)
{
MessageBox.Show(message);
if(isSuccessful)
{
// save new transaction to local database
DatabaseBl.Save(savedTransaction);
// save to observable collection Transactions in MainViewModel
App.ViewModel.Transactions.Add(App.ViewModel.TransactionToTransactionViewModel(savedTransaction));
App.ViewModel.SortTransactionList();
// Go back to Transaction List
NavigationService.GoBack();
}
else
{
// if REST is failed save unsent transaction to Phone database
SaveTransactionToPhone(_currentTransaction);
// save to observable collection Transactions in MainViewModel
App.ViewModel.Transactions.Add(App.ViewModel.TransactionToTransactionViewModel(_currentTransaction));
App.ViewModel.SortTransactionList();
}
}
Every Transaction object has IsInSync property. It is set to false by default until we got confirmation from REST API that it's saved successful on the server.
User has ability to refresh transactions. User can click on a button Refresh to fetch new data from the server. We do the syncing in the background like this:
private void RefreshTransactions(object sender, RoutedEventArgs e)
{
if (NetworkDetector.IsOnline)
{
var notSyncTransactions = DatabaseBl.GetData<Transaction>().Where(x => x.IsInSync == false).ToList();
if(notSyncTransactions.Count > 0)
{
// we must Sync all transactions
_isAllInSync = true;
_transactionSyncCount = notSyncTransactions.Count;
_transactionBl.AddTransactionCompleted += OnSyncTransactionCompleted;
if (_progress == null)
{
_progress = new ProgressIndicator();
}
foreach (var notSyncTransaction in notSyncTransactions)
{
_transactionBl.AddTransaction(notSyncTransaction);
}
_progress.Show();
}
else
{
// just refresh transactions
DoTransactionRefresh();
}
}
else
{
MessageBox.Show(ApplicationStrings.NETWORK_OFFLINE);
}
}
private void DoTransactionRefresh()
{
if (_progress == null)
{
_progress = new ProgressIndicator();
}
// after all data is sent do full reload
App.ViewModel.LoadMore = true;
App.ViewModel.ShowButton = false;
ApplicationBl<Transaction>.GetDataLoadingCompleted += OnTransactionsRefreshCompleted;
ApplicationBl<Transaction>.GetData(0, 10);
_progress.Show();
}
OnTransactionRefreshCompleted we delete all transaction data in local database and get the latest 10 transactions. We don't need all the data, and this way user have synced data. He can always load more data by taping load more at the end of transaction list. It's something similar like those twitter apps.
private void OnTransactionsRefreshCompleted(object entities)
{
if (entities is IList<Transaction>)
{
// save transactions
var transactions = (IList<Transaction>)entities;
DatabaseBl.TruncateTable<Transaction>();
DatabaseBl.Save(transactions);
((MainViewModel) DataContext).Transactions.Clear();
//reset offset
_offset = 1;
//update list with new transactions
App.ViewModel.LoadDataForTransactions(transactions);
App.ViewModel.LoadMore = false;
App.ViewModel.ShowButton = true;
}
if (entities == null)
{
App.ViewModel.ShowButton = false;
App.ViewModel.LoadMore = false;
}
// hide progress
_progress.Hide();
// remove event handler
ApplicationBl<Transaction>.GetDataLoadingCompleted -= OnTransactionsRefreshCompleted;
}
Caveat - I haven't tried this with windows phone development but use of GUID identities is something I usually do when faced with similar situations - eg creating records when I only have a one-way connection to the database - such as via a message bus or queue.
It works fine, albeit with a minor penalty in record sizes, and can also cause less performant indexes. I suggest you just give it a shot.

Resources