I am trying to create a simple Xamarin forms app which allows the user to browse for or take a photo and have azure cognitive services tag the photo using a custom vision model.
I am unable to get the client to successfully authenticate or find a resource per the error message in the exception produced by the VisionServiceClient. Am I missing something? What would be the correct values to use for the arguments to VisionServiceClient?
All keys have been removed from the below images, they are populated.
Exception thrown in VS2017:
'Microsoft.ProjectOxford.Vision.ClientException' in System.Private.CoreLib.dll
Call to VisionServiceClient:
private const string endpoint = #"https://eastus2.api.cognitive.microsoft.com/vision/prediction/v1.0";
private const string key = "";
VisionServiceClient visionClient = new VisionServiceClient(key, endpoint);
VisualFeature[] features = { VisualFeature.Tags, VisualFeature.Categories, VisualFeature.Description };
try
{
AnalysisResult temp = await visionClient.AnalyzeImageAsync(imageStream,
features.ToList(), null);
return temp;
}
catch(Exception ex)
{
return null;
}
VS Exception Error:
Azure Portal for cognitive services:
Custom Vision Portal:
It looks like you're confusing the Computer Vision and the Custom Vision APIs. You are attempting to use the client SDK for the former using the API key of the latter.
For .NET languages, you'll want the Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction NuGet package.
Your code will end up looking something like this:
ICustomVisionPredictionClient client = new CustomVisionPredictionClient()
{
ApiKey = PredictionKey,
Endpoint = "https://southcentralus.api.cognitive.microsoft.com"
};
ImagePrediction prediction = await client.PredictImageAsync(ProjectId, stream, IterationId);
Thank you to cthrash for the extended help and talking with me in chat. Using his post along with a little troubleshooting I have figured out what works for me. The code is super clunky but it was just to test and make sure I'm able to do this. To answer the question:
Nuget packages and classes
Using cthrash's post I was able to get both the training and prediction nuget packages installed, which are the correct packages for this particular application. I needed the following classes:
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction.Models
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Training
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Training.Models
Endpoint Root
Following some of the steps Here I determined that the endpoint URL's only need to be the root, not the full URL provided in the Custom Vision Portal. For instance,
https://southcentralus.api.cognitive.microsoft.com/customvision/v2.0/Prediction/
Was changed to
https://southcentralus.api.cognitive.microsoft.com
I used both the key and endpoint from the Custom Vision Portal and making that change I was able to use both a training and prediction client to pull the projects and iterations.
Getting Project Id
In order to use CustomVisionPredictionClient.PredictImageAsync you need a Guid for the project id and an iteration id if a default iteration is not set in the portal.
I tested two ways to get the project id,
Using project id string from portal
Grab the project id string from the portal under the project settings.
For the first argument to PredictImageAsync pass
Guid.Parse(projectId)
Using the training client
Create a new CustomVisionTrainingClient
To get a list of <Project> use
TrainingClient.GetProjects().ToList()
In my case I only had a single project so I would just need the first element.
Guid projectId = projects[0].Id
Getting Iteration Id
To get the iteration id of a project you need the CustomVisionTrainingClient.
Create the client
To get a list of <Iteration> use
client.GetIterations(projectId).ToList()
In my case I had only a single iteration so I just need the first element.
Guid iterationId = iterations[0].Id
I am now able to use my model to classify images. In the code below, fileStream is the image stream passed to the model.
public async Task<string> Predict(Stream fileStream)
{
string projectId = "";
//string trainingEndpoint = "https://southcentralus.api.cognitive.microsoft.com/customvision/v2.2/Training/";
string trainingEndpoint = "https://southcentralus.api.cognitive.microsoft.com/";
string trainingKey = "";
//string predictionEndpoint = "https://southcentralus.api.cognitive.microsoft.com/customvision/v2.0/Prediction/";
string predictionEndpoint = "https://southcentralus.api.cognitive.microsoft.com";
string predictionKey = "";
CustomVisionTrainingClient trainingClient = new CustomVisionTrainingClient
{
ApiKey = trainingKey,
Endpoint = trainingEndpoint
};
List<Project> projects = new List<Project>();
try
{
projects = trainingClient.GetProjects().ToList();
}
catch(Exception ex)
{
Debug.WriteLine("Unable to get projects:\n\n" + ex.Message);
return "Unable to obtain projects.";
}
Guid ProjectId = Guid.Empty;
if(projects.Count > 0)
{
ProjectId = projects[0].Id;
}
if (ProjectId == Guid.Empty)
{
Debug.WriteLine("Unable to obtain project ID");
return "Unable to obtain project id.";
}
List<Iteration> iterations = new List<Iteration>();
try
{
iterations = trainingClient.GetIterations(ProjectId).ToList();
}
catch(Exception ex)
{
Debug.WriteLine("Unable to obtain iterations.");
return "Unable to obtain iterations.";
}
foreach(Iteration itr in iterations)
{
Debug.WriteLine(itr.Name + "\t" + itr.Id + "\n");
}
Guid iteration = Guid.Empty;
if(iterations.Count > 0)
{
iteration = iterations[0].Id;
}
if(iteration == Guid.Empty)
{
Debug.WriteLine("Unable to obtain project iteration.");
return "Unable to obtain project iteration";
}
CustomVisionPredictionClient predictionClient = new CustomVisionPredictionClient
{
ApiKey = predictionKey,
Endpoint = predictionEndpoint
};
var result = await predictionClient.PredictImageAsync(Guid.Parse(projectId), fileStream, iteration);
string resultStr = string.Empty;
foreach(PredictionModel pred in result.Predictions)
{
if(pred.Probability >= 0.85)
resultStr += pred.TagName + " ";
}
return resultStr;
}
I've tried a number of different options, but no matter what I do it either won't do anything or always return newValue error.
newValue cannot be null.
It seems I'm not the only one but it's had updates since the link below.
docX ReplaceText works incorrect
Below is my original example:-
if (sur.RequestType)
{
templateDoc.ReplaceText("[#1]", "x");
templateDoc.ReplaceText("[#2]", "");
}
else
{
templateDoc.ReplaceText("[#1]", "");
templateDoc.ReplaceText("[#2]", "x");
}
When debugging this it would get to line 4 then jump to line 9 where it would return the newValue cannot be null error on next step.
So I tried:-
string temp1 = "temp1";
if (sur.RequestType)
{
templateDoc.ReplaceText("[#1]", "x");
templateDoc.ReplaceText("[#2]", temp1, false, RegexOptions.IgnoreCase, paraFormat, paraFormat, MatchFormattingOptions.SubsetMatch);
}
else
{
templateDoc.ReplaceText("[#1]", "x.x");
templateDoc.ReplaceText("[#2]", "x", false, RegexOptions.IgnoreCase, paraFormat, paraFormat, MatchFormattingOptions.SubsetMatch);
}
Along with a couple other tweaks but all returning the same error.
Prior to using ReplaceText I'd used the example from the sample project:-
templateDoc.AddCustomProperty( new CustomProperty( "CompanySlogan", "Always with you" ) );
templateDoc.AddCustomProperty( new CustomProperty( "ClientName", "James Doh" ) );
Here it would step through each line but the produced document wouldn't have replaced anything.
Lastly more off topic but if anybody has a better solution, I'd been stuck going back and forth trying to output the file without saving it but had issues converting it from the Xceed DocX type to a HttpResponseMessage.
Below was my least favourable implementation of such as I'd either like to save it to a database or skip saving the file and just provide it directly to the user to save where they want instead of having a server side copy.
[HttpGet]
public HttpResponseMessage DownloadRecord(int id)
{
SURequest sur = _sURequestsService.GetRequestData(id);
var fullPath = System.Web.Hosting.HostingEnvironment.MapPath(#"~/Content/RequestForm.docx");
var fullPath2 = System.Web.Hosting.HostingEnvironment.MapPath(#"~/Content/RequestFormUpdated.docx");
var templateDoc = DocX.Load(fullPath);
var template = CreateRequestFromTemplate(templateDoc, sur);
template.SaveAs(fullPath2);
//using (FileStream fs2 = new FileStream(#"~/Content/RequestFormUpdated.docx", FileMode.Create))
//{
// template.SaveAs(fs2);
//}
HttpResponseMessage result = new HttpResponseMessage(HttpStatusCode.OK);
var stream = new FileStream(fullPath2, FileMode.Open);
result.Content = new StreamContent(stream);
result.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment");
result.Content.Headers.ContentDisposition.FileName = Path.GetFileName(fullPath2);
result.Content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
result.Content.Headers.ContentLength = stream.Length;
return result;
//return fs2;
}
I'm stuck with no clue how to proceed further with Xceed so am going to branch my present code and try using OpenXML to see if I have any better luck or if someone else can spot what I'm doing wrong or how to get past the issue in Xceed?
Any help would be much appreciated.
Turned out to be an issue with VS17 which was behaving stranging with replacetext and seemed to have cached an earlier issue in it's compiler.
This behaved like the issue was somewhere it wasn't and could only be resolved by manually stopping the compiler process.
Still no resolution for AddCustomProperty or with skipping generating a local file.
I'm going to work on trying to get it not to generate a local file but likely will need to either open a new question specific to that or setup something else to cleanup old files.
I am writing an application in C# (.NET 4.0) which has to integrate with another, much older application. Part of the requirement is to integrate with a much older program that uses Pervasive PSQL Version 9. I asked this question about accessing the database without having to install an ODBC DSN. Part of the answer (thanks very much) is that I need to create a database using DTO.
I've used COM interop to access the dto2.dll COM library, and have read the samples, but I am having problems creating the database. Here is a summary of the code I'm using.
var session = new DtoSession();
var result = session.Connect("localhost", "", "");
Assert.AreEqual(dtoResult.Dto_Success, result);
testDB = new DtoDatabase {
Session = session,
Name = "Test1",
Ddfpath = #"C:\TEMP\DATA\DDF",
DataPath = #"C:\TEMP\DATA",
};
result = session.Databases.Add(testDB);
Assert.AreEqual(dtoResult.Dto_Success, result);
No matter what values I use for the Name and paths, that final Assert always fails. The error code is Dto_errDuplicateName. If I do not include the Session property I get a different error code (7039).
Has anyone done this successfully? What am I doing wrong?
I believe you are missing the Flags property of the DtoDatabase object. I had the following code in my archive as an example of adding a database with DTO. This code was probably written when DTO was first released but it works and the only difference I can see is the Flags property.
DtoSession session = new DtoSession();
dtoResult result;
result = session.Connect("localhost", "","");
if (result != dtoResult.Dto_Success)
{
Console.WriteLine("Error connecting. Error code: " + result.ToString());
return;
}
DtoDatabase testDB = new DtoDatabase();
testDB.Name = "Test1";
testDB.DataPath = #"C:\DATA";
testDB.DdfPath = #"C:\DATA";
testDB.Flags = dtoDbFlags.dtoDbFlagNotApplicable;
result = session.Databases.Add(testDB);
if (result != dtoResult.Dto_Success)
{
Console.WriteLine("Error Adding. Error code: " + result.ToString());
return;
}
Console.WriteLine("DB Added.");
Running Dynamics CRM 2011 rollout 3. Need to update millions of customer records periodically (delta updates). Using standard update (one by one) takes a few weeks. Also we don't want to touch the DB directly as it may break stuff in the future.
Is there a bulk update method in the Dynamics CRM 2011 webservice/REST API we can use? (WhatWhereHow)
I realize this is post is over 2 years old, but I can add to it in case someone else reads it and has a similar need.
Peter Majeed's answer is on target in that CRM processes requests one record at a time. There is no bulk edit that works the way you are looking for. I encourage you not to touch the DB directly if you need/want Microsoft support.
If you are looking at periodic updates of millions of records, you have a few options. Consider using Scribe or develop your own custom import utility or script using the CRM SDK.
Scribe is probably going to be your best option since it is cost effective for data imports and will allow you to easily update and insert from the same file.
If you write your own .Net/SDK based utility, I'd suggest making it multithreaded and either programmatically break up your input file in memory or on disk and have each thread work with its own subset of the data - that is, of course, if the order of execution does not have to be chronological according to the contents of the input file. If you can divide and conquer the input file over multiple threads, you can reduce the overall execution time considerably.
Also, if your corporate policy allows you to have access to one of the CRM Servers and you can place your code directly on the server and execute it from there - you can eliminate the network latency between a workstation running the code and the CRM web services.
Last but not least, if this large volume of import data is coming from another system, you can write a CRM plug-in to run on the Retrieve and RetrieveMultiple messages (events) in CRM for your specific entity, programmatically retrieve the desired data from the other system (and if the other system is unavailable - just use the cached copy in CRM), and keep CRM up to date in real-time or on a 'last-cached-on' basis. This is certainly more coding effort, but it potentially eliminates the need for a large synchronization job to be run every few weeks.
Yes and no, mostly no. Someone can correct me if I'm mistaken, in which case I'll gladly edit/delete my answer, but everything that's done in Dynamics CRM is done one at a time. It doesn't even try to handle set-based inserts/updates/deletes. So unless you go straight to direct DB operations, it will take you weeks.
The webservice does allow for "bulk" inserts/deletes/updates, but I put "bulk" in quotes because all it does is set up an asynchronous process where it does all the relevant data operations - yep - one at a time. There's a section of the SDK that addresses this sort of data management (linked). And to update the records this way, you'd have to first suffer the overhead of selecting all the data you want to update, then creating an xml file that contains the data, and finally updating the data (remember: one row at a time). So it would actually be more efficient to just loop through your data and issue an Update request for each yourself.
(I will note that our org hasn't experienced any memorable issues regarding direct DB access to handle what the SDK doesn't, nor have I seen anything in my personal internet readings that suggest others have.)
Edit:
See iFirefly's answer below for some other excellent ways to address this issue.
I realize this is an old question but it comes up high on "CRM Bulk Update" so the Update Rollup 12 feature ExecuteMultiple needs to be mentioned here -- its not going to work around your issue (massive volume) because as iFirefly and Peter point out CRM does everything one at a time. What it does do is package all your requests into a single envelope letting CRM handle the execution of each update and reduce the number of round trips between your app and the server if you do end up issuing an Update request for every record.
This is quite an old question, but nobody mentioned the fasted way (but also the most challenging) of updating/creating huge amounts of records in CRM 201X - using built-in import feature, which is totally doable using CRM SDK. There is a perfect MSDN article about that:
https://msdn.microsoft.com/en-us/library/gg328321(v=crm.5).aspx. In short you have to:
1) Build Excel file containing the data you want to import (simply export some data from CRM 201X and check how the structure looks like, remember that the first 3 columns are hidden)
2) create Import Map entity (specify the file you created)
3) Create column mappings if necessary
4) Create Import and ImportFile entity, providing proper mappings
5) Parse data using ParseImportRequest
6) Tranform data using TransformImportRequest
7) Import data using ImportRecordsImportRequest
This were the steps for CRM 2011, now in 2017 we have more versions available and there are slight differences between them. Check the sample that is available on MSDN and in SDK:
https://msdn.microsoft.com/en-us/library/hh547396(v=crm.5).aspx
Of course point 1, will be the most difficult part, because you have to build XML or docx file perfectly corresponding to what CRM expects, but I'm assuming you are doing it from external app, so you can use some great .NET libraries that will make things much simpler.
I never saw anything faster than standard CRM import when it comes to updating/creating records, even if you go for parallelism and Batch Update requests.
If something goes wrong with the MSDN sites, I'm posting here an example from the link above that is showing how to import data to CRM programatically:
using System;
using System.ServiceModel;
using System.Collections.Generic;
using System.Linq;
// These namespaces are found in the Microsoft.Xrm.Sdk.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Sdk.Messages;
using Microsoft.Xrm.Sdk.Metadata;
// These namespaces are found in the Microsoft.Crm.Sdk.Proxy.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Crm.Sdk.Messages;
namespace Microsoft.Crm.Sdk.Samples
{
/// <summary>
/// This sample shows how to define a complex mapping for importing and then use the
/// Microsoft Dynamics CRM 2011 API to bulk import records with that mapping.
/// </summary>
public class ImportWithCreate
{
#region Class Level Members
private OrganizationServiceProxy _serviceProxy;
private DateTime _executionDate;
#endregion
/// <summary>
/// This method first connects to the organization service. Afterwards,
/// auditing is enabled on the organization, account entity, and a couple
/// of attributes.
/// </summary>
/// <param name="serverConfig">Contains server connection information.</param>
/// <param name="promptforDelete">When True, the user will be prompted to delete all
/// created entities.</param>
public void Run(ServerConnection.Configuration serverConfig, bool promptforDelete)
{
using (_serviceProxy = ServerConnection.GetOrganizationProxy(serverConfig))
{
// This statement is required to enable early bound type support.
_serviceProxy.EnableProxyTypes();
// Log the start time to ensure deletion of records created during execution.
_executionDate = DateTime.Today;
ImportRecords();
DeleteRequiredRecords(promptforDelete);
}
}
/// <summary>
/// Imports records to Microsoft Dynamics CRM from the specified .csv file.
/// </summary>
public void ImportRecords()
{
// Create an import map.
ImportMap importMap = new ImportMap()
{
Name = "Import Map " + DateTime.Now.Ticks.ToString(),
Source = "Import Accounts.csv",
Description = "Description of data being imported",
EntitiesPerFile =
new OptionSetValue((int)ImportMapEntitiesPerFile.SingleEntityPerFile),
EntityState = EntityState.Created
};
Guid importMapId = _serviceProxy.Create(importMap);
// Create column mappings.
#region Column One Mappings
// Create a column mapping for a 'text' type field.
ColumnMapping colMapping1 = new ColumnMapping()
{
// Set source properties.
SourceAttributeName = "src_name",
SourceEntityName = "Account_1",
// Set target properties.
TargetAttributeName = "name",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with the data map.
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process)
};
// Create the mapping.
Guid colMappingId1 = _serviceProxy.Create(colMapping1);
#endregion
#region Column Two Mappings
// Create a column mapping for a 'lookup' type field.
ColumnMapping colMapping2 = new ColumnMapping()
{
// Set source properties.
SourceAttributeName = "src_parent",
SourceEntityName = "Account_1",
// Set target properties.
TargetAttributeName = "parentaccountid",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with the data map.
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process),
};
// Create the mapping.
Guid colMappingId2 = _serviceProxy.Create(colMapping2);
// Because we created a column mapping of type lookup, we need to specify lookup details in a lookupmapping.
// One lookupmapping will be for the parent account, and the other for the current record.
// This lookupmapping is important because without it the current record
// cannot be used as the parent of another record.
// Create a lookup mapping to the parent account.
LookUpMapping parentLookupMapping = new LookUpMapping()
{
// Relate this mapping with its parent column mapping.
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId2),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)LookUpMappingProcessCode.Process),
// Set the lookup for an account entity by its name attribute.
LookUpEntityName = Account.EntityLogicalName,
LookUpAttributeName = "name",
LookUpSourceCode =
new OptionSetValue((int)LookUpMappingLookUpSourceCode.System)
};
// Create the lookup mapping.
Guid parentLookupMappingId = _serviceProxy.Create(parentLookupMapping);
// Create a lookup on the current record's "src_name" so that this record can
// be used as the parent account for another record being imported.
// Without this lookup, no record using this account as its parent will be imported.
LookUpMapping currentLookUpMapping = new LookUpMapping()
{
// Relate this lookup with its parent column mapping.
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId2),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)LookUpMappingProcessCode.Process),
// Set the lookup for the current record by its src_name attribute.
LookUpAttributeName = "src_name",
LookUpEntityName = "Account_1",
LookUpSourceCode =
new OptionSetValue((int)LookUpMappingLookUpSourceCode.Source)
};
// Create the lookup mapping
Guid currentLookupMappingId = _serviceProxy.Create(currentLookUpMapping);
#endregion
#region Column Three Mappings
// Create a column mapping for a 'picklist' type field
ColumnMapping colMapping3 = new ColumnMapping()
{
// Set source properties
SourceAttributeName = "src_addresstype",
SourceEntityName = "Account_1",
// Set target properties
TargetAttributeName = "address1_addresstypecode",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with its parent data map
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process)
};
// Create the mapping
Guid colMappingId3 = _serviceProxy.Create(colMapping3);
// Because we created a column mapping of type picklist, we need to specify picklist details in a picklistMapping
PickListMapping pickListMapping1 = new PickListMapping()
{
SourceValue = "bill",
TargetValue = 1,
// Relate this column mapping with its column mapping data map
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId3),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)PickListMappingProcessCode.Process)
};
// Create the mapping
Guid picklistMappingId1 = _serviceProxy.Create(pickListMapping1);
// Need a picklist mapping for every address type code expected
PickListMapping pickListMapping2 = new PickListMapping()
{
SourceValue = "ship",
TargetValue = 2,
// Relate this column mapping with its column mapping data map
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId3),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)PickListMappingProcessCode.Process)
};
// Create the mapping
Guid picklistMappingId2 = _serviceProxy.Create(pickListMapping2);
#endregion
// Create Import
Import import = new Import()
{
// IsImport is obsolete; use ModeCode to declare Create or Update.
ModeCode = new OptionSetValue((int)ImportModeCode.Create),
Name = "Importing data"
};
Guid importId = _serviceProxy.Create(import);
// Create Import File.
ImportFile importFile = new ImportFile()
{
Content = BulkImportHelper.ReadCsvFile("Import Accounts.csv"), // Read contents from disk.
Name = "Account record import",
IsFirstRowHeader = true,
ImportMapId = new EntityReference(ImportMap.EntityLogicalName, importMapId),
UseSystemMap = false,
Source = "Import Accounts.csv",
SourceEntityName = "Account_1",
TargetEntityName = Account.EntityLogicalName,
ImportId = new EntityReference(Import.EntityLogicalName, importId),
EnableDuplicateDetection = false,
FieldDelimiterCode =
new OptionSetValue((int)ImportFileFieldDelimiterCode.Comma),
DataDelimiterCode =
new OptionSetValue((int)ImportFileDataDelimiterCode.DoubleQuote),
ProcessCode =
new OptionSetValue((int)ImportFileProcessCode.Process)
};
// Get the current user to set as record owner.
WhoAmIRequest systemUserRequest = new WhoAmIRequest();
WhoAmIResponse systemUserResponse =
(WhoAmIResponse)_serviceProxy.Execute(systemUserRequest);
// Set the owner ID.
importFile.RecordsOwnerId =
new EntityReference(SystemUser.EntityLogicalName, systemUserResponse.UserId);
Guid importFileId = _serviceProxy.Create(importFile);
// Retrieve the header columns used in the import file.
GetHeaderColumnsImportFileRequest headerColumnsRequest = new GetHeaderColumnsImportFileRequest()
{
ImportFileId = importFileId
};
GetHeaderColumnsImportFileResponse headerColumnsResponse =
(GetHeaderColumnsImportFileResponse)_serviceProxy.Execute(headerColumnsRequest);
// Output the header columns.
int columnNum = 1;
foreach (string headerName in headerColumnsResponse.Columns)
{
Console.WriteLine("Column[" + columnNum.ToString() + "] = " + headerName);
columnNum++;
}
// Parse the import file.
ParseImportRequest parseImportRequest = new ParseImportRequest()
{
ImportId = importId
};
ParseImportResponse parseImportResponse =
(ParseImportResponse)_serviceProxy.Execute(parseImportRequest);
Console.WriteLine("Waiting for Parse async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, parseImportResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
// Retrieve the first two distinct values for column 1 from the parse table.
// NOTE: You must create the parse table first using the ParseImport message.
// The parse table is not accessible after ImportRecordsImportResponse is called.
GetDistinctValuesImportFileRequest distinctValuesRequest = new GetDistinctValuesImportFileRequest()
{
columnNumber = 1,
ImportFileId = importFileId,
pageNumber = 1,
recordsPerPage = 2,
};
GetDistinctValuesImportFileResponse distinctValuesResponse =
(GetDistinctValuesImportFileResponse)_serviceProxy.Execute(distinctValuesRequest);
// Output the distinct values. In this case: (column 1, row 1) and (column 1, row 2).
int cellNum = 1;
foreach (string cellValue in distinctValuesResponse.Values)
{
Console.WriteLine("(1, " + cellNum.ToString() + "): " + cellValue);
Console.WriteLine(cellValue);
cellNum++;
}
// Retrieve data from the parse table.
// NOTE: You must create the parse table first using the ParseImport message.
// The parse table is not accessible after ImportRecordsImportResponse is called.
RetrieveParsedDataImportFileRequest parsedDataRequest = new RetrieveParsedDataImportFileRequest()
{
ImportFileId = importFileId,
PagingInfo = new PagingInfo()
{
// Specify the number of entity instances returned per page.
Count = 2,
// Specify the number of pages returned from the query.
PageNumber = 1,
// Specify a total number of entity instances returned.
PagingCookie = "1"
}
};
RetrieveParsedDataImportFileResponse parsedDataResponse =
(RetrieveParsedDataImportFileResponse)_serviceProxy.Execute(parsedDataRequest);
// Output the first two rows retrieved.
int rowCount = 1;
foreach (string[] rows in parsedDataResponse.Values)
{
int colCount = 1;
foreach (string column in rows)
{
Console.WriteLine("(" + rowCount.ToString() + "," + colCount.ToString() + ") = " + column);
colCount++;
}
rowCount++;
}
// Transform the import
TransformImportRequest transformImportRequest = new TransformImportRequest()
{
ImportId = importId
};
TransformImportResponse transformImportResponse =
(TransformImportResponse)_serviceProxy.Execute(transformImportRequest);
Console.WriteLine("Waiting for Transform async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, transformImportResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
// Upload the records.
ImportRecordsImportRequest importRequest = new ImportRecordsImportRequest()
{
ImportId = importId
};
ImportRecordsImportResponse importResponse =
(ImportRecordsImportResponse)_serviceProxy.Execute(importRequest);
Console.WriteLine("Waiting for ImportRecords async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, importResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
}
/// <summary>
/// Deletes any entity records that were created for this sample.
/// <param name="prompt">Indicates whether to prompt the user
/// to delete the records created in this sample.</param>
/// </summary>
public void DeleteRequiredRecords(bool prompt)
{
bool toBeDeleted = true;
if (prompt)
{
// Ask the user if the created entities should be deleted.
Console.Write("\nDo you want these entity records deleted? (y/n) [y]: ");
String answer = Console.ReadLine();
if (answer.StartsWith("y") ||
answer.StartsWith("Y") ||
answer == String.Empty)
{
toBeDeleted = true;
}
else
{
toBeDeleted = false;
}
}
if (toBeDeleted)
{
// Retrieve all account records created in this sample.
QueryExpression query = new QueryExpression()
{
EntityName = Account.EntityLogicalName,
Criteria = new FilterExpression()
{
Conditions =
{
new ConditionExpression("createdon", ConditionOperator.OnOrAfter, _executionDate),
}
},
ColumnSet = new ColumnSet(false)
};
var accountsCreated = _serviceProxy.RetrieveMultiple(query).Entities;
// Delete all records created in this sample.
foreach (var account in accountsCreated)
{
_serviceProxy.Delete(Account.EntityLogicalName, account.Id);
}
Console.WriteLine("Entity record(s) have been deleted.");
}
}
#region Main method
/// <summary>
/// Standard Main() method used by most SDK samples.
/// </summary>
/// <param name="args"></param>
static public void Main(string[] args)
{
try
{
// Obtain the target organization's web address and client logon
// credentials from the user.
ServerConnection serverConnect = new ServerConnection();
ServerConnection.Configuration config = serverConnect.GetServerConfiguration();
var app = new ImportWithCreate();
app.Run(config, true);
}
catch (FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault> ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine("Timestamp: {0}", ex.Detail.Timestamp);
Console.WriteLine("Code: {0}", ex.Detail.ErrorCode);
Console.WriteLine("Message: {0}", ex.Detail.Message);
Console.WriteLine("Trace: {0}", ex.Detail.TraceText);
Console.WriteLine("Inner Fault: {0}",
null == ex.Detail.InnerFault ? "No Inner Fault" : "Has Inner Fault");
}
catch (System.TimeoutException ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine("Message: {0}", ex.Message);
Console.WriteLine("Stack Trace: {0}", ex.StackTrace);
Console.WriteLine("Inner Fault: {0}",
null == ex.InnerException.Message ? "No Inner Fault" : ex.InnerException.Message);
}
catch (System.Exception ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine(ex.Message);
// Display the details of the inner exception.
if (ex.InnerException != null)
{
Console.WriteLine(ex.InnerException.Message);
FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault> fe = ex.InnerException
as FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault>;
if (fe != null)
{
Console.WriteLine("Timestamp: {0}", fe.Detail.Timestamp);
Console.WriteLine("Code: {0}", fe.Detail.ErrorCode);
Console.WriteLine("Message: {0}", fe.Detail.Message);
Console.WriteLine("Trace: {0}", fe.Detail.TraceText);
Console.WriteLine("Inner Fault: {0}",
null == fe.Detail.InnerFault ? "No Inner Fault" : "Has Inner Fault");
}
}
}
// Additional exceptions to catch: SecurityTokenValidationException, ExpiredSecurityTokenException,
// SecurityAccessDeniedException, MessageSecurityException, and SecurityNegotiationException.
finally
{
Console.WriteLine("Press <Enter> to exit.");
Console.ReadLine();
}
}
#endregion Main method
}
}
Not sure how this would go with millions of records, but you can select your records, then click the Edit button in the ribbon. This will bring up the "Edit Multiple Records" dialog. Any changes you make will be applied to all your records.
The BulkUpdate API works well for me; it is 10 times faster than updating records one at a time. Following is a snippet that performs a bulk update:
public override ExecuteMultipleResponse BulkUpdate(List<Entity> entities)
{
ExecuteMultipleRequest request = new ExecuteMultipleRequest()
{
Settings = new ExecuteMultipleSettings()
{
ContinueOnError = true,
ReturnResponses = true
},
Requests = new OrganizationRequestCollection()
};
for (int i = 0; i < entities.Count; i++)
{
request.Requests.Add(new UpdateRequest() { Target = entities[i] });
}
return (ExecuteMultipleResponse) ServiceContext.Execute(request);
}
I worked on a very large data migration project for Dynamics CRM 2011. We needed to load about 3 million records over a weekend. I ended up building a console application (single thread) and ran multiple instances on multiple machines. Each console application had an id (1, 2, etc.) and was responsible for loading segments of the data based on a unique SQL WHERE clause that matched the application's id.
You could do the same thing with updates. Each instance could query a subset of the records to update and can perform the updates via the SDK. Since we loaded millions of records over a weekend I think you could perform millions of updates (if relatively small) in just a few hours.
Microsoft PFE team for dynamics CRM wrote
new Another CRM SDK library that make use of parallelization
to bulk execute requests ensuring thread safety.
You may try : Parallel Execute Requests
I would be interested to know if it works and scales to millions of records.
CRM doesn't implement a way to update bulk data; there are 3 ways to improve the bulk update operation performance but internally they cannot change the fact that CRM updates record one by one.
Basically the ideas are:
reduce the time wasted on communicating to CRM server
use parallelism to do multiple operations at the same time
make sure the update process does NOT trigger any workflows/plugins. Otherwise you might never see the end of the process...
3 ways to improve bulk operation performance:
After RollUp 12 there is a ExecuteMultipleRequest feature, which allows you to send up to 1000 requests at once. This means you may save some time from sending 1000 requests to CRM web service, however, these requests are processed one after another. So if your CRM server is well configured, most likely this method won't help too much.
You may use OrganizationServiceContext instance to do bulk update. OrganizationServiceContext implements unit of work pattern so you can do multiple updates and transmit these operations to the server in one call. Comparing to ExecuteMultipleRequest, it doesn't have a limit on request amount, but if it encounters a failure during the update, it will rollback all the changes.
Use multithreading or multitask. Either way would improve the speed, but they are likely to generate some connection failures or SQL errors, so you would need to add some retry logic in the code.
One of my client had exactly the same problem. He solved it by creating a custom ETL and doing parallelism attacking two front-end. The whole thing was made in C#. Nowaday, it could be possible with KingswaySoft or Scribe.