Plugin performance in Microsoft Dynamics CRM 2013/2015 - performance

Time to leave the shy mode behind and make my first post on stackoverflow.
After doing loads of research (plugins, performance, indexes, types of update, friends) and after trying several approaches I was unable to find a proper answer/solution.
So if possible I would like to get your feedback/help in a Microsoft Dynamics CRM 2013/2015 plugin performance issue (or coding technique)
Scenario:
Microsoft Dynamics CRM 2013/2015
2 Entities with Relationship 1:N
EntityA
EntityB
EntityB has the following columns:
Id | EntityAId | ColumnDemoX (decimal) | ColumnDemoY (currency)
Entity A has: 500 records
Entity B has: 150 records per each Entity A record. So 500*150 = 75000 records.
Objective:
Create a Post Entity A Plugin Update to "mimic" the following SQL command
Update EntityB
Set ColumnDemoX = (some quantity), ColumnDemoY = (some quantity) * (some value)
Where EntityAId = (some id)
One approach could be:
using (var serviceContext = new XrmServiceContext(service))
{
var query = from a in serviceContext.EntityASet
where a.EntityAId.Equals(someId)
select a;
foreach (EntityA entA in query)
{
entA.ColumnDemoX = (some quantity);
serviceContext.UpdateObject(entA);
}
serviceContext.SaveChanges();
}
Problem:
The foreach for 150 records in the post plugin update will take 20 secs or more.
While the
Update EntityB Set ColumnDemoX = (some quantity), ColumnDemoY = (some quantity) * (some value) Where EntityAId = (some id)
it will take 0.00001 secs
Any suggestion/solution?
Thank you all for reading.
H

You can use the ExecuteMultipleRequest, when you iterate the 150 entities, save the entities you need to update and after that call the request. If you do this, you only call the service once, that's very good for the perfomance.
If your process could be bigger and bigger, then you should think making it asynchronous as a plug-in or a custom activity workflow.
This is an example:
// Create an ExecuteMultipleRequest object.
requestWithResults = new ExecuteMultipleRequest()
{
// Assign settings that define execution behavior: continue on error, return responses.
Settings = new ExecuteMultipleSettings()
{
ContinueOnError = false,
ReturnResponses = true
},
// Create an empty organization request collection.
Requests = new OrganizationRequestCollection()
};
// Add a UpdateRequest for each entity to the request collection.
foreach (var entity in input.Entities)
{
UpdateRequest updateRequest = new UpdateRequest { Target = entity };
requestWithResults.Requests.Add(updateRequest);
}
// Execute all the requests in the request collection using a single web method call.
ExecuteMultipleResponse responseWithResults =
(ExecuteMultipleResponse)_serviceProxy.Execute(requestWithResults);

Few solutions comes to mind but I don't think they will please you...
Is this really a problem ? Yes it's slow and database update can be so much faster. However if you can have it as a background process (asynchronous), you'll have your numbers anyway. Is it really a "I need this numbers in the next second as soon as I click or business will go down" situation ?
It can be a reason to ditch 2013. In CRM 2015 you can use a calculated field. If you need this numbers only to show up in forms (eg. you don't use them in reporting), you could also do it in javascript.
Warning this is for the desesperate call. If you really need your update to be synchronous, immediate, you can't use calculated fields, you really know what your doing etc... Why not do it directly in the database? I know this is a very bad advice. There are a lot of reason not to do it this way (you can read a few here). It's unsupported and if you do something wrong it could go really bad. But if your real situation is as simple as your example (just a calculated field, no entity creation, no relation modification), you could do it this way. You'll have to consider many things: you won't have any audit on the fields, no security, caching issues, no modified by, etc. Actually I pretty much advise against this solution.

1 - Put it this logic to async workflow.
OR
2 - Don't use
serviceContext.UpdateObject(entA);
serviceContext.SaveChanges();.
Get all the records (150) from post stage update the fields and ExecuteMultipleRequest to update crm records in one time.
Don't send update request for each and every record

Related

dart - Sort a list of futures

So I have a set of options that each contain an int value representing their ordinal.
These options are stored in a remote database with each option being a record.
As such when I fetch them from the db I end up with a list of future:
e.g. List<Future<Option>>
I need to be able to sort these Options.
The following dart pad shows a simplified view of what I'm trying to achieve:
https://dartpad.dartlang.org/a5175401516dbb9242a0edec4c89fef6
The Options MUST be futures.
My original solution was to copy the Options into a list, 'complete' them, then sort the list.
This however caused other problems and as such I need to do an 'insitu' sort on the original list.
You cannot sort the futures before they have completed, and even then, you need to extract the values first.
If you need to have a list of futures afterwards, this is what I would do:
List<Future<T>> sortFutures<T>(List<Future<T>> input, [int compare(T a, T b)]) {
var completers = [for (var i = 0; i < input.length; i++) Completer<T>()];
Future.wait(input).then((values) {
values.sort(compare);
for (int i = 0; i < values.length; i++) {
completers[i].complete(values[i]);
}
});
return [for (var c in completers) c.future];
}
This does not return the original futures because you don't know the ordering at the time you have to return them. It does return futures which complete with the same value.
If any of the futures completes with an error, then this blows up. You'll need more error handling if that is possible.
Gentlfolk,
thanks for the help.
julemand101 suggestion of using Future.wait() lead me to the answer.
It also helped me better understand the problem.
I've done a new gist that more accurately shows what I was attempting to do.
Essentilly when we do a db request over the network we get an entity back.
The problem is that the entity will often have references to other entities.
This can end in a whole tree of entities needing to be returned.
Often you don't need any of these entities.
So the solution we went for is to only return the database 'id' of each child entity (only the immediate children).
We then store those id's in a class RefId (see below).
The RefId is essentially a future that has the entities id and knows how to fetch the entity from the db.
When we actually need to access a child entity we force the RefId to complete (i.e. retrieve the entity across the network boundary).
We have a whole caching scheme to keep this performant as well as the ability to force the fetching of child elements, as part of the parent request, where we know up front they will be needed.
The options in my example are essentially menu items that need to be sorted.
But of course I can't sort them until they have been retrieved.
So a re-written example and answer:
https://dartpad.dartlang.org/369e71bb173ba3c19d28f6d6fec2072a
Here is the actual IdRef class we use:
https://dartpad.dartlang.org/ba892873a94d9f6f3924436e9fcd1b42
It now has a static resolveList method to help with this type of problem.
Thanks for your assistance.

Dynamics 365 implementation auto number with plugin

I have already implement auto number with plugin, and it is through another entity type to ensure all operations are in one transaction.
but I have another question, that is can we use process lock(eg. mutex) to fix it? this will more flexible than before, isn't it?
Dynamics 365 has native support for auto-numbering fields since v9.0, with the minor inconvenience of having to manipulate them through code only.
Unless it's a broken feature (it happens), there's no need for custom autonumbers anymore.
https://learn.microsoft.com/en-us/dynamics365/customer-engagement/developer/create-auto-number-attributes
Example:
CreateAttributeRequest widgetSerialNumberAttributeRequest = new CreateAttributeRequest
{
EntityName = "newWidget",
Attribute = new StringAttributeMetadata
{
//Define the format of the attribute
AutoNumberFormat = "WID-{SEQNUM:5}-{RANDSTRING:6}-{DATETIMEUTC:yyyyMMddhhmmss}",
LogicalName = "new_serialnumber",
SchemaName = "new_SerialNumber",
RequiredLevel = new AttributeRequiredLevelManagedProperty(AttributeRequiredLevel.None),
MaxLength = 100, // The MaxLength defined for the string attribute must be greater than the length of the AutoNumberFormat value, that is, it should be able to fit in the generated value.
DisplayName = new Label("Serial Number", 1033),
Description = new Label("Serial Number of the widget.", 1033)
}
};
_serviceProxy.Execute(widgetSerialNumberAttributeRequest);
Docs point out that
The sequential segment is generated by SQL and hence uniqueness is guaranteed by SQL.
XrmToolbox should have a plugin to manage auto number fields (thus making it easier), but I haven't tried it.
Plugin's can be run on multiple machines concurrently depending on your installation - that's why the entity (database) lock is regularly used.

Select Count very slow using EF with Oracle

I'm using EF 5 with Oracle database.
I'm doing a select count in a table with a specific parameter. When I'm using EF, the query returns the value 31, as expected, But the result takes about 10 seconds to be returned.
using (var serv = new Aperam.SIP.PXP.Negocio.Modelos.SIP_PA())
{
var teste = (from ens in serv.PA_ENSAIOS_UM
where ens.COD_IDENT_UNMET == "FBLDY3840"
select ens).Count();
}
If I execute the simple query bellow the result is the same (31), but the result is showed in 500 milisecond.
SELECT
count(*)
FROM
PA_ENSAIOS_UM
WHERE
COD_IDENT_UNMET 'FBLDY3840'
There are a way to improve the performance when I'm using EF?
Note: There are 13.000.000 lines in this table.
Here are some things you can try:
Capture the query that is being generated and see if it is the same as the one you are using. Details can be found here, but essentially, you will instantiate your DbContext (let's call it "_context") and then set the Database.Log property to be the logging method. It's fine if this method doesn't actually do anything--you can just set a breakpoint in there and see what's going on.
So, as an example: define a logging function (I have a static class called "Logging" which uses nLog to write to files)
public static void LogQuery(string queryData)
{
if (string.IsNullOrWhiteSpace(queryData))
return;
var message = string.Format("{0}{1}",
queryData.Trim().Contains(Environment.NewLine) ?
Environment.NewLine : "", queryData);
_sqlLogger.Info(message);
_genLogger.Trace($"EntityFW query (len {message.Length} chars)");
}
Then when you create your context point to LogQuery:
_context.Database.Log = Logging.LogQuery;
When you do your tests, remember that often the first run is the slowest because the server has to actually do the work, but on the subsequent runs, it often uses cached data. Try running your tests 2-3 times back to back and see if they don't start to run in the same time.
I don't know if it generates the same query or not, but try this other form (which should be functionally equivalent, but may provide better time)
var teste = serv.PA_ENSAIOS_UM.Count(ens=>ens.COD_IDENT_UNMET == "FBLDY3840");
I'm wondering if the version you have pulls data from the DB and THEN counts it. If so, this other syntax may leave all the work to be done at the server, where it belongs. Not sure, though, esp. since I haven't ever used EF with Oracle and I don't know if it behaves the same as SQL or not.

Using "Any" or "Contains" when context not saved yet

Why isn't the exception triggered? Linq's "Any()" is not considering the new entries?
MyContext db = new MyContext();
foreach (string email in {"asdf#gmail.com", "asdf#gmail.com"})
{
Person person = new Person();
person.Email = email;
if (db.Persons.Any(p => p.Email.Equals(email))
{
throw new Exception("Email already used!");
}
db.Persons.Add(person);
}
db.SaveChanges()
Shouldn't the exception be triggered on the second iteration?
The previous code is adapted for the question, but the real scenario is the following:
I receive an excel of persons and I iterate over it adding every row as a person to db.Persons, checking their emails aren't already used in the db. The problem is when there are repeated emails in the worksheet itself (two rows with the same email)
Yes - queries (by design) are only computed against the data source. If you want to query in-memory items you can also query the Local store:
if (db.Persons.Any(p => p.Email.Equals(email) ||
db.Persons.Local.Any(p => p.Email.Equals(email) )
However - since YOU are in control of what's added to the store wouldn't it make sense to check for duplicates in your code instead of in EF? Or is this just a contrived example?
Also, throwing an exception for an already existing item seems like a poor design as well - exceptions can be expensive, and if the client does not know to catch them (and in this case compare the message of the exception) they can cause the entire program to terminate unexpectedly.
A call to db.Persons will always trigger a database query, but those new Persons are not yet persisted to the database.
I imagine if you look at the data in debug, you'll see that the new person isn't there on the second iteration. If you were to set MyContext db = new MyContext() again, it would be, but you wouldn't do that in a real situation.
What is the actual use case you need to solve? This example doesn't seem like it would happen in a real situation.
If you're comparing against the db, your code should work. If you need to prevent dups being entered, it should happen elsewhere - on the client or checking the C# collection before you start writing it to the db.

Dynamics CRM 2011 LinQ to Find New Records

I have a bunch of custom entity records in a List (which comes from a csv file).
What is the best way to check which records are new and create those that are?
The equality check is based on a single text field in this case, but I need to do the same thing elsewhere where the equality check is based on a lookup and 2 text fields.
For arguments sake lets say I was inserting Account records, this is what I currently have:
private void CreateAccounts()
{
var list = this.GetAccounts(); // get the list of Accounts, some may be new
IEnumerable<string> existingAccounts = linq.AccountSet.Select(account => account.AccountNumber); // get all Account numbers in CRM, linq is a serviceContextName variable
var newAccounts = list.Where(account => !existingAccounts.Contains(account.AccountNumber)); // Account numbers not in CRM
foreach (var accountNumber in newAccounts) // go through the new list again and get all the Account info
{
var account = newAccounts.Where(newAccount => newAccount.AccountNumber.Equals(accountNumber)).FirstOrDefault();
service.Create(account);
}
}
Is there a better way to do this?
I seem to be iterating through lists too many times, but it must be better than querying CRM multiple times:
foreach (var account in list)
{
// is this Account already in CRM
// if not create the Account
}
Your current method seems a bit backwards (get everything out of CRM, then compare it to what you have locally), but it may not be too bad depending on how many accounts you have ie < 5000.
For your simple example, you should be able to apply a where in statement.
Joining on multiple fields is a little more tricky. If you are running CRM > R12, you should be able to use the ExecuteMultipleRequests, creating a seperate request for each item in your list, and then batching them all up, so there is one big request "over the wire" to CRM.

Resources