trigger on contact for counting of activities history and also updated on Account and opportunity object - apex-code

field for number of Activities for each Contact, regardless of Type or Status.
Full Details: The requirement is to be able to create reports based on Contacts or account or Opportunities and see a “Summary” field for Total # of Activities associated with either Contacts, Accounts or Opportunities.
Note that we do not want to see a line for each activity, we want to see one line for the each contact or opp or account with the activity summary count.
Note: The original request also included the ability to have a unique activity count on reports

I haven't tested this code, but you could do something like this for an insert (you'd need to also cover updates and deletes). In this example, NumberOfActivites__c is your custom Task count field on the Contact object:
Map<Id,Integer> countMap = new Map<Id,Integer>();
List<Contact> contactList = new List<Contact>();
for (Task t : trigger.new){
//get id's of all contacts affected by the batch
Id w = t.whoId;
if (w.getSObjectType().getDescribe().getName() == 'Contact'){
//since there could be more than one task related to a contact
//in a batch, you would have to count them
if (countMap.keyset().containts(w)){
countMap.get(w) += 1;
} else {
countMap.put(w,1);
}
}
}
//get list of contacts to be updated
contactList = [Select Id, NumberOfActivities__c
From Contact
Where Id In :countMap.keyset()];
//modify contacts in list with new count
for (Contact c : contactList){
c.NumberOfActivites__c = c.NumberOfActivites__c + countMap.get(c.Id));
}
//do the update
update contactList;

Related

How do I add a member (name and email address) to an existing Outlook Distribution List using C#

I am trying to programmatically add a member (name and email address) to an existing Outlook Distribution List, but I can figure out how to grab it. I have found many postings describing how to create a new Outlook Distribution List, but none on how to add a member to an existing one. I have been able to retrieve the items collection of the Contacts folder, but I cannot access the Outlook Distribution List I want. Keep in mind that the Contacts folder contains at least two different object types, Contact Items and Distribution List items. Is there a way to just retrieve the Distribution List items from the Contacts folder? Any help would be greatly appreciated. I have no code worth posting.
I have made some progress. I now have the below code:
Outlook.MAPIFolder outlookContactsFolder = outlookNamespace.GetDefaultFolder(Outlook.OlDefaultFolders.olFolderContacts); // Get Contacts folder.
Outlook.Items outlookContactsItems = outlookContactsFolder.Items; // Get the Items collection.
for (int i = 1; i <= outlookContactsItems.Count; i++)
{
if (i == 62)
{
Outlook.DistListItem outlookDistListItem = outlookContactsItems.GetNext();
Outlook.Recipient outlookRecipient = **(Need help creating a Recipient object with a name and email address)**
outlookDistListItem.AddMember(outlookRecipient);
outlookDistListItem.Save();
break;
}
else
{
Outlook.ContactItem outlookContactsItem = outlookContactsItems.GetNext();
}
}
I know this is not the best way, but it works. I can now access the Distribution List without the code blowing up. Now I need to add a new member to it. I know I can do that with the AddMember method, but it takes an Outlook.Recipient object. I can't find anywhere how to create it with a name and email address.

Dynamics CRM 2011 LinQ to Find New Records

I have a bunch of custom entity records in a List (which comes from a csv file).
What is the best way to check which records are new and create those that are?
The equality check is based on a single text field in this case, but I need to do the same thing elsewhere where the equality check is based on a lookup and 2 text fields.
For arguments sake lets say I was inserting Account records, this is what I currently have:
private void CreateAccounts()
{
var list = this.GetAccounts(); // get the list of Accounts, some may be new
IEnumerable<string> existingAccounts = linq.AccountSet.Select(account => account.AccountNumber); // get all Account numbers in CRM, linq is a serviceContextName variable
var newAccounts = list.Where(account => !existingAccounts.Contains(account.AccountNumber)); // Account numbers not in CRM
foreach (var accountNumber in newAccounts) // go through the new list again and get all the Account info
{
var account = newAccounts.Where(newAccount => newAccount.AccountNumber.Equals(accountNumber)).FirstOrDefault();
service.Create(account);
}
}
Is there a better way to do this?
I seem to be iterating through lists too many times, but it must be better than querying CRM multiple times:
foreach (var account in list)
{
// is this Account already in CRM
// if not create the Account
}
Your current method seems a bit backwards (get everything out of CRM, then compare it to what you have locally), but it may not be too bad depending on how many accounts you have ie < 5000.
For your simple example, you should be able to apply a where in statement.
Joining on multiple fields is a little more tricky. If you are running CRM > R12, you should be able to use the ExecuteMultipleRequests, creating a seperate request for each item in your list, and then batching them all up, so there is one big request "over the wire" to CRM.

How to prevent multiple users from adding an item to a Sharepoint list simultaneously

I am using a simple form to allow people to sign up for an event. Their details are saved to a Sharepoint list. I have a quota of people who can sign up for an event (say 100 people).
How can I prevent the 100th and the 101st person from signing up concurrently, causing the quota check to allow the 101st person to sign up (because the 100th person isn't in the list yet)?
Place the ItemAdding code inside a lock statement to make sure that only one thread at a time can enter the critical section of code:
private Object _lock = new Object();
public override void ItemAdding(SPItemEventProperties properties)
{
lock(_lock)
{
// check number of the list items and cancel the event if necessary
}
}
I came up with this idea of a solution for a farm with multiple WFEs - a shared resource (a row in a table in pseudo-code above) gets locked during the time the item is added to the list:
private Object _lock = new Object();
public override void ItemAdding(SPItemEventProperties properties)
{
try
{
// 1. begin a SQL Server transaction
// 2. UPDATE dbo.SEMAPHORE
// SET STATUS = 'Busy'
// WHERE PROCESS = 'EventSignup'
lock(_lock)
{
// 3. check number of the list items and cancel the event if necessary
}
}
finally
{
// 4. UPDATE dbo.SEMAPHORE
// SET STATUS = ''
// WHERE PROCESS = 'EventSignup'
// 5. commit a SQL Server transaction
}
}
I left the lock statement because I'm not sure what will happen if the same front-end server tries to add the item #100 and #101 - will the transaction lock the row or will it not because the same connection to SQL Server will be used?
So then you can use event receivers item adding method. at item adding, your item is not created, you can calculate the current count of signed up people. if it is bigger then 100 you can cancel item adding.
but sure, more than one item adding method can be fired, to prevent that you can calculate the current count of people and increase the count +1, and keep that value somewhere else (on a field on event item perhaps) and all item adding methods can check that value before adding the item.
item added method is too late for these operations.
this would be the solution i would use.
I guess if you are updating a column, lets say - "SignUp Count", then one of the users will get the Save Conflict issue. Whoever updated the value for the first time wins and the second one will fail.
Regards,
Nitin Rastogi

Need advice on designing an index to keep track of users that are not subscribed to a document in the system

I have a table for Projects, a table for Documents (with a FK to Projects), and a table for users with a relationship to projects and documents. Users can be subscribers to a document and that is how they have a relationship to the document.
Users are considered team members on a project. A document can only assigned to one project at a time. Only users, from the "Project Team" can be subscribed to a document.
What I am struggling with, in my application, is that admins are able to add users as subscribers to a document. However, the admin can only pick from a list of unsubscribed users. Those unsubscribed users are members of the project team, like I said above.
Right now, I am creating 2 queries on my database to pull all the subscribed users and the entire list of team members. I then compare the list and only pull the team members that are not subscribed.
I am not sure if I should even index this type of data or just pull from the database directly. If I should use an index, this data needs to be updated quickly becuase the admins need that unsubscribed list rather fast.
This is what my query looks like going against Entity Framework 4.1:
var currentSubscribers = _subscriptionRepository.Get(s => s.DocumentId == documentId).Select(s => s.Contact).ToList();
if (projecTeamMembers != null)
{
var availableSubscribers = (projecTeamMembers.Where(
p => !(currentSubscribers.Select(c => c.Id)).Contains(p.Id))).ToDictionary(c => c.Id, c=> c.LastName + ", " + c.FirstName);
return availableSubscribers;
}
else
{
return null;
}
This works great in EF, but I have been thinking of indexing my data using Lucene.Net and need some advice on if I should do this or not.
Write this query indide your data repository where you have access to the Database context
var q = from m in dbContext.ProjecTeamMembers
where !(from s in dbContext.Subscribers
where s.DocumentId == documentId &&
s.Contact.Id == p.Id
select s).Any()
select m;
var availableSubscribers = q.ToDictionary(m => m.Id, c=> m.LastName + ", " + m.FirstName);
The best way I have found to do something like this is, using Lucene.net, is to keep an index of subscribers and an index of all team members. And compare the two. It's faster than pulling form the database each time.

EF4 Import/Lookup thousands of records - my performance stinks!

I'm trying to setup something for a movie store website (using ASP.NET, EF4, SQL Server 2008), and in my scenario, I want to allow a "Member" store to import their catalog of movies stored in a text file containing ActorName, MovieTitle, and CatalogNumber as follows:
Actor, Movie, CatalogNumber
John Wayne, True Grit, 4577-12 (repeated for each record)
This data will be used to lookup an actor and movie, and create a "MemberMovie" record, and my import speed is terrible if I import more than 100 or so records using these tables:
Actor Table: Fields = {ID, Name, etc.}
Movie Table: Fields = {ID, Title, ActorID, etc.}
MemberMovie Table: Fields = {ID, CatalogNumber, MovieID, etc.}
My methodology to import data into the MemberMovie table from a text file is as follows (after the file has been uploaded successfully):
Create a context.
For each line in the file, lookup the artist in the Actor table.
For each Movie in the Artist table, lookup the matching title.
If a matching Movie is found, add a new MemberMovie record to the context and call ctx.SaveChanges().
The performance of my implementation is terrible. My expectation is that this can be done with thousands of records in a few seconds (after the file has been uploaded), and I've got something that times out the browser.
My question is this: What is the best approach for performing bulk lookups/inserts like this? Should I call SaveChanges only once rather than for each newly created MemberMovie? Would it be better to implement this using something like a stored procedure?
A snippet of my loop is roughly this (edited for brevity):
while ((fline = file.ReadLine()) != null)
{
string [] token = fline.Split(separator);
string Actor = token[0];
string Movie = token[1];
string CatNumber = token[2];
Actor found_actor = ctx.Actors.Where(a => a.Name.Equals(actor)).FirstOrDefault();
if (found_actor == null)
continue;
Movie found_movie = found_actor.Movies.Where( s => s.Title.Equals(title, StringComparison.CurrentCultureIgnoreCase)).FirstOrDefault();
if (found_movie == null)
continue;
ctx.MemberMovies.AddObject(new MemberMovie()
{
MemberProfileID = profile_id,
CatalogNumber = CatNumber,
Movie = found_movie
});
try
{
ctx.SaveChanges();
}
catch
{
}
}
Any help is appreciated!
Thanks, Dennis
First:
Some time ago I wrote an answer about calling SaveChanges after 1, n or all rows:
When should I call SaveChanges() when creating 1000's of Entity Framework objects? (like during an import)
It is actually better to call SaveChanges after more than 1 row, but not after all.
Second:
Make sure you have index on name in Actors table and title in Movies, that should help. Also you shouldn't select whole Actor, if you need only his ID:
Instead of:
Actor found_actor = ctx.Actors.Where(a => a.Name.Equals(actor)).FirstOrDefault();
you can select:
int? found_actor_id = ctx.Actors.Where(a => a.Name.Equals(actor)).Select(a => a.ID).FirstOrDefault();
and then
Something.ActorID = found_actor_id;
This can be faster, because doesn't require whole Actor entity and doesn't require additional lookups, specially when combined with index.
Third:
If you send a very large file, there is still probability of timeout, even with good performance. You should run this import in separate thread and return response immediately. You can give some kind of identifier to every import and allow user to check status by this ID.

Resources