Looking for PendingResult await() equivalent in New Places SDK Client - google-places-api

Background: I have a List of strings which contains the different place IDs. Once a user has selected his location, I have a loop that executes and determines if each place in the list (I obtain the location from the place ID) is near his selected location. I was able to implement this with the old Places SDK but could not migrate it to the new SDK because it seems that the new SDK has no await() equivalent.
Here is my old code:
// contains a list of Offices. Has method getId() which contains the Place ID from Google.
List<Office> results = obtained from the database...
// go thru each Location and find those near the user's location
for (int i = 0; i < results.size(); i++) {
// Get the place from the placeID
PendingResult<PlaceBuffer> placeResult = Places.GeoDataApi.
getPlaceById(mGoogleApiClient, results.get(i).getId());
// wait for the result to come out (NEED EQUIVALENT IN NEW PLACES SDK)
PlaceBuffer places = placeResult.await();
// Get the latitude and longitude for the specific Location
LatLng latLng = places.get(0).getLatLng();
// Set the location object for the specific business
Location A = new Location("Business");
A.setLatitude(latLng.latitude);
A.setLongitude(latLng.longitude);
// get the distance of the business from the user's selected location
float distance = A.distanceTo(mSelectedLocation);
// if the distance is less than 50m away
if (distance < 50) { ... do something in code}
As you can see in the code above, the old PLACES SDK API has a PendingResult class with await() as one of the methods. This await() as per documentation Blocks until the task is completed.. IN SUMMARY, the code will not proceed till a result is obtained from getPlaceById.
I migrated to the new Places SDK as per documentation and I have issues. Here is my new migrated code based on the Google documentation: https://developers.google.com/places/android-sdk/client-migration#fetch_a_place_by_id
for (int i = 0; i < results.size(); i++) {
// Get the place Id
String placeId = results.get(position).getId();
// Specify the fields to return.
List<Place.Field> placeFields = Arrays.asList(Place.Field.ID, Place.Field.NAME,
Place.Field.LAT_LNG, Place.Field.ADDRESS);
// Construct a request object, passing the place ID and fields array.
FetchPlaceRequest request = FetchPlaceRequest.builder(placeId, placeFields)
.build();
// Add a listener to handle the response.
placesClient.fetchPlace(request).addOnSuccessListener((response) -> {
Place place = response.getPlace();
// Get the latitude and longitude for the specific location
LatLng latLng = place.getLatLng();
// Set the location object for the specific business
Location A = new Location("Business");
A.setLatitude(latLng.latitude);
A.setLongitude(latLng.longitude);
// get the distance of the business from the selected location
float distance = A.distanceTo(mSelectedLocation);
// if the distance is less than 50m away
if (distance < 50) { ... do something in code}
It seems that key issue here is that in the old code await() blocks the code till its successful hence the for loop does not process. However this is not the case with OnSuccessListener. As a result, with the new migrated code, the for loop proceeds and completes the loop even when fetchPlace is not yet complete getting its results for each iteration. Thus, the code is broken and is unable to get the results needed.
Is there a way to block the code to move till fetchPlace is completed?!

Any Google API task can be waited on by Google's Task API as far as I'm aware.
For example, findAutocompletePredictions returns a Task<> object. Instead of adding an onCompleteListener, you can pass that task to Tasks.await.
Instead of this non-blocking way:
OnCompleteListener<T> onCompleteListener=
new OnCompleteListener<T> {...}
placesClient.findAutocompletePredictions(f)
.addOnCompleteListener(onCompleteListener);
You could pass it on to Tasks.await() and make the API call blocking:
T results = null;
try {
// No timeout
results = Tasks.await(placesClient.findAutocompletePredictions(f));
// Optionally, with a 30 second timeout:
results = Tasks.await(
placesClient.findAutocompletePredictions(f), 30, TimeUnit.SECONDS);
} catch (ExecutionException e) {
// Catch me
} catch (TimeoutException e) {
// Catch me, only needed when a timeout is set
} catch (InterruptedException e) {
// Catch me
}
if (results != null) {
// Do something
} else {
// Do another thing
}
Basically, instead of getting a PendingResult by default, you're now given a Task<T> that you can use however.

I solved the issue by using the Task Class. See below:
for (int position = 0; position < results.size(); position++) {
// Get the placeID
String placeId = results.get(position).getAddress();
// Specify the fields to return.
List<Place.Field> placeFields = Arrays.asList(Place.Field.ID, Place.Field.NAME,
Place.Field.LAT_LNG, Place.Field.ADDRESS);
// Construct a request object, passing the place ID and fields array.
FetchPlaceRequest request = FetchPlaceRequest.builder(placeId, placeFields)
.build();
// create a FetchPlaceResponse task
Task<FetchPlaceResponse> task = placesClient.fetchPlace(request);
try {
FetchPlaceResponse response = Tasks.await(task);
Place place = response.getPlace();
// Get the latitude and longitude for the specific place
LatLng latLng = place.getLatLng();
// Set the location object for the specific business
Location A = new Location("Business");
A.setLatitude(latLng.latitude);
A.setLongitude(latLng.longitude);
// get the distance of the business from the selected location
float distance = A.distanceTo(mSelectedLocation);
These two codes will ask the system to wait for the response..
Task task = placesClient.fetchPlace(request);
FetchPlaceResponse response = Tasks.await(task);

Related

How does location cache works?

I have an application that gets the GPS location of the devices every time the user fills-up the form. My problem is capturing the GPS location takes too long. It takes about 40 seconds to 60 seconds before the GPS has been captured. I am using jamesmontemagno's Geolocator plugin.
GPS Parameters:
Accuracy: 100 meters
Timeout: 1 minute
Here is my code that I am using right now:
var defaultgpsaccuracy = Convert.ToDouble(Preferences.Get("gpsaccuracy", String.Empty, "private_prefs"));
var defaultgpstimeout = Convert.ToDouble(Preferences.Get("gpstimeout", String.Empty, "private_prefs"));
var locator = CrossGeolocator.Current;
locator.DesiredAccuracy = defaultgpsaccuracy;
position = await locator.GetLastKnownLocationAsync();
if (position != null)
{
string location = position.Latitude + "," + position.Longitude;
lblStartLocation.Text = location;
}
else
{
position = await locator.GetPositionAsync(TimeSpan.FromMinutes(defaultgpstimeout), null, false);
string location = position.Latitude + "," + position.Longitude;
lblStartLocation.Text = location;
}
These are my questions:
I used locator.GetLastKnownLocationAsync(); how long before the location cache refreshes?
And does the last known location refreshes when there is a change of location
And does the location refreshes when the devices is outside the accuracy range for example the accuracy is 100 meters does the location cache refresh when the device is outside the 100 meters range of the last know location?
I would highly recommend looking through the documentation for that particular plugin, James Montemagno is a well known and respected developer employed my Microsoft working on the Xamarin framework, so his plugins, extensions and toolkits tend to be pretty highly optimised for use in cross-platform applications.
Looking at the documentation it's clear that trying to get the last 'known' location looks at an internally cached location data set and is not necessarily optimized for near real-time queries. However it can be used to reduce the number of actual location queries you have to do within your app.
The full snippet from the linked documentation is as follows:
public async Task<Position> GetCurrentLocation()
{
public static async Task<Position> GetCurrentPosition()
{
Position position = null;
try
{
var locator = CrossGeolocator.Current;
locator.DesiredAccuracy = 100;
position = await locator.GetLastKnownLocationAsync();
if (position != null)
{
//got a cahched position, so let's use it.
return position;
}
if (!locator.IsGeolocationAvailable || !locator.IsGeolocationEnabled)
{
//not available or enabled
return null;
}
position = await locator.GetPositionAsync(TimeSpan.FromSeconds(20), null, true);
}
catch (Exception ex)
{
Debug.WriteLine("Unable to get location: " + ex);
}
if (position == null)
return null;
var output = string.Format("Time: {0} \nLat: {1} \nLong: {2} \nAltitude: {3} \nAltitude Accuracy: {4} \nAccuracy: {5} \nHeading: {6} \nSpeed: {7}",
position.Timestamp, position.Latitude, position.Longitude,
position.Altitude, position.AltitudeAccuracy, position.Accuracy, position.Heading, position.Speed);
Debug.WriteLine(output);
return position;
}
}
This method is designed to follow a hierarchical patter of location retrieval, it starts with the last 'known' position and queries if necessary, you could also take this a step further and add in a timeout to checking for a cached location if you wanted to.
In regards to how often this data is refreshed, I'd refer you to the documentation section titled 'Background Updates'
James talks about a driving app as an example of how this works. The refresh is handled differently across Android and iOS but here's the snippet regarding Android:
For this you will want to integrate a foreground service that
subscribes to location changes and the user interface binds to. Please
read through the Xamarin.Android Services documentation
In the code example he uses he shows you how to create a 'listener' that will check for changes to location periodically. This might be a better fit for what you're trying to do depending on the purpose of your application.

Various errors using VisionServiceClient in XamarinForms

I am trying to create a simple Xamarin forms app which allows the user to browse for or take a photo and have azure cognitive services tag the photo using a custom vision model.
I am unable to get the client to successfully authenticate or find a resource per the error message in the exception produced by the VisionServiceClient. Am I missing something? What would be the correct values to use for the arguments to VisionServiceClient?
All keys have been removed from the below images, they are populated.
Exception thrown in VS2017:
'Microsoft.ProjectOxford.Vision.ClientException' in System.Private.CoreLib.dll
Call to VisionServiceClient:
private const string endpoint = #"https://eastus2.api.cognitive.microsoft.com/vision/prediction/v1.0";
private const string key = "";
VisionServiceClient visionClient = new VisionServiceClient(key, endpoint);
VisualFeature[] features = { VisualFeature.Tags, VisualFeature.Categories, VisualFeature.Description };
try
{
AnalysisResult temp = await visionClient.AnalyzeImageAsync(imageStream,
features.ToList(), null);
return temp;
}
catch(Exception ex)
{
return null;
}
VS Exception Error:
Azure Portal for cognitive services:
Custom Vision Portal:
It looks like you're confusing the Computer Vision and the Custom Vision APIs. You are attempting to use the client SDK for the former using the API key of the latter.
For .NET languages, you'll want the Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction NuGet package.
Your code will end up looking something like this:
ICustomVisionPredictionClient client = new CustomVisionPredictionClient()
{
ApiKey = PredictionKey,
Endpoint = "https://southcentralus.api.cognitive.microsoft.com"
};
ImagePrediction prediction = await client.PredictImageAsync(ProjectId, stream, IterationId);
Thank you to cthrash for the extended help and talking with me in chat. Using his post along with a little troubleshooting I have figured out what works for me. The code is super clunky but it was just to test and make sure I'm able to do this. To answer the question:
Nuget packages and classes
Using cthrash's post I was able to get both the training and prediction nuget packages installed, which are the correct packages for this particular application. I needed the following classes:
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction.Models
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Training
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Training.Models
Endpoint Root
Following some of the steps Here I determined that the endpoint URL's only need to be the root, not the full URL provided in the Custom Vision Portal. For instance,
https://southcentralus.api.cognitive.microsoft.com/customvision/v2.0/Prediction/
Was changed to
https://southcentralus.api.cognitive.microsoft.com
I used both the key and endpoint from the Custom Vision Portal and making that change I was able to use both a training and prediction client to pull the projects and iterations.
Getting Project Id
In order to use CustomVisionPredictionClient.PredictImageAsync you need a Guid for the project id and an iteration id if a default iteration is not set in the portal.
I tested two ways to get the project id,
Using project id string from portal
Grab the project id string from the portal under the project settings.
For the first argument to PredictImageAsync pass
Guid.Parse(projectId)
Using the training client
Create a new CustomVisionTrainingClient
To get a list of <Project> use
TrainingClient.GetProjects().ToList()
In my case I only had a single project so I would just need the first element.
Guid projectId = projects[0].Id
Getting Iteration Id
To get the iteration id of a project you need the CustomVisionTrainingClient.
Create the client
To get a list of <Iteration> use
client.GetIterations(projectId).ToList()
In my case I had only a single iteration so I just need the first element.
Guid iterationId = iterations[0].Id
I am now able to use my model to classify images. In the code below, fileStream is the image stream passed to the model.
public async Task<string> Predict(Stream fileStream)
{
string projectId = "";
//string trainingEndpoint = "https://southcentralus.api.cognitive.microsoft.com/customvision/v2.2/Training/";
string trainingEndpoint = "https://southcentralus.api.cognitive.microsoft.com/";
string trainingKey = "";
//string predictionEndpoint = "https://southcentralus.api.cognitive.microsoft.com/customvision/v2.0/Prediction/";
string predictionEndpoint = "https://southcentralus.api.cognitive.microsoft.com";
string predictionKey = "";
CustomVisionTrainingClient trainingClient = new CustomVisionTrainingClient
{
ApiKey = trainingKey,
Endpoint = trainingEndpoint
};
List<Project> projects = new List<Project>();
try
{
projects = trainingClient.GetProjects().ToList();
}
catch(Exception ex)
{
Debug.WriteLine("Unable to get projects:\n\n" + ex.Message);
return "Unable to obtain projects.";
}
Guid ProjectId = Guid.Empty;
if(projects.Count > 0)
{
ProjectId = projects[0].Id;
}
if (ProjectId == Guid.Empty)
{
Debug.WriteLine("Unable to obtain project ID");
return "Unable to obtain project id.";
}
List<Iteration> iterations = new List<Iteration>();
try
{
iterations = trainingClient.GetIterations(ProjectId).ToList();
}
catch(Exception ex)
{
Debug.WriteLine("Unable to obtain iterations.");
return "Unable to obtain iterations.";
}
foreach(Iteration itr in iterations)
{
Debug.WriteLine(itr.Name + "\t" + itr.Id + "\n");
}
Guid iteration = Guid.Empty;
if(iterations.Count > 0)
{
iteration = iterations[0].Id;
}
if(iteration == Guid.Empty)
{
Debug.WriteLine("Unable to obtain project iteration.");
return "Unable to obtain project iteration";
}
CustomVisionPredictionClient predictionClient = new CustomVisionPredictionClient
{
ApiKey = predictionKey,
Endpoint = predictionEndpoint
};
var result = await predictionClient.PredictImageAsync(Guid.Parse(projectId), fileStream, iteration);
string resultStr = string.Empty;
foreach(PredictionModel pred in result.Predictions)
{
if(pred.Probability >= 0.85)
resultStr += pred.TagName + " ";
}
return resultStr;
}

Match EWS Conversation* to Outlook Add-in Conversation*

I wrote an add-in for Outlook years ago that adds entries to a database based on the Item's ConversationIndex/ConversationId properties. This works great and remains uniform across all clients interacting with the messages (e.g. "Bob" can see that "Mary" already processed this message because an entry with the ConversationIndex already exists).
I'm now trying to move this piece to a service (and connect via the EWS API) but I'm not having good luck matching these properties with the values coming from Outlook. For example:
The Outlook Add-In will give me the following values for a specific email I'm targeting:
ConversationID: 6B6369F5023EA646AA7BC161274BDAE8
ConversationIndex: 0101CF3C7EEC6B6369F5023EA646AA7BC161274BDAE8
However, from the EWS API I get the following:
ConversationID: AAQkADFhZThkNmJmLTlkODItNDQyZS1hM2YxLTQ2NWNkMTllYjhjOQAQAGtjafUCPqZGqnvBYSdL2ug=
ConversationIndex: new byte[]{1,1,207,60,126,236,107,99,105,245,2,62,166,70,170,123,193,97,39,75,218,232}
I recognize the first as a Base64 encoded string, however what I get decoded doesn't look like anything I recognize (or can decipher). Is there anyone familiar with this, or who can help to get these two values to align? I can only imagine these properties come from the exchange server is some fashion, but the Client probably performs some cleansing whereas the EWS API just gives me the raw value (less the Base64 for what I presume transport purposes given the XML medium).
If anyone is familiar with this or has done it before I would greatly appreciate any guidance.
Side Note:
There are probably better ways to identify emails but for now I'm stuck with trying to keep these two synonymous. Modifying the outlook add-in isn't really an option, and once I migrate a 1:1 translation to the server (and drop the add-in) I'll have flexibility changing how it work. But for now I need them to run side-by-side. I need to be able to see processes made within Outlook from the web server and vise-versa.
Just found out (I think).
Breakdown
With more Googling and a bit more effort I believe I was able to make them align 1:1 using the following:
ConversationId
This is apparently an assembled value made up of several properties. Luckily I was able to find a method Woodman posted re-implementing the original algorithm used by Outlook here. With some minor modifications (to work with EWS instead of Outlook) I was able to get it to work.
ConversationIndex
This turned out to simply be a matter of using the BitConverter (and removing the hyphens). Easy peasy.
Final Result:
public static class EwsEmailMessageExtensions
{
private const int c_ulConvIndexIDOffset = 6;
private const int c_ulConvIndexIDLength = 16;
private static ExtendedPropertyDefinition PidTagConversationIndexTracking = new ExtendedPropertyDefinition(0x3016, MapiPropertyType.Boolean);
// HUGE props to Woodman
// https://stackoverflow.com/a/21625224/298053
public static string GetOutlookConversationId(this EmailMessage emailMessage)
{
Boolean convTracking;
if (!emailMessage.TryGetProperty(PidTagConversationIndexTracking, out convTracking))
{
convTracking = true;
}
var convIndex = emailMessage.ConversationIndex;
byte[] idBytes;
if (convTracking && convIndex != null && convIndex.Length > 0)
{
// get Id from Conversation index
idBytes = new byte[c_ulConvIndexIDLength];
Array.Copy(convIndex, c_ulConvIndexIDOffset, idBytes, 0, c_ulConvIndexIDLength);
}
else
{
// get Id from Conversation topic
var topic = emailMessage.ConversationTopic;
if (string.IsNullOrEmpty(topic))
{
return string.Empty;
}
if (topic.Length >= 265)
{
topic = topic.Substring(0, 256);
}
topic = topic.ToUpper();
using (var md5 = new System.Security.Cryptography.MD5CryptoServiceProvider())
{
idBytes = md5.ComputeHash(Encoding.Unicode.GetBytes(topic));
}
}
return BitConverter.ToString(idBytes).Replace("-", string.Empty);
}
public static String GetOutlookConversationIndex(this EmailMessage emailMessage)
{
var convIndex = emailMessage.ConversationIndex;
return BitConverter.ToString(convIndex).Replace("-", String.Empty);
}
}
Usage:
// Prep
ExchangeService service = new ExchangeService(...);
Folder inbox = Folder.bind(service, WellKnownFolderName.Inbox);
Item item = /* inbox.FindItems(...).First() */
// Implmentation
EmailMessage emailMessage = item as EmailMessage;
if (emailMessage != null)
{
String conversationId = emailMessage.GetOutlookConversationId();
String conversationIndex = emailMessage.GetOutlookConversationIndex();
/* ... */
}

Dynamics CRM 2011 Bulk Update

Running Dynamics CRM 2011 rollout 3. Need to update millions of customer records periodically (delta updates). Using standard update (one by one) takes a few weeks. Also we don't want to touch the DB directly as it may break stuff in the future.
Is there a bulk update method in the Dynamics CRM 2011 webservice/REST API we can use? (WhatWhereHow)
I realize this is post is over 2 years old, but I can add to it in case someone else reads it and has a similar need.
Peter Majeed's answer is on target in that CRM processes requests one record at a time. There is no bulk edit that works the way you are looking for. I encourage you not to touch the DB directly if you need/want Microsoft support.
If you are looking at periodic updates of millions of records, you have a few options. Consider using Scribe or develop your own custom import utility or script using the CRM SDK.
Scribe is probably going to be your best option since it is cost effective for data imports and will allow you to easily update and insert from the same file.
If you write your own .Net/SDK based utility, I'd suggest making it multithreaded and either programmatically break up your input file in memory or on disk and have each thread work with its own subset of the data - that is, of course, if the order of execution does not have to be chronological according to the contents of the input file. If you can divide and conquer the input file over multiple threads, you can reduce the overall execution time considerably.
Also, if your corporate policy allows you to have access to one of the CRM Servers and you can place your code directly on the server and execute it from there - you can eliminate the network latency between a workstation running the code and the CRM web services.
Last but not least, if this large volume of import data is coming from another system, you can write a CRM plug-in to run on the Retrieve and RetrieveMultiple messages (events) in CRM for your specific entity, programmatically retrieve the desired data from the other system (and if the other system is unavailable - just use the cached copy in CRM), and keep CRM up to date in real-time or on a 'last-cached-on' basis. This is certainly more coding effort, but it potentially eliminates the need for a large synchronization job to be run every few weeks.
Yes and no, mostly no. Someone can correct me if I'm mistaken, in which case I'll gladly edit/delete my answer, but everything that's done in Dynamics CRM is done one at a time. It doesn't even try to handle set-based inserts/updates/deletes. So unless you go straight to direct DB operations, it will take you weeks.
The webservice does allow for "bulk" inserts/deletes/updates, but I put "bulk" in quotes because all it does is set up an asynchronous process where it does all the relevant data operations - yep - one at a time. There's a section of the SDK that addresses this sort of data management (linked). And to update the records this way, you'd have to first suffer the overhead of selecting all the data you want to update, then creating an xml file that contains the data, and finally updating the data (remember: one row at a time). So it would actually be more efficient to just loop through your data and issue an Update request for each yourself.
(I will note that our org hasn't experienced any memorable issues regarding direct DB access to handle what the SDK doesn't, nor have I seen anything in my personal internet readings that suggest others have.)
Edit:
See iFirefly's answer below for some other excellent ways to address this issue.
I realize this is an old question but it comes up high on "CRM Bulk Update" so the Update Rollup 12 feature ExecuteMultiple needs to be mentioned here -- its not going to work around your issue (massive volume) because as iFirefly and Peter point out CRM does everything one at a time. What it does do is package all your requests into a single envelope letting CRM handle the execution of each update and reduce the number of round trips between your app and the server if you do end up issuing an Update request for every record.
This is quite an old question, but nobody mentioned the fasted way (but also the most challenging) of updating/creating huge amounts of records in CRM 201X - using built-in import feature, which is totally doable using CRM SDK. There is a perfect MSDN article about that:
https://msdn.microsoft.com/en-us/library/gg328321(v=crm.5).aspx. In short you have to:
1) Build Excel file containing the data you want to import (simply export some data from CRM 201X and check how the structure looks like, remember that the first 3 columns are hidden)
2) create Import Map entity (specify the file you created)
3) Create column mappings if necessary
4) Create Import and ImportFile entity, providing proper mappings
5) Parse data using ParseImportRequest
6) Tranform data using TransformImportRequest
7) Import data using ImportRecordsImportRequest
This were the steps for CRM 2011, now in 2017 we have more versions available and there are slight differences between them. Check the sample that is available on MSDN and in SDK:
https://msdn.microsoft.com/en-us/library/hh547396(v=crm.5).aspx
Of course point 1, will be the most difficult part, because you have to build XML or docx file perfectly corresponding to what CRM expects, but I'm assuming you are doing it from external app, so you can use some great .NET libraries that will make things much simpler.
I never saw anything faster than standard CRM import when it comes to updating/creating records, even if you go for parallelism and Batch Update requests.
If something goes wrong with the MSDN sites, I'm posting here an example from the link above that is showing how to import data to CRM programatically:
using System;
using System.ServiceModel;
using System.Collections.Generic;
using System.Linq;
// These namespaces are found in the Microsoft.Xrm.Sdk.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Sdk.Messages;
using Microsoft.Xrm.Sdk.Metadata;
// These namespaces are found in the Microsoft.Crm.Sdk.Proxy.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Crm.Sdk.Messages;
namespace Microsoft.Crm.Sdk.Samples
{
/// <summary>
/// This sample shows how to define a complex mapping for importing and then use the
/// Microsoft Dynamics CRM 2011 API to bulk import records with that mapping.
/// </summary>
public class ImportWithCreate
{
#region Class Level Members
private OrganizationServiceProxy _serviceProxy;
private DateTime _executionDate;
#endregion
/// <summary>
/// This method first connects to the organization service. Afterwards,
/// auditing is enabled on the organization, account entity, and a couple
/// of attributes.
/// </summary>
/// <param name="serverConfig">Contains server connection information.</param>
/// <param name="promptforDelete">When True, the user will be prompted to delete all
/// created entities.</param>
public void Run(ServerConnection.Configuration serverConfig, bool promptforDelete)
{
using (_serviceProxy = ServerConnection.GetOrganizationProxy(serverConfig))
{
// This statement is required to enable early bound type support.
_serviceProxy.EnableProxyTypes();
// Log the start time to ensure deletion of records created during execution.
_executionDate = DateTime.Today;
ImportRecords();
DeleteRequiredRecords(promptforDelete);
}
}
/// <summary>
/// Imports records to Microsoft Dynamics CRM from the specified .csv file.
/// </summary>
public void ImportRecords()
{
// Create an import map.
ImportMap importMap = new ImportMap()
{
Name = "Import Map " + DateTime.Now.Ticks.ToString(),
Source = "Import Accounts.csv",
Description = "Description of data being imported",
EntitiesPerFile =
new OptionSetValue((int)ImportMapEntitiesPerFile.SingleEntityPerFile),
EntityState = EntityState.Created
};
Guid importMapId = _serviceProxy.Create(importMap);
// Create column mappings.
#region Column One Mappings
// Create a column mapping for a 'text' type field.
ColumnMapping colMapping1 = new ColumnMapping()
{
// Set source properties.
SourceAttributeName = "src_name",
SourceEntityName = "Account_1",
// Set target properties.
TargetAttributeName = "name",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with the data map.
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process)
};
// Create the mapping.
Guid colMappingId1 = _serviceProxy.Create(colMapping1);
#endregion
#region Column Two Mappings
// Create a column mapping for a 'lookup' type field.
ColumnMapping colMapping2 = new ColumnMapping()
{
// Set source properties.
SourceAttributeName = "src_parent",
SourceEntityName = "Account_1",
// Set target properties.
TargetAttributeName = "parentaccountid",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with the data map.
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process),
};
// Create the mapping.
Guid colMappingId2 = _serviceProxy.Create(colMapping2);
// Because we created a column mapping of type lookup, we need to specify lookup details in a lookupmapping.
// One lookupmapping will be for the parent account, and the other for the current record.
// This lookupmapping is important because without it the current record
// cannot be used as the parent of another record.
// Create a lookup mapping to the parent account.
LookUpMapping parentLookupMapping = new LookUpMapping()
{
// Relate this mapping with its parent column mapping.
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId2),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)LookUpMappingProcessCode.Process),
// Set the lookup for an account entity by its name attribute.
LookUpEntityName = Account.EntityLogicalName,
LookUpAttributeName = "name",
LookUpSourceCode =
new OptionSetValue((int)LookUpMappingLookUpSourceCode.System)
};
// Create the lookup mapping.
Guid parentLookupMappingId = _serviceProxy.Create(parentLookupMapping);
// Create a lookup on the current record's "src_name" so that this record can
// be used as the parent account for another record being imported.
// Without this lookup, no record using this account as its parent will be imported.
LookUpMapping currentLookUpMapping = new LookUpMapping()
{
// Relate this lookup with its parent column mapping.
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId2),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)LookUpMappingProcessCode.Process),
// Set the lookup for the current record by its src_name attribute.
LookUpAttributeName = "src_name",
LookUpEntityName = "Account_1",
LookUpSourceCode =
new OptionSetValue((int)LookUpMappingLookUpSourceCode.Source)
};
// Create the lookup mapping
Guid currentLookupMappingId = _serviceProxy.Create(currentLookUpMapping);
#endregion
#region Column Three Mappings
// Create a column mapping for a 'picklist' type field
ColumnMapping colMapping3 = new ColumnMapping()
{
// Set source properties
SourceAttributeName = "src_addresstype",
SourceEntityName = "Account_1",
// Set target properties
TargetAttributeName = "address1_addresstypecode",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with its parent data map
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process)
};
// Create the mapping
Guid colMappingId3 = _serviceProxy.Create(colMapping3);
// Because we created a column mapping of type picklist, we need to specify picklist details in a picklistMapping
PickListMapping pickListMapping1 = new PickListMapping()
{
SourceValue = "bill",
TargetValue = 1,
// Relate this column mapping with its column mapping data map
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId3),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)PickListMappingProcessCode.Process)
};
// Create the mapping
Guid picklistMappingId1 = _serviceProxy.Create(pickListMapping1);
// Need a picklist mapping for every address type code expected
PickListMapping pickListMapping2 = new PickListMapping()
{
SourceValue = "ship",
TargetValue = 2,
// Relate this column mapping with its column mapping data map
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId3),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)PickListMappingProcessCode.Process)
};
// Create the mapping
Guid picklistMappingId2 = _serviceProxy.Create(pickListMapping2);
#endregion
// Create Import
Import import = new Import()
{
// IsImport is obsolete; use ModeCode to declare Create or Update.
ModeCode = new OptionSetValue((int)ImportModeCode.Create),
Name = "Importing data"
};
Guid importId = _serviceProxy.Create(import);
// Create Import File.
ImportFile importFile = new ImportFile()
{
Content = BulkImportHelper.ReadCsvFile("Import Accounts.csv"), // Read contents from disk.
Name = "Account record import",
IsFirstRowHeader = true,
ImportMapId = new EntityReference(ImportMap.EntityLogicalName, importMapId),
UseSystemMap = false,
Source = "Import Accounts.csv",
SourceEntityName = "Account_1",
TargetEntityName = Account.EntityLogicalName,
ImportId = new EntityReference(Import.EntityLogicalName, importId),
EnableDuplicateDetection = false,
FieldDelimiterCode =
new OptionSetValue((int)ImportFileFieldDelimiterCode.Comma),
DataDelimiterCode =
new OptionSetValue((int)ImportFileDataDelimiterCode.DoubleQuote),
ProcessCode =
new OptionSetValue((int)ImportFileProcessCode.Process)
};
// Get the current user to set as record owner.
WhoAmIRequest systemUserRequest = new WhoAmIRequest();
WhoAmIResponse systemUserResponse =
(WhoAmIResponse)_serviceProxy.Execute(systemUserRequest);
// Set the owner ID.
importFile.RecordsOwnerId =
new EntityReference(SystemUser.EntityLogicalName, systemUserResponse.UserId);
Guid importFileId = _serviceProxy.Create(importFile);
// Retrieve the header columns used in the import file.
GetHeaderColumnsImportFileRequest headerColumnsRequest = new GetHeaderColumnsImportFileRequest()
{
ImportFileId = importFileId
};
GetHeaderColumnsImportFileResponse headerColumnsResponse =
(GetHeaderColumnsImportFileResponse)_serviceProxy.Execute(headerColumnsRequest);
// Output the header columns.
int columnNum = 1;
foreach (string headerName in headerColumnsResponse.Columns)
{
Console.WriteLine("Column[" + columnNum.ToString() + "] = " + headerName);
columnNum++;
}
// Parse the import file.
ParseImportRequest parseImportRequest = new ParseImportRequest()
{
ImportId = importId
};
ParseImportResponse parseImportResponse =
(ParseImportResponse)_serviceProxy.Execute(parseImportRequest);
Console.WriteLine("Waiting for Parse async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, parseImportResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
// Retrieve the first two distinct values for column 1 from the parse table.
// NOTE: You must create the parse table first using the ParseImport message.
// The parse table is not accessible after ImportRecordsImportResponse is called.
GetDistinctValuesImportFileRequest distinctValuesRequest = new GetDistinctValuesImportFileRequest()
{
columnNumber = 1,
ImportFileId = importFileId,
pageNumber = 1,
recordsPerPage = 2,
};
GetDistinctValuesImportFileResponse distinctValuesResponse =
(GetDistinctValuesImportFileResponse)_serviceProxy.Execute(distinctValuesRequest);
// Output the distinct values. In this case: (column 1, row 1) and (column 1, row 2).
int cellNum = 1;
foreach (string cellValue in distinctValuesResponse.Values)
{
Console.WriteLine("(1, " + cellNum.ToString() + "): " + cellValue);
Console.WriteLine(cellValue);
cellNum++;
}
// Retrieve data from the parse table.
// NOTE: You must create the parse table first using the ParseImport message.
// The parse table is not accessible after ImportRecordsImportResponse is called.
RetrieveParsedDataImportFileRequest parsedDataRequest = new RetrieveParsedDataImportFileRequest()
{
ImportFileId = importFileId,
PagingInfo = new PagingInfo()
{
// Specify the number of entity instances returned per page.
Count = 2,
// Specify the number of pages returned from the query.
PageNumber = 1,
// Specify a total number of entity instances returned.
PagingCookie = "1"
}
};
RetrieveParsedDataImportFileResponse parsedDataResponse =
(RetrieveParsedDataImportFileResponse)_serviceProxy.Execute(parsedDataRequest);
// Output the first two rows retrieved.
int rowCount = 1;
foreach (string[] rows in parsedDataResponse.Values)
{
int colCount = 1;
foreach (string column in rows)
{
Console.WriteLine("(" + rowCount.ToString() + "," + colCount.ToString() + ") = " + column);
colCount++;
}
rowCount++;
}
// Transform the import
TransformImportRequest transformImportRequest = new TransformImportRequest()
{
ImportId = importId
};
TransformImportResponse transformImportResponse =
(TransformImportResponse)_serviceProxy.Execute(transformImportRequest);
Console.WriteLine("Waiting for Transform async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, transformImportResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
// Upload the records.
ImportRecordsImportRequest importRequest = new ImportRecordsImportRequest()
{
ImportId = importId
};
ImportRecordsImportResponse importResponse =
(ImportRecordsImportResponse)_serviceProxy.Execute(importRequest);
Console.WriteLine("Waiting for ImportRecords async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, importResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
}
/// <summary>
/// Deletes any entity records that were created for this sample.
/// <param name="prompt">Indicates whether to prompt the user
/// to delete the records created in this sample.</param>
/// </summary>
public void DeleteRequiredRecords(bool prompt)
{
bool toBeDeleted = true;
if (prompt)
{
// Ask the user if the created entities should be deleted.
Console.Write("\nDo you want these entity records deleted? (y/n) [y]: ");
String answer = Console.ReadLine();
if (answer.StartsWith("y") ||
answer.StartsWith("Y") ||
answer == String.Empty)
{
toBeDeleted = true;
}
else
{
toBeDeleted = false;
}
}
if (toBeDeleted)
{
// Retrieve all account records created in this sample.
QueryExpression query = new QueryExpression()
{
EntityName = Account.EntityLogicalName,
Criteria = new FilterExpression()
{
Conditions =
{
new ConditionExpression("createdon", ConditionOperator.OnOrAfter, _executionDate),
}
},
ColumnSet = new ColumnSet(false)
};
var accountsCreated = _serviceProxy.RetrieveMultiple(query).Entities;
// Delete all records created in this sample.
foreach (var account in accountsCreated)
{
_serviceProxy.Delete(Account.EntityLogicalName, account.Id);
}
Console.WriteLine("Entity record(s) have been deleted.");
}
}
#region Main method
/// <summary>
/// Standard Main() method used by most SDK samples.
/// </summary>
/// <param name="args"></param>
static public void Main(string[] args)
{
try
{
// Obtain the target organization's web address and client logon
// credentials from the user.
ServerConnection serverConnect = new ServerConnection();
ServerConnection.Configuration config = serverConnect.GetServerConfiguration();
var app = new ImportWithCreate();
app.Run(config, true);
}
catch (FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault> ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine("Timestamp: {0}", ex.Detail.Timestamp);
Console.WriteLine("Code: {0}", ex.Detail.ErrorCode);
Console.WriteLine("Message: {0}", ex.Detail.Message);
Console.WriteLine("Trace: {0}", ex.Detail.TraceText);
Console.WriteLine("Inner Fault: {0}",
null == ex.Detail.InnerFault ? "No Inner Fault" : "Has Inner Fault");
}
catch (System.TimeoutException ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine("Message: {0}", ex.Message);
Console.WriteLine("Stack Trace: {0}", ex.StackTrace);
Console.WriteLine("Inner Fault: {0}",
null == ex.InnerException.Message ? "No Inner Fault" : ex.InnerException.Message);
}
catch (System.Exception ex)
{
Console.WriteLine("The application terminated with an error.");
Console.WriteLine(ex.Message);
// Display the details of the inner exception.
if (ex.InnerException != null)
{
Console.WriteLine(ex.InnerException.Message);
FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault> fe = ex.InnerException
as FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault>;
if (fe != null)
{
Console.WriteLine("Timestamp: {0}", fe.Detail.Timestamp);
Console.WriteLine("Code: {0}", fe.Detail.ErrorCode);
Console.WriteLine("Message: {0}", fe.Detail.Message);
Console.WriteLine("Trace: {0}", fe.Detail.TraceText);
Console.WriteLine("Inner Fault: {0}",
null == fe.Detail.InnerFault ? "No Inner Fault" : "Has Inner Fault");
}
}
}
// Additional exceptions to catch: SecurityTokenValidationException, ExpiredSecurityTokenException,
// SecurityAccessDeniedException, MessageSecurityException, and SecurityNegotiationException.
finally
{
Console.WriteLine("Press <Enter> to exit.");
Console.ReadLine();
}
}
#endregion Main method
}
}
Not sure how this would go with millions of records, but you can select your records, then click the Edit button in the ribbon. This will bring up the "Edit Multiple Records" dialog. Any changes you make will be applied to all your records.
The BulkUpdate API works well for me; it is 10 times faster than updating records one at a time. Following is a snippet that performs a bulk update:
public override ExecuteMultipleResponse BulkUpdate(List<Entity> entities)
{
ExecuteMultipleRequest request = new ExecuteMultipleRequest()
{
Settings = new ExecuteMultipleSettings()
{
ContinueOnError = true,
ReturnResponses = true
},
Requests = new OrganizationRequestCollection()
};
for (int i = 0; i < entities.Count; i++)
{
request.Requests.Add(new UpdateRequest() { Target = entities[i] });
}
return (ExecuteMultipleResponse) ServiceContext.Execute(request);
}
I worked on a very large data migration project for Dynamics CRM 2011. We needed to load about 3 million records over a weekend. I ended up building a console application (single thread) and ran multiple instances on multiple machines. Each console application had an id (1, 2, etc.) and was responsible for loading segments of the data based on a unique SQL WHERE clause that matched the application's id.
You could do the same thing with updates. Each instance could query a subset of the records to update and can perform the updates via the SDK. Since we loaded millions of records over a weekend I think you could perform millions of updates (if relatively small) in just a few hours.
Microsoft PFE team for dynamics CRM wrote
new Another CRM SDK library that make use of parallelization
to bulk execute requests ensuring thread safety.
You may try : Parallel Execute Requests
I would be interested to know if it works and scales to millions of records.
CRM doesn't implement a way to update bulk data; there are 3 ways to improve the bulk update operation performance but internally they cannot change the fact that CRM updates record one by one.
Basically the ideas are:
reduce the time wasted on communicating to CRM server
use parallelism to do multiple operations at the same time
make sure the update process does NOT trigger any workflows/plugins. Otherwise you might never see the end of the process...
3 ways to improve bulk operation performance:
After RollUp 12 there is a ExecuteMultipleRequest feature, which allows you to send up to 1000 requests at once. This means you may save some time from sending 1000 requests to CRM web service, however, these requests are processed one after another. So if your CRM server is well configured, most likely this method won't help too much.
You may use OrganizationServiceContext instance to do bulk update. OrganizationServiceContext implements unit of work pattern so you can do multiple updates and transmit these operations to the server in one call. Comparing to ExecuteMultipleRequest, it doesn't have a limit on request amount, but if it encounters a failure during the update, it will rollback all the changes.
Use multithreading or multitask. Either way would improve the speed, but they are likely to generate some connection failures or SQL errors, so you would need to add some retry logic in the code.
One of my client had exactly the same problem. He solved it by creating a custom ETL and doing parallelism attacking two front-end. The whole thing was made in C#. Nowaday, it could be possible with KingswaySoft or Scribe.

How to call webservice methods in Windows Phone 7?

For connecting to webservices i wrote the following code.
WebClient wc = new WebClient();
wc.DownloadStringAsync(new Uri("http://www.Webservices.asmx"));
wc.DownloadStringCompleted += new DownloadStringCompletedEventHandler(wc_DownloadStringCompleted);
void wc_DownloadStringCompleted(object sender,DownloadStringCompletedEventArgs e)
{
Debug.WriteLine("Web service says: " + e.Result);
using (var reader = new StringReader(e.Result))
{
String str = reader.ReadToEnd();
}
}
by using above code Get the string result.But i want get the result in HTMLVisulaizer then i know the what are the methods having that webservice.then i can easily access the particular method.
Please tell me how to call a web service method in Windows phone 7?in webservice i am having 5 webmethods.how to get that and how to call the Particular webmenthod.
Please tell me thanks in advance.
#venkateswara Are you talking about obtaining a list of known WebReference methods so you know which one to call in you code? Do you not see the this of known method calls when you add the WebReference to your WP7 project? Since you will be developing the WP7 app in VS I can't see the reason you would want to do this. Even if you don't own the webservice yourself, you will need to connect to it from VS in order to add the reference to your project.
Below is the screen in VS2010 where a WebReference is added. The Operations are listed on the right.
Once added you can use the ObjectBrowser to understand how the methods should be called.
Please let me know if I have missed something from your question.
#Jason James
The first step:
You must add referent Services ,like Jason James has very detailed instructions .
step 2 :
You can open App.xaml.cs , in Functions Apps
public Apps()
{
// Global handler for uncaught exceptions.
UnhandledException += Application_UnhandledException;
// Show graphics profiling information while debugging.
if (System.Diagnostics.Debugger.IsAttached)
{
// Display the current frame rate counters.
Application.Current.Host.Settings.EnableFrameRateCounter = true;
// Show the areas of the app that are being redrawn in each frame.
//Application.Current.Host.Settings.EnableRedrawRegions = true;
// Enable non-production analysis visualization mode,
// which shows areas of a page that are being GPU accelerated with a colored overlay.
//Application.Current.Host.Settings.EnableCacheVisualization = true;
}
// You can declare objects here that you will use
//Examlpe: NameservicesReferent.(Function that returns services) = new NameservicesReferent.(Function that returns services)();
Ws_Function = new Nameservices.ServiceSoapClient();
}
step 3:
in Mainpage.xaml.cs
GlobalVariables.Ws_advertise.getLinkAdvertiseIndexCompleted += new EventHandler<advertise.getLinkAdvertiseIndexCompletedEventArgs>(Ws_advertise_getLinkAdvertiseIndexCompleted);
GlobalVariables.***NameWedservise***.getLinkAdvertiseIndexAsync("**parameters to be passed**");
step 4:
void Ws_advertise_getLinkAdvertiseIndexCompleted(object sender, advertise.getLinkAdvertiseIndexCompletedEventArgs e)
{
//function returns the results to you, the example here is an array
string[] array = null;
try
{
array = e.result;
if(array != null)
}
cath(exception ex)
{
}
finally
{
array = null;
GlobalVariables.Ws_advertise.getLinkAdvertiseIndexCompleted -= new EventHandler<advertise.getLinkAdvertiseIndexCompletedEventArgs>(Ws_advertise_getLinkAdvertiseIndexCompleted);
}
}

Resources