RFC on HttpWebRequest vs RESTSharp from Windows CE / Compact Framework 3.5 - asp.net-web-api

I created a simple WebAPI service in .NET4 and Visual Studio 2010.
I need to consume that service in a Windows CE / CF 3.5 app
I do have HttpWebRequest available to me, but am not sure if this is the way to go, or I should use RestSharp. Does anybody have insight/experience that would help me decide? I prefer not using 3rd party code when possible to avoid it, but am willing to do so if there is a clear advantage over the "raw" offerings.
Does anyone have, or know of, examples for accessing WebApi services using HttpWebRequest OR RestSharp from CF 3.5?
I have this code (adapted from from http://www.asp.net/web-api/overview/getting-started-with-aspnet-web-api/tutorial-your-first-web-api) for sample WebAPI methods:
public class VendorItemsController : ApiController
{
VendorItem[] vendorItems = new VendorItem[]
{
new VendorItem { VendorId = "1", VendorItemId = "Tomato Soup", ItemId = "Groceries", PackSize = 1 },
new VendorItem { VendorId = "2", VendorItemId = "V8", ItemId = "Groceries", PackSize = 6 },
new VendorItem { VendorId = "3", VendorItemId = "Garlic", ItemId = "Groceries", PackSize = 1 },
};
public IEnumerable<VendorItem> GetAllProducts()
{
return vendorItems;
}
public VendorItem GetProductById(string id)
{
var vendorItem = vendorItems.FirstOrDefault((p) => p.VendorId == id);
if (vendorItem == null)
{
throw new HttpResponseException(HttpStatusCode.NotFound);
}
return vendorItem;
}
}
...but don't know how to consume this using, if possible, HttpWebRequest.
Note: HttpClient is not available to me (HttpWebRequest is, though).
UPDATE
I start the VS2010 app that has the WebAPI method; but when I run the VS2008 Windows CE / Compact Framekwork 3.5 app with this code:
Uri _baseAddress = new Uri("http://localhost:48614/");
string localFile = "fetchedVendorItems.txt";
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(_baseAddress + "api/vendoritems/");
req.Method = "GET";
HttpWebResponse resp = (HttpWebResponse) req.GetResponse();
// Retrieve response stream and wrap in StreamReader
Stream respStream = resp.GetResponseStream();
StreamReader rdr = new StreamReader(respStream);
// Create the local file
StreamWriter wrtr = new StreamWriter(localFile);
// loop through response stream reading each line and writing to the local file
string inLine = rdr.ReadLine();
while (inLine != null)
{
wrtr.WriteLine(inLine);
inLine = rdr.ReadLine();
}
rdr.Close();
wrtr.Close();
(which I adapted from here: http://msdn.microsoft.com/en-us/library/aa446517.aspx)
...I get, "Unable to connect to the remote server"
UPDATE 2
This does work directly in the browser on the dev machine:
http://localhost:48614/api/redemptions/
It returns these values from a Controller:
readonly Redemption[] redemptions =
{
new Redemption { RedemptionId = "1", RedemptionName = "Old", RedemptionItemId = "ABC", RedemptionAmount = 0.25M, RedemptionDept = "2.0", RedemptionSubDept = "42" },
new Redemption { RedemptionId = "2", RedemptionName = "Damaged", RedemptionItemId = "BCD", RedemptionAmount = 5.00M, RedemptionDept = "42.0", RedemptionSubDept = "76" },
new Redemption { RedemptionId = "3", RedemptionName = "Rebate", RedemptionItemId = "DEF", RedemptionAmount = 42.75M, RedemptionDept = "76.0", RedemptionSubDept = "112" }
};
...like so:
<ArrayOfRedemption xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.datacontract.org/2004/07/HHSServerWebAPI.Models">
<Redemption>
<RedemptionAmount>0.25</RedemptionAmount>
<RedemptionDept>2.0</RedemptionDept>
<RedemptionId>1</RedemptionId>
<RedemptionItemId>ABC</RedemptionItemId>
<RedemptionName>Old</RedemptionName>
<RedemptionSubDept>42</RedemptionSubDept>
</Redemption>
<Redemption>
<RedemptionAmount>5.00</RedemptionAmount>
<RedemptionDept>42.0</RedemptionDept>
<RedemptionId>2</RedemptionId>
<RedemptionItemId>BCD</RedemptionItemId>
<RedemptionName>Damaged</RedemptionName>
<RedemptionSubDept>76</RedemptionSubDept>
</Redemption>
<Redemption>
<RedemptionAmount>42.75</RedemptionAmount>
<RedemptionDept>76.0</RedemptionDept>
<RedemptionId>3</RedemptionId>
<RedemptionItemId>DEF</RedemptionItemId>
<RedemptionName>Rebate</RedemptionName>
<RedemptionSubDept>112</RedemptionSubDept>
</Redemption>
</ArrayOfRedemption>
...even when the VS2008 project is not running - is that because this data was cached (the first time I entered:
http://localhost:48614/api/redemptions/
...in the browser, the Web API app was running)?
I get that the emulator won't recognize "localhost" as the desktop instance, considering itself someone/somewhere else. So how can I test this on an emulator? What IP address can I use?

Avoiding 3rd party code out of hand is just plain silly. Why reinvent the wheel? If you are a company who's IP is making REST calls, then sure, roll it, but I suspect your core business offering is in solving some other problem. I mean why use the CF itself, and not C? Why use C and not assembly? Why use a third-party processor and not design your own?
All that aside, RestSharp comes with source and it's free, so there's little risk in using it. There are some things I like about it - primarily that most of the grunt work for REST calls done. I'm a big fan of not reinventing things. It has some quirks that I've "fixed" locally (I'm meaning to do a pull request, just haven't found the time yet) but they were minor and not what I'd consider to be "typical" cases.
As for calling Web APIs with RestSharp, there's a pretty thorough coverage over in this article.

Related

Various errors using VisionServiceClient in XamarinForms

I am trying to create a simple Xamarin forms app which allows the user to browse for or take a photo and have azure cognitive services tag the photo using a custom vision model.
I am unable to get the client to successfully authenticate or find a resource per the error message in the exception produced by the VisionServiceClient. Am I missing something? What would be the correct values to use for the arguments to VisionServiceClient?
All keys have been removed from the below images, they are populated.
Exception thrown in VS2017:
'Microsoft.ProjectOxford.Vision.ClientException' in System.Private.CoreLib.dll
Call to VisionServiceClient:
private const string endpoint = #"https://eastus2.api.cognitive.microsoft.com/vision/prediction/v1.0";
private const string key = "";
VisionServiceClient visionClient = new VisionServiceClient(key, endpoint);
VisualFeature[] features = { VisualFeature.Tags, VisualFeature.Categories, VisualFeature.Description };
try
{
AnalysisResult temp = await visionClient.AnalyzeImageAsync(imageStream,
features.ToList(), null);
return temp;
}
catch(Exception ex)
{
return null;
}
VS Exception Error:
Azure Portal for cognitive services:
Custom Vision Portal:
It looks like you're confusing the Computer Vision and the Custom Vision APIs. You are attempting to use the client SDK for the former using the API key of the latter.
For .NET languages, you'll want the Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction NuGet package.
Your code will end up looking something like this:
ICustomVisionPredictionClient client = new CustomVisionPredictionClient()
{
ApiKey = PredictionKey,
Endpoint = "https://southcentralus.api.cognitive.microsoft.com"
};
ImagePrediction prediction = await client.PredictImageAsync(ProjectId, stream, IterationId);
Thank you to cthrash for the extended help and talking with me in chat. Using his post along with a little troubleshooting I have figured out what works for me. The code is super clunky but it was just to test and make sure I'm able to do this. To answer the question:
Nuget packages and classes
Using cthrash's post I was able to get both the training and prediction nuget packages installed, which are the correct packages for this particular application. I needed the following classes:
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Prediction.Models
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Training
Microsoft.Azure.CognitiveServices.Vision.CustomVision.Training.Models
Endpoint Root
Following some of the steps Here I determined that the endpoint URL's only need to be the root, not the full URL provided in the Custom Vision Portal. For instance,
https://southcentralus.api.cognitive.microsoft.com/customvision/v2.0/Prediction/
Was changed to
https://southcentralus.api.cognitive.microsoft.com
I used both the key and endpoint from the Custom Vision Portal and making that change I was able to use both a training and prediction client to pull the projects and iterations.
Getting Project Id
In order to use CustomVisionPredictionClient.PredictImageAsync you need a Guid for the project id and an iteration id if a default iteration is not set in the portal.
I tested two ways to get the project id,
Using project id string from portal
Grab the project id string from the portal under the project settings.
For the first argument to PredictImageAsync pass
Guid.Parse(projectId)
Using the training client
Create a new CustomVisionTrainingClient
To get a list of <Project> use
TrainingClient.GetProjects().ToList()
In my case I only had a single project so I would just need the first element.
Guid projectId = projects[0].Id
Getting Iteration Id
To get the iteration id of a project you need the CustomVisionTrainingClient.
Create the client
To get a list of <Iteration> use
client.GetIterations(projectId).ToList()
In my case I had only a single iteration so I just need the first element.
Guid iterationId = iterations[0].Id
I am now able to use my model to classify images. In the code below, fileStream is the image stream passed to the model.
public async Task<string> Predict(Stream fileStream)
{
string projectId = "";
//string trainingEndpoint = "https://southcentralus.api.cognitive.microsoft.com/customvision/v2.2/Training/";
string trainingEndpoint = "https://southcentralus.api.cognitive.microsoft.com/";
string trainingKey = "";
//string predictionEndpoint = "https://southcentralus.api.cognitive.microsoft.com/customvision/v2.0/Prediction/";
string predictionEndpoint = "https://southcentralus.api.cognitive.microsoft.com";
string predictionKey = "";
CustomVisionTrainingClient trainingClient = new CustomVisionTrainingClient
{
ApiKey = trainingKey,
Endpoint = trainingEndpoint
};
List<Project> projects = new List<Project>();
try
{
projects = trainingClient.GetProjects().ToList();
}
catch(Exception ex)
{
Debug.WriteLine("Unable to get projects:\n\n" + ex.Message);
return "Unable to obtain projects.";
}
Guid ProjectId = Guid.Empty;
if(projects.Count > 0)
{
ProjectId = projects[0].Id;
}
if (ProjectId == Guid.Empty)
{
Debug.WriteLine("Unable to obtain project ID");
return "Unable to obtain project id.";
}
List<Iteration> iterations = new List<Iteration>();
try
{
iterations = trainingClient.GetIterations(ProjectId).ToList();
}
catch(Exception ex)
{
Debug.WriteLine("Unable to obtain iterations.");
return "Unable to obtain iterations.";
}
foreach(Iteration itr in iterations)
{
Debug.WriteLine(itr.Name + "\t" + itr.Id + "\n");
}
Guid iteration = Guid.Empty;
if(iterations.Count > 0)
{
iteration = iterations[0].Id;
}
if(iteration == Guid.Empty)
{
Debug.WriteLine("Unable to obtain project iteration.");
return "Unable to obtain project iteration";
}
CustomVisionPredictionClient predictionClient = new CustomVisionPredictionClient
{
ApiKey = predictionKey,
Endpoint = predictionEndpoint
};
var result = await predictionClient.PredictImageAsync(Guid.Parse(projectId), fileStream, iteration);
string resultStr = string.Empty;
foreach(PredictionModel pred in result.Predictions)
{
if(pred.Probability >= 0.85)
resultStr += pred.TagName + " ";
}
return resultStr;
}

How to connect Stanford Core Server from dotnet

I am trying to use Stanford NLP for.NET. I am very new to this.
How can I connect Stanford core NLP server from c# program
My NLP server runs on localhost:9000
You can connect via the .NET HTTPClient or other equivalent .NET call to a web endpoint. You need to set up your NLP endpoint, properties for the NLP server, and the text content you want it to parse. There is additional information on the Stanford NLP Server page, as well as information on what properties can be set depending on what NLP pipeline you want to run.
The following code is from a .NET Core console application using the following call to return Named Entity Recognition, Dependency Parser and OpenIE results.
FYI I've had some occasions where my NLP endpoint didn't work when waking a laptop from sleep mode (Docker for Windows pre 17.12 on Win10). Resetting Docker did the trick for me... if you can't browse to your http://localhost:9000 website, then the endpoint definitely won't work either!
using System.Collections.Generic;
using System.Threading.Tasks;
using System.Net.Http;
// String to process
string s = "This is the sentence to provide to NLP."
// Set up the endpoint
string nlpBaseAddress = "http://localhost:9000"
// Create the query string params for NLPCore
string jsonOptions = "{\"annotators\": \"ner, depparse, openie\", \"outputformat\": \"json\"}";
Dictionary qstringProperties = new Dictionary();
qstringProperties.Add("properties", jsonOptions);
string qString = ToQueryString(qstringProperties);
// Add the query string to the base address
string urlPlusQuery = nlpBaseAddress + qString;
// Create the content to submit
var content = new StringContent(s);
content.Headers.Clear();
content.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
// Submit for processing
var client = new HttpClient();
HttpResponseMessage response;
Task tResponse = client.PostAsync(urlPlusQuery, content);
tResponse.Wait();
response = tResponse.Result;
// Check the response
if (response.StatusCode != System.Net.HttpStatusCode.OK)
{
// Do something better than throwing an app exception here!
throw new ApplicationException("Subject-Object tuple extraction returned an unexpected response from the subject-object service");
}
Task rString = response.Content.ReadAsStringAsync();
rString.Wait();
string jsonResult = rString.Result;
Utility function used within this call to generate a QueryString:
private string ToQueryString(Dictionary nvc)
{
System.Text.StringBuilder sb = new System.Text.StringBuilder("?");
bool first = true;
foreach (KeyValuePair key in nvc)
{
// CHeck if this is the first value
if (!first)
{
sb.Append("&");
}
sb.AppendFormat("{0}={1}", Uri.EscapeDataString(key.Key), Uri.EscapeDataString(key.Value));
first = false;
}
return sb.ToString();
}

How to automate Package Manager Console in Visual Studio 2013

My specific problem is how can I automate "add-migration" in a build process for the Entity Framework. In researching this, it seems the mostly likely approach is something along the lines of automating these steps
Open a solution in Visual Studio 2013
Execute "Add-Migration blahblah" in the Package Manager Console (most likely via an add-in vsextention)
Close the solution
This initial approach is based on my own research and this question, the powershell script ultimately behind Add-Migration requires quite a bit of set-up to run. Visual Studio performs that setup automatically when creating the Package Manager Console and making the DTE object available. I would prefer not to attempt to duplicate that setup outside of Visual Studio.
One possible path to a solution is this unanswered stack overflow question
In researching the NuGet API, it does not appear to have a "send this text and it will be run like it was typed in the console". I am not clear on the lines between Visual Studio vs NuGet so I am not sure this is something that would be there.
I am able to find the "Pacakage Manager Console" ironically enough via "$dte.Windows" command in the Package Manager Console but in a VS 2013 window, that collection gives me objects which are "Microsoft.VisualStudio.Platform.WindowManagement.DTE.WindowBase". If there is a way stuff text into it, I think I need to get it to be a NuGetConsole.Implementation.PowerConsoleToolWindow" through reviewing the source code I am not clear how the text would stuffed but I am not at all familiar with what I am seeing.
Worst case, I will fall back to trying to stuff keys to it along the lines of this question but would prefer not to since that will substantially complicate the automation surrounding the build process.
All of that being said,
Is it possible to stream commands via code to the Package Manager Console in Visual Studio which is fully initialized and able to support an Entity Framework "add-migration" command?
Thanks for any suggestions, advice, help, non-abuse in advance,
John
The approach that worked for me was to trace into the entity framework code starting in with the AddMigrationCommand.cs in the EntityFramework.Powershell project and find the hooks into the EntityFramework project and then make those hooks work so there is no Powershell dependency.
You can get something like...
public static void RunIt(EnvDTE.Project project, Type dbContext, Assembly migrationAssembly, string migrationDirectory,
string migrationsNamespace, string contextKey, string migrationName)
{
DbMigrationsConfiguration migrationsConfiguration = new DbMigrationsConfiguration();
migrationsConfiguration.AutomaticMigrationDataLossAllowed = false;
migrationsConfiguration.AutomaticMigrationsEnabled = false;
migrationsConfiguration.CodeGenerator = new CSharpMigrationCodeGenerator(); //same as default
migrationsConfiguration.ContextType = dbContext; //data
migrationsConfiguration.ContextKey = contextKey;
migrationsConfiguration.MigrationsAssembly = migrationAssembly;
migrationsConfiguration.MigrationsDirectory = migrationDirectory;
migrationsConfiguration.MigrationsNamespace = migrationsNamespace;
System.Data.Entity.Infrastructure.DbConnectionInfo dbi = new System.Data.Entity.Infrastructure.DbConnectionInfo("DataContext");
migrationsConfiguration.TargetDatabase = dbi;
MigrationScaffolder ms = new MigrationScaffolder(migrationsConfiguration);
ScaffoldedMigration sf = ms.Scaffold(migrationName, false);
}
You can use this question to get to the dte object and from there to find the project object to pass into the call.
This is an update to John's answer whom I have to thank for the "hard part", but here is a complete example which creates a migration and adds that migration to the supplied project (project must be built before) the same way as Add-Migration InitialBase -IgnoreChanges would:
public void ScaffoldedMigration(EnvDTE.Project project)
{
var migrationsNamespace = project.Properties.Cast<Property>()
.First(p => p.Name == "RootNamespace").Value.ToString() + ".Migrations";
var assemblyName = project.Properties.Cast<Property>()
.First(p => p.Name == "AssemblyName").Value.ToString();
var rootPath = Path.GetDirectoryName(project.FullName);
var assemblyPath = Path.Combine(rootPath, "bin", assemblyName + ".dll");
var migrationAssembly = Assembly.Load(File.ReadAllBytes(assemblyPath));
Type dbContext = null;
foreach(var type in migrationAssembly.GetTypes())
{
if(type.IsSubclassOf(typeof(DbContext)))
{
dbContext = type;
break;
}
}
var migrationsConfiguration = new DbMigrationsConfiguration()
{
AutomaticMigrationDataLossAllowed = false,
AutomaticMigrationsEnabled = false,
CodeGenerator = new CSharpMigrationCodeGenerator(),
ContextType = dbContext,
ContextKey = migrationsNamespace + ".Configuration",
MigrationsAssembly = migrationAssembly,
MigrationsDirectory = "Migrations",
MigrationsNamespace = migrationsNamespace
};
var dbi = new System.Data.Entity.Infrastructure
.DbConnectionInfo("ConnectionString", "System.Data.SqlClient");
migrationsConfiguration.TargetDatabase = dbi;
var scaffolder = new MigrationScaffolder(migrationsConfiguration);
ScaffoldedMigration migration = scaffolder.Scaffold("InitialBase", true);
var migrationFile = Path.Combine(rootPath, migration.Directory,
migration.MigrationId + ".cs");
File.WriteAllText(migrationFile, migration.UserCode);
var migrationItem = project.ProjectItems.AddFromFile(migrationFile);
var designerFile = Path.Combine(rootPath, migration.Directory,
migration.MigrationId + ".Designer.cs");
File.WriteAllText(designerFile, migration.DesignerCode);
var designerItem = project.ProjectItems.AddFromFile(migrationFile);
foreach(Property prop in designerItem.Properties)
{
if (prop.Name == "DependentUpon")
prop.Value = Path.GetFileName(migrationFile);
}
var resxFile = Path.Combine(rootPath, migration.Directory,
migration.MigrationId + ".resx");
using (ResXResourceWriter resx = new ResXResourceWriter(resxFile))
{
foreach (var kvp in migration.Resources)
resx.AddResource(kvp.Key, kvp.Value);
}
var resxItem = project.ProjectItems.AddFromFile(resxFile);
foreach (Property prop in resxItem.Properties)
{
if (prop.Name == "DependentUpon")
prop.Value = Path.GetFileName(migrationFile);
}
}
I execute this in my project template's IWizard implementation where I run a migration with IgnoreChanges, because of shared entites with the base project. Change scaffolder.Scaffold("InitialBase", true) to scaffolder.Scaffold("InitialBase", false) if you want to include the changes.

SignalR .Net client fails to connect (upd: how to set auth. cookie?)

This thing is dragging me nuts.
I have a .net 4.0 console app and I have an MVC web app.
javascript clients can connect and talk to the server - no problems here...
but my .net client throws System.AggregateException with InnerException = "Unexpected character encountered while parsing value: <. Path...
so I created an empty MVC3 app, added SignalR libraries, and .net client surprisingly connects to that. But for some reason it doesn't to the other one. I've checked everything, both MVC3 apps, both use the same SignalR libs, the same NewtonsoftJson... I thought it must be something with the routing, I guess no - js client works.
var connection = new HubConnection("http://localhost:58746");
var hubProxy = connection.CreateProxy("myProxy");
connection.Start().Wait() // it fails here on Wait
What could it be?
UPD: I have figured... it's because FormsAuthentication on the server. Now is there any way to feed .ASPXAUTH cookie to SignalR so it can connect to the server?
The solution by Agzam was really helpful, but if anyone else uses the posted code it is critical that you close the HttpWebResponse before exiting GetAuthCookie. If you don't you will find that whenever you use SignalR to invoke a method on the server, the request (under most circumstances) will queue indefinitely on the client and will neither succeed nor fail.
Note. The original code worked in the test environment when everything was on my PC, but failed consistently when the website was hosted on a remote server.
here is the modified code I ended up using
private Cookie GetAuthCookie(string user, string pass)
{
var http = WebRequest.Create(_baseUrl+"Users/Login") as HttpWebRequest;
http.AllowAutoRedirect = false;
http.Method = "POST";
http.ContentType = "application/x-www-form-urlencoded";
http.CookieContainer = new CookieContainer();
var postData = "UserName=" + user + "&Password=" + pass + "&RememberMe=true&RememberMe=false&ReturnUrl=www.google.com";
byte[] dataBytes = System.Text.Encoding.UTF8.GetBytes(postData);
http.ContentLength = dataBytes.Length;
using (var postStream = http.GetRequestStream())
{
postStream.Write(dataBytes, 0, dataBytes.Length);
}
var httpResponse = http.GetResponse() as HttpWebResponse;
var cookie = httpResponse.Cookies[FormsAuthentication.FormsCookieName];
httpResponse.Close();
return cookie;
}
its a very minor change , but it will save you a lot of debugging time.
Ok... stupid me... SignalR failed to connect because it cannot breach server's Forms authentication. So what needed to be done is to get the auth cookie and stick it to the HubConnection.CookieContainer...
so I wrote this method method to login with a username and get the cookie:
private Cookie GetAuthCookie(string user, string pass)
{
var http = WebRequest.Create(_baseUrl+"Users/Login") as HttpWebRequest;
http.AllowAutoRedirect = false;
http.Method = "POST";
http.ContentType = "application/x-www-form-urlencoded";
http.CookieContainer = new CookieContainer();
var postData = "UserName=" + user + "&Password=" + pass + "&RememberMe=true&RememberMe=false&ReturnUrl=www.google.com";
byte[] dataBytes = System.Text.Encoding.UTF8.GetBytes(postData);
http.ContentLength = dataBytes.Length;
using (var postStream = http.GetRequestStream())
{
postStream.Write(dataBytes, 0, dataBytes.Length);
}
var httpResponse = http.GetResponse() as HttpWebResponse;
var cookie = httpResponse.Cookies[FormsAuthentication.FormsCookieName];
httpResponse.Close();
return cookie;
}
And used it like this:
var connection = new HubConnection(_baseUrl)
{
CookieContainer = new CookieContainer()
};
connection.CookieContainer.Add(GetAuthCookie(_user, _pass));
Works perfectly!
Just use this for reading cookies:
var cookie = response.Cookies[".AspNet.ApplicationCookie"];

Windows Workflow Foundation 4.0 and Tracking

I'm working with the Beta 2 version of Visual Studio 2010 to get some advanced learning using WF4. I've been working with the SqlTracking Sample in the WF_WCF_Samples SDK, and have gotten a pretty good understanding of how to emit and store tracking data in a SQL Database, but haven't seen anything on how to query the data when needed. Does anyone know if there are any .Net classes that are to be used for querying the tracking data, and if so are there any known samples, tutorials, or articles that describe how to query the tracking data?
According to Matt Winkler, from the Microsoft WF4 Team, there isn't any built in API for querying the tracking data, the developer must write his/her own.
These can help:
WorkflowInstanceQuery Class
Workflow Tracking and Tracing
Tracking Participants in .NET 4 Beta 1
Old question, I know, but there is actually a more or less official API in AppFabric: Windows Server AppFabric Class Library
You'll have to find the actual DLL's in %SystemRoot%\AppFabric (after installing AppFabric, of course). Pretty weird place to put it.
The key classes to look are at are SqlInstanceQueryProvider, InstanceQueryExecuteArgs. The query API is asynchronous and can be used something like this (C#):
public InstanceInfo GetWorkflowInstanceInformation(Guid workflowInstanceId, string connectionString)
{
var instanceQueryProvider = new SqlInstanceQueryProvider();
// Connection string to the instance store needs to be set like this:
var parameters = new NameValueCollection()
{
{"connectionString", connectionString}
};
instanceQueryProvider.Initialize("Provider", parameters);
var queryArgs = new InstanceQueryExecuteArgs()
{
InstanceId = new List<Guid>() { workflowInstanceId }
};
// Total ruin the asynchronous advantages and use a Mutex to lock on.
var waitEvent = new ManualResetEvent(false);
IEnumerable<InstanceInfo> retrievedInstanceInfos = null;
var query = instanceQueryProvider.CreateInstanceQuery();
query.BeginExecuteQuery(
queryArgs,
TimeSpan.FromSeconds(10),
ar =>
{
lock (synchronizer)
{
retrievedInstanceInfos = query.EndExecuteQuery(ar).ToList();
}
waitEvent.Set();
},
null);
var waitResult = waitEvent.WaitOne(5000);
if (waitResult)
{
List<InstanceInfo> instances = null;
lock (synchronizer)
{
if (retrievedInstanceInfos != null)
{
instances = retrievedInstanceInfos.ToList();
}
}
if (instances != null)
{
if (instances.Count() == 1)
{
return instances.Single();
}
if (!instances.Any())
{
Log.Warning("Request for non-existing WorkflowInstanceInfo: {0}.", workflowInstanceId);
return null;
}
Log.Error("More than one(!) WorkflowInstanceInfo for id: {0}.", workflowInstanceId);
}
}
Log.Error("Time out retrieving information for id: {0}.", workflowInstanceId);
return null;
}
And just to clarify - this does NOT give you access to the tracking data, which are stored in the Monitoring Database. This API is only for the Persistence Database.

Resources