when calling rest services using the System.Net.Http.HttpClient i have code like
var response = client.GetAsync("api/MyController").Result;
if(response.IsSuccessStatusCode)
...
is that proper or should i be doing
client.GetAsync("api/MyController").ContinueWith(task => { var response = task.Result; ...}
It is much safer to do the second. There are a variety of scenarios where the first option can cause a deadlock.
Related
I am beginning with MassTransit, for a publisher/consumer scenario. In production we will be using SQS, however i would like to be able to use "In Memory" for development locally.
I am having trouble with forming the correct Uri for the call to ISendEndpointProvider.GetSendEnpoint(), as per:
//THE SET UP CODE:
x.AddConsumer<MTConsumer, MTMessageConsumerDefinition>()
.Endpoint(e =>
{
// override the default endpoint name
e.Name = "process-input-item";
//... more configurations as per docs here...
})
;
x.UsingInMemory((context, cfg) =>
{
cfg.ConfigureEndpoints(context);
});
});
//The Publish Code:
var endpoint = await SendEndpointProvider.GetSendEndpoint(new Uri("/ProcessInputItem"));
await endpoint.Send(new MTMessage { InputItemId = item.Id});
Note
I have tried the various cases for the endpoint string.
I do not want to capture the instance of IBus to call Send as that is not the 'closest' instance to the consumer, which according to the docs is important to consider.
Mass Transit document reference: https://masstransit-project.com/usage/configuration.html#receive-endpoints
Thank you for any guidance with this,
Dylan
As explained in the documentation, there are short endpoint addresses which can be used. In your case:
await SendEndpointProvider.GetSendEndpoint(new Uri("queue:process-input-item"));
I am trying to get the data stored in my ObjectStore and I want this synchronously. So instead of using onsuccess I want to use await / async.
I have implemented this below code but somehow its not returning me the data.
async function viewNotes() {
const tx = db.transaction("personal_notes","readonly")
const pNotes = tx.objectStore("personal_notes")
const items = await db.transaction("personal_notes").objectStore("personal_notes").getAllKeys()
console.log("And the Items are ", items.result)
let NotesHere = await pNotes.getAll().onsuccess
console.log("Ans this are the logs", NotesHere)
}
Neither I am getting the data through items.result nor from NotesHere.
When I view from debug mode, items's readyState is still in pending even after using await.
What am I missing ?
The IndexedDB API does not natively support async/await. You need to either manually wrap the event handlers in promises, or (much better solution) use a library like https://github.com/jakearchibald/idb that does it for you.
With v5 PublishRequest extension was removed from the IBus interface.
We used the callback to handle multiple response types that could be returned from the consumer ( Faults, Validations, actual responses, etc )
What is the equivalent way of publishing a message and wiring up multiple response types ?
// Request/Response contracts, may also return validation failure or fault contract
Request<TMessage> request = await bus.PublishRequest<TMessage>( msg, context => {
context.Handle<TResponse>( value => ... );
context.Handle<TValidation>( value => ... );
context.Handle<Fault>( value => ... );
context.CorrelationId = ...
context.Headers.Set( ... );
});
await request.Task;
You can use the new syntax, which is much cleaner overall.
var client = Bus.CreateRequestClient<RegisterMember>();
var (registered, existing) =
await client.GetResponse<MemberRegistered, ExistingMemberFound>(
new RegisterMember() {MemberId = "Johnny5"});
This will return either of the two responses, and if a fault occurs, either will throw the faulted request exception.
You can also use a request handle to add headers, etc.
var client = Bus.CreateRequestClient<RegisterMember>();
var request = client.Create(new RegisterMember()
{MemberId = "Johnny5"});
// the request is also the send pipe configurator, so...
request.UseExecute(context => context.CorrelationId = someId);
var (registered, existing) =
await request.GetResponse<MemberRegistered, ExistingMemberFound>();
You can see a working test case in the future tests:
https://github.com/MassTransit/MassTransit/blob/develop/src/MassTransit.Futures.Tests/Request_Specs.cs#L170
So I'm building a multipart form uploader over ajax on node.js, and sending progress events back to the client over socket.io to show the status of their upload. Everything works just fine until I have multiple clients trying to upload at the same time. Originally what would happen is while one upload is going, when a second one starts up it begins receiving progress events from both of the forms being parsed. The original form does not get affected and it only receives progress updates for itself. I tried creating a new formidable form object and storing it in an array along with the socket's session id to try to fix this, but now the first form stops receiving events while the second form gets processed. Here is my server code:
var http = require('http'),
formidable = require('formidable'),
fs = require('fs'),
io = require('socket.io'),
mime = require('mime'),
forms = {};
var server = http.createServer(function (req, res) {
if (req.url.split("?")[0] == "/upload") {
console.log("hit upload");
if (req.method.toLowerCase() === 'post') {
socket_id = req.url.split("sid=")[1];
forms[socket_id] = new formidable.IncomingForm();
form = forms[socket_id];
form.addListener('progress', function (bytesReceived, bytesExpected) {
progress = (bytesReceived / bytesExpected * 100).toFixed(0);
socket.sockets.socket(socket_id).send(progress);
});
form.parse(req, function (err, fields, files) {
file_name = escape(files.upload.name);
fs.writeFile(file_name, files.upload, 'utf8', function (err) {
if (err) throw err;
console.log(file_name);
})
});
}
}
});
var socket = io.listen(server);
server.listen(8000);
If anyone could be any help on this I would greatly appreciate it. I've been banging my head against my desk for a few days trying to figure this one out, and would really just like to get this solved so that I can move on. Thank you so much in advance!
Can you try putting console.log(socket_id);
after form = forms[socket_id]; and
after progress = (bytesReceived / bytesExpected * 100).toFixed(0);, please?
I get the feeling that you might have to wrap that socket_id in a closure, like this:
form.addListener(
'progress',
(function(socket_id) {
return function (bytesReceived, bytesExpected) {
progress = (bytesReceived / bytesExpected * 100).toFixed(0);
socket.sockets.socket(socket_id).send(progress);
};
})(socket_id)
);
The problem is that you aren't declaring socket_id and form with var, so they're actually global.socket_id and global.form rather than local variables of your request handler. Consequently, separate requests step over each other since the callbacks are referring to the globals rather than being proper closures.
rdrey's solution works because it bypasses that problem (though only for socket_id; if you were to change the code in such a way that one of the callbacks referenced form you'd get in trouble). Normally you only need to use his technique if the variable in question is something that changes in the course of executing the outer function (e.g. if you're creating closures within a loop).
Is there any way to use server methods not asynchronously in windows phone 7 application?
I have a list of data. In foreach loop, send a request to server for each data but they are not completed in the order of my calling. How can i do that?
I have a list of data. In foreach loop, send a request to server for each data but they are not completed in the order of my calling. How can i do that?
Well you can effectively make them synchronous - get rid of it being an actual foreach loop, and instead set up the appropriate asynchronous callbacks so that when the first response comes in, you send the second request, etc, until there are no more requests. (You might want to use a Queue<T> to queue up the requests to send.)
Do not revert to synchronous ways simply because something appears to not work. There are many benefits to working in an asynchronous world. There are also some dangers. The key is knowing how to avoid those dangers.
Here is an example using WebClient that will have harmful effects.
foreach (var item in items)
{
WebClient client = new WebClient();
client.DownloadStringCompleted += (sender, args) =>
{
item.Stuff = args.Result;
};
client.OpenReadAsync(new Uri("http://mydomain.com/stuff"));
}
When the client is returned, there is no guarantee that item is the same item that "created" the client request. This is known as "Access to modified closer". Which simply means you might be trying to modify something that doesn't exist anymore, or is not the same.
The proper approach to this is to capture your item within the foreach like such:
foreach (var item in items)
{
var internalItem = item;
WebClient client = new WebClient();
client.DownloadStringCompleted += (sender, args) =>
{
internalItem.Stuff = args.Result;
};
client.OpenReadAsync(new Uri("http://mydomain.com/stuff"));
}
This ensures that you are using the correct item because it has been captured within the scope of the foreach.