I'm using SignalR's latest version and it's great. However, I've recently hit an interesting scaling issue: it seems that SignalR sends all the client's groups from the client in the query string. My system requires that a person join numerous groups, which represent all the projects that the user is subscribed to, and receive notifications on any of them.
this large number of groups (and me using GUIDs for ids) has caused the querystring to reach its maximum size, and SignalR to stop working.
This is what Fiddler shows in the request webform (using long polling for the Azure bug):
transport longPolling
connectionId bbed6f35-8379-4be3-ac28-ef3e618382ae
connectionData [{"name":"jethub"}]
messageId 85
groups ["JetHub.f9f81bcc-8417-46bd-bae5-c4134972601d","JetHub.5139a8de-04c2-48da-9427-39666e52fabd","JetHub.6b12e333-4d22-47c3-8587-7a9ad5026184","JetHub.252ea279-7a71-40e7-b03c-7d63e69f07ab","JetHub.a4843a77-1e6d-4693-b3de-b392ef465410","JetHub.27feb53a-3c2d-4b11-92f7-dbdffb874b25","JetHub.8840dfcf-e6be-4b72-965b-b282a60446e8","JetHub.bf7d3301-6fc0-4499-bee8-fe22f1bc2281","JetHub.655cba0e-7f72-402c-b80b-dcb740546163","JetHub.85d817e2-67a3-4291-b564-5320598339f6","JetHub.e3079263-3f6e-4a54-ad88-0dfc5dd2ce18","JetHub.33f00a67-9b05-4293-8119-4617e2fed9b0","JetHub.6323cfe8-fb81-4716-b553-79b9d72641a5","JetHub.b4359f8a-030a-4ac9-aacd-c05b42163bcc", ... many more]
I know I can increase the querystring size in IIS, but is there a better way to manage groups, or should I try to create my server-side grouping scheme and broadcast to each client separately? can PersistenConnections help in that regard?
Thanks.
As groups are roundtripped via the query string in SignalR 0.5.3, you have the following options:
a) increase the maximum query string size
b) use shorter group names
c) handle grouping yourself on the server and broadcast seperately to each user
PersistentConnections won't help here as the Hub API is built on top of them, so you'll run into the same problem.
I know that this has already been answered to your satisfaction, but there is a way of managing groups on the server without sending messages to each client individually. You can implement your own HubDispatcher:
using System.Collections.Generic;
using SignalR;
using SignalR.Hubs;
namespace My.Hubs
{
public class MyHubDispatcher : HubDispatcher
{
public MyHubDispatcher() : base("/myhubs") { }
protected override Connection CreateConnection(string connectionId, IEnumerable<string> signals, IEnumerable<string> groups)
{
//ex: IEnumerable<string> myGroups = new string[] { "MyHub.MyGroup", "MyHub.MyOtherGroup", "MyOtherHub.MyGroup" };
IEnumerable<string> myGroups = GetGroups(connectionId);
return base.CreateConnection(connectionId, signals, myGroups);
}
}
}
You can then set up routing like any other PersistentConnection:
using System.Web;
using System.Web.Routing;
using SignalR;
namespace My
{
// Note: For instructions on enabling IIS6 or IIS7 classic mode,
// visit http://go.microsoft.com/?LinkId=9394801
public class Application : HttpApplication
{
protected void Application_Start()
{
RouteTable.Routes.MapConnection<Hubs.MyHubDispatcher>("myhubs", "myhubs/{*operation}");
RouteConfig.RegisterRoutes(RouteTable.Routes);
}
}
}
Then you can use groups on you Hubs like you normally would:
using SignalR.Hubs;
namespace My.Hubs
{
public class MyHub : Hub
{
public void AlertClients(string id, int duration)
{
Clients["MyGroup"].Alert("MyGroup");
Clients["MyOtherGroup"].Alert("MyOtherGroup");
}
}
}
If you using the JS client you can simply include the script at ~/myhubs/hubs instead of ~/signalr/hubs. If you are using the .NET client you just use new Client.Hubs.HubConnection("http://foo/myhubs", useDefaultUrl: false);
Related
I have set up a signalR website .net core. My function in my hub is:
public async Task Notify(int id) {
await Clients.All.InvokeAsync("Notified", id);
}
I have also tested this with the following js:
let connection = new signalR.HubConnection(myURL);
connection.on('Notified', data => {
console.log(4, data);
});
connection.start();
The js code seems to work fine and I see the log when I try connection.Invoke('Notify').
Now I have a console app that can needs to make the invoke. I am trying this in two ways and don't mind either solution:
1. A mvc controller within the signalR website that can take the id and invoke 'Notified'.
2. Use the client library Microsoft.AspNetCore.SignalR.Client in the console app.
The way 1 I have only done in classic asp.net like this:
GlobalHost.ConnectionManager.GetHubContext(hubName)
But couldn't find a way to do this in .net core.
Way 2 I have used the library and tried this so far:
var con = new HubConnectionBuilder();
con.WithUrl(myURL);
var connection = con.Build();
connection.InvokeAsync("Notify",args[0]).Wait();
This is the closest I have come to create a connection in the same way as the js code. However this code throws a null pointer when calling connection.InvokeAsync. The connection object is not null. It seems to be an internal object that is null. According to the stack trace the exception is thrown when a MoveNext() function is internally called.
Well looks like both are not currently possible. As of now I just used a forced way which is hopefully temporary.
I have created and used the following base class for hubs:
public abstract class MyHub : Hub
{
private static Dictionary<string, IHubClients> _clients = new Dictionary<string, IHubClients>();
public override Task OnConnectedAsync()
{
var c = base.OnConnectedAsync();
_clients.Remove(Name);
_clients.Add(Name, Clients);
return c;
}
public static IHubClients GetClients(string Name) {
return _clients.GetValueOrDefault(Name);
}
}
GlobalHost is gone. You need to inject IHubContext<THub> like in this sample.
This can be a bug in SignalR alpha1. Can you file an issue on https://github.com/aspnet/signalr and include a simplified repro?
ASP.NET apps using OWIN permit multiple Identity sources (Facebook, Google, etc.). Most of the provider-specifc information those sources provide is irrelevant to my app, potentially even large, and I don't want it in my cookies all session. My app is primarily WebAPI, but I suspect the question applies equally to MVC and WebForms.
For now, all I need is an integer account ID. Where/when should I reconstruct the identity, after external authentication?
For example, here is one way I could filter claims:
public ReplaceExistingClaims(ClaimsIdentity identity) {
{
Claim customClaim = GetCustomClaimFromDbForIdentity(identity);
foreach (Claim claim in ClaimsIdentity.Claims) ClaimsIdentity.RemoveClaim(claim);
ClaimsIdentity.AddClaim(customClaim);
}
And following are two different places I could inject those claims changes:
var facebookAuthenticationOptions = new FacebookAuthenticationOptions
{
Provider = new FacebookAuthenticationProvider
{
OnAuthenticated = context =>
{
ReplaceExistingClaims(context.Identity);
return Task.FromResult(0);
}
}
};
Above, I know I can hook an individual provider from Startup IF it provides an Authenticated event. I have two conceptual problems with this. One: it requires me to write and wire up my code separately for each provider I plug in. Two: there is no requirement for providers to provide this event. Both of these make me feel like there must be a different intended insertion point for my code.
public ActionResult ExternalLoginCallback(string returnUrl)
{
ReplaceExistingClaims((ClaimsIdentity)User.Identity);
new RedirectResult(returnUrl);
}
Above, I know I can put code in ExternalLoginCallback. But this happens too late for two reasons. One: The user has already been issued a ticket I consider invalid, but the default [Authorized] considers valid because it's signed by me, and now they are making requests to my site with it. There could even be race conditions here. Two: There is no guarantee the browser will visit this redirect, and I'd prefer from a design perspective if it didn't have to, e.g. to simplify my WebAPI client code.
To the best of my knowledge, the best solution will meet these requirements:
same code applies to all providers
client receives my custom ticket from my server (e.g. without image claims)
client never receives another ticket format from my server
the authentication process requires the minimum possible HTTP round-trips
token-refresh and other core identity features are still available
once a user is [Authorize]d, no further account transformation is necessary
database/repository access is feasible during ticket generation
Some pages I'm researching, for my own notes:
How do I access Microsoft.Owin.Security.xyz OnAuthenticated context AddClaims values?
https://katanaproject.codeplex.com/SourceControl/latest#src/Microsoft.Owin.Security.Facebook/FacebookAuthenticationHandler.cs
https://katanaproject.codeplex.com/workitem/82
https://www.simple-talk.com/dotnet/.net-framework/creating-custom-oauth-middleware-for-mvc-5/
You have to implement DelegationHandler and put all your authentication routines in it.
Register at Application start (DI usage is enabled):
private static void RegisterHandlers(HttpConfiguration config)
{
var authHandler = new MyFacebookAuthHandler();
config.MessageHandlers.Add(authHandler);
}
And this is an example of implementation:
public class MyFacebookAuthHandler : DelegationHandler
{
public override sealed Task<HttpResponseMessage> OnSendAsync(HttpRequestMessage request,
CancellationToken cancellationToken)
{
try
{
// Process credentials
// Probably you have to save some auth information to HttpContext.Current
// Or throw NotAuthorizedException
}
catch(NotAuthorizedException ex)
{
return request.CreateErrorResponse(HttpStatusCode.Unauthorized, ex).ToCompletedTask();
}
catch (Exception ex)
{
return request.CreateErrorResponse(HttpStatusCode.InternalServerError, ex).ToCompletedTask();
}
return base.OnSendAsync(request, cancellationToken);
}
}
The ClaimsAuthenticationManager class is specifically for this.
https://msdn.microsoft.com/en-us/library/system.security.claims.claimsauthenticationmanager(v=vs.110).aspx
Code sample from that reference:
class SimpleClaimsAuthenticatonManager : ClaimsAuthenticationManager
{
public override ClaimsPrincipal Authenticate(string resourceName, ClaimsPrincipal incomingPrincipal)
{
if (incomingPrincipal != null && incomingPrincipal.Identity.IsAuthenticated == true)
{
((ClaimsIdentity)incomingPrincipal.Identity).AddClaim(new Claim(ClaimTypes.Role, "User"));
}
return incomingPrincipal;
}
}
I am looking for a way to subscribe to events like Storing a specific object type to ServiceStack.Redis.
For example I may
using (var redisClient = new RedisClient())
using (var redisMyObjects = redisClient.As<MyObject>())
{
redisMyObjects.Store(myObject);//<-- I want this to trigger an event somehow
}
Is there anything like a OnStore event which I can hook too, anything out of the box? if not, is there any recommendation about how this should be done?
I don't think there is anything you can hook into (could be wrong).
Two options that came to mind:
1 - Make an extension method
2 - Publish a message to store your object and have a handler that listens for a response and does something. This is probably overkill since it's heading into the publish/subscribe realm. But, I think, worth looking into. (Basic example here and see Pub/Sub here).
Extension Method
public static class RedisClientExtensions
{
public static void StoreWithTrigger<T>(this IRedisTypedClient<T> redisClient, T value, Action<T> trigger)
{
redisClient.Store(value);
trigger(value);
}
}
Using ExtensionMethod
public void MyMethod()
{
using (var redisClient = new RedisClient())
using (var redisMyObjects = redisClient.As<MyObject>())
{
redisMyObjects.StoreWithTrigger<MyObject>(new MyObject(), TriggerEvent);//<-- I want this to trigger an event somehow
}
}
private void TriggerEvent<T>(T value)
{
//dosomething
}
Hope this gives you some ideas.
So I'm still in the process of learning SNMP, please go easy. I'm using snmp4j, not just the libraries but I've loaded the source code and I'm not against modifying the source if it gets me what I need. I've programmed an agent and a test client. What I want to do is be able to check the requests coming in from the test client and specifically listening for a "set" request to a specific OID.
The current way I'm thinking about doing it is catching the request right after it runs the snmp4j method fireProcessMessage (located in the package org.snmp4j.transport.DefaultUdpTranportMapping) but I don't know how an agent queries its own mib for an oid. Is there a method that the agent uses to get OID values from its mib?
Or Is there a better way to catch a specific SET request? Is it even possible to do what I want? Basically what I want to do is run another process if the client sets a certain OID value to 1(true).
It can be done by extending the CommandProcessor
and implementing RequestHandler
like i have done
public class SNMPRequestProcessor extends CommandProcessor
{
SetHandler setHandler = new SetHandler ();
public SNMPRequestProcessor()
{
//Your code
}
#Override
protected void processRequest(CommandResponderEvent command, CoexistenceInfo cinfo, RequestHandler handler)
{
synchronized (command) {
if (command.getPDU().getType() == PDU.SET) {
super.processRequest(command, cinfo, setHandler);
}
super.processRequest(command, cinfo, handler);
}
}
/**
* Handler for process set request which update to the database
*
*/
class SetHandler implements RequestHandler
{
#Override
public boolean isSupported(int mode)
{
return mode == PDU.SET;
}
#Override
public void processPdu(Request request, MOServer server)
{
//your code
}
}
}
I have no experience with the agent side of snmp4j, but I recommend to pose this question on the official mailing list: http://lists.agentpp.org/pipermail/snmp4j/. It is quite active, you'll have a good answer in a few hours.
How to get all the publishing items by code when a directory is being published and which event should I add my handler to, publish:begin or publish:itemProcessing?
If you're looking to setup a custom event handler start with the web.config reference.
<event name="publish:begin">
<handler type="YourNamespace.YourClass, YourLibrary" method="YourHandlerMethod" />
</event>
Then create a class that will support this reference.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Collections;
using Sitecore.Diagnostics;
using Sitecore.Sites;
using Sitecore.Configuration;
using Sitecore.Caching;
using Sitecore.Events;
using Sitecore.Publishing;
using Sitecore.Data.Events;
using Sitecore.Data;
using Sitecore.Data.Items;
namespace YourNamespace {
public class YourClass {
public void YourHandlerMethod(object sender, EventArgs args) {
Assert.ArgumentNotNull(sender, "sender");
Assert.ArgumentNotNull(args, "args");
//try to get the sitecore event args
if (args.GetType().ToString().Equals("Sitecore.Events.SitecoreEventArgs")) {
SitecoreEventArgs sargs = (SitecoreEventArgs)args;
foreach (object o in sargs.Parameters) {
//try to get the publisher object
if (o.GetType().ToString().Equals("Sitecore.Publishing.Publisher")) {
Publisher p = (Publisher)o;
if (p != null) {
Item root = p.Options.RootItem;
bool b = p.Options.RepublishAll;
if(p.Options.Mode.Equals(PublishMode.SingleItem)){
//only one item published
}
}
}
}
}
}
}
}
From this class you can try to access the publisher object which will give you the root item published and publish options. The publish options will tell you if there was a single item published or if it published all versions of languages.
Depending on your real needs, it might make more sense to inject a custom processor into the publishItem pipeline rather than use publish:itemProcessing event. If you take a closer look at that pipeline (search for "<publishItem") in web.config, you'll that those events (publish:itemProcessing and publish:itemProcessed) are generated by the appropriate processors of pipeline.
NOTE: the publishing process is rather complex and I would not recommend doing anything with the item being published that can influence the process in general. I can't give you an example here - only your fantasy sets the limits...
Note also, that with those events, as well as the pipeline I mentioned, you operate with 1 item at a time - it will be called for each item being published. This can become performance critical...
UPDATE: You can read more about pipeline in this blog post. Apart from being useful itself, it contains more useful links on the subject.