Subscribe on channel by criteria - ruby

Im looking for some tool that provides pub/sub model, but instead string channels allows to subscribe on some data by criteria.
I need to publish message to websocket connections each of them correspond to authenticated userwho fit numeric range mongodb query.

Read this: http://redis.io/topics/pubsub
Redis allows pattern-based subscription (not by regexp though, but allows asterisk operator).

Related

Is there a way to check subscribers number connected to pattern in redis?

I want to check how many users are connected to my pubsub pattern. Is there a simple way to do it in Go? Thank you.
pubsub := env.redisCli.PSubscribe(name)
defer pubsub.Close()
I have tried this:
val, _ := env.redisCli.Do("pubsub", "numpat").Int()
But it shows me other patterns also and I want to count only in that specific pattern.
The redis documentation states you can limit the result to a single subscription with the NUMSUB command, but this will not list clients that are subscribed to patterns:
Returns the number of subscribers (not counting clients subscribed to patterns) for the specified channels.
The NUMPAT on the other hand will count all patterns all clients are connected to:
Note that this is not just the count of clients subscribed to patterns but the total number of patterns all the clients are subscribed to.
(from https://redis.io/commands/pubsub)
I don't find a way to list all subscribers with their subscriptions in Redis. So the only way I can think of is to store that information in redis (or somewhere else) independently and manage it yourself.

Wildcard topic names and capturing message from all topics

We are using AWS IoT.
We have predefined topics, (+/device/), where the devices publish the messages.
But there is a possibility that, the devices can publish the messages to any other topics.
I want to calculate the number of messages which are published to all the topics, by these individual devices and implement throttling.
I tried to create IoT rules using wildcard topic names like ( +/* or /), but none of these wildcard topics seem to work.
Is there any wildcard topic name, which I can use to capture the messages from all the topics?
Or is there any way to dump all the messages on all the topics somewhere in DynamoDB or S3 and calculate the number of messages from individual devices in a specific time period?
I tried to create IoT rules using wildcard topic names like ( +/* or /), but none of these wildcard topics seem to work.
Is there any wildcard topic name, which I can use to capture the messages from all the topics?
+ and # are the relevant wildcards for AWS IoT rules. See https://docs.aws.amazon.com/iot/latest/developerguide/iot-sql-from.html
You can configure a rule with the following statement to capture messages from all topics.
SELECT * FROM '#'
Or is there any way to dump all the messages on all the topics somewhere in DynamoDB or S3 and calculate the number of messages from individual devices in a specific time period?
One approach is to create a rule based on the one above and also pass the client ID on every message (using the clientid() function). The action for this rule could write the client Id to DynamoDB or S3. Then this information is available to do your calculation.
An alternate approach might be to write the messages and clientid to a Kinesis Data Stream and use Kinesis Data Analytics to detect the errant devices.

Looking For A Scalable PubSub Solution Or Alternative

I'm currently looking for the best architecture for an IM app I'm trying to build.
The app consists of channels each having a couple thousands of subscribed users. Each user is subscribed only to one channel at a time and is able to publish and read from that channel. Users may move rapidly between channels.
I initially considered using the XMPP PubSub (via Ejabbered or MongooseIM) but as far as I understand it was added as an afterthought and is not very scalable.
I also thought about using using a message queue protocol like AMPQ but I'm not sure if that's what I'm looking for from the IM aspect.
Is my concern regarding the XMPP PubSub justified? And if so, do you know of a better solution?
Take a look at Redis and Kafka. Both are scalable and performant.
I imagined below primary usecases for above IM application based on your inputs.
**
Usecases
**
Many new users keep registering with system and subscribing to one
of the channels
Many existing users changing their subscription from one channel to
other channel
Many existing users keep publishing messages to channels
Many existing users keep receiving messages as subscribers
XMPP is natural fit for 3rd and 4th usecases. "ejabbered" is one of proven highly scalable platform to go ahead.
In case 2nd usecase, You probably may have logic some thing like this.
- a)update channel info of the user in DB
- b)make him listen to new channel
- c)change his publishing topic to other channel...so on
When ever you need to do multiple operations, I strongly recommend to use "KAFKA" to perform above operations in async manner
In case of 1st usecase, Provide registration through rest APIs.So that registration can be done from any device.While registering an user,You may have many operations as follows.
- 1) register user in DB
- 2) create internally IM account
- 3) send email OR SMS for confirmation...so on
Here also perform 1st operation as a part of rest API service logic. Perform 2nd and 3rd operations in async manner using KAFKA. That means your service logic perform 1st operation in sync manner and raise an event to KAFKA. Every consumer will handle 2nd and 3rd operations in async manner.
System could scale well if all layers/subsystems can scale well. In that perspective, Below tech stack may help you scale well.
REST APIS + KAFKA + EJABBERED(XMPP)

Consume multiple messages from a queue in Tibco EMS

Is it possible to consume multiple messages in one call from a Tibco EMS queue. I am using the Receive method of the MessageConsumer class to consume the data currently but this just returns one Message. I'm wondering if there's something that returns an array of Message objects?
Thanks
A queue should not be treated as an inbound array object... mostly because the number of such objects could be massive... and such behaviors would be in direct contradiction with the basic "atomic information piece" notions of messaging. Queues should really be seen as an input "faucet" providing a flux of information.
That said : You might be looking for the javax.jms.QueueBrowser facility object. It IS in contradiction of typical messaging patterns, but can be useful. (Rules are meant to be broken sometimes, are they not ?)
Here is a link to many related examples.
EMS is a JMS provider, so these examples can be used with it.
To retrieve specific messages (to put in an array ?), you could then use a "receive" with message selectors (ex: on the message ID)
UPDATE : There is also this non-jms response : Use the Native EMS API to purge.

What is the right approach for an async work queue with results?

I have a REST server on heroku. It will have N-dynos for the REST service and N-dynos for workers.
Essentially, I have some long running rest requests. When these come in I want to delegate them to one of the workers and give the client a redirect to poll the operation and eventually return the result of the operation.
I am going to use JEDIS/REDIS from RedisToGo for this. As far as I can tell there are two ways I can do this.
I can use the PUB/SUB functionality. Have the publisher create unique identities for the work results and return these in a redirect URI to the REST client.
Essentially the same thing but instead of PUB/SUB use RPUSH/BLPOP.
I'm not sure what the advantage is to #1. For example, if I have a task called LongMathOperation it seems like I can simply have a list for this. The list elements are JSON objects that have the math operation arguments as well as a UUID generated by the REST server for where the results should be placed. Then all the worker dynos will just have blocking BLPOP calls and the first one there will get the job, process it, and put the results in REDIS using the key of the UUID.
Make sense? So my question is "why would using PUB/SUB be better than this?" What does PUB/SUB bring to the table here that I am missing?
Thanks!
I would also use lists because pubsub messages are not persistent. If you have no subscribers then the messages are lost. In other words, if for whatever reason you do not have any workers listening then the client won't get served properly. Lists are persistent on the other hand. But pubsub does not take as much memory as lists obviously for the same reason: there is nothing to store.

Resources