We are using AWS IoT.
We have predefined topics, (+/device/), where the devices publish the messages.
But there is a possibility that, the devices can publish the messages to any other topics.
I want to calculate the number of messages which are published to all the topics, by these individual devices and implement throttling.
I tried to create IoT rules using wildcard topic names like ( +/* or /), but none of these wildcard topics seem to work.
Is there any wildcard topic name, which I can use to capture the messages from all the topics?
Or is there any way to dump all the messages on all the topics somewhere in DynamoDB or S3 and calculate the number of messages from individual devices in a specific time period?
I tried to create IoT rules using wildcard topic names like ( +/* or /), but none of these wildcard topics seem to work.
Is there any wildcard topic name, which I can use to capture the messages from all the topics?
+ and # are the relevant wildcards for AWS IoT rules. See https://docs.aws.amazon.com/iot/latest/developerguide/iot-sql-from.html
You can configure a rule with the following statement to capture messages from all topics.
SELECT * FROM '#'
Or is there any way to dump all the messages on all the topics somewhere in DynamoDB or S3 and calculate the number of messages from individual devices in a specific time period?
One approach is to create a rule based on the one above and also pass the client ID on every message (using the clientid() function). The action for this rule could write the client Id to DynamoDB or S3. Then this information is available to do your calculation.
An alternate approach might be to write the messages and clientid to a Kinesis Data Stream and use Kinesis Data Analytics to detect the errant devices.
Related
I understand that Topics have the additional features of subscriptions and filters which Queues do not.
In which case, when would I absolutely need to use a queue over a topic?
For consistency, could I use a topics everywhere including as a replacement for queues?
A topic is not a replacement for a queue. The combination of a topic and a subscription is. A topic is allowing to “replicate” the same message to multiple subscriptions. A subscription what actually holds messages. A subscription is identical to a queue in its attributes and behaviour. You could replace a queue with a topic+subscription combo if you’d like, generating 2 entities per use case instead of a single queue. Just keep in mind there’s a finite number of entities per namespace.
I want to check how many users are connected to my pubsub pattern. Is there a simple way to do it in Go? Thank you.
pubsub := env.redisCli.PSubscribe(name)
defer pubsub.Close()
I have tried this:
val, _ := env.redisCli.Do("pubsub", "numpat").Int()
But it shows me other patterns also and I want to count only in that specific pattern.
The redis documentation states you can limit the result to a single subscription with the NUMSUB command, but this will not list clients that are subscribed to patterns:
Returns the number of subscribers (not counting clients subscribed to patterns) for the specified channels.
The NUMPAT on the other hand will count all patterns all clients are connected to:
Note that this is not just the count of clients subscribed to patterns but the total number of patterns all the clients are subscribed to.
(from https://redis.io/commands/pubsub)
I don't find a way to list all subscribers with their subscriptions in Redis. So the only way I can think of is to store that information in redis (or somewhere else) independently and manage it yourself.
I have a JMS Queue which will be flooded with event messages from another system. I need to write a program/component which reads the messages and compares the event data against set of rules. Multiple rules can match a message from JMS. If they match, the system should be able to send notifications.
Example:
JMS Queue will be flooded with a strings.
Users of this system are interested in specific type of strings.
User1 will create a rule - "String that contains no special chars"
User2 will create a rule - "String with only capital letters"
and so on...
I should design a system which consumes the strings from the JMS, check the string against all rules, and alert the respective user that a string matching his rule has arrived.
Some considerations :
Hundreds of users.
Each user can create hundreds of rules.
Totally thousands of rules need to be run for a single string from JMS.
So, the comparison needs to be extremely fast.
Also, the users are allowed to create more rules while the system is running.
Which framework will help me achieve this?
I have read in the ActiveMQ documentation, that subtopics can be created by using wildcards. So for instance I could create the topics:
physicalEnvironmet.Conditions
physicalEnvironmet.Infrastructure
physicalEnvironmet.Location
I could then register to either one of the topics, or to all (physicalEnvironmet.>)
But how is it working for more complex structures, like this:
Would the topic for Flickering be called:
physicalEnvironmet.Conditions.Light.Flickering
And could I still have a precise selection, like only subscribing to topics considered with light:
physicalEnvironmet.Conditions.Light.>
So basically I am asking If there is a level restriction for subtopics and If there is maybe a more easy way to create hierarchical topic orders.
In my 10+ yrs of messaging, every hierarchal topic structure ends up being replaced, b/c the taxonomy never works out. Your overall message pattern suggests a moderate total volume, so I suggest a flexible event model where you use fields to define the variance vs topic names eventType="Environmental" sensorType="Light". This allows you to add new ones and then have the option of filtering out what clients want and do not want without having to mess with the broker.
Another option is to use JMS headers to do the same. This would allow you to use selectors to do broker-side filtering.
I've read my docs most examples are for basic use cases.
Where simply one process publish X event and another subscribe to X event.
But in my applications X is kind of variable. so lets say i've X means my user.
so i can do publish from one server event like user-ID means if i've 1000s of user connected to server so will that be Okay to publish and subscribe to so many dynamic topics, and then another 20 servers subscribe to that 1000s topics on this server.
Lets see the example.
i've 10 servers. each server with 1000 users connected. so total 10k users.
i need to send X data from each user to another user.
so i've did this.
X server publish user-ID data (1 publish user's who is connected, 1K publish)
Y server subscribe user-ID data (10k subscribe request to sent each server)
What should be optimal way of pub sub with dynamic topics so less bandwidth used among servers?
Notice::
user-ID is just an example where ID is dynamic number, and it publish some real time data which can't be stored anywhere.
In ZeroMQ subscription matching is implemented in the PUB socket with a prefix-matching trie. This is a very efficient data structure, and I would expect that 10K subscriptions and 10K msg/sec would be no problem at all.
The PUB socket only sends messages for matching subscriptions (so there is no "waste"). If a message doesn't match any subscription then the PUB socket will drop it. Matching messages are only sent to SUB sockets that have subscribed to them.
When you add or remove a subscription, the SUB socket will send a message its connected PUB socket(s). Each PUB socket will then update its topic trie.
My guess is 10k subs and 10k msgs/s is no problem, but the best thing to do would be to write some test code and try it out. Once nice thing about ZeroMQ is that it's not much work to test different architectures.
As far as I know in pyzmq API publisher can send messages to any topic
socket.send("%d %d" % (topic, messagedata))
and subscribers set a filter on these topics for topic of their interests with setsockopt
topicfilter = "10001"
socket.setsockopt(zmq.SUBSCRIBE, topicfilter)
So I think you can fully implement your plan.