We have a scenario where we need to update(send notification) to a X user belongs to a Y (subscribe)group on activity of other group members in real-time.
We decided the tech stack as -
MongoDB -- store user activity in MongoDB,
Kafka - Push activity event in Kafka (Messaging Queue) as well,
Spring Boot --- Backend API,
Angular2/Android/iOS -- Front End,
Websockets -- real-time data update
So, whenever there is an activity from user, it will be logged in both Mongo and Kafka.
Client will subscribe to Websocket /activity/{group-id}
Websocket will talk to kafka through kafka consumer and send notification to client if there is any new message in kafka.
My Question is -
How do I keep Kafka consumer process listening to Kafka-topic (I know how to read messages from Kafka) and send if any new message is there to client over socket.
In other words, One way communication from Server to Clients,of subscribed group.
Thanks
Pari
Related
Context and Problem
We are building a notification system.
Publisher users can send messages to other online subscriber users
online subscriber users will receive the sent messages.
Publisher users and Subscriber users are on different instances and have no direct way to reach each other.
It's okay for subscribers to miss some notifications in rare scenarios (other methods of retrieving all notifications are provided)
Solution
- Publishing
Publisher user publishes a message into RabbitMQ.
Business logic is applied to the message in RabbitMQ consumer.
RabbitMQ consumer publishes the message to Redis event notification_[subscriber_id].
- Subscribing
Subscriber user connects to a WebSocket server.
WebSocket server has a connection to Redis and subscribes to Redis event notification_[subscriber_id] on user connection.
Upon receiving messages on notification_[subscriber_id] a message is sent to the user over WebSocket.
Question
Both publisher users and subscriber users can be any amount (infinite), from my research it seems Redis has no limit on the number of subscriptions (around 4billion if there's any), so
Is this "dynamic" way of creating subscriptions in Redis, scalable?
Yes,
you can scale horizontally in Redis Cluster mode, which will allow you to continue serving requests during the scaling process.
Also,
It will be smart to also design your application to cleanup subscriptions, as you seem to be planning to deal with millions/billions of subscriptions, so a good planning before implementation is important.
I am developing a new notification microservice using SpringBoot and Kafka. It works like this:
The notification service has an API where consumers can send a notification message request.
The notification service puts this request on a kafka queue and sends back an accepted response.
The notification service reads the Kafka queue and sends the notifications to an external email/SMS service.
For point 1, the message details (e.g. sender, message) are put into the kafka queue. Ideally I don't want to put all these details in the kafka message. I could insert the notification into a DB and return a notification ID which is put into the Kafka message. The only concern I have here is due to the volume of requests, I don't want to overload the DB or if there is a DB issue I lose requests.
Any advice/recommendations of what is best practice?
how is that possible that a REST Microservice can communicate with another Microservice which is a hybrid, which means he can communicate with REST and with a Message Queue. For Example an API-Gateway. For the outside world, he is able to communicate with an App, Mobilephone via REST but the communication from the backend is via message queue.
Use case:
My homepage wants to get a Vehicle from the database. He asks the API-Gateway via a GET-Request. The API-Gateway takes the GET-request and publishes it into the message queue. The other Microservice takes the message and publishes the result. Then the API-gateway consumed the result and send it back as a response.
How can I implement it? Am I using Spring boot with Apache Kafka? Do I need to implement an asynchronous communication?
(Sorry its german)
There are some approaches to solve this situation.
You might create topics for each client request and wait for the reply on the other side, e.g, DriverService would read the request message, fetch all your data and publish it to your client request topic. As soon as you consume the response message, you destroy that topic.
BUT 'temporary' topics might take too long to be delete(if no configuration avoids that, such as delete.topic.enable property) in a request-response interaction, and you need to monitor possible topics overgrowth.
Websocket is another possible solution. Your client would start listening to a specific topic, previously agreed with your server, then in a specific timeout you would wait for the response, when your DriverService would publish to that specific socket channel.
Spring Boot offers you great starters for Kafka and Websockets. If you are expecting a large amount of transactions, I would go with a mixed strategy, using Kafka to help my backend scale and process all transactions, then would respond to client via Websocket.
We are currently implementing a simple chat app that allows users to create conversations and exchange messages.
Our basic setup involves AngularJS on the front-end and SignalR hub on the back end. It works like this:
Client app opens a Websockets connection to our real-time service (based on SignalR) and subscribes to chat updates
User starts sending messages. For each new message, client app calls HTTP API to send it
The API stores the message in the database and notifies our real-time service that there is a new message
Real-time service pushes the message via Websockets to subscribed Clients
However, we noticed that opening up so many HTTP connections for each new message may not be a good idea, so we were wondering if Websockets should be used to both send and receive messages?
The new setup would look like this:
Client app opens a Websockets connection with real-time service
User starts sending messages. Client app pushes the messages to real-time service using Websockets
Real-time service picks up the message, notifies our persistence service it needs to be stored, then delivers the message to other subscribed Clients
Persistence service stores the message
Which of these options is more typical when setting up an efficient and performant chat system? Thanks!
You don't need a different http or Web API to persist message. Persist it in the hub method that is broadcasting the message. You can use async methods in the hub, create async tasks to save the message.
Using a different persistence API then calling signalr to broadcase isn't efficient, and why dublicate all the efforts?
My goal is to create an application that I can use to manage pub/sub for various clients. The application should be able to receive new topics via an API and then accept subscribers via a websocket connection.
I have it working, but am aware the current solution has many flaws. It works currently as follows:
I have a chicago_boss app, that has a websocket endpoint for clients to connect to, once the client connects, I add the Pid for that Websocket connection to a list in Redis.
Client connects to "ws://localhost:8001/websocket/game_notifications"
The Pid for that Websocket connection is added to Redis using LPUSH game_notifications_pids "<0.201.0>".
3.The last 10 messages in Redis for game_notifications are sent to the websocket Pid
A new message is posted to "/game_notifications/create"
Message is added to redis using LPUSH game_notifications "new message"
All Pids in Redis with key game_notifications_pids are sent this new message
On closing of the websocket the Pid is deleted from the Redis list
Please let me know what problems people see with this setup? Thanks!