gRPC endpoint that sends initial data and afterwards stream of data - protocol-buffers

I want to define a gRPC endpoint, that when called, returns some initial data and afterwards a stream of data. For example in a game it could create a lobby and return some initial data about the lobby to the creator and afterwards stream an event every time a player joins.
I want to achieve something like this:
message LobbyData{
string id = 1;
}
message PlayerJoin{
string id = 2;
}
service LobbyService {
rpc OpenLobbyAndListenToPlayerJoins(Empty) returns (LobbyData, stream PlayerJoin);
}
Unfortunately this is not possible so I have 2 options:
Option 1 (not what a want)
Creating two seperate RPCs, and call them sequentially on the client:
service LobbyService {
rpc OpenLobby(Empty) returns (LobbyData);
rpc ListenToPlayerJoins(LobbyData) returns (stream PlayerJoin);
}
This however creates the problem that players can join the lobby possibly before the second RPC from the client (ListenToPlayerJoins) reaches the server. So on the server we would need additional logic to open the lobby only after the ListenToPlayerJoins RPC from the creator has arrived.
Option 2 (also not what I want)
Use a single RPC with a sum type:
message LobbyDataOrPlayerJoin{
oneof type {
LobbyData lobby_data = 1;
PlayerJoin player_join = 2;
}
}
service LobbyService {
rpc OpenLobbyAndListenToPlayerJoins(Empty) returns (stream LobbyDataOrPlayerJoin);
}
This would allow for just one RPC, where the first element of the stream is a LobbyData object and all subsequent elements are PlayerJoins. What is not nice about this is that all streamed events after the first one are PlayerJoins but the client receives them still as the sum type LobbyDataOrPlayerJoin. Which is not clean.
Both options seem to me like workarounds. Is there a real solution to this problem?

Related

How to segregate data from Flux without the underlying call again?

I am working with reactive spring-boot framework and I have a flow like this in my application, (psuedo code):
Flux<String> keys; // getting flux of keys from some part of code
Flux<RedisData> response = getDataFromRedisForKeys(keys);
Flux<RedisData> responseTypeA = filterATypeResponseFromRedisDataFlux(response);
Flux<RedisData> responseTypeB = filterBTypeResponseFromRedisDataFlux(response);
Flux<RedisData> responseTypeC = filterCTypeResponseFromRedisDataFlux(response);
Flux<RedisData> responseTypeD = filterDTypeResponseFromRedisDataFlux(response);
Now, when I am trying to do flatMap ops on the 4 fluxes after the filter, I am seeing that data from redis is being get 4 times, what I want is that we get it once reactively and segregate it out without blocking.
getDataFromRedisForAllKeys is according to your incoming key returns the corresponding redis data needs to be yourself
You just need to search through redis for the key you need first and then filter or subscribe
You can try using the subscribe method
And process data in it
Flux result = getDataFromRedisForAllKeys(keys);
result.subscribe(data -> {
switch (data.key){
case "A":
// do something
case "B":
// do something
case "C":
// do something
case "D":
// do something
}
});
Or use the Flux filter method
Flux<RedisData> responseTypeA = result.filter(redisData -> "A".equals(redisData.key));

Reactive Redis does not continually publish changes to the Flux

I am trying to get live updates on my redis ordered list without success.
It seems like it fetches all the items and just ends on the last item.
I would like the client to keep get updates upon a new order in my ordered list.
What am I missing?
This is my code:
#RestController
class LiveOrderController {
#Autowired
lateinit var redisOperations: ReactiveRedisOperations<String, LiveOrder>
#GetMapping(produces = [MediaType.TEXT_EVENT_STREAM_VALUE], value = "/orders")
fun getLiveOrders(): Flux<LiveOrder> {
val zops = redisOperations?.opsForZSet()
return zops?.rangeByScore("orders", Range.unbounded())
}
}
There is no such feature in Redis. First, reactive retrieval of a sorted set is just getting a snapshot, but your calls are going in a reactive fashion. So you need a subscription instead.
If you opt in for keyspace notifications like this (K - enable keyspace notifications, z - include zset commands) :
config set notify-keyspace-events Kz
And subscribe to them in your service like this:
ReactiveRedisMessageListenerContainer reactiveRedisMessages;
// ...
reactiveRedisMessages.receive(new PatternTopic("__keyspace#0__:orders"))
.map(m -> {
System.out.println(m);
return m;
})
<further processing>
You would see messages like this: PatternMessage{channel=__keyspace#0__:orders, pattern=__keyspace#0__:orders, message=zadd}. It will notify you that something has been added. And you can react on this somehow - get the full set again, or only some part (head/tail). You might even remember the previous set, get the new one and send the diff.
But what I would really suggest is rearchitecting the flow in some way to use Redis Pub/Sub functionality directly. For example: publisher service instead of directly calling zadd will call eval, which will issue 2 commands: zadd orders 1 x and publish orders "1:x" (any custom message you want, maybe JSON).
Then in your code you will subscribe to your custom topic like this:
return reactiveRedisMessages.receive(new PatternTopic("orders"))
.map(LiveOrder::fromNotification);

Protobuf: streaming input with common data to all elements

Below is the service spec:
service Cooler {
rpc saveThing (stream SaveRequest) returns (SaveReply);
}
I need to stream messages to to Cooler.saveThing(). All the SaveRequests have a common field author and unique fields per a Thing are price and name. How can I send the author only once?
Not working attempt - Multiple inputs
It would be a solution but it is not supported by protobuf yet.
service Cooler {
rpc saveThing (stream SaveRequest, Author) returns (SaveReply);
}
Not working attempt - Nested message
Every received element of SaveRequest will still contain author and an array of Things.
message SaveRequest {
message Thing {
int price = 1;
string name = 2;
}
repeated Thing things = 1;
string author = 2;
}
Possible solution
Grpc headers.
Question
How can I send the author only once?

Get OpenTok stream by ID

I am creating an app where I want a specific stream to be shown big on another page. There will be one page with all the streams (subscriptions) and one page which will show a SPECIFIC stream on the whole screen.
Send the streamID of a chosen stream to the database.
On the other page, fetch the last added streamID in the database
Get stream by ID and show it on the page
I got the first two steps working, the last step is a problem. I somehow need to fetch a stream object by the given streamId. Is this possible? OR is there another way to achieve this?
I dont know if this is the correct way to do it but you can add all the stream objects in an array:
var streamContainer = [];
session.on("streamCreated", streamCreatedHandler);
streamCreatedHandler = function(e) {
streamContainer.push(e.stream)
}
and when your ID gets fetched unsubscribe from current, iterate over the array and subscribe to the new stream:
for(var i = 0; i < streamContainer.length; i++) {
if(streamContainer[i].id == dbStreamId) {
session.subscribe(streamContainer[i], 'DOMELEMENT', {options});
}
}
You might not even need a database for what you are trying to do. Simply connect both pages (specific stream and all streams) to the same sessionId. You can put streamId in your url as a parameter like this: ?streamId=1343-thneue...
In your streamCreated event, simply check the stream.streamId in your callback function with your url streamId. If they match, then call session.subscribe on that stream object.

Using Reactives to Merge Chunked Messages

So I'm attempting to use reactives to recompose chunked messages identified by ID and am having a problem terminating the final observable. I have a Message class which consists of Id, Total Size, Payload, Chunk Number and Type and have the following client-side code:
I need to calculate the number of messages to Take at runtime
(from messages in
(from messageArgs in Receive select Serializer.Deserialize<Message>(new MemoryStream(Encoding.UTF8.GetBytes(messageArgs.Message))))
group messages by messages.Id into grouped select grouped)
.Subscribe(g =>
{
var cache = new List<Message>();
g.TakeWhile((int) Math.Ceiling(MaxPayload/g.First().Size) < cache.Count)
.Subscribe(cache.Add,
_ => { /* Rebuild Message Parts From Cache */ });
});
First I create a grouped observable filtering messages by their unique ID and then I am trying to cache all messages in each group until I have collected them all, then I sort them and put them together. The above seems to block on g.First().
I need a way to calculate the number to take from the first (or any) of the messages that come through however am having difficulty doing so. Any help?
First is a blocking operator (how else can it return T and not IObservable<T>?)
I think using Scan (which builds an aggregate over time) could be what you need. Using Scan, you can hide the "state" of your message re-construction in a "builder" object.
MessageBuilder.IsComplete returns true when all the size of messages it has received reaches MaxPayload (or whatever your requirements are). MessageBuilder.Build() then returns the reconstructed message.
I've also moved your "message building" code into a SelectMany, which keeps the built messages within the monad.
(Apologies for reformatting the code into extension methods, I find it difficult to read/write mixed LINQ syntax)
Receive
.Select(messageArgs => Serializer.Deserialize<Message>(
new MemoryStream(Encoding.UTF8.GetBytes(messageArgs.Message))))
.GroupBy(message => message.Id)
.SelectMany(group =>
{
// Use the builder to "add" message parts to
return group.Scan(new MessageBuilder(), (builder, messagePart) =>
{
builder.AddPart(messagePart);
return builder;
})
.SkipWhile(builder => !builder.IsComplete)
.Select(builder => builder.Build());
})
.Subscribe(OnMessageReceived);

Resources