Is there known way to add new syntax features to Protobuf? - protocol-buffers

Protobuf provides service keyword that defines rpc-interface of one application.
I also want to use concept of entity which means that is part of service (one service contains multiple entities). Each entity type has own unique identifier that gives possibility to address different entities in service.
I would like to use proto like this
message UserReq {
string username = 1;
string password = 2;
}
message RegReq {
uint8 result_code = 1;
}
message RemoteEntityInterface
{
MyEntity entity = 1;
}
message GiveItemResult
{
uint8 result_code = 1;
}
service MyService {
rpc RegisterUser (UserReq) returns (RegReq) {}
rpc Login(UserReq) returns (RemoteEntityInterface) {}
}
entity MyEntity
{
rpc GiveItem (GiveItemReq) returns (GiveItemResult) {}
}
As you can see in example, I used unknown for protobuf keyword entity, this keyword means that MyService can return the interface to some remote object (MyEntity) by using Login remote method.
What are the ways to do this? (maybe write plugin or known way to modify source code of protobuf). Or maybe there are more flexible solutions than protobuf?
I also would like to use multiple parameters per one rpc; adding java-like attributes to rpc; service and entity; and data-model for entity (variables/fields) to add real-time replication support from entity to another service.
I think it is very flexible for services in game-development.

The only official way to extend .proto syntax is to define custom options.
For example, you could have something like:
extend google.protobuf.ServiceOptions {
optional bool is_entity = 123456;
}
service MyEntity
{
option (is_entity) = true;
rpc GiveItem (GiveItemReq) returns (GiveItemResult) {}
}
The default code generator will not do anything special with this option, but you can access it from your own code and from a protoc plugin if you write one.

Related

Returning a list of messages

Given that i have multiple models, each needed to have their own create/get/get list API.
Do i need to add two different types of messages (single and list) for every model?
For example :
If i have a student type -
message Student{
string name = 1;
}
and a rpc:
rpc CreateStudent(Student) returns (google.protobuf.Empty){
..............
}
If i'd like to add a rpc to create a list of students, or get a list of students
rpc CreateStudends(??????) returns (google.protobuf.Empty){
..............
}
rpc GetAllStudents() returns (??????){
..............
}
Do i need to also define
message StudentList{
repeated Student students = 1;
}
Or is there a way to use a list type directly in the message input/output?
Yes, basically - you would want a different message type per element type, or maybe a single root type with a oneof style content. Raw protobuf does not include a concept of generics or templates.
Some libraries do, but: that's outside of the specification.
You can simply add the stream keyword to your RPCs. No need to define a message field as repeated, stream will send or receive multiple independent messages.
message Student {
string name = 1;
}
with RPCs:
rpc CreateStudent(Student) returns (google.protobuf.Empty) {
..............
}
rpc CreateStudents(stream Student) returns (google.protobuf.Empty) {
..............
}
rpc GetAllStudents() returns (stream Student) {
..............
}
It's good practice to send/stream a response object rather than empty. Otherwise, you only have the gRPC response code to indicate a problem and will need to reference the logs to debug.

How to ignore unknown enum values?

I'm wondering what would be the best way to ignore/discard the unknown enum values in GraphQL/Apollo server.
Let's say my GraphQL schema defines array of enums "enum Service { Supermarket, TicketSales }" and it works fine now, but later on other service I'm using is adding some new values (e.g. Playground) and my client just doesn't support it and I would just like to ignore it and return the supported values without error.
What would be the best way to do this in GraphQL. My first idea was to make directive that would read the supported values from schema and ignore everything else, but after googling around I didn't find any good examples how to do it. Can you point me a direction where to go about this?
If your resolver function will accept arbitrary strings, then you can use a custom scalar type, or just String.
"""
The type of a service. `Supermarket` means..., and
`TicketSales` means...; any other value is ignored.
"""
scalar Service
GraphQL generally places responsibility on the client to conform to the server's expectations, rather than making the server try to support any request. There are a couple of places you can reasonably expect an enum value like this to appear:
enum Service { Supermarket, TicketSales }
type Query {
inAReturnValue: Service!
asAQueryParam(service: Service!): Node
}
type Mutation {
asAMutationInput(service: Service!): Node
}
In particular it may not make sense to tell the server "make the type of this object be a playground" if the server just doesn't understand that. Conversely, if the server knows about "playground", it could return it in cases the client may not expect. Having an enum here makes it explicit what the server knows about. The server has said what it supports and it's the client's responsibility to cooperate.
Note that it's possible for the client to find out if the server supports playgrounds, if it's an enum value, and this might help it inform its behavior.
query GetServiceTypes {
__type(name: "Service") {
enumValues { name }
}
}
After playing around I found something that I can use to get around my original problem, so I will post it here in case somebody else is wondering the same thing.
So my original problem was in short that I'm receiving several different "available services" kind of string arrays from another services and I was thinking to map them to enum for better typescript support etc. But the problem was that if I get some unknown value from another service, my graphql will fail.
So my original idea was to fix it with directive which I after all got working:
# In schema
directive #mapUnknownTo(value: String) on ENUM
enum SomeAttribute #mapUnknownTo(value: "__UNKNOWN__") {
SomeAttribute1
AnotherAttribute
SomethingElse
__UNKNOWN__
}
And the directive implementation is:
import { SchemaDirectiveVisitor } from 'graphql-tools';
import { GraphQLEnumType } from 'graphql';
export class MapUnknownToDirective extends SchemaDirectiveVisitor {
visitEnum(type: GraphQLEnumType) {
const { value = '__UNKNOWN__' } = this.args;
const valueMap = type.getValues().reduce((map, v) => map.set(v.value, v.name), new Map<string, string>());
type.serialize = (v: string): string => valueMap.get(v) || value;
}
}
So this will map all the values not defined in schema into some custom value, which is not exactly what I originally wanted, but at least it's not giving an error, so it's okay-ish.
I'm still not 100% sure if directives are way to go on cases like this, but at least it's one possible solution.

How to get Method extensions from gRPC

I am using interceptors to perform additional validation based on the optional extensions set on an RPC on incoming and out going RPCs.
Given the following gRPC schema:
extend google.protobuf.MethodOptions {
string my_option = 50006;
}
service MyService {
rpc Foo (FooRequest) returns (FooResponse) {
option (my_option) = "foo"
}
}
How do I go about getting the value of my_option? At first I had thought to get it from the request using this. However, as this is a MethodOptions it doesn't seem that its part of the descriptor. Thoughts?
Found the following answer for those who get here in the future.

Can protobuf service method return primitive type?

I'm trying to use Google protobuf and i 'm having the next descriptions:
message.proto file:
message Request {
required int32 id = 1;
optional string value = 2;
}
service.proto file:
import "message.proto";
service Service {
rpc request (Request) returns (bool);
}
I'm trying to generate c++ sources and getting an error:
$ protoc service.proto --cpp_out=/tmp/proto/build
service.proto:4:40: Expected message type.
Do i have to return user-defined types only? Are primitive (like bool or string) supported? Can i use primitive types as service method argument (instead of Request in my example)?
No, you cannot use a primitive type as either the request or response. You must use a message type.
This is important because a message type can be extended later, in case you decide you want to add a new parameter or return some additional data.
If you want to return a primitive type, wrap it in a message and return it:
message Name {
string name = 1;
}
In case you don't want to return anything, void I mean, you can just create an empty message:
message Void {}
message Name {
string name = 1;
}
..
service MyService{
rpc MyFunc(Name) returns (Void);
}
You can return scalar datatypes like bool, int, etc by making use of wrappers.proto
service.proto file:
import "message.proto";
import "google/protobuf/wrappers.proto";
service Service {
rpc request (Request) returns (.google.protobuf.BoolValue);
}

MongoDB - override default Serializer for a C# primitive type

I'd like to change the representation of C# Doubles to rounded Int64 with a four decimal place shift in the serialization C# Driver's stack for MongoDB. In other words, store (Double)29.99 as (Int64)299900
I'd like this to be transparent to my app. I've had a look at custom serializers but I don't want to override everything and then switch on the Type with fallback to the default, as that's a bit messy.
I can see that RegisterSerializer() won't let me add one for an existing type, and that BsonDefaultSerializationProvider has a static list of primitive serializers and it's marked as internal with private members so I can't easily subclass.
I can also see that it's possible to RepresentAs Int64 for Doubles, but this is a cast not a conversion. I need essentially a cast AND a conversion in both serialization directions.
I wish I could just give the default serializer a custom serializer to override one of it's own, but that would mean a dirty hack.
Am I missing a really easy way?
You can definitely do this, you just have to get the timing right. When the driver starts up there are no serializers registered. When it needs a serializer, it looks it up in the dictionary where it keeps track of the serializers it knows about (i.e. the ones that have been registered). Only it it can't find one in the dictionary does it start figuring out where to get one (including calling the serialization providers) and if it finds one it registers it.
The limitation in RegisterSerializer is there so that you can't replace an existing serializer that has already been used. But that doesn't mean you can't register your own if you do it early enough.
However, keep in mind that registering a serializer is a global operation, so if you register a custom serializer for double it will be used for all doubles, which could lead to unexpected results!
Anyway, you could write the custom serializer something like this:
public class CustomDoubleSerializer : BsonBaseSerializer
{
public override object Deserialize(BsonReader bsonReader, Type nominalType, Type actualType, IBsonSerializationOptions options)
{
var rep = bsonReader.ReadInt64();
return rep / 100.0;
}
public override void Serialize(BsonWriter bsonWriter, Type nominalType, object value, IBsonSerializationOptions options)
{
var rep = (long)((double)value * 100);
bsonWriter.WriteInt64(rep);
}
}
And register it like this:
BsonSerializer.RegisterSerializer(typeof(double), new CustomDoubleSerializer());
You could test it using the following class:
public class C
{
public int Id;
public double X;
}
and this code:
BsonSerializer.RegisterSerializer(typeof(double), new CustomDoubleSerializer());
var c = new C { Id = 1, X = 29.99 };
var json = c.ToJson();
Console.WriteLine(json);
var r = BsonSerializer.Deserialize<C>(json);
Console.WriteLine(r.X);
You can also use your own serialization provider to tell Mongo which serializer to use for certain types, which I ended up doing to mitigate some of the timing issues mentioned when trying to override existing serializers. Here's an example of a serialisation provider that overrides how to serialize decimals:
public class CustomSerializationProvider : IBsonSerializationProvider
{
public IBsonSerializer GetSerializer(Type type)
{
if (type == typeof(decimal)) return new DecimalSerializer(BsonType.Decimal128);
return null; // falls back to Mongo defaults
}
}
If you return null from your custom serialization provider, it will fall back to using Mongo's default serialization provider.
Once you've written your provider, you just need to register it:
BsonSerializer.RegisterSerializationProvider(new CustomSerializationProvider());
I looked through the latest iteration of the driver's code and checked if there's some sort of backdoor to set custom serializers. I am afraid there's none; you should open an issue in the project's bug tracker if you think this needs to be looked at for future iterations of the driver (https://jira.mongodb.org/).
Personally, I'd open a ticket -- and if a quick workaround is necessary or required, I'd subclass DoubleSerializer, implement the new behavior, and then use Reflection to inject it into either MongoDB.Bson.Serialization.Serializers.DoubleSerializer.__instance or MongoDB.Bson.Serialization.BsonDefaultSerializationProvider.__serializers.

Resources