We are using Golang and .NET Core for our inter-communication microservices infrastructure.
All the data across the services are coming based on Protobuffs Protocols that we have created.
Here is an example of one of our Protobuffs:
syntax = "proto3";
package Protos;
option csharp_namespace = "Protos";
option go_package="Protos";
message EventMessage {
string actionType = 1;
string payload = 2;
bool auditIsActive = 3;
}
Golang is working well and the service is generating the content as needed and sending it to the SQS queue, once that happens the .NET core service is getting the data and trying to serialize it.
Here are the contents of the SQS message example:
{"#type":"type.googleapis.com/Protos.EventMessage","actionType":"PushPayload","payload":"<<INTERNAL>>"}
But we are getting an Exception that saying the wire-type is not defined as mentioned below:
Google.Protobuf.InvalidProtocolBufferException: Protocol message contained a tag with an invalid wire type.
at Google.Protobuf.UnknownFieldSet.MergeFieldFrom(CodedInputStream input)
at Google.Protobuf.UnknownFieldSet.MergeFieldFrom(CodedInputStream input)
at Google.Protobuf.UnknownFieldSet.MergeFieldFrom(UnknownFieldSet unknownFields, CodedInputStream input)
at Protos.EventMessage.MergeFrom(CodedInputStream input) in /Users/maordavidzon/projects/github_connector/GithubConnector/GithubConnector/obj/Debug/netcoreapp3.0/EventMessage.cs:line 232
at Google.Protobuf.MessageExtensions.MergeFrom(IMessage message, Byte[] data, Boolean discardUnknownFields, ExtensionRegistry registry)
at Google.Protobuf.MessageParser`1.ParseFrom(Byte[] data)
The Proto file is exactly the same in both of the services.
Is there any potential missing options or property that we need to add?
It looks like you're using the JSON format rather than the binary format. In that case, you want ParseJson(string json), not ParseFrom(byte[] data).
Note: the binary format is more efficient, if that matters to you. It also has better support across protobuf libraries / tools.
Basically there are two possible scenarios, or your protos files generated for .NET and GoLang are not in the same version or your data has been corrupted while transferring between GoLang and .NET application.
Protobuf is a binary protocol, check if you have any http filter or anything else that can change incoming or outgoing stream of bytes.
Related
Is there a way to use Url or Uri data type inside a message of gRPC? And if not what is the best way to do this?
I am using gRPC and Protocol Buffers for this and I run a backend go app to trigger popup notifications that display in my Flutter app. The notification has a link that takes the user to a webpage when clicked on in my Flutter app.
Right now I am using a String for this, like so:
message NotificationResponse{
string title = 1;
string url = 2;
}
I can't seem to find a way to use Url/Uri as a type. Is there such a thing?
Storing in a string is a totally viable solution. But if you would like to reduce the payload size a little and make the instantiation little bit safer, you could remove the schema part of the url (eg: https://).
To do that you could do the following:
message Url {
enum Schema {
UNSPECIFIED = 0;
HTTP = 1;
HTTPS = 2;
// more if needed
}
Schema schema = 1;
string rest = 2;
}
and then you can use it in your message like this:
message NotificationResponse {
string title = 1;
Url url = 2;
}
This would exclude these non necessary character in the string and reduce the payload size (remove http(s) and ://). Enum are serialized more efficiently than string.
This would make instantiation safer because you can restrict at least the schema that you and other developers can use (in my example, no ftp or even restrict to only https for security).
That has one inconvenience though. You will have to concatenate that back in your client code, but in my opinion this is worth the effort since the concatenation is pretty trivial (change enum value to is text value and add ://).
Note: doing the same enum trick for Domain Name (.com, .net, ...) would not be as trivial and would force you to store the path and the host in different field (not worth it since it increase payload).
Let me know if you need more help.
I'm creating client side streaming method which proto file definition should be looking similar to this:
service DataService {
rpc Send(stream SendRequest) returns (SendResponse) {}
}
message SendRequest {
string id = 1;
bytes data = 2;
}
message SendResponse {
}
The problem here is that ID is sent with each streaming message even it is needed only once. What's your recommendation and most optimal way for such a use cases?
One hacky approach would be to set ID only once within first message and after that left if blank. But this API is supposed to be used by third party and method definition like above doesn't explaining that well.
I don't think something like this is supported either:
service DataService {
rpc Send(InitialSendRequest, stream DataOnlyRequest) returns (SendResponse) {}
}
I'm currently considering SendRequest message to be something like this, but will have to check how optimal is this compared to the first case considering proto marshaling:
message SendRequest {
oneof request{
string id = 1;
bytes data = 2;
}
}
Your approach of using a oneof with the fields clearly documented saying id is expected only in the first message on the stream and that server implementations will terminate the stream if id is set on subsequent messages on the stream sounds good to me.
The following is a usage of the above described pattern in grpc-lb-v1. Even though the grpc team is moving away from grpc-lb-v1, the above mentioned pattern is a commonly used one.
I'm not very sure about its implications with respect to proto marshaling. That might be a question for the protobuf team.
Hope this helps.
I have a protocol buffer file in GRPC server, and the SayHello is defined under package hello.v1
syntax = "proto3";
package hello.v1;
service GreetService {
rpc SayHello(SayHelloRequest) returns (SayHelloResponse) {}
}
message SayHelloRequest {
string name = 1;
}
message SayHelloResponse {
}
the function may be changed over time like this:
service GreetService {
rpc SayHello(SayHelloRequest) returns (SayHelloResponse) {}
}
message SayHelloRequest {
string name = 1;
uint32 age = 2;
}
message SayHelloResponse {
string reply_message = 1;
}
But for users, they want to have a non-breaking service, so I want to minimize the impact for them. My question is how to keep both the two versionsSayHello? Client can call them throw different namespace in cpp or different package in golang.
Generally speaking, you should have a single source of truth for your protobuf files. One solution is using a monorepo, i.e. housing both your client and server in the same monorepo. However, this approach is uncommon outside of certain large companies.
Organizations with multiple repos tend to create a separate repo for all of their protobufs which is then included in other repos via submodules or pulled down over the network at buildtime.
Regardless, it is always possible to introduce a breaking change by modifying the proto, even if there is a single source of truth. You might consider running Buf as a presubmit test on all code changes to detect if a breaking change has been introduced.
I would like to send a stream of different protobuf messages through the wires and be able to differentiate them at arrival as they are coming.
Let's say I have a *.proto like that:
message Book {
//...
}
message BlueRay{
//...
}
And then and the sender side, I serialize let's say this sequence (pseudo code in C#):
Book1.WriteDelimitedTo(myStream);
BlueRay1.WriteDelimitedTo(myStream);
Book2.WriteDelimitedTo(myStream);
How can I do to know the order/types of messages I'm getting on the receiver side? (The contract is available on both sender and receiver side of course)
Depending of my sender's state I can not presume/tell what is going to be sent and in which order...
I understood that there is no built-in way to do that like stated in the documentation, but for instance for the size of the message there was a helper (C# API helper WriteDelimited method to embedd size).
How can I do to get/map the type of a received message?
My server will be written in a given language (C# actually), but my clients should be "implementable" in any protobuf supported target, so I don not want to set up something that would serialize C#/CLR specific stuff in between...
I'm maybe using protobuf in a weird way? I'm trying to set up a kind of protocol.
I think I finally found out how to do that (only in C# at the moment).
Basically, I'm writing the following to the stream:
the coming message descriptor's target name (in proto file, as both end share this definition), using the primitive for string serilization
the coming message size, using the primitive for size serialization
the serialized message into the stream
This results in a method like the following:
public static void WriteToStream(Stream outputStream, IMessage message)
{
MessageDescriptor stateMsgDescriptor = message.Descriptor;
using (CodedOutputStream codedOutStr = new CodedOutputStream(outputStream, true))
{
codedOutStr.WriteString(stateMsgDescriptor.FullName);
int size = message.CalculateSize();
codedOutStr.WriteLength(size);
message.WriteTo(codedOutStr);
codedOutStr.Flush();
}
}
As stated here,
The Protocol Buffer wire format is not self-delimiting, so protocol
buffer parsers cannot determine where a message ends on their own
I am trying to write an online message board in Haxe (OpenFL). There are lots of server/client examples online. But I am new to this area and I do not understand any of them. What is the easiest way to send a list of objects between server and client? Could you guys give an example?
You could use JSON
You can put this in your openFL project (client):
var myData = [1,2,3,4,5];
var http = new haxe.Http("server.php");
http.addParameter("myData", haxe.Json.stringify(myData));
http.onData = function(resultData) {
trace('the data is send to server, this is the response:' + resultData);
}
http.request(true);
If you have a server.php file, you can access the data like this:
$myData = json_decode($_POST["myData"]);
If the server returns Json data which needs to be read in the client, then in Haxe you need to do haxe.Json.parse(resultData);
EDIT: I'm not still sure if the user's problem is really about sending "a list of objects"; see comment to the question...
The easiest way is to use Haxe Serialization, either with Haxe Remoting or with your own protocol on top of TCP/UDP. The choice of protocol depends whether you already have something built and whether you will be calling functions or simply getting/posting data.
In either case, haxe.Serializer/Unserializer will give you a format to transmit most (if not all) Haxe objects from client to server with minimal code.
See the following minimal example (from the manual) on how to use the serialization APIs. The format is string based and specified.
import haxe.Serializer;
import haxe.Unserializer;
class Main {
static function main() {
var serializer = new Serializer();
serializer.serialize("foo");
serializer.serialize(12);
var s = serializer.toString();
trace(s); // y3:fooi12
var unserializer = new Unserializer(s);
trace(unserializer.unserialize()); // foo
trace(unserializer.unserialize()); // 12
}
}
Finally, you could also use other serialization formats like JSON (with haxe.Json.stringify/parse) or XML, but they wouldn't be so convenient if you're dealing with enums, class instances or other data not fully supported by these formats.