Looking into weird protocol-buffer message (decoding and encoding) - protocol-buffers

I'm trying to figure out, is there a way to create a .proto definition, which could create and decode a message that looks like this:
parent_field {
carrier_field { (field id: 1)
subfield_1: 10 (field id: 1)
subfield_2: 20 (field id: 2)
}
carrier_field: "string" (field id: 1)
}
which means that under same field identifier I can get either sub message or a string.
I tried something like:
message MessageWrapper {
message ParentField {
message SubMessage {
....
}
repeated SubMessage carrier_field = 1
}
ParentField parent_field = 1
}
but when I try to decode the message I get:
# protoc --experimental_allow_proto3_optional --decode MessageWrapper proto_definitions/test.proto < test.buff
Failed to parse input.
How should the .proto definition look like to be able to decode and encode a message showed above?

The only way of doing that would be for carrier_field to be a bytes field that you would then separately decode as either a string or a protobuf message. Frankly, I'd advise against this: it would be better to use a oneof with different field numbers.

Related

regex as protobuf message field name?

can we define regular expression in protobuf field name? I send the request as list of dictinary in client.py file
"cur_cur_bin" : [{"cur_cur_bin1_bin3_bin1" : 4,"cur_cur_bin3_bin5_bin8" : 6} ]
I defined .proto file like,
int32 cur_cur_bin1_bin3_bin1 = 1;
}
message Message{
repeated cur_cur_BIN cur_cur_bin = 1;
}```
any one can explain how to define this type of field in .proto file dynamically. because
(bin1) having some range like (1 - [1-8]) same for (bin3) like (3 -[8-11]) like this.
No, as far as I know there is no mechanism to generate field names automatically or dynamically in protoc.
You can create messages dynamically through the python-protobuf reflection API, but that probably doesn't make sense for your usecase.
Instead define a reasonable .proto format, and then write some custom Python code to convert your JSON data to that.

How to name these Google Protobuf fields so I can use them in GoLang to accept specific JSON?

I am building a gRPC function (called myFunc) that takes the equivalent of the following JSON data as its argument:
{
"id": "ABCD4435010",
"otherId": "WXYZ4435010",
"duration": 30
}
As part of this exercise, I need to design the protobuf message. It looks like this:
1: // MyFunc does something I guess.
2: rpc MyFunc(MyFuncRequest) returns (MyFuncResponse) {
3: option (google.api.http) = {
4: post: "/my.path.to.endpoint/MyFunc"
5: body: "*"
6: };
7: }
8:
9: // MyFuncRequest is the request object for MyFunc.
10: message MyFuncRequest {
11: // id is something.
12: string id = 1;
13: // otherId is something.
14: string otherId = 2;
15: // duration is something.
16: string duration = 3;
17: }
When I try to generate the golang files from this, I get the following errors:
myFile.proto:14:3:Field name "otherId" must be lower_snake_case.
myFile.proto:16:3:Field "duration" must be a google.protobuf.Duration.
2 problems with these errors:
If I change otherId to other_id it will no longer match the key name in the JSON.
If I change duration field's type to google.protobuf.Duration it will no longer match the type of data from the JSON. So that marshalling/unmarshalling will fail.
How can I work around these error messages and get my protocol buffers to compile?
follow the grpc quick start guide for Go in there website(protoc version 3) grpc.io/docs/languages/go/quickstart
By updating their hello world example as saying here change that HelloRequest as follow and regenerate grpc code following this command.
message HelloRequest {
// id is something.
string id = 1;
// otherId is something.
string otherId = 2;
// duration is something.
string duration = 3;
}
regenerate command
--go-grpc_out=. --go-grpc_opt=paths=source_relative \
helloworld/helloworld.proto
It will create grpc request with your fields without an error.
I assume you're getting OtherId and you want otherId.
It may be easiest to create a new type and map this to|from the proto-generated types. This gives you the desired mapping without having to do any 'acrobatics' to keep protoc happy and get the JSON you want.
I'm surprised, it detects and forces duration. You want this to be a number rather than a string. Have you tried using e.g. uint32 in your message definition instead of string (which it isn't)?
Specifying the JSON name for protobuf extension is this an answer to your question?
message TestMessage {
string other_id = 2 [json_name="otherId"];
}
google.protobuf.Duration should a string in json. Read comment here

How serialized protobuf text format looks like?

Given the following proto file
syntax = "proto3";
package tutorial;
message MyMessage {
string my_value = 1;
}
How the corresponding serialized text file should look like?
my_value : "abc"
or
MyMessage {
my_value : "abc"
}
Neither. There are two data formats found in protobuf; the more common is the binary protobuf format; the second (and rarer) is an opinionated JSON variant. So; if we assume that you're talking about the JSON version, we would expect valid JSON (note that I'm not accounting for whitespace here) similar to:
{
"my_value" : "abc"
}

Could protobuf read text file which has no schema but just data?

For example, the proto file is like this.
message {
required int key = 1;
repeated int value = 2;
}
The text file is like this where the first column indicates key while the others indicates the repeated value.
3391 [ 4847 3948 4849 ]
9483 [ 4938 48497 71 ]
...
Could protobuf read and parse this text file?
No, protobuf has no support for custom text formats.
You'll have to write custom parser code for it, which can then convert to protobuf or whatever other representation you might want.

Zlib::DataError: incorrect header check

I have a string, but I don't know the type of encoding.
Here's what the raw data looks like:
{
"securityProxyResponseEnvelope":{
"resultCode":"OK",
"apiResponse":"{zlibe}9mtdE350h9rd4h7wlFX3AkeCtNsb40FFaEZAl/CfcNNKrhGXawhgAs2rI4rnEgIwLpgJkKl+qkB0kzJ6+ZFmmz12Pl9/9MPdA1unUKL5OdHcWmAZim3ZjDXFGCW4zruCS/IOSiU1qVKAF5qIbocB4+2rAF7zH18SRtmXM8YW3eYs5w1NPjmYkM31W8x7QvrKkzFscH3kqDwmYn0I2gNNOtfwuKjWd5snunyqxPopZHNX3CBdW/pj4+N0tJXjAoHorCe8Ypmjxnvh3zthkLTbiBLgeULH1hGvVtkI0C9PGMyt/92upVW6qHxqCYoO/LTJK1tq6OpBnMRBNZDDntSRkrzp+1RpvzbBxFtwQ9jh45eSthbG5hq+D2oJkW5zrGi6TM8eG4ztCqRoO9dEvz2JbQsDCTPz70+C6iPYdkvOyqji18ysLjBbGcHw1j45YItcurVxp0FChxXrnHZwu6m430xKEp7ONxvgEZurt3T8qAjrkrbHfd8jRjDydUXYsMoa",
"session":"n3qp6jzHwZkXWSMW3VBF:jitqBjBmlZbrgcEgY7Od",
"parameters":{
}
}
}
I want to decompress the string in data['securityProxyResponseEnvelope']['apiResponse'].
Here's what I'm doing:
#clear_string_from_data = '9mtdE350h9rd4h7wlFX3AkeCtNsb40FFaEZAl/CfcNNKrhGXawhgAs2rI4rnEgIwLpgJkKl+qkB0kzJ6+ZFmmz12Pl9/9MPdA1unUKL5OdHcWmAZim3ZjDXFGCW4zruCS/IOSiU1qVKAF5qIbocB4+2rAF7zH18SRtmXM8YW3eYs5w1NPjmYkM31W8x7QvrKkzFscH3kqDwmYn0I2gNNOtfwuKjWd5snunyqxPopZHNX3CBdW/pj4+N0tJXjAoHorCe8Ypmjxnvh3zthkLTbiBLgeULH1hGvVtkI0C9PGMyt/92upVW6qHxqCYoO/LTJK1tq6OpBnMRBNZDDntSRkrzp+1RpvzbBxFtwQ9jh45eSthbG5hq+D2oJkW5zrGi6TM8eG4ztCqRoO9dEvz2JbQsDCTPz70+C6iPYdkvOyqji18ysLjBbGcHw1j45YItcurVxp0FChxXrnHZwu6m430xKEp7ONxvgEZurt3T8qAjrkrbHfd8jRjDydUXYsMoa'
#decoded = Base64.decode64(#clear_string_from_data)
#inflated = Zlib::Inflate.inflate(#decoded)
But this returns
#=> Zlib::DataError: incorrect header check
What's causing this and what could I try next to decompress the data?
What's causing it is that it is not zlib data. You should ask whoever is producing that raw data.
I was getting this when trying to call inflate on a data that hadn't been deflated by Zlib. In my case it was for a unit test and I sent in a plain string and simply forgot to call .deflate on it first.
In your case, if you do this instead you don't get the error:
#decoded = Zlib::Deflate.deflate(#clear_string_from_data)
#inflated = Zlib::Inflate.inflate(#decoded)

Resources