I am using protobuf java, with following .proto
// service1.proto
option java_package = "package";
option java_outer_classname = "Proto1";
message M {
... // some definition
}
and
// service2.proto
option java_package = "package";
option java_outer_classname = "Proto2";
message M {
... // some different definition
}
while compile, error is thrown in service2.proto saying that "M" is already defined in service1.proto
But from package and generated code they should be package.Proto1.M and package.Proto2.M, is this conflict?
The "package" is also a .proto concept (not just a language/framework concept); if you need to have both schemas involved in anything, it may be useful to add
package Proto1;
to service1.proto and
package Proto2;
to service2.proto
Alternatively, if the M is actually the same in both places: move M to a different single file, and use import from both service1.proto and service2.proto
Related
Similar to the example at: https://developers.google.com/protocol-buffers/docs/proto#extensions
Suppose I have a proto like:
import "google/protobuf/descriptor.proto";
extend google.protobuf.FieldOptions {
string search_key = 7000;
}
message Person {
string name = 1 [(search_key) = "searchIndex.firstName"];
}
then I use the protobufjs-cli to generate a static module:
pbjs -t static-module -w commonjs -o compiled.js test.proto
How can I then read the descriptor in javascript using the generated module?
As mentioned in a comment, this required access to either the original .proto file or the output of:
pbjs -t proto3 -o compiled.proto myfile.proto
but once I had that it was simply a matter of:
import * as protobuf from 'protobufjs';
const root = protobuf.loadSync('test.proto')
const Person = root.lookupType('main.Person')
console.log(Person.fields.name.options!['(search_key)'])
// logs: searchIndex.firstName
I am writing this code in Vala, using Camel
using Camel;
[...]
MimeParser par = new MimeParser();
[...]
par.push_state( MimeParserState.MULTIPART, boundary );
I downloaded the camel-1.2.vapi from github vala-girs (this link), put it in a vapi subdirectory and compiled with
valac --vapidir=vapi --includedir=/usr/include/evolution-data-server/camel --pkg camel-1.2 --pkg posix --target-glib=2.32 -o prog prog.vala -X -lcamel-1.2
Compiling I get this error:
error: unknown type name "CamelMimeParserState"
const gchar* camel_mime_parser_state_to_string (CamelMimeParserState self);
Looking the C output code I see that the CamelMimeParserState type is used several times but it is never defined. It should be a simple enum because the camel-1.2.vapi file says:
[CCode (cheader_filename = "camel/camel.h", cprefix = "CAMEL_MIME_PARSER_STATE_", has_type_id = false)]
public enum MimeParserState {
INITIAL,
PRE_FROM,
FROM,
HEADER,
BODY,
MULTIPART,
MESSAGE,
PART,
END,
EOF,
PRE_FROM_END,
FROM_END,
HEADER_END,
BODY_END,
MULTIPART_END,
MESSAGE_END
}
So why doesn't the C output code simply use an enum as the vapi file says (described by cprefix CAMEL_MIME_PARSER_STATE_)?
Is there an error in the .vapi file?
I found the solution. The vapi file is wrong because the cname field is missing. Changing the vapi file adding this cname="camel_mime_parser_state_t":
[CCode (cheader_filename = "camel/camel.h", cname="camel_mime_parser_state_t", cprefix = "CAMEL_MIME_PARSER_STATE_", has_type_id = false)]
public enum MimeParserState {
INITIAL,
[...]
works correctly.
Im trying to use protobuffers and scalaPB, i've added:
import com.trueaccord.scalapb.compiler.Version.scalapbVersion
libraryDependencies += "com.trueaccord.scalapb" %% "scalapb-runtime" %
scalapbVersion % "protobuf"
PB.targets in Compile := Seq(
scalapb.gen() -> (sourceManaged in Compile).value
)
and below lines to plugins.sbt
addSbtPlugin("com.thesamet" % "sbt-protoc" % "0.99.3")
libraryDependencies += "com.trueaccord.scalapb" %% "compilerplugin" % "0.5.47"
and .proto file under src/main/protobuf:
syntax = "proto3";
import "scalapb/scalapb.proto";
import "google/protobuf/wrappers.proto";
package actors;
message ExamplePROTO {
double value = 1;
}
but using sbt compile nothing happens - class is not generated in target. File is ignored, because even with some typo in protobuf file project still compiles. Ive also tried to look at debug logs using
logLevel in Global := Level.Debug
but the only files .proto which are mentioned there are some internal ones like: /target/protobuf_external/google/protobuf/source_context.proto
How can I proceed with it? Can I get some more information what scalaPB is looking at?
I am prototyping a meta model on top of proto3. To generate domain specific boilerplate as the go proto3 extension syntax is ridiculously expressive. My domain proto files depend on meta.proto which contain the extensions.
I can compile the these to go. When including the meta.proto file the generated go ends up with the following include block:
import proto "github.com/golang/protobuf/proto"
import fmt "fmt"
import math "math"
import google_protobuf "google/protobuf" <--- this import does not exist !!
My extension file has the following structure(based off this):
syntax = "proto2";
package "...";
option go_package = "...";
import "google/protobuf/descriptor.proto"; <--- this causes the import
// message MyExtensionClass ...
// message MyExtensionField ...
extend google.protobuf.MessageOptions {
optional MyExtensionClass class = 50000;
}
extend google.protobuf.FieldOptions {
optional MyExtensionField field = 50001;
}
I know the solution is likely very simple, the google/protobuf include is meant for C++ generation.
In my workspace the included package should be "github.com/golang/protobuf/protoc-gen-go/descriptor"
Poor mans solution. Not ideal, directing it to the relevant go import works:
sed -i '' -e 's/import google_protobuf \"google\/protobuf\"/import google_protobuf \"github.com\/golang\/protobuf\/protoc-gen-go\/descriptor\"/g' pkg/domain/proto/extensions/*.pb.go
Google protobuf is a nice IDL for RPC. But I want to know how to write my own code generator for protobuf.
The protoc compiler can output a protobuf-formatted description of the .proto file. That way most of the parsing has been done for you already, and you only need to generate the output you want.
The .proto schema for the .proto file description is here:
https://github.com/google/protobuf/blob/master/src/google/protobuf/descriptor.proto
As an additional step, you can make your generator runnable via an "-mygenerator-out=." option on protoc itself:
https://developers.google.com/protocol-buffers/docs/reference/other
Here is one (albeit a bit convoluted) example on how a code generator can be written in Python:
https://github.com/nanopb/nanopb/blob/master/generator/nanopb_generator.py
A protoc plugin is a binary that takes a protobuf message of type CodeGeneratorRequest and returns a response of type CodeGeneratorResponse to standard out.
The binary must be called protoc-gen-NAME and can be used by invoking the protoc command with:
protoc --plugin=./path/to/protoc-gen-NAME --NAME_out=./test/generated ./test.proto
Note specifically that names are important. This will not work, it will invoke the java generator:
protoc --plugin=./path/to/protoc-gen-NAME --java_out=./test/generated ./test.proto
This will not work, because the binary does not have the correct name:
protoc --plugin=./path/to/whatever-NAME --NAME_out=./test/generated ./test.proto
In order to process the incoming CodeGeneratorRequest and generate a valid response, your binary must itself be able to parse the protobuf message as per the protocol file plugin.proto from the protocolbuffers repository.
Historically this was difficult to do in a self-contained manner, but you can do this 'end-to-end' entirely in rust simply with the protobuf crate, like this trivial example demonstrates:
[dependencies]
protobuf="3.0.2"
use protobuf::plugin::{code_generator_response, CodeGeneratorRequest, CodeGeneratorResponse};
use protobuf::Message;
use std::io;
use std::io::{BufReader, Read, Write};
fn main() {
// Read message from stdin
let mut reader = BufReader::new(io::stdin());
let mut incoming_request = Vec::new();
reader.read_to_end(&mut incoming_request).unwrap();
// Parse as a request
let req = CodeGeneratorRequest::parse_from_bytes(&incoming_request).unwrap();
// Generate the content for each output file
let mut response = CodeGeneratorResponse::new();
for proto_file in req.proto_file.iter() {
let mut output = String::new();
output.push_str(&format!("// from file: {:?}\n", &proto_file.name));
output.push_str(&format!("// package: {:?}\n", &proto_file.package));
for message in proto_file.message_type.iter() {
output.push_str(&format!("\nmessage: {:?}\n", &message.name));
for field in message.field.iter() {
output.push_str(&format!(
"- {:?} {:?} {:?}\n",
field.type_,
field.type_name,
field.name(),
));
}
}
// Add it to the response
let mut output_file = code_generator_response::File::new();
output_file.content = Some(output);
output_file.name = Some(format!("{:?}/out.txt", &proto_file.name.as_ref().unwrap()));
response.file.push(output_file);
}
// Serialize the response to binary message and return it
let out_bytes: Vec<u8> = response.write_to_bytes().unwrap();
io::stdout().write_all(&out_bytes).unwrap();
}
Obviously this trivial example doesn't generate code, just text files, but it shows the basic process. You should also iterate over service and deal with all the additional properties on each type.
What this basically gives you is an AST matching the .proto files; the codegen side of it can be done however you like.
Helpful hints:
Do not log to stdout in your plugin, eg. for debugging. The only permitted output to stdout is a protobuf format CodeGeneratorResponse message.
The plugin does not write files, the protoc command does that; it should generate content and then return an array of files, along with content and metadata.
For more information on plugins, carefully read the plugin.proto file linked above; it has extensive details.