Where is the (meta) .proto file which describes .desc files?
I make .desc files with:
protoc --descriptor_set_out=foo.desc --include_imports foo.proto
Am I correct in believing that the .desc files are in protobuf format?
If so, where can I get the .proto file which describes their format?
The format is FileDescriptorSet as defined in descriptor.proto:
https://code.google.com/p/protobuf/source/browse/trunk/src/google/protobuf/descriptor.proto
descriptor.proto is typically installed to /usr/include/descriptor.proto or /usr/local/include/descriptor.proto on Unix systems. descriptor.pb.h is installed with the protobuf headers and descriptor.pb.cc is compiled into the protobuf library, so you don't have to generate them yourself if you are using C++. Similarly, in Java, the com.google.protobuf.DescriptorProtos class is compiled into the base library.
If you install protocol buffers, the definition is in
<PB install directory>/src/google/protobuf/descriptor.proto
Some/Most of the Instalation processes (e.g. Java) will "Generate" pb classes from this definition.
As keton said it is also available at
https://code.google.com/p/protobuf/source/browse/trunk/src/google/protobuf/descriptor.proto
Presumably, it should be in the reference documentation, here:
https://developers.google.com/protocol-buffers/docs/reference/cpp/google.protobuf.descriptor.pb#FileDescriptorSet
Related
I have two .proto files (a.proto and b.proto) having the same contents in them:
syntax = "proto3";
message env {
string broker = 1;
}
When I execute the compiler (I want to generate Java source files) and specify both files on the command line
protoc.exe --java_out=. a.proto b.proto
I get error messages:
b.proto:4:10: "env.broker" is already defined in file "a.proto".
b.proto:3:9: "env" is already defined in file "a.proto".
I'd expect the compiler to generate two Java classes (A and B), each having a nested class Env. This is how I understand the docs. But this does not happen.
What am I doing wrong?
Thank you for any hints.
b.proto:4:10: "env.broker" is already defined in file "a.proto".
b.proto:3:9: "env" is already defined in file "a.proto".
Most protobuf libraries put all messages with same package name into same namespace. The Java library is special in generating the outer class, which is needed because Java does not allow multiple top-level classes per file.
Protoc is doing you a service checking the rule here. If you went ahead and used the same message name in multiple files without specifying a package, you would make it very difficult for other people to use your .proto files with different programming languages.
I am trying to have protoc-generated server interface and client implementation in separate packages
The header part of my .proto files is the following:
syntax = "proto3";
option go_package = "github.com/<username>/<myservice>/pkg/grpc";
And I am using this command to generate .go files:
protoc --go_out=. --go_opt=paths=source_relative\
--go-grpc_out=. --go-grpc_opt=paths=source_relative\
pkg/grpc/*.proto
It generates pkg/grpc/<name>.pb.go files containing models and pkg/grpc/<name>_grpc.pb.go files containing server interface and client implementation (picture)
But I want the server to go to, say internal/pkg/grpc/, while the client and the models remaining inside pkg/grpc/, and the server correctly importing the models.
Versions:
protoc version is libprotoc 3.19.0
protoc-gen-go-grpc version is protoc-gen-go-grpc 1.1.0
protoc-gen-go version is protoc-gen-go v1.27.1
I am new to golang and protobuf, so if whatever I am asking happens to be bad practice, feel free to point me to the idiomatic one
It seems there isn't an option to do this. The plugin protoc-gen-go-grpc writes the output service code to the same file with _grpc.pb.go suffix, where "service" includes both client and server code.
You can only define different output paths per plugin:
protoc-gen-go supports --go_out and --go_opt flags
protoc-gen-go-grpc supports --go-grpc_out and --go-grpc_opt flags
While generating a new library we can specify dependencies in Android.mk .For Example there are header dependencies which can be specified in LOCAL_C_INCLUDES, then there are library dependencies like LOCAL_STATIC_LIBRARIES and LOCAL_SHARED_LIBRARIES.
But ,I could not find anything in Android documentation(mentioned below) for LOCAL_HEADER_LIBRARIES
https://source.android.com/devices/architecture/vndk/build-system
I asked this from my lead. He said, we include header file's Android.bp. These header files are header file for LOCAL_SHARED_LIBRARIES.
That's what I understand
Say you want to write a program that consumes a tree-sitter grammar you've written, using the node-tree-sitter package. How do you package the grammar you've written for consumption? What is the minimal set of files that must be included in the node module? In the package.json file of the javascript module there's a section specific to tree-sitter, is it important to fill that out?
No, that section of the package.json is only used by the tree-sitter CLI tool, when running tree-sitter parse or tree-sitter highlight. It is described here.
To use a Tree-sitter grammar with node-tree-sitter, you just need to ensure that the nan module is included in the dependencies of your package.json. The tree-sitter generate command will generate the other files that are needed for exposing the code to Node.js: binding.gyp and src/binding.cc. If you want to publish your module to npmjs.com, you can do that with the usual commands (e.g. npm publish).
It seem to me that scons targets are being generated not in declaration sequence. My problem is, I need to generate some code first, I'm using protoc to process a my.proto file into .h and .cc file, I need some pseudo code like this(what should the working code look like?)
import os
env=Environment(ENV=os.environ,LIBPATH='/usr/local/lib')
env.ShellExecute('protoc', '--outdir=. --out-lang=cpp', 'my.proto')//produces my.cc
myObj=Object('my.cc')//should wait until 'my.cc' is generated by protoc
Dependency(myObj, 'my.cc')
mainObj=Object('main.cpp')
My question is:
How to specify this ShellExecution of protoc in SConstruct/SConscript?
How to make sure that the compilation of 'main.cpp' depends on the existence of 'my.cc', in another word, wait until 'my.cc' is generated and then execute?
Your observations and assumptions are correct, SCons will not execute the single build commands in the order that you list them in the SConstruct files. It will run them based on the dependencies of the targets and source files in your build, either defined implicitly (header includes in C++, for example) or explicitly (via the Depends() method).
So you have to define and setup your dependencies correctly, such that SCons delivers the output that you want. For the special protoc case in your example, a special Builder exists that will help you to get the dependency graph right. It is available in our ToolsIndex, where also support for a variety of other languages and dialects can be found.
These special builders will emit the correct target nodes, e.g. when given a *.proto input file, and SCons is then able to automatically detect the dependency between the protoc input file and your main program if you say something like:
env=Environment(tools=['default','protoc'])
env.Protoc([], "test.proto")
env.Program('main', ['main.cpp'] + Glob('*.cc'))
The Glob('*.cc') will detect your *.cc files, coming out of the protoc Tool, and include them as dependencies for your final target main.
You can always write your own Builders and Emitters in SCons, which is the canonical way of making new tools/toolchains known to SCons dependency analysis. In the UserGuide, sect. "18 Writing Your Own Builders", and especially our ToolsForFools Guide you can find more infos about this.