I am new to Protocol Buffers, and I have just added the library / build requirements for my Maven project. I now have a .proto file in my source repository that has little to nothing in it:
package com.christopher.kade;
option java_package= "protocol";
message Protocol {
required int32 id = 1;
required string name = 2;
}
But I've found myself facing a problem when it comes to packages, the following file creates a protocol package in my com.christopher.kade one and I get an error message stating that:
Package name 'protocol' does not correspond to the file path 'com.christopher.kade.protocol'.
What is the good approach in order to generate my class in my current package? Therefore I would have:
com.christopher.kade
|- client.proto
|- MyGeneratedClass.java
|- MyClass.java
The mistake is in this line
option java_package= "protocol";
Change it to
option java_package= "com.christopher.kade";
and you are good!
Related
I am working in a project using some proto sources that are already compiled using a specific version, I also need to compile some custom protos that are cohabiting in the same project, so the protoc needs to match the one that was used to generate the other ones.
I can see in the pre-generated ones:
#if PROTOBUF_VERSION < 3009000
#if 3009002 < PROTOBUF_MIN_PROTOC_VERSION
In mines:
#if PROTOBUF_VERSION < 3017000
#if 3017000 < PROTOBUF_MIN_PROTOC_VERSION
I don't quite understand which protoc is being used, the one installed on my system is 3.19.4.
Also this is my WORKSPACE:
http_archive(
name = "rules_proto",
sha256 = "66bfdf8782796239d3875d37e7de19b1d94301e8972b3cbd2446b332429b4df1",
strip_prefix = "rules_proto-4.0.0",
urls = [
"https://mirror.bazel.build/github.com/bazelbuild/rules_proto/archive/refs/tags/4.0.0.tar.gz",
"https://github.com/bazelbuild/rules_proto/archive/refs/tags/4.0.0.tar.gz",
],
)
load("#rules_proto//proto:repositories.bzl", "rules_proto_dependencies", "rules_proto_toolchains")
rules_proto_dependencies()
rules_proto_toolchains()
http_archive(
name = "com_github_grpc_grpc",
urls = [
"https://github.com/grpc/grpc/archive/refs/tags/v1.44.0.tar.gz",
],
sha256 = "8c05641b9f91cbc92f51cc4a5b3a226788d7a63f20af4ca7aaca50d92cc94a0d",
strip_prefix = "grpc-1.44.0",
)
load("#com_github_grpc_grpc//bazel:grpc_deps.bzl", "grpc_deps")
grpc_deps()
load("#com_github_grpc_grpc//bazel:grpc_extra_deps.bzl", "grpc_extra_deps")
grpc_extra_deps()
The error I am currently getting is:
In file included from cc/tensorflow/plugin_primeclient/grappler/grappler.cc:7:
bazel-out/aarch64-fastbuild/bin/cc/tensorflow/plugin/protos/graph.pb.h:12:2: error: #error This file was generated by a newer version of protoc which is
12 | #error This file was generated by a newer version of protoc which is
| ^~~~~
bazel-out/aarch64-fastbuild/bin/cc/tensorflow/plugin/protos/graph.pb.h:13:2: error: #error incompatible with your Protocol Buffer headers. Please update
13 | #error incompatible with your Protocol Buffer headers. Please update
| ^~~~~
bazel-out/aarch64-fastbuild/bin/cc/tensorflow/plugin/protos/graph.pb.h:14:2: error: #error your headers.
14 | #error your headers.
| ^~~~~
I'll try and describe the general process I take when tracking dependency problems down in Bazel, as it seems to be a regular problem that you'll probably run into again.
Before Bazel does anything to do with the build it's going to look in your WORKSPACE file to see if it needs to fetch any dependencies. It might not seem like an important detail, but Bazel handles WORKSPACE dependencies from top to bottom. We can use this behaviour to override the protobuf version used. Checkout the maybe macro if your interested in how this works.
So in your WORKSPACE file, the first dependency that you have is 'rules_proto' # version 4.0.0 with the http_archive. Then you are loading two macros from 'rules_proto' here;
load("#rules_proto//proto:repositories.bzl", "rules_proto_dependencies", "rules_proto_toolchains")
rules_proto_dependencies()
rules_proto_toolchains()
So first things first let's head over to the rules_proto releases page and find your specific release. Then click on the little hash on that page (circled in red).
Then click "browse files" in the top right;
This will allow you to browse the state of that repository at that specific version. Now as you loaded "repositories.bzl" you'll want to open that up and inspect it (ctrl-F search for protobuf). You'll find that it calls another private macro i.e.
protobuf_workspace(name = "com_google_protobuf")
If we look for where that macro was loaded you'll see that you'll need to follow that through to the 'proto/private/dependencies.bzl';
load("//proto/private:dependencies.bzl", "dependencies", "maven_dependencies", "protobuf_workspace")
After opening that up and searching again for protobuf you'll find the line that specifies the protobuf version;
"com_github_protocolbuffers_protobuf": {
#...
},
So by the looks of it you are using an older version of protobuf with Bazel than what is installed on your system. So in order to override the protobuf version in Bazel you need to simply add it as a dependency in your WORKSPACE before the rules_proto repository. e.g.
# file: //:WORKSPACE
http_archive(
name = "com_github_protocolbuffers_protobuf",
# TODO: Leave this empty and Bazel will tell you what to put here when you build.
sha256 = "",
# Note: Same version as your system deps.
strip_prefix = "protobuf-3.19.4",
urls = [
# Note: Same version as your system deps.
"https://github.com/protocolbuffers/protobuf/releases/download/v3.19.4/protobuf-all-3.19.4.tar.gz"
],
)
http_archive(
name = "rules_proto",
# The rest of the WORKSPACE...
In previos products i use old protoc-gen-go which allow to use plugins and generate serialization/deserialization and gRPC client/server in same pb file
as far i understand protoc-gen-go v1.27.1 will not allow plugins and demand to use go-grpc_out flag for client\server code
Follow this command
protoc -I /usr/local/include -I $PWD/api/dummy-proto --go_out=generated --go-grpc_out=generated --go_opt=paths=source_relative proto/v1/foo.proto
i got
generated
|_proto
|_v1
|_dummy
| |_foo_grpc.pb.go //package dummy
|_foo.pb.go //package dummy
Because of created "dummy" folder foo_grpc.pb.go functions do not see Request and Response which generated in foo.pb.go
What i am doing wrong? Is there is option to generate one file as before? It will be work properly after move foo_grpc.pb.go on same level with foo.pb.go.
Also is there is possible to use old flag like --go_out=import_path=" and declare package with M without slashes and without go_options in proto like -go_out=import_path=grpc_v1_proto,M$PWD/proto/v1/foo.proto=grpc_v1_proto"
foo.proto
syntax = "proto3";
package dummy.v1.foo;
option go_package = "proto/v1/dummy";
import "proto/v1/structures.proto";
service FooService {
rpc reverse(ReverseRequest) returns (ReverseResponse);
rpc getBar(GetBarRequest) returns (GetBarResponse);
}
message ReverseRequest {
string text = 1;
}
message ReverseResponse {
string reversed_text = 1;
}
message GetBarRequest {
}
message GetBarResponse {
structures.Bar bar = 1;
}
As per the comments you need to add --go-grpc_opt=paths=source_relative. This is covered in the basics tutorial (but that really just gives the command without much detail).
protoc-gen-go-grpc uses code shared with protoc-gen-go to process most of these options so the documentation for Go Generated Code will probably answer your questions (just change go_opt to go-grpc_opt).
Have a relatively simple helloworld.proto file down below
syntax = "proto3";
package helloworld;
service Greeter { rpc SayHello(HelloRequest) returns (HelloResponse); }
message HelloRequest { string name = 1; }
message HelloResponse { string message = 1; }
When I run protoc --js_out=import_style=commonjs,binary:. .\helloworld.proto it generates a helloworld_pb.js file but it doesn't include my Greeter service nor my SayHello rpc function. Looked around a few other post and also Google's reference (https://developers.google.com/protocol-buffers/docs/reference/overview) and it seems like I need to include a --plugin option but I can't seem to find any. Does anybody have a solution to for this?
The protoc plugin for Node gRPC is distributed in the grpc-tools npm package. That package provides a tool grpc_tools_node_protoc that is a version of protoc that automatically includes the plugin.
As described in that package's README, when you run the tool you will also need to use the --grpc_out argument to control the plugin. The question is tagged grpc-js, so you will probably want to use the grpc_js option for that argument to generate code that interacts with grpc-js.
For those that have been looking for an example that also generates typescript see below
grpc_tools_node_protoc.cmd --js_out=import_style=commonjs,binary:.\output --grpc_out=generate_package_definition:.\output *.proto
grpc_tools_node_protoc.cmd --plugin=protoc-gen-ts.cmd=./node_modules/.bin/protoc-gen-ts --ts_out=.\typescript -I .\output *.proto
I am trying to create a grpc service with a very basic single action which is GetDeployment, takes a namespace and a name as an input, and returns a Kubernetes deployment. The thing is that I do not want to define my own message for the Deployment as it already exists on the official Kubernetes repository.
I am pretty new to grpc and probably do not understand well enough how it works but can I import this message to my own file in a way I could then write the following .proto file ?
syntax = "proto3";
package api;
import "google/api/annotations.proto";
import "k8s.io/kubernetes/pkg/api/v1/generated.proto";
message GetDeploymentOptions {
string namespace = 1;
string name = 2;
}
service AppsV1 {
rpc GetDeployment(GetDeploymentOptions) returns (k8s.io.kubernetes.pkg.api.v1.Deployment) {}
}
Thank you in advance
GRPC codegen is just a protoc plugin. It generates code for service and rpc but it follows the normal protobuf rules for imports.
In your example, if your file is in src/api.proto and the k8s api repo is a git submodule checked out into thirdparty/k8s.io/api folder you would generate the files you'd need by running:
root>protoc.exe -I thirdparty k8s.io/api/core/v1/generated.proto --go_out=go
root>protoc.exe -I thirdparty src/api.proto --go_out=plugins=grpc:go
The first command is generating the .pb.go file which contains the k8s messages, while the second command is generating the .pb.go file which contains your messages and your service.
Looking at the transient imports of that file, you may also need to checkout api-machinery into k8s.io/apimachinery and run protoc on that file as well.
I've been using protoc to generate golang gRPC client and server code without issues. Since I have multiple gRPC services that use the same data types, I'd like to refer to a base.proto for these types rather than copy and paste, which is extra work and may result in out of sync issues.
Here's a sample of base.proto:
syntax = "proto3";
package base;
message Empty {
}
message Label {
string Key = 1;
string Value = 2;
}
Here's a sample specific .proto:
syntax = "proto3";
import = "base.proto";
package publisher;
service ClientPublisher {
rpc Publish(stream base.Label) returns (base.Empty) {}
}
And, here's my command:
protoc -I system-client-go/ system-client-go/client/publisher.proto --go_out=plugins=grpc:system-client-go --proto_path=system-client-go/
No matter what I try, it throws this:
2019/08/01 15:31:31 protoc-gen-go: error:bad Go source code was generated: 273:7: expected type, found '.' (and 10 more errors)
which corresponds to the line:
rpc Publish(stream base.Label) returns (base.Empty) {}
Any ideas?
This kind of error normally is because your relative path is wrong. Try to put the specific proto file in a directory and import it like
import "exampleproject/specific.proto";
If both files are in the same directory the solution is explained in this thread => https://github.com/golang/protobuf/issues/322
Basically, golang only allows one package per directory. So protoc-gen-go is considering them like 2 separate libraries.