Handling errors in a gRPC service commonly requires both a status message and error codes. Both have two definitions:
Google APIs definition (googleapis/go-genproto) - the generated Go packages for common protocol buffer types, and the generated gRPC code for Google's gRPC APIs
gRPC definition (grpc/grpc-go) - the Go implementation of gRPC
The Go packages for both definitions of Status and Codes are:
Google APIs
Status: google.golang.org/genproto/googleapis/rpc/status
Codes: google.golang.org/genproto/googleapis/rpc/code
gRPC
Status: google.golang.org/grpc/status
Codes: google.golang.org/grpc/codes
Since I'm a client of my own gRPC service and not a client of an existing Google gRPC API, I want to use the gRPC definitions of Status and Code.
However, the gRPC proto file for Status is actually copied from Google APIs definition. See https://github.com/grpc/grpc/tree/master/src/proto/grpc/status. The go_package of status.proto is also unchanged, so both the Google API and gRPC definitions use the following Go package
option go_package = "google.golang.org/genproto/googleapis/rpc/status;status";
The upshot is the only way to use Status when defining an API is by importing it with
import "google/rpc/status.proto";
...and importing the language bindings in Go with
import (
"google.golang.org/genproto/googleapis/rpc/status"
)
// Go server code...
But as stated earlier, this is wrong since I'm not a client of a Google API, but rather my own gRPC service. Therefore the language bindings should be imported with
import (
"google.golang.org/grpc/status"
)
// Go server code...
As expected if I switch to importing the gRPC language bindings and try and return a Status message to the API client, I get a compile error
cannot use &(status.Status literal)
(value of type *"google.golang.org/grpc/internal/status".Status) as
*"google.golang.org/genproto/googleapis/rpc/status".Status value
This is caused by my .proto file using the Google API definition of Status while the server implementation (in Go) uses the gRPC definition.
The problem impacts error codes since Google APIs uses signed 32 bit integers (int32) whereas gRPC uses unsigned 32 bit integers (uint32).
Questions
Is my assertion that I should be using the gRPC definition of Status and Codes correct?
If my assertion is correct, how can I use the gRPC definition of Status when it's packaged for Google APIs?
We need to distinguish a few cases. Some of them are obvious, some are not.
Just returning Status from a gRPC handler
If your proto schema (.proto files) doesn't define messages that use Status or Code directly, then the gRPC handlers can satisfy the return type error simply with "google.golang.org/grpc/status".Error(), or Newf().Err(). And that's about it.
Example:
// implements SomeServiceServer unary RPC GetFoo
func (s *SomeService) GetFoo(ctx context.Context, req *grpc.FooRequest) (*grpc.FooResponse, error) {
// status is "google.golang.org/grpc/status"
return nil, status.Error(codes.Unimplemented, "coming soon")
Using Status in your .proto files
In this case, you are forced to use the googleapis implementation. As you already have seen, the status.proto Go package is defined as:
option go_package = "google.golang.org/genproto/googleapis/rpc/status;status";
So let's say you have the following .proto file, where the imported status.proto is just a copy-paste of the gRPC status.proto as per this question:
syntax = "proto3";
package test;
import "status.proto";
option go_package = ".;main";
message Foo {
string a = 1;
google.rpc.Status status = 2;
}
with directory structure as:
/protos
|_ status.proto
|_ test.proto
and you compile the above with:
cd protos && protoc -I=. --go_out=. test.proto
breathe ...then the generated Go code will have the following import
import (
status "google.golang.org/genproto/googleapis/rpc/status"
)
and you must satisfy that by go get google.golang.org/genproto.
So about your first question, you can only use Status from googleapis in proto files, because that's how status.proto declares its Go package.
Using generated googleapis Status in Go code
Since the imported Go package is from googleapis that is what you must use in your Go code in order to initialize such messages:
package main
import (
"fmt"
googleapis "google.golang.org/genproto/googleapis/rpc/status"
)
func main() {
foo := &Foo{
A: "foo",
Status: &googleapis.Status{
Code: int32(code.Code_OK),
Message: "all good",
},
}
// fmt.Println(foo)
}
Yes but I must use grpc-go Status in my Go code
You can't. protoc generates code with the packages described above. If you absolutely NEED to construct these Status fields using grpc-go, you can use Status.Proto:
package main
import (
"fmt"
"google.golang.org/grpc/codes"
"google.golang.org/grpc/status"
)
func main() {
foo := &Foo{
A: "foo",
Status: status.New(codes.OK, "all good").Proto(),
}
fmt.Println(foo)
}
Just for the record, the opposite is also possible with status.FromProto:
package main
import (
"fmt"
googleapis_codes "google.golang.org/genproto/googleapis/rpc/code"
googleapis "google.golang.org/genproto/googleapis/rpc/status"
"google.golang.org/grpc/status"
)
func main() {
gapisStatus := &googleapis.Status{
Code: int32(googleapis_codes.Code_OK),
Message: "all good",
}
grpcStatus := status.FromProto(gapisStatus)
fmt.Println(grpcStatus)
}
As a less well-behaved alternative, you can simply copy-paste the status.proto sources into your project and manually change the go_package:
option go_package = "google.golang.org/grpc/status;status";
This way protoc will generate the Go code with this import, and your own sources will be able to follow suit. Of course this means you now have your own fork of status.proto.
Related
I'm new to Protocol Buffers and gRPC stuff. Now I'm trying to build a client/server architecture with grpc + grpc-gateway in Go.
I tried to follow some examples but I always end up with the same problem.
After generating the code with protoc i run go build and I get this error:
proto/helloworld/hello_world.pb.gw.go:64:2: cannot use msg (type *HelloReply) as type protoreflect.ProtoMessage in return argument:
*HelloReply does not implement protoreflect.ProtoMessage (missing ProtoReflect method)
proto/helloworld/hello_world.pb.gw.go:98:2: cannot use msg (type *HelloReply) as type protoreflect.ProtoMessage in return argument:
*HelloReply does not implement protoreflect.ProtoMessage (missing ProtoReflect method)
This is go.mod:
module github.com/riccardopedrielli/grpc-gateway-test
go 1.15
require (
github.com/golang/protobuf v1.4.3
github.com/grpc-ecosystem/grpc-gateway/v2 v2.2.0
google.golang.org/genproto v0.0.0-20210207032614-bba0dbe2a9ea
google.golang.org/grpc v1.35.0
google.golang.org/protobuf v1.25.0
)
This is hello_world.proto:
syntax = "proto3";
package helloworld;
import "google/api/annotations.proto";
option go_package = "github.com/riccardopedrielli/grpc-gateway-test/proto/helloworld";
// Here is the overall greeting service definition where we define all our endpoints
service Greeter {
// Sends a greeting
rpc SayHello (HelloRequest) returns (HelloReply) {
option (google.api.http) = {
get: "/v1/example/echo/{name}"
};
}
}
// The request message containing the user's name
message HelloRequest {
string name = 1;
}
// The response message containing the greetings
message HelloReply {
string message = 1;
}
This is the link to the repository: https://github.com/riccardopedrielli/grpc-gateway-test
A difference I see between the generated go files is that they are importing different protobuf libraries.
The one generated by protoc-gen-go imports github.com/golang/protobuf/proto.
The one generated by protoc-gen-grpc-gateway imports google.golang.org/protobuf/proto.
Could this be the cause of the problem?
Still it's not clear to me which one should be used and how to force the same in both the generators.
I'm new to grpc and quite lost at this point, so I could have omitted some important informations. Any suggestion will be welcomed.
Thank you
Ok I solved the issue.
I had installed protoc via snap and the stable channel had version 3.11.4
Now I upgraded to 3.14.0 and everything is working well.
For generating the stubs, we can use either protoc or buf. protoc is the more classic generation experience used widely in the industry. Still, it has a pretty steep learning curve. buf is a newer tool built with user experience and speed in mind. It also offers linting and breaking change detection, and something protoc doesn’t provide.
You should check out the tutorial series on gRPC-Gateway, i.e., https://grpc-ecosystem.github.io/grpc-gateway/docs/tutorials/. Also, you can refer to my simple hello world program, which uses gRPC-Gateway, i.e., https://github.com/iamrajiv/helloworld-grpc-gateway.
Hi I am trying to generate the simple protobuf file in Go language
syntax = "proto3";
package gen;
message EvtKeepAlive
{
string SvcName = 2;
}
In the header I see that the package uses two different proto go implementations, one from github.com and one from google.golang.org. As far as I understand the latter supersedes the former, so is this file generation valid?
// versions:
// protoc-gen-go v1.25.0-devel
// protoc v3.13.0
// source: common.proto
package gen
import (
proto "github.com/golang/protobuf/proto"
protoreflect "google.golang.org/protobuf/reflect/protoreflect"
protoimpl "google.golang.org/protobuf/runtime/protoimpl"
reflect "reflect"
sync "sync"
)
The file is valid; see the comments from dsnet in response to this issue:
The only reason the newly generated .pb.go files depend on the
deprecated proto package is to enforce a weak dependency on a
sufficiently new version of the legacy package. This is necessary
because not everyone is using Go modules such that the Go toolchain
would enforce this dependency constraint. I wasn't fond of adding it,
but I think it's probably necessary to keep at least for a few months.
Is it possible to use go doc to view all sub-packages defined under a specific package?
Say, I want to view all sub-packages under crypto.
go doc crypto only lists what crypto defines, but no information about its sub-packages, like crypto/aes and crypto/cipher:
go doc crypto
package crypto // import "crypto"
Package crypto collects common cryptographic constants.
func RegisterHash(h Hash, f func() hash.Hash)
type Decrypter interface{ ... }
type DecrypterOpts interface{}
type Hash uint
const MD4 Hash = 1 + iota ...
...
If you want to see all sub-packages under a specific package you can use go list command:
go list crypto/...
crypto
crypto/aes
crypto/cipher
crypto/des
crypto/dsa
crypto/ecdsa
crypto/ed25519
crypto/ed25519/internal/edwards25519
crypto/elliptic
crypto/hmac
crypto/internal/randutil
crypto/internal/subtle
crypto/md5
crypto/rand
crypto/rc4
crypto/rsa
crypto/sha1
crypto/sha256
crypto/sha512
crypto/subtle
crypto/tls
crypto/x509
crypto/x509/pkix
Finally, for each package you can get the doc with go doc command.
go doc crypto/x509
...
You can write a script if you need to iterate over the results returned by go list.
Honestly, I think that the best way to consume the doc of std library is the Go website: https://golang.org/pkg/.
You can also start a local godoc web server to read the doc of your Go code:
godoc -http=:6060
*open your browser and visit localhost:6060*
Goal: I want to reuse many Go functions from two Go functions with HTTP triggers.
What I have tried and steps to reproduce the problem:
In GCP, create a new Go 1.11 Cloud Function, HTTP Trigger
Name it: MyReusableHelloWorld
In function.go, paste this:
package Potatoes
import (
"net/http"
)
// Potatoes return potatoes
func Potatoes(http.ResponseWriter, *http.Request) {
}
In go.mod, paste this: module example.com/foo
In function to execute, paste this: Potatoes
Click on deploy. It works.
Create another Go serverless function in GCP
In function. go, paste this:
// Package p contains an HTTP Cloud Function.
package p
import (
"encoding/json"
"fmt"
"html"
"net/http"
"example.com/foo/Potatoes"
)
// HelloWorld prints the JSON encoded "message" field in the body
// of the request or "Hello, World!" if there isn't one.
func HelloWorld(w http.ResponseWriter, r *http.Request) {
var d struct {
Message string `json:"message"`
}
if err := json.NewDecoder(r.Body).Decode(&d); err != nil {
fmt.Fprint(w, "error here!")
return
}
if d.Message == "" {
fmt.Fprint(w, "oh boy Hello World!")
return
}
fmt.Fprint(w, html.EscapeString(d.Message))
}
In go.mod, paste this: module example.com/foo
In function to execute, paste this: HelloWorld
Click on deploy. It doesn't work. You have the error: unknown import path "example.com/foo/Potatoes": cannot find module providing package example.com/foo/Potatoes
I have also tried all kinds of combinations for the module/packages to import.
I have tried without the example.com/ part.
Other smaller issue:
The functions I want to reuse could all be in the same file and don't really need any trigger, but it doesn't seem that having no trigger is possible.
Related questions and documentation with which I could not achieve my goal:
How can I use a sub-packages with Go on Google Cloud Functions?
https://github.com/golang/go/wiki/Modules , section go.mod
You can’t invoke a cloud function from another one, because each function is in its own container independently.
So If you want to deploy the function with a dependency that can't be downloaded from a package manager you need to put the code together like here and deploy using the CLI
It is likely each cloud function defined in the console is independent of the other. If you want code reuse, it's best to structure it as per the following document and deploy it using gcloud command.
https://cloud.google.com/functions/docs/writing/#structuring_source_code
You are mixing things: package management and function deployment.
When you deploy a Cloud Function, if you want to (re)use it, you have to call if with http package.
If you build a package that you want to include in your source code, you have to rely on package manager. With Go, Git repository, like Github, is the best way to achieve this (Don't forget to perform a release and to name it as expected by Go mod: vX.Y.Z)
Here your code can't work without more engineering and package publication/management.
I achieve the same things but with a Dockerfile and I deplored my code in Cloud Run (that I recommend you if you aren't event oriented and only HTTP oriented. I wrote a comparison on Medium)
Root
go.mod
pkg/foo.go
pkg/go.mod
service/Helloworld.go
service/go.mod
In my helloworld.go, I can reuse the package foo. For this I perform this in my service/go.mod file
module service/helloworld
go 1.12
require pkg/foo v0.0.0
replace pkg/foo v0.0.0 => ../pkg
Then when you build your container, you run your go build service/Helloworld.go from the root directory.
I am using wgo for dependency management in Golang (although I think wgo has little to do with this), wgo has a folder structure like this
project/
.gocfg/
gopaths
vendor.json
vendor/
src/
github.com_or_whatever/
I have a library I coded myself which uses an nsq-go type in one of the exported methods:
func AddNsqSubscription(
topic, channel string,
handler nsq.Handler,
config *nsq.Config) error { }
The library is called messi and I import the nsq-go like so "messi/vendor/src/github.com/bitly/go-nsq"
The problem comes when I try to use this library in another project. For instance, in a project called scribe I have the following code (notice the imports):
import (
"scribe/vendor/src/github.com/bitly/go-nsq"
"scribe/vendor/src/messi"
)
//...
nsqHandler := nsq.HandlerFunc(func(message *nsq.Message) error {
msgHandler(MessiMessage{message})
return nil
})
return messi.AddNsqSubscription(destination, subdestination, nsqHandler, nsq.NewConfig())
When I go build the following error is returned:
cannot use nsqHandler (type "scribe/vendor/src/github.com/bitly/go-nsq".HandlerFunc) as type "messi/vendor/src/github.com/bitly/go-nsq".Handler in argument to messi.AddNsqSubscription:
"scribe/vendor/src/github.com/bitly/go-nsq".HandlerFunc does not implement "messi/vendor/src/github.com/bitly/go-nsq".Handler (wrong type for HandleMessage method)
have HandleMessage("scribe/vendor/src/github.com/bitly/go-nsq".Message) error
want HandleMessage("messi/vendor/src/github.com/bitly/go-nsq".Message) error
Why? I do not really know what is going on. The code go-nsq imported is exactly the same, yet golang wants that this code comes from the same folder?
What am I doing wrong?
Packages in Go are identified by full import path, not by name.
For example in the standard library there are two different packages with the same name template but different import paths: text/template and html/template.
You should make sure that go-nsq package is imported using the same path.