How to share Protobuf definitions for gRPC? - protocol-buffers

Since you have to share .proto files to define data and services for gRPC, service provider and clients need to access the same .proto files. Is there any common strategy to distribute these files? I want to avoid that every project has its own .proto file in its Git repository and our team members need to manual edit these files or share them via email.
Is there any common best practice?

Unfortunately, there is no common practice, although the main goal that you should achieve is to store proto files in one version control repository.
During my investigation, I've found some interesting blog posts about that subject:
https://www.bugsnag.com/blog/libraries-for-grpc-services
https://www.crowdstrike.com/blog/improving-performance-and-reliability-of-microservices-communication-with-grpc/
https://medium.com/namely-labs/how-we-build-grpc-services-at-namely-52a3ae9e7c35
They covers much of gRPC workflow considerations. Hope that helps!

In terms of best practices in sharing gRPC definitions, I would suggest instead of sharing those files, to use GRPC Server Reflection Protocol (https://github.com/grpc/grpc/blob/master/doc/server-reflection.md)

Related

Can I generate a grpc stub file by referring to an external url?

I was start to learning gRPC / protobuf from last week and I wanna find out best architectures for microservices. So one of the things is to have a IDL repository separately. If so, Any service can generate stub files without proto file copy / paste from another service. Is it possible?
IIRC protoc does not enable referencing protos via URL which is unfortunate as it's a reasonable requirement. It's possible that language-specific implementations of the code generation, do enable this.
I recommend you do publish a project's protos (and possibly cache code protoc-generated from them) in a separate (proto) repo. This facilitates reuse, independent versioning and encourages cross-language use.
If protos are bundled in e.g. a repo including a Golang server implementation, it's more difficult to just clone the protos in order to generate e.g. a Python client.

Do you need copies of protobufs in both client and server in web applications?

I'm not sure if this is the right forum to post this question, but I'm trying to learn gRPC/protobufs in the context of a web application. I am building the UI in Flutter, and the backend in Go with MongoDB. I was able to get a simple go service running and I was able to query it using Kreya, however my question now is - how do I integrate the UI with the backend? In order to make the Kreya call, I needed to import the protobufs. Do I need to maintain identical protobufs in both the front end and backend? Meaning, do I literally have to copy all of my protobufs in the backend into my UI codebase and compile locally there as well? This seems like a nightmare to maintain, as now the protobufs have to be maintained in two places, as opposed to one.
What is the best way to maintain the protobufs?
Yes, but think of the protos as a shared (contract) between your clients and servers.
The protos define the interface by which the client is able to communicate with the server. In order for this to be effective, the client and server need to implement the same interface.
One way to do this is to store your protos in a repo that you share in any clients and servers that implement it. This provides a single source of truth of the protos. I also generate copies of the protos compiled (protoc) to the languages I will use e.g. Golang, Dart etc. in this shared protos repo and import from the repo where needed.
Then, in your case, the client imports the Dart-generated sources and the Golang server imports the Golang-generated sources from the shared repo.
Alternatively, your client and your server could protoc compile appropriate sources when they need them, on-the-fly, usually as part of an automated build process.
Try not to duplicate the protos across clients and servers because it will make it challenging to maintain consistency; it will be challenging to ensure every copy remains synchronized.

consuming grpc service using go

I plan to use grpc as an inter-service sync communication protocol.
There are lots of different services and I have generated a pb.go file with all the relevant code for client and server using protoc with the go-rpc plugin.
Now I'm trying to figure out the best way or the common way of consuming this service from another service.
Here is what I have so far:
Option 1
use the .proto file from the service (download it)
run the protoc compiler and generate the ...pb.go file for the consumer to use
Option 2
because the ...pb.go is already generated on the grpc service side
to implement the server and my client is another service written in
go I can expose this as a sub module (another .mod file in a sub-
directory)
use go get github.com/usr/my-cool-grpc-service/client
Option 2 seems more appealing to me because it makes the consumption of a service very easy and available for all other services that may require it.
On the other hand I know that the .proto file is the contract that cann generate clients for many different languages and should be used the the source of truth.
I fear that by choosing option 2 I'm unaware of any possible pitfalls I might encounter with regards to backwards compatibility or any other topic..
So, what is the idiomatic way of consuming a gRPC service?

Organization of protobuf files in a microservice architecture

In my company, we have a system organized with microservices with a dedicated git repository per service. We would like to introduce gRPC and we were wondering how to share protobuf files and build libs for our various languages. Based on some examples we collected, we decided at the end to go for a single repository with all our protobuf inside, it seems the most common way of doing it and it seems easier to maintain and use.
I would like to know if you have some examples on your side ?
Do you have some counter examples of companies doing the exact opposite, meaning hosting protobuf in a distributed way ?
We have a distinct repo for protofiles (called schema) and multiple repos for every microservice. Also we never store generated code. Server and client files are generated from scratch by protoc during every build on CI.
Actually this approach works and fits our needs well. But there are two potential pitfalls:
Inconsistency between schema and microservice repositories. Commits to two different git repos are not atomic, so, at the time of schema updates, there is always a little time period when schema is updated, while microservice's repo is not yet.
In case if you use Go, there is a potential problem of moving to Go modules introduced in Go 1.11. We didn't make a comprehensive research on it yet.
Each of our microservices has it's own API (protobuf or several protobuf files). For each API we have separate repository. Also we have CI job which build protoclasses into jar (and not only for Java but for another language too) and publish it into our central repository. Than you just add dependencies to API you need.
For example, we have microservice A, we also have repository a-api (contains only protofiles) which build by job into jar (and to another languages) com.api.a-service.<version>

Best practices for sharing contracts when using Google Protocol Buffers?

What are some effective ways to share contracts between peer applications, when using Google Protobuffers as the transport? Have any best practices emerged?
If you are talking between different platforms, your best bet is to simply put the .proto schema definition somewhere accessible - could be documentation, could be a download. Each platform can generate their code from there.

Resources