gRPC and clean architecture - Where to place proto files - microservices

I'm using grpc in the asp.net project with clean architecture, where should I put proto files and grpc services which layer, and which folder

When taking Clean Architecture (from Uncle Bob) very strict gRPC should be considered as a "framework" (technical detail) and so all code depending on it (including proto files) should be outside the domain/application layer and in the outer most layer.

Related

Best practice- share Spring-boot Service and Repo layer code between applications

Need some best practice recommendations to a classic requirement around modularising Springboot application based on layers.
Some background info:
Small- medium size Spring boot project with less than 10 developers
2 different Spring-boot applications and shared Service, Repo layer and also shared models
Bit too late to go with micro service approach with full Model/ Controller/ Service/ Repo per API.
Currently there is just one web application exposing the APIs for a frontend application.
Requirement is to add new set of APIs which are used for B2B integration, so the request/ response formats will be quite different to the already available APIs. i.e. /webapi/v1/orders for frontend client and /b2b/v1/orders will need to return different response format.
The Service and Repository layer along with the models need to be shared among the 2 applications, so 3 modules identified as similar to how it's explained in https://stackoverflow.com/a/50352532/907032
-- Main app
-- Webapi (Got dependency to common, jar packaging)
-- b2b (Got dependency to common, jar packaging)
-- common (jar packaging)
The two applications need to be deployed separately and also separated from CICD perspective not to build all the sub modules every time (A change to b2b controller should not affect common/ Webapi)
A change to common module which is only required to the latest b2b module, preferably should not trigger a build and deploy of webapi. i.e. webapi uses common-1.01 and b2b module uses common-1.02. Understood the new version common-1.02 should not break any feature from common-1.01 but just trying to save unnecessary build & deploy for that module until required if that makes sense.
The challenge
Should the modules defined in the same Repo or 3 different Repos?
All the talks about mono vs multi repo is about whether to keep all different projects in same or not, but here as you can see these are modules which are kind of related to each other.
If we define these as sub-modules in same Repo, how versioning of the common module handled? If it's always triggering a build of all three sub modules, do we even have any advantage of modularising the code?
As per your description, the module named "common" is not not that comon to the other two. I'd go with the multi-modudle way by doing so:
first break that common module in three: common, utils-webapi, utils-b2b
The first will strictly contains the thing both webapi and b2b need at the same version. Utils-webapi will be dedicated strictly to the things in api. Same goes for utils-b2b
B2b depends on utils-b2b with depends on common. Webapi depends on utils-webapi with depends on common
Versionning of common module is always consistant, only utils-X module version change from the X module perspective
CI is thus independant for each build.
Note: You can go further and simply consider utils-webapi utils-b2b and get rid of common. At the cost of some deduped code.

Sharing Proto or Generated Files across Microservices

I am building a Spring Boot Microservices using protobuf and gRPC for communication. However, I realised that I will need to define the entities in all microservices that requires it.
Instead of the straight forward method of copy and paste the raw file (not recommended), I can think of 2 methods:
sharing raw proto file
sharing proto generated files
Which is the proper way of doing? If I am sharing raw proto files, how can I share the proto files properly across microservices?
When sharing across languages, copying raw protos between repositories is typical. Some build systems like Bazel don't need to copy the protos, but most do. When copying protos it is important there there is a single well-known canonical copy of the protos and all other copies are bit-identical to a version of the canonical copy.
But when sharing in a Java-centric collection of projects, creating a canonical Java package for the generated code is superior as it is easier to use and it helps make sure only one copy of the generated code is in the classpath.
A typical Java protobuf Jar will include both the raw protos and the generated code; the raw protos are automatically included by both the Maven and Gradle Protobuf plugins. You then depend on that Jar like normal and it provides the dependency for Java code and Protobuf definitions. The Maven and Gradle Protobuf plugins automatically find .proto files in dependencies and add them to the include path (-I) of protoc when generating code.

How are proto files shared among relevant microservices?

I am building a microservice application adopting gRPC as the communication protocol among the microservices. I realised that I have to copy and paste the proto files to all microservices. Futhermore, if there is a change in definition, I will have to C&P again to all relevant microservices.
How should the proto files be shared? Preferably there is a common that we can call to upload and download the updated proto files.

Shared data classes in microservice architecture with Spring and Kotlin

Suppose you have two microservices talking to each other, both share data classes as contracts. While you dont want to duplicate all contracts in every microservices project, whats the best way to share them?
As is see it, there are only two options:
Duplicate shared contracts, while also duplicating code
Compile a first party library and import it
My team had the same scenario some time ago and we decided for a variant of your second point. We used Maven Multi-Module Project, so that one Microservice produced the library, which was then imported by the 2nd Microservice.
Maven Multi-Module Project is explained here:
https://www.baeldung.com/maven-multi-module
It is not a good idea to do project1 > project2 dependency because it might change in the future and you'll end up having to refactor the whole thing.
What I'd do is to create a shared library that's published to an internal Nexus or artifactory, then you can just add this as a dependency to your project so you'll have something like this:
library > project1
library > project2
Another idea to consider is to use something like Protocol Buffers if you are only sharing the data transfer objects. This has the additional advantage of versioning and forward compatibility.

How to use grpc with spring boot

I am new in grpc i don't know how to use it with spring boot but using the below link
https://github.com/saturnism/grpc-java-by-example/tree/master/simple-grpc-server
https://github.com/saturnism/grpc-java-by-example/tree/master/simple-grpc-client
note* : - first is for server project and second is for client project.
i have created a project on grpc with spring boot but i can'nt getting understand one thing in this that in grpc client project how can i use classes which are generated by protobuf in the project of grpc server. because it is not creating any proto file in grpc client project then how can i use the classes of grpc server project in grpc client project or can we create one project for grpc server and client instead of creating a diffrent project for both.
I have two queries to ask related to this question one:-
1. How to use classes of grpc generated by protobuf compiler in another project like if client and server are two different project and only server have proto generated files and client wants to use same classes.
How can i create all these thing in a single project means client and server in one project and then how can i run this project with step by step demo.
There are two ways you can do this:
Copy the .proto files between the two projects, and have each one generate their own copies of the generated code. This is probably the easiest, and allows you to avoid checking in the generated code into source control. The downside to this approach is that the .proto files can get out of date if you modify one and not the other.
Keep the .proto in the same repository of both the client and server, and make both depend on the generated code. This allows the proto to be modified for the client and server at the same time, but requires the code to live in the same repository (this is sometimes called the "Monorepo" approach). The downside to this is that the client and server repos may get too big, and need to be split up.
Google (the author of Protobuf) typically uses option #2, but many users of Protobuf prefer option 1. I would highly recommend regenerating the classes each time, and not check in the generated code. The ABI of Protobuf classes can change occasionally, and you would lose the backwards compatibility of Protobuf.
I have created a sample spring boot grpc application and posted in here
https://javabelazy.blogspot.com/
use the dependency net.devh.grpc-server-spring-boot-starter in your pom
create a protofile (sample service code)
service PingPongService {
rpc ping(PingRequest) returns (PongResponse) {
option (google.api.http) = { get: "/v1/grpc/{ping}" };
}
generate stubs for proto file using io.grpc:protoc-gen-grpc-java:1.30.0:exe
use nettyserver
set the port to 9090 (default) grpc.server.port=9090 in application properties
I have used https://github.com/yidongnan/grpc-spring-boot-starter recently. You will get most of the spring features along with grpc using this library.
There is yidongnan/grpc-spring-boot-starter (DOC) which implement springboot autoconfiguration starter for both client and server.
It implements #GrpcServer and #GrpcClient.
#GrpcService, which will add service to grpc server and start server automatically.
Annotation that marks gRPC services that should be registered with a gRPC server.
If spring-boot's auto configuration is used, then the server will be created
automatically. This annotation should only be added to implementations of
BindableService (GrpcService-ImplBase).
#GrpcClient, which will create channel and stub for client automatically
Example: #GrpcClient("myClient") <-> grpc.client.myClient.address=static://localhost:9090
nils server sample
nils client sample
Based on these samples, I also implement my simple server and client sample:
ppdouble/springboot-grpc-server-sample
ppdouble/springboot-grpc-client-sample
You can based on those samples implement your project or implement a new springboot autoconfiguration starter.

Resources