I have 2 grpc services (service1 and service2) that interact with each other and on some cases the rpc response of service1 will consists of a struct defined in service2, after going to several situations where duplication is inevitable, i figure that as the services grow these will be hard to manage, so i restructure the proto files into something like this for now
.
├── app
...
├── proto
│ ├── service1
│ │ ├── service1.access.proto
│ │ ├── service1.proto
│ ├── service2
│ │ ├── service2.access.proto
│ │ └── service2.proto
│ └── model
│ ├── model.service1.proto
│ └── model.service2.proto
└── proto-gen // the protoc generated files
├── service1
│ ├── service1.access.pb.go
│ └── service1.pb.go
├── service2
│ ├── service2.access.pb.go
│ └── service2.pb.go
└── model
├── model.service1.pb.go
└── model.service2.pb.go
service1 needs to import the model definition on model/model.service2.proto, so i am importing it like this
import "model/model.service2.proto";
option go_package = "proto-gen/service1";
and i generate the .pb.go files, using this protoc command
ls proto | awk '{print "protoc --proto_path=proto proto/"$1"/*.proto --go_out=plugins=grpc:."}' | sh
the command generates the .pb.go files just fine, but the code on service1.access.pb.go doesn't seem to import the model correctly, and i don't know if it related or not but when i run the app, it throws this error
cannot load model: malformed module path "model": missing dot in first path element
i spent a few hours now googling on how can i properly import another proto file, i can't seem to find any solution
The reason you got that error about model is because the generated files use the go_package of the imported file, and model is not a valid import path. You have to convince the generated file to use the full import path of the package.
This is how I did it for my source tree: I have a similar tree of proto files importing each other. If your module is named, say github.com/myapp, then run protoc with --proto-path=<directory containing github.com>, import other proto files using full path, that is github.com/myapp/proto/service1/service1.proto, and in service1.proto, define go_package = service1. This setup writes the import paths correctly in my case.
Before settling into this solution, I was using go_package=<full path to proto>, so you might give that a try as well.
Building on Burak Serdar, I want to provide my implementation.
Set the package on the proto you want to import similar to this where the location is your full path. My path is generally github.com/AllenKaplan/[project]/[package]/proto/
option go_package = [path];
In the file you wish to import to add an import. My path is generally [package]/proto/[package].proto
import = [path from protoc proto path]
The last part is the protoc command where you must define the protopath in a way that connects the import path and option go_package path
if executing from the github.com/AllenKaplan/[project] directory, I would call
protoc -I. --go_out=./[package]/proto [package]/proto/[package].proto
-I. === --proto_path.
the -I. sets the protopath to the entire project
One note, when calling protoc on your .proto files that you are importing, you will want to add source_relative: to the output will ensure the output is from the root with a set package.
My implementation of the imported protoc when called from github.com/AllenKaplan/[project]/[package]
protoc -I./proto --go_out=paths=source_relative:./proto [package].proto
I was also facing a similar issue while importing. Had changed the .protoc file option package with the following.
option go_package = "./;proto-gen/service1";
The first param means relative path where the code you want to generate.
The path relative to the --go_out , you set in your command.
Related
I currently have two protobuf repos: api and timestamp:
timestamp Repo:
- README.md
- timestamp.proto
- timestamp.pb.go
- go.mod
- go.sum
api Repo:
- README.md
- protos/
- dto1.proto
- dto2.proto
Currently, timestamp contains a reference to a timestamp object that I want to use in api but I'm not sure how the import should work or how I should modify the compilation process to handle this. Complicating this process is the fact that the api repo is compiled to a separate, downstream repo for Go called api-go.
For example, consider dto1.proto:
syntax = "proto3";
package api.data;
import "<WHAT GOES HERE?>";
option go_package = "github.com/my-user/api/data"; // golang
message DTO1 {
string id = 1;
Timestamp timestamp = 2;
}
And my compilation command is this:
find $GEN_PROTO_DIR -type f -name "*.proto" -exec protoc \
--go_out=$GEN_OUT_DIR --go_opt=module=github.com/my-user/api-go \
--go-grpc_out=$GEN_OUT_DIR --go-grpc_opt=module=github.com/my-user/api-go \
--grpc-gateway_out=$GEN_OUT_DIR --grpc-gateway_opt logtostderr=true \
--grpc-gateway_opt paths=source_relative --grpc-gateway_opt
generate_unbound_methods=true \{} \;
Assuming I have a definition in timestamp for each of the programming languages I want to compile api into, how would I import this into the .proto file and what should I do to ensure that the import doesn't break in my downstream repo?
There is no native notion of remote import paths with protobuf. So the import path has to be relative to some indicated local filesystem base path (specified via -I / --proto_path).
Option 1
Generally it is easiest to just have a single repository with protobuf definitions for your organisation - e.g. a repository named acme-contract
.
└── protos
└── acme
├── api
│ └── data
│ ├── dto1.proto
│ └── dto2.proto
└── timestamp
└── timestamp.proto
Your dto1.proto will look something like:
syntax = "proto3";
package acme.api.data;
import "acme/timestamp/timestamp.proto";
message DTO1 {
string id = 1;
acme.timestamp.Timestamp timestamp = 2;
}
As long as you generate code relative to the protos/ dir of this repository, there shouldn't be an issue.
Option 2
There are various alternatives whereby you continue to have definitions split over various repositories, but you can't really escape the fact that imports are filesystem relative.
Historically that could be handled by manually cloning the various repositories and arranging directories such that the path are relative, or by using -I to point to various locations that might intentionally or incidentally contain the proto files (e.g. in $GOPATH). Those strategies tend to end up being fairly messy and difficult to maintain.
buf makes things somewhat easier now. If you were to have your timestamp repo:
.
├── buf.gen.yaml
├── buf.work.yaml
├── gen
│ └── acme
│ └── timestamp
│ └── timestamp.pb.go
├── go.mod
├── go.sum
└── protos
├── acme
│ └── timestamp
│ └── timestamp.proto
├── buf.lock
└── buf.yaml
timestamp.proto looking like:
syntax = "proto3";
package acme.timestamp;
option go_package = "github.com/my-user/timestamp/gen/acme/timestamp";
message Timestamp {
int64 unix = 1;
}
buf.gen.yaml looking like:
version: v1
plugins:
- name: go
out: gen
opt: paths=source_relative
- name: go-grpc
out: gen
opt:
- paths=source_relative
- require_unimplemented_servers=false
- name: grpc-gateway
out: gen
opt:
- paths=source_relative
- generate_unbound_methods=true
... and everything under gen/ has been generated via buf generate.
Then in your api repository:
.
├── buf.gen.yaml
├── buf.work.yaml
├── gen
│ └── acme
│ └── api
│ └── data
│ ├── dto1.pb.go
│ └── dto2.pb.go
└── protos
├── acme
│ └── api
│ └── data
│ ├── dto1.proto
│ └── dto2.proto
├── buf.lock
└── buf.yaml
With buf.yaml looking like:
version: v1
name: buf.build/your-user/api
deps:
- buf.build/your-user/timestamp
breaking:
use:
- FILE
lint:
use:
- DEFAULT
dto1.proto looking like:
syntax = "proto3";
package acme.api.data;
import "acme/timestamp/timestamp.proto";
option go_package = "github.com/your-user/api/gen/acme/api/data";
message DTO1 {
string id = 1;
acme.timestamp.Timestamp timestamp = 2;
}
and buf.gen.yaml the same as in the timestamp repo.
The code generated via buf generate will depend on the timestamp repository via Go modules:
// Code generated by protoc-gen-go. DO NOT EDIT.
// versions:
// protoc-gen-go v1.28.1
// protoc (unknown)
// source: acme/api/data/dto1.proto
package data
import (
timestamp "github.com/your-user/timestamp/gen/acme/timestamp"
protoreflect "google.golang.org/protobuf/reflect/protoreflect"
protoimpl "google.golang.org/protobuf/runtime/protoimpl"
reflect "reflect"
sync "sync"
)
// <snip>
Note that if changes are made to dependencies you'll need to ensure that both buf and Go modules are kept relatively in sync.
Option 3
If you prefer not to leverage Go modules for importing generated pb code, you could also look to have a similar setup to Option 2, but instead generate all code into a separate repository (similar to what you're doing now, by the sounds of it). This is most easily achieved by using buf managed mode, which will essentially make it not require + ignore any go_modules directives.
In api-go:
.
├── buf.gen.yaml
├── go.mod
└── go.sum
With buf.gen.yaml containing:
version: v1
managed:
enabled: true
go_package_prefix:
default: github.com/your-user/api-go/gen
plugins:
- name: go
out: gen
opt: paths=source_relative
- name: go-grpc
out: gen
opt:
- paths=source_relative
- require_unimplemented_servers=false
- name: grpc-gateway
out: gen
opt:
- paths=source_relative
- generate_unbound_methods=true
You'd then need to generate code for each respective repo (bushed to BSR):
$ buf generate buf.build/your-user/api
$ buf generate buf.build/your-user/timestamp
After which you should have some generated code for both:
.
├── buf.gen.yaml
├── gen
│ └── acme
│ ├── api
│ │ └── data
│ │ ├── dto1.pb.go
│ │ └── dto2.pb.go
│ └── timestamp
│ └── timestamp.pb.go
├── go.mod
└── go.sum
And the imports will be relative to the current module:
// Code generated by protoc-gen-go. DO NOT EDIT.
// versions:
// protoc-gen-go v1.28.1
// protoc (unknown)
// source: acme/api/data/dto1.proto
package data
import (
timestamp "github.com/your-user/api-go/gen/acme/timestamp"
protoreflect "google.golang.org/protobuf/reflect/protoreflect"
protoimpl "google.golang.org/protobuf/runtime/protoimpl"
reflect "reflect"
sync "sync"
)
// <snip>
All in all, I'd recommend Option 1 - consolidating your protobuf definitions into a single repository (including vendoring 3rd party definitions) - unless there is a particularly strong reason not to.
I have a repository structure as follows :-
xyz/src
1. abc
- p
- q
- r
2. def
- t
- u
- v
3. etc
- o
- m
- n
I have created a .mod file in src and run go build ./...
Except for local packages everything is fine. So if abc/p is being used in def then it throws the following exception :- cannot find module providing package abc/p. The idea behind keeping the .mod file in src package was to make sure the path is being found from where the mod file is located. Can anyone suggest where should the mod file ideally should be? also i tried placing it one directory above in xyz but still same issue as well as i created one for each sub directory. I am bit confused on this. Will I have to create separate repository for abc and etc. But considering gopath which earlier used to work for the same I think module should also be able to do the same. Any suggestions?
The most common and easiest approach is a single go.mod file in your repository, where that single go.mod file is placed in the root of your repository.
Russ Cox commented in #26664:
For all but power users, you probably want to adopt the usual convention that one repo = one module. It's important for long-term evolution of code storage options that a repo can contain multiple modules, but it's almost certainly not something you want to do by default.
The Modules wiki says:
For example, if you are creating a module for a repository
github.com/my/repo that will contain two packages with import paths
github.com/my/repo/foo and github.com/my/repo/bar, then the first
line in your go.mod file typically would declare your module path as
module github.com/my/repo, and the corresponding on-disk structure
could be:
repo/
├── go.mod <<<<< Note go.mod is located in repo root
├── bar
│ └── bar.go
└── foo
└── foo.go
In Go source code, packages are imported using the full path including
the module path. For example, if a module declared its identity in its
go.mod as module github.com/my/repo, a consumer could do:
import "example.com/my/repo/bar"
That imports package bar from the module github.com/my/repo.
I have a single go.mod in the root of my go application. I am using the following structure inspired by Kat Zien - How Do You Structure Your Go Apps
At the minute one of my applications looks like this
.
├── bin
├── cmd
│ ├── cli
│ └── server
│ └── main.go
├── pkg
│ ├── http
│ │ └── rest
| │ # app-specific directories excluded
│ └── storage
│ └── sqlite
All packages are imported via their full path, i.e. import "github.com/myusername/myapp/pkg/http/rest" otherwise it causes problems all over the place and this was the one change I had to make going from $GOPATH to go mod.
go mod then handles all the dependencies it discovers properly as far as I've discovered so far.
my working tree is like this:
/opt/go/src/tb-to-composer/
├── apis
│ └── rtb.go
├── config.yaml
├── jsondef
│ └── structures.go
├── LICENSE.md
├── README.md
├── tb-to-composer
└── thingsToComposer.go
when I do go build inside /opt/go/src/tb-to-composer/ the build doesn't recompile rtb.go and structures.go even though there was changes in them. In order to achieve build I need to run go build -a every time I do a change to rtb.go or structures.go, is that the expected behavior from go build? How to I recompile only custom libs inside my package folder without recompile the whole /opt/go/src tree?
You can try the -i flag, or (this does not work, sorry) specify the files in the directories explicitly as arguments to go build, i.e. go build thingsToComposer.go apis/rtb.go jsondef/structures.go
First of all and to be clear I come from the Java world, and I have been programming on go for a while and I really love it.
I have a small question about the packaging system and the imports, if I am importing a library which uses another library and I am already using that library in my project, how can I eliminate the duplication(if its possible),
in other words:
A is a the main program, C and B are libraries, and then:
C was added to A
B uses C
and then
B was also added to A
AProject/
src/
LibC/
src/
somefiles.go
LibB/
src/
LibC
somefiles.go
app.go
So now I have two libraries of C one in A since the beginning and one in B because B is dependent on C.
I know its a little bit confusing but in the Java world we have Ant and Maven and those build tools make it really easy to us to handle the dependencies.
Any thoughts?
In Go, there is no duplication of packages.
First, you should read about Go workspaces in How to Write Go Code.
From your question, your directory structure should look something like this:
gopath (gopath is the path of a directory in your $GOPATH list)
├── bin
│ └── projecta
├── pkg
│ └── linux_amd64
│ └── projecta
│ ├── libb.a
│ └── libc.a
└── src
└── projecta
├── a.go
├── libb
│ └── b.go
└── libc
└── c.go
Where,
gopath/src/projecta/a.go:
package main
import (
"projecta/libb"
"projecta/libc"
)
func a() {
libb.B()
libc.C()
}
func main() { a() }
gopath/src/projecta/libb/b.go:
package libb
import (
"projecta/libc"
)
func B() { libc.C() }
gopath/src/projecta/libc/c.go:
package libc
func C() {}
If you are talking about third-party libraries, in go is very simple to do that,
just put the import in your source code, like:
import "github.com/somepackage/somelib"
and from the command line in your working directory run:
go get
the the source code of the libraries will be downloaded in the src directory of your $GOPATH.
If you want to create your own lib instead, just create the folder named as the lib in $GOPATH/src and put the code in this folder.
The folders structure is:
$GOPATH/
src/
github.com/
somepackage/
somelib/
somelib.go
yourlibB/
yourlibB.go -> //import somelib here
yourlibC/
yourlibC.go -> //import libB here
yourmainprogramA/
yourmainprogramA.go -> //import somelib, libC and libB here
I’ve tried to add a package to Buildroot that uses Qt and Boost. The package uses qmake to generate a Makefile, this part seems to be working, however I get an error when I build saying:
Could not find qmake configuration file qws/linux-arm-g++.
Error processing project file: MsgDisplay.pro
The contents of my package is laid out like this:
DummyPgm
├── main.cpp
├── MsgDisplay.pri
├── MsgDisplay.pro
├── MsgDisplay.pro.user
├── MsgHandler.cpp
├── MsgHandler.h
├── MsgServer.cpp
├── MsgServer.h
├── Tcp
│ ├── TcpAddrPort.cpp
│ ├── TcpAddrPort.h
│ ├── TcpServer.cpp
│ ├── TcpServer.h
│ ├── TcpSocket.cpp
│ └── TcpSocket.h
└── Tools
├── Banner.cpp
├── Banner.h
├── IoExt.h
├── SeparateArgumentList.cpp
├── SeparateArgumentList.h
└── SysTypes.h
2 directories, 20 files
I have added a package directory, dummypgm, which contains Config.in and dummypgm.mk files. The contents of the files are:
Config.in:
config BR2_PACKAGE_DUMMYPGM
bool "dummypgm"
help
Foo Software.
http://www.foo.com
dummypgm.mk:
DUMMYPGM_VERSION = 0.1.0
DUMMYPGM_SOURCE = DummyPgm-$(DUMMYPGM_VERSION).tar.gz
define DUMMYPGM_CONFIGURE_CMDS
(cd $(#D); $(QT_QMAKE) MsgDisplay.pro)
endef
define DUMMYPGM_BUILD_CMDS
$(MAKE) -C $(#D)
endef
$(eval $(generic-package))
Since the package is hosted locally, I’ve simply put the DummyPgm-0.1.0.tar.gz in the dl directory.
I’ve also added the following to package/Config.in:
source "package/dummypgm/Config.in"
I’m a little lost as to why this doesn’t work, if anyone could help me I would be very grateful. Also, is there any way to call $(eval $(qmake-package)) or something?
Are you using Qt4 or Qt5 ? Your package/dummypgm/Config.in should have a depends on on one of them, and your dummypgm.mk should have a DUMMYPGM_DEPENDENCIES = qt or DUMMYPGM_DEPENDENCIES = qt5base.
My intuition is that you are using Qt5. In this case, you shouldn't call $(QT_QMAKE), but $(QT5_QMAKE).
Have a look at http://git.buildroot.net/buildroot/tree/package/qextserialport/qextserialport.mk for an example. Note that this example supports both Qt4 and Qt5, probably in your case you only need one of the two.
Also, you should really subscribe to the Buildroot mailing list, you would get a lot more answers than here.