Undefined function despite being defined in imported .proto file - go

I have 2 .proto files in the same directory such that second.proto is dependent on first.proto
second.proto
import "first.proto"
enum ThingINeed {
...something
}
I have no problem running these commands:
$ protoc --go_out=generatedsources/first -I. first.proto
$ protoc --go_out=generatedsources/second -I. second.proto
so the directory structure looks something like
src
|-first.proto
|-second.proto
|-generatedsources
|-first
|-first.pb.go
|-second
|-second.pb.go
My problem is that when I run
$ cd generatedsources/second
$ go build second.pb.go
I recieve a "./second.pb.go: Undefined: ThingINeed" since second.pb.go uses ThingINeed from first.pb.go (seen in first.proto as well)
I notice that second.pb.go doesn't have an import . "generatedsources/first" line in it. When I put it in manually, it works just fine. But I mean, I'm not supposed to edit these .pb.go files, so was wondering how to fix this. I also would very rather not edit these .proto files.
Would appreciate any help!

Related

Referencing external protos like google/rpc/status.proto

I'm wondering how to properly reference external proto files. Say I've got a .proto file which references standard protobuf types such as Timestamp:
syntax = "proto3";
package api;
import "google/protobuf/timestamp.proto";
message ServerTimeResponse {
google.protobuf.Timestamp ts = 1;
}
Easy. Timestamp is automatically available when compiling.
Now I add an external
type, say google.rpc.Status:
syntax = "proto3";
package api;
import "google/protobuf/timestamp.proto";
import "google/rpc/status.proto";
message ServerTimeResponse {
google.protobuf.Timestamp ts = 1;
google.rpc.Status status = 2;
}
Of course we have to tell protoc how to find this file where it is via -I/--proto_path.
My question is this: What is the best practice for actually referencing this file, in particular to make version control happy? There appears not to be a go mod equivalent for protobufs. I've seen it copied verbatim into projects (such as in grpc-gateway) or just referenced from the local filesystem.
I think you sort of answered your own question here. I've done both successfully: manually copied the necessary files in verbatim (from https://github.com/googleapis/googleapis/tree/master/google and https://github.com/protocolbuffers/protobuf/tree/master/src/google/protobuf), and referenced local copies of the files.
If you want to do this and make version control happy, you could add these two repositories as git submodules inside your repository. Just make sure to pass the right locations to protoc using -I. E.g.:
cd $PROJECT_DIR
mkdir third_party && cd third_party
git submodule add https://github.com/googleapis/googleapis/tree/master/google
cd $PROJECT_DIR
<git commit the change>
protoc -I third_party/google <the rest of your protoc command>
As for referencing local copies of the files, and making sure they're present before attempting to build, you may find that adding something like the following to your Makefile will help (this is in a Go build environment):
go get -u github.com/grpc-ecosystem/grpc-gateway/protoc-gen-grpc-gateway
go get -u github.com/golang/protobuf/protoc-gen-go
grpc_gateway_path=$(go list -m -f '{{.Dir}}' github.com/grpc-ecosystem/grpc-gateway)
googleapis_path="$grpc_gateway_path/third_party/googleapis"
protoc -I $googleapis_path --go_out=. <list of input files>

grpc-web compile proto file with wildcard

I have a folder protobuf with a lot of .proto files which I can compile with
protoc -I=protobuf filename.proto --grpc-web_out=import_style=commonjs,mode=grpcwebtext:output
This generates the grpc_web_pb.js into the /output folder but I'm looking for a way to not have to call protoc for every single file, is there something like a wildcard?
I tried
protoc -I=protobuf *.proto --grpc-web_out=import_style=commonjs,mode=grpcwebtext:output
but that doesn't work, fails with no matches found: *.proto
This works for me, maybe just add path to your .proto file:
protoc -I . --grpc-web_out=import_style=commonjs,mode=grpcwebtext:. ./*.proto

Include mpi to make file

I am trying to include MPI compiler to my makefile. The makefile is already prepared such that I only need to include the address of the MPI compiler in a a separate env file. However doing so does not work. I can get the cpp file to run manually by typing:
mpicxx Demo_00.cpp -o aprogram
./aprogram
I test where the mpi compiler is located using:
which mpicxx
/usr/bin/mpicxx
In the env file the corresponding line is:
MPICXX=/usr/bin/mpicxx
However, when I try to 'make' he cpp file I get the following error:
make Demo_00
g++ Demo_00.cpp -o Demo_00
Demo_00.cpp:2:17: fatal error: mpi.h: No such file or directory
compilation terminated.
make: *** [Demo_00] Error 1
The cpp file is in the same folder as the env file and the makefile.
I am not quite sure how to identify the error.
Thank you for your help,
Tartaglia
If you want to change the name of the C++ compiler, you have to change the variable CXX. That's the default variable make uses when it wants to compile C++ code.
This line in your log file:
g++ Demo_00.cpp -o Demo_00
says that you are using g++ compiler instead of mpixx.
Usually in makefiles compiler definition is at the beginnig of the file and looks like this:
CC=g++
just change it to mpixx
CC=mpixx
Thank you all for your responses, I took a closer look into the makefile I thought I was using and it turns out, as you have already suggested, I was not using it at all. The makefile was only able to execute one specific cpp file with one specific name. So whenever I typed in make *.cpp I was using the standard make as you already pointed out.
Thanks again for your help.

Link command line too long: how to use response files when linking in scons on windows

Like others I have a link line that exceeds the Windows cmd line limit. For most cases we have solved the problem by building intermediate archives (aka static libraries) with subsets of the object files and performed the final link with those archives. However using this strategy with Google Test this causes the tests not to be found, specifically the tests defined in the object files that were archived.
Update: This is why. I will probably use this workaround, but I would still like to understand how to make response files work under scons.
The LongCmdLinesOnWin32 fix is problematic. We have a cygwin environment and pathnames that include spaces, so some compiler absolute paths involve quotes. The script in LongCmdLinesOnWin32 first needs to be extended to handle both the embedded quotes and the spaces (otherwise it creates separate tokens of a single path name). More seriously, when using MS Visual Studio, the compiler command is just 'cl' i.e doesn't include the pathname. This is not available in the PATH environment--it appears to be dynamically set (somehow) and not visible when constructing the cmdline argument to the LongCmdLinesOnWin32 script. But I digress....
There seems to be a much simpler (and to my eyes suitable) solution: response files, which are also supported by gcc.
I wrote a little function to take the list of object names and print them to a text file, one to a line, something like:
"""
In place for generating response files
"""
def gen_response_file(filename,file_list):
with open(filename,"w") as f:
for obj_name in file_list:
f.write ('%s\n' %os.path.abspath(str(obj_name)).replace('\\','/'))
return filename
I then tried prepending the '#' character to the file name and added it to the list of options.
The command line echoed was:
link /nologo /MACHINE:x86 /DEBUG #E:\dev\pcoip_view_client\soft_test.rsp /OUT:blah_client\blah_client_tests.exe /LIBPATH:\\sterbkp03\qt\4.8.2\lib ....
If I simply named the file "soft_test" then scons would add the suffix ".obj" and the linker could not find it, so I tried adding the suffix '.rsp'. Now, the linker complains it cannot find the file, but it is present. I captured the output from scons and pasted it to a bat file. When I ran the bat file (from the VS 2008 command line env.) the link worked like a charm, so it seems like scons is somehow causing the problem with finding the file
I tried changing the path, using absolute (#C:\blah\soft_test.rsp), relative (#.\soft_test.rsp) and just #soft_test.rsp, none of them worked.
LINK : fatal error LNK1104: cannot open file '#E:\dev\swift.dev\blah_client\soft_test.rsp'
scons: *** [blah_client\blah_client_tests.exe] Error 1104
I'm using scons v2.1.0.r5357, VS 2008 and python 2.7 under Windows 7-64
My scons file looks like:
test_objects = tenv.Object(test_sources)
xx = gen_response_file('soft_test.rsp',test_objects)
tenv.Append( LINKFLAGS = [ '#%s' % os.path.abspath(xx)]) #
test_exe = tenv.Program(target = 'blah_client_tests', source = objects + moc_objects + qrc_objects )
Any suggestions greatly appreciated.
Update: I tried with gcc and there was no problem. My guess is that somehow the scons rules associated with Visual Studio tools is different enough to cause grief.
I tried to reproduce this in Linux using gcc, and came across a different problem, whose solution may help.
Originally, I used this SConscript:
import os
"""
In place for generating response files
"""
def gen_response_file(filename,file_list):
with open(filename,"w") as f:
for obj_name in file_list:
f.write ('%s\n' %os.path.abspath(str(obj_name)).replace('\\','/'))
return filename
env = Environment()
test_objects = env.Object(target = 'testClass', source = 'testClass.cc')
resp_file = gen_response_file('response_file.rsp', test_objects)
env.Append(LINKFLAGS = [ '#%s' % os.path.abspath(resp_file)])
env.Program(target = 'helloWorld', source = 'helloWorld.cc')
Here are the related source files I used:
# tree .
.
|-- SConstruct
|-- helloWorld.cc
|-- testClass.cc
`-- testClass.h
Where helloWorld.cc is the main program. helloWorld.cc includes testClass.h and links in testClass.o When I tried to compile this, the response file was correctly generated (only contains /some/path/testClass.o) and read by the compiler. The problem that I came across was that testClass.o was not compiled, since SCons doesnt appear to recognize the dependency with the objects listed in the response file. Here is the result:
# scons
scons: Reading SConscript files ...
scons: done reading SConscript files.
scons: Building targets ...
g++ -o helloWorld.o -c helloWorld.cc
g++ -o helloWorld #/some/path/response_file.rsp helloWorld.o
g++: /some/path/testClass.o: No such file or directory
scons: *** [helloWorld] Error 1
scons: building terminated because of errors.
This seems like a failure in SCons because it doesnt analyze the response file. To solve this problem, I had to use the Depends() function as in the following excerpt:
...
bin = env.Program(target = 'helloWorld', source = 'helloWorld.cc')
env.Depends(bin, test_objects)
This worked and gave me the following:
# scons
scons: Reading SConscript files ...
scons: done reading SConscript files.
scons: Building targets ...
g++ -o helloWorld.o -c helloWorld.cc
g++ -o testClass.o -c testClass.cc
g++ -o helloWorld #/some/path/response_file.rsp helloWorld.o
scons: done building targets.
I know this doesnt answer the original question about why the response files cant be found, but once you solve that, you will most likely run into the problem mentioned above, and have to use the Depends() function.

Can I make gotest pass compiler flags?

I have just put together a Go package that is going to be a part in a fairly large system with a lot of shared packages. I was able to get it to compile by writing its Makefile such that the compiler is called with -I flags:
include $(GOROOT)/src/Make.inc
TARG=foobar
GOFILES=\
foobar.go\
foobar:
$(GC) -I$(CURDIR)/../intmath -I$(CURDIR)/../randnum foobar.go
include $(GOROOT)/src/Make.pkg
It compiles just fine, and being a good boy, I wrote a comprehensive set of tests. However, when I try to run the tests with gotest, I get a compile error:
$ gotest
rm -f _test/foobar.a
8g -o _gotest_.8 foobar.go foobar_test.go
foobar.go:4: can't find import: intmath
make: *** [_gotest_.8] Error 1
gotest: "C:\\msys\\bin\\sh.exe -c \"gomake\" \"testpackage\" \"GOTESTFILES=foobar_test.go\"" failed: exit status 2
So, the Go file itself will compile when I use the -I flags to tell it where to find the intmath and randnum packages, but gotest doesn't seem to use the Makefile.
Answering peterSO's question:
foobar.go's import section looks like this:
import (
"intmath"
"randnum"
"container/vector"
)
And the compile works fine as long as I have the -I flags going to the compiler. I have tried to use relative paths, like this:
import (
"../intmath"
"../randnum"
"container/vector"
)
but that just doesn't seem to work.
EDIT: answering further peterSO questions:
GOROOT is set to C:\Go the directory where I have all of the Go stuff -- aside from my source code -- installed. I was expecting the relative path to be relative to the directory in which the source file lives.
My source tree looks like this:
server/
foobar/
randnum/
intmath/
So, while I am open to a different, more Go-idiomatic directory structure, my instinct is to arrange them as peers.
Is there some way that I can nudge gotest into compiling foobar.go with the needed flags?
Create the Windows source code directory structure:
C:\server
C:\server\foobar
C:\server\intnum
For intnum.go:
package intnum
func IntNum() int {
return 42
}
Makefile:
include $(GOROOT)/src/Make.inc
TARG=server/intnum
GOFILES=\
intnum.go\
include $(GOROOT)/src/Make.pkg
Run:
$ cd c/server/intnum
$ make install
For foobar.go:
package foobar
import (
"math"
"server/intnum"
)
func FooBar() float64 {
return float64(intnum.IntNum()) * math.Pi
}
Makefile:
include $(GOROOT)/src/Make.inc
TARG=server/foobar
GOFILES=\
foobar.go\
include $(GOROOT)/src/Make.pkg
Run:
$ cd /c/server/foobar
$ make install
After the install, the intnum.a and foobar.a package files will be in the $GOROOT\pkg\windows_386\server (C:\Go\pkg\windows_386\server) directory`.

Resources