protobuf validator command generates file in wrong path - protocol-buffers

I am trying to include request validation for grpc. I modified the protobuf command like this.
pkg/test/test.proto contains my schema.
If i run the below command :
protoc --go_out=. \
--proto_path=${GOPATH}/src \
--proto_path=${GOPATH}/src/github.com/gogo/protobuf/gogoproto/ \
--proto_path=${GOPATH}/src/github.com/mwitkow/go-proto-validators/ \
--proto_path=. \ --go_opt=paths=source_relative --go-grpc_out=. --go-grpc_opt=paths=source_relative pkg/test/test.proto --govalidators_out=.
The validator.go file generated file is not generated inside pkg/test instead it is getting generated inside a new folder created {source relative pkg}/pkg/test/test.proto/validator.go.
How to generate validator.go file without the folder structure in pkg/test?

Analysis
It looks like the *.validator.pb.go files are generated in the wrong directory.
Using the pkg/test/test.proto file with the following content:
syntax = "proto3";
option go_package = "github.com/example-user/example-repository";
service Greeter {
rpc SayHello (HelloRequest) returns (HelloReply) {}
}
message HelloRequest {
string name = 1;
}
message HelloReply {
string message = 1;
}
Produced the file system contents:
$ find .
.
./github.com
./github.com/example-user
./github.com/example-user/example-repository
./github.com/example-user/example-repository/test.validator.pb.go
./pkg
./pkg/test
./pkg/test/test_grpc.pb.go
./pkg/test/test.proto
./pkg/test/test.pb.go
Solution
Add the --govalidators_opt=paths=source_relative command line argument.
Please, note the parameter name:
--govalidators_opt
The complete command line:
protoc --go_out=. \
--proto_path=. \
--go_opt=paths=source_relative \
--go-grpc_out=. \
--go-grpc_opt=paths=source_relative \
--govalidators_out=. \
--govalidators_opt=paths=source_relative \
pkg/test/test.proto
Produced the file system contents:
$ find .
.
./pkg
./pkg/test
./pkg/test/test_grpc.pb.go
./pkg/test/test.proto
./pkg/test/test.pb.go
./pkg/test/test.validator.pb.go
Additional references
Seems to be an example. Comment.
A GitHub issue created by the asker? How to specify path for generated validator.go ? · Issue #121 · mwitkow/go-proto-validators.

Related

--go-grpc_out: protoc-gen-go-grpc: Plugin failed with status code 1

i am getting this error when i compile my proto file:
protoc-gen-go-grpc: program not found or is not executable
Please specify a program using absolute path or make sure the program is available in your PATH system variable
\--go-grpc_out: protoc-gen-go-grpc: Plugin failed with status code 1.
I am using these versions:
Binary
Version
protoc-gen-go
v1.25.0-devel
protoc
v3.12.4
I have tried to compile this proto file:
syntax = "proto3";
option go_package = "proto/";
message GreetingRequest {
string first_name = 1;
string last_name = 2;
}
message GreetingResponse{
string result = 1;
}
service AddService{
rpc Greet(GreetingRequest) returns (GreetingResponse) {};
}
I used this command:
protoc \
--proto_path=proto \
--go_out=proto \
--go_opt=paths=source_relative \
--go-grpc_out=proto \
--go-grpc_opt=paths=source_relative \
service.proto
I installed protoc-gen-go-grpc using:
go install google.golang.org/grpc/cmd/protoc-gen-go-grpc#latest
but its still showing this error.

error with the use of the script $ this-> load-> library ('upload', $ config)

I have this error: Fatal error: Call to a member function load () on string C: \ xampp \ htdocs \ cloud.ikwook.com \ ikwook_system \ libraries \ Upload.php on line 1136 when I execute this script: $ this-> load-> library ('upload', $ config); it may be due to what exactly.
$ this-> load-> library ('upload', $ config);
I see spaces in between all this I think this should be the same in your code as well. it will be really helpful if u paste the code.
$this->load->library('upload' $config);

gcloud endpoints deploy error unresolved type

I'm trying to deploy a service which requires google protobuf's Timestamp but I am receiving an error.
gcloud endpoints services deploy api_descriptor.pb api_config.yaml --validate-only
ERROR: (gcloud.endpoints.services.deploy) INVALID_ARGUMENT: Cannot
convert to service config.
'ERROR: unknown location: Unresolved type '.google.protobuf.Timestamp''
my command to generate api_descriptor.pb:
protoc \
--plugin=protoc-gen-go=${GOBIN}/protoc-gen-go \
-I . proto/service.proto \
--descriptor_set_out=api_descriptor.pb \
--go_out=plugins=grpc:. \
relevant bit from proto file which requires google.protobuf.Timestamp:
syntax = "proto3";
package proto;
import "vendor/github.com/golang/protobuf/ptypes/timestamp/timestamp.proto";
message CandleStick {
string ID = 1;
double Open = 2;
double Close = 3;
double High = 4;
double Low = 5;
google.protobuf.Timestamp TimeStamp = 6;
}
Tried for hours unsuccessfully to resolve this issue. Thanks in advance!
In your protoc command line invocation, I think you need to include all the imports in the generated descriptor. You can do this using --include_imports:
protoc \
--plugin=protoc-gen-go=${GOBIN}/protoc-gen-go \
--include_imports \
-I . proto/service.proto \
--descriptor_set_out=api_descriptor.pb \
--go_out=plugins=grpc:. \

AC_CONFIG_FILES not generating Makefiles

I'm writing an app in Vala with support to plugins. The app has the following directory structure:
data/
[data files]
m4/
my_project.m4
plugins/
example/
example.plugin.in
example-plugin.vala
Makefile.am
po/
src/
[source files]
The file "my_project.m4" dinamically adds plugin dirs with a simple defined function called MYPROJ_ADD_PLUGIN, and it works fine as I tested it with some other projects. Basically, it calls:
AC_CONFIG_FILES([plugins/example/Makefile])
[...]
AC_CONFIG_FILES([plugins/example/example.plugin])
The problem is, when I try to configure it, it gives back:
"error: cannot find input file: `plugins/example/Makefile.in'"
The example makefile (plugins/example/Makefile.am) is the following:
include $(top_srcdir)/common.am
plugin_LTLIBRARIES = example-plugin.la
plugin_DATA = example.plugin
example_plugin_la_SOURCES = \
example-plugin.vala
example_plugin_la_VALAFLAGS = \
$(MYPROJ_COMMON_VALAFLAGS) \
--target-glib=2.38
example_plugin_la_CFLAGS = \
$(MYPROJ_COMMON_CFLAGS) \
-I$(srcdir) \
-DG_LOG_DOMAIN='"Example"'
example_plugin_la_LIBADD = \
$(MYPROJ_COMMON_LIBS)
example_plugin_la_LDFLAGS = \
$(MYPROJ_PLUGIN_LINKER_FLAGS) \
-lm
EXTRA_DIST = example.plugin.in
Every var is correctly generated (in common.am and configure.ac).
I appreciate any advice on this issue.
Thanks in advance
Looks like I found the answer to my own question. Apparently, everything I had to do was add a "lib" prefix to my plugin output file. The plugins/example/Makefile.am now looks like:
include $(top_srcdir)/common.am
plugin_LTLIBRARIES = **lib**example.la
plugin_DATA = example.plugin
**lib**example_la_SOURCES = \
example-plugin.vala
**lib**example_la_VALAFLAGS = \
$(MYPROJ_COMMON_VALAFLAGS) \
--target-glib=2.38
**lib**example_la_CFLAGS = \
$(MYPROJ_COMMON_CFLAGS) \
-I$(srcdir) \
-DG_LOG_DOMAIN='"Example"'
**lib**example_la_LIBADD = \
$(MYPROJ_COMMON_LIBS)
**lib**example_la_LDFLAGS = \
$(MYPROJ_PLUGIN_LINKER_FLAGS) \
-lm
EXTRA_DIST = example.plugin.in
This was the only modification I did, and it works as expected now. Seems like autoconf/autotools is very rigid about the syntax of plugins and shared libs, as they MUST start with lib prefix.

specifying own inputformat for streaming job

I defined my own input format as follows which prevents file spliting:
import org.apache.hadoop.fs.*;
import org.apache.hadoop.mapred.TextInputFormat;
public class NSTextInputFormat extends TextInputFormat {
#Override
protected boolean isSplitable(FileSystem fs, Path file) {
return false;
}
}
I compiled this using Eclipse into a class NSTextInputFormat.class. I copied this class to a client from where the job is launched. I used following command for launching the job and passing above class as inputformat.
hadoop jar $HADOOP_HOME/hadoop-streaming.jar -Dmapred.job.queue.name=unfunded -input 24222910/framefile -input 24225109/framefile -output Output -inputformat NSTextInputFormat -mapper ExtractHSV -file ExtractHSV -file NSTextInputFormat.class -numReduceTasks 0
This fails saying:
-inputformat : class not found : NSTextInputFormat
Streaming Job Failed!
I set the PATH and CLASSPATH variable to the directory containing NSTextInputFormat.class, but still that doesnot work. Any pointers to this will be helpful.
There are a few gotchas here that can get you if you are not familiar with Java.
-inputformat (and the other commandline options that expect classnames) expects a fully qualified classname, otherwise it expects to find the class in some org.apache.hadoop... namespace. So you must include a package name in you .java file
package org.example.hadoop;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.mapred.TextInputFormat;
public class NSTextInputFormat extends TextInputFormat {
#Override
protected boolean isSplitable(FileSystem fs, Path file) {
return false;
}
}
And the specify the full name on the commandline:
-inputformat org.example.hadoop.NSTextInputFormat
When you build the jar file the .class file must also be in a directory structure that mirrors the package name. I'm sure this is Java Packaging 101, but if you are using Hadoop Streaming then you probably aren't too familiar with Java in the first place. Passing the -d option to javac will tell it to compile the input files into .class files in directories that match the package name.
javac -classpath `hadoop classpath` -d ./output NSTextInputFormat.java
The compiled .class file will be written to ./output/org/example/hadoop/NSTextInputFormat.class. You will need to create the output directory but the other sub-directories will be created for you. The jar file can then be created like so:
jar cvf myjar.jar -C ./output/ .
And you should see some output similar to this:
added manifest
adding: org/(in = 0) (out= 0)(stored 0%)
adding: org/example/(in = 0) (out= 0)(stored 0%)
adding: org/example/hadoop/(in = 0) (out= 0)(stored 0%)
adding: org/example/hadoop/NSTextInputFormat.class(in = 372) (out= 252)(deflated 32%)
Bundle the input format and mapper class into a jar (myjar.jar) and add the -libjars myjar.jar option to the command line:
hadoop jar $HADOOP_HOME/hadoop-streaming.jar \
-libjars myjar.jar \
-Dmapred.job.queue.name=unfunded \\
-input 24222910/framefile \
-input 24225109/framefile \
-output Output \
-inputformat NSTextInputFormat \
-mapper ExtractHSV \
-numReduceTasks 0

Resources