openapiv2 imports causes compilation error in generated code - maven

I am using buf to generate grpc server and client code for several languages (go, python, js, java, c#), while using grpc-ecosystem/plugins/openapiv2 plugin to generate swagger documentation from the same proto files.
In some files I'm using custom
option (grpc.gateway.protoc_gen_openapiv2.options.openapiv2_tag) = {description: "Manage datasets and examples used for training."}; to add additional metadata to the documentation. This requires me to import annotations.proto from grpc-gateway project which causes the imports to also appear in generated source files. Now languages like go for example can handle this by using import for side effects
import (
_ "github.com/grpc-ecosystem/grpc-gateway/v2/protoc-gen-openapiv2/options"
_ "google.golang.org/genproto/googleapis/api/annotations"
)
but in java and c# there are some lines that are being generated which look like this
registry.add(com.google.api.AnnotationsProto.http);
registry.add(grpc.gateway.protoc_gen_openapiv2.options.Annotations.openapiv2Tag);
which causes compilation errors, because package grpc.gateway... does not exist (I was able to import the googleapis package via Maven and nuget). When I remove the options from .proto files there are no issues and I can compile the source files to a package for distribution. Is there any way to exclude these imports from generated code?
I have tried separating the documentation to its own files, but it's impossible to do with metadata which are part of Service or Message definitions as I'm getting duplicate definition errors.

Since there is no official Java library that corresponds to that annotations.proto file, you need to generate your own or do some shenanigans to modify the compiled descriptors before generating the Java code. I'll explain both.
You can generate Java code for annotations.proto, not just your own proto that imports it.
If you use Buf, you can actually tell it to generate source code for your imports using a --include-imports flag to buf generate (though this generates sources for all imports, not just ones that don't otherwise have a corresponding Java library).
These annotations are available in Buf's Schema Registry, so you could also separately generate these files and compile them into a separate JAR using buf generate buf.build/grpc-ecosystem/grpc-gateway.
One issue with this file is that it does not declare a Java package option in the file. That's why the Java package in the generated code doesn't have a proper reverse domain name. With Buf, you could use managed mode to actually inject a Java package option, to generate these files into whatever package you want (so you could generate them as if they were in a "shaded" package in your own JAR).
This second route is much less advised and not for the faint of heart, but it lets you omit the import in the generated code. You need to first compile your sources to a file descriptor set. (Buf can produce this via buf build; With protoc, use a -o option instead of --java_out.) This file is a binary encoded FileDescriptorSet.
You could write something that reads this file (unmarshalling its contents into a FileDescriptorSet) and then modifies it. You'd modify it by examining the dependency field of every file in the set and removing the entries like "protoc-gen-openapiv2/options/annotations.proto".
You can then re-marshal this to a file and feed that back in for code generation. So instead of generating Java code from sources, you'd generate them from the modified descriptor set (buf generate <file>#format=bin if using Buf or protoc --descriptor_set_in=<file> --java_out=<output-dir> if using protoc).
Note that this approach can only work if the only things that use the import being removed are custom options. That's because custom options can safely be represented as unrecognized fields in the descriptor (and effectively ignored). If you remove an import that has type definitions that are referenced in the file, the compiler will not accept the modified file descriptor set.
If that last bullet looks like Greek, it's because that is quite advanced descriptor-fiddling. Realistically, I think the first bullet is your best approach.

Related

How to add custom information model XML.file to server and run It?

I'm currently working on open62541. I created one object XML file. now I want to add the file to server and try to run server. when server runs the XML file contain objects should be show on opcua-client application.
In 3 steps:
you need a nodeset.xml file
use cmake command to generate source code from it
call a function in your executable
1
I do not know what kind of "XML" file you have.
I would assume you have a valid nodeset.xml file.
if you do not know how to do it,you can try read this: https://opcua.rocks/custom-information-models/
personally i suggest to use a GUI tool for that (e.g. free opc ua modeler)
2
Then you should use following custom CMake commands provides by open62541
# Generate types and namespace for DI
ua_generate_nodeset_and_datatypes(
NAME "di" # the name you want
FILE_CSV "${UA_NODESET_DIR}/DI/OpcUaDiModel.csv"
FILE_BSD "${UA_NODESET_DIR}/DI/Opc.Ua.Di.Types.bsd"
NAMESPACE_MAP "2:http://opcfoundation.org/UA/DI/"
FILE_NS "${UA_NODESET_DIR}/DI/Opc.Ua.Di.NodeSet2.xml"
)
after build, you would find bunches of ua_xxxx_generated.c and ua_xxxx——generated.h file under build/src_generated folder.
Then in your programm code just include these headers and call
3
namespace_xxx_nodeset_generated(server)
please refer to https://github.com/open62541/open62541/tree/master/examples/nodeset
and
http://www.open62541.org/doc/master/nodeset_compiler.html
There are rich example and codes for that

I am looking for code policy enforcement tool for xml and Python

I have projects that are developed with xml and python code mostly (Odoo modules). There is a bit of .po files for translation and csv fields for data.
I would like to enforce specific policies in xml files, for example:
No duplicate id attributes.
A specific attribute must be present if child elements contain a specific tags.
On python, I want to enforce rules like:
Look for SQL queries, and make sure that they use specific parameter methods to prevent SQL injection
Follow a specific naming convention
Some attributes are required in classes that inherit a specific class
I hope that the idea is clear.
Is there any open source solution for this? Preferably linked with github and checks on every commit!
I found a python package made specifically for this, pylint-odoo, here.
It can also be installed with pip install pylint-odoo.
An example .pylintrc config file can be found at the web OCA module, here. They also have another file named .pylintrc-mandatory.
There is even a warning for duplicate xml id attribute W7902.

How to plugin into grpc-java to modify code generation and add setNameOrClear(null) methods?

We are having too many issues around all this extra code for every database field with regard to
if(databaseObj.getName() != null)
builder.setName(databaseObj.getName());
and I read square wired into protobuf adding setOrClear methods in java. How do we do this when we generate as well using gradle?
We are using the gradle code from this page right now..
https://github.com/grpc/grpc-java
thanks,
Dean
You can accomplish that via protoc_insertion_points. When you generate the Java code you will see comments like // ##protoc_insertion_point(...). That is where the insertion will occur.
While appearing useful, this approach has serious drawbacks for .protos used in multiple projects. All projects using the same .proto and in the same language should use the same plugins, otherwise it causes the diamond dependency problem. This is why gRPC did not use this approach and instead generates its classes in separate files from the normal message generation. I strongly discourage against this approach, as it paints you into a corner and you don't know when you will need to "pay the piper."
To insert into a point, your plugin needs to run in the same protoc command-line invocation as the java builtin. Your plugin would then need to set CodeGeneratorResponse.file.insertion_point and content for each file you want to inject code.

Can I override the endpoint prefix set in a Go GRPC client call?

I have a single protobuf which generates out C# and Go code.
The protobuf contains:
syntax = "proto3";
package myprotobuf;
option go_package = "gitlab.example.com/mycompany/myprotobuf.git";
I'm using go-micro and protoc-gen-micro for my Go GRPC. I'm using Go modules for my Go packages. I'm pushing generated Go code to my protobuf repository for a few reasons: (a) Git submodules can be painful to work with (b) a protobuf referencing a type in an external package requires that external package to have a defined absolute package URL and (c) that's how Google do it (ref e.g. structpb) so it seems like that's the "standard".
The C# server / client generated from that proto serve / hit an endpoint at "/myprotobuf.Service/Method", and work fine.
GRPC_TRACE for C# gives:
Decode: ':path: /myprotobuf.Service/Method', elem_interned=1 [1], k_interned=1, v_interned=1 (edited)
The Go / go-micro client calling the C# server gives:
Decode: ':path: /myprotobuf.git.Service/Method', elem_interned=0 [2], k_interned=1, v_interned=0
followed by an error. Note that the path is different. Breakpoints and Console.WriteLine's in the C# GRPC handler never get hit, which makes sense since we're not hitting a known endpoint.
What's the solution for this?
go get seems to require the .git at the end of the package URL.
go modules require the "module" and "package" definitions to match the URL.
C# won't like a "." in the namespace.
So it seems like Go and C# are both always going to prefix the endpoint with what the think the package / namespace is, and they're never going to agree on what the package / namespace should be.
Is there a way to override the namespace prefixed to the GRPC endpoint?
One workaround I've found is to sit the package a level under the protos in a "mypb" directory:
package mypb;
option go_package = "gitlab.example.com/mycompany/myprotobuf.git/mypb";
option csharp_namespace = "MyCompany.Protobuf.MyPB";
It's a bit of a hack, but I don't mind it too much, especially since it sits the generated code out of the way of the proto source that I actually care about. This way the generated C# and Go agree on the namespace / package that they prefix endpoints with. Thankfully the camel case MyPB vs lower case mypb doesn't seem to matter.

How to dynamically add imports

I want to dynamically create an HTTP router using custom plugins/middleware, currently, I am using a config.yml for example:
plugins:
- waf
- jwt
- cors
After parsing the yml file I create the routes like this:
route := violetear.New()
chain := middleware.New()
for _, plugin := range Plugins {
chain := chain.Append(plugin)
}
router.Handle("/test", chain.Then(myHandler))
log.Fatal(http.ListenAndServe(":8080", router))
For this to work, I would have to include all the plugins in the import section, something like:
import (
"net/http"
"github.com/nbari/violetear"
"github.com/nbari/violetear/midleware"
// How to deal with this
"github.com/example/waf"
"github.com/example/jwt"
"github.com/example/cors"
)
I would need to change the current config format to be something more useful/generic probably something like:
plugins:
- [github.com/foo, foo]
- [github.com/bar, bar]
But besides that what could be the best approach to "dynamically" create the imports or to generate the code the one later could be compiled?
Any ideas?
Go is a statically linked language. This means if a package is not referenced at compile time (from your .go source files), that package will not be linked / compiled into the executable binary, which implies it will not be availabe at runtime.
So the easiest is to just use imports.
If you want truly dynamic configuration, plugins introduced in Go 1.8 may be an option for you, but using plugins complicates things and I would only use that as a last resort. Also note that plugins currently only work on linux.
Related questions:
Dynamic loading in Golang?
Go Plugin variable initialization
go 1.8 plugin use custom interface

Resources