Grpc Go Generated.pb.go import was not formatted - go

I imported proto file (validator.proto) from one of my project https://github.com/maanasasubrahmanyam-sd/customValidation to another project (test.proto) https://github.com/maanasasubrahmanyam-sd/customeValTest/tree/master/interfaces/test_server
go get github.com/maanasasubrahmanyam-sd/customValidation
protoc \
-I. \
-I $GOPATH/src/ \
--proto_path=${GOPATH}/pkg/mod/github.com/envoyproxy/protoc-gen-validate#v0.1.0 \
--proto_path=${GOPATH}/src/github.com/google/protobuf/src \
--go_out="plugins=grpc:./generated" \
--validate_out="lang=go:./generated" \
--govalidators_out=. \
./interfaces/test_server/*.proto
The correct import should come as github.com/maanasasubrahmanyam-sd/customValidation/validator. But test.pb.go import was coming as _ "./validator" which shows red line in Goland.
EDIT - All pb.go files are showing errors in them. I suspect it is due to bad import.
I google it but did not found any relevant information. Any suggestion experts ?

You can address the proto path in two ways,
One: if your import proto file is in your local then you should move it to your parent directory then address it from your parent path like this:
- parentDirectory
-- directory1
--- proto1.proto
-- importDirectory
--- proto2.proto
you can build this file(proto1.proto) with this command :
protoc --proto_path=parentDirectory/directory1 --proto_path=parentDirectory --go-grpc_out=***your output path*** --go_out=***your output path*** parentDirectory/directory1/proto1.proto
also if you use Goland you need to add parentDirectory to your setting(File | Settings | Languages & Frameworks | Protocol Buffers) then uncheck Configure automatically and add your parent path.
Two: If your proto file is in URL: then you can add it to your build command like this:
protoc --proto_path=src \
--go_opt=Mprotos/buzz.proto=example.com/project/protos/fizz \
--go_opt=Mprotos/bar.proto=example.com/project/protos/foo \
protos/buzz.proto protos/bar.proto

Related

The line import openapiclient "github.com/GIT_USER_ID/GIT_REPO_ID" is added when generating from openapi-generator prompts login

I am using openapi generator to generate my rest api client. It generates the line
openapiclient "github.com/GIT_USER_ID/GIT_REPO_ID"
In my imports but I can't for the life of me understand why. Running a go mod vendor prompts me to sign in while this line is in place. What is this trying to import? I'm on an enterprise github, which would complicate things. The example README says to add this line but provides no explanation of what it is doing https://github.com/OpenAPITools/openapi-generator/blob/master/samples/openapi3/client/petstore/go/go-petstore/README.md#:~:text=%22github.com/GIT_USER_ID/GIT_REPO_ID%22
You can pass those to the OpenAPI Generator client as parameters:
openapi-generator-cli generate \
-i openapi.yaml
-g go \
-p packageName=mypackage \
-o /src \
--git-repo-id my-go-lib/v1 --git-user-id user1 \
that will result in
openapiclient "github.com/user1/my-go-lib/v1"

AsyncAPI: Only generate payload

Is it possible to skip generation of specific files using asyncapi-generator?
I am using the Go generator but I only need the payload.go. Right now it always generates all files:
handlers.go payloads.go publishers.go router.go server.go subscribers.go
The command I am using is:
$ docker run --rm -it \
-v ${PWD}/asyncapi.yaml:/app/asyncapi.yml \
-v ${PWD}/output:/app/output \
asyncapi/generator -o /app/output /app/asyncapi.yml #asyncapi/go-watermill-template --force-write
You cannot selectively generate only selected files yet. I encourage you to join the related discussion on GitHub
From what I understand is that you are interested only in models generation. So maybe you should just use directly the Modelina tool that is used there in go-watermill-template.
Modelina is already integrated with AsyncAPI CLI and you can do asyncapi generate models golang asyncapi.yml

Alpine abuild fails with error "builddir missing"

I have a fairly simple APKBUILD file, fetched from the original repository and just altered at one place to build a static library instead of a dynamic. However, I'm new to Alpine, apkbuild and I'm stuck at what is probably a failry simply to fix problem, but I couldn't quite figure out myself where it comes from. When I build using abuild -r -F I get the error:
>>> fc-abseil-cpp: Unpacking /var/cache/distfiles/abseil-cpp-20211102.0.tar.gz...
>>> ERROR: fc-abseil-cpp: Is $builddir set correctly?
>>> ERROR: fc-abseil-cpp: prepare failed
The APKBUILD file is this one:
# Contributor: Bart Ribbers <bribbers#disroot.org>
# Maintainer: Duncan Bellamy <dunk#denkimushi.com>
pkgname=fc-abseil-cpp
pkgver=20211102.0
pkgrel=1
pkgdesc="Abseil Common Libraries (C++) "
url="https://abseil.io/"
arch="all"
license="Apache-2.0"
makedepends="
cmake
gtest-dev
linux-headers
"
subpackages="$pkgname-dev"
source="https://github.com/abseil/abseil-cpp/archive/$pkgver/abseil-cpp-$pkgver.tar.gz
0002-abseil.patch
"
build() {
cmake -B build \
-DCMAKE_CXX_STANDARD=17 \
-DCMAKE_BUILD_TYPE=MinSizeRel \
-DCMAKE_INSTALL_PREFIX=/usr/local \
-DCMAKE_INSTALL_LIBDIR=lib \
-DBUILD_SHARED_LIBS=OFF \
-DBUILD_TESTING=ON \
-DABSL_USE_EXTERNAL_GOOGLETEST=ON \
-DABSL_PROPAGATE_CXX_STD=ON \
-DABSL_FIND_GOOGLETEST=ON
cmake --build build
}
# disable broken tests
check() {
CTEST_OUTPUT_ON_FAILURE=TRUE ctest --test-dir build -E "absl_str_format_convert_test|absl_mutex_test\
|absl_notification_test|absl_per_thread_sem_test|absl_sysinfo_test|absl_random_beta_distribution_test"
}
package() {
DESTDIR="$pkgdir" cmake --install build
}
sha512sums="
fed68aa434c02ec6faa9d1c81f1ad35b60ec024b44957e2e0ac31e6075e385c06a3e1b616afeb4bb7c2413191fd7827d82e1f9f0796b52ed21fb2c41dd9031cf abseil-cpp-20211102.0.tar.gz
78bca9372af30624a303b53cbc07b4bfe0ca5a11ef2126c6b3fb34714e3b119fa4bf9a088968b491a7823107df5083c0d4b4aed0e47b8e872ba572543e9a52ea 0002-abseil.patch
"
I've tried to set a builddir vairable directly within the APKBUILD file, as well as on the shell as an export. What's the problem here?

Start Relay Chain error "substrate : command no found"

I am trying to do a substrate tutorial, "Start your relay chain".
(https://docs.substrate.io/tutorials/v3/cumulus/start-relay/)
Here, I copied the code and run it to start the alice validator.
./target/release/polkadot \
--alice \
--validator \
--base-path /tmp/relay/alice \
--chain <path to spec json> \
--port 30333 \
--ws-port 9944
"path to spec json" - What should I replace this?
I tried to build a chain spec file like this.
substrate build-spec > myCustomSpec.json
Error is here, "substrate : command no found"
What's the problem? Anyone can help me?
Thank you.
Polkadot is used to create the chain spec as described in the tutorial. You don't have substrate installed, and that would be incorrect to use anyway.
See the sections on raw and plain chain spec generation.

Elasticsearch standalone JDBC river feeder missing main class

I'm trying to setup the feeder following this instruction https://github.com/jprante/elasticsearch-jdbc#installation
I downloaded and unzipped the feeder
I don't quite understand this step:
run script with a command that starts org.xbib.tools.JDBCImporter with the lib directory on the classpath
what am I suppposed to do?
if I try to run a sample script from bin I get:
Bad substitution
Error: Could not find or load main class org.xbib.elasticsearch.plugin.jdbc.feeder.Runner
where do I get the java classes org.xbib.elasticsearch.plugin.jdbc.feeder.Runner \
org.xbib.elasticsearch.plugin.jdbc.feeder.JDBCFeeder?
figured out the solution
it was to set the installation folder in script (not the elasticsearch folder but the jdbc folder!)
#!/bin/bash
#JDBC Directory -> important, change accordingly!
export JDBC_IMPORTER_HOME=~/Downloads/elasticsearch-jdbc-1.6.0.0
bin=$JDBC_IMPORTER_HOME/bin
lib=$JDBC_IMPORTER_HOME/lib
echo '{
...
...
}
}' | java \
-cp "${lib}/*" \
-Dlog4j.configurationFile=${bin}/log4j2.xml \
org.xbib.tools.Runner \
org.xbib.tools.JDBCImporter

Resources