Best practices for sharing contracts when using Google Protocol Buffers? - protocol-buffers

What are some effective ways to share contracts between peer applications, when using Google Protobuffers as the transport? Have any best practices emerged?

If you are talking between different platforms, your best bet is to simply put the .proto schema definition somewhere accessible - could be documentation, could be a download. Each platform can generate their code from there.

Related

is there any other way to interact with Ethereum's smart contracts via UI besides Etherscan?

I'm aware of Etherscan's capability for interactions with smart contracts on the Ethereum network, but I wonder if there is any other way to read and write from smart contracts.
I'd expect an improved UI/UX usability, allowing input validation, adding documentation on top of the contract etc, yet I couldn't find any other service providing it.
You could use https://remix.ethereum.org/
There is no service that I know that can provide documentation on top of the contract.
But, it's possible to develop one. Are you interested in how it can be done?
The only one I know of is Remix. This is a great tool for smart contract testing and interaction
And if you are planning to develop your own UI with an API. This is not the exact solution but check out drizzle. It has some good built in features which will get you started on the front-end parts and showing blockchain data
Both tools presented below load the ABI automatically from the contract address.
eth95.dev
There is one that looks like old Windows 95 app. Pretty cool.
https://eth95.dev/
mycrypto.com
https://app.mycrypto.com/interact-with-contracts

How to share Protobuf definitions for gRPC?

Since you have to share .proto files to define data and services for gRPC, service provider and clients need to access the same .proto files. Is there any common strategy to distribute these files? I want to avoid that every project has its own .proto file in its Git repository and our team members need to manual edit these files or share them via email.
Is there any common best practice?
Unfortunately, there is no common practice, although the main goal that you should achieve is to store proto files in one version control repository.
During my investigation, I've found some interesting blog posts about that subject:
https://www.bugsnag.com/blog/libraries-for-grpc-services
https://www.crowdstrike.com/blog/improving-performance-and-reliability-of-microservices-communication-with-grpc/
https://medium.com/namely-labs/how-we-build-grpc-services-at-namely-52a3ae9e7c35
They covers much of gRPC workflow considerations. Hope that helps!
In terms of best practices in sharing gRPC definitions, I would suggest instead of sharing those files, to use GRPC Server Reflection Protocol (https://github.com/grpc/grpc/blob/master/doc/server-reflection.md)

Proxy API paths

I'm building an API using KOA and have read some best practise on versioning. This answer pointed out that versions should be hidden from the client.
My question is, how would I go about doing this? I've read some mentions of using an API proxy. Would I be using something like "Squid" as a reverse-proxy, or are there better Node/KOA specific solutions for this type of work?
I think GraphQL is the perfect tool to avoid pain in the ass with API.
Yes, in some point it breaks the REST philosophy but gives flexibility.
All you need to build a flexible API with no worry about version is: Koa, Objection + GraphQL.

How to make a chatbot in Golang with dynamic plugins and adapters?

I'm new to Golang and writing a chat bot as an exercise.
Basically I'm using net/http and gorilla/mux to handle requests.
For now it can only talk to one specific chat platform (I call it an adapter) and has only one plugin (find a picture on google).
How can I make both adapters and plugins be dynamic - so other developers can write their own stuff and just use my bot as a base platform? Are there any good examples?
Also should I have all the plugins and adapters in one repo/static binary or should they be separate? I know I can do both ways but what would you recommend as a better way to have easier collaboration and extensibility?
Medium post, "Standard Package Layout", by Ben Johnson, helped me think about this.

Are there any good instant message APIs for the Mac?

Just curious, if you were to build an instant message client for the Mac what existing API or service would you use to handle the transfer of messages from one user to another? I am looking for something that can be used in conjunction with objective-c and is compatible with other popular messaging services such as MSN, Yahoo, Aim, gtalk, etc. I don't want to host the service, but rather connect to existing services and use their "pipes".
Thanks
There are many Instant messenger protocols out there.
There is a good bet you could find a Java API for which protocol you would like to use like the XMPP Java API.
Or
for C or C++ you could use the libpurple library.
Your question lacks a lot of informations, so it's rather hard to answer. Please add some details on your requirements. What protocols do you need, what functionality, what development language do you use?
As a start:
Adium has been released under the GPL, thus you can use the code in your own projects as long as the license fits your needs.
http://trac.adium.im/
Another option, if you don't want to implement multiple networks but prefer to use a single protocol where the server provides gateways to other networks you can also check out Jabber/XMPP libraries that are available for the mac.
http://www.google.de/search?q=jabber+library+mac&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:de:official&client=firefox-a

Resources