Decode Tendermint b64 encoded transactions in Golang - go

I execute the Tendermint request /unconfirmed_txs to get pending transactions data and I get a list of b64 encoded transactions like this:
CsQECpAECh8vZXRoZXJtaW50LmV2bS52MS5Nc2dFdGhlcmV1bVR4EuwDCqUDChovZXRoZXJtaW50LmV2bS52MS5MZWdhY3lUeBKGAwgCEg01MDc3OTQ2NTQxMzY0GPbZDCIqMHgxNDU4NjNFYjQyQ2Y2Mjg0N0E2Q2E3ODRlNjQxNkMxNjgyYjFiMkFlKhU0MDAwMDAwMDAwMDAwMDAwMDAwMDAy5AF/82q1AAAAAAAAAAAAAAAAAAAAAAAAAAAAfvEOe1XqTQ9Ynq8AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAgAAAAAAAAAAAAAAAAK5tNYq1CNfuvRh/6NqF6Zzkg6PtAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAMd7WUoAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAgAAAAAAAAAAAAAAAFx/ilcNV47YTmP9+nse5y3q4a4jAAAAAAAAAAAAAAAA3XPeoQq8K/+ZxgiC7FsrgbsdxbI6AVZCIBX0RwQW5LaC1LZG0W5mHid2Dsx/8FQn0r9IhHrWBAVxSiBLOhtk5OAdcCGBHaLUHK8rKEpL1bgtBRYxG/if+UesIxpCMHhlZGRjOWMxMWVkY2Q1NTY4YWZlOWM0Y2ZmYWNjMmU4ZjBhMGNlNjFiMDczMDE4OTQ1NzEyZDMwNTU3ZDBjMTUw+j8uCiwvZXRoZXJtaW50LmV2bS52MS5FeHRlbnNpb25PcHRpb25zRXRoZXJldW1UeBImEiQKHgoHYmFzZWNybxITMTA1NjgxMjA3ODI5NTU5Mjk1MhD22Qw=
How can I decode these transactions in Golang to get all the transaction data ( if possible in the Transaction struct of go-ethereum ) ?
After some research I understood that I have to use some codec and register them with some functions with the lib but I don't found a good code exemple for my use case and really don't understand how it works.
I also tried the library github.com/calvinlauyh/cosmosutils to decode transactions but I get the error: "unable to resolve type URL /ethermint.evm.v1.MsgEthereumTx: tx parse error"
Thanks !

transactions data is encode in base64, check the below code for decoding.
const { sha256 } = require("#cosmjs/crypto")
const { toHex} = require("#cosmjs/encoding")
const base64EncodedTrx ="CpIBCo8BChwvY29zbW9zLmJhbmsudjFiZXRhMS5Nc2dTZW5kEm8KLWNvc21vczEyeHQ0eDQ5cDk2bjlhdzR1bWp3eXAzaHVjdDI3bndyMmc0cjZwMhItY29zbW9zMXV0MncwbTN4YTd6MnJ2bmR2MjNwdGh2OXFjN2hrc3g2dGtmOXVxGg8KBXVhdG9tEgYyNTAwMDASaQpRCkYKHy9jb3Ntb3MuY3J5cHRvLnNlY3AyNTZrMS5QdWJLZXkSIwohAyhu2k5/x5GSSsNDGaahT1pMDK7Yk65LnRJo81IH166REgQKAggBGKMdEhQKDgoFdWF0b20SBTczMzU1EIu9BBpAM6HTxgkO1dRiuVU3TD23MzfKcyWOxhM4VlZKulLmFy4dcbpkaauXeuOlptAl9sbRKGozVr0Z87VPT/LIQCBrDw==";
const sha256v= sha256(Buffer.from(base64EncodedTrx,'base64'));
const txHash = toHex(sha256v)
//trx signature
console.log(txHash)

Related

Why is the signature verification not working when the signature is constructed by node-forge?

I have a Nuxt application that needs to retrieve some information from a Spring Boot-based auth service.
Right now I sign a text message on the Nuxt app (the auth server is aware of that text message), using node-forge, and then I send it encrypted and with the signature for verification on the auth service.
The problem is that the auth service keeps telling me that the size of the signature is wrong, with a java.security.SignatureException: Signature length not correct: got 3XX but was expecting 256.
Here is the code generating the encrypted message and signature on the Nuxt side:
var md = forge.md.sha256.create();
md.update("123"); // for example purposes
var sign = pPrivateKey.sign(md);
var digestBytes = md.digest().bytes();
console.log("Signature:", sign );
console.log("Encrypted:", digestBytes);
console.log("Encrypted B64:", Buffer.from(digestBytes).toString("base64"));
var keyAuthB64Url = Buffer.from(digestBytes).toString("base64url");
var signB64Url = Buffer.from(sign).toString("base64url");
var jwt = await axios.get(process.env.URL + "/auth", { params: { encrypted: keyAuthB64Url, signature: signB64Url } });
On the auth service I have the following code:
byte[] messageBytes = Base64.getUrlDecoder().decode(encryptedMessage);
byte[] signatureBytes = Base64.getUrlDecoder().decode(signature);
Signature sign = Signature.getInstance("SHA256withRSA");
sign.initVerify(certPublicKey);
sign.update(messageBytes);
boolean verified = sign.verify(signatureBytes);
if (!verified) {
throw new Exception("Not verified!");
}
From all the debugging I have done, it seems like the Spring Boot app has a problem with the signature generated by node-forge on the Nuxt side, with a signature generated in the Spring Boot app the verification works.
There are several issues:
First, the bug that was already mentioned in the comment: While the NodeJS code does not hash implicitly, the Java side does. Therefore, hashing must not be done explicitly on the Java side:
byte[] messageBytes = "123".getBytes("utf-8");
...
sign.update(messageBytes); // Fix 1: Don't hash
Also, in the NodeJS code, sign() returns the data as a bytes string, which must therefore be imported into a NodeJS buffer as a 'binary':
var keyAuthB64Url = Buffer.from(digestBytes, "binary").toString("base64url"); // Fix 2: Import via 'binary' encoding
Without explicit specification of the encoding, a UTF-8 encoding is performed by default, which irreversibly corrupts the data.
And third, latin1 is implicitly used as encoding when generating the hash in the NodeJS code. Other encodings must be specified explicitly, e.g. for the common UTF-8 with utf8:
md.update("123", "utf8"); // Fix 3: Specify the encoding
For the example data 123 used here, this fix has no effect, which changes as soon as characters with a Unicode value larger than 0x7f are included, e.g. 123§. Note that there is little margin for error in the specification of the encoding, e.g. utf-8 would be ignored (because of the hyphen) and latin1 would be used silently.
With these fixes, verification with the Java code works.

How to upload an image in chunks with client-side streaming gRPC using grpcurl

I have been trying to upload an image in chunks with client side streaming using grpcurl. The service is working without error except that at the server, image data received is 0 bytes.
The command I am using is:
grpcurl -proto image_service.proto -v -d # -plaintext localhost:3010 imageservice.ImageService.UploadImage < out
This link mentions that the chunk data should be base64 encode and so the contents of my out file are:
{"chunk_data": "<base64 encoded image data>"}
This is exactly what I am trying to achieve, but using grpcurl.
Please tell what is wrong in my command and what is the best way to achieve streaming via grpcurl.
I have 2 more questions:
Does gRPC handles the splitting of data into chunks?
How can I first send a meta-data chunk (ImageInfo type) and then the actual image data via grpcurl?
Here is my proto file:
syntax = "proto3";
package imageservice;
import "google/protobuf/wrappers.proto";
option go_package = "...";
service ImageService {
rpc UploadImage(stream UploadImageRequest) returns (UploadImageResponse) {}
}
message UploadImageRequest {
oneof data {
ImageInfo info = 1;
bytes chunk_data = 3;
};
}
message ImageInfo {
string unique_id = 1;
string image_type = 2;
}
message UploadImageResponse {
string url = 1;
}
Interesting question. I've not tried streaming messages with (the excellent) grpcurl.
The documentation does not explain how to do this but this issue shows how to stream using stdin.
I recommend you try it that way first to ensure that works for you.
If it does, then bundling various messages into a file (out) should also work.
Your follow-on questions suggest you're doing this incorrectly.
chunk_data is the result of having split the file into chunks; i.e. each of these base64-encoded strings should be a subset of your overall image file (i.e. a chunk).
your first message should be { "info": "...." }, subsequent messages will be { "chunk_data": "<base64-encoded chunk>" } until EOF.

CAPL Multiframe handling

I am writting a CAPL for Diagnostic request and response, I can get response if the data is up to 8 bytes, if data is multiframe I am not getting respone and the message on the trace is "Breaking connection between server and tester", how to handle this? I know about the CANTP frames but in this case it should handle by CAN/Canoe .
Please read CANoe ISO-TP protocol. In case of multiframe response, the tester has to send the flow control frame which provides synchronization between Sender and Receiver, which is usually 0x30. It also has fields for Block size of continous frames and seperation time. Try the below CAPL code.
variables
{
message 0x710 msg = { dlc=8,dir = rx };
byte check_byte0;
}
on message 0x718
{
check_byte0 = this.byte(0) & 0x30;
if(check_byte0 == 0x10)
{
msg.dword(0)=0x30;
msg.dword(4)=0x00;
output(msg2);
}
}
I was trying to send the request over a message ID in most gross form like 22 XX YY , which is a read DID request,this works well if the response is less than 8 bytes, if response is more than 8 bytes this wont work. so we need to use the Diagnostic objects for the request and response as defined in the CDD(or any description file) as used in the project.
If you are not using CDD, in such cases you need to use CCI (Capl call back interfaces), mostly that is necessary for simulation setups.

Sending an email with ruby gmail api v0.9

Does anyone have a simple example as to how to send an email from scratch with the v0.9 API.
simply want an example of sending the following:
m = Mail.new(
to: "test1#test.com",
from: "test2#test.com",
subject: "Test Subject",
body:"Test Body")
Now to create that message object which is required to send,we can use:
msg = Base64.urlsafe_encode64 m.to_s
And then try to send (where message_object = msg):
client = Google::Apis::GmailV1::GmailService.new #Appropriately authorised
client.send_user_message("me", message_object)
The client wants an RFC822 compatible encoded string, which the above should be.
I've tried:
message_object = msg
=> Google::Apis::ClientError: invalidArgument: 'raw' RFC822 payload message string or uploading message via /upload/* URL required
message_object = raw:msg
=>ArgumentError: unknown keyword: raw
message_object = {raw:msg}
=>ArgumentError: unknown keyword: raw
message_object = Google::Apis::GmailV1::Message.new(raw:msg)
=> #<Google::Apis::GmailV1::Message:0x007f9158e5b4b0 #id="15800cd7178d69a4", #thread_id="15800cd7178d69a4">
#But then I get Bounce <nobody#gmail.com> - An error occurred. Your message was not sent.
i.e. None of them work...
Sending the basic encded string (msg above) through the Gmail API interface tester here works.
I'm obviously missing something obvious here as to how to construct that object required to make it work through the API.
Ok. So the answer... thanks for all your help Holger...
Was that the documentation is wrong. It asks you to encode to base64.
The base64 encoding is not required (it is done internally by the api client).
The correct way to send is
msg = m.encoded
# or m.to_s
# this doesn't base64 encode. It just turns the Mail::Message object into an appropriate string.
message_object = Google::Apis::GmailV1::Message.new(raw:m.to_s)
client.send_user_message("me", message_object)
Hope that saves someone else from being patronised by an overzealous mod.
Carpela‘s answer works fine, but for the message_object, it's missing "raw" in Message.new. The correct code should as following:
message_object = Google::Apis::GmailV1::Message.new(raw: m.encoded) # or m.to_s
client.send_user_message('me', message_object)

Importing binary data to parse.com

I'm trying to import data to parse.com so I can test my application (I'm new to parse and I've never used json before).
Can you please give me an example of a json file that I can use to import binary files (images) ?
NB : I'm trying to upload my data in bulk directry from the Data Browser. Here is a screencap : i.stack.imgur.com/bw9b4.png
In parse docs i think 2 sections could help you out depend on whether you want to use REST api of the android sdk.
rest api - see section on POST, uploading files that can be upload to parse using REST POST.
SDk - see section on "files"
code for Rest includes following:
use some HttpClient implementation having "ByteArrayEntity" class or something
Map your image to bytearrayEntity and POST it with the correct headers for Mime/Type in httpclient...
case POST:
HttpPost httpPost = new HttpPost(url); //urlends "audio OR "pic"
httpPost.setProtocolVersion(new ProtocolVersion("HTTP", 1,1));
httpPost.setConfig(this.config);
if ( mfile.canRead() ){
FileInputStream fis = new FileInputStream(mfile);
FileChannel fc = fis.getChannel(); // Get the file's size and then map it into memory
int sz = (int)fc.size();
MappedByteBuffer bb = fc.map(FileChannel.MapMode.READ_ONLY, 0, sz);
data2 = new byte[bb.remaining()];
bb.get(data2);
ByteArrayEntity reqEntity = new ByteArrayEntity(data2);
httpPost.setEntity(reqEntity);
fis.close();
}
,,,
request.addHeader("Content-Type", "image/*") ;
pseudocode for post the runnable to execute the http request
The only binary data allowed to be loaded to parse.com are images. In other cases like files or streams .. etc the most suitable solution is to store a link to the binary data in another dedicated storage for such type of information.

Resources