How to encode arguments in AssemblyScript when calling Aurora contract from Near blockchain? - nearprotocol

I'm trying to call a contract located in Aurora from a contract located in Near. I'm using AssemblyScript and I'm struggling with passing arguments to the Aurora contract itself. I receive ERR_BORSH_DESERIALIZE panic from the Aurora contract. Can anybody help me with figuring out how I would encode arguments? Here is sample code:
import { BorshSerializer } from '#serial-as/borsh'
#serializable
class FunctionCallArgs {
contract: Uint8Array;
input: Uint8Array;
}
export function myFunction(): void {
const args: FunctionCallArgs = {
contract: util.stringToBytes(contractAddress),
input: util.stringToBytes(abiEncodedFn),
};
const argsBorsh = BorshSerializer.encode(args);
ContractPromise.create("aurora", "call", argsBorsh, 100);
}

I managed to find a solution. The flow of calling the contract was right, however I had two errors in implementation.
Wrong conversion of the contract address to 20 byte array. My custom implementation of the function a bit verbose, so here is a single line JS script that does the same:
Buffer.from(contractAddress.substring(2), 'hex') // removing 0x prefix is mandatory
"#serial-as/borsh" doesn't deserialize fixed length arrays. So I had to convert contractAddress (which is Uint8Array after converting it to bytes in 1st point) to StaticArray(20), like this:
const contract = hexToBytes(tokenAddress).reduce((memo, v, i) => {
memo[i] = <u8>v;
return memo;
}, new StaticArray<u8>(20);
And finally monkey-patched "encode_static_array" method in the library to not allocate space before adding bytes to buffer. So removed:
encode_static_array<T>(value: StaticArray<T>): void {
...
this.buffer.store<u32>(value.length); // remove this line
...
}

Related

In solidity ,can push command be used for adding an element to a byte data type , this guy on yt did , on the net it says its only for dynamic array?

// SPDX-License-Identifier: MIT
pragma solidity ^0.8.0;
contract dynamicsizedbyte{
bytes public by1;
function setvalue() public {
by1="abcdefgh";
}
function pushelement() public {
by1.push(10);
}
}
im getting this error
TypeError: Member "push" not found or not visible after argument-dependent lookup in bytes storage ref.
The best high-level approach I could come up with is overriding the whole value with a newly-created byte array. Hoping someone finds a more efficient solution, should be possible using the Solidity assembly.
function pushelement() public {
by1 = abi.encodePacked(by1, bytes1(0x10));
}

How to know which environment/network a NEAR smart-contract is deployed to (AssemblyScript)?

I'm doing some cross contract calls using NEAR and AssemblyScript. I would like to call different accounts based on the environment my smart-contract is deployed to. If the contract is deployed to testnet, I want to call a testnet cross-contract call. If the contract is deployed to mainnet, I want to call a mainnet cross-contract call.
export function callMetaNear(accountId: string): void {
// how to get correct contract name based on where the contract is deployed?
let otherContract: string = 'test.testnet';
if(contractIsDeployedToMainnet) {
otherContract = 'test.near';
}
// cross-contract call example
const itemArgs: AddItemArgs = {
accountId,
itemId: "Sword +9000",
};
const promise = ContractPromise.create(
otherContract,
"addItem",
itemArgs.encode(),
0,
);
promise.returnAsResult();
I will answer my own question, but I'm not sure if it's the best solution. Better answers are welcome.
I figured we can assume the contract is deployed to mainnet if Context.contractName ends with ".near".
import { Context } from 'near-sdk-core';
...
let otherContract: string = 'test.testnet';
if(Context.contractName.endsWith(".near")) {
otherContract = 'test.near';
}

How can I mint multiple NFT copies based on NEP-171 standard?

i'm actualy minting tokens like this:
self.tokens.mint(token_id.clone(),account.clone(),Some(token_metadata.clone())
this are the params that i use to minting new tokens:
'{"token_id":next_tokenid_counter,"account": "'dokxo.testnet'", "token_metadata": { "title": "Some titile", "description": "Some description", "media": "","extra":"","copies":copies_number}}'
then only can minting one token with metadata info but only exist one token
and im looking if exist a method in Near/Rust like solidity's method to minting copies's n number: ex.
_mintBatch(address to, uint256[] ids, uint256[] amounts, bytes data)
any suggestions or examples for this?
The easiest implementation is probably to pre-mint all the tokens in the series.
I don't know RUST, so my example will be in AssemblyScript, and I will call the method nft_mint_series (you can call it whatever you want):
following the NEP-171 (and NEP-177 for metadata) standard. We can do something like the following example implementation. I will assume that you have a mint function, which it looks like you already have.
nft_mint_series will do one thing, which is to call the nft_mint function for all copies. You MUST change the id of each token, but everything else you do is up to the implementation and logic you want. I also change the title of each token in the method. Though this example is in AssemblyScript, I think it shouldn't be too difficult to find an equivalent in Rust, as it's a simple for loop.
#nearBindgen
export class Contract {
// Example implementation of how we can mint multiple copies of an NFT
// Return type is optional. I added it to make it easier to test the function
nft_mint_series(
to: string,
id: string,
copies: i32,
metadata: TokenMetadata
): Token[] {
const seriesName = metadata.title; // Store the title in the metadata to change name for each token's metadata.
const tokens: Token[] = [];
for (let i = 0; i < copies; i++) {
const token_id = id + ':' + (i + 1).toString(); // format -> id:edition
// (optional) Change the title (and other properties) in the metadata
const title = seriesName + ' #' + (i + 1).toString(); // format -> Title #Edition
metadata.title = title;
tokens.push(this.nft_mint(token_id, to, metadata));
}
return tokens;
}
}
The example above is just one simple and straight forward way, and is not necessary how it should be created in a real application.

Heroku Apollo Server throws "ServerParseError: Unexpected token < in JSON at position 0" only for some queries

I created a GraphQL wrapper for PokeAPI. My queries all work in development fine and most of them work in production. However, I have the following query that works in production for smaller start and end ranges, but throws "ServerParseError: Unexpected token < in JSON at position 0" when I try to query for all of the pokemon with a very large range. This error does not happen in development.
query {
allPokemon(start: 0, end: 964) {
id
name
}
}
My resolver in my GraphQL for allPokemon only hits one REST endpoint and comes back with an array of objects that have the following structure:
{
name: "charmander",
url: "https://pokeapi.co/api/v2/pokemon/4/"
}
My resolver maps over the resulting array to grab the name value and to parse the url value to grab the id number at the end of the string.
Not sure if this is relevant/necessary to include here, but I am using apollo-datasource-rest. I created a class component that extends RESTDataSource that has abstracted out my functions for my GraphQL resolvers. Then I simply call those methods inside of my resolvers. My allPokemon method inside this RESTDataSource component looks like this:
async getAllPokemon(start = 0, end = 964) {
const response = await this.get(`pokemon?offset=${start}&limit=${end}`);
const pokemonIds = response.results.map(pokemon =>
parseUrl(pokemon.url)
);
return pokemonIds;
}
parseUrl is a utils function I created that just takes a url and parses it to grab the number at the end of the url after the last /.
Then in my resolvers, I have the following:
const resolvers = {
Query: {
allPokemon: (parent, args, { dataSources }) => {
return dataSources.pokemonAPI.getAllPokemon(args.start, args.end);
}
}
}
I can't seem to figure out if this is an issue with Heroku or with Apollo Server. My guess was with Heroku since I have no problems in development getting the expected data for all of the queries. I thought perhaps Heroku must have some limitations as far as timing out or how how many iterations of the parsing function it can do, but have been unable to confirm this theory, let alone find a solution. Any help is appreciated!

C++11 Lambda function compilation error

i am new using c++11 features and also tryng to use SDL_Widget-2 lib for build a simple Gui for my project. But i am getting stuck in the problem :
#include "sdl-widgets.h"
class Builder
{
public:
Builder():top_win(nullptr)
,but(nullptr)
{
top_win=new TopWin("Hello",Rect(100,100,120,100),0,0,false,
[]() {
top_win->clear();
draw_title_ttf->draw_string(top_win->render,"Hello world!",Point(20,40));
}
);
but=new Button(top_win,0,Rect(5,10,60,0),"catch me",
[](Button *b) {
static int dy=60;
b->hide();
b->move(0,dy);
b->hidden=false;
dy= dy==60 ? -60 : 60;
});
}
private:
TopWin * top_win;
Button *but;
};
int main(int,char**) {
Builder aViewBuilder;
get_events();
return 0;
}
with the error in the compilation stage:
In lambda function:
error: 'this' was not captured for this lambda function
error: 'this' was not captured for this lambda function
this error is printed out twice int the console.
I have try :
[this](){}
[=](){}
and
[&](){}
with different compile error but a cannot go more further.
Can any see a fix?
You do need to capture with [this] or [&]. I suspect that the TopWin and Button constructors take raw function pointers, and need to take std::functions instead.
A plain vanilla function pointer is not compatible with capturing lambdas. std::function is able to work like a function pointer that also allows safe storage of captured data. (i.e. the captured objects will need to be properly copied or destroyed when the function object is itself copied or destroyed)

Resources