How to run near-js-api tests against a localnet with multiple contracts - nearprotocol

I have set up node:
nearup run localnet --binary-path ~/code/nearcore/target/release
I am trying to run a jest test case:
beforeAll(async function () {
// NOTE: nearlib and nearConfig are made available by near-cli/test_environment
const near = await nearlib.connect(nearConfig)
})
However there is obvious step missing how to create test accounts on your local node. This is the error:
● Test suite failed to run
Can not sign transactions for account test.near, no matching key pair found in Signer.
at node_modules/near-api-js/lib/account.js:83:23
at Object.exponentialBackoff [as default] (node_modules/near-api-js/lib/utils/exponential-backoff.js:7:24)
at Account.signAndSendTransaction (node_modules/near-api-js/lib/account.js:80:24)
at Account.createAndDeployContract (node_modules/near-api-js/lib/account.js:176:9)
at LocalTestEnvironment.setup (node_modules/near-cli/test_environment.js:39:9)
Also looks like near-js-api is hardcoded to deploy only one contract. I need to test multiple contracts. How can I deploy multiple contracts? From near-js-api test_environment.js
class LocalTestEnvironment extends NodeEnvironment {
constructor(config) {
super(config);
}
async setup() {
this.global.nearlib = require('near-api-js');
this.global.nearAPI = require('near-api-js');
this.global.window = {};
let config = require('./get-config')();
this.global.testSettings = this.global.nearConfig = config;
const now = Date.now();
// create random number with at least 7 digits
const randomNumber = Math.floor(Math.random() * (9999999 - 1000000) + 1000000);
config = Object.assign(config, {
contractName: 'test-account-' + now + '-' + randomNumber,
accountId: 'test-account-' + now + '-' + randomNumber
});
const keyStore = new nearAPI.keyStores.UnencryptedFileSystemKeyStore(PROJECT_KEY_DIR);
config.deps = Object.assign(config.deps || {}, {
storage: this.createFakeStorage(),
keyStore,
});
const near = await nearAPI.connect(config);
const masterAccount = await near.account(testAccountName);
const randomKey = await nearAPI.KeyPair.fromRandom('ed25519');
const data = [...fs.readFileSync('./out/main.wasm')];
await config.deps.keyStore.setKey(config.networkId, config.contractName, randomKey);
await masterAccount.createAndDeployContract(config.contractName, randomKey.getPublicKey(), data, INITIAL_BALANCE);
await super.setup();
}
near-js-sdk itself is deploying against mysterious shared-test
case 'ci':
return {
networkId: 'shared-test',
nodeUrl: 'https://rpc.ci-testnet.near.org',
masterAccount: 'test.near',
};

How can I deploy multiple contracts?
You can use masterAccount.createAndDeployContract same as test_environment.js. There is nothing special there except some common init - you can instead create whatever accounts / contracts are needed for your tests directly.
near-js-sdk itself is deploying against mysterious shared-test
This is shared network which can be used to run integration tests. Unless you want to keep your dev work private – it is recommended way to run tests (as hits most realistic environment currently available for testing).
However there is obvious step missing how to create test accounts on your local node.
If you created project using create-near-app you likely have corresponding test.near keys in neardev/ project folder already. That's why above mysterious environment generally works out of box.
For your local environment you need to create test.near yourself:
NODE_ENV=local near create-account test.near --masterAccount some-existing-account
After that you can copy keys to local format (or just reconfigure UnencryptedFileSystemKeyStore to use ~/.near-credentials path):
cp ~/.near-credentials/local/test.near.json project/dir/neardev/local/test.near.json

Related

How to run nextjs in AWS lambda with `experimental-edge` runtime

I'm trying to find a way to run Next.js (v13.0.6) with OG image generation logic (using #vercel/og) in AWS Lambda
Everything works fine locally (in dev and prod mode) but when I try execute lambda handler getting "statusCode": 500,
It only fails for apis that involve ImageResponse (and runtime: 'experimental-edge' as a requirement for #vercel/og)
I'm pretty sure the problem is caused by Edge Runtime is not being configured correctly
There is my handler code
next build with next.config.js output: 'standalone' creates folder .next/standalone
insde standalone handler.js
const { parse } = require('url');
const NextServer = require('next/dist/server/next-server').default
const serverless = require('serverless-http');
const path = require('path');
process.env.NODE_ENV = 'production'
process.chdir(__dirname)
const currentPort = parseInt(process.env.PORT, 10) || 3000
const nextServer = new NextServer({
hostname: 'localhost',
port: currentPort,
dir: path.join(__dirname),
dev: false,
customServer: false,
conf: {...} // copied from `server.js` in there same folder
});
const requestHandler = nextServer.getRequestHandler();
// this is a AWS lambda handler that converts lambda event
// to http request that next server can process
const handler = serverless(async (req, res) => {
// const parsedUrl = parse(req.url, true);
try {
await requestHandler(req, res);
}catch(err){
console.error(err);
res.statusCode = 500
res.end('internal server error')
}
});
module.exports = {
handler
}
testing it locally with local-lambda, but getting similar results when test against AWS deployed lambda
what is confusing is that server.js (in .next/standalone) has a similar setup, it only involves http server on top of of it
update:
aws lambda logs show
ERROR [Error [CompileError]: WebAssembly.compile(): Compiling function #64 failed: invalid value type 'Simd128', enable with --experimental-wasm-simd #+3457 ]
update 2:
the first error was fixed by selecting Node 16 for AWS lambda, now getting this error
{
"errorType": "Error",
"errorMessage": "write after end",
"trace": [
"Error [ERR_STREAM_WRITE_AFTER_END]: write after end",
" at new NodeError (node:internal/errors:372:5)",
" at ServerlessResponse.end (node:_http_outgoing:846:15)",
" at ServerlessResponse.end (/var/task/node_modules/next/dist/compiled/compression/index.js:22:783)",
" at NodeNextResponse.send (/var/task/node_modules/next/dist/server/base-http/node.js:93:19)",
" at NextNodeServer.handleRequest (/var/task/node_modules/next/dist/server/base-server.js:332:47)",
" at processTicksAndRejections (node:internal/process/task_queues:96:5)",
" at async /var/task/index.js:34:5"
]
}
At the moment of writing Vercel's runtime: 'experimental-edge' seems to be unstable (run into multiple issues with it)
I ended up recreating #vercel/og lib without wasm and next.js dependencies, can be found here
and simply use it in AWS lambda. It depends on #resvg/resvg-js instead of wasm version, which uses binaries, so there should not be much perf loss comparing to wasm

How do I invoke a SAM Lambda function locally with X-Ray statements?

I'm receiving the following error when invoking an AWS SAM Lambda function locally:
Missing AWS Lambda trace data for X-Ray. Ensure Active Tracing is enabled and no subsegments are created outside the function handler.
Below you can see my function:
/** Bootstrap */
require('dotenv').config()
const AWSXRay = require('aws-xray-sdk')
/** Libraries*/
const se = require('serialize-error')
/** Internal */
const logger = require('./src/utils/logger')
const ExecuteService = require('./src/service')
/**
*
*/
exports.handler = async (event) => {
const xraySegement = AWSXRay.getSegment()
const message = process.env.NODE_ENV == 'production' ? JSON.parse(event.Records[0].body) : event
try {
await ExecuteService(message)
} catch (err) {
logger.error({
error: se(err)
})
return err
}
}
In addition, I have Tracing set to Active in my template.yml.
What part of the documentation am I clearly misreading, missing, or reading over?
For now you can't invoke a SAM lambda locally with X-ray because it is not supported yet.
See
[Feature Request] Add support for X-Ray on SAM Local
#217
aws-lambda-runtime-interface-emulator#level-of-support
The component does not support X-ray and other Lambda integrations locally.
If you don't care about X-ray and just want your code to work you can check the env variable AWS_SAM_LOCAL to prevent X-ray usage:
let AWSXRay
if (!process.env.AWS_SAM_LOCAL) {
AWSXRay = require('aws-xray-sdk')
}
// ...
if (!process.env.AWS_SAM_LOCAL) {
const xraySegement = AWSXRay.getSegment()
}
I am using SAM CLI, version 1.36.0
I noticed importing aws-xray-sdk does not create this issue but invoking the functions do.
To not have a million if statements I created a middle man service that imports aws-xray-sdk then exports noops if the env var process.env.AWS_SAM_LOCAL is set.

Lambda Layers not installing with Serverless

Currently getting the following error with MongoDB:
no saslprep library specified. Passwords will not be sanitized
We are using Webpack so simply installing the module doesn't work (Webpack just ignores it). I found this thread which talks about how to exclude it from Webpack compilations, but then I have to manually load it into every Lambda function which led me to Lambda Layers.
Following the Serverless guide on using Lambda layers allowed me to get my layer published to AWS and included in all of my functions, but for some reason, it doesn't install the modules. If I download the layer using the AWS GUI, I get a folder with just the package.json and package-lock.json files.
My file structure is:
my-project
|_ layers
|_ saslprep
|_ package.json
and my serverless.yml is:
layers:
saslprep:
path: layers/saslprep
compatibleRuntimes:
- nodejs14.x
This is not my preferred solution as I'd like to use 256, but the way I got around this error/warning was by changing the authMechanism from SCRAM-SHA-256 to SCRAM-SHA-1 in the connection string. The serverless-bundle most likely needs to add this dependency into their package to enable support for Mongo 4.0 SHA256 (my best guess!).
You can specify this authentication mechanism by setting the authMechanism parameter to the value SCRAM-SHA-1 in the connection string as shown in the following sample code.
const { MongoClient } = require("mongodb");
// Replace the following with values for your environment.
const username = encodeURIComponent("<username>");
const password = encodeURIComponent("<password>");
const clusterUrl = "<MongoDB cluster url>";
const authMechanism = "SCRAM-SHA-1";
// Replace the following with your MongoDB deployment's connection string.
const uri =
`mongodb+srv://${username}:${password}#${clusterUrl}/?authMechanism=${authMechanism}`;
// Create a new MongoClient
const client = new MongoClient(uri);
// Function to connect to the server
async function run() {
try {
// Connect the client to the server
await client.connect();
// Establish and verify connection
await client.db("admin").command({ ping: 1 });
console.log("Connected successfully to server");
} finally {
// Ensures that the client will close when you finish/error
await client.close();
}
}
run().catch(console.dir);

Error compiling OpenZeppelin imported contracts in truffle in vs code

I'm running a truffle project created with truffle init command in CMD. then I create a react app with npx reate-react-app in the same folder.
I'm creating a NFT so I have installed OZ contracts in my react app with using npm install #openzeppelin/contracts command.
Now at this point whole project looks like this.
root folder
Package.json
and this is the complete truffle-config.js
/**
* Use this file to configure your truffle project. It's seeded with some
* common settings for different networks and features like migrations,
* compilation and testing. Uncomment the ones you need or modify
* them to suit your project as necessary.
*
* More information about configuration can be found at:
*
* trufflesuite.com/docs/advanced/configuration
*
* To deploy via Infura you'll need a wallet provider (like #truffle/hdwallet-provider)
* to sign your transactions before they're sent to a remote public node. Infura accounts
* are available for free at: infura.io/register.
*
* You'll also need a mnemonic - the twelve word phrase the wallet uses to generate
* public/private key pairs. If you're publishing your code to GitHub make sure you load this
* phrase from a file you've .gitignored so it doesn't accidentally become public.
*
*/
// const HDWalletProvider = require('#truffle/hdwallet-provider');
// const infuraKey = "fj4jll3k.....";
//
// const fs = require('fs');
// const mnemonic = fs.readFileSync(".secret").toString().trim();
const path = require("path");
module.exports = {
contracts_build_directory: path.join(__dirname, "client/src/contracts"),
/**
* Networks define how you connect to your ethereum client and let you set the
* defaults web3 uses to send transactions. If you don't specify one truffle
* will spin up a development blockchain for you on port 9545 when you
* run `develop` or `test`. You can ask a truffle command to use a specific
* network from the command line, e.g
*
* $ truffle test --network <network-name>
*/
networks: {
// Useful for testing. The `development` name is special - truffle uses it by default
// if it's defined here and no other network is specified at the command line.
// You should run a client (like ganache-cli, geth or parity) in a separate terminal
// tab if you use this network and you must also set the `host`, `port` and `network_id`
// options below to some value.
//
development: {
host: "127.0.0.1", // Localhost (default: none)
port: 7545, // Standard Ethereum port (default: none)
network_id: "*", // Any network (default: none)
},
// Another network with more advanced options...
// advanced: {
// port: 8777, // Custom port
// network_id: 1342, // Custom network
// gas: 8500000, // Gas sent with each transaction (default: ~6700000)
// gasPrice: 20000000000, // 20 gwei (in wei) (default: 100 gwei)
// from: <address>, // Account to send txs from (default: accounts[0])
// websocket: true // Enable EventEmitter interface for web3 (default: false)
// },
// Useful for deploying to a public network.
// NB: It's important to wrap the provider as a function.
// ropsten: {
// provider: () => new HDWalletProvider(mnemonic, `https://ropsten.infura.io/v3/YOUR-PROJECT-ID`),
// network_id: 3, // Ropsten's id
// gas: 5500000, // Ropsten has a lower block limit than mainnet
// confirmations: 2, // # of confs to wait between deployments. (default: 0)
// timeoutBlocks: 200, // # of blocks before a deployment times out (minimum/default: 50)
// skipDryRun: true // Skip dry run before migrations? (default: false for public nets )
// },
// Useful for private networks
// private: {
// provider: () => new HDWalletProvider(mnemonic, `https://network.io`),
// network_id: 2111, // This network is yours, in the cloud.
// production: true // Treats this network as if it was a public net. (default: false)
// }
},
// Set default mocha options here, use special reporters etc.
mocha: {
// timeout: 100000
},
// Configure your compilers
compilers: {
solc: {
version: "0.6.1", // Fetch exact version from solc-bin (default: truffle's version)
// docker: true, // Use "0.5.1" you've installed locally with docker (default: false)
// settings: { // See the solidity docs for advice about optimization and evmVersion
// optimizer: {
// enabled: false,
// runs: 200
// },
// evmVersion: "byzantium"
// }
}
},
// Truffle DB is currently disabled by default; to enable it, change enabled: false to enabled: true
//
// Note: if you migrated your contracts prior to enabling this field in your Truffle project and want
// those previously migrated contracts available in the .db directory, you will need to run the following:
// $ truffle migrate --reset --compile-all
db: {
enabled: false
}
};
in my solidity contract file, path is C:\Dapp\Pet-Shop-IPFS-React\contracts\PetShop.sol, I've imported the installed OZ ERC721 contract
PetShop.js
pragma solidity >=0.4.22 <0.9.0;
import "#openzeppelin/contracts/token/ERC721/ERC721.sol";
contract PetShop is ERC721 {
uint256 public tokenId;
uint256 public prevOwnerTokenID;
address public owner;
mapping(uint256 => uint256) public priceMapping;
mapping(uint256 => string) tokenIdToOffchainContentHash;
event PetGenerated(address, uint256, uint256, string);
event BuyPet(uint256, address, address);
event SuccessfulEtherWithdrawal(uint256, address, bool);
constructor()
ERC721("ShanBuilders", "SBRS")
{
owner = msg.sender;
}
function generatePet(uint256 _petPrice, string memory contentHash) public returns(bool) {
require(msg.sender != address(0), "Please! Check back! Registeration should not be from zero address");
require(msg.sender == owner, "Only contract owner can generate more pets");
tokenId++;
require(tokenId <= 16, "More than 16 pets is not allowed");
priceMapping[tokenId] = _petPrice;
tokenIdToOffchainContentHash[tokenId] = contentHash;
_mint(owner, tokenId);
emit PetGenerated(owner, tokenId, _petPrice, contentHash);
return true;
}
function checkPrice(uint256 _tokenId) public view returns(uint256) {
return priceMapping[_tokenId];
}
function checkHashForAToken(uint256 _tokenId) public view returns(string memory) {
return tokenIdToOffchainContentHash[_tokenId];
}
function buyPet(uint256 _tokenId) public payable returns(bool, string memory) {
prevOwnerTokenID = _tokenId;
address buyer = msg.sender;
address _owner = ownerOf(prevOwnerTokenID);
require(buyer != address(0), "Should not be zero address");
require(_exists(prevOwnerTokenID), "Invalid property Id, not registered");
require(msg.value == checkPrice(prevOwnerTokenID), "Please Send The Required Value");
withDraw(msg.value);
_transfer(_owner, buyer, prevOwnerTokenID);
emit BuyPet(_tokenId, _owner, buyer);
return (true, "Succesful");
}
function withDraw(uint256 _amount) internal returns(bool) {
address _owner = ownerOf(prevOwnerTokenID);
require(_amount > 0, "Amount must be valid");
payable(_owner).transfer(_amount);
emit SuccessfulEtherWithdrawal(_amount, _owner, true);
return true;
}
}
this is Migration
const PetShop = artifacts.require("PetShop");
module.exports = function (deployer) {
deployer.deploy(PetShop);
};
Q is after doing all this something went wrong that I didn't notice. when i run truffle compile it fails the compilation and throws error,
*** ParserError: Source "#openzeppelin/contracts/token/ERC721/ERC721.sol" not found: File import callback not supported***
How can I work around this issue, It will be great if someone could help me understand what this error says and why this is happening?
I have solved it in the following way,
#openzeppelin library. i installed it in the wrong place, in react, it should have been installed in the root folder, after installing #openzeppelin in root folder, it creates package-lock.json
compiler version as #openzeppelin's contracts using latest solidity version every time, for now it is 0.8.0, and the truffle current version was 5.2.0, and truffle using solidity version 0.5.0 or something between to 0.5.5, what did i do first install npm install solc#0.8.0 in root folder it updates package-lock.json like this
"solc": {
"version": "0.8.0",
"resolved": "https://registry.npmjs.org/solc/-/solc-0.8.0.tgz",
"integrity": "sha512-ypgvqYZhb/i5BM6cw9/5QkSlDJm/rLynsbWGP3kz6HeB6oNxPK6UMiB7jMr+tNVbQbBM/8l47vrI3XaDCHShjQ==",
"requires": {
"command-exists": "^1.2.8",
"commander": "3.0.2",
"follow-redirects": "^1.12.1",
"fs-extra": "^0.30.0",
"js-sha3": "0.8.0",
"memorystream": "^0.3.1",
"require-from-string": "^2.0.0",
"semver": "^5.5.0",
"tmp": "0.0.33"
}
then update truffle to 0.5.3 using npm install -g truffle
Last step i did is to update truffle-config.js with the following:
compilers: {
solc: {
version: "0.8.0", // Fetch exact version from solc-bin (default: truffle's version)
// docker: true, // Use "0.5.1" you've installed locally with docker (default: false)
// settings: { // See the solidity docs for advice about optimization and evmVersion
// optimizer: {
// enabled: false,
// runs: 200
// },
// evmVersion: "byzantium"
// }
}
},
that's it then run truffle compile, if it creates a contract folder in react/src and compiled successfully, there will be no error
const path = require("path");
module.exports = {
contracts_build_directory: path.join(__dirname, "client/src/contracts"),
/**
* Networks define how you connect to your ethereum client and let you set the
* defaults web3 uses to send transactions. If you don't specify one truffle
* will spin up a development blockchain for you on port 9545 when you
* run `develop` or `test`. You can ask a truffle command to use a specific
* network from the command line, e.g
*
* $ truffle test --network <network-name>
*/

How to capture the transactions while doing testing using Mocha

I am in the process of writing unit/behavioural tests using Mocha for a particular blockchain network use-case. Based on what I can see, these tests are not hitting the actual fabric, in other words, they seem to be running in some kind of a simulated environment. I don't get to see any of the transactions that took place as a part of the test. Can someone please tell me if it is somehow possible to capture the transactions that take place as part of the Mocha tests?
Initial portion of my code below:
describe('A Network', () => {
// In-memory card store for testing so cards are not persisted to the file system
const cardStore = require('composer-common').NetworkCardStoreManager.getCardStore( { type: 'composer-wallet-inmemory' } );
let adminConnection;
let businessNetworkConnection;
let businessNetworkDefinition;
let businessNetworkName;
let factory;
//let clock;
// Embedded connection used for local testing
const connectionProfile = {
name: 'hlfv1',
'x-type': 'hlfv1',
'version': '1.0.0'
};
before(async () => {
// Generate certificates for use with the embedded connection
const credentials = CertificateUtil.generate({ commonName: 'admin' });
// PeerAdmin identity used with the admin connection to deploy business networks
const deployerMetadata = {
version: 1,
userName: 'PeerAdmin',
roles: [ 'PeerAdmin', 'ChannelAdmin' ]
};
const deployerCard = new IdCard(deployerMetadata, connectionProfile);
console.log("line 63")
const deployerCardName = 'PeerAdmin';
deployerCard.setCredentials(credentials);
console.log("line 65")
// setup admin connection
adminConnection = new AdminConnection({ cardStore: cardStore });
console.log("line 69")
await adminConnection.importCard(deployerCardName, deployerCard);
console.log("line 70")
await adminConnection.connect(deployerCardName);
console.log("line 71")
});
Earlier, my connection profile was using the embedded mode, which I changed to hlfv1 after looking at the answer below. Now, I am getting the error: Error: the string "Failed to import identity. Error: Client.createUser parameter 'opts mspid' is required." was thrown, throw an Error :). This is coming from
await adminConnection.importCard(deployerCardName, deployerCard);. Can someone please tell me what needs to be changed. Any documentation/resource will be helpful.
Yes you can use a real Fabric. Which means you could interact with the created transactions using your test framework or indeed other means such as REST or Playground etc.
In Composer's own test setup, the option for testing against an hlfv1 Fabric environment is used in its setup (ie whether you want to use embedded, web or real Fabric) -> see https://github.com/hyperledger/composer/blob/master/packages/composer-tests-functional/systest/historian.js#L120
Setup is captured here
https://github.com/hyperledger/composer/blob/master/packages/composer-tests-functional/systest/testutil.js#L192
Example of setting up artifacts that you would need to setup to use a real Fabric here
https://github.com/hyperledger/composer/blob/master/packages/composer-tests-functional/systest/testutil.js#L247
Also see this blog for more guidelines -> https://medium.com/#mrsimonstone/debug-your-blockchain-business-network-using-hyperledger-composer-9bea20b49a74

Resources