I'm building on Solana and need some PDAs to store my program state.
The default with anchor is constantly serialize/deserialize the accounts, even when just passing the address into instructions, which crushes BPM's 4k stack limit REALLY soon for my app.
So I found the zero_copy feature, which is exactly what I need, see https://docs.rs/anchor-lang/latest/anchor_lang/attr.account.html.
The examples shown in the anchor docs, as well as some code samples I found through web search, all refer to wallet-owned accounts, not to PDAs. There is literally NOTHING to find online about zero_copy with PDAs, so I'm wondering if it's possible at all?!
Anyway - I feel I really, really need it, otherwise my PDA accounts will be limited to something around 400 bytes or so.
So I gave it a try:
#[program]
mod myapp {
use super::*;
pub fn create_aff(ctx: Context<CreateAff>, _i: u8) -> Result<()> {
let acc = &mut ctx.accounts.aff_account.load_init()?;
acc.aff_account = ctx.accounts.user.key();
acc.bump = *ctx.bumps.get("aff_account").unwrap();
Ok(())
}
}
#[account(zero_copy)]
pub struct Aff {
aff_account: Pubkey,
bump: u8,
}
#[derive(Accounts)]
#[instruction(i: u8)]
pub struct CreateAff<'info> {
#[account(init, space = 41, payer = user, seeds = [AFFSEED], bump)]
aff_account: AccountInfor<'info, Aff>,
#[account(mut)]
user: Signer<'info>,
/// CHECK: This is not dangerous because we don't read or write from this account
system_program: AccountInfo<'info>,
}
So far, so good. It compiles. Running a simple test:
it("Creates the aff account if doesn't exist", async () => {
const [affPDA, bump] = await PublicKey.findProgramAddress([anchor.utils.bytes.utf8.encode(AFFSEED)],program.programId);
console.log("CreateAff: affPDA is [", affPDA.toString(), "], bump is", bump);
const contents = await program.account.aff.fetchNullable(affPDA);
if (contents == null) {
await program.rpc.createAff({
accounts: {
affAccount: affPDA,
user: usr,
systemProgram: SystemProgram.programId,
},
signers: [],
});
const affe = await program.account.counts.fetch(affPDA);
console.log("affe: ", affe);
assert.ok(true);
} else {
assert.ok(true);
}
});
renders me an error:
Creates the aff account if doesn't exist:
Error: Invalid arguments: affAccount not provided.
at /Users/bb/app/nodestore/node_modules/#project-serum/anchor/dist/cjs/program/common.js:39:23
at Array.forEach (<anonymous>)
at validateAccounts (node_modules/#project-serum/anchor/dist/cjs/program/common.js:33:16)
at ix (node_modules/#project-serum/anchor/dist/cjs/program/namespace/instruction.js:38:46)
at txFn (node_modules/#project-serum/anchor/dist/cjs/program/namespace/transaction.js:16:20)
at Object.rpc [as createAff] (node_modules/#project-serum/anchor/dist/cjs/program/namespace/rpc.js:9:24)
at Context.<anonymous> (tests/nodeshop.js:63:25)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
It's complaining affAccount is not provided, even though I'm clearly passing it in.
So the problem seems to be some part of the runtime cannot handle affAccount being AccountLoader (for zero_copy) rather than the standard AccountInfo.
Any help how I can fix or at least further debug this are highly appreciated.
I got it working. Sometimes it helps just posting a question, it helps thinking things through... ;-)
So great news: zero_copy with PDAs is possible! :-)
Here's what it was:
I originally gave the create_aff function (and the corresponding accounts struct) an i argument, even though I'm not using and additional i in the PDA account seeds. This was just a copy/paste error from a previous PDA I had been working on :-/
Since I was consistent with the i, the compiler didn't complain.
I removed the i parameter from the create_aff's parameter list as well as the #[instruction(i: u8)] from the CreateAff accounts struct declaration and, violĂ , it's working now.
Long live Solana and anchor. Oh, and a recommendation to the builders of anchor: Just make zero_copy the default, and lengthy borsh the exception!
Related
I have some schema with simple misuse cases like on the picture bellow.
I was able to get them only via this stupid idea running apollo-server, as this was the first time when I saw them, so it would perform validation on create.
const { ApolloServer } = require("apollo-server-express");
const { readFileSync } = require("fs");
const typeDefs = readFileSync(process.argv[2]).toString("utf-8");
try {
new ApolloServer({
typeDefs: typeDefs,
debug: true,
mockEntireSchema: true,
});
} catch (error) {
console.error(error.message);
process.exit(-1);
}
I wasn't really able to get any validation cli tool which can accept only SDL definition and do this simple syntax checks. Are there any? I'm just trying to build build pipeline and injecting as many as I can into compile time, rather than runtime. I mean, hack is working, but this is so weird that there is no ready solutions for that.
I also using graphql-codegen for typescript, but it just doesn't care and no errors are thrown. The tool is okay with interfaces being used in unions.
graphql-cli with it's validate requires document and schema, but it's doing a bit different kind of validation.
I have an instruction in Anchor code that creates a PDA like this:
#[account]
#[derive(Default)]
pub struct Device {
pub ipv4: [u8; 4],
pub hostname: String,
bump: u8,
status: DeviceStatus,
authority: Pubkey,
}
#[derive(Accounts)]
#[instruction(local_key: Pubkey)]
pub struct RegisterDevice<'info> {
#[account(mut)]
pub authority: Signer<'info>,
#[account(init,
space = 128,
seeds = [local_key.as_ref()],
bump,
payer = authority,
)]
pub device: Box<Account<'info, Device>>,
pub system_program: Program<'info, System>,
}
I want to allow a different key (whose pubkey is known at the time of creation, and does not have access to the priv key of the original authority) to update the created Device PDA, and my first stab at this looks like below:
#[derive(Accounts)]
pub struct UpdateDevice<'info> {
#[account(mut)]
pub authority: Signer<'info>,
#[account(mut,
has_one = authority,
seeds = [authority.key().as_ref()],
bump,
)]
pub device: Box<Account<'info, Device>>,
pub system_program: Program<'info, System>,
}
However, Anchor/the Solana runtime complains about a signing error, which means this approach is wrong. Since it's not a token account, the Device doesn't have an owner, yet the system seems to track that the original authority is the one who can sign for modifications -- if I've signed the transaction with the original provided key, the runtime doesn't seem to mind.
How can I implement what I'm looking to do?
OK! Seems I've figured out the issue. h/t to Paulx for help.
As I thought, there is nothing special in the Solana runtime that tracks who is the "authority" on an account by default (Token Program accounts have owners as a property in the account, but that's at the program level...)
Anyway, the issue here I ran into is that Anchor RPC defaults .signers([]) to the provider key (local wallet) if they are not provided, i.e., that method is not called in the rpc. I wasn't passing signers, cause I was using that magic before and assumed it would still work with a non-default signers due to all the reflection.
So make sure you pass your signers correctly kids!
I am making a native messaging application, with the web API, from firefox. The extension is supposed to call an application that parses stdin and then calls my other rust app, based on some of the data it parsed, but for no apparent reason, the rust app doesn't accept input from firefox (it works when I do it manually).
This is the code of the extension:
/*
On a click on the browser action, send the app a message.
*/
browser.browserAction.onClicked.addListener(() => {
console.log("Sending: ping");
var sending = browser.runtime.sendNativeMessage(
"themefox_manager",
"ping");
sending.then(onResponse, onError);
});
function onResponse(response) {
console.log("Received " + response);
}
function onError(error) {
console.log(`Error: ${error}`);
}
and this the code of the rust app:
use std::fs;
use std::io;
use std::io::prelude::*;
fn main() {
let stdin = io::stdin();
for line in stdin.lock().lines() {
let mut file = fs::File::create("/home/user/filename.txt").unwrap();
//
if line.unwrap() == "ping" {
file.write_all(b"TEST").expect("Error");
}
}
}
The weird thing is, that the text file in my home dir first appears, when I close firefox, not when the app gets started. And it also doen't have the text TEST.
Thanks for any help!
Cheers
I managed to make my own solution, taking a bit from this crate.
Quick note: "If you want to skip all of the code and immeditly want to start coding from a template repo, scoll to the bottom of this solution, and you should be able to find more info there."
The code, which reads the input, and then returns it, is the following:
pub fn read_input<R: Read>(mut input: R) -> io::Result<serde_json::Value> {
let length = input.read_u32::<NativeEndian>().unwrap();
let mut buffer = vec![0; length as usize];
input.read_exact(&mut buffer)?;
let json_val: serde_json::Value = serde_json::from_slice(&buffer).unwrap();
Ok(json_val)
}
What the code does, is read the input, which is being passed as a parameter to the function, and then it reads it parses it in a json var and returns the sucess/err value of it.
That code is being used in the main.rs file like this:
let json_val = match lib::read_input(io::stdin()) {
Err(why) => panic!("{}", why.to_string()),
Ok(json_val) => json_val,
};
Here the input is being passed as a parameter to the read_input function.
And to send the code i used the following function:
pub fn write_output<W: Write>(mut output: W, value: &serde_json::Value) -> io::Result<()> {
let msg = serde_json::to_string(value)?;
let len = msg.len();
// Chrome won't accept a message larger than 1MB
if len > 1024 * 1024 {
panic!("Message was too large", length: {}, len)
}
output.write_u32::<NativeEndian>(len as u32)?;
output.write_all(msg.as_bytes())?;
output.flush()?;
Ok(())
}
Which gets the stdout and the message passed as parameters. The function then writes the message to the output (normally stdout, could also be a file for debugging purposes).
The code which calls the function write_output is the following:
let response = serde_json::json!({ "msg": "pong" });
match lib::write_output(io::stdout(), &response) {
Err(why) => panic!("{}", why.to_string()),
Ok(_) => (),
};
The project uses these dependencies, so make sure to add them to Cargo.toml
"byteorder" = "*"
"serde_json" = "*"
The imports for the main.rs file are:
mod lib;
use std::io;
and for the lib.rs file, in which both functions reside:
extern crate serde_json;
use byteorder::{NativeEndian, ReadBytesExt, WriteBytesExt};
use std::error::Error;
use std::fs;
use std::io;
use std::io::{Read, Write};
I also created a git template repo, so that you can start really quick, you can find it here.
If I have the following method to test predecessor_account_id behaviour
pub fn get_pred_acc(&self) -> (String {
let prev_acc = env::predecessor_account_id().to_string();
return prev_acc;
}
And try to call this from frontend
const contract = await this.near.loadContract(window.nearConfig.contractName, {
viewMethods: ["get_pred_acc", ],
changeMethods: [],
sender: this.accountId,
});
const acc = await contract.get_pred_acc();
I get the following error:
Uncaught (in promise) Error: Querying call/flux-protocol/get_account_id failed: wasm execution failed with error: FunctionCallError(HostError(ProhibitedInView("predecessor_account_id"))).
{ "error": "wasm execution failed with error: FunctionCallError(HostError(ProhibitedInView(\"predecessor_account_id\")))",
"logs": []
}
That's expected behavior for the view calls.
The view calls don't have certain context information such calls are not part of an actual transaction.
Currently, the best option to see which methods are prohibited in the view calls is to take a look at the test: https://github.com/nearprotocol/nearcore/blob/master/runtime/near-vm-logic/tests/test_view_method.rs#L19-L43
To summarize:
previous account information and keys (signer, predecessor and signer_public_key)
gas information
all promise methods, cause they involve other contracts
I am in the process of writing unit/behavioural tests using Mocha for a particular blockchain network use-case. Based on what I can see, these tests are not hitting the actual fabric, in other words, they seem to be running in some kind of a simulated environment. I don't get to see any of the transactions that took place as a part of the test. Can someone please tell me if it is somehow possible to capture the transactions that take place as part of the Mocha tests?
Initial portion of my code below:
describe('A Network', () => {
// In-memory card store for testing so cards are not persisted to the file system
const cardStore = require('composer-common').NetworkCardStoreManager.getCardStore( { type: 'composer-wallet-inmemory' } );
let adminConnection;
let businessNetworkConnection;
let businessNetworkDefinition;
let businessNetworkName;
let factory;
//let clock;
// Embedded connection used for local testing
const connectionProfile = {
name: 'hlfv1',
'x-type': 'hlfv1',
'version': '1.0.0'
};
before(async () => {
// Generate certificates for use with the embedded connection
const credentials = CertificateUtil.generate({ commonName: 'admin' });
// PeerAdmin identity used with the admin connection to deploy business networks
const deployerMetadata = {
version: 1,
userName: 'PeerAdmin',
roles: [ 'PeerAdmin', 'ChannelAdmin' ]
};
const deployerCard = new IdCard(deployerMetadata, connectionProfile);
console.log("line 63")
const deployerCardName = 'PeerAdmin';
deployerCard.setCredentials(credentials);
console.log("line 65")
// setup admin connection
adminConnection = new AdminConnection({ cardStore: cardStore });
console.log("line 69")
await adminConnection.importCard(deployerCardName, deployerCard);
console.log("line 70")
await adminConnection.connect(deployerCardName);
console.log("line 71")
});
Earlier, my connection profile was using the embedded mode, which I changed to hlfv1 after looking at the answer below. Now, I am getting the error: Error: the string "Failed to import identity. Error: Client.createUser parameter 'opts mspid' is required." was thrown, throw an Error :). This is coming from
await adminConnection.importCard(deployerCardName, deployerCard);. Can someone please tell me what needs to be changed. Any documentation/resource will be helpful.
Yes you can use a real Fabric. Which means you could interact with the created transactions using your test framework or indeed other means such as REST or Playground etc.
In Composer's own test setup, the option for testing against an hlfv1 Fabric environment is used in its setup (ie whether you want to use embedded, web or real Fabric) -> see https://github.com/hyperledger/composer/blob/master/packages/composer-tests-functional/systest/historian.js#L120
Setup is captured here
https://github.com/hyperledger/composer/blob/master/packages/composer-tests-functional/systest/testutil.js#L192
Example of setting up artifacts that you would need to setup to use a real Fabric here
https://github.com/hyperledger/composer/blob/master/packages/composer-tests-functional/systest/testutil.js#L247
Also see this blog for more guidelines -> https://medium.com/#mrsimonstone/debug-your-blockchain-business-network-using-hyperledger-composer-9bea20b49a74