custom program error: 0xbbf on anchor framework solana - solana

const connection = new anchor.web3.Connection("https://api.devnet.solana.com");
const provider = new anchor.Provider(connection, wallet, {
preflightCommitment: "recent",
});
const programid = new anchor.web3.PublicKey("EcgzWq52GhdCRJFEXub5LCgdqPs5Eid2XfiZSXM22eQu");
const res = await axios.get(process.env.REACT_APP_API_ENDPOINT+"/stakeidl");
const idl = await res.data;
const program = new anchor.Program(idl, programid, provider);
console.log(program);
const [collections,bump] = (await PublicKey.findProgramAddress(["collections"],programid));
console.log(collections,bump);
const tx = await program.rpc.setAdmin(bump,{accounts:
{collections:collections,
admin:wallet.publicKey,
systemProgram:anchor.web3.SystemProgram.programId}});
Above code is throwing custom program error 0xbbf
I create program using publickey and idl and call setAdmin on rust it is giving 0xbbf

The 0xbbf is the hex representation for 3007. You can see a description of all errors here, for example. https://solanacookbook.com/references/anchor.html#error-codes
I guess in this case, some of the seeds for a PDA don't match up with the right owner, or maybe deployed by a wrong program.

Related

how to use client context to start child spans at server side

from an angular client i have injected context and sending through headers
try {
const ctx = api.trace.setSpan(
api.context.active(),
parentSpan
);
const tracer = trace.getTracerProvider().getTracer('Ui-Service');
let span = tracer.startSpan('start_Client_spans', undefined, ctx);
const carrier:{traceparent:string} = {
traceparent:""
};
const propagator = new W3CTraceContextPropagator();
propagator.inject(
api.trace.setSpanContext(api.context.active(), span.spanContext()),
carrier,
api.defaultTextMapSetter,
)
const clone = req.clone({
headers: req.headers
.set('ot-tracer-traceid' ,carrier.traceparent)
});
span.end()
return clone;
} catch (e) {
console.log(e)
throw e
}
}
server side code below here the context is extracted from headers and trying to be injected to start further spans at server side so that client request is parent and server requests are child spans of the same trace id but currently both server and client are having separate trace ids
const carrier = req.headers["ot-tracer-traceid"]
const propagator = new W3CTraceContextPropagator();
if(carrier?.length>0){
const clientContext = propagator.extract(
api.context.active(),
carrier,
api.defaultTextMapGetter
);
const newTracer = api.trace.getTracer("test","1.0.0")
// create server span using the client context
//below code snippet does create spans but individual spans , but i am expecting them to use the client context
const serverSpan = newTracer.startSpan('operationName', undefined, clientContext);
console.log(newTracer);
}
can some one hint me where am i getting wrong ?

near-api-js signed contract deploy

I am having trouble getting serverside near-api-js account creation and contract deployment working successfully and would appreciate some input.
I am using a unencrypted local keystore set up as follows;
const privatekey = "__private key__";
const keyPair = KeyPair.fromString(privatekey); // generate key pair
const keyStore = new nearAPI.keyStores.UnencryptedFileSystemKeyStore(credentialsPath); // declare keystore type
const config = Object.assign(require('./config')('testnet'), {
networkId: networkId,
deps: { keyStore },
}); // setup config for near connection with keystore added
// update keystore with main account
await keyStore.setKey('testnet', config.masterAccount, nearAPI.utils.KeyPair.fromString(privatekey));
// connect near with config details
const near = await nearAPI.connect(config);
// load master account
const account = await new nearAPI.Account(near.connection, config.masterAccount);
I am creating a subaccount as;
const amountInYocto = utils.format.parseNearAmount("5");
// generate new public key
const newPublicKey = await near.connection.signer.createKey(newAccountName,config.networkId);
let keyPair;
let publicKey;
keyPair = await KeyPair.fromRandom('ed25519');
publicKey = keyPair.getPublicKey();
await near.connection.signer.keyStore.setKey(near.connection.networkId, newAccountName, keyPair);
const response = await near.createAccount(newAccountName, publicKey, amountInYocto);
When I try to deploy a contract to the created account I get the error;
Actor 'master account' doesn't have permission to account 'subaccount' to complete the action
This is how I deploy a contract:
const txs = [transactions.deployContract(fs.readFileSync(`../near/out/out.wasm`))];
const result = await account.signAndSendTransaction({
receiverId: subaccount,
actions: txs
});
Im sure I have something wrong in how i manage the keys, but cannot work it out. Any suggestions?

Chainlink Fullfill_alarm() not calling back after successfull request send to oracle on testing with mocha

I'm trying to test the Chainlink alarm oracle following the documentation steps with hardhat on a kovan fork network.
https://docs.chain.link/docs/chainlink-alarm-clock/#:~:text=ala
When I test the contract with npx hardhat test it looks like it hits the request function to the oracle but never calling back the fulfill_alarm().
I wonder if that might be an issue with the fork network or with my testing code.
I'm using Alchemy api as url for the fork network on kovan.
Also, I'm using a timeout with mocha so it can wait for the callback to execute, but never hits.
The first it statement executes correctly, but then on the second one which is suposed to increment lotteryId + 1 once the lottery is done (fulfill_alarm() called) doesn't pass. It looks like the function never gets executed by the chainlink oracle.
oracle: 0xAA1DC356dc4B18f30C347798FD5379F3D77ABC5b
jobId: '982105d690504c5d9ce374d040c08654'
Solidity code:
//Starting Oracle Alarm
function start_new_lottery(uint256 duration) public {
require(lottery_state == LOTTERY_STATE.CLOSED, "can't start a n
lottery_state = LOTTERY_STATE.OPEN;
Chainlink.Request memory req = buildChainlinkRequest(jobId, add
req.addUint("until", block.timestamp + duration);
sendChainlinkRequestTo(oracle, req, oraclePayment);
console.log('LINK REQ SEND');
}
//Callback Function after Oracle Alarm is Fulfilled
function fulfill_alarm(bytes32 _requestId) public recordChainlinkFu
require(lottery_state == LOTTERY_STATE.OPEN, "The lottery hasn'
lotteryId = lotteryId + 1;
console.log('ALARM_FULFILLED');
pickWinner();
}
test code:
const {expect} = require("chai");
const { time } = require("#openzeppelin/test-helpers");
describe("Lottery Contract", function() {
//Defining global variables for testing.
const price_lottery = ethers.BigNumber.from(1);
let owner;
let Lottery;
let lottery;
before(async function () {
Lottery = await ethers.getContractFactory("Lottery");
[owner] = await ethers.getSigners();
lottery = await Lottery.deploy(
price_lottery,
);
//Starting new Lottery
await lottery.start_new_lottery(60);
});
it("should start a chainlink alarm to init new lottery", async func
let lottery_state = await lottery.lottery_state();
expect(lottery_state).to.equal(0);
//Initial value
let lotteryId = await lottery.lotteryId();
expect(lotteryId).to.equal(1);
})
it("increments lotteryID + 1, when chainlink alarm is fulfilled aft
//Wait until alarm is fulfilled
function timeout(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
await timeout(120000);
//Expected value
let lotteryId = await lottery.lotteryId();
expect(lotteryId).to.equal(2);
})
})

NativeScript android play sound on notification

I'm writing native Android code in a Nativescript Angular app, to display a notification with sound.
I followed the recommendations in this answer.
I have a sound file named a.mp3, in the following folder:
Here is the code to configure the sound of the notification:
const uri = new android.net.Uri.Builder()
.scheme(android.content.ContentResolver.SCHEME_ANDROID_RESOURCE)
.authority(application.android.nativeApp.getPackageName())
.appendPath("raw")
.appendPath("a.mp3")
.build();
const AudioAttributes = android.media.AudioAttributes;
const audioAttributes = new AudioAttributes.Builder()
.setContentType(AudioAttributes.CONTENT_TYPE_SONIFICATION)
.setUsage(AudioAttributes.USAGE_NOTIFICATION)
.build();
Here is the code to display a notification:
const NOTIFICATION_ID = 234;
const CHANNEL_ID = "my_channel_01";
const name = "my notifications";
const Description = "some desc";
const title = "notif title";
const message = "This notification has been triggered by me";
const NotificationManager = android.app.NotificationManager;
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.O) {
const importance = android.app.NotificationManager.IMPORTANCE_HIGH;
const mChannel = new android.app.NotificationChannel(CHANNEL_ID, name, importance);
mChannel.setSound(uri, audioAttributes);
mChannel.setDescription(Description);
mChannel.enableLights(true);
mChannel.setLightColor(android.graphics.Color.RED);
mChannel.enableVibration(true);
mChannel.setVibrationPattern([100, 200, 300, 400, 500, 400, 300, 200, 400]);
mChannel.setShowBadge(false);
notificationManager.createNotificationChannel(mChannel);
}
///////// Create an activity on tap (intent)
const Intent = android.content.Intent;
const PendingIntent = android.app.PendingIntent;
const intent = new Intent(context, com.tns.NativeScriptActivity.class);
intent.setFlags(Intent.FLAG_ACTIVITY_RESET_TASK_IF_NEEDED | Intent.FLAG_ACTIVITY_NEW_TASK);
const pendingIntent = PendingIntent.getActivity(context, 0, intent, 0);
///////// PRESERVE NAVIGATION ACTIVITY. To start a "regular activity" from your notification, set up
///////// the PendingIntent using TaskStackBuilder so that it creates a new back stack as follows.
///////// SEE: https://developer.android.com/training/notify-user/navigation.html
const TaskStackBuilder = android.support.v4.app.TaskStackBuilder;
const resultIntent = new Intent(context, com.tns.NativeScriptActivity.class);
const stackBuilder = TaskStackBuilder.create(context);
stackBuilder.addParentStack(com.tns.NativeScriptActivity.class);
stackBuilder.addNextIntent(resultIntent);
///////// Creating a notification
var NotificationCompat = android.support.v4.app.NotificationCompat;
const builder = new NotificationCompat.Builder(context, CHANNEL_ID)
.setSound(uri)
.setSmallIcon(android.R.drawable.btn_star_big_on)
.setContentTitle(title)
.setContentText(message)
.setStyle(
new NotificationCompat.BigTextStyle()
.bigText("By default, the notification's text content is truncated to fit one line. If you want your notification to be longer, you can enable an expandable notification by adding a style template with setStyle(). For example, the following code creates a larger text area:")
)
.setPriority(NotificationCompat.PRIORITY_HIGH)
// Set the intent that will fire when the user taps the notification
.setContentIntent(pendingIntent)
.setAutoCancel(true);
///////// Show the notification
notificationManager.notify(NOTIFICATION_ID, builder.build());
The notification indeed fires, but the sound file is not played.
Can anyone help to solve this, please?
I also tried using another approach of acquiring the mp3 file, as recommended here:
const uri = android.net.Uri.parse("android.resource://" + context.getPackageName() + "/raw/a.mp3");
But that didn't help either.
Did I put the 'a.mp3' sound file in the correct folder that is recognized by android?
Thanks
I found the solution for this. I'm writing answer so that others can learn from this question.
Regarding the question I asked - Did I put the 'a.mp3' sound file in the correct folder that is recognized by android?
The answer is yes. /App_Resources/Android/src/main/res/raw/ is where the above code will look for the sound file.
But, I needed to do 2 modifications:
A_ The code needs to be changed to not include the file extension:
const uri = new android.net.Uri.Builder()
.scheme(android.content.ContentResolver.SCHEME_ANDROID_RESOURCE)
.authority(application.android.nativeApp.getPackageName())
.appendPath("raw")
.appendPath("a") // Referring to a.mp3
.build();
and then the sound uri would be used like in the code in the question:
const mChannel = new android.app.NotificationChannel(CHANNEL_ID, name, importance);
mChannel.setSound(uri, audioAttributes);
or
const builder = new NotificationCompat.Builder(context, CHANNEL_ID)
.setSound(uri)
B_ I needed to tell webpack to pack the .mp3 files from /App_Resources/Android/src/main/res/raw/ and put it into the nativescript build. To do this, I had to add { from: { glob: "**/*.mp3" } }, to webpack.config.js:
// Copy assets to out dir. Add your own globs as needed.
new CopyWebpackPlugin([
{ from: { glob: "fonts/**" } },
{ from: { glob: "**/*.png" } },
{ from: { glob: "**/*.mp3" } },
], { ignore: [`${relative(appPath, appResourcesFullPath)}/**`] }),
As webpack isn't configured to copy all files from App_Resources automatically. The same is true for any other file extension that you'd use in your project (for example: .jpg files).
More info here (Q&A on 'nativescript-audio' plugin's git).

Windows 8.1 store apps OnCommandsRequested doesn't add ApplicationCommands when async used

On the App.xaml.cs I have the following code
private async void OnCommandsRequested(SettingsPane settingsPane, SettingsPaneCommandsRequestedEventArgs e)
{
var loader = ResourceLoader.GetForCurrentView();
var generalCommand = new SettingsCommand("General Settings", "General Settings", handler =>
{
var generalSettings = new GeneralSettingsFlyout();
generalSettings.Show();
});
e.Request.ApplicationCommands.Add(generalCommand);
object data;
IAuthService _authService = new AuthService();
if (Global.UserId == 0)
data = await _authService.GetSettingValueBySettingName(DatabaseType.GeneralDb, ApplicationConstants.GeneralDbSettingNames.ShowSupportInfo);
else
data = await _authService.GetSettingValueBySettingName(DatabaseType.UserDb, ApplicationConstants.UserDbSettingNames.ShowSupportInfo);
if (data != null && data.ToString().Equals("1"))
{
var supportCommand = new SettingsCommand("Support", "Support", handler =>
{
var supportPane = new SupportFlyout();
supportPane.Show();
});
e.Request.ApplicationCommands.Add(supportCommand);
}
var aboutCommand = new SettingsCommand("About", loader.GetString("Settings_OptionLabels_About"), handler =>
{
var aboutPane = new About();
aboutPane.Show();
});
e.Request.ApplicationCommands.Add(aboutCommand);
}
This code adds the setting "General Settings" but neither "Support" or "About" commands. Can anyone advice what's wrong with this code?
Instead of querying the commands from your service when they are requested you'll need to query them ahead of time and then add the already known commands.
You cannot use await in OnCommandsRequested.
A method returns when it gets to the first await, so only commands added to the request before the await will be used.
Since the SettingsPaneCommandsRequestedEventArgs doesn't provide a deferral there is no way to tell the requester to wait for internal async calls to complete.
Note also that SettingsPane is deprecated and not recommended for new app development for Windows 10.

Resources