how do I parse the data from programSubscribe websocket Endpoint? - solana

JSON.stringify({
jsonrpc: '2.0',
id: 1,
method: 'programSubscribe',
params: [
address,
{
encoding: 'jsonParsed',
commitment: 'processed',
},
],
})
)
}
using the above payload, (as well as encoding: 'base64')
I get back something like this.
{
context: { slot: 162051102 },
value: {
pubkey: '33DMmrkWEWEDWhPaXfXQxFTssreDjmkSHdo93Yp27fRV',
account: {
lamports: 2011440,
data: [Array],
owner: 'M2mx93ekt1fmXSVkTrUL9xVFHkmME8HTUi5Cyc5aF7K',
executable: false,
rentEpoch: 361
}
}
}
The data array looks like this...
['yKSZu3Y8yDPDGxjMPhSKClKTgYkg7frtqzkeSTNsC3TbZp0QRwNCS4mVGWt5DMnpawo5OX1XhzojzVpAp5AkOns3GVU1tI23CK/25BBZJGavm0hr5XZ58vaLQc3cMeAgkndKj2Ni7ROAAFliAAAAADVtgjQOEJ9Azw2AlH80qNd2MINdaKjv/nLVRWodXmGIAQAAAAAAAAD9AAAAAAAAAAA=',
'base64']
How do I turn the above string into something that I can see.... Why is everything on Solana hidden behind more rpc calls?

Decode the account data using base64. This will result in a serialized byte array.
To then deserialize it as the datatype (struct, etc.) you need to know the structure of the data and how it was serialized by the program. For example: borsh and serde will produce different serialization.

Related

Post data to a graphql server with request-promise

I'm using the request-promise library to make http request to a graphql server. To achieve a query, I'm doing this:
const query = `
{
user(id:"123173361311") {
_id
name
email
}
}
`
const options = {
uri: "http://localhost:5000/graphql",
qs: { query },
json: true
}
return await request(options)
The above code is working fine. However I'm confused about how to go about a mutation since I need to specify both the actual mutation and the inputData like this:
// Input
{
name: "lomse"
email: "lomse#lomse.com"
}
const mutation = `
mutation addUser($input: AddUserInput!){
addUser(input: $input) {
_id
name
email
}
}
`
const option = {
uri: "http://localhost:5000/graphql",
formData: {mutation},
json: true,
// how to pass the actual data input
}
request.post(option)
Or is it that the request-promise library isn't designed for this use case?
Use body, not formData. Your body should consist of three properties:
query: The GraphQL document you're sending. Even if the operation is a mutation, the property is still named query.
variables: A map of your variable values serialized as a JSON object. Only required if your operation utilized variables.
operationName: Specifies which operation to execute. Only required if your document included multiple operations.
request.post({
uri : '...',
json: true,
body: {
query: 'mutation { ... }',
variables: {
input: {
name: '...',
email: '...',
},
},
},
})
The graphql-request library seems to do what I needed the request-promise library to do.
import { request } from 'graphql-request'
const variables = {
name: "lomse",
email: "lomse#lomse.com"
}
const mutation = `
mutation addUser($input: AddUserInput!){
addUser(input: $input) {
_id
name
email
}
}
`
response = await request(uri, mutation, {input: variables})

JSON schema validation with perfect messages

I have divided the data entry in a REST call in 4 parts. Data can be sent to REST call via:-
headers
query params
path params
request body
So in order to validate the presence of any key in any of the above 4 parts I have created a schema in this format. So if in case I have to validate anything in query params I will add the key 'query' and then add the fields inside that, that needs to be validated
const schema = {
id: 'Users_login_post',
type: 'object',
additionalProperties: false,
properties: {
headers: {
type: 'object',
additionalProperties: false,
properties: {
Authorization: {
type: 'string',
minLength: 10,
description: 'Bearer token of the user.',
errorMessages: {
type: 'should be a string',
minLength: 'should be atleast of 23 length',
required: 'should have Authorization'
}
}
},
required: ['Authorization']
},
path: {
type: 'object',
additionalProperties: false,
properties: {
orgId: {
type: 'string',
minLength: 23,
maxLength: 36,
description: 'OrgId Id of the Organization.',
errorMessages: {
type: 'should be a string',
minLength: 'should be atleast of 23 length', // ---> B
maxLength: 'should not be more than 36 length',
required: 'should have OrgId'
}
}
},
required: ['orgId']
}
}
};
Now, in my express code, I created a request object so that I can test the validity of the JSON in this format.
router.get("/org/:orgId/abc", function(req, res){
var request = { //---> A
path: {
orgId : req.params.orgId
},
headers: {
Authorization : req.headers.Authorization
}
}
const Ajv = require('ajv');
const ajv = new Ajv({
allErrors: true,
});
let result = ajv.validate(schema, request);
console.log(ajv.errorsText());
});
And I validate the above request object (at A) against my schema using AjV.
The output what I get looks something like this:
data/headers should have required property 'Authorization', data/params/orgId should NOT be shorter than 23 characters
Now I have a list of concerns:
why the message is showing data word in the data/headers and data/params/orgId even when my variable name is request(at A)
Also why not my errormessages are used, like in case of orgId I mentioned: should be atleast of 23 length (at B) as a message, even then the message came should NOT be shorter than 23 characters.
How can I show request/headers instead of data/headers.
Also, the way I used to validate my path params, query params, header params, body param, is this the correct way, if it is not, then what can be the better way of doing the same?
Please shed some light.
Thanks in advance.
Use ajv-keywords
import Ajv from 'ajv';
import AjvKeywords from 'ajv-keywords';
// ajv-errors needed for errorMessage
import AjvErrors from 'ajv-errors';
const ajv = new Ajv.default({ allErrors: true });
AjvKeywords(ajv, "regexp");
AjvErrors(ajv);
// modification of regex by requiring Z https://www.regextester.com/97766
const ISO8601UTCRegex = /^(-?(?:[1-9][0-9]*)?[0-9]{4})-(1[0-2]|0[1-9])-(3[01]|0[1-9]|[12][0-9])T(2[0-3]|[01][0-9]):([0-5][0-9]):([0-5][0-9])(.[0-9]+)?Z$/;
const typeISO8601UTC = {
"type": "string",
"regexp": ISO8601UTCRegex.toString(),
"errorMessage": "must be string of format 1970-01-01T00:00:00Z. Got ${0}",
};
const schema = {
type: "object",
properties: {
foo: { type: "number", minimum: 0 },
timestamp: typeISO8601UTC,
},
required: ["foo", "timestamp"],
additionalProperties: false,
};
const validate = ajv.compile(schema);
const data = { foo: 1, timestamp: "2020-01-11T20:28:00" }
if (validate(data)) {
console.log(JSON.stringify(data, null, 2));
} else {
console.log(JSON.stringify(validate.errors, null, 2));
}
https://github.com/rofrol/ajv-regexp-errormessage-example
AJV cannot know the name of the variable you passed to the validate function.
However you should be able to work out from the errors array which paths have failed (and why) and construct your messages from there.
See https://ajv.js.org/#validation-errors
To use custom error messages in your schema, you need an AJV plugin: ajv-errors.
See https://github.com/epoberezkin/ajv-errors

I have confusion on relay and graphql resolve method

Apologies if this is a stupid question. this is the code for relay/graphql pagination that's confusing me:
const GraphQLTodo = new GraphQLObjectType({
name: 'Todo',
fields: {
id: globalIdField('Todo'),
text: {
type: GraphQLString,
resolve: (obj) => obj.text,
},
complete: {
type: GraphQLBoolean,
resolve: (obj) => obj.complete,
},
},
interfaces: [nodeInterface],
});
/* When pagination is needed, make a connection */
const {
connectionType: TodosConnection,
edgeType: GraphQLTodoEdge,
} = connectionDefinitions({
name: 'Todo',
nodeType: GraphQLTodo,
});
const GraphQLUser = new GraphQLObjectType({
name: 'User',
fields: {
id: globalIdField('User'),
todos: {
type: TodosConnection,
args: {
status: {
type: GraphQLString,
defaultValue: 'any',
},
...connectionArgs,
},
resolve: (obj, {status, ...args}) =>
connectionFromArray(getTodos(status), args),
},
totalCount: {
type: GraphQLInt,
resolve: () => getTodos().length,
},
completedCount: {
type: GraphQLInt,
resolve: () => getTodos('completed').length,
},
},
interfaces: [nodeInterface],
});
const Root = new GraphQLObjectType({
name: 'Root',
fields: {
viewer: {
type: GraphQLUser,
resolve: () => getViewer(),
},
node: nodeField,
},
});
You can see that on the GraphQLTodo field, it has text and complete fields with resolve function passed an obj parameter, how is obj passed there? is it from GraphQLUser resolve? I've read on docs that source(in this case obj) - The object resolved from the field on the parent type. is it not from the root query? how is obj here created?
The Connection
Here is where (some of) the magic happens:
const {
connectionType: TodosConnection,
edgeType: GraphQLTodoEdge,
} = connectionDefinitions({
name: 'Todo',
nodeType: GraphQLTodo,
});
You have now told GraphQL that a TodosConnection is going to be made up of GraphQLTodo nodes. Now, let's take a look at where the objects are actually fetched for the connection in your GraphQLUser object, which is on the todos field:
todos: {
type: TodosConnection,
args: {
status: {
type: GraphQLString,
defaultValue: 'any',
},
...connectionArgs,
},
resolve: (obj, {status, ...args}) =>
connectionFromArray(getTodos(status), args),
},
So where does the object come from? The key part here is the getTodos function, which is responsible for actually getting an array of the objects from your data source. Since this field is a TodosConnection and we've already specified in the connection definitions that the nodes are GraphQLTodos, GraphQL knows that the text and complete fields are resolved by getting (in this case) identically named fields on the objects that have been returned. In other words, the returned object is passed to the resolve method on each field.
Querying the Root
You have two fields exposed on Root: viewer and node. Ignoring node for a moment, you have just one way to actually query todos. Since viewer is of type GraphQLUser, and GraphQLUser has that todos field, they can be fetched only as a subfield of viewer, like this:
{
viewer {
todos(first: 10) {
edges {
# each node is a Todo item
node {
text
complete
}
}
}
}
}
Mystery of the Node
But what about that node field? Relay wants to be able to fetch any object using a top-level query, i.e. on your Root field, when given a unique globalId, which is just a base64 encoding of the type name and the id, so Todo:1 is encoded to VG9kbzox. This is set up in the nodeDefinitions (which you haven't included here, but probably have). In those definitions, the globalId is parsed back into the type (Todo) and id (1), and once again you then tell it how to fetch the correct object from your data source. It might look something like:
const { nodeInterface, nodeField } = nodeDefinitions(
(globalId) => {
const { type, id } = fromGlobalId(globalId);
if (type === 'Todo') {
return getTodo(id)
} else if (type === 'User') {
return getUser(id)
}
...
Because you're implementing the nodeInterface in both your GraphQLTodo and GraphQLUser types, Relay will be able query for either of them from the Root's node field.

Each Node of data perform separate database request

below is the GraphQLObject Fields
userId: {
type: GraphQLID,
resolve: obj => {
console.log(obj._id);
return obj._id;
}
},
email: { type: GraphQLString },
password: { type: GraphQLString },
firstName: { type: GraphQLString },
lastName: { type: GraphQLString },
mine server sents multiple request equally as of mine documents, here it will send 5 different request.
how can i optimize these request get all data in one request
589800cf39b58b29c4de90dd
--------------------------------
58980673e7c9a733009091d1
--------------------------------
58985339651c4a266848be42
--------------------------------
589aac5f884b062b979389bc
--------------------------------
589aad9d24989c2d50f2a25a
In such a case you could create a query method which would accept an array as a parameter, which would be an array of IDs in this case.
getUsers: {
type: new GraphQLList(User),
args: {
ids: {
type: new GraphQLNonNull(new GraphQLList(new GraphQLNonNull(GraphQLID)))
}
},
resolve: (root, args, context) => {
let query = 'SELECT * FROM users WHERE id = ANY($1)';
return pgClient.query(query, [args.ids], (err, result) => {
// here you would access all returned rows in the result object
console.log(result);
});
}
}
The query variable would differ depending on what database you are using. In this example I have used the node-postgres module for PostgreSQL. However, the concept is the same - use array of ids to perform single query returning all users.
And then you could call that query:
query getUsers($ids: [ID!]!) {
getUsers(ids: $ids){
id
email
...
}
}
// and the variables below
{
ids: ['id#1', 'id#2', 'id#3']
}
This is a job for Dataloader, a library from Facebook specifically for batching together queries like this:
https://github.com/facebook/dataloader

ExtJS 5.0.1: Unable to use anonymous models in a Session

I'm getting this error:
[E] Ext.data.Session.checkModelType(): Unable to use anonymous models
in a Session
when trying to use a Session when binding a Grid with a Store via ViewModel:
ViewModel:
Ext.define('App.view.gridViewModel', {
extend: 'Ext.app.ViewModel',
alias: 'viewmodel.gridview',
stores:{
gridstore: {
model: 'gridView',
autoLoad: true,
//This triggers the Exception:
session: true,
listeners: {
beforeload: function(store, operation, eOpts) {
var oProxy = this.getProxy();
oProxy.setExtraParams({
tableName: 'dbo.SomeTable'
, identityKey: "id"
, primaryKey: ["id"]
, mode: "readonly"
, lang: "es"
, output: 'json'
});
}
}
}
}
});
View:
Ext.define('App.view.gridView', {
extend: 'Ext.form.Panel',
//...
viewModel: {
type: 'gridview'
},
controller: 'gridview',
// Create a session for this view
session:true,
items: [{
xtype: 'grid',
reference: 'myGrid',
bind: '{gridstore}',
columns: [
//...
]
}]
//...
});
Model's data is fetch through a Proxy:
Model:
Ext.define("App.model.gridView", {
extend: 'Ext.data.Model',
schema: {
namespace: 'App.model'
},
proxy: {
//proxy remote api stuff......
}.
idProperty: 'id'.
primaryKeys: 'id'.
fields: [
//fields
]
});
I have no idea what an anonymous model is and I haven't found anything related in the web, any ideas?
Thanks in advance!
The reason seems to be that in my Server's response I have a JSON Object called "metaData", which collides with the one available in JSON Reader:
Response MetaData
The server can return metadata in its response, in addition to the
record data, that describe attributes of the data set itself or are
used to reconfigure the Reader. To pass metadata in the response you
simply add a metaData attribute to the root of the response data. The
metaData attribute can contain anything, but supports a specific set
of properties that are handled by the Reader if they are present:
http://docs.sencha.com/extjs/5.0/apidocs/#!/api/Ext.data.reader.Json
The curious thing is that I don't use any of the available metaData options for JSON Reader, nothing related to any anonymous model, therefore this might be considered a bug

Resources