Is it possible to subscribe to asset creation events in composer? - hyperledger-composer

I realise I can emit events in chaincode and listen for those events in the node SDK.
However, I'd like to be able to receive an event when an asset is created so that I can act on that. Is this possitble?
An alternative might be the ability to specify a function which is called when an asset is added.

I can think of one workaround:
create a transaction that creates that asset with an event emitted
restrict the creation of an asset to one transaction only in .acl
Example:
rule CreateAssetOnlyWithTx{
description: "Allow all learners to create and read their own submissions"
participant: "ANY"
operation: CREATE
resource: "org.example.blah.asset"
transaction: "org.example.blah.tx"
action: ALLOW
}

if you add an asset through the transaction function, you can emit an event as soon as it is created. The event is only emitted, if the asset was successfully created FYI. See more here https://hyperledger.github.io/composer/business-network/publishing-events.html
// Add a vehicle to the asset registry and emit an event - not tested.
return getAssetRegistry('org.acme.Vehicle')
.then(function (vehicleAssetRegistry) {
// Get the factory for creating new asset instances.
var factory = getFactory();
// Create the vehicle.
var vehicle = factory.newResource('org.acme', 'Vehicle', 'VEHICLE_1');
vehicle.colour = 'BLUE';
// Add the vehicle to the vehicle asset registry.
return vehicleAssetRegistry.add(vehicle);
})
.catch(function (error) {
// Add optional error handling here. If the transaction fails, no event is emitted please note
})
.then(function() {
var addNotification = getFactory().newEvent('org.acme.trading', 'AddNotification');
addNotification.vehicle = vehicle; // the object passed into the transaction etc etc
emit(addNotification);
})

Related

Apollo conditional data sources & initialization lifecycle

I have a specific use case where a user’s data sources are conditional - e.g based on the data sources saved in the database for every specific user.
This also means every data source has unique credentials for every user, which is fine for RESTDataSource because I can use the willSendRequest to set the Authentication headers before each request.
However, I have custom data sources that have proprietary clients (for example JSForce for Salesforce) - and they have their own fetch mechanism.
As of now - I have a custom transformer directive that fetches the tokens from the database and adds it into the context - however, the directive is ran before the dataSource.initialize() method - so that I can’t use the credentials there because the context still doesn’t have it.
I also don’t want to initialize all data sources for every user even if he doesn’t use said data source in this request - but the dataSources() function doesn’t accept any parameter and is not contextual.
Bottom line is - is it possible to pass data sources conditionally based even on the Express request? When is the right time to pass the tokens and credentials to the dataSource? Maybe add my own custom init function and call it from the directive?
So you have options. Here are 2 choices:
1. Just add your dataSources
If you just initialize all dataSources, internally it can check to see if the user has access. You could have a getClient function that resolves on the client or throws an UnauthorizedError, depending.
2. Don't just add your dataSources
So if you really don't want to initialize the dataSources at ALL, you can absolutely do this by adding the "dataSources" yourself, just like Apollo does it.
const server = new ApolloServer({
// this example uses apollo-server-express
context: async ({ req, res }) => {
const accessToken = req.headers?.authorization?.split(' ')[1] || ''
const user = accessToken && buildUser(accessToken)
const context = { user }
// You can't use the name "dataSources" in your config because ApolloServer will puke, so I called them "services"
await addServices(context)
return context
}
})
const addServices = async (context) => {
const { user } = context;
const services = {
userAPI: new UserAPI(),
postAPI: new PostAPI(),
}
if (user.isAdmin) {
services.adminAPI = new AdminAPI()
}
const initializers = [];
for (const service of Object.values(services)) {
if (service.initialize) {
initializers.push(
service.initialize({
context,
cache: null, // or add your own cache
})
);
}
}
await Promise.all(initializers);
/**
* this is where you have to deviate from Apollo.
* You can't use the name "dataSources" in your config because ApolloServer will puke
* with the error 'Please use the dataSources config option instead of putting dataSources on the context yourself.'
*/
context.services = services;
}
Some notes:
1. You can't call them "dataSources"
If you return a property called "dataSources" on your context object, Apollo will not like it very much [meaning it throws an Error]. In my example, I used the name "services", but you can do whatever you want... except "dataSources".
With the above code, in your resolvers, just reference context.services.whatever instead.
2. This is what Apollo does
This pattern is copied directly from what Apollo already does for dataSources [source]
3. I recommend you still treat them as DataSources
I recommend you stick to the DataSources pattern and that your "services" all extend DataSource. It's going to be easier for everyone involved.
4. Type safety
If you're using TypeScript or something, you're going to lose a bit of type safety, since the context.services is either going to be one shape or another. Even if you're not, if you're not careful, you may end up throwing "Cannot read property users of undefined" errors instead of "Unauthorized" errors. You might be better off creating "dummy services" that reflect the same object shape but just throw Unauthorized.

How do you get data back from a react-redux store?

Using React-Redux
I have a select list that when the user chooses one of the options, a item is created and placed in the database (if it matters, the reason its a select box is that there are multiple variations of the same item and what variation is important).
This is working correctly.
My problem is that I am not sure how I can get the id of the new item out of the redux store.
And just for chuckles, the prior developer set all this up with sagas. So I am still coming up to speed on how it all works together.
So when the select box is checked, the function checkFunction is called that calls the function createItem in the saga file. These functions are below:
in Repositories.jsx
checkFunction = (data) => {
const {createItem} = this.props;
// data holds the info that we need to send to the action
const created = createItem(data);
// once data comes back, redirect the user ot the details of the created item
// need id of created item
// navigateTo(`/item-details/${created.id}`);
}
in Repositories.saga.js
export function* createItem(action) {
try {
const {payload: newItem} = action;
// call api to create item
const created = yield call(requestAPI.post, ITEMS_URL, newItem);
// send message that its been done
yield put(actions.repositories.createItem.ok(created));
// send message to refresh item list
yield put(actions.inventories.fetchItems.start());
} catch (e) {
yield put(actions.repositories.createItem.fail(e));
}
}
I don't understand how to return the id of the created item once its created. I feel like I am missing something basic here. Can anyone point me in the right direction?
Actually getting data from saga back to react component is not trivial. There are multiple approaches to do what you need although each has its downside.
1. Call navigateTo in the saga.
export function* createItem(action) {
...
const created = yield call(requestAPI.post, ITEMS_URL, newItem);
yield call(navigateTo, `/item-details/${created.id}`)
}
This would be my recommended solution if you can get the navigateTo function into the saga. Navigation is a side effect and sagas are there to deal with side effects. Make sure to use the call effect, changing the url by directly calling the function can lead to some issues.
2. Store the latest created item id in redux store
In your reducer, when action actions.repositories.createItem.ok(created) is dispatched, store the created item info and then send the latest created item to the component. Finally you can use componentDidUpdate or useEffect to call navigateTo when the prop changes.
render() {
const created = Redux.useSelector(state => state.created);
useEffect(() => navigateTo(`/item-details/${created.id}`), [created])
...
}
This has the disadvantage that the component will rerender becuase of the changed created value.
3. Send callback in the createItem action
You can put a function into your action and then call it from the saga and essentially using it as a callback.
Component:
checkFunction = (data) => {
const {createItem} = this.props;
// data holds the info that we need to send to the action
const created = createItem(data, (created) => {
navigateTo(`/item-details/${created.id}`);
});
}
Saga:
export function* createItem(action) {
...
const created = yield call(requestAPI.post, ITEMS_URL, newItem);
action.callback(created)
...
}
The problem with this approach is that functions are not serializable and so you ideally should avoid them in your actions. Also, technically, there could be multiple sagas handling the same action and then it gets kind of confusing who should call the callback.

Fabric.js - Sync object:modified event to another client

Collaboration Mode:
What is the best way to propagate changes from Client #1's canvas to client #2's canvas? Here's how I capture and send events to Socket.io.
$scope.canvas.on('object:modified',function(e) {
Socket.whiteboardMessage({
eventId:'object:modified',
event:e.target.toJSON()
});
});
On the receiver side, this code works splendidly for adding new objects to the screen, but I could not find documentation on how to select and update an existing object in the canvas.
fabric.util.enlivenObjects([e.event], function(objects) {
objects.forEach(function(o) {
$scope.canvas.add(o);
});
});
I did see that Objects have individual setters and one bulk setter, but I could not figure out how to select an existing object based on the event data.
Ideally, the flow would be:
Receive event with targeted object data.
Select the existing object in the canvas.
Perform bulk update.
Refresh canvas.
Hopefully someone with Fabric.JS experience can help me figure this out. Thanks!
UPDATED ANSWER - Thanks AJM!
AJM was correct in suggesting a unique ID for every newly created element. I was also able to create a new ID for all newly created drawing paths as well. Here's how it worked:
var t = new fabric.IText('Edit me...', {
left: $scope.width/2-100,
top: $scope.height/2-50
});
t.set('id',randomHash());
$scope.canvas.add(t);
I also captured newly created paths and added an id:
$scope.canvas.on('path:created',function(e) {
if (e.target.id === undefined) {
e.target.set('id',randomHash());
}
});
However, I encountered an issue where my ID was visible in console log, but it was not present after executing object.toJSON(). This is because Fabric has its own serialization method which trims down the data to a standardized list of properties. To include additional properties, I had to serialize the data for transport like so:
$scope.canvas.on('object:modified',function(e) {
Socket.whiteboardMessage({
object:e.target.toJSON(['id']) // includes "id" in output.
})
});
Now each object has a unique ID with which to perform updates. On the receiver's side of my code, I added AJM's object-lookup function. I placed this code in the "startup" section of my application so it would only run once (after Fabric.js is loaded, of course!)
fabric.Canvas.prototype.getObjectById = function (id) {
var objs = this.getObjects();
for (var i = 0, len = objs.length; i < len; i++) {
if (objs[i].id == id) {
return objs[i];
}
}
return 0;
};
Now, whenever a new socket.io message is received with whiteboard data, I am able to find it in the canvas via this line:
var obj = $scope.canvas.getObjectById(e.object.id);
Inserting and removing are easy, but for updating, this final piece of code did the trick:
obj.set(e.object); // Updates properties
$scope.canvas.renderAll(); // Redraws canvas
$scope.canvas.calcOffset(); // Updates offsets
All of this required me to handle the following events. Paths are treated as objects once they're created.
$scope.canvas.on('object:added',function(e) { });
$scope.canvas.on('object:modified',function(e) { });
$scope.canvas.on('object:moving',function(e) { });
$scope.canvas.on('object:removed',function(e) { });
$scope.canvas.on('path:created',function(e) { });
I did something similar involving a single shared canvas between multiple users and ran into this exact issue.
To solve this problem, I added unique IDs (using a javascript UUID generator) to each object added to the canvas (in my case, there could be many users working on a canvas at a time, thus I needed to avoid collisions; in your case, something simpler could work).
Fabric objects' set method will let you add an arbitrary property, like an id: o.set('id', yourid). Before you add() a new Fabric object to your canvas (and send that across the wire), tack on an ID property. Now, you'll have a unique key by which you can pick out individual objects.
From there, you'd need a method to retrieve an object by ID. Here's what I used:
fabric.Canvas.prototype.getObjectById = function (id) {
var objs = this.getObjects();
for (var i = 0, len = objs.length; i < len; i++) {
if (objs[i].id == id) {
return objs[i];
}
}
return null;
};
When you receive data from your socket, grab that object from the canvas by ID and mutate it using the appropriate set methods or copying properties wholesale (or, if getObjectById returns null, create it).

IndexedDB error in FireFox when calling createObjectStore

I'm trying to call createObjectStore on a newly credited indexedDB and getting this error in FireFox: InvalidStateError: A mutation operation was attempted on a database that did not allow mutations.
Here is my code:
var indexed_db = window.indexedDB || window.webkitIndexedDB || window.mozIndexedDB;
if (indexed_db) {
var request = indexed_db.open("Map Tiles", 1);
request.onerror = function(event) { };
request.onsuccess = function(event) {
var tile_store = event.target.result.createObjectStore("map", {keyPath: ["zoom_level", "tile_column", "tile_row"]});
};
request.onupgradeneeded = function(event) { };
}
The error is happening when I call createObjectStore. Any help would be appreciated.
There are basically three types of transactions with indexedDB: readonly, readwrite, and versionchange. You can add/remove objects to/from an object store in a transaction that is the readwrite type. Technically you can also add/remove objects in versionchange but it is not what I consider a best practice. However, you cannot create/remove object stores or indices in a readwrite/readonly type transaction (you get this error). You can only do objectstore/index create/remove in a versionchange transaction.
You can directly create transactions of the type readonly and readwrite, but you cannot create versionchange. versionchange only happens within an upgradeneeded event callback. Effectively you can only make changes in the onupgradeneeded callback. So, as your comment says, doing schema changes (add/remove stores/indices) outside of a versionchange transaction triggers this error, which is basically every transaction outside of the specially typed one provided inside onupgradeneeded.

Backbone pass object with event

Reading up on tutorials of Backbone, it seems that when the add event is fired from a collection, the item added is sent along with the event (same goes for remove). I can't find any documentation on this feature on the backbonejs.org site and was curious if there was a way I could send an object along with my custom events. Secondly, is something like this possible in Marionette?
Each object defined by Backbone mixes in Backbone.Events which means you can trigger events with object.trigger. It is defined as
trigger object.trigger(event, [*args])
Trigger callbacks for the given event, or space-delimited list of events. Subsequent arguments
to trigger will be passed along to the event callbacks.
You just have to pass additional arguments to get them in your callbacks.
For example,
var m = new Backbone.Model();
m.on('custom', function(more) {
console.log(more);
});
m.trigger('custom', 'more info');
will log more info
See http://jsfiddle.net/nikoshr/HpwXe/ for a demo
You would trigger an event with a reference to the object to emulate the behavior of backbone :
var m = new Backbone.Model();
m.on('custom', function(model, more) {
console.log(arguments);
});
m.trigger('custom', m, 'more info');
http://jsfiddle.net/nikoshr/HpwXe/1/
And in a derived model:
var M = Backbone.Model.extend({
custom: function() {
this.trigger('custom', this);
}
});
var m = new M();
m.on('custom', function(model, more) {
console.log(model);
});
m.custom();
http://jsfiddle.net/nikoshr/HpwXe/2/
Yes of course, you can use Backbone.Event
var collection = Backbone.Collection.extend();
collection = new collection();
collection.on("message", function(message){
console.log(message);
});
var model = new Backbone.Model();
collection.add(model);
model.trigger("message", "This is message");
About what types of events you can see to backbone documentation.
This is demo
Also you can use Event Aggregator from Marionette.js
An event aggregator implementation. It extends from Backbone.Events to provide the core event handling code in an object that can itself be extended and instantiated as needed.
var vent = new Backbone.Wreqr.EventAggregator();
vent.on("foo", function(){
console.log("foo event");
});
vent.trigger("foo");

Resources