How to tell if a streamDestroyed event was triggered by session.disconnect() or closing of browser - opentok

I have a user interface with a "Hang Up" button used to end a call.
The "Hang Up" button calls session.disconnect()
On calling session.disconnect() the other participant in the call listens for the streamDestroyed event which returns a reason of "clientDisconnected"
According to the docs:
"clientDisconnected" — A client disconnected from the session by calling the disconnect() method of the Session object or by closing the browser.
Thing is, I need to differentiate across these two scenarios so that I can provide different messaging to the call participant.
How do I tell if the call ended by 1) explicit hang up (calling session.disconnect()) or 2) closing of the browser?

The only other options for a reason are "networkDisconnected" which means that the other participant lost their internet connection, and "forceDisconnected" which means the participant was force disconnected using session.forceDisconnect(). Neither of these are what you want.
You could keep track of hangups yourself though if you send a signal to everyone in the session before you hangup. Then on the receiving side keep track of whether someone has sent the hangup signal. ie.
// On the hangup side
function hangup() {
session.signal({type: 'hangup'}, function() {
session.disconnect();
});
}
// On the receiving side...
var hangups = {};
session.on('signal:hangup', function(event) {
hangups[event.from.connectionId] = true;
});
session.on('streamDestroyed', function(event) {
if (hangups[event.stream.connection.connectionId] === true) {
// They pressed the hangup button!
}
});

Related

How can I use `firstValueFrom` with `WebSocketSubject` without closing the underlying web socket?

I am using a WebSocketSubject, and I often want to block execution until a given event arrives, which is why I use firstValueFrom, like in the following code:
let websocket = new WebSocketSubject<any>(url);
let firstMessage = await firstValueFrom(websocket.pipe(filter(m => true));
I only have one issue, which is that firstValueFrom calls websocket.unsubscribe() when it resolves the promise, but on a WebSocketSubject that has the effect of closing the underlying Web Socket, which I want to keep open!
Currently, I have thought of a few possible ways out:
Writing an equivalent of firstValueFrom that does not unsubscribe.
Counter argument: I would prefer not reimplementing a function that is nearly perfect, except for one small issue;
Using another Subject that will subscribe to WebSocketSubject, and I will use firstValueFrom on that subject.
Counter argument: In terms of usage, I see potential confusion to have two Subject objects, and having to know which one to use (E.g. Use websocket.next for sending messages upstream, and only use websocketProxy for receiving messages, and never get confused between the two!);
Using multiplex to create temporary Observable objects that will then be closed by firstValueFrom without issue.
Counter argument: As I am not actually multiplexing in this case, I would rather not use that method, whose signature and usage seems overkill for my use case.
In short, I suspect that I am missing something basic (e.g. an appropriate OperatorFunction) that would allow me to make it so that the unsubscribe call made by firstValueFrom does not result in the underlying web socket being closed.
Essentially, you want to always have a subscription so the socket connection stays open. I don't think firstValueFrom is the proper tool for the job. I think its simpler to just create an explicit subscription.
If the intent is to keep it open for the lifetime of the app, just subscribe at app launch.
Since you want to filter out the first several emissions until some condition is met, you can use skipWhile:
const websocket = new WebSocketSubject<any>(url);
const messages = websocket.pipe(skipWhile(m => m !== 'my special event'));
websocket.subscribe(); // keep socket open
// listen
messages.subscribe(m => console.log('message received:', m);
// send
websocket.next('hello server');
It may be worth creating a light wrapper class around the rxjs websocket that handles keeping the connection open and filtering out the first few events:
class MyWebsocket {
private websocket = new WebSocketSubject<any>(this.url);
public messages = websocket.pipe(skipWhile(m => m !== 'my special event'));
constructor(private url) {
this.websocket.subscribe(); // keep socket open
}
public sendMessage(message: any) {
this.websocket.sendMessage(message);
}
}
const websocket = new MyWebsocket(url);
// listen
websocket.messages.subscribe(m => console.log('message received:', m);
// send
websocket.sendMessage('hello server');

RxJS5 WebSocketSubject - how to filter and complete messages?

I'm looking for some guidance on the correct way to setup a WebSocket connection with RxJS 5. I am connecting to a WebSocket that uses JSON-RPC 2.0. I want to be able to execute a function which sends a request to the WS and returns an Observable of the associated response from the server.
I set up my initial WebSocketSubject like so:
const ws = Rx.Observable.webSocket("<URL>")
From this observable, I have been able to send requests using ws.next(myRequest), and I have been able to see responses coming back through the ws` observable.
I have struggled with creating functions that will filter the ws responses to the correct response and then complete. These seem to complete the source subject, stopping all future ws requests.
My intended output is something like:
function makeRequest(msg) {
// 1. send the message
// 2. return an Observable of the response from the message, and complete
}
I tried the following:
function makeRequest(msg) {
const id = msg.id;
ws.next(msg);
return ws
.filter(f => f.id === id)
.take(1);
}
When I do that however, only the first request will work. Subsequent requests won't work, I believe because I am completing with take(1)?
Any thoughts on the appropriate architecture for this type of situation?
There appears to be either a bug or a deliberate design decision to close the WebSocket on unsubscribe if there are no further subscribers. If you are interested here is the relevant source.
Essentially you need to guarantee that there is always a subscriber otherwise the WebSocket will be closed down. You can do this in two ways.
Route A is the more semantic way, essentially you create a published version of the Observable part of the Subject which you have more fine grained control over.
const ws = Rx.Observable.webSocket("<URL>");
const ws$ = ws.publish();
//When ready to start receiving messages
const totem = ws$.connect();
function makeRequest(msg) {
const { id } = msg;
ws.next(msg);
return ws$.first(f => f.id === id)
}
//When finished
totem.unsubscribe();
Route B is to create a token subscription that simply holds the socket, but depending on the actual life cycle of your application you would do well to attach to some sort of closing event just to make sure it always gets closed down. i.e.
const ws = Rx.Observable.webSocket("<URL>");
const totem = ws.subscribe();
//Later when closing:
totem.unsubscribe();
As you can see both approaches are fairly similar, since they both create a subscription. B's primary disadvantage is that you create an empty subscription which will get pumped all the events only to throw them away. They only advantage of B is that you can refer to the Subject for emission and subscription using the same variable whereas A you must be careful that you are using ws$ for subscription.
If you were really so inclined you could refine Route A using the Subject creation function:
const safeWS = Rx.Subject.create(ws, ws$);
The above would allow you to use the same variable, but you would still be responsible for shutting down ws$ and transitively, the WebSocket, when you are done with it.

How to send message or establish 2 way communication between two XUL Overlay Firefox extensions/add-ons

I have an XUL Overlay Firefox extension, I need to develop a dummy XUL extension that establishes connection with the original extension and sends a set of parameters (message) to the original extension. In short, I have to trigger my original extension with my dummy extension.
Probably the easiest way to do this is to have the original extension listening for a custom event on the base browser window. The dummy extension can then create and dispatch the event with whatever custom data is desired.
Creating and dispatching the event from the dummy:
function sendDataToMainExtension(data) {
if (typeof window === "undefined") {
//If there is no window defined, get the most recent.
var window=Components.classes["#mozilla.org/appshell/window-mediator;1"]
.getService(Components.interfaces.nsIWindowMediator)
.getMostRecentWindow("navigator:browser");
}
//This assumes that this event is being both sent from
// and received by privileged (main add-on) code.
var event = new CustomEvent('MyExtensionName-From-Dummy', { 'detail': data });
window.dispatchEvent(event);
}
You may need to take the same steps for making sure the data is visible on the receiving end as would be necessary when firing from privileged code to non-privileged code.
Listening for the event in main:
Components.utils.import("resource://gre/modules/Services.jsm");
const Ci = Components.interfaces;
//Listen for the event on all windows as it is unknown on which one
// the event will be sent.
function loadIntoWindow(myWindow) {
myWindow.addEventListener("MyExtensionName-From-Dummy",
receiveMessageFromDummy, false);
}
function unloadFromWindow(myWindow) {
myWindow.removeEventListener("MyExtensionName-From-Dummy",
receiveMessageFromDummy, false);
}
function forEachOpenWindow(fn) {
// Apply a function to all open browser windows
var windows = Services.wm.getEnumerator("navigator:browser");
let windowCount =0;
while (windows.hasMoreElements()) {
windowCount++;
fn(windows.getNext().QueryInterface(Ci.nsIDOMWindow));
}
}
function receiveMessageFromDummy(event) {
var dataFromDummy = event.detail;
//Do whatever was desired with the data.
}
var WindowListener = {
onOpenWindow: function(aWindow)
{
let domWindow = aWindow.QueryInterface(Ci.nsIInterfaceRequestor)
.getInterface(Ci.nsIDOMWindowInternal || Ci.nsIDOMWindow);
function onWindowLoad()
{
domWindow.removeEventListener("load",onWindowLoad);
if (domWindow.document.documentElement.getAttribute("windowtype")
== "navigator:browser") {
loadIntoWindow(domWindow);
}
}
domWindow.addEventListener("load",onWindowLoad);
},
onCloseWindow: function(xulWindow) { }, // Each window has an unload event handler.
onWindowTitleChange: function(xulWindow, newTitle) { }
};
//Listen for the custom event on all current browser windows.
forEachOpenWindow(loadIntoWindow);
//Listen for the custom event on any new browser window.
Services.wm.addListener(WindowListener);
The data sent should be available as event.detail within the receiveMessageFromDummy() function.
The code above provides one way communication. Two way communication is obtained just duplicating the code to communicate in the other direction with a different custom event. In other words, by having the main extension dispatching a different custom event called something like MyExtensionName-From-Main and having the dummy extension listening for that event. The code is exactly the same as above, but with the event name changed and the function called being receiveMessageFromMain().
Alternately, you could use Window.postMessage(). Doing so sends a "message" event for which you can listen. However, doing so leads to complications which are easier to avoid by using a custom event (e.g. you have to account for the fact that any code (i.e. some other random extension) could be using this event for their own purpose).
Note: The code to loop through windows was originally taken from Converting an old overlay-based Firefox extension into a restartless addon which that author re-wrote as the initial part of How to convert an overlay extension to restartless on MDN. It has been modified multiple times from that code. It may have even earlier versions from other sources.

Connection dropped but endPoints did not change or remained same

How will I come to know easily that an attempt was made by dragging one connection, left in between having no change in endpoints?
You need to get notified when a endpoint is dragged to make connection but dropped such that no connection was made. To attain this we need to handle 2 events:
1.event triggered when connection was made. (jsPlumbConnection)
2.event triggered when endpoint stopped dragging. (dragOptions: {stop})
NOTE: jsPlumbConnection event is triggered before stop event.
Maintain a global variable(FLAG) which is initially false and set to true when connection is made. While endpoint is stopped dragging we check FLAG for our result.
FLAG=false; //global variable
jsPlumb.bind("jsPlumbConnection", function(ci) { //Register event when connection is made
flag=true; // set to true when connection is made
});
var e = jsPlumb.addEndpoint("DOM_ELEMENT", {
endpoint:"Dot",
hoverPaintStyle:{ fillStyle:"red" },
anchor:"Right",
dragOptions: {
stop:function() { //Register event when endpoint stopped dragging
if(FLAG===false)
alert("No connection was made");
else
FLAG=false; // Reset variable for next use
}
}
});

NodeJS wait for callback to finish on event emit

I have and application written in NodeJS with Express and am attempting to use EventEmitter to create a kind of plugin architecture with plugins hooking into the main code by listening to emitted events.
My problem comes when a plugin function makes an async request (to get data from mongo in this case) this causes the plugin code to finish and return control back to the original emitter which will then complete execution, before the async request in the plugin code finishes.
E.g:
Main App:
// We want to modify the request object in the plugin
self.emit('plugin-listener', request);
Plugin:
// Plugin function listening to 'plugin-listener', 'request' is an arg
console.log(request);
// Call to DB (async)
this.getFromMongo(some_data, function(response){
// this may not get called until the plugin function has finished!
}
My reason for avoiding a callback function back to the main code from the 'getFromMongo' function is that there may be 0 or many plugins listening to the event. Ideally I want some way to wait for the DB stuff to finish before returning control to the main app
Many Thanks
Using the EventEmitter for plugin/middleware management is not ideal, because you cannot ensure that the listeners are executed sequentially, if they have asynchroneous code. This especially is a problem when these listeners interact with each other or the same data.
That's why i.e. connect/express middleware functions are stored in an array and executed one after the other, instead of using an EventEmitter; They each need to call a next(); function when they are done doing their task.
You can't mix asynchronous calls with synchronous behavior. If you're going to stick with event emitter (which may not be ideal for you as Klovadis pointed out), you'll need to have your plugin emit an event that triggers a function in the main app which contains the code that you want to 'wait' to execute. You would also have to in turn keep track of all the plugin calls you made that you are waiting for event calls for so that your main code doesn't run until all the plugin calls have finished their MongoDB callbacks.
var callList = ['pluginArgs1', 'pluginArgs2', 'pluginArgs3'];
for (var i = 0; i < callList.length; i++){
self.emit('plugin-listener', callList[i], i);
}
self.on('plugin-callback', function(i){
callList.splice(i, 1);
if (callList.length < 1){
//we're done, do something
}
});
Had the same kind of decision to make about some events that I sometime need to wait for before returning the response to the client and sometimes not (when not in an HTTP request context).
The easiest way for me was to add a callback as the last argument of the event.
Stuff.emit('do_some_stuff', data, data2, callback);
In the event check if there is a callback:
Stuff.on('do_some_stuff', function(data, data2, callback) {
// stuff to do
// ...
if (typeof callback === "function") return callback(err, result);
});
I know that mixing event and callbacks can be messy but that work fine for what I need.
The other solution I see is the one proposed by #redben: add an emit function at the end of the event. The problem when in a HTTP context is that you need unique keys so your events don't mess up if they do different stuff per user.
Haven't tried it myself but you could use a property in the event's data object as an array of functions to execute by the code that emitted the event :
Listeners
foo.on('your-event', function(data) {
console.log(data);
// Then add the asynchronous code to a callbacks array
// in the event data object
data.callbacks.push(function(next) {
getFromMongo(some_data, function(err, result) { next(err) }
}
});
Emitter
self.emit('your-event', data);
// listeners have modified data object,
// some might have added callback to data.callbacks
// (suppose you use async)
async.series(data.callbacks);
This seems quite dangerous, but I have to do it anyway...
const ee = new EventEmitter();
if (ee.listeners("async-event").length > 0) {
await new Promise((resolve) => {
ee.emit("async-event", data1, data2, resolve);
});
}
Otherwise, just emit the event back-and-forth.

Resources