How to handle extra subscriber stream on web, when a mobile user is reconnected to an opentok session? - opentok

Let's say we have an opentok call between a mobile client and a web client.
When the mobile is reconnected to a opentok session, after a losing its internet connection, an extra subscriber stream is received for all participants of that session (including mobile). This extra stream is the previous mobile stream prior losing internet. As a workaround we ignore that stream on mobile as mentioned here
On the web client, how do we recognize the mobile stream that we should ignore?

Related

How to send an audio stream over SIP

I'm developing an application that receives an audio stream over a WebSocket and needs to forward the audio to a SIP server.
Currently, I've managed to connect to the audio source using a Websocket and receive the media stream (encoded u-law) using Node-Red, but I'm struggling to figure out how to send the media stream to the SIP server. Any advice would be much appreciated.
I looked into this for a similar question a while back, can't find where it was now.
As you probably know the media part of SIP is RTP, so its a fairly separate stack to the call signalling.
I didn't find any nodes that supported it and the few node.js libraries for RTP were all very incomplete and out of date.
In theory it might be possible to craft your own RTP streams using the UDP nodes and then create the relevant SDP in the SIP response but I'm not sure how robust or scalable this would be.
The other option is that there are a couple of Programmable Comms platforms out there that support both SIP and Web sockets so you could possible utilise one of those and connect from Node-RED via websocket letting them do the SIP work.
I've done SIP|<>Websocket stuff with both the Vonage API (Previously Nexmo) and Jambonz (open source)

Architecture for Live Audio Streaming web app

Need your opinions on architecture for the live audio streaming app.
Currently, I'm testing it on a local network, and everything seems to work, but I have doubts about how good it will be in production.
My architecture:
1 2
Broadcaster HTTP client ---> App Server ---> Listening clients (React.js App)
1 — communication over HTTP, 2 — communication over HTTP and WebSocket
What I want to do:
When the user opens my React App and the Broadcaster is not streaming yet, React should display something like "OFFLINE"
Next, when the Broadcaster starts streaming to App Server, React App should display "The stream is started" and automatically start the playback.
Finally, when the Broadcaster stops streaming, React App should display "OFFLINE" again.
How I currently do it:
My App server uses two protocols: HTTP (for audio streaming and other stuff) and WebSocket (only for sending JSON status messages of what happens on the server).
When The Broadcaster starts streaming to the App Server (over HTTP), the App Server sends the WebSocket message to React App: "the stream has started, you can access it at http://my-domain/stream i.e. the App Servers streams the audio to React over regular HTTP.
React App sees this message and renders HTML <audio> element and starts playing the audio.
When the Broadcaster has stopped streaming, App Server sends WebSocket message to React App "the stream is finished" and React hides the player, displaying "OFFLINE"
So, I do all streaming (both from Broadcaster to App Server and from App Server to React client) over HTTP and use WebSocket to communicate real-time stream state updates.
How good is this architecture?
How good is this architecture?
It's not so much a matter of good or bad, it's a matter of whether or not it's appropriate for your use case. I'd note that this is basically exactly how internet radio servers such as SHOUTcast and Icecast have worked for 20+ years, so it can't be that bad. :-)

OpenTok TokBox: How can I automatically start a live streaming (rtmp) broadcast of a session when the first connected user publishes?

I've seen the sample app on github. There is an explicit "Start Broadcasting" button that does what it sounds like (starts broadcasting rtmp).
I'd like not to have an explicit button. I'd like to start broadcasting when the first user in a session publishes his or her camera. So if 5 users connect to the session, call broadcast when the first of them publishes a stream, but not when any of the others do.
Can I query the session and know whether it is live streaming currently? What is the best practice here? Thanks.
TokBox Developer Evangelist here.
You cannot query the number of active streams in a Session, you would have to store that information on your own as events are being dispatched. Please see this SO answer for more details: #OpenTok how enumerate streams in a session?
As for broadcasting, you can start broadcasting programmatically when the first person starts publishing instead of using a visual component to trigger the call. For example, on the client side, you can listen for the streamCreated event and then send a request to your application server to start the broadcast. Your application server would then have to make a the startBroadcast call to OpenTok via a Server SDK or using the REST API.
Alternatively, you could use Session Monitoring to listen to Stream and Connection events on the server via a webhook, to start the broadcast.
To find out more on how OpenTok Broadcasting works, I recommend checking out the following resources:
OpenTok Broadcasting Developer Guide
OpenTok Broadcasting Webinar

Is it possible to emit stream object through socket.io

For my app I'm streaming audio data from a raspberry-pi client to my node.js audio service through socket.io. The problem is, to process the audio, I'm piping the audio stream from client in my service to an external service. Then this external service will give the result stream audio back to my service and my service will emit it to the client.
So my application flow is like
Client ---socket.io-stream---> audio_service ---stream---> external_service
external_service --stream---> audio_service ---socket.io-stream---> client
My questions is:
Is it possible that when a client connected to my audio_service, my audio_service will initiate a connection to external_service and emit that connection back to the client through socket.io? This way the client will stream audio directly to the external_service by using the returned connection instead of going through audio_service.
If it is possible, is it also possible that even though the client stream audio directly to the external_service, it will still send the stream result back to the audio_service?
Thank you very much for your help
It isn't possible to send a stream through Socket.IO the way it is set up today. Some folks have made some add-ons that emulate streams in the usual evented RPC way, but it isn't very efficient.
The best library I know for this is Binary.JS. This will give you streams multiplexed over a single binary WebSocket connection. Unfortunately, Binary.js hasn't been maintained in awhile, but it still works.

Streaming Media to Server using web socket

I am a newbie to WebRtc. I would like to send the media stream from client ( java script) to my server (via websockets). In my server I will be doing some processing on those media content.
Could you please show me a client code snippet on sending media stream to Server via websocket.
Thanks
Ganesh.R
Nobody can show you this, because you cannot send the stream via Websockets. You need to read a little more about WebRTC.
WebRTC give you the possibility to request access to media devices from Javascript, and allows you to create a PeerConnection that will establish a connection to another endpoint to send the streams captured from the devices or some raw data (using DataChannel).
You won't have access to the streams data to send via WebSockets. Instead, the browser will send it over UDP or TCP using the SRTP protocol. If you want to get media streams on server side, you will need to implement this protocol and some negotiation to establish the connection.
HTML5Rocks have a great introduction with code snippets to start.

Resources