For my app I'm streaming audio data from a raspberry-pi client to my node.js audio service through socket.io. The problem is, to process the audio, I'm piping the audio stream from client in my service to an external service. Then this external service will give the result stream audio back to my service and my service will emit it to the client.
So my application flow is like
Client ---socket.io-stream---> audio_service ---stream---> external_service
external_service --stream---> audio_service ---socket.io-stream---> client
My questions is:
Is it possible that when a client connected to my audio_service, my audio_service will initiate a connection to external_service and emit that connection back to the client through socket.io? This way the client will stream audio directly to the external_service by using the returned connection instead of going through audio_service.
If it is possible, is it also possible that even though the client stream audio directly to the external_service, it will still send the stream result back to the audio_service?
Thank you very much for your help
It isn't possible to send a stream through Socket.IO the way it is set up today. Some folks have made some add-ons that emulate streams in the usual evented RPC way, but it isn't very efficient.
The best library I know for this is Binary.JS. This will give you streams multiplexed over a single binary WebSocket connection. Unfortunately, Binary.js hasn't been maintained in awhile, but it still works.
Related
I'm developing an application that receives an audio stream over a WebSocket and needs to forward the audio to a SIP server.
Currently, I've managed to connect to the audio source using a Websocket and receive the media stream (encoded u-law) using Node-Red, but I'm struggling to figure out how to send the media stream to the SIP server. Any advice would be much appreciated.
I looked into this for a similar question a while back, can't find where it was now.
As you probably know the media part of SIP is RTP, so its a fairly separate stack to the call signalling.
I didn't find any nodes that supported it and the few node.js libraries for RTP were all very incomplete and out of date.
In theory it might be possible to craft your own RTP streams using the UDP nodes and then create the relevant SDP in the SIP response but I'm not sure how robust or scalable this would be.
The other option is that there are a couple of Programmable Comms platforms out there that support both SIP and Web sockets so you could possible utilise one of those and connect from Node-RED via websocket letting them do the SIP work.
I've done SIP|<>Websocket stuff with both the Vonage API (Previously Nexmo) and Jambonz (open source)
I am new to this !
I am working for a chat application which requires text+ video chats.
I explored Socket.io initially and found it very handy to develop text based chatting application (WEB).
While exploring the Video chat element i came across WebRTC -RTCDataChannel for sending out arbitrary data across connected peers.
My Chat Server( preferably NodeJS ) will be serving the connections for peers, along with saving text chat history.
Confusion:
Should I use Socket.io-MyChatServer as the Signalling server also? [Possible?] , Or
Should I use RTCDataChannel for signalling server? , Or
Simply forget Socket.io and consider WebRTC for both !
Thanks in advance :)
Well WebRTC data channels and web sockets are different and complementary concepts in the case of peer connections.
In order to open a data channel you first need a P2P connection. In order to establish a P2P connection, you need a signaling server. So, sockets are used for that purpose, to exchange the metadata necessary to create a P2P connection. First, through sockets you establish a peer to peer connection and only after that you can use data channels.
As for using the same chat server as signaling server is up to you. WebRTC let the signaling server architecture be defined by the developer. It's a blackbox.
So, no you can't use data channels as signaling, as you can see.
I'm new to socket.io. In Realtime (Web) Applications, we used to choose whether it should be WebRTC or WebSocket (or even SIP, still?) technologies.
What exactly is socket.io in this case please?
WebSockets
socket.io is a popular open source library implemented on both backend and client side. This library is based on WebSockets API which allow a communication between a SERVER and a CLIENT.
WebRTC
On the other hand, WebRTC is a WebAPI which comes with basically 3 things:
Real Time Communication between two browsers (no server needed), a peer to peer connection (P2P)
Media Streaming (Audio and Video)
Real Time Communication Data Chanel (stream any data on P2P)
The main difference is that WebSockets needs A SERVER and it is based on publish/subscribe pattern where you can send raw data back and forth, without having any special data handling by default. In contrast, WebRTC has a lot of functions already in place which can be used to handle Audio/Video streaming and also the raw data with data chanel.
For more info I recommend reading on MDN links I provided above and also check this very cool slides on sockets and webRTC
If you want to make video or audio communication services use WebRTC for browser build in support and write the discovery and signaling. webrtc have awesome features like P2P connections and data encryption.
WebRTC client side (browser) features like get video and audio data with good support in evergreen browsers: http://iswebrtcreadyyet.com/#interop
And socket.io is good for build centralized pub / sub apps like text chat
You can make connections with WebRTC without socket.io but both works fine if you use socket.io for help in signaling
I am a newbie to WebRtc. I would like to send the media stream from client ( java script) to my server (via websockets). In my server I will be doing some processing on those media content.
Could you please show me a client code snippet on sending media stream to Server via websocket.
Thanks
Ganesh.R
Nobody can show you this, because you cannot send the stream via Websockets. You need to read a little more about WebRTC.
WebRTC give you the possibility to request access to media devices from Javascript, and allows you to create a PeerConnection that will establish a connection to another endpoint to send the streams captured from the devices or some raw data (using DataChannel).
You won't have access to the streams data to send via WebSockets. Instead, the browser will send it over UDP or TCP using the SRTP protocol. If you want to get media streams on server side, you will need to implement this protocol and some negotiation to establish the connection.
HTML5Rocks have a great introduction with code snippets to start.
I am trying to understand the difference between WebRTC and WebSockets so that I can better understand which scenario calls for what. I am curious about the broad idea of two parties (mainly web based, but potentially one being a dedicated server application) talking to each other.
Assumption:
Clearly in regards to ad-hoc networks, WebRTC wins as it natively supports the ICE protocol/method.
Questions:
Regarding direct communication between two known parties in-browser, if I am not relying on sending multimedia data, and I am only interested in sending integer data, does WebRTC give me any advantages over webSockets other than data encryption?
Regarding a dedicated server speaking to a browser based client, which platform gives me an advantage? I would need to code a WebRTC server (is this possible out of browser?), or I would need to code a WebSocket server (a quick google search makes me think this is possible).
There is one significant difference: WebSockets works via TCP, WebRTC works via UDP.
In fact, WebRTC is SRTP protocol with some additional features like STUN, ICE, DTLS etc. and internal VoIP features such as Adaptive Jitter Buffer, AEC, AGC etc.
So, WebSockets is designed for reliable communication. It is a good choice if you want to send any data that must be sent reliably.
When you use WebRTC, the transmitted stream is unreliable. Some packets can get lost in the network. It is bad if you send critical data, for example for financial processing, the same issue is ideally suitable when you send audio or video stream where some frames can be lost without any noticeable quality issues.
If you want to send data channel via WebRTC, you should have some forward error correction algorithm to restore data if a data frame was lost in the network.
WebRTC specifies media transport over RTP .. which can work P2P under certain circumstances. In any case to establish a webRTC session you will need a signaling protocol also .. and for that WebSocket is a likely choice. In other words: unless you want to stream real-time media, WebSocket is probably a better fit.
Question 1: Yes. The DataChannel part of WebRTC gives you advantages in this case, because it allows you to create a peer to peer channel between browsers to send and receive any raw data you want. Websockets forces you to use a server to connect both parties.
Question 2 Like I said in the previous response, Websockets are better if you want a server-client communication, and there are many implementations to do this (i.e. jWebSocket). To add support in a server to establish a connection with a WebRTC DataChannel, it may take you some days of life and health. :)