I am new to this !
I am working for a chat application which requires text+ video chats.
I explored Socket.io initially and found it very handy to develop text based chatting application (WEB).
While exploring the Video chat element i came across WebRTC -RTCDataChannel for sending out arbitrary data across connected peers.
My Chat Server( preferably NodeJS ) will be serving the connections for peers, along with saving text chat history.
Confusion:
Should I use Socket.io-MyChatServer as the Signalling server also? [Possible?] , Or
Should I use RTCDataChannel for signalling server? , Or
Simply forget Socket.io and consider WebRTC for both !
Thanks in advance :)
Well WebRTC data channels and web sockets are different and complementary concepts in the case of peer connections.
In order to open a data channel you first need a P2P connection. In order to establish a P2P connection, you need a signaling server. So, sockets are used for that purpose, to exchange the metadata necessary to create a P2P connection. First, through sockets you establish a peer to peer connection and only after that you can use data channels.
As for using the same chat server as signaling server is up to you. WebRTC let the signaling server architecture be defined by the developer. It's a blackbox.
So, no you can't use data channels as signaling, as you can see.
I'm new to socket.io. In Realtime (Web) Applications, we used to choose whether it should be WebRTC or WebSocket (or even SIP, still?) technologies.
What exactly is socket.io in this case please?
WebSockets
socket.io is a popular open source library implemented on both backend and client side. This library is based on WebSockets API which allow a communication between a SERVER and a CLIENT.
WebRTC
On the other hand, WebRTC is a WebAPI which comes with basically 3 things:
Real Time Communication between two browsers (no server needed), a peer to peer connection (P2P)
Media Streaming (Audio and Video)
Real Time Communication Data Chanel (stream any data on P2P)
The main difference is that WebSockets needs A SERVER and it is based on publish/subscribe pattern where you can send raw data back and forth, without having any special data handling by default. In contrast, WebRTC has a lot of functions already in place which can be used to handle Audio/Video streaming and also the raw data with data chanel.
For more info I recommend reading on MDN links I provided above and also check this very cool slides on sockets and webRTC
If you want to make video or audio communication services use WebRTC for browser build in support and write the discovery and signaling. webrtc have awesome features like P2P connections and data encryption.
WebRTC client side (browser) features like get video and audio data with good support in evergreen browsers: http://iswebrtcreadyyet.com/#interop
And socket.io is good for build centralized pub / sub apps like text chat
You can make connections with WebRTC without socket.io but both works fine if you use socket.io for help in signaling
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
So I'm looking to build a chat app that will allow video, audio, and text. I spent some time researching into Websockets and WebRTC to decide which to use. Since there are plenty of video and audio apps with WebRTC, this sounds like a reasonable choice, but are there other things I should consider?
Feel free to share your thoughts.
Things like:
Due to being new WebRTC is available only on some browsers, while WebSockets seems to be in more browsers.
Scalability - Websockets uses a server for session and WebRTC seems to be p2p.
Multiplexing/multiple chatrooms - Used in Google+ Hangouts, and I'm still viewing demo apps on how to implement.
Server - Websockets needs RedisSessionStore or RabbitMQ to scale across multiple machines.
WebRTC is designed for high-performance, high quality communication of video, audio and arbitrary data. In other words, for apps exactly like what you describe.
WebRTC apps need a service via which they can exchange network and media metadata, a process known as signaling. However, once signaling has taken place, video/audio/data is streamed directly between clients, avoiding the performance cost of streaming via an intermediary server.
WebSocket on the other hand is designed for bi-directional communication between client and server. It is possible to stream audio and video over WebSocket (see here for example), but the technology and APIs are not inherently designed for efficient, robust streaming in the way that WebRTC is.
As other replies have said, WebSocket can be used for signaling.
I maintain a list of WebRTC resources: strongly recommend you start by looking at the 2013 Google I/O presentation about WebRTC.
Websockets use TCP protocol.
WebRTC is mainly UDP.
Thus main reason of using WebRTC instead of Websocket is latency.
With websocket streaming you will have either high latency or choppy playback with low latency. With WebRTC you may achive low-latency and smooth playback which is crucial stuff for VoIP communications.
Just try to test these technology with a network loss, i.e. 2%. You will see high delays in the Websocket stream.
WebSockets:
Ratified IETF standard (6455) with support across all modern browsers and even legacy browsers using web-socket-js polyfill.
Uses HTTP compatible handshake and default ports making it much easier to use with existing firewall, proxy and web server infrastructure.
Much simpler browser API. Basically one constructor with a couple of callbacks.
Client/browser to server only.
Only supports reliable, in-order transport because it is built On TCP. This means packet drops can delay all subsequent packets.
WebRTC:
Just beginning to be supported by Chrome and Firefox. MS has proposed an incompatible variant. The DataChannel component is not yet compatible between Firefox and Chrome.
WebRTC is browser to browser in ideal circumstances but even then almost always requires a signaling server to setup the connections. The most common signaling server solutions right now use WebSockets.
Transport layer is configurable with application able to choose if connection is in-order and/or reliable.
Complex and multilayered browser API. There are JS libs to provide a simpler API but these are young and rapidly changing (just like WebRTC itself).
webRTC or websockets? Why not use both.
When building a video/audio/text chat, webRTC is definitely a good choice since it uses peer to peer technology and once the connection is up and running, you do not need to pass the communication via a server (unless using TURN).
When setting up the webRTC communication you have to involve some sort of signaling mechanism. Websockets could be a good choice here, but webRTC is the way to go for the video/audio/text info. Chat rooms is accomplished in the signaling.
But, as you mention, not every browser supports webRTC, so websockets can sometimes be a good fallback for those browsers.
Security is one aspect you missed.
With Websockets the data has to go via a central webserver which typically sees all the traffic and can access it.
With WebRTC the data is end-to-end encrypted and does not pass through a server (except sometimes TURN servers are needed, but they have no access to the body of the messages they forward).
Depending on your application this may or may not matter.
If you are sending large amounts of data, the saving in cloud bandwidth costs due to webRTC's P2P architecture may be worth considering too.
Comparing websocket and webrtc is unfair.
Websocket is based on top of TCP. Packet's boundary can be detected from header information of a websocket packet unlike tcp.
Typically, webrtc makes use of websocket. The signalling for webrtc is not defined, it is upto the service provider what kind of signalling he wants to use. It may be SIP, HTTP, JSON or any text / binary message.
The signalling messages can be send / received using websocket.
Webrtc is a part of peer to peer connection.
We all know that before creating peer to peer connection, it requires handshaking process to establish peer to peer connection.
And websockets play the role of handshaking process.
Websocket and WebRTC can be used together, Websocket as a signal channel of WebRTC, and webrtc is a video/audio/text channel, also WebRTC can be in UDP also in TURN relay, TURN relay support TCP HTTP also HTTPS.
Many projects use Websocket and WebRTC together.
I had a huge confusion in WebSockets. I read some blog about WebSockets and it requires node websocket server, I downloaded the demo files and the chat application didn't seem to work. To summarize this, what do I need to use WebSockets? Do I need to download node server or something? And what is something to relate with socket.io to one another?
WebSockets?
WebSockets is a standard for implementing socket communication (to a server) over the web.
Is node required?
Now this server which the socket communication prevails between can be implemented in any way whatsoever. Node is surely a popular option to implement the server side in however its not the only, you can use python, erlang, ruby, or any other language where you can bind a socket connection.
What is socket.io?
socket.io is javascript library which makes it possible for socket OR socket-like connections over the web. See the WebSockets is a recent standard, not all browsers support it, only the modern ones do (proof: http://caniuse.com/#search=websockets). What makes socket.io so popular, rainbow and fairy tale like (and one of the main reasons why you happened to stumble upon it while researching WebSockets) is that it will make socket/socket-like communications possible in all browsers.
socket: when socket.io detects a browser supporting WebSockets, in which case it uses this WebSockets implementation for the socket communications.
socket-like: however when socket.io detects a browser which does NOT support WebSockets it will still provide you with socket-like communication. Tid bit: the internals of this feature use AJAX polling.
Node is a good place to start for websockets, but by no means the only place.
I would probably start here:
http://www.html5rocks.com/en/tutorials/websockets/basics/
I'm experimenting with mqttjs and websockets and I wish to be able to send messages from a webpage using websockets without a bridge to an MQTT broker that is run by mqttjs. I can't find any information if this is available or even possible.
I've looked at mosquitto and they have "experimental" websocket support and I would love to find a Node.JS MQTT broker which could offer the same.
Thus far I got the communication working with pywebsocket and Socket.IO. I would really appreciate pointers in any direction if it is possible to use websockets to mqtt without bridging.
Thanks.
Is old question but is good to share my findings.
You can use the mosca broker that is written in node.js and is using mqtt.js
The mosca is supporting classic mqtt connection and mqtt over WS :
MQTT-over-Websockets
Mosca can operate in two modes: Standalone and as a node.js module.
In general mosca can support many types of brokers:
Mosca-advanced-usage
HiveMQ supports native websockets, which means you can use any Javascript MQTT library (like Eclipse Paho.js with websockets. It's perfectly possible to connect some clients vie websockets and other clients via standard TCP connection. The websocket support is stable and used in production.
The only drawback for you could be that HiveMQ is not written in Node.JS.
Disclosure: I am one of the developers of HiveMQ.