I'm new to socket.io. In Realtime (Web) Applications, we used to choose whether it should be WebRTC or WebSocket (or even SIP, still?) technologies.
What exactly is socket.io in this case please?
WebSockets
socket.io is a popular open source library implemented on both backend and client side. This library is based on WebSockets API which allow a communication between a SERVER and a CLIENT.
WebRTC
On the other hand, WebRTC is a WebAPI which comes with basically 3 things:
Real Time Communication between two browsers (no server needed), a peer to peer connection (P2P)
Media Streaming (Audio and Video)
Real Time Communication Data Chanel (stream any data on P2P)
The main difference is that WebSockets needs A SERVER and it is based on publish/subscribe pattern where you can send raw data back and forth, without having any special data handling by default. In contrast, WebRTC has a lot of functions already in place which can be used to handle Audio/Video streaming and also the raw data with data chanel.
For more info I recommend reading on MDN links I provided above and also check this very cool slides on sockets and webRTC
If you want to make video or audio communication services use WebRTC for browser build in support and write the discovery and signaling. webrtc have awesome features like P2P connections and data encryption.
WebRTC client side (browser) features like get video and audio data with good support in evergreen browsers: http://iswebrtcreadyyet.com/#interop
And socket.io is good for build centralized pub / sub apps like text chat
You can make connections with WebRTC without socket.io but both works fine if you use socket.io for help in signaling
Related
I am new to this !
I am working for a chat application which requires text+ video chats.
I explored Socket.io initially and found it very handy to develop text based chatting application (WEB).
While exploring the Video chat element i came across WebRTC -RTCDataChannel for sending out arbitrary data across connected peers.
My Chat Server( preferably NodeJS ) will be serving the connections for peers, along with saving text chat history.
Confusion:
Should I use Socket.io-MyChatServer as the Signalling server also? [Possible?] , Or
Should I use RTCDataChannel for signalling server? , Or
Simply forget Socket.io and consider WebRTC for both !
Thanks in advance :)
Well WebRTC data channels and web sockets are different and complementary concepts in the case of peer connections.
In order to open a data channel you first need a P2P connection. In order to establish a P2P connection, you need a signaling server. So, sockets are used for that purpose, to exchange the metadata necessary to create a P2P connection. First, through sockets you establish a peer to peer connection and only after that you can use data channels.
As for using the same chat server as signaling server is up to you. WebRTC let the signaling server architecture be defined by the developer. It's a blackbox.
So, no you can't use data channels as signaling, as you can see.
I built a realtime application that, thanks to Socket.IO, can serve a lot of different client types (C#, Java, Browser, ...)! I know that there are a lot of Socket.IO alternatives, but from my understanding, everything is more or less based on WebSockets. (I know that Socket.IO has fallbacks if WebSockets are not working, but that they are more less "inferior workarounds" so to speak...)
My question is: Is there any comparable real-time engine available that is NOT based on WebSockets, but can still serve all those different clients?
You don't say what your endpoints are. If one of the endpoints is a browser with purely the built-in capabilities of the browser and Javascript, then a webSocket is your only way to get a continuous connection from the browser to some other destination.
If a webSocket is not supported (in an older browser), then the other socket.io fallbacks (such as xhr-long-polling) are the next best alternatives. As the browser has limited communication capabilities, if you can't use a webSocket, then an ajax call is your only other generally supported option without requiring plug-ins on each browser (such as Flash or Java or something like that). socket.io already supports the next-best options that are available in a browser - you can't do better than that if you're talking about a standard browser with no custom plug-ins.
If your endpoints don't necessarily include a browser and you can use any language or library you want, then you can use plain TCP sockets and then use whatever protocol you want over a TCP socket.
The WebSocket protocol establishes a bidirectional communication channel between server and client; they kind of speak more naturally with each other. The server can just send something to the client and the other way around. In http it just goes in one direction, there's a request and a response and everything needs to be initiated with a request from the client.
From my experience, realtime webApps like a multiplayer game or a chat become easier to develop and it apparently creates less overhead than using http - but still you can do the same things more or less elegant with http as well (see e.g. long polling).
Look at gmail or other existing webApps, they all use http (so does Socket.io as a fallback) and it works quite well.
I'm trying to understand the concepts of socket.io and websockets.
Suppose you have many users connected in a channel over socket.io, can two (peers) of them start a private conversation (with video for example) without passing their data through socket.io server?
For instanace, browser to browser with websocket.
I am asking because I need to let the data (audio video) flow from browser to browser between two users so the server will not be saturated with data of users starting private conversation.
If it is possible, what data needs to be exchanged to make this happen?
You should read this answer, how to make a browser to browser connection.
https://stackoverflow.com/a/7933140/3375010
Actually, it's not possible to initiate a p2p communication with socket.io. But WebRTC allows that, it supports browser to browser applications for voice, video, file sharing...
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
So I'm looking to build a chat app that will allow video, audio, and text. I spent some time researching into Websockets and WebRTC to decide which to use. Since there are plenty of video and audio apps with WebRTC, this sounds like a reasonable choice, but are there other things I should consider?
Feel free to share your thoughts.
Things like:
Due to being new WebRTC is available only on some browsers, while WebSockets seems to be in more browsers.
Scalability - Websockets uses a server for session and WebRTC seems to be p2p.
Multiplexing/multiple chatrooms - Used in Google+ Hangouts, and I'm still viewing demo apps on how to implement.
Server - Websockets needs RedisSessionStore or RabbitMQ to scale across multiple machines.
WebRTC is designed for high-performance, high quality communication of video, audio and arbitrary data. In other words, for apps exactly like what you describe.
WebRTC apps need a service via which they can exchange network and media metadata, a process known as signaling. However, once signaling has taken place, video/audio/data is streamed directly between clients, avoiding the performance cost of streaming via an intermediary server.
WebSocket on the other hand is designed for bi-directional communication between client and server. It is possible to stream audio and video over WebSocket (see here for example), but the technology and APIs are not inherently designed for efficient, robust streaming in the way that WebRTC is.
As other replies have said, WebSocket can be used for signaling.
I maintain a list of WebRTC resources: strongly recommend you start by looking at the 2013 Google I/O presentation about WebRTC.
Websockets use TCP protocol.
WebRTC is mainly UDP.
Thus main reason of using WebRTC instead of Websocket is latency.
With websocket streaming you will have either high latency or choppy playback with low latency. With WebRTC you may achive low-latency and smooth playback which is crucial stuff for VoIP communications.
Just try to test these technology with a network loss, i.e. 2%. You will see high delays in the Websocket stream.
WebSockets:
Ratified IETF standard (6455) with support across all modern browsers and even legacy browsers using web-socket-js polyfill.
Uses HTTP compatible handshake and default ports making it much easier to use with existing firewall, proxy and web server infrastructure.
Much simpler browser API. Basically one constructor with a couple of callbacks.
Client/browser to server only.
Only supports reliable, in-order transport because it is built On TCP. This means packet drops can delay all subsequent packets.
WebRTC:
Just beginning to be supported by Chrome and Firefox. MS has proposed an incompatible variant. The DataChannel component is not yet compatible between Firefox and Chrome.
WebRTC is browser to browser in ideal circumstances but even then almost always requires a signaling server to setup the connections. The most common signaling server solutions right now use WebSockets.
Transport layer is configurable with application able to choose if connection is in-order and/or reliable.
Complex and multilayered browser API. There are JS libs to provide a simpler API but these are young and rapidly changing (just like WebRTC itself).
webRTC or websockets? Why not use both.
When building a video/audio/text chat, webRTC is definitely a good choice since it uses peer to peer technology and once the connection is up and running, you do not need to pass the communication via a server (unless using TURN).
When setting up the webRTC communication you have to involve some sort of signaling mechanism. Websockets could be a good choice here, but webRTC is the way to go for the video/audio/text info. Chat rooms is accomplished in the signaling.
But, as you mention, not every browser supports webRTC, so websockets can sometimes be a good fallback for those browsers.
Security is one aspect you missed.
With Websockets the data has to go via a central webserver which typically sees all the traffic and can access it.
With WebRTC the data is end-to-end encrypted and does not pass through a server (except sometimes TURN servers are needed, but they have no access to the body of the messages they forward).
Depending on your application this may or may not matter.
If you are sending large amounts of data, the saving in cloud bandwidth costs due to webRTC's P2P architecture may be worth considering too.
Comparing websocket and webrtc is unfair.
Websocket is based on top of TCP. Packet's boundary can be detected from header information of a websocket packet unlike tcp.
Typically, webrtc makes use of websocket. The signalling for webrtc is not defined, it is upto the service provider what kind of signalling he wants to use. It may be SIP, HTTP, JSON or any text / binary message.
The signalling messages can be send / received using websocket.
Webrtc is a part of peer to peer connection.
We all know that before creating peer to peer connection, it requires handshaking process to establish peer to peer connection.
And websockets play the role of handshaking process.
Websocket and WebRTC can be used together, Websocket as a signal channel of WebRTC, and webrtc is a video/audio/text channel, also WebRTC can be in UDP also in TURN relay, TURN relay support TCP HTTP also HTTPS.
Many projects use Websocket and WebRTC together.
I am trying to understand the difference between WebRTC and WebSockets so that I can better understand which scenario calls for what. I am curious about the broad idea of two parties (mainly web based, but potentially one being a dedicated server application) talking to each other.
Assumption:
Clearly in regards to ad-hoc networks, WebRTC wins as it natively supports the ICE protocol/method.
Questions:
Regarding direct communication between two known parties in-browser, if I am not relying on sending multimedia data, and I am only interested in sending integer data, does WebRTC give me any advantages over webSockets other than data encryption?
Regarding a dedicated server speaking to a browser based client, which platform gives me an advantage? I would need to code a WebRTC server (is this possible out of browser?), or I would need to code a WebSocket server (a quick google search makes me think this is possible).
There is one significant difference: WebSockets works via TCP, WebRTC works via UDP.
In fact, WebRTC is SRTP protocol with some additional features like STUN, ICE, DTLS etc. and internal VoIP features such as Adaptive Jitter Buffer, AEC, AGC etc.
So, WebSockets is designed for reliable communication. It is a good choice if you want to send any data that must be sent reliably.
When you use WebRTC, the transmitted stream is unreliable. Some packets can get lost in the network. It is bad if you send critical data, for example for financial processing, the same issue is ideally suitable when you send audio or video stream where some frames can be lost without any noticeable quality issues.
If you want to send data channel via WebRTC, you should have some forward error correction algorithm to restore data if a data frame was lost in the network.
WebRTC specifies media transport over RTP .. which can work P2P under certain circumstances. In any case to establish a webRTC session you will need a signaling protocol also .. and for that WebSocket is a likely choice. In other words: unless you want to stream real-time media, WebSocket is probably a better fit.
Question 1: Yes. The DataChannel part of WebRTC gives you advantages in this case, because it allows you to create a peer to peer channel between browsers to send and receive any raw data you want. Websockets forces you to use a server to connect both parties.
Question 2 Like I said in the previous response, Websockets are better if you want a server-client communication, and there are many implementations to do this (i.e. jWebSocket). To add support in a server to establish a connection with a WebRTC DataChannel, it may take you some days of life and health. :)