Architecture for Live Audio Streaming web app - websocket

Need your opinions on architecture for the live audio streaming app.
Currently, I'm testing it on a local network, and everything seems to work, but I have doubts about how good it will be in production.
My architecture:
1 2
Broadcaster HTTP client ---> App Server ---> Listening clients (React.js App)
1 — communication over HTTP, 2 — communication over HTTP and WebSocket
What I want to do:
When the user opens my React App and the Broadcaster is not streaming yet, React should display something like "OFFLINE"
Next, when the Broadcaster starts streaming to App Server, React App should display "The stream is started" and automatically start the playback.
Finally, when the Broadcaster stops streaming, React App should display "OFFLINE" again.
How I currently do it:
My App server uses two protocols: HTTP (for audio streaming and other stuff) and WebSocket (only for sending JSON status messages of what happens on the server).
When The Broadcaster starts streaming to the App Server (over HTTP), the App Server sends the WebSocket message to React App: "the stream has started, you can access it at http://my-domain/stream i.e. the App Servers streams the audio to React over regular HTTP.
React App sees this message and renders HTML <audio> element and starts playing the audio.
When the Broadcaster has stopped streaming, App Server sends WebSocket message to React App "the stream is finished" and React hides the player, displaying "OFFLINE"
So, I do all streaming (both from Broadcaster to App Server and from App Server to React client) over HTTP and use WebSocket to communicate real-time stream state updates.
How good is this architecture?

How good is this architecture?
It's not so much a matter of good or bad, it's a matter of whether or not it's appropriate for your use case. I'd note that this is basically exactly how internet radio servers such as SHOUTcast and Icecast have worked for 20+ years, so it can't be that bad. :-)

Related

Redis Pub Sub directly use with frontend without using websocket,

Can frontend directly subscribe to redis pub sub for getting messages. Most of the blogs in internet says client has to interact with backend using web socket and web socket service will communicate with redis. Can frontend directly subscribe with redis an get the updates without using web sockets.
We are trying to make a dashboard on which graphs refreshes to show correct metrics real time. Will that work or this design has any cons?
The browser (the frontend) is stateless by nature (HTTP is stateless). The instance of the (Javascript) code that "subscribes" to something effectively goes away after a page reloads. Web Sockets give you a persistent full-duplex communication channel between the browser and the server.
Before Web Sockets (and Server-Sent Events), you had to poll the server, a.k.a. check for messages for your instance/user/etc. in a loop that eats up a lot of CPU cycles. So, yes, you need Web Sockets or SSE to do async messaging efficiently on a browser.

save text chat in the server with architecture of webrtc+socket.io+nodejs

I am building a chat system where I am using webrtc and socket.io + node js for building the system. My problem is how to put back up of text chats on my server while there is 1-o-1 chat.
Approach one could be using webrtc for the chat that is peer to peer communication and with every successful message sent I would hit a web service through Which I could update the db. But this seems to me not a good approach as I have to update db from client everytime and network bandwidth seems to be an issue specially for mobile clients.
Another approach could be sending the messages through socket.io and from the node js server save the chat in db from node js server.
Second approach makes more sense to me but I am looking for the best approach.
When I wrote my chat application (using socket.io), persisting chat history was done on the socket.io server side
i.e.
socket.on('chat:message', function(message){
//persist message to db here
io.emit('chat:message', message);
});
it worked fine for me

Is socket.io the WebRTC or WebSocket or something else?

I'm new to socket.io. In Realtime (Web) Applications, we used to choose whether it should be WebRTC or WebSocket (or even SIP, still?) technologies.
What exactly is socket.io in this case please?
WebSockets
socket.io is a popular open source library implemented on both backend and client side. This library is based on WebSockets API which allow a communication between a SERVER and a CLIENT.
WebRTC
On the other hand, WebRTC is a WebAPI which comes with basically 3 things:
Real Time Communication between two browsers (no server needed), a peer to peer connection (P2P)
Media Streaming (Audio and Video)
Real Time Communication Data Chanel (stream any data on P2P)
The main difference is that WebSockets needs A SERVER and it is based on publish/subscribe pattern where you can send raw data back and forth, without having any special data handling by default. In contrast, WebRTC has a lot of functions already in place which can be used to handle Audio/Video streaming and also the raw data with data chanel.
For more info I recommend reading on MDN links I provided above and also check this very cool slides on sockets and webRTC
If you want to make video or audio communication services use WebRTC for browser build in support and write the discovery and signaling. webrtc have awesome features like P2P connections and data encryption.
WebRTC client side (browser) features like get video and audio data with good support in evergreen browsers: http://iswebrtcreadyyet.com/#interop
And socket.io is good for build centralized pub / sub apps like text chat
You can make connections with WebRTC without socket.io but both works fine if you use socket.io for help in signaling

Send private message between two peers with socket.io

I'm trying to understand the concepts of socket.io and websockets.
Suppose you have many users connected in a channel over socket.io, can two (peers) of them start a private conversation (with video for example) without passing their data through socket.io server?
For instanace, browser to browser with websocket.
I am asking because I need to let the data (audio video) flow from browser to browser between two users so the server will not be saturated with data of users starting private conversation.
If it is possible, what data needs to be exchanged to make this happen?
You should read this answer, how to make a browser to browser connection.
https://stackoverflow.com/a/7933140/3375010
Actually, it's not possible to initiate a p2p communication with socket.io. But WebRTC allows that, it supports browser to browser applications for voice, video, file sharing...

Heroku: Suitable for a Real-Time Game Server?

I am trying to run a real-time game server on Heroku using Java/Netty. The game server uses a non-standard port for communication (4876/tcp). I have built the game client using Unity3D. The game client communicates with the game server using a binary protocol (i.e. it is not using HTTP).
Is it possible for me to host this on Heroku? Heroku looks like it can only host web apps on port 80 or 443 (i.e. the web process in the procfile).
To complicate things slightly I also have a web services app built using Java/Embedded Jetty which needs to be able to communicate with the game client and the real-time game server which I also want to host on Heroku. Is this possible because I know there can be no inter-process communication? What if I create two seperate apps (one fore web services and one for real-time game server) on Heroku?
As of the time of this answer, heroku only supports http and hptts as you have noticed.
You could try modifying your game to use http as a transport for the binary protocol.

Resources