Serverside WebRTC (streaming camera) - media

Use Case (stream UDP video)
Stream server-side web-cam (robot) UDP video to a client browser. We would rather lose packets than have the webcam struggle to keep up over a TCP connection via wifi which constantly cuts out.
Attempted solution
Start a Xvfb FireFox browser on the server and have that stream the webcam media source. I don't like this solution as it's not flexible for non webcam video and difficult to configure.
I'm looking for something that can stream an arbitrary media source to a WebRTC connection (including the greets & hand shaking). I don't particularly care which language it is, if something already exists in node.js, python, C, java or Scala I'll use it. Otherwise I suppose I'll get to work on the problem (in that case any guidance would be appreciated)

Related

How to send an audio stream over SIP

I'm developing an application that receives an audio stream over a WebSocket and needs to forward the audio to a SIP server.
Currently, I've managed to connect to the audio source using a Websocket and receive the media stream (encoded u-law) using Node-Red, but I'm struggling to figure out how to send the media stream to the SIP server. Any advice would be much appreciated.
I looked into this for a similar question a while back, can't find where it was now.
As you probably know the media part of SIP is RTP, so its a fairly separate stack to the call signalling.
I didn't find any nodes that supported it and the few node.js libraries for RTP were all very incomplete and out of date.
In theory it might be possible to craft your own RTP streams using the UDP nodes and then create the relevant SDP in the SIP response but I'm not sure how robust or scalable this would be.
The other option is that there are a couple of Programmable Comms platforms out there that support both SIP and Web sockets so you could possible utilise one of those and connect from Node-RED via websocket letting them do the SIP work.
I've done SIP|<>Websocket stuff with both the Vonage API (Previously Nexmo) and Jambonz (open source)

WebRTC H264 video live streaming (w FFMPEG) from OpenGL

I am trying to make a peer-to-peer game streaming platform. At this point I managed to capture the OpenGL frames and I have a functional Java websockets server, I can have 2 clients that establish a peer to peer connection (I have solved the STUN/TURN servers part) and transfer text at this point.
I do not quite understand how I could stream a video made out of the Opengl frames with a low latency (<100ms). The problem mainly lies in the FFMPEG part, I want to use this to encode the frames, get the result (stdin/stdout redirect for ffmpeg ?), somehow link to the the JS API of the host (maybe a local websocket to which the JS of the hoster will connect to).
I tried several FFMPEG arguements/commands with stdin and stdout pipes and they did not work.
What WebRTC Client are you using? What is the H264 Live stream flowing into?
WebRTC in the browser has a few restrictions (just because the implementation is naive). Try doing constrained-baseline, and do a very small keyframe interval (every second is usually good for a prototype!)
If you don't have a WebRTC client you can do something like webrtc-remote-screen

Socket Programming For Audio Streaming using Bluetooth or wifi

I know that for audio streaming and video streaming RTSP protocol is used (However i am not aware of what is used in bluetooth).
However my question is little different.I would like to explain it with an example(Actually it is not an example i am trying to build something similar).
When we connect our device(a mobile running on android) using bluetooth to our PC(Operating System-Windows) the
control panel shows an option for streaming audio.
As many of you would be aware in this when i play a song on my device it is played on my PC speakers.
So my question is
1)Who is the server
2)Who is the client
Probably i think that my PC is the client.If PC is the client then it would open a connection for audio streaming with the device.
As it opens a connection with the server, the server should have a specific application through which the transfer of packets must take place with the client.
But to my surprise i was able to use any media player on my device to play songs on my laptop speaker.
How is this possible?? Is it possible to do the same thing using RTSP protocol.

Save and re-stream RSTP video as straight UDP

I am trying to write a program that will connect to a RTSP video source and redirect the video data to another location using UDP. I am also saving the RTSP packets to be able to replay the video stream at a latter moment in time as well. Right now my program can connect to the RTSP video stream and redirect and save, but when I try to look at the redirected video I get nothing using VLC.
Currently the program just strips out the datagram from the RTSP video packets it receives in its open UDP socket and re-sends them using this code using the boost asio library.
newVideoSocket->send_to(&dg.data[0], dg.data.size() ,Endpoint);
When I look at the traffic using Wireshark I see that it is actually sending the data to the new address and it is recognized as a UDP packet, but when I try and view the video using VLC nothing happens. The video stream is Mpeg4 with the video encoded as H.264 and VLC can play it.
I have tried to connect to the redirected stream as UDP and as RTP at both multicast and unicast addresses but have had no success. Do I need to add or take something out of the datagram before I resend it? Or is it something wrong with how I am tring to view it in VLC? Thanks for the help.
To play raw UDP-stream VLC needs information about a stream (this information is transferred through RTSP in DESCRIBE and SETUP messages). Try to create sdp file, specifying port number, video type, etc (you need to read DESCRIBE response from serer) and then open it in vlc.
I've managed to make it work, but using VLC like this I've encountered problems with synchronization and video output (video was broken).

Websocket VP9 Videostream based on messages

I want to establish a video stream between a C# application and a Browser.
Im using Websockets for the communication.
The Video source is a webcam.
I am able to request single PNG frames but it is slow as hell.
The websocket Server(ratchet) is message based but is it possible to use VP9 compression or something similar by using some kind of buffer?
WebSockets implement a messaging protocol over sockets. This is not desirable for video. I think a better suited technology for this is WebRTC.

Resources