Live WebRTC streams (getUserMedia) to DASH using WebM - ffmpeg

I'm trying to understand the feasibility of a live streaming solution.
I want to grab WebRTC streams (audio and video), send them to a server and transform them in chunks to send to a html5 video tag or a DASH player using WebM container (VP8 and Opus codecs).
I also looked into ffmpeg, ffserver and gstreamer but...
My question is how to feed the WebRTC streams (live) and transform them in HTTP chunks (live DASH compatible)?
Anyone achieved something like this?

Related

HTML5 live streaming with alpha channel

We are trying to live stream a video file with alpha channel. Adaptive streaming would be great, but it's not a must. Streaming is pretty new territory for us.
We found out that WebM (VP9) seems to be the only format in the web that supports alpha channel. We tried using the nginx-rtmp-module as streaming server (MPEG-DASH) and broadcast the file with ffmpeg. But the alpha channel was lost. Probably because RTMP required us to broadcast the video as a FLV which doesn't support alpha.
Is anyone having experience in streaming rgba videos on the web? Getting to know compatible commercial solutions would be interesting tee. My next approaches would be trying to use Icecast. There is not much information online but this article where streaming a webm directly seems to be possible.

how to stream a video in h264?

I'm trying to stream my video into h264, so I can play it on a html5 page through video tag. I have found a lot of examples showing how to stream a video file to rtmp stream. but I can barely find a example for h264.
Here is the only example I can find:
ffmpeg -f dshow -i video="Virtual-Camera" -preset ultrafast -vcodec libx264 -tune zerolatency -b 900k -f mpegts udp://10.1.0.102:1234
This seems fits to my need. But I don't know what kind of server udp://10.1.0.102:1234 is.
If it starts with rtmp://10.1.0.102, then I know it's a rtmp server, and I have to setup a nginx and a rtmp module. But what's a udp server? What do I have to do to setup one?
Thanks a lot.
This ffmpeg command line allows streaming over MPEG2-TS over UDP.
So it acts as a live encoder, and that's not a bad choice for live encoder.
So you have a live encoder in place, but to stream to a web page, you also need a streaming server software, that will ingest (receive) this live stream, and will convert it to a format playable by HTML5 video tag. The format is likely to be WebRTC.
You can use Wowza or Unreal Media Server - they will ingest your MPEG2-TS stream and output to webpage as WebRTC stream.
UDP:// is not a streaming format as such, but tells you it's serving the stream over UDP (instead of TCP). The format is actually MPEG-TS (which you can see from -f mpegts)
If you want to play it in a normal browser, you will need to provide it in a different format. for live video, there isn't really a universally supported format that you can just use with the tag. Microsoft Edge and Apple Safari both support HLS natively, but both Chrome and Firefox lack any native support for a live streaming format.
With HLS, you can use hls.js and get playback in pretty much all browsers. ffmpeg can natively output HLS, you would just need a web server as well then.

HTTP Live Streaming: Fragmented MP4 or MPEG-TS?

I have an IP camera which sends out a live stream in RTSP over UDP and I want to display this stream in the browser, and I want it to work on the major browsers and on mobile (both iOs and Android). To achieve this I want to convert the stream to HTTP Live Streaming (HLS) on the server before sending it to the client. Now I've read that not very long ago Apple added support for fragmented MP4 (fMP4) as format for the stream, whereas normally the stream would be sent in MPEG-TS format. And fMP4 is also the format that MPEG-DASH supports, and MPEG-DASH might be the industry standard in a few years.
Now my question is, what the advantages and disadvantages of both fMP4 and MPEG-TS are?
EDIT: According to the technical notes for HLS from Apple, live streams must be encoded as MPEG-TS streams (https://developer.apple.com/library/content/technotes/tn2224/_index.html#//apple_ref/doc/uid/DTS40009745-CH1-ENCODEYOURVARIANTS). Is there a reason for this or is this information outdated?
fMP4 is likely to replace TS as a standard. It has less overhead and is required for HEVC, but the main advantage is compatibility with DASH - i.e. you can generate both HLS and DASH using the same files, which helps with compute and storage costs. For your particular use case, HLS TS probably has the more coverage (due to old devices and players) than HLS fMP4, but HLS+DASH fMP4 is what I would choose.

RTSP streaming to html5 video

On a project, we have a camera with a RTSP stream (video & audio, encoded in H264). We need to make the stream available on all browsers (desktop & mobile).
I've seen some solutions:
Convert the stream on HLS (iOS) and MPEG DASH (other browsers) with FFMPEG on a server
Video only with jsmpeg
The problem is since we need a really live streaming (e.g. the user can record some pictures/video on live), a low the latency solution is a specification of the project.
Any ideas?

Encoding and streaming continuous PNG output image files as live video streaming on Web browser

I have an Open GL application that renders a simulation animation and outputs several PNG image files per second and saves these files in a disk. I want to stream these image files as a video streaming over HTTP protocol so I can view the animation video from a web browser. I already have a robust socket server that handles connection from websocket and I can handle all the handshake and message encoding/decoding part. My server program and OpenGL application program are written in C++.
A couple of questions in mind:
what is the best way to stream this OpenGL animation output and view it from my web browser? The video image frames are dynamically (continuously) generated by the OpenGL application as PNG image files. The web browser should display the video corresponding to the Open GL display output (with minimum latency time).
How can I encode these PNG image files as a continuous (live) video programmatically using C / C++ (without me manually pushing the image files to a streaming server software, like Flash Media Live Encoder)? What video format should I produce?
Should I send/receive the animation data using a web-socket, or is there any other better ways? (like JQuery Ajax call for instead, I am just making this up, but please guide me through the correct way of implementing this). It is gonna be great if this live video streaming works across different browsers.
Does HTML5 video tag support live video streaming, or does it only work for a complete video file which exists at a particular URL/directory (not a live streaming)?
Is there any existing code samples (tutorial) for doing this live video streaming, where you have a C/C++/Java application producing some image frames, and have a web-browser consuming this output as a video streaming? I could barely find tutorials about this topic after spending few hours searching on Google.
You definitely want to stop outputting PNG files to disk and instead input the frames of image data into a video encoder. A good bet is to use libav/ffmpeg. Next, you will have to encapsulate the encoded video to a network friendly format. I would recommend x264 as an encoder and MPEG4 or MPEG2TS stream format.
To view the video in the web browser, you'll have to choose the streaming format. HLS in HTML5 is supported by Safari, but unfortunately not much else. For wide client support you will need to use a plugin such as flash or a media player.
The easiest way I can think of to do this is to use Wowza for doing a server-side restream. The GL program would stream MPEG2 TS to Wowza, and it would then prepare streams for HLS, RTMP(flash), RTSP, and Microsoft Smooth Streaming (Silverlight). Wowza costs about $1000. You could setup an RTMP stream using Red5, which is free. Or you could do RTSP serving with VLC, but RTSP clients are universally terrible.
Unfortunately, at this time, the level of standardization for web video is very poor, and the video tooling is rather cumbersome. It's a large undertaking, but you can get hacking with ffmpeg/libav. A proof of concept could be writing image frames in YUV420p format to a pipe that ffmpeg is listening to and choosing an output stream that you can read with an RTSP client such as VLC, Quicktime, or Windows Media Player.
Most live video is MPEG2 internally, wrapped up as RTMP (Flash) or HLS (Apple). There is probably a way to render off your OpenGL to frames and have them converted into MPEG2 as a live stream, but I don't know exactly how (maybe FFMPEG?). Once that is done you can push the stream through Flash Media Live Encoder (it's free) and stream it out to Flash clients directly using RTMP or push publish it into Wowza Media Server to package it for Flash, Http Live Streaming (Cupertino), Smooth Streaming for Silverlight.
Basically you can string together some COTS solutions into a pipeline and play on a standard player without handling the sockets and low level stuff yourself.

Resources