Live streaming of images in a LAN - image

I have a proprietary library that generates JPEG images at 10-20Hz. I'd like to stream the images as a video stream over the network, so that a remote client (VLC for example) will be able to view it.
The clients are all in a LAN and there are no restrictions on the streaming protocol and the video format. The environment is Windows 7/XP and the library DLL exports a C-only API.
Is there a recommended library that allows streaming image frames, injected in real-time? the streaming libraries I know (VLC and Live555) do not allow this AFAIK.

M-JPEG defines streaming over HTTP by sending individual images. This protocol is understood by VLC.
From Wikipedia:
M-JPEG over HTTP
HTTP streaming
separates each image into individual
HTTP replies on a specified marker.
RTP streaming creates packets of a
sequence of JPEG images that can be
received by clients such as QuickTime
or VLC. The server software mentioned
above streams the sequence of JPEGs
over HTTP. A special mime-type content
type
multipart/x-mixed-replace;boundary=
informs the browser to expect several
parts as answer separated by a special
boundary. This boundary is defined
within the MIME-type. For M-JPEG
streams the JPEG data is sent to the
client with a correct HTTP-header. The
TCP connection is not closed as long
as the client wants to receive new
frames and the server wants to provide
new frames. Two basic implementations
of such a server are test-server
"cambozola" and webcam server
"MJPG-Streamer".
Client software
Browsers such as Safari, Google Chrome
and Opera stream M-JPEG natively.
See: http://en.wikipedia.org/wiki/Motion_JPEG#M-JPEG_over_HTTP

Related

How to send an audio stream over SIP

I'm developing an application that receives an audio stream over a WebSocket and needs to forward the audio to a SIP server.
Currently, I've managed to connect to the audio source using a Websocket and receive the media stream (encoded u-law) using Node-Red, but I'm struggling to figure out how to send the media stream to the SIP server. Any advice would be much appreciated.
I looked into this for a similar question a while back, can't find where it was now.
As you probably know the media part of SIP is RTP, so its a fairly separate stack to the call signalling.
I didn't find any nodes that supported it and the few node.js libraries for RTP were all very incomplete and out of date.
In theory it might be possible to craft your own RTP streams using the UDP nodes and then create the relevant SDP in the SIP response but I'm not sure how robust or scalable this would be.
The other option is that there are a couple of Programmable Comms platforms out there that support both SIP and Web sockets so you could possible utilise one of those and connect from Node-RED via websocket letting them do the SIP work.
I've done SIP|<>Websocket stuff with both the Vonage API (Previously Nexmo) and Jambonz (open source)

WebRTC H264 video live streaming (w FFMPEG) from OpenGL

I am trying to make a peer-to-peer game streaming platform. At this point I managed to capture the OpenGL frames and I have a functional Java websockets server, I can have 2 clients that establish a peer to peer connection (I have solved the STUN/TURN servers part) and transfer text at this point.
I do not quite understand how I could stream a video made out of the Opengl frames with a low latency (<100ms). The problem mainly lies in the FFMPEG part, I want to use this to encode the frames, get the result (stdin/stdout redirect for ffmpeg ?), somehow link to the the JS API of the host (maybe a local websocket to which the JS of the hoster will connect to).
I tried several FFMPEG arguements/commands with stdin and stdout pipes and they did not work.
What WebRTC Client are you using? What is the H264 Live stream flowing into?
WebRTC in the browser has a few restrictions (just because the implementation is naive). Try doing constrained-baseline, and do a very small keyframe interval (every second is usually good for a prototype!)
If you don't have a WebRTC client you can do something like webrtc-remote-screen

what decides ftp download and stream?

While trying to setup an streaming server with my raspberry pi, the instructions seem to contain just installing an ftp server.
This made me wonder, what decides whether a file stored in the ftp server to be downloaded or streamed?
In other words, is the choice of downloading or streaming dependent on the client side and not the server side?
If using FTP, streaming is implemented client side using the REST command (for Start Position), as explained at How does a FTP server resume a download? and (in more detail) at http://cr.yp.to/ftp/retr.html .
Your server therefore needs to allow the REST verb (most do by default). Throttling (flow control) is also managed client side.
Long story:
This mechanism is similar to the strategy used by HTTP too. Streaming, however, is a wide subject. and there are other approaches to streaming. Some protocols provide extra verbs to signal other events like changes of bandwidth/resolution to account for unstable connections (like videoconference / desktop share protocols). Some are more suitable for live broadcasting and others for buffered/stored video.
Nowadays, most Streaming Players like YouTube are web based and built on top of the HTTP protocol. Streaming is achieved using the HTTP RANGE Header and by dividing the media in chunks that can be retrieved separately, as explained in this magnific video: https://www.youtube.com/watch?v=OqQk7kLuaK4 .

Serverside WebRTC (streaming camera)

Use Case (stream UDP video)
Stream server-side web-cam (robot) UDP video to a client browser. We would rather lose packets than have the webcam struggle to keep up over a TCP connection via wifi which constantly cuts out.
Attempted solution
Start a Xvfb FireFox browser on the server and have that stream the webcam media source. I don't like this solution as it's not flexible for non webcam video and difficult to configure.
I'm looking for something that can stream an arbitrary media source to a WebRTC connection (including the greets & hand shaking). I don't particularly care which language it is, if something already exists in node.js, python, C, java or Scala I'll use it. Otherwise I suppose I'll get to work on the problem (in that case any guidance would be appreciated)

Websocket VP9 Videostream based on messages

I want to establish a video stream between a C# application and a Browser.
Im using Websockets for the communication.
The Video source is a webcam.
I am able to request single PNG frames but it is slow as hell.
The websocket Server(ratchet) is message based but is it possible to use VP9 compression or something similar by using some kind of buffer?
WebSockets implement a messaging protocol over sockets. This is not desirable for video. I think a better suited technology for this is WebRTC.

Resources