How can I send js mediaStream to server and return the processed stream? - websocket

I want to send live webcam stream from website to my server, and the server will do some processing on the frames and return the processed stream. I'm thinking about using WebRTC to send live stream to server (server as a peer), and return the processed frames by images via WebSocket. Is there any easier way to do this?

WebRTC.getUserMedia can capture video stream, but you cannot send them to your server.(you can do this if your server can parse them to a complete picture)
So the easier way to finish this problem is that post the picture which captured by webrtc.getUserMedia to server, and then your server return the processed picture. You can use canvas to show those pictures.
You can read this Computer Vision on the Web with WebRTC and TensorFlow

Related

WebRTC H264 video live streaming (w FFMPEG) from OpenGL

I am trying to make a peer-to-peer game streaming platform. At this point I managed to capture the OpenGL frames and I have a functional Java websockets server, I can have 2 clients that establish a peer to peer connection (I have solved the STUN/TURN servers part) and transfer text at this point.
I do not quite understand how I could stream a video made out of the Opengl frames with a low latency (<100ms). The problem mainly lies in the FFMPEG part, I want to use this to encode the frames, get the result (stdin/stdout redirect for ffmpeg ?), somehow link to the the JS API of the host (maybe a local websocket to which the JS of the hoster will connect to).
I tried several FFMPEG arguements/commands with stdin and stdout pipes and they did not work.
What WebRTC Client are you using? What is the H264 Live stream flowing into?
WebRTC in the browser has a few restrictions (just because the implementation is naive). Try doing constrained-baseline, and do a very small keyframe interval (every second is usually good for a prototype!)
If you don't have a WebRTC client you can do something like webrtc-remote-screen

How would you be able to achieve real-time video with Websocket?

Suppose I have this web application client that connects/subscribes to a Websocket server. In which this websocket server sends the binary sent from clients to subscribers.
The client sends chucks of webm recorded video (e.g. every 1 second) then the server sends those chucks to every client to display the video stream.
My issue here is when the network slows down then the sending of the "buffer" webm will pile up and a noticeable lag will be displayed. So if there's a connection problem for 15 seconds, then there will be this 15 chucks to send then the WebSocket server will just broadcast those chucks to subscribers causing those clients to playback a stream that is 15 seconds in the past, so this become totally not real-time.
What is the general approach to be able to achieve real-time video with Websocket?
ps. I understand there's WebRTC, but in my case, I have to use Websocket.

Is it possible to emit stream object through socket.io

For my app I'm streaming audio data from a raspberry-pi client to my node.js audio service through socket.io. The problem is, to process the audio, I'm piping the audio stream from client in my service to an external service. Then this external service will give the result stream audio back to my service and my service will emit it to the client.
So my application flow is like
Client ---socket.io-stream---> audio_service ---stream---> external_service
external_service --stream---> audio_service ---socket.io-stream---> client
My questions is:
Is it possible that when a client connected to my audio_service, my audio_service will initiate a connection to external_service and emit that connection back to the client through socket.io? This way the client will stream audio directly to the external_service by using the returned connection instead of going through audio_service.
If it is possible, is it also possible that even though the client stream audio directly to the external_service, it will still send the stream result back to the audio_service?
Thank you very much for your help
It isn't possible to send a stream through Socket.IO the way it is set up today. Some folks have made some add-ons that emulate streams in the usual evented RPC way, but it isn't very efficient.
The best library I know for this is Binary.JS. This will give you streams multiplexed over a single binary WebSocket connection. Unfortunately, Binary.js hasn't been maintained in awhile, but it still works.

Streaming Media to Server using web socket

I am a newbie to WebRtc. I would like to send the media stream from client ( java script) to my server (via websockets). In my server I will be doing some processing on those media content.
Could you please show me a client code snippet on sending media stream to Server via websocket.
Thanks
Ganesh.R
Nobody can show you this, because you cannot send the stream via Websockets. You need to read a little more about WebRTC.
WebRTC give you the possibility to request access to media devices from Javascript, and allows you to create a PeerConnection that will establish a connection to another endpoint to send the streams captured from the devices or some raw data (using DataChannel).
You won't have access to the streams data to send via WebSockets. Instead, the browser will send it over UDP or TCP using the SRTP protocol. If you want to get media streams on server side, you will need to implement this protocol and some negotiation to establish the connection.
HTML5Rocks have a great introduction with code snippets to start.

Live streaming of images in a LAN

I have a proprietary library that generates JPEG images at 10-20Hz. I'd like to stream the images as a video stream over the network, so that a remote client (VLC for example) will be able to view it.
The clients are all in a LAN and there are no restrictions on the streaming protocol and the video format. The environment is Windows 7/XP and the library DLL exports a C-only API.
Is there a recommended library that allows streaming image frames, injected in real-time? the streaming libraries I know (VLC and Live555) do not allow this AFAIK.
M-JPEG defines streaming over HTTP by sending individual images. This protocol is understood by VLC.
From Wikipedia:
M-JPEG over HTTP
HTTP streaming
separates each image into individual
HTTP replies on a specified marker.
RTP streaming creates packets of a
sequence of JPEG images that can be
received by clients such as QuickTime
or VLC. The server software mentioned
above streams the sequence of JPEGs
over HTTP. A special mime-type content
type
multipart/x-mixed-replace;boundary=
informs the browser to expect several
parts as answer separated by a special
boundary. This boundary is defined
within the MIME-type. For M-JPEG
streams the JPEG data is sent to the
client with a correct HTTP-header. The
TCP connection is not closed as long
as the client wants to receive new
frames and the server wants to provide
new frames. Two basic implementations
of such a server are test-server
"cambozola" and webcam server
"MJPG-Streamer".
Client software
Browsers such as Safari, Google Chrome
and Opera stream M-JPEG natively.
See: http://en.wikipedia.org/wiki/Motion_JPEG#M-JPEG_over_HTTP

Resources