http streaming TO server FROM client (mobile device) - client-server

I know it's possible to stream content from a server to a web browser. I'm curious if there is a method to allow the same in reverse; and once a page is requested/connection established to then try and keep it open and stream content to the server from the client without continually re-initiating a connection/POSTing to the server. I have tried searching for such a thing but I always find information about http streaming from server to client and not the other way around. This is specifically to target mobile devices and stream chunks of text up to the server.
Thanks in advance for any info or advice!

Related

Dynamic URL that forwards all requests to dedicated Pod in Kubernetes

I am trying to use a kubernetes cluster to do on-demand live transcoding of a video stream from multiple ip cameras, and send via websocket to a website.
I've modified a project I found online written in go, which takes a web request with a payload of the rtsp feed URL, it then uses the url to start a FFMPEG process to access the stream and transcode it to MPEG and sends the mpeg data to another endpoint on the go app, which starts a websocket connection. The response to the orignal request includes a websocket url when the stream can be accessed. Which the url can be put into a mpeg js player and viewed in a browser over the websocket. Main advantage is multiple clients can view the stream while there is only one stream coming from camera which reduces mobile data. I will also add that the FFMPEG automatically stops after 60 seconds with a request isn't set to endpoint.
The problem I can't find a solution to, is how can I scale the above application in a kubernetes cluster, so when I request comes in it does the below.
-> Checks to see if someone is already viewing stream
----> If someone is viewing, means a pod and websocket connection is already created. So a url that points to the pod just needs to be sent back to client.
----> If no one is viewing, a streaming/websocket pod needs to be created and once created a url sent back to the client to access stream.
I've looked at ingress controllers possibly dynamically updating the ingress resource is a solution, or even possible using a service mesh, but these are all new to me.
If anyone has any input to give me a bit of direction down a path, it would be much appreciated.
Many Thanks
Luke
Having your application dynamically configure K8s sounds like a complicated solutions with possible performance implications due to a very large object of ingress objects. You might have to look into how to clean those up. Also, doing this requires a K8s client in your app.
I would suggest to solve this without the help of K8s resources by simply having your application return a HTTP redirect (301) to that websocket URL if need be.

Is it possible to emit stream object through socket.io

For my app I'm streaming audio data from a raspberry-pi client to my node.js audio service through socket.io. The problem is, to process the audio, I'm piping the audio stream from client in my service to an external service. Then this external service will give the result stream audio back to my service and my service will emit it to the client.
So my application flow is like
Client ---socket.io-stream---> audio_service ---stream---> external_service
external_service --stream---> audio_service ---socket.io-stream---> client
My questions is:
Is it possible that when a client connected to my audio_service, my audio_service will initiate a connection to external_service and emit that connection back to the client through socket.io? This way the client will stream audio directly to the external_service by using the returned connection instead of going through audio_service.
If it is possible, is it also possible that even though the client stream audio directly to the external_service, it will still send the stream result back to the audio_service?
Thank you very much for your help
It isn't possible to send a stream through Socket.IO the way it is set up today. Some folks have made some add-ons that emulate streams in the usual evented RPC way, but it isn't very efficient.
The best library I know for this is Binary.JS. This will give you streams multiplexed over a single binary WebSocket connection. Unfortunately, Binary.js hasn't been maintained in awhile, but it still works.

what decides ftp download and stream?

While trying to setup an streaming server with my raspberry pi, the instructions seem to contain just installing an ftp server.
This made me wonder, what decides whether a file stored in the ftp server to be downloaded or streamed?
In other words, is the choice of downloading or streaming dependent on the client side and not the server side?
If using FTP, streaming is implemented client side using the REST command (for Start Position), as explained at How does a FTP server resume a download? and (in more detail) at http://cr.yp.to/ftp/retr.html .
Your server therefore needs to allow the REST verb (most do by default). Throttling (flow control) is also managed client side.
Long story:
This mechanism is similar to the strategy used by HTTP too. Streaming, however, is a wide subject. and there are other approaches to streaming. Some protocols provide extra verbs to signal other events like changes of bandwidth/resolution to account for unstable connections (like videoconference / desktop share protocols). Some are more suitable for live broadcasting and others for buffered/stored video.
Nowadays, most Streaming Players like YouTube are web based and built on top of the HTTP protocol. Streaming is achieved using the HTTP RANGE Header and by dividing the media in chunks that can be retrieved separately, as explained in this magnific video: https://www.youtube.com/watch?v=OqQk7kLuaK4 .

Streaming Media to Server using web socket

I am a newbie to WebRtc. I would like to send the media stream from client ( java script) to my server (via websockets). In my server I will be doing some processing on those media content.
Could you please show me a client code snippet on sending media stream to Server via websocket.
Thanks
Ganesh.R
Nobody can show you this, because you cannot send the stream via Websockets. You need to read a little more about WebRTC.
WebRTC give you the possibility to request access to media devices from Javascript, and allows you to create a PeerConnection that will establish a connection to another endpoint to send the streams captured from the devices or some raw data (using DataChannel).
You won't have access to the streams data to send via WebSockets. Instead, the browser will send it over UDP or TCP using the SRTP protocol. If you want to get media streams on server side, you will need to implement this protocol and some negotiation to establish the connection.
HTML5Rocks have a great introduction with code snippets to start.

Icecast server status

Does anybody know, how to check if broadcast is online or offline in Icecast2 server?
Ruby preferred.
I guess you can make a TCP (HTTP) connection to specified server. Icecast server works as a regular HTTP server but data transfers are actually streams. So, all you need is to make regular Socket connection and send request (you can grab it from live http headers extension in firefox). Also, you might want to set timeout in case that server is down. And if server responds with HTTP/OK (200) code then its live.

Resources