Serve video while recording (Spring boot) - spring

I am trying to figure out how can I stream a video which is currently being recorded and send to a Spring-Boot based server to a client.
I am not sure how to receive the data (maybe save it as any array of bytes and append it each time?)
How to request it from the client side. Would 'give me the last 100' bytes work?
Thank you

Related

Dynamic URL that forwards all requests to dedicated Pod in Kubernetes

I am trying to use a kubernetes cluster to do on-demand live transcoding of a video stream from multiple ip cameras, and send via websocket to a website.
I've modified a project I found online written in go, which takes a web request with a payload of the rtsp feed URL, it then uses the url to start a FFMPEG process to access the stream and transcode it to MPEG and sends the mpeg data to another endpoint on the go app, which starts a websocket connection. The response to the orignal request includes a websocket url when the stream can be accessed. Which the url can be put into a mpeg js player and viewed in a browser over the websocket. Main advantage is multiple clients can view the stream while there is only one stream coming from camera which reduces mobile data. I will also add that the FFMPEG automatically stops after 60 seconds with a request isn't set to endpoint.
The problem I can't find a solution to, is how can I scale the above application in a kubernetes cluster, so when I request comes in it does the below.
-> Checks to see if someone is already viewing stream
----> If someone is viewing, means a pod and websocket connection is already created. So a url that points to the pod just needs to be sent back to client.
----> If no one is viewing, a streaming/websocket pod needs to be created and once created a url sent back to the client to access stream.
I've looked at ingress controllers possibly dynamically updating the ingress resource is a solution, or even possible using a service mesh, but these are all new to me.
If anyone has any input to give me a bit of direction down a path, it would be much appreciated.
Many Thanks
Luke
Having your application dynamically configure K8s sounds like a complicated solutions with possible performance implications due to a very large object of ingress objects. You might have to look into how to clean those up. Also, doing this requires a K8s client in your app.
I would suggest to solve this without the help of K8s resources by simply having your application return a HTTP redirect (301) to that websocket URL if need be.

How would you be able to achieve real-time video with Websocket?

Suppose I have this web application client that connects/subscribes to a Websocket server. In which this websocket server sends the binary sent from clients to subscribers.
The client sends chucks of webm recorded video (e.g. every 1 second) then the server sends those chucks to every client to display the video stream.
My issue here is when the network slows down then the sending of the "buffer" webm will pile up and a noticeable lag will be displayed. So if there's a connection problem for 15 seconds, then there will be this 15 chucks to send then the WebSocket server will just broadcast those chucks to subscribers causing those clients to playback a stream that is 15 seconds in the past, so this become totally not real-time.
What is the general approach to be able to achieve real-time video with Websocket?
ps. I understand there's WebRTC, but in my case, I have to use Websocket.

How can I send js mediaStream to server and return the processed stream?

I want to send live webcam stream from website to my server, and the server will do some processing on the frames and return the processed stream. I'm thinking about using WebRTC to send live stream to server (server as a peer), and return the processed frames by images via WebSocket. Is there any easier way to do this?
WebRTC.getUserMedia can capture video stream, but you cannot send them to your server.(you can do this if your server can parse them to a complete picture)
So the easier way to finish this problem is that post the picture which captured by webrtc.getUserMedia to server, and then your server return the processed picture. You can use canvas to show those pictures.
You can read this Computer Vision on the Web with WebRTC and TensorFlow

Is it possible to emit stream object through socket.io

For my app I'm streaming audio data from a raspberry-pi client to my node.js audio service through socket.io. The problem is, to process the audio, I'm piping the audio stream from client in my service to an external service. Then this external service will give the result stream audio back to my service and my service will emit it to the client.
So my application flow is like
Client ---socket.io-stream---> audio_service ---stream---> external_service
external_service --stream---> audio_service ---socket.io-stream---> client
My questions is:
Is it possible that when a client connected to my audio_service, my audio_service will initiate a connection to external_service and emit that connection back to the client through socket.io? This way the client will stream audio directly to the external_service by using the returned connection instead of going through audio_service.
If it is possible, is it also possible that even though the client stream audio directly to the external_service, it will still send the stream result back to the audio_service?
Thank you very much for your help
It isn't possible to send a stream through Socket.IO the way it is set up today. Some folks have made some add-ons that emulate streams in the usual evented RPC way, but it isn't very efficient.
The best library I know for this is Binary.JS. This will give you streams multiplexed over a single binary WebSocket connection. Unfortunately, Binary.js hasn't been maintained in awhile, but it still works.

http streaming TO server FROM client (mobile device)

I know it's possible to stream content from a server to a web browser. I'm curious if there is a method to allow the same in reverse; and once a page is requested/connection established to then try and keep it open and stream content to the server from the client without continually re-initiating a connection/POSTing to the server. I have tried searching for such a thing but I always find information about http streaming from server to client and not the other way around. This is specifically to target mobile devices and stream chunks of text up to the server.
Thanks in advance for any info or advice!

Resources