I am trying to use a kubernetes cluster to do on-demand live transcoding of a video stream from multiple ip cameras, and send via websocket to a website.
I've modified a project I found online written in go, which takes a web request with a payload of the rtsp feed URL, it then uses the url to start a FFMPEG process to access the stream and transcode it to MPEG and sends the mpeg data to another endpoint on the go app, which starts a websocket connection. The response to the orignal request includes a websocket url when the stream can be accessed. Which the url can be put into a mpeg js player and viewed in a browser over the websocket. Main advantage is multiple clients can view the stream while there is only one stream coming from camera which reduces mobile data. I will also add that the FFMPEG automatically stops after 60 seconds with a request isn't set to endpoint.
The problem I can't find a solution to, is how can I scale the above application in a kubernetes cluster, so when I request comes in it does the below.
-> Checks to see if someone is already viewing stream
----> If someone is viewing, means a pod and websocket connection is already created. So a url that points to the pod just needs to be sent back to client.
----> If no one is viewing, a streaming/websocket pod needs to be created and once created a url sent back to the client to access stream.
I've looked at ingress controllers possibly dynamically updating the ingress resource is a solution, or even possible using a service mesh, but these are all new to me.
If anyone has any input to give me a bit of direction down a path, it would be much appreciated.
Many Thanks
Luke
Having your application dynamically configure K8s sounds like a complicated solutions with possible performance implications due to a very large object of ingress objects. You might have to look into how to clean those up. Also, doing this requires a K8s client in your app.
I would suggest to solve this without the help of K8s resources by simply having your application return a HTTP redirect (301) to that websocket URL if need be.
Related
Playing around with nginx and wanted to know if there was a way to take an incoming RTMP stream and redirect it to another server based on the URL used. For example:
rtmp://ingress.foo.com/live/<stream_key> would forward to rtmp://<stream_key>.internal.foo.com
Thanks.
While trying to setup an streaming server with my raspberry pi, the instructions seem to contain just installing an ftp server.
This made me wonder, what decides whether a file stored in the ftp server to be downloaded or streamed?
In other words, is the choice of downloading or streaming dependent on the client side and not the server side?
If using FTP, streaming is implemented client side using the REST command (for Start Position), as explained at How does a FTP server resume a download? and (in more detail) at http://cr.yp.to/ftp/retr.html .
Your server therefore needs to allow the REST verb (most do by default). Throttling (flow control) is also managed client side.
Long story:
This mechanism is similar to the strategy used by HTTP too. Streaming, however, is a wide subject. and there are other approaches to streaming. Some protocols provide extra verbs to signal other events like changes of bandwidth/resolution to account for unstable connections (like videoconference / desktop share protocols). Some are more suitable for live broadcasting and others for buffered/stored video.
Nowadays, most Streaming Players like YouTube are web based and built on top of the HTTP protocol. Streaming is achieved using the HTTP RANGE Header and by dividing the media in chunks that can be retrieved separately, as explained in this magnific video: https://www.youtube.com/watch?v=OqQk7kLuaK4 .
I know it's possible to stream content from a server to a web browser. I'm curious if there is a method to allow the same in reverse; and once a page is requested/connection established to then try and keep it open and stream content to the server from the client without continually re-initiating a connection/POSTing to the server. I have tried searching for such a thing but I always find information about http streaming from server to client and not the other way around. This is specifically to target mobile devices and stream chunks of text up to the server.
Thanks in advance for any info or advice!
I want to implement a P2P photo sharing application.Scenario is like this:
A is online and he would like to share his photos with B. Through some server, B gets A's IP address and access A's photos directly.
Is it possible to implement using WebRTC or Websocket ? Please give me some inputs,
Thanks
I implemented P2P file transfer on websockets with very small nodejs server. But it works fine only on Chrome thanks to "download" tag. I also have to have middleware server so it's not STRICTLY P2P. My node.js server never has full file it just transfer current file chunk from one websocket connection to another.
I hope WebRTC will help you to implement what you wish more smouthly, without any middleware.
This is sort of a theoretical question, however, I need to add file sharing capabilities to my web socket powered chat application. I could use a service like Amazon S3 to upload a file to share by posting a link to the file, but that involves the uploading of a file that may already be accessible over the local network (sharing a file between co-workers for example).
So I had the idea that it might be possible to somehow tunnel the upload/download/transfer through the already existing web socket connection. However, I don't know enough about HTTP file transfer to know the next step of how to implement it. Is there a limitation to web sockets that would prevent this from being possible?
I'm using Ruby and EventMachine for my current web socket implementation. If you were able to provide a high level overview to get me started, that would be very much appreciated.
Here's an example of a project that uses only Web Sockets and the javascript File API for transferring files: http://www.github.com/thirtysixthspan/waterunderice
To allow files to be shared without the need to upload it to the server, (i.e Coworkers) you can now use the WebRTC DataChannel API, to create a peer to peer connection.