What are the possible ways we can play a live stream (RTSP) on the web browser, without using any video player plugin like VLC or VXG players?
I have a web application written with Laravel framework. The application is to play live stream from IP cameras. Using the video player plugins could work, but it has high latency. Also, it is not reliable. After running for some time, it will crash without any useful error message.
I tried to use opencv.js but failed. The VideoCapture function does not accept URL (but in its documentation it can. Perhaps in opencv.js it is different.)
Any other alternative?
No options available. Web browsers can not open sockets directly. The only option for getting data into a browser are HTTP, web sockets or webrtc.
Related
I am trying to make a peer-to-peer game streaming platform. At this point I managed to capture the OpenGL frames and I have a functional Java websockets server, I can have 2 clients that establish a peer to peer connection (I have solved the STUN/TURN servers part) and transfer text at this point.
I do not quite understand how I could stream a video made out of the Opengl frames with a low latency (<100ms). The problem mainly lies in the FFMPEG part, I want to use this to encode the frames, get the result (stdin/stdout redirect for ffmpeg ?), somehow link to the the JS API of the host (maybe a local websocket to which the JS of the hoster will connect to).
I tried several FFMPEG arguements/commands with stdin and stdout pipes and they did not work.
What WebRTC Client are you using? What is the H264 Live stream flowing into?
WebRTC in the browser has a few restrictions (just because the implementation is naive). Try doing constrained-baseline, and do a very small keyframe interval (every second is usually good for a prototype!)
If you don't have a WebRTC client you can do something like webrtc-remote-screen
I have WebRTC video call SDK and WebRTC server. I wanted to Register more than 10000 user virtually to WebRTC server and do the video call in between them by using my SDK. Basically I wanted to do load/stress/performance testing using my SDK.
Could you please suggest me how can we do the load/stress/performance testing using my own WebRTC video call SDK ?
I'm new to socket.io. In Realtime (Web) Applications, we used to choose whether it should be WebRTC or WebSocket (or even SIP, still?) technologies.
What exactly is socket.io in this case please?
WebSockets
socket.io is a popular open source library implemented on both backend and client side. This library is based on WebSockets API which allow a communication between a SERVER and a CLIENT.
WebRTC
On the other hand, WebRTC is a WebAPI which comes with basically 3 things:
Real Time Communication between two browsers (no server needed), a peer to peer connection (P2P)
Media Streaming (Audio and Video)
Real Time Communication Data Chanel (stream any data on P2P)
The main difference is that WebSockets needs A SERVER and it is based on publish/subscribe pattern where you can send raw data back and forth, without having any special data handling by default. In contrast, WebRTC has a lot of functions already in place which can be used to handle Audio/Video streaming and also the raw data with data chanel.
For more info I recommend reading on MDN links I provided above and also check this very cool slides on sockets and webRTC
If you want to make video or audio communication services use WebRTC for browser build in support and write the discovery and signaling. webrtc have awesome features like P2P connections and data encryption.
WebRTC client side (browser) features like get video and audio data with good support in evergreen browsers: http://iswebrtcreadyyet.com/#interop
And socket.io is good for build centralized pub / sub apps like text chat
You can make connections with WebRTC without socket.io but both works fine if you use socket.io for help in signaling
I know that for audio streaming and video streaming RTSP protocol is used (However i am not aware of what is used in bluetooth).
However my question is little different.I would like to explain it with an example(Actually it is not an example i am trying to build something similar).
When we connect our device(a mobile running on android) using bluetooth to our PC(Operating System-Windows) the
control panel shows an option for streaming audio.
As many of you would be aware in this when i play a song on my device it is played on my PC speakers.
So my question is
1)Who is the server
2)Who is the client
Probably i think that my PC is the client.If PC is the client then it would open a connection for audio streaming with the device.
As it opens a connection with the server, the server should have a specific application through which the transfer of packets must take place with the client.
But to my surprise i was able to use any media player on my device to play songs on my laptop speaker.
How is this possible?? Is it possible to do the same thing using RTSP protocol.
Use Case (stream UDP video)
Stream server-side web-cam (robot) UDP video to a client browser. We would rather lose packets than have the webcam struggle to keep up over a TCP connection via wifi which constantly cuts out.
Attempted solution
Start a Xvfb FireFox browser on the server and have that stream the webcam media source. I don't like this solution as it's not flexible for non webcam video and difficult to configure.
I'm looking for something that can stream an arbitrary media source to a WebRTC connection (including the greets & hand shaking). I don't particularly care which language it is, if something already exists in node.js, python, C, java or Scala I'll use it. Otherwise I suppose I'll get to work on the problem (in that case any guidance would be appreciated)