I am trying to stream some content from my browser's webcam implementation to a random RTMP server. I got it working to the part where it sends blobs of WEBM (VP8 i believe) encoded bits of movie to my server every 2 seconds, but the tricky part is getting it to an RTMP server from that part on.
A bit of fiddling with FFMPEG showed that it can successfully stream to the server I want to stream to, but so far I have only managed to get it working with regular files. Attempting to stream the blobs are unsuccessful, it simply does not upload anything. It also only seems to accept mp4 encoded with the h264 codec.
The question: what is the best way to get the raw video data from my webbrowser's webcam implementation, encode it with the h264 codec and send it to an RTMP server?
Without using a server to convert your blobs to an RTMP stream, The only way is you use flash. RTMP is an adobe protocol that no browsers support natively. Another option is WebRTC which uses RTP protocol.
Related
I am currently working on a program that plays a MP4 video on disk on a RTSP server. I have created a simple RTSP server and I can publish the stream on it using FFMPEG but I was wondering if there was a less complicated way to publish the stream on the server so I can put that into the batch file and I won't have to use FFMPEG everytime.
I use ffmpeg for streaming a video using RTP protocol.
Why I can't send multiple streams to one RTP port, but RTSP that uses RTP can?
I started a RTSP server that listens 8554 TCP port and 8000/8001 RTP/RTCP. It easily can receive both video and audio streams to one port 8000, I've checked it with wireshark. But when I try to do the same with pure RTP using ffmpeg, it prints me an error
Only one stream supported in the RTP muxer
And if I want to stream a video with sound, I have to split them into two streams and send to different RTP ports. Or maybe I can somehow make it receive multiple streams to one port via RTP?
Could you explain me, why it happens?
The RFC for RTP explains a bit here
For example, in a teleconference
composed of audio and video media encoded separately, each medium
SHOULD be carried in a separate RTP session with its own destination
transport address.
Separate audio and video streams SHOULD NOT be carried in a single
RTP session and demultiplexed based on the payload type or SSRC
fields. Interleaving packets with different RTP media types but
using the same SSRC would introduce several problems
Also see the multiplex guidelines draft RFC. So it is possible to multiplex RTP streams but the software you are using may not support it.
I can't answer how RTSP handles this however. The answer is likely in the RFC
I am having a live stream of ts data. I streamed this with an rtp header and received it in an android device using vlc. Now i would like to stream the same stream using rtsp protocol. How can I do this. Is there some method for using this rtp data for streaming rtsp?
RTSP is just an RTP streaming initiation protocol, it's very simple and can be easily implemented just after reading RFC. It will take some time (and reading) though to implement RTCP protocol
I am trying to write a program that will connect to a RTSP video source and redirect the video data to another location using UDP. I am also saving the RTSP packets to be able to replay the video stream at a latter moment in time as well. Right now my program can connect to the RTSP video stream and redirect and save, but when I try to look at the redirected video I get nothing using VLC.
Currently the program just strips out the datagram from the RTSP video packets it receives in its open UDP socket and re-sends them using this code using the boost asio library.
newVideoSocket->send_to(&dg.data[0], dg.data.size() ,Endpoint);
When I look at the traffic using Wireshark I see that it is actually sending the data to the new address and it is recognized as a UDP packet, but when I try and view the video using VLC nothing happens. The video stream is Mpeg4 with the video encoded as H.264 and VLC can play it.
I have tried to connect to the redirected stream as UDP and as RTP at both multicast and unicast addresses but have had no success. Do I need to add or take something out of the datagram before I resend it? Or is it something wrong with how I am tring to view it in VLC? Thanks for the help.
To play raw UDP-stream VLC needs information about a stream (this information is transferred through RTSP in DESCRIBE and SETUP messages). Try to create sdp file, specifying port number, video type, etc (you need to read DESCRIBE response from serer) and then open it in vlc.
I've managed to make it work, but using VLC like this I've encountered problems with synchronization and video output (video was broken).
I'm trying to extract some information from an MJPEG stream that is being transmitted wirelessly through HTTP protocol. I am most interested in the time stamp at which the received images were captured. Does anyone have an idea of how to do this?
(Note: I am using MJPG-streamer to stream the images to the PC)