i was installed red5 server on ubuntu 12.04 lts for live and vod video streaming. i just want to convert my RTMP protocol stream to RTSP and HTTP protocol stream. i was studied and search about FFMPEG. but i did not understand correctly. so please guide me any one. thanks advance
the sample URL
rtmp://xxxxx.com/live
to
rtsp://xxxxx.com/live and http://xxxxx.com/live
There are red5 plugins for hls(http live streaming) and rtsp. I don't know how stable they are so you can try. Here are the links.
https://github.com/Red5/red5-plugins/tree/master/rtspplugin
https://github.com/Red5/red5-hls-plugin
You could write a transcoding application that uses Xuggler and converts your streams on-the-fly to RTSP (RTMP -> RTSP). That isn't meant to sound simple, but it can be done as well as the other way around (RTSP -> RTMP).
Related
Could anybody give me tip how to stream from main aircraft camera to remote server? We have our own app runing on RASPI 4 build on Matrice and can get live view from camera, can download h264 file to SD card, but havent't found any description/sample how to stream outside.
Is it possibe to use aircraft-RemoteController connection and then RemoteController to WiFi? Or rather use RASPI WiFi (that will cut range I assume).
Setup a RTMP server.
Stream to a RTMP-server from the MSDK (which running on the remote).
See the MSDK example project.
Anyway if you search for the class "LiveStreamManager" in the msdk example app on github.
method getLiveStreamManager
LiveStreamManager getLiveStreamManager()
Provides access to getLiveStreamManager. It can be used to stream the video to a RTMP server to do live streaming with DJI products.
It is possible to do so as shown in the figure below. The RTMP streaming is done by using FFMEPG we stream the section of the desktop to a webRTC server. We use opencv to control the XT2 image box on the desktop and then perform the live streaming. But the normal 4G based point-to-point connection may have 30sec latency, we use a webRTC video server to make the stream realtime.
I am trying to make a peer-to-peer game streaming platform. At this point I managed to capture the OpenGL frames and I have a functional Java websockets server, I can have 2 clients that establish a peer to peer connection (I have solved the STUN/TURN servers part) and transfer text at this point.
I do not quite understand how I could stream a video made out of the Opengl frames with a low latency (<100ms). The problem mainly lies in the FFMPEG part, I want to use this to encode the frames, get the result (stdin/stdout redirect for ffmpeg ?), somehow link to the the JS API of the host (maybe a local websocket to which the JS of the hoster will connect to).
I tried several FFMPEG arguements/commands with stdin and stdout pipes and they did not work.
What WebRTC Client are you using? What is the H264 Live stream flowing into?
WebRTC in the browser has a few restrictions (just because the implementation is naive). Try doing constrained-baseline, and do a very small keyframe interval (every second is usually good for a prototype!)
If you don't have a WebRTC client you can do something like webrtc-remote-screen
I am trying to stream some content from my browser's webcam implementation to a random RTMP server. I got it working to the part where it sends blobs of WEBM (VP8 i believe) encoded bits of movie to my server every 2 seconds, but the tricky part is getting it to an RTMP server from that part on.
A bit of fiddling with FFMPEG showed that it can successfully stream to the server I want to stream to, but so far I have only managed to get it working with regular files. Attempting to stream the blobs are unsuccessful, it simply does not upload anything. It also only seems to accept mp4 encoded with the h264 codec.
The question: what is the best way to get the raw video data from my webbrowser's webcam implementation, encode it with the h264 codec and send it to an RTMP server?
Without using a server to convert your blobs to an RTMP stream, The only way is you use flash. RTMP is an adobe protocol that no browsers support natively. Another option is WebRTC which uses RTP protocol.
I am trying to write a program that will connect to a RTSP video source and redirect the video data to another location using UDP. I am also saving the RTSP packets to be able to replay the video stream at a latter moment in time as well. Right now my program can connect to the RTSP video stream and redirect and save, but when I try to look at the redirected video I get nothing using VLC.
Currently the program just strips out the datagram from the RTSP video packets it receives in its open UDP socket and re-sends them using this code using the boost asio library.
newVideoSocket->send_to(&dg.data[0], dg.data.size() ,Endpoint);
When I look at the traffic using Wireshark I see that it is actually sending the data to the new address and it is recognized as a UDP packet, but when I try and view the video using VLC nothing happens. The video stream is Mpeg4 with the video encoded as H.264 and VLC can play it.
I have tried to connect to the redirected stream as UDP and as RTP at both multicast and unicast addresses but have had no success. Do I need to add or take something out of the datagram before I resend it? Or is it something wrong with how I am tring to view it in VLC? Thanks for the help.
To play raw UDP-stream VLC needs information about a stream (this information is transferred through RTSP in DESCRIBE and SETUP messages). Try to create sdp file, specifying port number, video type, etc (you need to read DESCRIBE response from serer) and then open it in vlc.
I've managed to make it work, but using VLC like this I've encountered problems with synchronization and video output (video was broken).
Is there any way to cache rtmp streams?
Normally squid successfully caches http based videos. What about rtmp?
Do you know any other tool?
about rtmp: http://en.wikipedia.org/wiki/Real_Time_Messaging_Protocol
Thank you,