Hi everyone,most of time I had seen the method that convert ip camera source(rtsp) to http(hls),but no one try convert hls to rtsp。I have hls test url:
https://multiplatform-f.akamaihd.net/i/multi/will/bunny/big_buck_bunny_,640x360_400,640x360_700,640x360_1000,950x540_1500,.f4v.csmil/master.m3u8
and Use the command to convert hls to rtsp:
ffmpeg -i https://multiplatform-f.akamaihd.net/i/multi/will/bunny/big_buck_bunny_,640x360_400,640x360_700,640x360_1000,950x540_1500,.f4v.csmil/master.m3u8 -f rtsp rtsp://localhost:8554/test
but not work,can someone help me?Thank you very much!
You need a live streaming server (like Wowza SE) to serve the live stream to other clients (players).
Also when publishing the live stream to the streaming server some authentication parameters are required, like rtsp://user:password#server:[post]/app/stream .
You can check the WordPress - Broadcast Live Video plugin, that handles various protocols, streaming methods and FFmpeg transcoding (if you have the necessary streaming server).
Related
Basically there are two inputs for ffmpeg planning to execute Windows 10 OS. RTSP stream and static image. While saving frames if rtsp strram is not available or timeout occurs, then ffmpeg immediately switch to second input. If rtsp stream is back, continue with rtsp.
Key expectations is that the process should not try to reconnect to rtsp source or process should not restart.
Ffmpeg commands with/without pipes or gstreamer or simple proxy server or any fresh solutions with scrpts/commands/codes to try outs are highly appreciated.
I have a use case where I need to restream an RTSP URL.
For all the below cases, Live555 is built locally by cloning the source.
For all the below cases, required ports are opened for RTSP in respective servers.
First I set up a file to be streamed using Live555MediaServer. This is in an AWS instance (AWS1).
./Live555MediaServer ashi.webm gives
rtsp://public_IP_of_instance:8554/ashi.web
I check whether an incoming stream is working or not, by trying to play it in VLC player. This incoming stream is working fine and playing well on VLC.
Now, to restream, I tried Live555ProxyServer. Now incoming stream from AWS1 is restreamed to another URL by running Live555ProxyServer on my Mac.
./Live555ProxyServer rtsp://public_IP_of_instance_one:8554/ashi.web gives,
rtsp://localhost:8554/ProxyStream
This URL is also playable in VLC.
Now I setup another AWS instance (AWS2) and run ProxyServer on it, listening to AWS1.
That is,
./Live555ProxyServer rtsp://public_IP_of_instance_one:8554/ashi.web gives,
rtsp://public_IP_of_instance_two:8554/ProxyStream
This URL is not playable.
I tried using -t flag for TCP instead of UDP. I tried checking the incoming RTSP stream with ffprobe and the stream is showing all the required details.
What could be the possible reason? What is the missing piece in this pipeline? Do we've great industry-grade open source RTSP servers?
EDIT:
I don't know what is exactly happened, but using VLC as the client to check the stream was the problem. I moved to ffmpeg and tried to write it to a file using
ffmpeg -i rtsp://public_IP_of_instance_two:8554/proxyStream -acodec copy -vcodec copy abc.webm
So the stream from ProxyServer is fine.
The lead to the change of client was the repeated warning from live555ProxyServer about outdated firmware explained here.
I'm currently using VLC Version 3.0.3 Vetinari (Intel 64bit) in Mac which is the latest version. Maybe VLC is using an old version of Live555 internally.
I have a RTSP server that provides access to a multicast stream. I can play it fine in VLC and various other players, however in ffmpeg / ffplay I get no data back.
After investigation this seems to be caused by the RTSP SETUP listing only unicast as a transport method.
If I use:
ffplay -rtsp_transport udp_multicast rtsp://<blah>
then it works fine.
How does ffmpeg decide which transport methods to put into its SETUP call? Is it perhaps some part of the SDP returned by the DESCRIBE call? Ideally I'd like to find a way to allow ffplay to play the multicast stream without the need to force the transport mode.
I'm very new with ffmpeg. Consider the following case:
I have several onvif ip camera connected to the network with an IIS server inside it. I'd like to allow client to streaming to any of ip camera inside the network but it must through the IIS server.
So basically each of ip camera will stream to IIS server in single stream and IIS server will re-distribute to many client who request it. My question is how to setup iis server to works with this scenario? And an example of ffmpeg command line to read from rtsp ip camera and send it the iis server which will re-distribute it to client.
You can use HTTP live streaming for this scenario, either HLS or DASH. HTTP streaming adds some latency so you need to do a bit of research on how to tweak the encoding parameters for low-latency.
The basic idea is that you need to segment the incoming stream and make those segments and playlist/manifest available via your existing web server infrastructure.
Example for FFmpeg and HLS:
ffmpeg -i rtsp://input_stream.sdp -c:v libx264 -r 25 -g 25 -c:a libfdk_aac -hls_time 1 -hls_list_size 4 -hls_wrap 8 /path/to/webroot/live/playlist.m3u8
On the client you will then use the URL http://domain.com/live/playlist.m3u8. HLS in not supported natively on all devices so get a web player like JWplayer or clappr. The client needs 3 segments to start the playback.
FFmpeg HLS
For DASH the idea is similar but you also need to use MP4Box.
My requirement is to get iphone camera feed, encode it into H264 format and send it to server.
In search, I found encoding part is possible with ffmpeg lib with x264 (libx264). But now the next task is to send the encoded data to Wowza server using rtsp.
Please share some code or useful document if anyone is aware about this.
There is one another library for encoding purpose live555. But I am not sure it can send the data to server using rtsp.
Actualy I made an iOS streaming app (with wowza as streaming server)
I believe you can stream video only with FFmpeg with rtsp protocol although FFmpeg don't fully support it
However with ffmpeg you can get a valid SDP and pass it to wowza using RTCP protocol - ANNOUNCE OPTION SETUP RECORD -
I didn't use FFmpeg for encoding but if you can get the raw H264 data you can packetize it to make a valid RTP packet using rfc6184
edit :
here is a sample to connect wowza :
NSString* response = [NSString stringWithFormat:#"ANNOUNCE %# RTSP/1.0\r\n",self->addr];
response = [response stringByAppendingFormat:#"CSeq: %d\r\n",self->cseq];
response = [response stringByAppendingFormat:#"Content-Type: application/sdp\r\nContent-Length: %d\r\n\r\n", [self->sdp length] ];
response = [response stringByAppendingString:self->sdp];
NSString* result = [self sendAndRecvData:response];
where sendAndRecvData is a tcp socket bound to wowza_ip:1935
you can use the same kind of code for SETUP, which will send back RTP (+RTCP) ports where you should send your datas
Your using live555 , you can use a live 555 server living on the device to send both audio and video, this will give you rtsp+rtcp stream to wowza, for announce and record live 555 has an unsupported dss module.
Wowza has an iPhone app called GoCoder which will send a live encoded stream to a Wowza server.
You can stream directly to a Wowza server using RTMP instead of RTSP. The ffmpeg command is something like:
ffmpeg -re -i localFile.mp4 -c copy -f flv rtmp://server/live/streamName
As long as you specify your output format as flv and the output destination as rtmp://xxx then you should be OK.
Source:
ffmpeg streaming