How to make ffmpeg exit when rtmp input stream ends? - ffmpeg

I am using ffmpeg to transcode a stream from my local nginx rtmp server, and send the transcoded media back to the same local rtmp server. When the stream goes offline, ffmpeg stays active. When the stream starts again, ffmpeg picks up the transcoding work.
I'd like ffmpeg to exit when the input rtmp stream stops, but I cannot find out to configure it to do that. I've looked through the manual page and the online documentation.
For completeness, these are the arguments I currectly provide to ffmpeg:
ffmpeg -i 'rtmp://localhost/live/ijsbeer' -s '1280x720' -vcodec 'libx264' -preset 'veryfast' -crf '25' -maxrate '2000k' -bufsize '1000.0k' -force_key_frames '0:00:02' -max_muxing_queue_size '4096' -acodec 'copy' -copyts -copytb '0' -f 'flv' 'rtmp://localhost/live/ijsbeer_720p'
Making ffmpeg exit itself solves two of my problems. Firstly, ffmpeg sometimes experiences non-monotonous DTS issues when the encoder (OBS) streaming to the rtmp server restarts. Second, I'd like to implement a basic monitoring system to notify me of issues in the transcoder. Making ffmpeg exit when the input stream stops seems like to most reliable way to determine if ffmpeg is doing work. Parsing the stderr/stdout isn't reliable as there is no clear output format.
When transcoding I receive these regular stderr lines:
frame= 241 fps= 30 q=30.0 size= 38kB time=00:00:09.74 bitrate= 32.0kbits/s speed= 1.2x
Which pause when I stop the source stream. There is no log output stating the source stream has exited.

Related

How to convert MP4 frame rate like 14.939948fps to 15fps

Description
I pushed a USB camera stream by ffmpeg to a RTMP stream server which is called SRS.
The SRS had saved a MP4 file for me. The frame rate is not a common value in VLC - it's 14.939948. I've checked it out - It seems to be the 'ntsc' format.
Meanwhile, I had received the stream by OpenCV and saved it as another MP4 file.They're not synchronized.
I have tried to convert the frame rate by ffmpeg but was still not synchronized. The only way to make it is to put it in Adobe Premiere and modify the frame rate. Here is the ffmpeg commands I executed:
ffmpeg -i 1639444871684_copy.mp4 -filter:v fps=15 out.mp4
Aside from the stream server, how can I convert the frame rate to normal and keep synchronized at the same time?
Note: For live streaming, you should never depends on the FPS, because RTMP/FLV always use fixed TBN 1k, so there is always introduce some deviation, when publish stream as RTMP or record to other format like TS/MP4.
Note: For WebRTC, the fps is variant, please read Would WebRTC use a constant frame rate to capture video frame or about the Variable Frame Rate (VFR)
It's not a problem of SRS or FPS, you can also replay it by FFmpeg.
Use FFmpeg to transcode doc/source.flv from 25fps to 15fps, then publish to SRS by RTMP(15fps).
Use FFmpeg to record the RTMP(15fps) as output.mp4(15fps).
Use VLC to play the output.mp4(15fps), it show the fps IS NOT 15fps.
First, please start SRS by bellow config, note that DVR disabled:
# ./objs/srs -c test.conf
listen 1935;
daemon off;
srs_log_tank console;
vhost __defaultVhost__ {
}
Run FFmpeg to transcode and publish to SRS, change the fps to 15:
cd srs/trunk
ffmpeg -re -i doc/source.flv -c:v libx264 -r 15 -c:a copy \
-f flv rtmp://localhost/live/livestream
Record the RTMP stream(in 15fps) to output.mp4, note tat the fps is, in FFmpeg logs, it's 15fps:
ffmpeg -f flv -i rtmp://localhost/live/livestream -c copy -y output.mp4
Use VLC to play the output.mp4 which is 15fps, open the Window -> Media Information, you will find out that the fps is changing around 14.8fps, not 15fps!
It's because the TBN of RTMP/FLV, is fixed 1000(1k tbn, each frame is about 66.66666666666667ms), so the deviation is introduced when publish MP4 to RTMP stream. It's not caused by DVR, it's caused by RTMP/FLV TBN.
Note: However, for SRS, using fixed TBN 1k is not a good choice, because it's not friendly for MP4 duration, I reopen the issue srs#2790.
Ultimately, the framerate/fps is not a fixed stuff, it's just a number that give some tips about the stream. Instead, the player always use the DTS/PTS to decide when and how to render the picture.
Answer myself. Here is my method: Read by OpenCV and write frames to a new file at 15FPS. They're going to be synchronized.
with -r
ffmpeg -i 1639444871684_copy.mp4 -r 15 out.mp4

Ffmpeg hangs while transcoding HDHomerun Prime on channel change

I'm using ffmpeg to transcode a live stream from a HDHomerun Prime. Everything is working beautifully. However, one ability that I would like, if possible, is the ability to change the channel on the HDHomerun without having to stop and restart the ffmpeg transcoding process.
I start the ffmpeg process to begin reading the UDP feed from the HDHomerun. It writes the stream to a series of *.ts files with a m3u8 playlist.
The second I use hdhomerun_config to change the channel on the device, ffmpeg immediately reports the following and hangs:
[mpegts # 0000018b4e05be60] New video stream 0:3 at pos:295211888 and DTS:40884.7s=108 drop=0 speed=1.02x
[mpegts # 0000018b4e05be60] New audio stream 0:5 at pos:295279568 and DTS:40884.4s
frame= 4488 fps= 29 q=23.0 q=27.0 q=23.0 size=N/A time=00:02:28.94 bitrate=N/A dup=108 drop=0 speed=0.959x
The command I am using to launch ffmpeg is:
ffmpeg.exe -t 03:00:00 -i "udp://192.168.1.150:5000?fifo_size=1000000&overrun_nonfatal=1" -vf yadif=0:-1:1 -y -threads 4 -c:v libx264 -s 1280x720 -r 30 -b:v 4500k -force_key_frames expr:gte(t,n_forced*2) -profile:v high -preset fast -x264opts level=41 -c:a libfdk_aac -b:a 96k -ac 2 -hls_time 10 -hls_list_size 6 -hls_wrap 6 -hls_base_url /stream/ -hls_flags temp_file -hls_playlist_type event "C:\temp\streams\4500-stream.m3u8"
Is there a specific command that I can pass to allow ffmpeg to "recover" from this? Or, is there an arg that prevents this hang in the first place? I'm using the latest version v3.4 of ffmpeg cross-compiled for Windows from Ubuntu. The issue also occurred using the stable version v3.4 of ffmpeg for Windows from ffmpeg.org.
Edit:
A new discovery to the issue, but still not yet resolved:
If I change to the channel BACK to the original channel, Ffmpeg is able to continue writing the stream.
Example: I start on channel X. Ffmpeg is recording to a file. I change to channel Y. Ffmpeg outputs a message similar to the one posted above and "hangs." I change back to channel X and ffmpeg picks up where it left off, no problem.
No answer for your issue and too new to comment, but I had been trying to do this for a while based on another software package at https://github.com/bkirkman/hdhrtv
Basically what this implementation is doing is opening a new stream for each channel. I suspect that you will need to do the same and that also explains why when you go "back" the channel resumes itself. Unfortunately, unless you make a playlist and buffer every channel you may not be able to get a solution since the input stream changes.
I'd be interested in taking a look and testing if your willing to share your code.

FFMPEG: rtsp stream to a udp stream

I am looking for advice on using ffmpeg to convert RTSP stream to udp stream. What would be the simplest general command to do so? This is what I have right now:
ffmpeg -i rtsp://192.168.1.247/play1.sdp -f mpegts -vcodec mpeg4 -acodec mp2 udp://127.0.0.1:1234
The error I'm getting:
UDP timeout, retrying TCP
method PAUSE failed: 405 PAUSE
rtsp://192....: operation not permitted
Finishing stream 0:0 without any data written to it.
I'm running ubuntu 14.04. Thank you!
Looks like the ffmpeg command you are using good enough. I suspect your RTSP input stream is not valid. Have you verified it? You can do so using below command or in vlc also:
ffplay -i rtsp://192.168.1.247:port/filename
One change in the command could be instead of play1.sdp, you can directly give the stream filename i.e, a playable stream than sdp file. Hope it helps.

Seek and pause video stream from ffserver

I managed to successfully feed ffserver from ffmpeg. ffmpeg takes input as PIPE:
ffmpeg -loglevel fatal -f image2pipe -re -vcodec png -i - -vcodec libx264 http://localhost:8090/%s.ffm
External java process generates svg/png images and pushes to ffpmepg
My ffserver config allows me to buffer live feeds in ffm file without defining the size of the file.
My stream configuration looks like this:
<Stream live2.mjpg>
Feed feed2.ffm
Format mpjpeg
VideoFrameRate 25
VideoSize 640x880
VideoQMin 1
VideoQMax 5
NoAudio
Strict -1
</Stream>
The problem is that, despite that I can watch streams in VLC by opening network:
http://0.0.0.0:8090/live2.mjpg
But I can not seek through already buffered movie.
Is there a way to achieve seeking through movie, pausing, and resume playing from "now"? I have tried already rtsp with h264, mpg and sdp but without success:
<Stream test1.mpg/sdp/h264>
Format rtp
Feed feed2.ffm
VideoCodec libx264
VideoSize 640x880
VideoQMin 1
VideoQMax 5
NoAudio
Strict -1
VideoFrameRate 25
</Stream>
Is rtsp solution for this problem, or I need something else?
Can this be achieved from dynamic file since I am using PIPE?
RTSP
RTSP support in ffserver seems a bit sketchy, you could try Darwin Streaming Server or the Live555 media server. The two seem to support some forms of trick-play at least for VOD. Since you're using a pipe this won't probably help.
RTMP
Some RTMP servers/clients support in-buffer seeking (Smart Seeking).
About Smart Seek
Adobe Media Server 3.5.3 and Flash Player 10.1 work together to
support smart seeking in VOD streams and in live streams that have a
buffer. [Source].
ffserver doesn't support RTMP output but you can use your ffmpeg command to push your stream directly to the server:
ffmpeg -re -i <input> -f flv rtmp://...
There's a Nginx RTMP module and a C++ RTMP server although it's not very clear if they support smart seeking. VLC seems to be able to seek a bit while paused and there are usually options to modify the size of the client RTMP buffer.

Wowza error :Failed to play myStream; stream not found.

i am using ffmpeg for encoding a video which will then be restreamed using wowza.i am new to streaming.first i started wowza using command
/etc/init.d/WowzaMediaServer start
after that i start streaming a MP4 file using rtsp protocol.i used the command
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream.sdp
video start streaming before all of these i changed admin.password and added a username as myuser and password as mypassword when i run above command its streaming but after that they say go to
WowzaMediaServer/examples/LiveVideoStreaming/FlashRTMPPlayer/Player.html
and fill server with rtmp://localhost:1935/live
and Stream field with myStream
when i click on connect its giving me status
"Failed to play myStream; stream not found."
i am following this article http://www.wowza.com/forums/content.php?354-How-to-set-up-live-streaming-using-an-RTSP-RTP-based-encoder
where i am wrong i dont know.i am unable to figure it out. from wowza support team i am not getting satisfactory answers .so someone work on wowza please help me!!why i am not able to connect my video stream to wowza.Please respond stuck badly.
So it appears there are some basic issues with the rtsp from ffmpeg and then no matches to the play request
You have
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream.sdp
You need to make sure your ffmpeg has libx264 and libfdk_aac plugins available. You should be able to determine this with just
ffmpeg
and it should print out the libraries available.
If you have all the libraries then you are publishing a stream called
myStream.sdp
You then have instructions that say
and fill server with rtmp://localhost:1935/live
and Stream field with myStream
So you should either change your ffmpeg command to
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream
Note no .sdp in the stream name any more or use a Stream field in the player of
myStream.sdp
When publishing a stream and then attempting to play it back they must match, otherwise you get back Stream Not Found.
One way to successfully do this is to specify only the port number (65000 in this example), making sure it isn't 1935 and server in your ffmpeg command then create a mystream.stream file in your content directory of your Wowza server with ONLY the following line:
udp://0.0.0:65000
Then, in Wowza/conf/startupstreams.xml, add the following:
<!-- Native RTP example (SDP file is myStream.sdp) -->
<StartupStream>
<Application>live/_definst_</Application>
<MediaCasterType>rtp</MediaCasterType>
<StreamName>mystream.stream</StreamName>
</StartupStream>
Restart wowza and ffmpeg and then re-try your url in with the stream name mystream.stream.

Resources