How to convert MP4 frame rate like 14.939948fps to 15fps - ffmpeg

Description
I pushed a USB camera stream by ffmpeg to a RTMP stream server which is called SRS.
The SRS had saved a MP4 file for me. The frame rate is not a common value in VLC - it's 14.939948. I've checked it out - It seems to be the 'ntsc' format.
Meanwhile, I had received the stream by OpenCV and saved it as another MP4 file.They're not synchronized.
I have tried to convert the frame rate by ffmpeg but was still not synchronized. The only way to make it is to put it in Adobe Premiere and modify the frame rate. Here is the ffmpeg commands I executed:
ffmpeg -i 1639444871684_copy.mp4 -filter:v fps=15 out.mp4
Aside from the stream server, how can I convert the frame rate to normal and keep synchronized at the same time?

Note: For live streaming, you should never depends on the FPS, because RTMP/FLV always use fixed TBN 1k, so there is always introduce some deviation, when publish stream as RTMP or record to other format like TS/MP4.
Note: For WebRTC, the fps is variant, please read Would WebRTC use a constant frame rate to capture video frame or about the Variable Frame Rate (VFR)
It's not a problem of SRS or FPS, you can also replay it by FFmpeg.
Use FFmpeg to transcode doc/source.flv from 25fps to 15fps, then publish to SRS by RTMP(15fps).
Use FFmpeg to record the RTMP(15fps) as output.mp4(15fps).
Use VLC to play the output.mp4(15fps), it show the fps IS NOT 15fps.
First, please start SRS by bellow config, note that DVR disabled:
# ./objs/srs -c test.conf
listen 1935;
daemon off;
srs_log_tank console;
vhost __defaultVhost__ {
}
Run FFmpeg to transcode and publish to SRS, change the fps to 15:
cd srs/trunk
ffmpeg -re -i doc/source.flv -c:v libx264 -r 15 -c:a copy \
-f flv rtmp://localhost/live/livestream
Record the RTMP stream(in 15fps) to output.mp4, note tat the fps is, in FFmpeg logs, it's 15fps:
ffmpeg -f flv -i rtmp://localhost/live/livestream -c copy -y output.mp4
Use VLC to play the output.mp4 which is 15fps, open the Window -> Media Information, you will find out that the fps is changing around 14.8fps, not 15fps!
It's because the TBN of RTMP/FLV, is fixed 1000(1k tbn, each frame is about 66.66666666666667ms), so the deviation is introduced when publish MP4 to RTMP stream. It's not caused by DVR, it's caused by RTMP/FLV TBN.
Note: However, for SRS, using fixed TBN 1k is not a good choice, because it's not friendly for MP4 duration, I reopen the issue srs#2790.
Ultimately, the framerate/fps is not a fixed stuff, it's just a number that give some tips about the stream. Instead, the player always use the DTS/PTS to decide when and how to render the picture.

Answer myself. Here is my method: Read by OpenCV and write frames to a new file at 15FPS. They're going to be synchronized.

with -r
ffmpeg -i 1639444871684_copy.mp4 -r 15 out.mp4

Related

FFMPEG and FFPlay can access rtsp stream from one ip, but from other ip, it can't

The situation is kind of complex. I was archiving several CCTV camera feeds (rtsp, h264, no audio) through OpenCV, which worked but the CPU utilization was too high and started to lose some frames time by time.
To reduce the CPU utilization, I started to use FFMPEG to skip the decoding and encoding processes, which worked perfectly on my home machine. However, when I connected to my university VPN and tried to deploy it on our lab server, FFmpeg couldn't read any frame, ffplay couldn't get anything either. However, OpenCV, VLC Player and IINA Player could still read and display the feed.
In Summary,
1 FFMPEG/ffplay
1.1 can only read the feed from my home network(Wi-Fi, optimum)
1.2 from other two networks, the error message says: "Could not find codec parameters for stream 0 (Video: h264, none): unspecified size
Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options"
2 IINA/VLC Player, OpenCV
These tools can get the video all the time.
I'm wondering whether it's related to some specific port access, that the ffmpeg required but the others don't. I'd appreciate it if anyone can provide any suggestions.
As references, the tested ffplay command is simple:
ffplay 'the rtsp address'
Thanks
Update
More tests have been performed.
By specifying rtsp_transport as TCP, ffplay can play the video, but FFmpeg can't access the video. (In the beginning, when both FFmpeg and ffplay worked through my home network, it was UDP)
The FFmpeg command is as follows:
ffmpeg -i rtsp://the_ip_address/axis-media/media.amp -hide_banner -c:v copy -s 1920x1080 -segment_time 00:30:00 -f segment -strftime 1 -reset_timestamps 1 -rtsp_transport tcp "%Y-%m-%d-%H-%M-%S_Test.mp4"
Please help...
Solved by forcing it to use "-rtsp_transport tcp" right before -i.
ffmpeg -rtsp_transport tcp -i rtsp://the_ip_address/axis-media/media.amp -hide_banner -c:v copy -s 1920x1080 -segment_time 00:30:00 -f segment -strftime 1 -reset_timestamps 1 "%Y-%m-%d-%H-%M-%S_Test.mp4"

Ffmpeg -c copy not carrying over audio track from mkv to hls stream

I am using ffmpeg to create an hls stream. The source is an mkv with multiple audio tracks. I have tried using -map to specify the audio stream as well. I also found that when I point ffmpeg to any other audio stream in the file it works. It's just the first audio stream that does not. At one point I replaced -c copy with -acodec aac -ac 6 on the first stream and I got sound which is great but I am only looking to copy the stream and not re-encode it. The next thing I tried was using other mkv videos I have. All are reflecting the same issue. The mkv's by itself play both audio and video fine in VLC. When playing the output.m3u8 in VLC the option to choose different audio tracks is greyed out. Here is the command I'm using:
ffmpeg -i "./video.mkv" -ss 00:00:00 -t 00:00:30 -c copy -f hls "output.m3u8"
I want the audio of my hls stream to reflect that of the mkv source:
Although what I get returned from the command above gives me no sound and shows me this in mediaInfo:
I've aslo noticed that hls does not support pcm. Is it possible dash could work with this stream because it is pcm?
HLS segments can be either MPEG-TS or fragmented MP4. Neither officially support PCM audio, so you'll have to convert it.
DASH uses fragmented MP4 as segment format.

FFMPEG screen capture outputting very poor and inconsistent framerate as webm with no audio

I've been testing different parameters to capture my desktop video and audio (desktop audio, not mic) and I find that no matter what settings I have, the resulting webm file's framerate is around 5fps and is horribly inconsistent. It starts at around 20fps and slowly drops over time until about 4-5fps. I'm not really sure what I'm doing wrong, but here is the basic command I'm using:
ffmpeg -y -video_size 1920x1080 -f gdigrab -framerate 60 -i desktop -c:v libvpx-vp9 -acodec libvorbis -c:a libopus -b:v 2M -threads 4 output.webm
I've tried anywhere between 30-60 fps and tested different bitrates but nothing seems to affect the output framerate.
Also, I know that acodec and c:a are for audio but I'm not sure how to specify the audio device to use.
So my issues are horrible framerate for webm and how to include desktop audio in the recording.
You can use arecord and pipe it through stdout and ffmpeg can read it from stdin.
aplay piping to arecord using a file instead of stdin and stdout
Replacing the aplay command with your ffmpeg. Dont forget to add '-i -' in ffmpeg.
A doubt: why are you defining audio encoder two times?
It's impossible to say why the video frame rate is low from the question. It can be an issue with encoder. Or issue in reading input. Remove the video encoding option. See if the issue persists. If it's working fine, try some other encoders.
Use -c:v libx264 instead of -c:v libvpx-vp9. libvpx-vp9's realtime encoding quality is really bad, even regular libvpx (i.e. VP8) is much better. If you insist on using libvpx, use options like -deadline realtime and -cpu-used -4

How encode video using hevc/h265 codec via ffmpeg OSX

I try to encode video using hevc codec
./ffmpeg -i 1234.mp4 -vcodec hevc_videotoolbox -vb 1000k -acodec aac -ab 192k -sn 2.mp4
error:
[hevc_videotoolbox # 0x7fc681813a00] Error: cannot create compression session: -12908
[hevc_videotoolbox # 0x7fc681813a00] Try -allow_sw 1. The hardware encoder may be busy, or not supported.
Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
i try change bitrate, width, height, error still exist
is it possible encode video on my macbook air 2015?
is videotoolbox can't use my old GPU and its possible on newest macbooks?
I have the same question recently, I found this when I was looking for answers:
Screenshot from this pdf: enter link description here
It's possible that old mac doesn't support HEVC hardware acceleration natively, I don't have a new mac to test if ffmpeg has anything related to it, maybe someone whose mac has 6th Gen CPU can help you address the problem.
Edit:
I tested following code on the latest 2018 mbp and it worked.
ffmpeg -i VIDEO_PATH -vcodec hevc_videotoolbox -tag:v hcv1 OUT_PATH
The size of the hevc_videotoolbox-encoded video#1 is smaller than the original test file(h264) but larger than libx265-encoded video#2 (using default parameters).
Unexpectedly, the quality of video#1 is much worse than that of the original, whereas video#2 seems untouched. Besides, hevc_videotoolbox doesn't support -crf option, so I'm still stick to libx265, even though it is really slow.

FFmpeg Stream Transcoding

I have got a streaming application that displays the stream sent from a Flash Media Server.
I want to grab that stream and transcode it to a output stream with a different bitrate using ffmpeg.
Could such kind of thing be done using ffmpeg?
This will get input from a feed, and transcode it to an MKV file with default audio and video codecs, and 1024k bitrate for the video stream (audio bitrate is specified with '-ab'):
ffmpeg -i "http://my_server/video_feed" -b 1024k output.mkv
For a live feed try this (not sure if it'll work, I don't have ffmpeg to test it right now):
ffmpeg -i "http://my_server/input_video_feed" -b 1024 -f flv "http://my_server/output_video_feed"
This should create a FLV feed.

Resources