Why ffmpeg ignores output fps? - ffmpeg

I have the following ffmpeg command that streams an input to an RTMP endpoint :
ffmpeg
-re
-i -
-r 30
-vf scale=852:480
-c:v libx264
-pix_fmt yuv420p
-profile:v main
-preset veryfast
-x264opts "nal-hrd=cbr:no-scenecut"
-minrate 3000
-maxrate 3000
-g 60
-c:a aac
-b:a 160k
-ac 2
-ar 44100
-f flv
Some RTMPURL/ Some RTMPKey
This command works but the output frame rate is not respected. It drops to 6 fps.
I need it to be always 30 fps.
Does anyone know why it's not respected please ?
Thanks

Related

ffmpeg "steam" cbr gop

It's about live video streaming to STEAM... with ffmpeg
I have this command
ffmpeg -re -i file-from-webcam.webm -deinterlace
-c:v libx264 -pix_fmt yuv420p -preset veryfast
-tune zerolatency -c:a aac -b:a 128k -ac 2 -strict -2 -crf 18
-r 30 -g 60 -vb 1369k -minrate 1369k -maxrate 1369 -ar 44100 -x264-params "nal-hrd=cbr"
-vf "scale=1280:720" -profile:v main
-f flv "rtmp://ingest-rtmp.broadcast.steamcontent.com/app/steam_...."
but after a few seconds, the stream stops and the log of steam says
Make sure your upload key-frame interval is set to 2 seconds
and use constant bitrate (CBR).
Limit your encoders group of picture (GOP) to at most two times your framerate.
but I do have -x264-params "nal-hrd=cbr" and -r 30 -g 60 framerate 30 GOP 60...
Is there something wrong in the ffmpeg command ?
Or is it linux server related ?
**** The SAME ffmpeg command work very nicely in youtube, twitter, twitch, dlive, facebook, etc...
so what I'm I missing to get it work for steam ?
ffmpeg -re -i file.webm -deinterlace -c:v libx264 -preset veryfast -tune zerolatency -c:a aac -b:a 128k -ac 2 -r 30 -g 60 -vb 1369k -minrate 1369k -maxrate 1369k -bufsize 2730k -ar 44100 -x264-params "nal-hrd=cbr" -vf "scale=1280:720,format=yuv420p" -profile:v main -f flv "rtmp://ingest-rtmp.broadcast.steamcontent.com/app/___key___"
-crf and -b:v/-vb are mutually exclusive. It's likely your -vb was being ignored. Since you want a specific bitrate remove -crf.
-maxrate 1369 was missing the k.
Add -bufsize. See FFmpeg Wiki: Encoding for Streaming Sites.
No need for -strict -2. Users always add that without knowing why. (It was for the old AAC encoder before 2015.)
Make sure your input has audio. Some sites like YouTube require audio. If it does not have audio use the anullsrc filter to generate silent audio.

I want to mix multiple crop is my command But ffmpeg error

I want to mix multiple crop is my command But ffmpeg error Filter overlay has an unconnected output.
ffmpeg -y -i "tetcrop.mp4" -i fulla.png -filter_complex "[0:v]setpts=PTS/1,scale=854:480,select='lt(mod(t,20),20)',setpts=N/FRAME_RATE/TB,setdar=16/9[vm];[vm]crop=155:176:229:150,scale=854:480,setdar=16/9[v2];[0:v]scale=854:480,setdar=16/9,setpts=PTS/1,select='lt(mod(t,20),20)',setpts=N/FRAME_RATE/TB[vc];[vc][v2]overlay=shortest=1:enable='lt(mod(t,4),2)*gte(t,2)'[v3];[0:v]setpts=PTS/1,scale=854:480,select='lt(mod(t,20),20)',setpts=N/FRAME_RATE/TB,setdar=16/9[ve];[ve]crop=155:176:229:150,scale=854:480,setdar=16/9[v4];[0:v]scale=854:480,setdar=16/9,setpts=PTS/1,select='lt(mod(t,20),20)',setpts=N/FRAME_RATE/TB[vd];[vd][v4]overlay=shortest=1:enable='lt(mod(t,4),2)*gte(t,2)'[vout];[vout]overlay =main_w-overlay_w-5:5;[0:a]aformat=sample_fmts=fltp:sample_rates=44100:channel_layouts=stereo,atempo=1,aecho=0.5:0.3:2:0.5,aecho=0.5:0.3:2:0.5,aecho=0.5:0.3:2:0.5,highpass=f=10,treble=g=0,volume=10,volume=+25dB,aselect='lt(mod(t,20),20)',asetpts=N/SR/TB[a1];amovie=uottro.mp4:loop=999,volume=0.03[a2];[a1][a2]amix=duration=shortest" -vcodec libx264 -pix_fmt yuv420p -b:v 1000k -bf 2 -r 25 -g 60 -acodec libmp3lame -b:a 128k -ar 44100 -ac 2 -preset veryfast "tetcropok.mp4"
You made [vc][v2]overlay=shortest=1:enable='lt(mod(t,4),2)*gte(t,2)'[v3] but never told ffmpeg what to do with [v3]. It is orphaned, but it must be consumed.

FFMPEG SCREEN RECORDING: How to get H265 (libx265) recording using ffmpeg with xorg?

I really would appreciate all the help I can get here.
I'm trying to use the libx265 codec for recording an xorg dummy screen. The command that currently works for H264 (libx264 codec) is:
ffmpeg -y -v info -f x11grab -draw_mouse 0 -r 30 -s 1280x720
-thread_queue_size 4096 -i :0.0+0,0 -f alsa -acodec aac -strict -2 -ar 44100 -b:a 128k -af aresample=async=1 -c:v libx264 -preset fast
-profile:v main -level 3.1 -pix_fmt yuv420p -r 30 -crf 21 -g 60 -tune zerolatency -f mp4 capture.mp4
In trying to get H265 instead, I first changed the codec to libx265 like below:
ffmpeg -y -v info -f x11grab -draw_mouse 0 -r 30 -s 1280x720
-thread_queue_size 4096 -i :0.0+0,0 -f alsa -acodec aac -strict -2 -ar 44100 -b:a 128k -af aresample=async=1 -c:v libx265 -preset fast
-profile:v main -level 3.1 -pix_fmt yuv420p -r 30 -crf 21 -g 60 -tune zerolatency -f mp4 capture.mp4
But that didn't do it. Although it didn't error, it was producing a file that was playing at twice the recorded speed (i.e. twice the speed of the clip that was recorded).
Then I tried using -x265-params to specify the parameters like this:
ffmpeg -y -v info -f x11grab -draw_mouse 0 -r 30 -s 1280x720
-thread_queue_size 4096 -i :0.0+0,0 -f alsa -acodec aac -strict -2 -ar 44100 -b:a 128k -af aresample=async=1 -c:v libx264 -preset fast
-x265-params profile=main:level=3.1:crf=21 -pix_fmt yuv420p -r 30 -g 60 -tune zerolatency -f mp4 capture.mp4
And this gave me an error with the following message:
"output file #0 does not contain any stream ffmpeg"
I've tried all sorts of combinations, searched extensively online (for both how to set 265 parameters and on the output file error), but I'm not making a headway. I'm really new to all this. Can anyone please help (with the most simple terms and directions)?

FFMPEG -re Insurance

FFMPEG -re according to the ffmpeg docs:
Read input at native frame rate. Mainly used to simulate a grab
device, or live input stream (e.g. when reading from a file). Should
not be used with actual grab devices or live input streams (where it
can cause packet loss).
My ffmpeg stream command is:
ffmpeg -re -i https://www.example.com/video.mp4 -filter_complex tpad=start_duration=10:stop_duration=15:start_mode=add:color=black:stop_mode=add -af adelay=10000|10000 -maxrate 2M -crf 24 -bufsize 6000k -c:v libx264 -preset superfast -tune zerolatency -strict -2 -c:a aac -ar 44100 -attempt_recovery 1 -max_recovery_attempts 5 -drop_pkts_on_overflow 1 -f flv rtmp://live.example.com/123453
Except this does not always work, and sometimes I have livestreams ending early because ffmpeg is playing faster than the frame rate. Is there another command that can be used to ensure ffmpeg streams the video in real time?
Remove -re and add filter-based throttling.
ffmpeg -i https://www.example.com/video.mp4 -filter_complex tpad=start_duration=10:stop_duration=15:start_mode=add:color=black:stop_mode=add,fifo,realtime -af adelay=10000|10000,afifo,arealtime -maxrate 2M -crf 24 -bufsize 6000k -c:v libx264 -preset superfast -tune zerolatency -strict -2 -c:a aac -ar 44100 -attempt_recovery 1 -max_recovery_attempts 5 -drop_pkts_on_overflow 1 -f flv rtmp://live.example.com/123453

FFMPEG image not updating

THE INPUT FILES
An overlay image that has is being updated every 5 seconds by a Python script
A small MP4 file that will be looped by a concat input
An MP3 file as audio source
THE COMMAND (UPDATED)
This is the command I'm currently using to combine and stream the inputs.
ffmpeg -re -i music.mp3 -f concat -i videoincludes.txt
-r 1 -loop 1 -f image2 -i overlay.png
-c:v libx264 -c:a aac -shortest -crf 23 -pix_fmt yuv420p
-maxrate 2500k -bufsize 2500k -preset ultrafast -r 30 -g 60 -b:v 2000k -b:a 192k -ar 44100
-filter_complex "[1:v][2:v] overlay=0:0" -map 0:a -strict -2
-f flv rtmp://a.rtmp.youtube.com/live2/{key}
Als tried using -framerate 1 instead of -r 1
THE ISSUE
So the issue is that the image doesn't always update. Sometimes it does update every couple seconds at the start but it stops updating after 10-20 seconds without any difference in log output and sometimes it just doesn't update.
I can however confirm that the image is being updated by the Python script but FFmpeg is just not picking this up.
I read setting the input format of the image to image2 should allow it to update so I am not sure what is wrong or what I can do to improve it.
I'm working on the same task, and finally, I think, I found the answer.
Because streams different from each other we must reset their timestamps with setpts=PTS-STARTPTS to have them begin in the same zero timestamp . And, also, try to use image2pipe instead of image2.
This is your code with timestamp reset:
ffmpeg -re -i music.mp3 -f concat -i videoincludes.txt
-r 1 -loop 1 -f image2pipe -i overlay.png
-c:v libx264 -c:a aac -shortest -crf 23 -pix_fmt yuv420p
-maxrate 2500k -bufsize 2500k -preset ultrafast -r 30 -g 60 -b:v 2000k -b:a 192k -ar 44100
-filter_complex "[1:v]setpts=PTS-STARTPTS[out_main]; [2:v]setpts=PTS-STARTPTS[out_overlay]; [out_main][out_overlay]overlay=0:0" -map 0:a -strict -2
-f flv rtmp://a.rtmp.youtube.com/live2/{key}
p.s and I think, there is no need in -r or -framerate anymore

Resources