Recording Screen with FFmpeg - ffmpeg

I recently used the FFmpeg bash commands below to capture a screen recording.
Are there any improvements that I could make to streamline this process?
Or perhaps some settings to reduce the size of the output files?
Ideally, I'd like to capture to mp4 directly. Is this possible?
Any advice generally about FFmpeg screen recording would be most appreciated.
ffmpeg -f x11grab -y -r 30 -s 1920x1080 -i :0.0 -vcodec huffyuv out.avi
# Then convert it to .mp4
ffmpeg -y -i out.avi -s 1920x1080 -f mp4 -vcodec libx264 -preset slow -crf 18 -b:v 3000k -maxrate 4000k -bufsize 512k -c:a aac -b:a 128k -strict -2 out.mp4
# and remove the .avi
rm out.avi

In general, for FFmpeg, input formats and output formats aren't tied to one another, so you can save in any format as long as the codecs are acceptable in the output format.
So, this will do,
ffmpeg -f x11grab -y -framerate 30 -s 1920x1080 -i :0.0 -c:v libx264 -preset superfast -crf 18 out.mp4
You may need to add -pix_fmt yuv420p after -i :0.0 for player compatibility.

Related

FFMPEG SCREEN RECORDING: How to get H265 (libx265) recording using ffmpeg with xorg?

I really would appreciate all the help I can get here.
I'm trying to use the libx265 codec for recording an xorg dummy screen. The command that currently works for H264 (libx264 codec) is:
ffmpeg -y -v info -f x11grab -draw_mouse 0 -r 30 -s 1280x720
-thread_queue_size 4096 -i :0.0+0,0 -f alsa -acodec aac -strict -2 -ar 44100 -b:a 128k -af aresample=async=1 -c:v libx264 -preset fast
-profile:v main -level 3.1 -pix_fmt yuv420p -r 30 -crf 21 -g 60 -tune zerolatency -f mp4 capture.mp4
In trying to get H265 instead, I first changed the codec to libx265 like below:
ffmpeg -y -v info -f x11grab -draw_mouse 0 -r 30 -s 1280x720
-thread_queue_size 4096 -i :0.0+0,0 -f alsa -acodec aac -strict -2 -ar 44100 -b:a 128k -af aresample=async=1 -c:v libx265 -preset fast
-profile:v main -level 3.1 -pix_fmt yuv420p -r 30 -crf 21 -g 60 -tune zerolatency -f mp4 capture.mp4
But that didn't do it. Although it didn't error, it was producing a file that was playing at twice the recorded speed (i.e. twice the speed of the clip that was recorded).
Then I tried using -x265-params to specify the parameters like this:
ffmpeg -y -v info -f x11grab -draw_mouse 0 -r 30 -s 1280x720
-thread_queue_size 4096 -i :0.0+0,0 -f alsa -acodec aac -strict -2 -ar 44100 -b:a 128k -af aresample=async=1 -c:v libx264 -preset fast
-x265-params profile=main:level=3.1:crf=21 -pix_fmt yuv420p -r 30 -g 60 -tune zerolatency -f mp4 capture.mp4
And this gave me an error with the following message:
"output file #0 does not contain any stream ffmpeg"
I've tried all sorts of combinations, searched extensively online (for both how to set 265 parameters and on the output file error), but I'm not making a headway. I'm really new to all this. Can anyone please help (with the most simple terms and directions)?

FFMPEG Add watermark to MP4

I have this command to add watermark to an mp4
ffmpeg -i junai-blvaz.mp4 -i evercam-logo-white.png -filter_complex "[1]scale=iw/2:-1[wm];[0][wm]overlay=x=main_w-overlay_w-10:y=main_h-overlay_h-10" -codec:a copy output.mp4
But I am creating the video using
ffmpeg -r 6 -i /tmp/%d.jpg -c:v h264_nvenc -r 6 -preset slow -bufsize 1000k -pix_fmt yuv420p -y junai-blvaz.mp4
Is there any way to merge this command of adding watermark
-i evercam-logo-white.png -filter_complex '[1]scale=iw/2:-1[wm];[0][wm]overlay=x=main_w-overlay_w-10:y=main_h-overlay_h-10'
to the very first command through which mp4 video has been created?
Combine the two commands:
ffmpeg -y -framerate 6 -i /tmp/%d.jpg -i evercam-logo-white.png -filter_complex "[1]scale=iw/2:-1[wm];[0][wm]overlay=x=main_w-overlay_w-10:y=main_h-overlay_h-10,format=yuv420p" -c:v h264_nvenc -preset slow -bufsize 1000k junai-blvaz.mp4

Record and stream desktop to Youtube by ffmpeg with HD resolution

I want to record and stream desktop to Youtube live by FFmpeg. But the output resolution is very low, maximum 360.
What options I need to change?
ffmpeg -framerate 30 -f x11grab -i :1 -f pulse -i default -c:v libx264 -s 1920x1080 -r 60 -b:v 5000k -crf 10 -vf format=yuv420p -c:a aac -b:a 128k -f flv rtmp://a.rtmp.youtube.com/live2/stream_key
Problem
Default size for x11grab is the full desktop or window (640x480 for old ffmpeg versions). Your ffmpeg is old, so it is capturing at 640x480. You are then upscaling 640x480 to 1920x1080 which is bad and looks ugly.
Solution 1: Upgrade ffmpeg
Fix by using a modern ffmpeg version and it will grab the full desktop or window size by default. See FFmpeg Download page for links or the FFmpeg compile and install guides.
Solution 2: Use -video_size input option
ffmpeg -framerate 30 -video_size 1920x1080 -f x11grab -i :0.0 -f pulse -i default -c:v libx264 -b:v 5000k -maxrate 5000k -bufsize 10000k -g 60 -vf format=yuv420p -c:a aac -b:a 128k -f flv rtmp://a.rtmp.youtube.com/live2/stream_key
See the FFmpeg x11grab documentation for more info and options.
For streaming it is recommended to add -g, -bufsize, and -maxrate to enable VBV.

FFMPEG image not updating

THE INPUT FILES
An overlay image that has is being updated every 5 seconds by a Python script
A small MP4 file that will be looped by a concat input
An MP3 file as audio source
THE COMMAND (UPDATED)
This is the command I'm currently using to combine and stream the inputs.
ffmpeg -re -i music.mp3 -f concat -i videoincludes.txt
-r 1 -loop 1 -f image2 -i overlay.png
-c:v libx264 -c:a aac -shortest -crf 23 -pix_fmt yuv420p
-maxrate 2500k -bufsize 2500k -preset ultrafast -r 30 -g 60 -b:v 2000k -b:a 192k -ar 44100
-filter_complex "[1:v][2:v] overlay=0:0" -map 0:a -strict -2
-f flv rtmp://a.rtmp.youtube.com/live2/{key}
Als tried using -framerate 1 instead of -r 1
THE ISSUE
So the issue is that the image doesn't always update. Sometimes it does update every couple seconds at the start but it stops updating after 10-20 seconds without any difference in log output and sometimes it just doesn't update.
I can however confirm that the image is being updated by the Python script but FFmpeg is just not picking this up.
I read setting the input format of the image to image2 should allow it to update so I am not sure what is wrong or what I can do to improve it.
I'm working on the same task, and finally, I think, I found the answer.
Because streams different from each other we must reset their timestamps with setpts=PTS-STARTPTS to have them begin in the same zero timestamp . And, also, try to use image2pipe instead of image2.
This is your code with timestamp reset:
ffmpeg -re -i music.mp3 -f concat -i videoincludes.txt
-r 1 -loop 1 -f image2pipe -i overlay.png
-c:v libx264 -c:a aac -shortest -crf 23 -pix_fmt yuv420p
-maxrate 2500k -bufsize 2500k -preset ultrafast -r 30 -g 60 -b:v 2000k -b:a 192k -ar 44100
-filter_complex "[1:v]setpts=PTS-STARTPTS[out_main]; [2:v]setpts=PTS-STARTPTS[out_overlay]; [out_main][out_overlay]overlay=0:0" -map 0:a -strict -2
-f flv rtmp://a.rtmp.youtube.com/live2/{key}
p.s and I think, there is no need in -r or -framerate anymore

ffmpeg converted .mp4 videos are not playing on windows

I am converting videos with extension "flv","avi","mp4","mkv", "mpg", "wmv", "asf", "webm","mov","3gp","3gpp" into "mp4" for a better quality.
Command I am using:
ffmpeg -i <server_path>/g9zyy2qg54qp1l5spo2-mergedFile.webm -strict -2 -vcodec libx264 -preset slow -vb 500k -maxrate 500k -bufsize 1000k -vf 'scale=-1:480 ' -threads 0 -ab 64k -s 640x480 -movflags faststart -metadata:s:v:0 rotate=0 <server_path>/g9zyy2qg54qp1l5spo2-mergedFile7.mp4
Videos are working fine everywhere except on Windows. No Video is working on window platform. I tried playing them on firefox, opera, even downloaded them and played on media player software but didn't work at all.
Can you please tell me codecs I should use that make the videos play on windows as well?
My video gets played in Windows 10 after adding parameters about pix_fmt and resolution (width and height should be even number):
ffmpeg -i temp-%d.png -c:v libx264 -strict -2 -preset slow -pix_fmt yuv420p -vf "scale=trunc(iw/2)*2:trunc(ih/2)*2" -f mp4 output.mp4
Use
"ffmpeg -i {$audioFile} -i {$videoFile} -map 1:0 -map 0:0 -strict -2 -vcodec libx264 -preset slow -vb 500k -maxrate 500k -bufsize 1000k -vf 'scale=-1:480 ' -threads 0 -ab 64k -s 640x480 -movflags +faststart -metadata:s:v:0 rotate=0 -fflags +genpts <server_path>/g9zyy2qg54qp1l5spo2-mergedFile7.mp4
(this uses the original command in your question)
Found a fix here convert webm to mp4. Now after getting merged webm file I am converting it to mp4 using command "ffmpeg -fflags +genpts -i 1.webm -r 24 1.mp4". This mp4 file is playing in window browsers.
For the above process I have to use 2 ffmpeg commands.
1.To make merge audio/video file into 1 webm file and
"ffmpeg -i {$audioFile} -i {$videoFile} -map 0:0 -map 1:0 -strict -2 {$mergedFileName}"
To make mp4 file.
"ffmpeg -fflags +genpts -i {$mergedFile} -strict -2 -r 24 {$mp4File}"
Can I club above 2 commands which input audio & video files to give me single mp4 file?
Edit:
I have clubbed the above 2 commands
"ffmpeg -fflags +genpts -i {$videoFile} -i {$audioFile} -strict -2 -r 24 {$mp4File}"
Its working well for me. The result mp4 video is playing in window 7 (chrome, firefox, opera) browsers. Also working in Linux (firefox, Opera ) browsers.

Resources