FFMPEG images to video + overlay video - ffmpeg

I am trying to make 15 second video where the background layer is a video made up of 2 images, the first line creates a 15 second video from 2 images.
I chose a small framerate so it renders an mp4 quickly. I then overlay a webm video (which has transparency) over the images. The final video seems to keep the framerate of 2, but i would rather keep the 24 framerate of the webm video.
Is this possible? & is it also possible to turn the below into 1 statement.
ffmpeg -loop 1 -framerate 2 -t 11 -i image1.png -loop 1 -framerate 2 -t 4 -i image2.png -filter_complex "[0][1]concat=n=2" backgroundvideo.mp4;
ffmpeg -i backgroundvideo.mp4 -c:v libvpx-vp9 -i overlayvideo.webm -filter_complex overlay newvid.mp4

You can use the filter fps to adjust your background's framerate
ffmpeg \
-loop 1 -framerate 2 -t 11 -i image1.png \
-loop 1 -framerate 2 -t 4 -i image2.png \
-c:v libvpx-vp9 -i overlayvideo.webm \
-filter_complex '[0][1]concat,fps=24[bg];[2][bg]overlay' \
backgroundvideo.mp4

Related

ffmpeg can loop png but not audio

I'm using the following to stream an image to YouTube:
ffmpeg -threads:v 2 -threads:a 8 -filter_threads 2 -thread_queue_size 1080 \
-loop 1 -re -i ./image.png \
-i ./track.mp3 \
-pix_fmt yuv420p -c:v libx264 -qp:v 19 -profile:v high -rc:v cbr_ld_hq -level:v 4.2 -r:v 60 -g:v 120 -bf:v 3 -refs:v 16 -preset fast -f flv rtmp://a.rtmp.youtube.com/live2/xxx
And the looping for the image (to keep it streaming over) works, but not the sound.
Remember that FFmpeg input options are applied per input. So, -loop 1 is only specified for -i image.png input, and -i ./track.mp3 has no input options defined. Now, to loop audio track, you need to use -stream_loop input option like this:
ffmpeg -threads:v 2 -threads:a 8 -filter_threads 2 -thread_queue_size 1080 \
-loop 1 -re -i ./image.png \
-stream_loop -1 -i ./track.mp3 \
...

ffmpeg adjust contrast, show histogram

I'm trying to first adjust the contrast of a frame extracted from an mp4, then overlay the histogram of the resultant frame on top. My command here does all of this, but also adjusts the contrast of the histogram itself. Is there a single ffmpeg command that can do what I wish?
ffmpeg -ss 3.5 -i in.mp4 -an -y -vf \
"split=2[a][b],[b]eq=0.5:0:1:1:1:1:1,histogram=levels_mode=logarithmic:\
components=1:level_height=100, [a]overlay,eq=0.5:0:1:1:1:1:1" \
-vframes 1 -q:v 2 out.jpg
Use
ffmpeg -ss 3.5 -i in.mp4 -an -y -filter_complex \
"eq=0.5:0:1:1:1:1:1,split=2[a][b];[b]histogram=levels_mode=logarithmic:\
components=1:level_height=100[b];[a][b]overlay" -vframes 1 -q:v 2 out.jpg

Change image overlay on demand

I need your help. I stream to Twitch with this Command:
ffmpeg -i input.mp4 -i image.jpg -filter_complex 'overlay=x=10:x=10' -s \
1920x1200 -framerate 15 -c:v libx264 -preset ultrafast -pix_fmt yuv420p \
-threads 0 -f flv 'rtmp://'
How is it possible to change the image.jpg picture to another picture on a variable time? I will don't restart the FFMPEG Command.
Add the -f image2 -loop 1 input options for the image input, then atomically replace image.jpg when desired such as by using mv.
Basic example:
ffmpeg -i input.mp4 -f image2 -loop 1 -i image.jpg -filter_complex overlay output.mp4
Streaming example:
ffmpeg -re -i input.mp4 -f image2 -loop 1 -i image.jpg -filter_complex "overlay,format=yuv420p" -c:v libx264 -preset fast -g 50 -b:v 4000k -maxrate 4000k -bufsize 8000k -f flv 'rtmp://'
To answer the "variable time" part of your question use a cron job to run scripts that update the overlay image at a specified time i.e. every 5 mins. For example you can create a folder of various overlays and select one randomly every 5 minutes and copy it to image.jpg. FFMPEG will then render the new image to your stream.
It is important to use -f image 2 -loop 1 -thread_queue_size 512 -i image.jpg especially when rendering other image formats.

Merge videos and images using ffmpeg

I'm trying to compile one .webm file that contains this:
10 seconds showing image1.jpg
Show a movie (an .mp4 file), which lasts about 20 seconds
10 seconds showing image2.jpg
10 seconds showing image3.jpg
I was unable to find out how/if the concatenate functionality of ffmpeg could do such a thing. Any clues?
You can use the concat filter.
Without audio
ffmpeg \
-loop 1 -framerate 24 -t 10 -i image1.jpg \
-i video.mp4 \
-loop 1 -framerate 24 -t 10 -i image2.jpg \
-loop 1 -framerate 24 -t 10 -i image3.jpg \
-filter_complex "[0][1][2][3]concat=n=4:v=1:a=0" out.mp4
Match -framerate with frame rate from video.mp4.
With audio
If there is audio in video.mp4 you'll need to provide audio for the images as well for it to be able to concatenate. Example of generating silence:
ffmpeg \
-loop 1 -framerate 24 -t 10 -i image1.jpg \
-i video.mp4 \
-loop 1 -framerate 24 -t 10 -i image2.jpg \
-loop 1 -framerate 24 -t 10 -i image3.jpg \
-f lavfi -t 0.1 -i anullsrc=channel_layout=stereo:sample_rate=44100 \
-filter_complex "[0:v][4:a][1:v][1:a][2:v][4:a][3:v][4:a]concat=n=4:v=1:a=1" out.mp4
Match channel_layout with audio channel layout (stereo, mono, 5.1, etc) from video.mp4.
Match sample_rate with audio sample rate from video.mp4.
No need to match the -t duration from anullsrc with any associated video input: the concat filter will automatically pad it to match video duration.

Adding splash screen using FFMPEG

everyone!
I'm trying to add a splash screen to fade out after 2 seconds into a video using FFMPEG.
I'm using the following command:
ffmpeg -loop 1 -framerate 2 -t 2 -i image.png \
-i video.mp4 \
-filter_complex "[0:v]fade=t=in:st=0:d=0.500000,fade=t=out:st=4.500000:d=0.500000,setsar=1; \
[0:0] [1:0] concat=n=2:v=1:a=0" \
-c:v libx264 -crf 23 output.mp4
but it is generating a video whose duration is correct, but plays for just 2 seconds, exactly the splash screen duration.
Since I don't have very experience on FFMPEG and got this code from the internet I don't know where the problem is...
Use
ffmpeg -i video.mp4 -loop 1 -t 2 -i image.png \
-filter_complex \
"[1]fade=t=in:st=0:d=0.500000,fade=t=out:st=1.500000:d=0.500000,setsar=1[i]; \
[i][0]concat=n=2:v=1:a=0" \
-c:v libx264 -crf 23 output.mp4
The image should be the same resolution as the video. It will fade-in for 0.5 seconds, remain for 1 second, then fade out for 0.5 seconds.

Resources