I've been playing around with ffmpeg over the past months and can't get rid of an issue I'm facing when adding a GIF file as an overlay.
Basically what I'm trying to achieve is to add a transparent GIF animation as an overlay on top of a MP4 video.
Please find below an example command that I'm using:
ffmpeg \
-i 0689a8a9-43b5-45d2-b0e8-acbea6905ce1.mp4 \
-ignore_loop 0 \
-i 02a6e696-969b-4a90-9444-e4b0b4d6f6da.gif \
-t 10.000000 \
-filter_complex "[0:v][1:v]overlay=enable='between(t, 1, 3)'[overlay]" \
-map '[overlay]' \
-pix_fmt yuv420p \
output.mp4
For a better understanding, please note that:
-ignore_loop 0 allows me to loop the animation as long as the overlay is enabled
-t makes my video last 10s
overlay=enable='between(t, 1.0, 3.0)' sets the interval during which it's visible
However, when I run this command, a very few milliseconds before the GIF disapears (at 3s), it starts blinking. If I run take a look at it frame by frame, it actually disappears from the video, then comes back, and eventually goes away as expected.
Please find an example with a black background and a random GIF from giphy at this link. The assets can be found here.
I'm probably missing something here. Do you have any hints ?
I'm running ffmpeg in 4.3.1.
Thank you in advance
I can replicate this with an arbitrary gif. I suspect a bug in the overlay filter. Feel free to present this to https://trac.ffmpeg.org.
This happens as soon as the temporal filtering is set (filter is listed as having timeline support) and furthermore changes depending on the time boundaries. The latter should never be the case.
MWE
ffmpeg \
-t 10 -s qcif -f rawvideo -pix_fmt rgb24 -r 25 -i /dev/zero \
-ignore_loop 0 -i 'https://media.tenor.com/images/c50ca435dffdb837914e7cb32c1e7edf/tenor.gif' \
-filter_complex "overlay=enable='between(t,3,7)'" \
-f flv - | ffplay -
You could try, as a workaround, converting the gif to an mp4 (ffmpeg -re -i <gif> [...]) and set the white areas to transparent.
In the official FFmpeg community there is a ticket for this, which hasn't been fixed though:
https://trac.ffmpeg.org/ticket/4803
The ticket mentions that a GIF is being shown before the specified enable time. On my tests, given a 60 fps video, a 10.42 fps GIF (that needs to be shown from 5 to 10 s) blinks once 5 frames before the desired time (at 4.933 s) and becomes visible again right at 5 s. Should be something connected to the GIF's fps which doesn't match the video's fps.
Anyways, I've found the most elegant workaround, which solves the problem in a single pass and doesn't require converting GIFs to temporary MP4s (because in some cases that could be undesired). So, given the video fps, to overlay a GIF at a certain position (x=10, y=20) from 5 to 10 s without blinking we should use the following:
ffmpeg -y -i "video.mp4" -ignore_loop 0 -i "giphy.gif" \
-filter_complex "[1:v]fps=60[gif];[0:v][gif]overlay=x=10:y=20:enable='between(t,5,10)'" \
-c:a copy -shortest "overlay.mp4"
We can go further and come up with a command line which doesn't require a prior knowledge of the video fps (but you should know the output video fps instead, which is 60 fps in this case):
ffmpeg -y -i "video.mp4" -ignore_loop 0 -i "giphy.gif" \
-filter_complex "[0:v]fps=60[video];[1:v]fps=60[gif];[video][gif]overlay=x=10:y=20:enable='between(t,5,10)'" \
-acodec copy -shortest "overlay.mp4"
Related
I'd like to create a zoom effect where I start at 1 * inputwidth and end with 1.1 * inputwidth over a set time.
I tried using zoompan but it:
pauses the frame, zooms in over the amount of time specified, continues sound while zooming.
zooms to the top-left corner, instead of center.
leaves the frame smaller than it was before. Do I need to upscale afterwards?
$ ffmpeg -i in.mp4 -t 10 \
-vf "fps=60,zoompan=z='if(lte(it,2),it/2*0.1+1,1.1)':d=1,scale=iw:ih" out.mp4
I tried using scale with crop, but then I get the error shown below. I assume this isn't to be as a function of time.
$ ffmpeg -i in.mp4 -t 10 \
-vf "fps=60, scale='t/5*2*iw':-1, crop='iw/2*t/5':'ih/2*t/5'" out.mp4
Expressions with frame variables 'n', 't', 'pos' are not valid in init eval_mode.
If there's an other tool besides ffmpeg, that would be fine, except it should not be a gui-only way.
You are looking in the right direction for what you want. There are a few things off with the commands you've specified
zoompan itself needs an fps option. The fps filter just interpolates the outcome of zoompan. You need to add fps=60 to zoompan (match this to the input video).
ffmpeg -i in.mp4 -t 10 \
-vf "zoompan=fps=60:z='if(lte(it,2),it/2*0.1+1,1.1)':d=1,scale=iw:ih" out.mp4
This will make the video not go out of sync.
the top-left zoom, is because x and y are both 0. To zoom at center add x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)'
ffmpeg -i in.mp4 -t 10 \
-vf "zoompan=fps=60:z='if(lte(it,2),it/2*0.1+1,1.1)':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=1,scale=iw:ih" out.mp4
Assuming your video is bigger than 720p, the default for zoompan output size is s=hd720 that's why it's smaller. To keep it in 1080p add s=hd1080
ffmpeg -i in.mp4 -t 10 \
-vf "zoompan=fps=60:z='if(lte(it,2),it/2*0.1+1,1.1)':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=1:s=hd1080,scale=iw:ih" out.mp4
This should be the full command
ffmpeg -i in.mp4 -t 10 \
-vf "zoompan=fps=60:z='if(lte(it,2),it/2*0.1+1,1.1)':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=1:s=hd1080,scale=iw:ih" out.mp4
Completely new to working with FFMPEG, what I'm trying to achieve is applying overlaying graphics at certain positions and times, and cutting out sections of a single input video.
I've worked out the overlaying graphics, so this code is working:
ffmpeg -i /Users/username/projectdir/static/video.mp4 \
-i overlay.png -i overlay2.png \
-filter_complex "[0:v][1:v] overlay=192:108:enable='between(t, 0, 5)'[ov0];
[ov0] overlay=192:108:enable='between(t, 5, 10)'" \
-pix_fmt yuv420p output_overlayed.mp4
But when I try to cut out sections using this code:
ffmpeg -i /Users/username/projectdir/static/video.mp4 \
-i overlay.png -i overlay2.png \
-filter_complex "[0:v][1:v] overlay=192:108:enable='between(t, 0, 5)'[ov0]; \
[ov0] overlay=192:108:enable='between(t, 5, 10)', \
select='between(t,0,5)+between(t,10,15)', \
setpts='N/FRAME_RATE/TB'" \
-pix_fmt yuv420p output_overlayed_trimmed.mp4
It seems to cut correctly, so the original video starts playing from 0 seconds until 5 seconds and then plays from 10 seconds in until 15 seconds and cuts out. But after the point where the video cuts out it's just a black screen for the duration of the video. I can't seem to get it to work so it affects the overall duration of the video.
(The values being passed in are just examples by the way, eg. I've got it to start an overlay 5 seconds in but also cut 5 seconds in)
I have the timestamps for when the overlays should appear on the non-trimmed video, so the overlaying should happen first and then the trimming. If the video is trimmed first then the overlays will appear at the wrong times.
An alternative way of achieving this that is currently working is by performing the first line of code (which just produces a new video file with the overlay) and then separately take this new file and perform the trimming independently:
ffmpeg -ss 0 -to 5 -i /Users/username/projectdir/static/output_overlayed.mp4 \
-ss 15 -to 20 -i /Users/username/projectdir/static/output_overlayed.mp4 \
-filter_complex "[0][1]concat=n=2:v=1:a=1" output_trimmed.mp4
But this means working with 2 separate files and then having to remove the first after the 2nd execution is complete. Ideally I'd combine them into one command which doesn't produce multiple files.
Would appreciate any help - thanks!
How about get the input twice with different trims (so both video & audio are cut in sync) then concatenate after overlaying? Like this:
ffmpeg -t 5 -i /Users/username/projectdir/static/video.mp4 \
-ss 10 -to 15 -i /Users/username/projectdir/static/video.mp4 \
-i overlay.png -i overlay2.png \
-filter_complex "[0:v][2:v] overlay=192:108[ov0]; \
[1:v][3:v] overlay=192:108[ov1]; \
[ov0][0:a][ov1][1:a] concat=n=2:v=2:a=2[vout][aout] \
-map [vout] -map[aout] -pix_fmt yuv420p output_overlayed_trimmed.mp4
ffmpeg noob here, trying to help my mother with some videos for real estate walkthroughs. I'd like to set up a simple pipeline that I can run videos through and have outputted as such:
5 second (silent) title card ->
xfade transition ->
property walk through ->
xfade transition ->
5 second (silent) title card
Considerations:
The intro / outro card will be the same content.
The input walkthrough videos will be of variable length so, if possible, a dynamic solution accounting for this would be ideal. If this requires me to script something using ffprobe, I can do that - just need to gain an understanding of the syntax and order of operations.
The video clip will come in with some audio already overlaid. I would like for the title cards to be silent, and have the video/audio clip fade in/out together.
I have gotten a sample working without the transitions:
ffmpeg -loop 1 -t 5 -i title_card.jpg \
-i walkthrough.MOV \
-f lavfi -t 0.1 -i anullsrc \
-filter_complex "[0][2][1:v][1:a][0][2]concat=n=3:v=1:a=1[v][a]" \
-map "[v]" -map "[a]" \
-vcodec libx265 \
-crf 18 \
-vsync 2 \
output_without_transitions.mp4
I have been unable to get it to work with transitions. See below for the latest iteration:
ffmpeg -loop 1 -t 5 -r 60 -i title_card.jpg \
-r 60 -i walkthrough.MOV \
-f lavfi -t 0.1 -i anullsrc \
-filter_complex \
"[0][1:v]xfade=transition=fade:duration=0.5:offset=4.5[v01]; \
[v01][0]xfade=transition=fade:duration=0.5:offset=12.8[v]" \
-map "[v]" \
-vcodec libx265 \
-crf 18 \
-vsync 2 \
output_with_transitions.mp4
This half-works, resulting in the initial title card, fading into the video, but the second title card never occurs. Note, I also removed any references to audio, in an effort to get the transitions alone to work.
I have been beating my head against the wall on this, so help would be appreciated :)
Assuming walkthrough.MOV is 10 seconds long:
ffmpeg -loop 1 -t 5 -framerate 30 -i title_card.jpg -i walkthrough.MOV -filter_complex "[0]settb=AVTB,split[begin][end];[1:v]settb=AVTB[main];[begin][main]xfade=transition=fade:duration=1:offset=4[xf];[xf][end]xfade=transition=fade:duration=1:offset=13,format=yuv420p[v];[1:a]adelay=4s:all=1,afade=t=in:start_time=4:duration=1,afade=t=out:start_time=13:duration=1,apad=pad_dur=4[a]" -map "[v]" -map "[a]" -c:v libx265 -crf 18 -movflags +faststart output.mp4
You will need to upgrade your ffmpeg for this to work. The current release version (4.3 as of this answer) is too old, so get a build from the git master branch. See FFmpeg Download for links to builds for your OS, or see FFmpeg Wiki: Compile Guide.
title_card.jpg frame rate, width, and height must match walkthrough.MOV.
See Merging multiple video files with ffmpeg and xfade filter to see how to calculate xfade and afade offsets.
See FFmpeg Filter documentation for details on each filter.
See How to get video duration in seconds? which can help you automate this via scripting.
apad is supposed to automatically work with -shortest, but it doesn't with -filter_complex. So pad_dur is used to add the additional silence to the last title image, but whole_dur can be used instead if that is easier for you. Another method is to use anullsrc as in your question, then concatenate audio only with the concat filter, but I wanted to show adelay+apad as a viable alternative.
I have an mp4 that I want to overlay on top of a jpeg. The command I'm using is:
Ffmpeg -y -i background.jpg -i video.mp4 -filter_complex "overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2" -codec:a copy output.mp4
But for some reason, the output is 0 second long but the thumbnail does show the first frame of the video centred on the image properly.
I have tried using -t 4 to set the output's length to 4 seconds but that does not work.
I am doing this on windows.
You need to loop the image. Since it loops indefinitely you then must use the shortest option in overlay so it ends when video.mp4 ends.
ffmpeg -loop 1 -i background.jpg -i video.mp4 -filter_complex \
"overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2:shortest=1" \
-codec:a copy -movflags +faststart output.mp4
See overlay documentation for more info.
Well you should loop the image until the video duration. So to do the you need to add -loop 1 before the input image. Then the image will have a infinite duration. So to control it specify -shortest before the output file which will trim all the streams to the shortest duration among them. Else you can use -t to trim the image duration to the video length. This will do what you want.
Hope this helps!
Need to add watermark for first 3 seconds the video using ffmpeg. Here's what I got right now:
ffmpeg -y -i '255871.mov' -qscale:v 0 -qscale:a 0 -vf '[in] transpose=1 [out];movie=watermark.png , select=lte(t\,3) [bg]; [out][bg] overlay=x=20:y=main_h-60 [out]' output.mp4
It rotates video to the right and adds watermark at the bottom of the video for first 3 seconds. The problem is watermark is visible during the whole video.
Thought that select doesn't work at all. Tried following command
ffmpeg -y -i '255871.mov' -qscale:v 0 -qscale:a 0 -vf '[in] transpose=1 [out];movie=watermark.png , select=0 [bg]; [out][bg] overlay=x=20:y=main_h-60 [out]' output.mp4
Watermark is not visible. This is correct and proves that select filter works as expected. As I understand this is how ffmpeg works: it leaves last frame of the shortest video visible.
How can I force ffmpeg to discard show watermark after N seconds?
Have to answer it myself. ffmpeg mailing list helped me to solve the issue.
The main idea is to convert existing watermark into video using Apple Animation codec (it supports transparency) and fade out last frame of created video using fade filter.
Example:
ffmpeg -loop 1 -i watermark.png -t 3 -c qtrle -vf 'fade=out:73:1:alpha=1' watermark.mov
ffmpeg -y -i '255871.mov' -qscale:v 0 -qscale:a 0 -vf '[in] transpose=1 [out];movie=watermark.mov [bg]; [out][bg] overlay=x=20:y=main_h-60 [out]' output.mp4
Fade out is required because ffmpeg uses last frame of overlaid video for the rest of the video. This filter makes last frame fully transparent via alpha=1 parameter. In fact it should be fade=out:74:1:alpha=1, but it didn't work for me, don't know why