I've got an MKV that I would like to replace the first 5 seconds with a static png image that fades in/out from black. How can I accomplish this with just ffmpeg?
Easy method is to overlay the image:
ffmpeg -i input.mkv -loop 1 -t 5 -i image.png -filter_complex "[1]fade=type=in:duration=1,fade=type=out:duration=1:start_time=4[fg];[0]drawbox=t=fill:enable='lte(t,5)'[bg];[bg][fg]overlay=eof_action=pass:x=(W-w)/2:y=(H-h)/2" -c:a copy output.mkv
I added the drawbox filter to make a black background because I didn't know the size of your image.
See FFmpeg Filter Documentation.
Related
I have this example video, recorded by Kazam:
https://user-images.githubusercontent.com/1997316/178513325-98513d4c-49d4-4a45-bcb2-196e8a76fa5f.mp4
It's a 1022x728 video.
I need to add a drop shadow identical to the one generated by the "Drop shadow (legacy)" filter of Gimp with the default settings. So, I generated with Gimp a PNG containing only the drop shadow. It's a 1052x758 image:
Now I want to put the video over the image to get a new video with the drop shadow. The wanted effect for the first frame is:
So, the video must be placed over the image. The top-left corner of the video must be in the position 11x11 of the background image.
How can I achieve this result?
I tried without success the following command. What's wrong?
ffmpeg -i shadow.png -i example.mp4 -filter_complex "[0:v][1:v] overlay=11:11'" -pix_fmt yuv420p output.mp4
About the transparency of the PNG background image, if it can't be maintained, then it's okay for the shadow to be on a white background. Otherwise, if it can be maintained by using an animated GIF as the output format, it is better.
The solution is to remove the transparency from shadow.png. Then:
ffmpeg -i example.mp4 -filter_complex "[0:v] palettegen" palette.png
ffmpeg -loop 1 -i shadow.png -i example.mp4 -i palette.png -filter_complex "[1:v] fps=1,scale=1022:-1[inner];[0:v][inner]overlay=11:11:shortest=1[new];[new][2:v] paletteuse[out]" -map '[out]' -y output.gif
The result is exactly what I wanted:
This solution is inspired by the answer https://stackoverflow.com/a/66318325 and by the article https://www.baeldung.com/linux/convert-videos-gifs-ffmpeg
Im using this ffmpeg command to overlay a video on image (with remove black background):
ffmpeg -loop 1 -i image.png -i video.mp4 -filter_complex [1:v]colorkey=0x000000:0.1:0.1[ckout];[0:v][ckout]overlay[out] -map [out] -t 5 -c:a copy -c:v libx264 -y result.mp4
But as you can see in the picture, the black parts of the ball have also disappeared. How can I solve this problem?
Not possible with colorkey/chromakey alone. The background is too similar to the color you want to remove. You have two options.
Mask
Use a mask. If the video comes with an alpha mask you can use it to cut out the background using the alphamerge filter:
ffmpeg -i bg.jpg -i video.mp4 -i alpha.mp4 -filter_complex "[1][2]alphamerge[alf];[0][alf]overlay" output.mp4
Use a different color
Replace the video that has a color that is different than the color you want to remove.
When overlaying a text or image(test.jpg) to a background image(bg.png),output video background color is getting darker.
here input image has an alpha value .
ffmpeg -loop 1 -i bg.png -i test.jpg -y -filter_complex "[0:v][1:v] overlay=25:25,-pix_fmt yuv420p" -shortest -t 10 tesy.mp4
Output excepted : https://i.stack.imgur.com/e5C3x.png
Output I got is
https://i.stack.imgur.com/reujS.png
If you see background color has huge differnce in the output I got
Below is the background image(bg.png) to which i am overlaying some image(test.jpg)
Use this image and overlay a test image, and let me know the difference in background color .
Add the premultiply filter for these particular PNG files:
ffmpeg -y -loop 1 -i bg.png -i test.jpg -filter_complex "[0]premultiply=inplace=1[bg];[bg][1:v]overlay=25:25:format=auto,format=yuv420p" -t 10 output.mp4
Or create the PNG with no alpha.
Or use the color filter to make the background:
ffmpeg -y -f lavfi -i color=c=purple:s=1280x720 -i test.jpg -filter_complex "[0][1]overlay=25:25:format=auto,format=yuv420p" -t 10 output.mp4
I have bunch of images that i have to convert to slideshow with curtain effect. currently i am running this command that convert images to video.
ffmpeg -r 1/5 -i img%d.png -c:v libx264 -vf "fps=25,format=yuv420p" video.mp4
But how to achieve this kind of effect with ffmpeg. Image link Required result
I searched online but not found any solution. I have clue of alpha mask but no idea how to use it for such result.
ffmpeg -y -i img1.png -i img2.png -i img3.png -filter_complex "[0:v]zoompan=z='zoom+0.0000':d=50[img1];[1:v]zoompan=z='if(lte(zoom,1.0),1.1,max(1.001,zoom-0.0030))':d=200[img2];[img1][img2]blend=all_expr='if(lte((H/2-sqrt((Y-H/2)*(Y-H/2)))+N*8*SH,H/2),A,B)'[img1img2];[1:v]zoompan=z='zoom+0.0000':d=50[img2];[2:v]zoompan=z='if(lte(zoom,1.0),1.1,max(1.001,zoom-0.0030))':d=200[img3];[img2][img3]blend=all_expr='if(lte((H/2-sqrt((Y-H/2)*(Y-H/2)))+N*8*SH,H/2),A,B)'[img2img3];[img1img2][img2img3]concat=n=2[final]" -map "[final]" out.mp4
This ffmpeg command will generate door open (curtain) effect.
Here is logic.
Suppose you have there images you want to create this effect. First create blend effect of first img1 and img2. Then create another blend effect with img2 and img3. then merge these 2 generated videos.
I have an mp4 that I want to overlay on top of a jpeg. The command I'm using is:
Ffmpeg -y -i background.jpg -i video.mp4 -filter_complex "overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2" -codec:a copy output.mp4
But for some reason, the output is 0 second long but the thumbnail does show the first frame of the video centred on the image properly.
I have tried using -t 4 to set the output's length to 4 seconds but that does not work.
I am doing this on windows.
You need to loop the image. Since it loops indefinitely you then must use the shortest option in overlay so it ends when video.mp4 ends.
ffmpeg -loop 1 -i background.jpg -i video.mp4 -filter_complex \
"overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2:shortest=1" \
-codec:a copy -movflags +faststart output.mp4
See overlay documentation for more info.
Well you should loop the image until the video duration. So to do the you need to add -loop 1 before the input image. Then the image will have a infinite duration. So to control it specify -shortest before the output file which will trim all the streams to the shortest duration among them. Else you can use -t to trim the image duration to the video length. This will do what you want.
Hope this helps!