I get a darker background, almost like a placeholder for the asset that is to be blended, whereas the background should be transparent, and should show the same colour as the rest of the background.
I have two webm vp9 files that I am trying to blend using FFMPEG blending functionality.
One of the videos is a zoom animation that starts from 1 pixel in size and then increases in size to 50 x 50 pixels.
The other is a solid red background video 640 px x 360 px
At frame 1 the result looks like this:-
At about 0.5 seconds through, the result looks like this:-
At the end of the sequence, the zoom animation webm fills that darker square you see (50 x 50 pixels).
The code to do the blending of the two webm files looks like this:-
filter_complex.extend([
"[0:v][1:v]overlay=0:0:enable='between(t,0,2)'[out1];",
'[out1]split[out1_1][out1_2];',
'[out1_1]crop=50:50:231:251:exact=1,setsar=1[cropped1];',
'[cropped1][2:v]blend=overlay[blended1];',
"[out1_2][blended1]overlay=231:251:enable='between(t,0,2)'[out2]"
])
This overlays a red background onto a white background, making the red background the new background colour.
It then splits the red background into two, so that there is one output for cropping and another output for overlaying.
It then crops the location and size of the layer to be blended out of the red background. We do this because blending works only on an asset of the same size.
It then performs the blend of the zoom animation onto the cropped background.
It then overlays the blended over the red background
Unfortunately I'm unable to attach videos in stackoverflow, otherwise I would have included them.
The full command looks like this:-
ffmpeg -i v1_background.webm -itsoffset 0 -c:v libvpx-vp9 -i v2_red.webm -itsoffset 0 -c:v libvpx-vp9 -i v3_zoom.webm -filter_complex "[0:v][1:v]overlay=0:0[out1];[out1]split[out1_1][out1_2];[out1_1]crop=50:50:231:251:exact=1,setsar=1[cropped1];[cropped1][2:v]blend=overlay[blended1];[out1_2][blended1]overlay=231:251" output_video_with_blended_overlaid_asset.mp4
I have checked the input vp9 webm zoom video file by extracting the first frame of the video
ffmpeg -vcodec libvpx-vp9 -i zoom.webm first_frame.png
and inspecting the colours in all the channels in GIMP. The colours (apart from the opaque pixel in the middle) are all zero, including the alpha channel.
Note that I tried adding in all_mode, so that the blend command is blend=all_mode=overlay, however this still shows the darker placeholder under the animation asset. In other words, this command
ffmpeg -i v1_background.webm -itsoffset 0 -c:v libvpx-vp9 -i v2_red.webm -itsoffset 0 -c:v libvpx-vp9 -i v3_zoom.webm -filter_complex "[0:v][1:v]overlay=0:0[out1];[out1]split[out1_1][out1_2];[out1_1]crop=50:50:231:251:exact=1,setsar=1[cropped1];[cropped1][2:v]blend=all_mode=overlay[blended1];[out1_2][blended1]overlay=231:251" output_video_with_blended_all_mode_overlay_asset.mp4
also doesn't work
and trying to convert the formats to rgba first doesn't help either, command below is simplified a bit
ffmpeg -c:v libvpx-vp9 -i v2_red.webm -c:v libvpx-vp9 -i v3_zoom.webm -filter_complex "[0:v]format=pix_fmts=rgba[out1];[1:v]format=pix_fmts=rgba[out2];[out1]split[out1_1][out1_2];[out1_1]crop=50:50:0:0:exact=1,setsar=1[cropped1];[cropped1][out2]blend=all_mode=dodge[blended1];[out1_2][blended1]overlay=50:50" output_video_with_blended_all_mode_dodge_rgba_and_alpha_premultiplied_overlay.mp4
adding in an alpha premultiply didn't help either
ffmpeg -c:v libvpx-vp9 -i v2_red.webm -c:v libvpx-vp9 -i v3_zoom.webm -filter_complex "[0:v]format=pix_fmts=rgba[out1];[1:v]setsar=1,format=pix_fmts=rgba,geq=r='r(X,Y)*alpha(X,Y)/255':g='g(X,Y)*alpha(X,Y)/255':b='b(X,Y)*alpha(X,Y)/255'[out2];[out1]split[out1_1][out1_2];[out1_1]crop=50:50:0:0:exact=1,setsar=1[cropped1];[cropped1][out2]blend=all_mode=dodge[blended1];[out1_2][blended1]overlay=50:50" output_video_with_blended_all_mode_dodge_rgba_and_alpha_premultiplied_overlay.mp4
Wondering if there is a workaround I could use so that the background stays transparent?
I was looking for maybe a way of changing the input pixel format in the filter_complex stream to see if that works, but couldn't see anything about this.
Related
I am creating a video with transparent background video with the following command. After creation I want to overlay this video on another video
ffmpeg -f lavfi -i color=red:s=1920x1080,colorkey=red,format=rgba -loop 1 -t 0.08 -i "CreditWhite.png" -filter_complex "[1:v]scale=1920:-2,setpts=if(eq(N\,0)\,0\,1+1/0.02/TB),fps=60[fg]; [0:v][fg]overlay=y=-'t*h*0.02':eof_action=endall[v]" -map "[v]" -pix_fmt yuva444p10le -vcodec prores_ks credits.mov
Creating the video works fine but when I overlay this on another video (using openshot) I get a lot of color bleeding of the background colour around the edges. Any suggestions to improve the ffmpeg prompt to stop this from happening? I tried very slightly increasing the opacity (0.06) as mentioned in another thread without success.
Video uploaded to youtube for reference
UPDATE
Using different colours had the same effect
I have this example video, recorded by Kazam:
https://user-images.githubusercontent.com/1997316/178513325-98513d4c-49d4-4a45-bcb2-196e8a76fa5f.mp4
It's a 1022x728 video.
I need to add a drop shadow identical to the one generated by the "Drop shadow (legacy)" filter of Gimp with the default settings. So, I generated with Gimp a PNG containing only the drop shadow. It's a 1052x758 image:
Now I want to put the video over the image to get a new video with the drop shadow. The wanted effect for the first frame is:
So, the video must be placed over the image. The top-left corner of the video must be in the position 11x11 of the background image.
How can I achieve this result?
I tried without success the following command. What's wrong?
ffmpeg -i shadow.png -i example.mp4 -filter_complex "[0:v][1:v] overlay=11:11'" -pix_fmt yuv420p output.mp4
About the transparency of the PNG background image, if it can't be maintained, then it's okay for the shadow to be on a white background. Otherwise, if it can be maintained by using an animated GIF as the output format, it is better.
The solution is to remove the transparency from shadow.png. Then:
ffmpeg -i example.mp4 -filter_complex "[0:v] palettegen" palette.png
ffmpeg -loop 1 -i shadow.png -i example.mp4 -i palette.png -filter_complex "[1:v] fps=1,scale=1022:-1[inner];[0:v][inner]overlay=11:11:shortest=1[new];[new][2:v] paletteuse[out]" -map '[out]' -y output.gif
The result is exactly what I wanted:
This solution is inspired by the answer https://stackoverflow.com/a/66318325 and by the article https://www.baeldung.com/linux/convert-videos-gifs-ffmpeg
I am trying to convert a mp4 video with a completely uniform pink color to a mov file seuch that the pink color is transparent.
I have run:
ffmpeg -i input.mp4 -vf "chromakey=0xf25b98:0.01:0" -c copy -c:v png output.mov
I confirmed that #f25b98 is the color I am replacing. This makes absolutely nothing transparent. When I try:
ffmpeg -i input.mp4 -vf "chromakey=0xf25b98:0.02:0" -c copy -c:v png output.mov
I get some weird transparent dots in my pink but still nothing is changed (see attached screenshot from video).
Why would ffmpeg exhibit this behavior?
Short answer is that it is not possible to use FFMPEG to remove a keyframe from an already compressed video stream because the compression introduces bleed which makes for an imprecise transparency.
I'm trying to overlap an animated gif over a video with no success.
My goals are the next:
gif animation have to loop until video ends.
gif is scaled so it covers the whole video.
gif preserves transparency.
The most I have achieved regarding this is that the gif covers the whole video with the scale filter and that it loops until video ends (but this not in the best way, I guess).
Regarding loop I know I can use -ignore_loop 0 gif filter parameter with shortest=1 in overlay but this way it is not working so I ended up with -frames:v 900 (my video is 30fps and 30sec long so 900 is the number of frames).
My most important issue is I'm not able to keep gif transparency and everything I've tried resulted in no success.
This is my ffmpeg command with arguments, so I hope anybody can help (I'm using ffmpeg 4.1).
ffmpeg -y
-i videoin.mp4
-i anim01.gif
-filter_complex [1:v]scale=1080:1920[ovrl] [0:v][ovrl]overlay=main_w-overlay_w:main_h-overlay_h
-frames:v 900
-codec:a copy
-codec:v libx264
-preset ultrafast
video.mp4
Ok, I'll answer my own question. The first part, not being able to achieve gif transparency, such a silly issue!! The gif I was using was not transparent and I didn't realized!! OMG, I thought, so this is the first thing to check whenever you have a transparency issue.
The second, looping the gif until the video ends, I wasn't able to do it with -ignore_loop 0 along with shortest=1 but what I did is -ignore_loop 0 and -frames:v 900 and that worked like a charm.
What was not working was not the -ignore_loop 0 but the shortest=1 and so ffmpeg was never ending encoding but if you set it to finish at a certain number of frames that resolves the problem.
900 comes from 30fps x 30 sec video.
In the end, my complete ffmpeg command line parameters ended up as follows:
ffmpeg -y -i xxx.mp4 -ignore_loop 0 -i xxx.gif -filter_complex "[1:v]scale=1080:1920[ovrl];[0:v][ovrl]overlay=0:0" -frames:v 900 -codec:a copy -codec:v libx264 -max_muxing_queue_size 2048 video.mp4
Hello Guys if anyone want to add gif to video use this command. Deffienetly you will get the right answer
String strFilter = "[1:v]scale=h=-1:w=100[overlay_scaled],"
+ "[0:v][overlay_scaled]overlay=shortest=1:x=W*0:y=H*0";
String[] complexCommand = new String[] {
"-i",
yourRealPath,
"-itsoffset",
String.valueOf(0),
"-ignore_loop", "0", "-i",
fullPath,
"-filter_complex",
strFilter,
"-frames:v", "900", "-preset",
"ultrafast",
"-g",
"120",
dest.getAbsolutePath()
};
I've 3 inputs (1st, 2nd and 3rd block)
1st a mp4 video with 600x400 aspect ratio
2nd a png image with 600x400 aspect ratio
3rd a jpeg image with red background
Output (4th block)
I need a mp4 video of 600x400 as output, it should have resized video of 422x282 and merge all three as shown in image.
Can we implement this via ffmpeg command line?
I'm able to resize video and image separately but having issue in creating desire output.
Use
ffmpeg -i 1.mp4 -i red.jpg -i frame.png
-filter_complex "[0]scale=422:-1[vid];[1][vid]overlay=(W-w)/2:(H-h)/2[bg];
[bg][2]overlay=(W-w)/2:(H-h)/2" out.mp4
First, the video is resized. Then that resized video is overlaid on the red background. Then, on top of that result, the PNG frame is overlaid.
With no red frame and white BG,
ffmpeg -i 1.mp4 -i frame.png
-filter_complex "[0]scale=422:-1,pad=600:400:(ow-iw)/2:(oh-ih)/2:color=white[vid];[vid][1]overlay=(W-w)/2:(H-h)/2" out.mp4