First frame not getting blur from ffmpeg - filter

I am trying to run a combination of overlay and boxblur filters to blur pixels in human background (using ffmpeg). My command is as follows:
ffmpeg -y \
-ss 00:00:02.000 -t 00:00:02.000 -i Adeel_Abbas_20210423T191428-1X.MP4 \
-ss 00:00:02.000 -t 00:00:02.000 -i Adeel_Abbas_20210423T191428-1X-SM.MP4 \
-filter_complex " \
[0:v][1:v] alphamerge[fg0]; [0:v]boxblur=luma_radius=3:chroma_radius=0:alpha_radius=0:luma_power=1:chroma_power=0:alpha_power=0[bv0]; [bv0][fg0]overlay,crop=1920:1080:0:78,setpts=PTS-STARTPTS[vs0]" -map "[vs0]" \
-b:v 8709120 -bf 2 -threads 8 -tag:v avc1 -c:v libx264 -preset medium -g 144 -color_primaries bt709 -color_trc bt709 -colorspace smpte170m output.mp4
Here are sample files that can be used to perfectly reproduce this issue.
Using ffmpeg-5.0 the background in first frame will not get blur applied but successive frames will get blurred properly. Can someone please help

Only happens if I seek from random places in original file using -ss 00:00:02.000 -t 00:00:02.000 command. If I start processing from beginning of file by omitting -ss flag, everything looks good.

Related

Concatenating video clip with static image causes buffer errors

I'm trying to concatenate a 15 second clip of a video (MOVIE.mp4) with 5 seconds (no audio) of an image (IMAGE.jpg) using FFmpeg.
Something seems to be wrong with my filtergraph, although I'm unable to determine what. The command I've put together is the following:
ffmpeg \
-loop 1 -t 5 -I IMAGE.jpg \
-t 15 -I MOVIE.mp4 \
-filter_complex "[0:v]scale=480:640[1_v];anullsrc[1_a];[1:v][1:a][1_v][1_a]concat=n=2:v=1:a=1[out]" \
-map "[out]" \
-strict experimental tst_full.mp4
Unfortunately, this seems to be creating some strange results:
On my personal computer (FFmpeg 4.2.1) it correctly concatenates the movie with the static image; however, the static image lasts for an unbounded length of time. (After entering ctrl-C, the movie is still viewable, but is of an extremely long length--e.g., 35 min--depending on when I interrupt the process.)
On a remote machine where I need to do the ultimate video processing (FFmpeg 2.8.15-0ubuntu0.16.04.1), the command does not terminate, and instead, I get cascading errors of the following form:
Past duration 0.611458 too large
...
[output stream 0:0 # 0x21135a0] 100 buffers queued in output stream 0:0, something may be wrong.
...
[output stream 0:0 # 0x21135a0] 100000 buffers queued in output stream 0:0, something may be wrong.
I haven't been able to find much documentation that elucidates what these errors mean, so I don't know what's going wrong.
As Gyan pointed out, you only have to add atrim to your audio:
anullsrc,atrim=0:5[silent-audio]
Instead of scale you could use scale2ref and setsar to automatically make your image the same size and aspect ratio as the video.
ffmpeg \
-loop 1 -t 5 -i IMAGE.jpg \
-t 15 -i MOVIE.mp4 \
-filter_complex "[0:v][1:v]scale2ref[img][v];[img]setsar=1[img]; \
anullsrc,atrim=0:5[silent-audio];[v][1:a][img]
[silent-audio]concat=n=2:v=1:a=1[out]" \
-map "[out]" \
-strict experimental tst_full.mp4
Alternatively you could use anullsrc as a 3rd input:
ffmpeg \
-t 15 -i MOVIE.mp4 \
-loop 1 -t 5 -i IMAGE.jpg \
-f lavfi -t 5 -i anullsrc \
-filter_complex "[1:v][0:v]scale2ref[img][v];\
[img]setsar=1[img];[v][0:a][img][2:a]concat=n=2:v=1:a=1[out]" \
-map "[out]" \
-strict experimental tst_full.mp4

FFMPEG zoom-pan multiple images

I am trying to stitch multiple images with some zoom-pan happening on the images to create a video.
Command:-
ffmpeg -f lavfi -r 30 -t 10 -i \
color=#000000:1920x1080 \
-f lavfi \
-r 30 -t 10 \
-i aevalsrc=0 \
-i "image-1.png" \
-i "image-2.png" \
-y -filter_complex \
"[0:v]fifo[bg];\
[2:v]setpts=PTS-STARTPTS+0/TB,scale=4455:2506:force_original_aspect_ratio=decrease,zoompan=z='min(zoom+0.0015,2.5)':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=150:fps='30':s='1920x1080'[v2];\
[bg][v2]overlay=0:0:enable='between(t,0, 5)'[bg];\
[3:v]setpts=PTS-STARTPTS+5.07/TB,scale=3840:2160:force_original_aspect_ratio=decrease,zoompan=z='min(zoom+0.0015,2.5)':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=150:fps='30':s='1920x1080'[v3];\
[bg][v3]overlay=0:0:enable='between(t,5, 10)'[bg];\
[1:a]amix=inputs=1:duration=first:dropout_transition=0" \
-map "[bg]" -vcodec "libx264" -preset "veryfast" -crf "15" "output.mp4"
The output is not as expected, it only zooms only on the first image, the second image is just static.
FFMPEG version - 4.1
Use
ffmpeg -f lavfi -i color=#000000:1920x1080:r=30:d=10 \
-f lavfi -t 10 -i anullsrc \
-i "image-1.png" \
-i "image-2.png" \
-filter_complex \
"[2:v]scale=4455:2506:force_original_aspect_ratio=decrease,zoompan=z='min(zoom+0.0015,2.5)':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=150:fps=30:s='1920x1080'[v2];\
[bg][v2]overlay=0:0:enable='between(t,0,5)'[bg];\
[3:v]scale=3840:2160:force_original_aspect_ratio=decrease,zoompan=z='min(zoom+0.0015,2.5)':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=150:fps=30:s='1920x1080',setpts=PTS+5/TB[v3];\
[bg][v3]overlay=0:0:enable='between(t,5,10)'[bg];\
-map "[bg]" -map 1:a -vcodec libx264 -preset veryfast -crf 15 -y "output.mp4"
For lavfi sources, it's best to set frame rate and duration where applicable within the filter.
Since you're not looping the images, -t won't have any effect. Since zoompan will set fps in its output, you can skip input rate setting. And since it's a single image, setpts before zoompan has no relevance. It should be set only on the zoompan whose timestamps need to be shifted.
Since you've only one audio, no point sending it to amix - there's nothing to mix with! Just map it directly.

ffmpeg adjust contrast, show histogram

I'm trying to first adjust the contrast of a frame extracted from an mp4, then overlay the histogram of the resultant frame on top. My command here does all of this, but also adjusts the contrast of the histogram itself. Is there a single ffmpeg command that can do what I wish?
ffmpeg -ss 3.5 -i in.mp4 -an -y -vf \
"split=2[a][b],[b]eq=0.5:0:1:1:1:1:1,histogram=levels_mode=logarithmic:\
components=1:level_height=100, [a]overlay,eq=0.5:0:1:1:1:1:1" \
-vframes 1 -q:v 2 out.jpg
Use
ffmpeg -ss 3.5 -i in.mp4 -an -y -filter_complex \
"eq=0.5:0:1:1:1:1:1,split=2[a][b];[b]histogram=levels_mode=logarithmic:\
components=1:level_height=100[b];[a][b]overlay" -vframes 1 -q:v 2 out.jpg

Combining multiple image files into a video while using filter_complex to apply a watermark

I'm trying to combine two ffmpeg operations into a single one.
Currently I have two sets of ffmpeg commands that first generate a video from existing images, then runs that video through ffmpeg again to apply a watermark.
I'd like to see if its possible to combine these into a single operation.
# Create the source video
ffmpeg -y \
-framerate 1/1 \
-i layer-%d.png \
-r 30 -vcodec libx264 -preset ultrafast -crf 23 -pix_fmt yuv420p \
output.mp4
# Apply the watermark and render the final output
ffmpeg -y \
-i output.mp4 \
-i logo.png \
-filter_complex "[1:v][0:v]scale2ref=40:40[a][b];[b][a]overlay=(80):(main_h-200-80)" \
final.mp4
Use
ffmpeg -y \
-framerate 1/1 -i layer-%d.png \
-i logo.png \
-filter_complex "[0:v]fps=30[img];
[1:v][img]scale2ref=40:40[a][b];[b][a]overlay=(80):(main_h-200-80)" \
final.mp4
(The use of scale2ref doesn't make sense since you're scaling to a fixed size).

Adding splash screen using FFMPEG

everyone!
I'm trying to add a splash screen to fade out after 2 seconds into a video using FFMPEG.
I'm using the following command:
ffmpeg -loop 1 -framerate 2 -t 2 -i image.png \
-i video.mp4 \
-filter_complex "[0:v]fade=t=in:st=0:d=0.500000,fade=t=out:st=4.500000:d=0.500000,setsar=1; \
[0:0] [1:0] concat=n=2:v=1:a=0" \
-c:v libx264 -crf 23 output.mp4
but it is generating a video whose duration is correct, but plays for just 2 seconds, exactly the splash screen duration.
Since I don't have very experience on FFMPEG and got this code from the internet I don't know where the problem is...
Use
ffmpeg -i video.mp4 -loop 1 -t 2 -i image.png \
-filter_complex \
"[1]fade=t=in:st=0:d=0.500000,fade=t=out:st=1.500000:d=0.500000,setsar=1[i]; \
[i][0]concat=n=2:v=1:a=0" \
-c:v libx264 -crf 23 output.mp4
The image should be the same resolution as the video. It will fade-in for 0.5 seconds, remain for 1 second, then fade out for 0.5 seconds.

Resources