why zoom effects only applies to first image? - windows

why zoom effects only applies to first image ?
ffmpeg -i img%03d.jpeg -i 1.mp3 -vf
zoompan=z='zoom+0.002':d=25*5:s=1280x800 -pix_fmt yuv420p -c:v libx264
-t 01:05:00 out12345.mp4
I have 3 images, 1 audio, and I am trying to create a video and expecting each image to have zoom effects.
Here is what I am getting, First image shows zoom effect then 2nd image shows up for a split second and then last image stays without any effect.
What am I doing wrong ?

The zoompan filter operates per frame, so normally the command should produce the desired result i.e. each frame gets zoomed in over 125 frames.
However, when an image in the stream has different properties, the filtergraph is reinitialized, so a new zoompan instance is created, which starts on the changed frame as if starting from scratch. This new set of output has the same timestamps as already output frames so they are dropped.
There are two workarounds to prevent reinitialization:
1) make sure all frames in the input are uniform in properties
or
2) forcibly prevent reinitialization by adding -reinit_filter 0 before the input. Only a few filters can handle frames with changing properties, so avoid doing this unless you are sure.

Related

Using blurdetect to filter blurred keyframes in ffmpeg5

I want to extract the keyframes of a video by ffmpeg and determine if each keyframe is blurred or not using a predefined thershold. I noticed the new blurdetect filter in ffmpeg5, so I tried the following command:
ffmpeg -i test.mp4 -filter_complex "select=eq(pict_type,I),blurdetect=block_width=32:block_height=32:block_pct=80" -vsync vfr -qscale:v 2 -f image2 ./I_frames_ffmpeg/image%08d.jpg
Using this command I can get the keyframes and at the end in the terminal I can see the average blur value of those frames being printed out.
blur mean
My question is, can I use the blurdetect filter to get the blur value for each frame? Can I use this blur value as a precondition for keyframe selection, e.g. only select this frame as a keyframe if the blur value is less than 5?
Yes, blurdetect filter pushes the blur value of each frame to stream metadata, which you can capture with metadata filter. Try the following filtergraph:
select=eq(pict_type,I),\
blurdetect=block_width=32:block_height=32:block_pct=80,\
metadata=print:file=-
The metadata filter outputs to stdout, so you'll see 2 lines for each frame like:
frame:1295 pts:1296295 pts_time:43.2098
lavfi.blur=4.823009
Note that the terminal may get cluttered with other logs, but these lines should be the only lines actually on stdout (standard logs are on stderr) so you should be able to capture easily. From there a simple regex should help you retrieve the blur values.
Can I use this blur value as a precondition for keyframe selection, e.g. only select this frame as a keyframe if the blur value is less than 5?
I believe (not verified) that metadata filter can do exactly this:
metadata=select:key=lavfi.blur:value=5:function=less
Not the best documentation, but it's all there

Detect scene change on part of the frame

I have a video file of an online lecture cosisting of a slideshow with audio in the background.
I want to save images of each slide as well as the timestamp of that slide.
I do this using the scene and metadata filters:
ffmpeg -i week-01.mp4 -filter_complex "select='gt(scene,0.011)',metadata=print:file=frames/time.txt" -vsync vfr frames/img%03d.jpg
This works fine exept for one thing, there is a timer onscreen on the right in the video file.
If i set the thershold small enough to pick up all the slide changes, it also picks up the timer changes.
So here is my question, Can I ask ffmpeg to:
analize part of the frame (only the right side till roughly 75% to the left).
Then, on detecting a scene change in this area, save the entire frame and the timestamp.
I though of making a script that
crops the video and saves it alongside the origional
analize the cropped video for scene changes and save the timestamps
extract the frames from the origional video using the timestamps
Is there a better/faster/shorter way to do this?
Thanks in advance!
You can do it in one command like this,
ffmpeg -i week-01.mp4 -filter_complex "[0]split=2[full][no_timer];[no_timer]drawbox=w=0.25*iw:h=ih:x=0.75*iw:y=0[no_timer];[no_timer]select='gt(scene,0.011)',metadata=print:file=frames/time.txt[no_timer];[no_timer][full]overlay" -vsync vfr frames/img%03d.jpg
Basically, make two copies of the video, use drawbox on one copy to paint solid black over the quarter of the screen on the right, analyze scene change and record scores to file; then overlay the full unpainted frame on top of the painted ones. Due to how overlay syncs frames, only the full frames with corresponding timestamps will be used to overlay on top of the base selected frames.

ffmpeg how to crop and scale at the same time?

I'm trying to convert a video with black bars, to one without and if the source is 4k, I want the video to be converted to 1080p
Now to do this, I'm using the following command:*
ffmpeg -i input ... -filter:v "crop=..." -filter:V "scale=1920:-1" ouput
But running this, I found that the end product still has said black bars and is 1920x1080 as opposed to the 1920x800 I'd expect.
What gives, why does this not work?
*: Other settings have been left out for convenience.
I got it to work by putting both the crop and the scale in the same -vf tag. I was cropping and then increasing the size of an old video game, and I just did this:
-vf crop=256:192:2:16,scale=-2:1080:flags=neighbor
I knew it worked as soon as I saw it display the output file size as 1440x1080 (4:3 ratio at 1080p).

How to use timeline editing with a single image input in ffmpeg?

Small image should be animated over a background video in a simple way:
change position - move along a straight line, no easing. Starting at frame A, till frame B (i.e. frames 11 to 31);
zoom in - between frames C and D (i.e. 45 and 55).
Filters I intend to use:
overlay filter has x and y parameters for image position;
zoompan filter allows zooming (preceeded with a static scale up to avoid jitter).
My filtergraph:
video.avi >----------------------------------->|-------|
|overlay|-> out.mp4
image.png >-> scale >-> zoompan >-> zoompan >->|-------|
The problem is with timeline editing. Both filter support the enable option. I thought I could add instructions like enable='between(n, 11, 31)' to "place" the animations at right times.
Appears that the image input has only two values of n: zero and 1. Checked that by wrapping n with print(n) in zoompan filter to output during rendering.
Inside overlay filter, in opposite, n outputs sequence of numbers as expected.
Question: how can I make the single image input "look" like a normal video stream to ffmpeg filters – so that every generated frame has its unique number?
One of the latest tests. Video is hd720, image is 1000x200 transparent png with the logo occupying about 150x50 area in the center, not to be cropped out when zoomed in.
ffmpeg -i $FOOTAGE -loop 1 -i $IMAGE -filter_complex \
"
[1:v]
scale=10*iw:-2
,zoompan=
z='1'
:x='iw/2-(iw/zoom/2)+80'
:y='ih/2-(ih/zoom/2)'
:d=26
:s=500x100
:enable='lt(print(n),24)'
,zoompan=
z='min(zoom+1.3/18,2.3)'
:x='iw/2-(iw/zoom/2)'
:y='ih/2-(ih/zoom/2)'
:d=20
:s=500x100
:enable='between(n,24,42)'
[name];
[0:v][name]
overlay=
x=1005-250
:y=406-50
:enable='lte(n,173)'
" -t 7 -y -hide_banner out.mp4
Appears, zoompan filter does not support timeline editing. In commit aa26258f dated August 27, 2017, this was updated in ffmpeg and it no longer lists zoompan as a timeline-enabled filter.
The workaround is to write expressions that depend on in "Input frame count" variable and output desired zoom factor.

FFmpeg fade effects between frames

I want to create a slideshow of my images with fade in & fade out transitions between them and i am using FFmpeg fade filter.
If I use command:
ffmpeg -i input.mp4 "fade=in:5:8" output.mp4
To create the output video with fade effect, then it gives output video with first 5 frames black and than images are shown with fade in effect but i want fade:in:out effect between frame change.
How can i do that?
Please tell a solution for Centos server because i am using FFmpeg on this server only
To create a video with fade effect, just break the video into parts and create separate videos for each image. For instance, if you have 5 images then firstly, create 50-60 copies of each image and obtain a video for that:
$command= "ffmpeg -r 20 -i images/%d.jpg -y -s 320x240 -aspect 4:3 slideshow/frame.mp4";
exec($command." 2>&1", $output);
This will allow you to create 5 different videos. Then, you need 10-12 different copies of those five images and again create separate videos with fade effects.
ffmpeg -i input.mp4 "fade=in:5:8" output.mp4
After this you will have videos like: video for image 1 and its fade effect then for image 2 and its fade effect and so on. Now combine those videos in respective order to get the whole video.
For combining the videos you need:
$command = "cat pass.mpg slideshow/frame.mpg > final.mpg";
This means to join the videos using cat and then you need to convert them to mpg, join them and again reconvert them to mp4 or avi to view them properly. Also the converted mpg videos will not be proper so do not bother. When you convert them to mp4, it will be working fine.
You can make a slideshow with crossfading between the pictures, by using the framerate filter. In the following example 0.25 is the framerate used for reading in the pictures, in this case 4 seconds for each picture. The parameter fps sets the output framerate. The parameters interp_start and interp_end can be used for changing the fading effect: interp_start=128:interp_end=128 means no fading at all. interp_start=0:interp_end=255 means continuous fading. When one picture has faded out and the next picture has fully faded in, the third picture will immediately begin to fade in. There is no pause for showing the second picture. interp_start=64:interp_end=191 means half of the time is pause for showing the pictures and the other half is fading. Unfortunately it won't be a full fading from 0 to 100%, but only from 25% to 75%. That's not exactly what you might want, but better than no fading at all.
ffmpeg -framerate 0.25 -i IMG_%3d.jpg -vf "framerate=fps=30:interp_start=64:interp_end=192:scene=100" test.mp4
You can use gifblender to create the blended, intermediary frames from your images and then convert those to a movie with ffmpeg.

Resources