Showing in-video visual progress bar with FFMPEG? - ffmpeg

As OBS Studio lacks a visual indicator to show how far a video has progressed (and when you need to advance to the next scene), I was wondering if there is a command-line option (or solution) to get FFMPEG to re-encode the video and show a progress bar at the bottom of the video that shows how long the video has been playing so far.
Is there such a feature?

Here's a simple 3 second example using an animated overlay:
ffmpeg -i input.mp4 -filter_complex "color=c=red:s=1280x10[bar];[0][bar]overlay=-w+(w/10)*t:H-h:shortest=1" -c:a copy output.mp4
What you will have to change:
In the color filter I used 1280 as an example to match the width of input.mp4. You can use ffprobe to get the width or the scale2ref filter to resize to match input.mp4.
In the overlay filter I used 10 as an example for the total duration in seconds of input.mp4. You can use ffprobe to get the duration.

Related

ffmpeg glitch while converting DNG sequence into mp4 video

I have two problems with FFmpeg, when I use it to join DNG files sequence into mp4 video file. I also need to downgrade the resolution of the video from 6016x3200 to 2030x1080.
First of all I got almost black screen in the resulting video. Had to play with gamma and brightness options. But it was not enough!
New problems:
something strange happens with aspect ratio in resulting video file: in the first frame aspect is normal, just like in the original picture, but all the rest frames are getting squeezed. can't figure out why this happen!? (see picture attached).
colors are desaturated. despite the fact that I set "saturation" option to the maximum value. and also, the first frame of the video is different from the rest (while DNG files are all similar, first is no exception)
I tried prores codec as well, with the same result.
command I use is simple:
ffmpeg.exe -start_number 1 -i "K:\video\copter_R%5d.dng" -c:v libx264 -vf "fps=25,format=yuv420p, eq=gamma=3.2:brightness=0.2:contrast=1.6:saturation=3, scale=w=2030:h=1080" e:\output.mp4
I tried to use different variants of scale parameter: "scale=-1:1080" as well.
Illustration:
UPDATE: ffmpeg log report for operation:
https://drive.google.com/file/d/1H6bdpU0Eo4WfR3h-SRtgf7WBNYVFRwz2/view?usp=sharing

ffmpeg how to crop and scale at the same time?

I'm trying to convert a video with black bars, to one without and if the source is 4k, I want the video to be converted to 1080p
Now to do this, I'm using the following command:*
ffmpeg -i input ... -filter:v "crop=..." -filter:V "scale=1920:-1" ouput
But running this, I found that the end product still has said black bars and is 1920x1080 as opposed to the 1920x800 I'd expect.
What gives, why does this not work?
*: Other settings have been left out for convenience.
I got it to work by putting both the crop and the scale in the same -vf tag. I was cropping and then increasing the size of an old video game, and I just did this:
-vf crop=256:192:2:16,scale=-2:1080:flags=neighbor
I knew it worked as soon as I saw it display the output file size as 1440x1080 (4:3 ratio at 1080p).

Produce waveform video from audio using FFMPEG

I am trying to create a waveform video from audio. My goal is to produce a video that looks something like this
For my test I have an mp3 that plays a short clipped sound. There are 4 bars of 1/4 notes and 4 bars of 1/8 notes played at 120bpm. I am having some trouble coming up with the right combination of preprocessing and filtering to produce a video that looks like the image. The colors dont have to be exact, I am more concerned with the shape of the beats. I tried a couple of different approaches using showwaves and showspectrum. I cant quite wrap my head around why when using showwaves the beats go past so quickly, but using showspectrum produces a video where I can see each individual beat.
ShowWaves
ffmpeg -i beat_test.mp3 -filter_complex "[0:a]showwaves=s=1280x100:mode=cline:rate=25:scale=sqrt,format=yuv420p[v]" -map "[v]" -map 0:a output_wav.mp4
This link will download the output of that command.
ShowSpectrum
ffmpeg -i beat_test.mp3 -filter_complex "[0:a]showspectrum=s=1280x100:mode=combined:color=intensity:saturation=5:slide=1:scale=cbrt,format=yuv420p[v]" -map "[v]" -an -map 0:a output_spec.mp4
This link will download the output of that command.
I posted the simple examples because I didn't want to confuse the issue by adding all the variations I have tried.
In practice I suppose I can get away with the output from showspectrum but I'd like to understand where/how I am thinking about this incorrectly. Thanks for any advice.
Here is a link to the source audio file.
What showwaves does is show the waveform in realtime, and the display window is 1/framerate i.e. if the video output is 25 fps, then each frame shows the waveform of 40 ms of audio. There's no 'history' or 'memory' so you can't (directly) get a scrolling output like it seems your reference video shows.
The workaround for this is to use the showwavespic filter to produce a single frame showing the entire waveform at a high enough horizontal resolution. Then do a scrolling overlay of that picture over a desired background, at a speed such that the scroll lasts as long as the audio.
Basic command template would be:
ffmpeg -loop 1 -i bg.png -loop 1 -i wavespic.png -i audio.mp3
-filter_complex "[0][1]overlay=W-w*t/mp3dur:y=SOMEFIXEDVALUE" -shortest waves.mp4
mp3dur above should be replaced with the duration of the audio file.

FFmpeg fade effects between frames

I want to create a slideshow of my images with fade in & fade out transitions between them and i am using FFmpeg fade filter.
If I use command:
ffmpeg -i input.mp4 "fade=in:5:8" output.mp4
To create the output video with fade effect, then it gives output video with first 5 frames black and than images are shown with fade in effect but i want fade:in:out effect between frame change.
How can i do that?
Please tell a solution for Centos server because i am using FFmpeg on this server only
To create a video with fade effect, just break the video into parts and create separate videos for each image. For instance, if you have 5 images then firstly, create 50-60 copies of each image and obtain a video for that:
$command= "ffmpeg -r 20 -i images/%d.jpg -y -s 320x240 -aspect 4:3 slideshow/frame.mp4";
exec($command." 2>&1", $output);
This will allow you to create 5 different videos. Then, you need 10-12 different copies of those five images and again create separate videos with fade effects.
ffmpeg -i input.mp4 "fade=in:5:8" output.mp4
After this you will have videos like: video for image 1 and its fade effect then for image 2 and its fade effect and so on. Now combine those videos in respective order to get the whole video.
For combining the videos you need:
$command = "cat pass.mpg slideshow/frame.mpg > final.mpg";
This means to join the videos using cat and then you need to convert them to mpg, join them and again reconvert them to mp4 or avi to view them properly. Also the converted mpg videos will not be proper so do not bother. When you convert them to mp4, it will be working fine.
You can make a slideshow with crossfading between the pictures, by using the framerate filter. In the following example 0.25 is the framerate used for reading in the pictures, in this case 4 seconds for each picture. The parameter fps sets the output framerate. The parameters interp_start and interp_end can be used for changing the fading effect: interp_start=128:interp_end=128 means no fading at all. interp_start=0:interp_end=255 means continuous fading. When one picture has faded out and the next picture has fully faded in, the third picture will immediately begin to fade in. There is no pause for showing the second picture. interp_start=64:interp_end=191 means half of the time is pause for showing the pictures and the other half is fading. Unfortunately it won't be a full fading from 0 to 100%, but only from 25% to 75%. That's not exactly what you might want, but better than no fading at all.
ffmpeg -framerate 0.25 -i IMG_%3d.jpg -vf "framerate=fps=30:interp_start=64:interp_end=192:scene=100" test.mp4
You can use gifblender to create the blended, intermediary frames from your images and then convert those to a movie with ffmpeg.

Overlaying video with ffmpeg

I'm attempting to write a script that will merge 2 separate video files into 1 wider one, in which both videos play back simultaneously. I have it mostly figured out, but when I view the final output, the video that I'm overlaying is extremely slow.
Here's what I'm doing:
Expand the left video to the final video dimensions
ffmpeg -i left.avi -vf "pad=640:240:0:0:black" left_wide.avi
Overlay the right video on top of the left one
ffmpeg -i left_wide.avi -vf "movie=right.avi [mv]; [in][mv] overlay=320:0" combined_video.avi
In the resulting video, the playback on the right video is about half the speed of the left video. Any idea how I can get these files to sync up?
Like the user 65Fbef05 said, the both videos must have the same framerate
use -f framerate and framerate must be the same in both videos.
To find the framerate use:
ffmpeg -i video1
ffmpeg -i video2
and look for the line which contains "Stream #0.0: Video:"
on that line you'll find the fps in movie.
Also I don't know what problems you'll encounter by mixing 2 audio tracks.
From my part I will try to use the audio from the movie which will be overlayed
over and discard the rest.

Resources