Produce waveform video from audio using FFMPEG - ffmpeg

I am trying to create a waveform video from audio. My goal is to produce a video that looks something like this
For my test I have an mp3 that plays a short clipped sound. There are 4 bars of 1/4 notes and 4 bars of 1/8 notes played at 120bpm. I am having some trouble coming up with the right combination of preprocessing and filtering to produce a video that looks like the image. The colors dont have to be exact, I am more concerned with the shape of the beats. I tried a couple of different approaches using showwaves and showspectrum. I cant quite wrap my head around why when using showwaves the beats go past so quickly, but using showspectrum produces a video where I can see each individual beat.
ShowWaves
ffmpeg -i beat_test.mp3 -filter_complex "[0:a]showwaves=s=1280x100:mode=cline:rate=25:scale=sqrt,format=yuv420p[v]" -map "[v]" -map 0:a output_wav.mp4
This link will download the output of that command.
ShowSpectrum
ffmpeg -i beat_test.mp3 -filter_complex "[0:a]showspectrum=s=1280x100:mode=combined:color=intensity:saturation=5:slide=1:scale=cbrt,format=yuv420p[v]" -map "[v]" -an -map 0:a output_spec.mp4
This link will download the output of that command.
I posted the simple examples because I didn't want to confuse the issue by adding all the variations I have tried.
In practice I suppose I can get away with the output from showspectrum but I'd like to understand where/how I am thinking about this incorrectly. Thanks for any advice.
Here is a link to the source audio file.

What showwaves does is show the waveform in realtime, and the display window is 1/framerate i.e. if the video output is 25 fps, then each frame shows the waveform of 40 ms of audio. There's no 'history' or 'memory' so you can't (directly) get a scrolling output like it seems your reference video shows.
The workaround for this is to use the showwavespic filter to produce a single frame showing the entire waveform at a high enough horizontal resolution. Then do a scrolling overlay of that picture over a desired background, at a speed such that the scroll lasts as long as the audio.
Basic command template would be:
ffmpeg -loop 1 -i bg.png -loop 1 -i wavespic.png -i audio.mp3
-filter_complex "[0][1]overlay=W-w*t/mp3dur:y=SOMEFIXEDVALUE" -shortest waves.mp4
mp3dur above should be replaced with the duration of the audio file.

Related

ffmpeg scale down video dynamically (squeeze-back) or zoompan out to smaller than original

I have 2 videos, I'm trying to overlay one on top of the other, and have it shrink down in an animated fashion until it appears like a picture-in-picture setup. Then, after a few seconds it should scale back up.
This is what I am trying to achieve (these would be videos, not images):
This is the closest I've been able to get, but, crucially, zoompanning "out" (as opposed to "in") does not appear to work; so, of course, this does not work:
ffmpeg -i bg.mov -i top.mov -filter_complex "[0:v]zoompan=z='pzoom-0.1':d=1, setpts=PTS-STARTPTS[top]; [1:v]setpts=PTS-STARTPTS+2/TB, scale=1920x1080, format=yuva420p,colorchannelmixer=aa=1.0[bottom]; [top][bottom]overlay=shortest=0" -vcodec libx264 out.mp4
Is this achievable with ffmpeg?
Use the scale filter with animation, available since v4.3.
Here's something to get you started. This will expand the top layer from 480 px height to 1080 height in 2 seconds and then back to 480 px in 2 seconds.
ffmpeg -i bg.mov -i top.mov -filter_complex "[0:v]scale=1920x1080,setpts=PTS-STARTPTS[bg]; [1:v]setpts=PTS-STARTPTS+2/TB, scale=-1:'480+600*abs(sin((t-2)*2*PI/8))':eval=frame[top]; [bg][top]overlay" -vcodec libx264 out.mp4

FFMPG Concat two video quality issue

I'm trying to concat two mp4 video files using ffmpg (with the below command), a main video and a secondary one, the main video always have 1080x1920 resolution and the resulting video should have the some resolution.
val concat = "-i ${mainVideoPath} -i ${secondVideoPath} -filter_complex [0:v]scale=1080:1920:force_original_aspect_ratio=decrease,setsar=1:1,pad=1080:1920:(ow-iw)/2:(oh-ih)/2[s0];[1:v]scale=1080:1920:force_original_aspect_ratio=decrease,setsar=1:1,pad=1080:1920:(ow-iw)/2:(oh-ih)/2[s1];[s0][s1]concat=n=2:v=1[v] -map [v] $resultVideoPath"
The concat work fine but the main part of my resulting video always lose quality although the resulting video has the same resolution.
Any help will be appreciated.

ffmpeg output doesn't play on multiple devices

I have read the other stackoverflow posts regarding this topic so I am fairly certain this is not exact duplicate.
ffmpeg exports a video that seems to only play on select players. I want to export a video that plays on iphone/mac/general players. I have seen the suggestions for the -pix_fmt yuv420p tag but this does not seem to work anymore - I read that Mac has since changed their systems that makes it not compatible anymore.
I am running:
ffmpeg -start_number 1 -framerate 4 -pix_fmt yuv420p -i screen%01d.png output.mp4
This all works fine and I can see the video by doing:
ffplay output.mp4
But I would like to be able to transfer this to mobile or general playback, any way to do this, ideally using ffmpeg? I'd rather not use two tools to do 1 job.
Works on gmail
Doesn't work on QuickTime Player
Doesn't work on Flip Player
Doesn't work on iPhone
Order of options is important. It should be,
ffmpeg -start_number 1 -framerate 4 -i screen%01d.png -pix_fmt yuv420p output.mp4
Now pix_fmt is set as an output option. Originally, it was trying to force the input format, but since PNGs are images with metadata and not raw pixel data, th eoption had no effect. Additionally, for web use, it's good to also set -movflags +faststart as an output option.
Note that old versions of VLC couldn't play videos with framerate < 6. Could possibly be an issue with a few other players as well. Add -r 8 as an output option to avoid that.

FFMPEG Start 2 videos at the same time but hide one until the other is finished

I have a generic intro sequence (no audio) and a main video clip. I want the audio from the main clip to play as the intro sequence is playing then the video to switch from the finished intro sequence to the main video. So almost like playing both videos at the same time but hiding one until the other is finished. Is this possible with ffmpeg? Almost like a send to back function for the video on the main clip (but keep it's audio rolling so it's in sync when it shows as the intro clip finishes).
Looks like you want a J-cut. This can be done using the overlay filter.
ffmpeg -i main.mp4 -i intro.mp4 -filter_complex "[1][0]scale2ref[intro][base]; \
[base][intro]overlay=eof_action=pass[v]" -map "[v]" -map 0:a -c:a copy out.mp4
The scale2ref filter ensures that the intro is the same resolution as the main video. Then the intro is overlaid on top of the main video, in sync, and vanishes when it ends, leaving the main video on display. The audio is copied over - no processing required.

Overlaying video with ffmpeg

I'm attempting to write a script that will merge 2 separate video files into 1 wider one, in which both videos play back simultaneously. I have it mostly figured out, but when I view the final output, the video that I'm overlaying is extremely slow.
Here's what I'm doing:
Expand the left video to the final video dimensions
ffmpeg -i left.avi -vf "pad=640:240:0:0:black" left_wide.avi
Overlay the right video on top of the left one
ffmpeg -i left_wide.avi -vf "movie=right.avi [mv]; [in][mv] overlay=320:0" combined_video.avi
In the resulting video, the playback on the right video is about half the speed of the left video. Any idea how I can get these files to sync up?
Like the user 65Fbef05 said, the both videos must have the same framerate
use -f framerate and framerate must be the same in both videos.
To find the framerate use:
ffmpeg -i video1
ffmpeg -i video2
and look for the line which contains "Stream #0.0: Video:"
on that line you'll find the fps in movie.
Also I don't know what problems you'll encounter by mixing 2 audio tracks.
From my part I will try to use the audio from the movie which will be overlayed
over and discard the rest.

Resources