Morphing images into each other using ffmpeg - ffmpeg

I have a directory containing jpegs. Using ffmpeg I want to read the images and create a video out of it. The images should morph into each other, every image should be visible ~2 seconds.
So far, I came up with this:
ffmpeg -i %d.jpg -c:v libx264 -pix_fmt yuv420p ./bla.mp4
Now what I'm missing is the morping part. I know there is a "morph" filter, however no matter how I try it, I either get an error that No such filter: 'morph' (even tho I compiled ffmpeg on own, making sure the needed libs are included) or I get a random 0xfff something error.

Related

ffmpeg record screen file formats

Good day,
I'm currently writing a bash script which records the screen under certain conditions. The problem is that only avi works as a file extension for recording the screen. This script is going to be used on an Raspberry Pi and currently I get on a decent virtual machine only 10-20 fps (goal would be around 30 fps). I think .avi is not suited for my project. But .mpeg and .mp4 are not working for recording. I tried recording with .avi and then converting it in .mp4, but I have limited memory and .avi ist just too big in size. I use currently the following command:
ffmpeg -f x11grab -y -r 30 -s 960x750 -i :0.0+0,100 -vcodec huffyuv ./Videos/out_$now.avi
//$now is the current date and time
So I wanted to know if I need some special packages from ffmpeg to record with for example .mp4 or if there are other file formats available for ffmpeg screen recording.
Edit:
I found that the codec libx264 for mp4 works, but the fps drop until they hit5 fps, which is definetly too low. The recorded video appeared like being a fast forward version of the recorded screen.
With mpeg4 for mpeg I reached over 30 fps, but the video qualitywas very bad.
It appears that even my big avi-files look like being played fast forward. Is there something I do wrong?
Is there a good middle way, where I get a decend video quality, good fps (20+) and a file which isn't too big?
Edit 2:
I tried recording it with .avi and converting it afterwards. Just converting with ffmpeg -i test.avi -c:a aac -b:a 128k -c:v libx264 -crf 23 output.mp4
resulted in the same framedrops as if I was recording with .mp4. But when I cut a littlebit of the beginning of the video and named the outputfile .mp4, the size became much smaller. But when I started the cutting at 0:00:00 (so tried just converting), it just changed the file format without converting it (so the size stayed the same). Any ideas?

ffmpeg: concatenating mp4 files. Video freezes for few seconds on first frame on output mp4

I created several MP4 files using ffmpeg. All of the videos have same settings and codec. Only difference is frames per second and duration. I then concatenated the videos using command below.
ffmpeg -f concat myList.txt -c copy output.mp4
I notice that when launching/opening the output.mp4 file in windows media player, it stops/freezes on the first frame of the video for about three four seconds and then starts playing, rest of the videos has correct fps and runs smoothly. Has anyone encountered this issue. I would like the video to start as soon as it is launched. Any suggestions to mitigate this issue?
Update: So far, I have found that the video length is exactly what I expect it to be.
ffprobe -i output.mp4
When i ffplay the video, it runs smoothly, but when I use windows media player, it gets stuck in first frame for about 4-5 seconds then plays smoothly. So I am going to assume that this issue is related to media players (buffers/loading before playing). Can't be sure though.
I solved this problem by converting my input files to avi and resizing them to the same size.
And then run
ffmpeg -i "concat:file1.avi|file2.avi|" -c copy out.avi

ffmpeg output doesn't play on multiple devices

I have read the other stackoverflow posts regarding this topic so I am fairly certain this is not exact duplicate.
ffmpeg exports a video that seems to only play on select players. I want to export a video that plays on iphone/mac/general players. I have seen the suggestions for the -pix_fmt yuv420p tag but this does not seem to work anymore - I read that Mac has since changed their systems that makes it not compatible anymore.
I am running:
ffmpeg -start_number 1 -framerate 4 -pix_fmt yuv420p -i screen%01d.png output.mp4
This all works fine and I can see the video by doing:
ffplay output.mp4
But I would like to be able to transfer this to mobile or general playback, any way to do this, ideally using ffmpeg? I'd rather not use two tools to do 1 job.
Works on gmail
Doesn't work on QuickTime Player
Doesn't work on Flip Player
Doesn't work on iPhone
Order of options is important. It should be,
ffmpeg -start_number 1 -framerate 4 -i screen%01d.png -pix_fmt yuv420p output.mp4
Now pix_fmt is set as an output option. Originally, it was trying to force the input format, but since PNGs are images with metadata and not raw pixel data, th eoption had no effect. Additionally, for web use, it's good to also set -movflags +faststart as an output option.
Note that old versions of VLC couldn't play videos with framerate < 6. Could possibly be an issue with a few other players as well. Add -r 8 as an output option to avoid that.

Produce waveform video from audio using FFMPEG

I am trying to create a waveform video from audio. My goal is to produce a video that looks something like this
For my test I have an mp3 that plays a short clipped sound. There are 4 bars of 1/4 notes and 4 bars of 1/8 notes played at 120bpm. I am having some trouble coming up with the right combination of preprocessing and filtering to produce a video that looks like the image. The colors dont have to be exact, I am more concerned with the shape of the beats. I tried a couple of different approaches using showwaves and showspectrum. I cant quite wrap my head around why when using showwaves the beats go past so quickly, but using showspectrum produces a video where I can see each individual beat.
ShowWaves
ffmpeg -i beat_test.mp3 -filter_complex "[0:a]showwaves=s=1280x100:mode=cline:rate=25:scale=sqrt,format=yuv420p[v]" -map "[v]" -map 0:a output_wav.mp4
This link will download the output of that command.
ShowSpectrum
ffmpeg -i beat_test.mp3 -filter_complex "[0:a]showspectrum=s=1280x100:mode=combined:color=intensity:saturation=5:slide=1:scale=cbrt,format=yuv420p[v]" -map "[v]" -an -map 0:a output_spec.mp4
This link will download the output of that command.
I posted the simple examples because I didn't want to confuse the issue by adding all the variations I have tried.
In practice I suppose I can get away with the output from showspectrum but I'd like to understand where/how I am thinking about this incorrectly. Thanks for any advice.
Here is a link to the source audio file.
What showwaves does is show the waveform in realtime, and the display window is 1/framerate i.e. if the video output is 25 fps, then each frame shows the waveform of 40 ms of audio. There's no 'history' or 'memory' so you can't (directly) get a scrolling output like it seems your reference video shows.
The workaround for this is to use the showwavespic filter to produce a single frame showing the entire waveform at a high enough horizontal resolution. Then do a scrolling overlay of that picture over a desired background, at a speed such that the scroll lasts as long as the audio.
Basic command template would be:
ffmpeg -loop 1 -i bg.png -loop 1 -i wavespic.png -i audio.mp3
-filter_complex "[0][1]overlay=W-w*t/mp3dur:y=SOMEFIXEDVALUE" -shortest waves.mp4
mp3dur above should be replaced with the duration of the audio file.

ffmpeg GIF to WebM decoding issue

I'm trying to convert GIF files into WebM (ffmpeg, libvpx) and getting some strange ffmpeg behaviour.
ffmpeg is installed on my mac from MacPorts.
Converting with:
ffmpeg -i srcFilename.gif -b:v 600K -qmin 0 -qmax 50 -crf 5 destFilename.webm
if my GIF file has some frame(s) with 1-2s duration somewhere in the middle of animation like this, conversion result is fine - it's playing with the "pause" in the middle.
But if I have GIF like this with "pause" in the last frame, ffmpeg decodes it without a delay.
Have no idea why, spent some time reading ffmpeg manual, trying different conversion options with no success.
Any ideas? Thanks in advance!
I wrote an email to GIF decoder author and he answered me that he knows about this issue. It's located somewhere deep inside of ffmpeg and he has no idea how to fix it right now.
So, I'm using "dirty hack" in my project - just adding copy of last frame with zero delay to GIF file before encoding.

Resources