Quicktime can play it right. But ffmpeg will produce an upside-down thumbnail sometimes.
ffmpeg -i input.MOV -ss 00:00:00.002 -vframes 1 -y output.png
I like to generate thumbnails of correct positions. What are some good remedies?
Thanks
Videos recorded on mobile devices will have a rotation attribute in their metadata. Quicktime uses this to know if it need to rotate the video. Use "MediaInfo" to see your media's metadata, you can download it from here; https://mediaarea.net/en/MediaInfo/Download
If the video is rotated 90' use -vf 'transpose=1', for 180' use -vf 'hflip,vflip'
Related
I have a video with an image overlay. The picture is set nearly full screen for the first 10 seconds, and aftwerwards I need to zoom out the image to a smaller size and place it in the bottom left corner for the rest of the video.
At the moment I split the input image and overlay it twict to the video, without transitions, one for the "big" picture and one for the "small" picture and it works fine.
What I would like to make is a "zoom out and moove" effect to make it smoother from the big central picture to the small one.
This is my current complex filter:
-i "video.mp4"
-i "img.jpg"
-filter_complex "[1:v]split=2[img10][img20];[img10]scale=1469:856[img11];[img20]scale=293:171[img21];[0:v][img11]overlay=(main_w-overlay_w)/2:(main_h - overlay_h)/2:enable='between(t,0,10)',fade=out:st=9:d=1:alpha=1[vid];[vid][img21]overlay=10:(main_h-overlay_h-40):enable='gte(t,10)'"
-crf 18 -c:a copy "out.mp4"
How can I make it as a single image overlay with zoom out + move effect?
Using ffmpeg version 4.3 or newer, you can animate the scale parameter. And then animate the overlay parameter.
ffmpeg
-i "video.mp4"
-loop 1 -i "img.jpg"
-filter_complex "[1:v]scale=w='if(between(t,10,14),1469-(1469-293)*(t-10)/4,if(lt(t,10),1469,293))':h='if(between(t,10,14),856-(856-171)*(t-10)/4,if(lt(t,10),856,171))':eval=frame[img];[0:v][img]overlay=x='if(between(t,10,14),(W-w)/2-((W-w)/2-10)*(t-10)/4,if(lt(t,10),(W-w)/2,10))':y='if(between(t,10,14),(H-h)/2-((H-h)/2-(H-h-40))*(t-10)/4,if(lt(t,10),(H-h)/2,H-h-40))':shortest=1"
-crf 18 -c:a copy "out.mp4"
I have to create 'preview' on video progress hover. I'm doing with a sprite image and WebVTT file. Using ffmpeg and imagemagick. However generating thumbnails from a mp4 video is really damn slow (20-30 minutes for 2hrs and 20 min long video). The video is Full HD, H246 encoded, 2GB big. The command used
"ffmpeg.exe -i largevideo.mp4 -f image2 -bt 20M -vf fps=1/5 thumbs-%03d.jpg"
Which means thumb for every 5 secs of the video. Is there a way to make it faster? Videos in prod can be even bigger.
OS: Win10, ImageMagick is used later to create the sprite from all the thumbnails created with ffmpeg.
Skip everything except keyframes:
ffmpeg -skip_frame nokey -i input.mp4 -vsync passthrough thumbs-%03d.jpg
Also see:
Get keyframe intervals
Control JPG quality
how to mirror + add logo to video?
How to mirror video, then add logo to video using ffmpeg?
To flip a video horizontally, you can use -vf hflip (or vflip for vertical flipping).
To add a watermark of any image to a video is more complex, especially if you want to position it:
-i logo.png -filter_complex "overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2"
There are a lot of things you can do with the overlay filter, check the documentation. It gets extremely complex if you also want to scale the overlay, so make sure your logo file is the right size before that.
However, you cannot mix -vf and -filter_complex, so the flipping has to become part of the complex filter. So, for your desired result, you'd have to do this (assuming you want the logo to be at position 10/10):
ffmpeg -i input.mp4 -i logo.png -filter_complex "hflip[flipped];[flipped]overlay=x=10:y=10" out.mp4
I am trying to create a waveform video from audio. My goal is to produce a video that looks something like this
For my test I have an mp3 that plays a short clipped sound. There are 4 bars of 1/4 notes and 4 bars of 1/8 notes played at 120bpm. I am having some trouble coming up with the right combination of preprocessing and filtering to produce a video that looks like the image. The colors dont have to be exact, I am more concerned with the shape of the beats. I tried a couple of different approaches using showwaves and showspectrum. I cant quite wrap my head around why when using showwaves the beats go past so quickly, but using showspectrum produces a video where I can see each individual beat.
ShowWaves
ffmpeg -i beat_test.mp3 -filter_complex "[0:a]showwaves=s=1280x100:mode=cline:rate=25:scale=sqrt,format=yuv420p[v]" -map "[v]" -map 0:a output_wav.mp4
This link will download the output of that command.
ShowSpectrum
ffmpeg -i beat_test.mp3 -filter_complex "[0:a]showspectrum=s=1280x100:mode=combined:color=intensity:saturation=5:slide=1:scale=cbrt,format=yuv420p[v]" -map "[v]" -an -map 0:a output_spec.mp4
This link will download the output of that command.
I posted the simple examples because I didn't want to confuse the issue by adding all the variations I have tried.
In practice I suppose I can get away with the output from showspectrum but I'd like to understand where/how I am thinking about this incorrectly. Thanks for any advice.
Here is a link to the source audio file.
What showwaves does is show the waveform in realtime, and the display window is 1/framerate i.e. if the video output is 25 fps, then each frame shows the waveform of 40 ms of audio. There's no 'history' or 'memory' so you can't (directly) get a scrolling output like it seems your reference video shows.
The workaround for this is to use the showwavespic filter to produce a single frame showing the entire waveform at a high enough horizontal resolution. Then do a scrolling overlay of that picture over a desired background, at a speed such that the scroll lasts as long as the audio.
Basic command template would be:
ffmpeg -loop 1 -i bg.png -loop 1 -i wavespic.png -i audio.mp3
-filter_complex "[0][1]overlay=W-w*t/mp3dur:y=SOMEFIXEDVALUE" -shortest waves.mp4
mp3dur above should be replaced with the duration of the audio file.
So I thought to embed the image as part of video content while converting the m4a to mp4. The output file has the image as video content and it plays as expected in VLC but when the same was streamed from my CDN using JW player, I do not see the image as video content, still it is black and audio is heard. Not sure what is the issue with embeding. I used the FFMPEG to embed image as video content.
On the other hand is there any posibility with JW player to overlay an image as video content while audio can be heard in back end?
You need
to loop
the image
ffmpeg -loop 1 -r 1 -i img.png -i audio.m4a -shortest -filter:v \
'crop=trunc(iw/2)*2:trunc(ih/2)*2' out.mp4
repeat image over and over
once per second
stop video when audio stops
cut image to even dimensions if necessary