ffmpeg : mix audio and video of different length - ffmpeg

I have 2 files: 1 video file (without sound) - length 6 seconds, 1 audio - length 10 seconds.
Both audio and video contains same conversation, but audio starts 4 seconds earlier and after that was started video.
[----------] audio
[------] video
So, I want to mix them together to video file with length 10 seconds where first 4 seconds black screen with audio then goes real video and audio.
[====------] audio+video (where '=' is black screen)
I hope my description was clear enough ).
How can I do this with ffmpeg or gstreamer ?

Let's say the video's resolution is WxH and framerate is F, and the difference in durations is D seconds, then the command is
ffmpeg -i video.mp4 -i audio.mp3 -f lavfi -i color=s=WxH:r=F -filter_complex
"[0]setpts=PTS-STARTPTS+D/TB[v];[2][v]overlay=eof_action=endall[vid]"
-map "[vid]" -map 1:a output.mp4

Related

How to create a full length video from images with FFmpeg?

I have more than a thousand images that I want to transform into a 3 minutes video. I tried using this line ffmpeg -r 30 -i "E:/White-box-Cartoonization/test_code/cartoonized_images/$flower%03d.bmp" -c:v libx264 -pix_fmt yuv420p out.mp4 it worked but creates only a 5 seconds video. What do I need to do to turn it into a full length 3 minutes video?
If you have 1250 images and want an output duration of 180 seconds:
ffmpeg -framerate 1250/180 -i input%03d.bmp -c:v libx264 -vf format=yuv420p output.mp4
This example results in a frame rate of 6.94. Some players can't handle such low frame rates. If your player does not like it then add the -r output option to make a normal output frame rate. ffmpeg will duplicate the frames but the output will look the same.
ffmpeg -framerate 1250/180 -i input%03d.bmp -c:v libx264 -vf format=yuv420p -r 25 output.mp4
For 3 minutes of video at 30 frames per second (-r parameter) you'd need 30*60*3 images: 5400 images.
Your source parameter specifies there would be only 3 digits, so you have a maximum of 1000 source images:
$flower%03d.bmp => $flower000.bmp .. $flower999.bmp
1000 images at 30 frames per second should give about 30 seconds of video ... if you actually have $flowerxxx.bmp files.
You might need a 4th digit in there somewhere.
$flower%04d.bmp

Ffmpeg: add a short audio at the end of a mp4 video

What would be the command to add a short audio at the end of a video mp4?
Let's say the overlay audio lasts 3 sec. My video last 26 min.
At 25:50 min (video duration - 10 sec) the overlay audio starts. The original audio of the video is silenced for 3 sec meanwhile the overlay audio runs. Then the original video sound comes back after 3 sec.
Basic syntax is
ffmpeg -i video -i audio \
-filter_complex "[0]volume=0:enable='between(t,1550,1553)'[v];\
[1]adelay=1550000|1550000,apad[a];\
[v][a]amix=duration=first:dropout_transition=0,volume=2[mix]"
-map 0:v -map '[mix]' -c:v copy output

FFMPEG is doubling audio length when extracting from video

I've got a video file, video.mp4. It is 18 minutes 23 seconds in duration. I am looking to extract the audio only from this video, and create an MP3 of the highest possible quality from the audio in the video.
Some googlefu lead me to this command: ffmpeg -i video.mp4 audio.mp3
The problem is that, this command doubles the length of the audio that's outputted (duration is 36 minutes 46 seconds). It loops the audio track once, so the output contains the entire 18 minutes 23 seconds of audio, then immediately starts the 18 minutes and 23 seconds of audio over again.
Some more googlefu lead me to this flag: -write_xing 0 from this SO question, but even with that flag it still loops the audio.
EDIT: Additional googlefu and me seeming to think it has something to do with 2 audio channels (and perhaps looping channel 2 immediately after channel 1, rather than merging the two) lead me to this flag: -ac 1 to force it to merge stereo -> mono. This did not work also, and it still outputs a 38 minute 46 seconds MP3 file.
How can I extract (to MP3) the audio from a video file, without doubling the duration?
Your googlefu must be malfunctioning.
If you have a single audio track:
ffmpeg -i movie.mp4 -map 0:a -c:a mp3 audio.mp3
If you have multiple audio tracks:
Identify the track:
ffprobe -i movie.mp4 and look for an audio Stream #0:x where x is an integer
Use the above command using -map 0:x. Example for x = 2:
ffmpeg -i movie.mp4 -map 0:2 -c:a mp3 audio.mp3
How to use the -map option

FFmpeg Slideshow issues

trying to get my head around ffmpeg to create a slideshow where each image is displayed for ~5 seconds with some audio. created a bat file to run the following so far:
ffmpeg -f image2 -i image-%%03d.jpg -i music.mp3 output.mpg
It gets the images and displayes them all very fast in the first second of the video, it then plays out the rest of the audio while showing the last image.
I want to make the images stay up longer (about 5 seconds), and stop the video after the last frame (not playing the rest of the song), are either of these things possible? i could hack the frame rate thing i guess by having hundreds of the same image in order to keep it up longer, but this is far from ideal!
Thanks
The default encoder for mpg output, mpeg1video, is strict about the allowed frame rates, so an input and an output -r are required:
ffmpeg -r 1/5 -i image-%03d.jpg -i music.mp3 -r 25 -qscale:v 2 -shortest -codec:a copy output.mpg
The input images will have a frame rate of 1 frame every 5 seconds and the output will duplicate frames to reach 25 frames per second.
-f image2 is generally not required.
-qscale:v can control output quality. A sane range is 2-5.
-shortest will make the output duration the same as the shortest input duration.
-codec:a copy copy your MP3 audio instead of re-encoding.
MPEG-1 video has more modern alternatives. See the FFmpeg and x264 Encoding Guide for more info.
Also see:
* FFmpeg FAQ: How do I encode single pictures into movies?
* FFmpeg Wiki: Create a video slideshow from images
You could use the filter fps instead of output framerate
ffmpeg -r 1/5 -i img%03d.png -i musicfile -c:v libx264 -vf fps=25 -pix_fmt yuv420p out.mp4
This however skips the last image for me strangely.

Show watermark at the beginning of the video

Need to add watermark for first 3 seconds the video using ffmpeg. Here's what I got right now:
ffmpeg -y -i '255871.mov' -qscale:v 0 -qscale:a 0 -vf '[in] transpose=1 [out];movie=watermark.png , select=lte(t\,3) [bg]; [out][bg] overlay=x=20:y=main_h-60 [out]' output.mp4
It rotates video to the right and adds watermark at the bottom of the video for first 3 seconds. The problem is watermark is visible during the whole video.
Thought that select doesn't work at all. Tried following command
ffmpeg -y -i '255871.mov' -qscale:v 0 -qscale:a 0 -vf '[in] transpose=1 [out];movie=watermark.png , select=0 [bg]; [out][bg] overlay=x=20:y=main_h-60 [out]' output.mp4
Watermark is not visible. This is correct and proves that select filter works as expected. As I understand this is how ffmpeg works: it leaves last frame of the shortest video visible.
How can I force ffmpeg to discard show watermark after N seconds?
Have to answer it myself. ffmpeg mailing list helped me to solve the issue.
The main idea is to convert existing watermark into video using Apple Animation codec (it supports transparency) and fade out last frame of created video using fade filter.
Example:
ffmpeg -loop 1 -i watermark.png -t 3 -c qtrle -vf 'fade=out:73:1:alpha=1' watermark.mov
ffmpeg -y -i '255871.mov' -qscale:v 0 -qscale:a 0 -vf '[in] transpose=1 [out];movie=watermark.mov [bg]; [out][bg] overlay=x=20:y=main_h-60 [out]' output.mp4
Fade out is required because ffmpeg uses last frame of overlaid video for the rest of the video. This filter makes last frame fully transparent via alpha=1 parameter. In fact it should be fade=out:74:1:alpha=1, but it didn't work for me, don't know why

Resources