Cutting audio and adding overlay in single command FFMPEG - ffmpeg

How to add overlay and cut audio from a particular time in any type of video?
Here is what I am trying
ffmpeg -ss 5 -t 30 -i Happier.mp4 -i Watermark.png-filter_complex "[0:v][1:v] overlay=0:0:enable='between(t,5,30)'" -preset ultrafast -pix_fmt yuv420p -c:a copy output.mp4

Use
ffmpeg -i Happier.mp4 -i Watermark.png \
-filter_complex "[0:v][1:v] overlay=0:0:enable='between(t,5,30)'[v]; \
[0]volume=0:enable='between(t,5,30)'[a]" \
-map "[v]" -map "[a]" -preset ultrafast -pix_fmt yuv420p output.mp4

Related

Multiple sounds + watermark overlay not working with ffmpeg

I have a problem with an ffmpeg command.
I want to add the same sound several times in the final video and then add a watermark above.
When I do the full command, it doesn't work correctly because the sound is only played once (the first reference):
ffmpeg -i "assets/frame%05d.png" -i "assets/sound.mp3" -loop 1 -i
"assets/watermark.png" -filter_complex
"[1:a]adelay=1000|1000[s1];[1:a]adelay=3000|3000[s2];[s1][s2]amix=2[a];[0:v][2:v]overlay=shortest=1[outv]"
-map "[outv]" -map "[a]" -c:v libx264 -pix_fmt yuv420p -preset ultrafast -y "result.mp4"
When I don't add the watermark, it works correctly:
ffmpeg -i "assets/frame%05d.png" -i "assets/sound.mp3" -filter_complex
"[1:a]adelay=1000|1000[s1];[1:a]adelay=3000|3000[s2];[s1][s2]amix=2[a]"
-map 0:v -map "[a]" -c:v libx264 -pix_fmt yuv420p -preset ultrafast -y "result.mp4"
Try this:
ffmpeg -i "assets/frame%05d.png" \
-stream_loop -1 -i "assets/sound.mp3" \
-loop 1 -i "assets/watermark.png" \
-filter_complex "[1:a]adelay=1000|1000[s1];\
[1:a]adelay=3000|3000[s2];\
[s1][s2]amix=2[a];\
[0:v][2:v]overlay=shortest=1[outv]" \
-map "[outv]" -map "[a]" -shortest \
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -y "result.mp4"
-stream_loop -1 input option loops the input infinitely and -shortest output option stops the audio when video is done.
p.s., I think an aecho filter can combine the 2nd adelay and amix filters.
I found the problem was that the different frames that make up the video were not all correctly encoded which caused the problem with the sounds.
ffmpeg can't read indexed PNG correctly.
You must therefore use this option (for those like me who use imagick):
$imagick->setOption('png:color-type', 6);
$imagick->writeImage('frame00000.png');

ffmpeg: overlay multiple images to a video

In order to overlay a single image to a video, I can do:
ffmpeg -i vid00.mp4 -i img00.png -filter_complex "[0:v][1:v]overlay=0:0:enable='between(t, 1, 2)'" -c:v libx264 -preset ultrafast -qp 20 -c:a copy -y vid01.mp4
How can I overlay multiple images to a video in a single ffmpeg call?
I've tried stuff like:
ffmpeg -i vid00.mp4 -i img00.png -i img01.png -filter_complex "\
[0:v][1:v]overlay=0:0:enable='between(t, 1, 2)'[v0]; \
[2:v][3:v]overlay=0:0:enable='between(t, 3, 4)'[v1]; \
[v0][v1]concat=n=2:v=1:a=0,format=yuv420p[v]" -map "[v]" -map 0:a -c:v libx264 -preset ultrafast -qp 20 -c:a copy -y vid01.mp4
and variations thereof (by messing with the [0:v][1:v] indices), but to avail.,
Combined command:
ffmpeg -i vid00.mp4 -i img00.png -i img00.png -filter_complex "[0:v][1:v]overlay=0:0:enable='between(t, 1, 2)'[v0];[v0][2:v]overlay=0:0:enable='between(t, 3, 4)'" -c:v libx264 -preset ultrafast -qp 20 -c:a copy -y vid01.mp4

Can I combine these 2 commands? (or am I fighting a losing battle?)

I'm very new to ffmpeg but so far I'm enjoying it. But I'm stuck on something. I want to combine these two commands into one, something I'm sure must be possible, but after countless hours and no luck, here I am :)
ffmpeg -y -f concat -safe 0 -protocol_whitelist "file,http,https,tcp,tls" -i "tmp.images.txt" -i "tmp.audio.mp3" -filter_complex "drawbox=y=ih-38:color=black#0.6:width=iw:height=38:t=fill, drawtext=fontfile=Assets/calibrib.ttf:text='%%~ni':fontcolor=white:fontsize=14:x=(w-tw)/2:y=(h)-24" -c:v libx264 -preset veryfast -tune stillimage -shortest -pix_fmt yuv420p "tmp.slide.mp4"
ffmpeg -loop 1 -framerate 2 -i "Assets/studio.jpg" -i tmp.slide.mp4 -filter_complex "[1]scale=879:496[inner];[0][inner]overlay=207:49:shortest=1[out]" -map "[out]" -map 1:a -c:a aac -y tmp.output.mp4
the first line creates a slideshow and places text at bottom
the second line takes the slideshow video and inserts it into a background image before outputting final video
Use
ffmpeg -y -f concat -safe 0 -protocol_whitelist "file,http,https,tcp,tls" -i "tmp.images.txt" -i "tmp.audio.mp3" -i "Assets/studio.jpg" -filter_complex "[0]drawbox=y=ih-38:color=black#0.6:width=iw:height=38:t=fill, drawtext=fontfile=Assets/calibrib.ttf:text='%%~ni':fontcolor=white:fontsize=14:x=(w-tw)/2:y=(h)-24,scale=879:496[inner];[2][inner]overlay=207:49" -c:v libx264 -preset veryfast -tune stillimage -c:a aac -shortest -pix_fmt yuv420p "tmp.slide.mp4"

ffmpeg wmv to mp4 and synchronous add a logo image

The script I use to add a logo:
ffmpeg -i input.mp4 -framerate 30000/1001 -loop 1 -i test.png \
-filter_complex "[1:v] fade=out:st=30:d=1:alpha=1 [ov]; \
[0:v][ov] overlay=10:10 [v]" -map "[v]" -map 0:a \
-c:v libx264 -c:a copy -shortest output.mp4
The command I use to convert video. (With this command, synchronize your webm and mp4 and get the picture.)
ffmpeg -i input.wmv -c:v libvpx -crf 10 -b:v 1M -c:a libvorbis \
outputwebm.webm -c:v libx264 -crf 35 outputmp4.mp4 \
-vf "thumbnail,scale=640:360" -frames:v 1 outputpng.png
I want to add the logo image as synchronous.
The command I tried:
ffmpeg -i input.wmv -c:v libvpx -crf 10 -b:v 1M \
-c:a libvorbis outputwebm.webm -c:v libx264 \
-crf 35 -framerate 30000/1001 -loop 1 -i test.png \
-filter_complex "[1:v] fade=out:st=30:d=1:alpha=1 [ov]; \
[0:v][ov] overlay=10:10 [v]" -map "[v]" -map 0:a \
-c:v libx264 -c:a copy -shortest outputmp4.mp4 \
-vf "thumbnail,scale=640:360" -frames:v 1 outputpng.png
Result:
Group all inputs at the front of the command and remove the encoding for the temp MP4 file.
ffmpeg -i input.wmv -framerate 30000/1001 -loop 1 -i test.png -c:v libvpx -crf 10 -b:v 1M -c:a libvorbis outputwebm.webm -filter_complex "[1:v] fade=out:st=30:d=1:alpha=1 [ov]; [0:v][ov] overlay=10:10 [v]" -map "[v]" -map 0:a -c:v libx264 -c:a copy -shortest outputmp4.mp4 -vf "thumbnail,scale=640:360" -frames:v 1 outputpng.png
If your PNG has greater resolution than the WMV then you'll need to map the video for the webm and png outputs.

FFMpeg - Combine multiple filter_complex and overlay functions

I am having trouble combining these 3 passes in ffmpeg into a single process.
Is this even possible?
Pass 1
ffmpeg -y -i C:\Users\MJ\Downloads\20151211_pmoney_pmpod.mp3 -loop 1 -i C:\Users\MJ\Documents\pm1080.png -filter_complex "[0:a]showwaves=s=1920x1080:mode=line,colorkey=0x000000:0.01:0.1,format=yuva420p[v];[1:v][v]overlay=0:270[outv]" -map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest C:\Users\MJ\Documents\20151211_pmoney_pmpod4.mp4
Pass 2
ffmpeg -i "C:\Users\MJ\Documents\20151211_pmoney_pmpod4.mp4" -vf drawtext="fontsize=50:fontcolor=white:fontfile=/Windows/Fonts/impact.ttf:text=Planet Money Podcast on NPR - A/B Split Testing:x=(w-text_w)/2:y=200" -acodec copy "C:\Users\MJ\Documents\20151211_pmoney_pmpod-overlay-text.mp4"
Pass 3
ffmpeg -i "C:\Users\MJ\Documents\20151211_pmoney_pmpod-overlay-text.mp4" -i C:\Users\MJ\Downloads\6.png -filter_complex "overlay=10:10" C:\Users\MJ\Documents\20151211_pmoney_pmpod-overlay-text1.mp4"
Thanks!
Join filters with a comma and filterchains with a semicolon:
ffmpeg -i audio.mp3 -i image1.png -i image2.png -filter_complex \
"[0:a]showwaves=s=1920x1080:mode=line[fg]; \
[1:v][fg]overlay=0:270,drawtext=fontsize=50:fontcolor=white:fontfile=/Windows/Fonts/impact.ttf:text='Planet Money Podcast on NPR - A/B Split Testing':x=(w-text_w)/2:y=200[bg]; \
[bg][2:v]overlay=10:10,format=yuv420p[outv]" \
-map "[outv]" -map 0:a -c:v libx264 -c:a copy -movflags +faststart -shortest out.mp4

Resources