Merge video and audio while delaying audio by x seconds - ffmpeg

This works for merging audio and video
ffmpeg -i video.mp4 -i audio.ogg -filter_complex "[0:a][1:a]amerge=inputs=2[a]" -map 0:v -map "[a]" -c:v copy -c:a libvorbis -ac 2 -shortest out.mp4 -y -nostdin
I can't figure out how to delay the audio so it starts x seconds into the video. I have tried -itsoffset but it doesn't work.

Use
ffmpeg -i video.mp4 -i audio.ogg -filter_complex "[1:a]adelay=1000|1000[a1];[0:a][a1]amerge=inputs=2[a]" -map 0:v -map "[a]" -c:v copy -c:a libvorbis -ac 2 -shortest out.mp4 -y -nostdin
The adelay adds 1000 ms of silence to both channels of the OGG.

This is more of a workaround, but you could concatenate 1 second of silence with your ogg first:
https://trac.ffmpeg.org/wiki/Concatenate

Related

adding background music to video using ffmpeg

I am trying to add bacground music to a video using ffmpeg and it is working fine but I want if length of video is more than the music file then music should start playing again till the video is over -
ffmpeg -i 1.mp4 -i 2.mp3 -map 0:v:0 -map 1:a:0 output.mp4
is there anyway to perform this action?
you can use the following example:
ffmpeg -i ./INPUUT_VIDEO.mp4 -filter_complex "amovie=./INPUT_MUSIC.M4A:loop=0,asetpts=N/SR/TB[aud];[0:a][aud]amix[a]" -map 0:v -map '[a]' -c:v copy -c:a aac -b:a 256k -shortest ./OUTPUT_VIDEO.mp4
you can set/change the audio bit rate by using -b:a flag, for more detail, take a look at this document.
EDIT:
for replacing new audio in the video (repeat it until the end of the video):
ffmpeg -i VIDEO.mp4 -stream_loop -1 -i MUSIC.mp3 -c copy -shortest -map 0:v:0 -map 1:a:0 OUTPUT.mp4
replacing new audio in the video (repeat video until the end of the audio):
ffmpeg -stream_loop -1 -i VIDEO.mp4 -i MUSIC.mp3 -c copy -shortest -map 0:v:0 -map 1:a:0 OUTPUT.mp4

ffmpeg - cannot find matching stream for unlabeled input pad 1 on filter Parsed_amix_0

I have an mp4 and I am trying to merge multiple audio files into it. The command is generated dynamically given the number of audio and their offset. Everything works when I have multiple audios to merge but doesn't when I have one audio file to merge.
Here is the command to merge single audio into the mp4
ffmpeg -y -i ./video.mp4 -itsoffset 00:00:1.753 -i ./audio0.mp3 -filter_complex amix -map 0:v -map 1:a -c:v copy -async 1 -c:a aac -strict experimental -t 10 ./finalVideo.mp4
Here is the command to merge multiple audio files which works
ffmpeg -y -i ./video.mp4 -itsoffset 00:00:1.753 -i ./audio0.mp3 -itsoffset 00:00:0.113 -i ./audio/audio1.mp3 -filter_complex amix -map 0:v -map 1:a -map 2:a -c:v copy -async 1 -c:a aac -strict experimental -t 10 ./finalVideo.mp4

FFMPEG - Merge 2 Files (video_video), with TC and watermark

I need to merge two video files, add a watermaker and a timecode burned in the video.
I see this (by #llogan:)
ffmpeg -i video.mp4 -i audio.mp3 -i watermark.png -filter_complex "[0:v:0]drawtext=fontfile=/usr/share/fonts/TTF/DejaVuSansMono.ttf:timecode='01\:23\:45\:00':r=25:x=(w-text_w)/2:y=h-text_h-20:fontsize=20:fontcolor=white:box=1:boxborderw=4:boxcolor=black[bg];[1][bg]overlay=W-w-10:H-h-12:format=auto[v]" -map "[v]" -map 1:a -shortest output.mp4
But I can't apply for two videos, because of the map. Can someone help me, please? My last attempt was:
ffmpeg -i [video1] -i [video2] -i [image-overlay] -filter_complex "[0:v:0]drawtext=fontfile=/Windows/Fonts/arial.ttf: timecode='00\:00\:00\:00': r=25: x=(w-tw)/2: y=h-(2*lh): fontcolor=0xccFFFF#1: fontsize=85: box=1: boxcolor=0x000000#0.2[bg];concat=n=2:v=1:a=1[vv][a];[vv][2:v]overlay=0:0[v];[vv][bg]overlay=0:0" -map "[v]" -map "[a]" -c:v libx264 -b 2000k -preset fast -c:a aac [output file]
Assuming you want to draw timecode to only video1, and concatenate video1 to video2, and add image overlay over the combined videos:
ffmpeg -i [video1] -i [video2] -i [image-overlay] -filter_complex "[0:v:0]drawtext=fontfile=/Windows/Fonts/arial.ttf: timecode='00\:00\:00\:00': r=25: x=(w-tw)/2: y=h-(2*lh): fontcolor=0xccFFFF#1: fontsize=85: box=1: boxcolor=0x000000#0.2[tc];[tc][0:a][1:v][1:a]concat=n=2:v=1:a=1[cat][a];[cat][2:v]overlay=0:0[v]" -map "[v]" -map "[a]" -c:v libx264 -b:v 2000k -preset fast -c:a aac [output file]

FFMPEG map multiple audio input files to 1 single image file in order to create multiple video output files

I am attempting to have multiple output files in ffmpeg map to multiple inputs however instead of each input mapping to a unique output I get the first input mapping to the first video and then the next videos never get created at all, I will describe exactly what I am trying to achieve below,
I need to:
create multiple video output files
from multiple audio input files
which all use the same one common image file to create the
video
I will post my command below, any help would be greatly appreciated thanks
-y -i input1.mp3 -i input2.mp3 -f image2 -loop 1 -r 2 -i imagefile.png -shortest -c:a aac -c:v mpeg4 -crf 18 -preset veryfast -movflags faststart -map 0 output1.mp4 -map1 output2.mp4
Basic command
ffmpeg -i input1.mp3 -i input2.mp3 -loop 1 -framerate 10 -i imagefile.png -map 2:v -map 0:a -vf format=yuv420p -shortest output1.mp4 -map 2:v -map 1:a -vf format=yuv420p -shortest output2.mp4
The video is filtered and encoded once per output.
With the split filter
ffmpeg -i input1.mp3 -i input2.mp3 -loop 1 -framerate 10 -i imagefile.png -filter_complex "[2:v]format=yuv420p,split=outputs=2[v0][v1]" -map "[v0] -map 0:a -shortest -movflags +faststart output1.mp4 -map "[v1]" -map 1:a -shortest -movflags +faststart output2.mp4
The video is filtered once total, and encoded separately for each output.
Pipe
ffmpeg -y -v error -loop 1 -framerate 10 -i imagefile.png -filter_complex "[0:v]format=yuv420p[v]" -map "[v]" -c:v libx264 -f nut - | ffmpeg -y -i - -i input1.mp3 -i input2.mp3 -map 0:v -map 1:a -c:v copy -shortest -movflags +faststart output1.mp4 -map 0:v -map 2:a -c:v copy -shortest -movflags +faststart output2.mp4
The video is filtered and encoded only once and each output stream copies it.

ffmpeg parallel encoding for make mp4 qualities

I want to make the different qualities from video in one command.
I used this below code.
But there is an issue,and it's that the output files not have details
ffmpeg -i input.mp4 -filter_complex [0:v]format=yuv420p,split=2[s0][s1];
[s0]scale=hd480[v0];
[s1]scale=nhd[v1]
-map [v0] -map [v1] -map 0:a? -c:v libx264 -c:a aac -f tee -threads 0
"[select='v\:0,a':f=mp4]1/480.mp4|[select='v\:1,a':f=mp4]1/360.mp4"
What I must be do?
With the guidances and helps of #Mulvya the answer is like this :
ffmpeg -i input.mp4 -filter_complex [0:v]format=yuv420p,split=2[s0][s1];
[s0]scale=hd480[v0];
[s1]scale=nhd[v1]
-map [v0] -map [v1] -map 0:a? -c:v libx264 -c:a aac -f tee -flags +global_header -threads 0
"[select='v\:0,a':f=mp4]1/480.mp4|[select='v\:1,a':f=mp4]1/360.mp4"

Resources