FFMPEG multiple file pattern not working in single command - ffmpeg

I want to add multiple file sequences in single ffmpeg command, below is my code, video is getting created but only first image sequence is getting used, second is getting ignored
ffmpeg -y -i input.mp4 -start_number 0000001 -i 1/%07d.png -i 2/%07d.png -filter_complex "[0][1]overlay=x=10:y=10:enable='between(t,0,3)'[v1];[v1][2]overlay=x=10:y=10:enable='between(t,3.8561422222222,6.9761777777778)'[v2]" -map "[v2]" -map 0:a out.mp4
Now the problem is FFMPEG wants continous images, which i don't have i have images starting from 0000001.png in each folder, how can i accomplish this without changing much in my images

Try the glob pattern to deal with inconsistent numbering and pad the PTS with setpts so the overlay doesn't get consumed before it is displayed:
ffmpeg -y -i input.mp4 -pattern_type glob -i "1/*.png" -pattern_type glob -i "2/*.png" -filter_complex "[0][1]overlay=x=10:y=10:enable='between(t,0,3)'[v1];[2]setpts=PTS+3.856/TB[fg];[v1][fg]overlay=x=10:y=10:enable='between(t,3.8561422222222,6.9761777777778)'[v2]" -map "[v2]" -map 0:a out.mp4

Can you pipe the images to -f image2pipe ?
cat $(find 1 2 -name '*.png' -print) | ffmpeg -y -i input.mp4 \
-f image2pipe -vcodec png -i - \
-filter_complex "[0][1]overlay=x=10:y=10:enable='between(t,0,3)'[v1];[v1][2]overlay=x=10:y=10:enable='between(t,3.8561422222222,6.9761777777778)'[v2]" \
-map "[v2]" -map 0:a out.mp4

Related

ffmpeg combining multiple commands into one

Justing wondering how to combine below commands into one, I have searched how to combine simple filters with comma and complex filter with colon, but I'm not sure how to do this.
Basically I want the output of the first command to be the input of the second.
Command 1: concatenate multiple clips into one with different xfade transitions.
Command 2: add a fade in for a video
ffmpeg -i input0.mp4 -i input1.mp4 -i input2.mp4 -i input3.mp4 -i input4.mp4 -filter_complex "[0:v][1:v]xfade=transition=fade:duration=0.500:offset=27.486[v01];[v01][2:v]xfade=transition=fadeblack:duration=1.000:offset=31.531[v02];[v02][3:v]xfade=transition=fadeblack:duration=1.000:offset=42.972[v03];[v03][4:v]xfade=transition=fade:duration=0.500:offset=94.149,format=yuv420p[video];[0:a][1:a]acrossfade=d=0.500:c1=tri:c2=tri[a01];[a01][2:a]acrossfade=d=1.000:c1=tri:c2=tri[a02];[a02][3:a]acrossfade=d=1.000:c1=tri:c2=tri[a03];[a03][4:a]acrossfade=d=0.500:c1=tri:c2=tri[audio]" -map [video] -map [audio] -movflags +faststart output.mp4
ffmpeg -i input.mp4 -vf "fade=t=in:st=0.000:d=1.000:color=black" -c:a copy output.mp4
Combined command:
ffmpeg -i input0.mp4 -i input1.mp4 -i input2.mp4 -i input3.mp4 -i input4.mp4 -filter_complex "[0:v][1:v]xfade=transition=fade:duration=0.500:offset=27.486[v01];[v01][2:v]xfade=transition=fadeblack:duration=1.000:offset=31.531[v02];[v02][3:v]xfade=transition=fadeblack:duration=1.000:offset=42.972[v03];[v03][4:v]xfade=transition=fade:duration=0.500:offset=94.149,format=yuv420p,fade=t=in:st=0.000:d=1.000:color=black[video];[0:a][1:a]acrossfade=d=0.500:c1=tri:c2=tri[a01];[a01][2:a]acrossfade=d=1.000:c1=tri:c2=tri[a02];[a02][3:a]acrossfade=d=1.000:c1=tri:c2=tri[a03];[a03][4:a]acrossfade=d=0.500:c1=tri:c2=tri[audio]" -map [video] -map [audio] -movflags +faststart output.mp4

ffmpeg, apply fade only to video input using filter_complex while concating

ffmpeg -i foo.mp4 -filter_complex "fade=d=0.5, reverse, fade=d=0.5, reverse" output.mp4
can be used to fade in and out foo.mp4 video. (we do not care about audio). According to https://video.stackexchange.com/questions/19867/how-to-fade-in-out-a-video-audio-clip-with-unknown-duration
It's good but only works in the simple situation of 1 input video, and 1 output. Now, how can I apply the fade in and out effect in the following more complex situation? I'm trying to concat a.jpg (a picture) with bar.mp4. I want only the bar.mp4 portion to fade in and out.
ffmpeg -loop 1 -t 2 -framerate 1 -i a.jpg -f lavfi -t 2 -i anullsrc -r 24 -i bar.mp4 -filter_complex "[0][1][2:v][2:a] concat=n=2:v=1:a=1 [vpre][a];[vpre]fps=24[v]" -map "[v]" -map "[a]" out.mp4 -y
Of course, I could first create a temporary temp.mp4 from bar.mp4 by running the first command, then input temp.mp4 in my second command. This involves an extra step and extra encoding. Could anyone help fix the commands or suggest something even better?
Use
ffmpeg -loop 1 -t 2 -framerate 24 -i a.jpg -f lavfi -t 2 -i anullsrc -r 24 -i bar.mp4 -filter_complex "[2:v]fade=d=0.5, reverse, fade=d=0.5, reverse[v2];[0][1][v2][2:a] concat=n=2:v=1:a=1 [v][a]" -map "[v]" -map "[a]" out.mp4 -y

ffmpeg read from a file and apply filter_complex at once

I am feeding fls.txt into ffmpeg -i and applying concat and a speedup.
fls.txt
file 'input1.mp4'
file 'input2.mp4'
file 'input3.mp4'
The command in one go looks as follows:
ffmpeg -i fls.txt \
-filter_complex "[0:v][0:a][1:v][1:a][2:v][2:a] concat=n=3:v=1:a=1 [v][a];\
[v]setpts=0.5*PTS[v1];[a]atempo=2,asetpts=N/SR/TB[a1]" \
-c:v h264_nvenc -map "[v1]" -map "[a1]" x2.mp4
The output is really weird and says something like a stream is not found. And it also looks like as if it's trying to understand the fls.txt itself and not its content as the parameters.
What am I doing wrong here and how can I correct it?
Also, it's a simplified example and I cannot write per hand 3 input file paths. I need it to be read from a file. I'm on windows 10 if that matters.
EDIT:
From doing the suggested edits and expanding the -filter_complex I get an error below.
ffmpeg -f concat -safe 0 -i fls.txt \
-filter_complex "[0:v]setpts=0.5*PTS[v1];[v1]setpts=0.5*PTS[v2];[0:a]atempo=2,asetpts=N/SR/TB[a1];[a1]atempo=2,asetpts=N/SR/TB[a2]" \
-c:v h264_nvenc -map "[v1]" -map "[a1]" x2.mp4 \
-c:v h264_nvenc -map "[v2]" -map "[a2]" x4.mp4
error:
Output with label 'v1' does not exist in any defined filter graph, or was already used elsewhere.
Stream specifier ':a' in filtergraph description … matches no streams.
To enable the concat demuxer you have to use -f concat before -i fls.txt.
ffmpeg -f concat -i fls.txt \
-filter_complex "[0:v]setpts=0.5*PTS[v1];[0:a]atempo=2,asetpts=N/SR/TB[a1]" \
-c:v h264_nvenc -map "[v1]" -map "[a1]" x2.mp4
Because you're attempting to use the concat demuxer there is no need for the concat filter as well, so you can simplify the command.
You may also have to use -safe 0 before -i which you can read about in the documentation.
Follow-up question: Output with label 'v1' does not exist in any defined filter graph, or was already used elsewhere
You can't reuse consumed filter output labels so this example avoids that:
ffmpeg -f concat -safe 0 -i fls.txt \
-filter_complex "[0:v]setpts=0.5*PTS[2xv];[0:v]setpts=PTS/4[4xv];[0:a]atempo=2,asetpts=N/SR/TB[2xa];[0:a]atempo=4,asetpts=N/SR/TB[4xa]" \
-c:v h264_nvenc -map "[2xv]" -map "[2xa]" x2.mp4 \
-c:v h264_nvenc -map "[4xv]" -map "[4xa]" x4.mp4

combine two -filter_complex command together

Below, I have two ffmpeg commands (1, 2) to be combined into (3).
add sounds from 1.mp3 and 1.3gp into muted 1.mp4
code works without error:
ffmpeg -i 1.mp3 -i 1.3gp -i 1.mp4 \
-filter_complex "[1]adelay=640|640[s1];[0][s1]amix=2[mixout];" \
-map 2:v -map [mixout] -c:v copy result.mp4
add watermark to top-right of 1.mp4
code works without error:
ffmpeg -i 1.mp4 -i logo.png \
-filter_complex "overlay=x=main_w-overlay_w:y=1" \
result.mp4
combine above two commands into one
My code fails
ffmpeg -i 1.mp3 -i 1.3gp -i 1.mp4 -i logo.png \
-filter_complex "[1]adelay=640|640[s1];[0][s1]amix=2[mixout];[2:v][3]overlay=x=main_w-overlay_w:y=1[outv]" \
-map [outv] -map [mixout] -c:v copy result.mp4
What am I doing wrong here?
Use
ffmpeg -i 1.mp3 -i 1.3gp -i 1.mp4 -i logo.png \
-filter_complex "[1]adelay=640|640[s1];[0][s1]amix=2[mixout];
[2:v][3]overlay=x=main_w-overlay_w:y=1[outv]" \
-map [outv] -map [mixout] result.mp4
If you're filtering the video stream e.g. adding an overlay, then you can't copy that video stream.

FFmpeg filter_complex merging two commands

I'm having trouble merging these two commands, if anyone can help me merge the top ones transposing and watermarking to the second one I would really appreciate it. I've tried a few things such as:
ffmpeg -i .\test1.flv -i .\test2.flv -loop 1 -i .\watermark.png -filter_complex "[0]transpose=1[a];[1]transpose=1[b];[a][b]hstack[c];[c][2]overlay=W-w-5:H-h-5:shortest=1; [0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; [1:v]setpts=PTS-STARTPTS[fg]; amerge,pan=stereo:c0<c0+c2:c1<c1+c3" -c:v libx264 -f mp4 -threads 24 -y matt.mp4
ffmpeg -i .\test1.flv -i .\test2.flv -loop 1 -i .\watermark.png -filter_complex "[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; [1:v]setpts=PTS-STARTPTS[fg]; amerge,pan=stereo:c0<c0+c2:c1<c1+c3;[0]transpose=1[a];[1]transpose=1[b];[a][b]hstack[c];[c][2]overlay=W-w-5:H-h-5:shortest=1" -c:v libx264 -f mp4 -threads 24 -y matt.mp4
Thanks everyone!
If I understand your intent right, this is what you want
ffmpeg -i .\test1.flv -i .\test2.flv -loop 1 -i .\watermark.png -filter_complex
"[0]setpts=PTS-STARTPTS,transpose=1[a];
[1]setpts=PTS-STARTPTS,transpose=1[b];
[a][b]hstack[c];
[c][2]overlay=W-w-5:H-h-5:shortest=1[v];
[0]asetpts=PTS-STARTPTS[x];[1]asetpts=PTS-STARTPTS[y];
[x][y]amerge,pan=stereo:c0<c0+c2:c1<c1+c3[a]"
-map "[v]" -map "[a]" -c:v libx264 -f mp4 -threads 24 -y matt.mp4

Resources