FFMPEG Video loop from images files - ffmpeg

I am trying to make a video from a bundle of image files and then apply a an overlay on top of it.Another requirement is to make the video loop 3x. It is simply not working.
The first three paths are pointing toward the same image bundle. (A folder containing images like the following DSC0001_0013.jpg,DSC0002_0013.jpg,etc)
Observed symptoms:
The script runs infinitely.I produces a video file of 0 KB.I have to abort script using CTRL+C
This is my script.
ffmpeg
-start_number 1 -framerate 3/1
-i "C:\Users\xxx\AppData\Local\xxx\xxx\xxx\xxx\xxx\xxx\xxx\963d9d9b8e1\DSC%04d_0013.jpg"
-i "C:\Users\xxx\AppData\Local\xxx\xxx\projects\xxx\xxx\xxx\xxx\963d9d9b8e1\DSC%04d_0013.jpg"
-i "C:\Users\xxx\AppData\Local\xxx\xxx\projects\xxx\xxx\xxx\xxx\963d9d9b8e1\DSC%04d_0013.jpg"
-i "C:\Users\xxx\AppData\Local\xxx\xxx\projects\1237\1138\overlay.png"
-i "C:\Users\xxx\AppData\Local\xxx\xxx\projects\1237\1138\overlay.png"
-i "C:\Users\xxx\AppData\Local\xxx\xxx\projects\1237\1138\overlay.png"
-filter_complex " [0:v]scale=600x900[scaled1]; [1:v]scale=600x900[scaled2]; [2:v]scale=600x900[scaled3]; [scaled1][3:v]overlay[tmp1]; [scaled2][4:v]overlay[tmp2]; [scaled3][5:v]overlay[tmp3]; [tmp1][tmp2][tmp3]concat=n=3[scaled] "
-map [scaled] -r 10 -vcodec libx264 -pix_fmt yuv420p -crf 23 "C:\Users\xxx\Documents\Projets\2020\xxx\video test ffmpeg\test.mp4"

Use the -stream_loop option:
ffmpeg -stream_loop 3 -framerate 3/1 -i DSC%04d_0013.jpg -i overlay.png -filter_complex "[0]scale=600:900[bg];[bg][1]overlay=format=auto,format=yuv420p[v]" -map "[v]" -r 10 -c:v libx264 -crf 23 output.mp4

#Ilogan, This is our solution.
-start_number 1 -framerate 3/1
-i DSC%04d_0013.jpg
-loop 1 -i overlay.png"
-filter_complex "
[0:v]scale=600x900[scaled];
[scaled][1:v]overlay,trim=duration=3,loop=loop=2:size=9[tmp]
" -map [tmp] -r 10 -vcodec libx264 -pix_fmt yuv420p -crf 23
test.mp4

Related

Multiple sounds + watermark overlay not working with ffmpeg

I have a problem with an ffmpeg command.
I want to add the same sound several times in the final video and then add a watermark above.
When I do the full command, it doesn't work correctly because the sound is only played once (the first reference):
ffmpeg -i "assets/frame%05d.png" -i "assets/sound.mp3" -loop 1 -i
"assets/watermark.png" -filter_complex
"[1:a]adelay=1000|1000[s1];[1:a]adelay=3000|3000[s2];[s1][s2]amix=2[a];[0:v][2:v]overlay=shortest=1[outv]"
-map "[outv]" -map "[a]" -c:v libx264 -pix_fmt yuv420p -preset ultrafast -y "result.mp4"
When I don't add the watermark, it works correctly:
ffmpeg -i "assets/frame%05d.png" -i "assets/sound.mp3" -filter_complex
"[1:a]adelay=1000|1000[s1];[1:a]adelay=3000|3000[s2];[s1][s2]amix=2[a]"
-map 0:v -map "[a]" -c:v libx264 -pix_fmt yuv420p -preset ultrafast -y "result.mp4"
Try this:
ffmpeg -i "assets/frame%05d.png" \
-stream_loop -1 -i "assets/sound.mp3" \
-loop 1 -i "assets/watermark.png" \
-filter_complex "[1:a]adelay=1000|1000[s1];\
[1:a]adelay=3000|3000[s2];\
[s1][s2]amix=2[a];\
[0:v][2:v]overlay=shortest=1[outv]" \
-map "[outv]" -map "[a]" -shortest \
-c:v libx264 -pix_fmt yuv420p -preset ultrafast -y "result.mp4"
-stream_loop -1 input option loops the input infinitely and -shortest output option stops the audio when video is done.
p.s., I think an aecho filter can combine the 2nd adelay and amix filters.
I found the problem was that the different frames that make up the video were not all correctly encoded which caused the problem with the sounds.
ffmpeg can't read indexed PNG correctly.
You must therefore use this option (for those like me who use imagick):
$imagick->setOption('png:color-type', 6);
$imagick->writeImage('frame00000.png');

ffmpeg images list (text file) to video with overlay watermark

I have 250 images / day of 4000*3000 pixels in a text file.
file '/home/user/camdata/nonseqdata.jpg'
file '/home/user/camdata/strangedata.jpg'
i created mp4 video with this command
ffmpeg -y -f concat -safe 0 -i ecam.001_20210525.txt -c:v libx264 -vf "scale=1280:720,fps=25,format=yuv420p" out.mp4
Now i need to add watermark to video.(in same command)
closest example i found on web, trying to modify that and use in my case is like..
ffmpeg -r 25 -f image2 -s 1280x720 -i ecam.001_20210525.txt -i wm.png -filter_complex "[0:v][1:v] overlay=0:0" -vcodec libx264 -crf 25 -pix_fmt yuv420p test_overlay.mp4
OR
ffmpeg -r 25 -f concat -safe 0 -s 1280x720 -i ecam.001_20210525.txt -i wm.png -filter_complex "[0:v]pad=width=mainw:height=mainh:x=0:y=0,[1:v] overlay=0:0" -c:v libx264 test_overlay.mp4
BUT it error out to >> Decoder (codec none) not found for input stream #0:0
Q. how exactly to fix this.? i need output to be 720p or 1080p.?
Use
ffmpeg -y -f concat -safe 0 -i ecam.001_20210525.txt -i wm.png -filter_complex "[0]scale=1280:720[v];[v][1]overlay=x=0:y=0,fps=25,format=yuv420p" -c:v libx264 out.mp4

Can I combine these 2 commands? (or am I fighting a losing battle?)

I'm very new to ffmpeg but so far I'm enjoying it. But I'm stuck on something. I want to combine these two commands into one, something I'm sure must be possible, but after countless hours and no luck, here I am :)
ffmpeg -y -f concat -safe 0 -protocol_whitelist "file,http,https,tcp,tls" -i "tmp.images.txt" -i "tmp.audio.mp3" -filter_complex "drawbox=y=ih-38:color=black#0.6:width=iw:height=38:t=fill, drawtext=fontfile=Assets/calibrib.ttf:text='%%~ni':fontcolor=white:fontsize=14:x=(w-tw)/2:y=(h)-24" -c:v libx264 -preset veryfast -tune stillimage -shortest -pix_fmt yuv420p "tmp.slide.mp4"
ffmpeg -loop 1 -framerate 2 -i "Assets/studio.jpg" -i tmp.slide.mp4 -filter_complex "[1]scale=879:496[inner];[0][inner]overlay=207:49:shortest=1[out]" -map "[out]" -map 1:a -c:a aac -y tmp.output.mp4
the first line creates a slideshow and places text at bottom
the second line takes the slideshow video and inserts it into a background image before outputting final video
Use
ffmpeg -y -f concat -safe 0 -protocol_whitelist "file,http,https,tcp,tls" -i "tmp.images.txt" -i "tmp.audio.mp3" -i "Assets/studio.jpg" -filter_complex "[0]drawbox=y=ih-38:color=black#0.6:width=iw:height=38:t=fill, drawtext=fontfile=Assets/calibrib.ttf:text='%%~ni':fontcolor=white:fontsize=14:x=(w-tw)/2:y=(h)-24,scale=879:496[inner];[2][inner]overlay=207:49" -c:v libx264 -preset veryfast -tune stillimage -c:a aac -shortest -pix_fmt yuv420p "tmp.slide.mp4"

FFMPEG : Cut Video AND Include Ending Video/Image

I am using this command line to add a five second image on end of video:
ffmpeg -i "f:\output\input.mov" -loop 1 -t 5 -i "f:\output\taff.jpg" -f lavfi -t 5 -i anullsrc -filter_complex "[0:v] [0:a] [1:v] [2:a] concat=n=2:v=1:a=1 [v] [a]" -c:v libx264 -c:a aac -strict -2 -map "[v]" -map "[a]" f:\output\output.mp4
It works great, but sometimes I want to cut the video and then add the five seconds. So, make a 120 second video 110 seconds, then add the 5 second ending.
Possibly in one command line? I've tried to break it into two, by starting with cutting the video, but then I get an "Unable to parse option value "-1" pixel format" error if I try to re-encode the video I cut with ffmpeg using this:
ffmpeg -i f:\output\input.mov -vcodec copy -acodec copy -ss 00:00:00.000 -t 00:01:50.000 f:\output\output.mov
That output video will then give an error if I try to run the first command line against it.
All feedback appreciated on shortening a video, and then adding ending.
Cheers!
Ryan
Use
ffmpeg -t 110 -i "f:\output\input.mov"
-loop 1 -t 5 -i "f:\output\taff.jpg"
-f lavfi -t 5 -i anullsrc
-filter_complex "[0:v][0:a][1:v][2:a]concat=n=2:v=1:a=1[v][a]"
-c:v libx264 -c:a aac -strict -2 -map "[v]" -map "[a]" f:\output\output.mp4
With scale2ref, it should be
ffmpeg -t 110 -i "f:\output\input.mov"
-loop 1 -t 5 -i "f:\output\taff.jpg"
-f lavfi -t 5 -i anullsrc
-filter_complex "[1][0]scale2ref[2nd][ref];[ref][0:a][2nd][2:a]concat=n=2:v=1:a=1[v][a]"
-c:v libx264 -c:a aac -strict -2 -map "[v]" -map "[a]" f:\output\output.mp4
If the image has a different aspect ratio, use
ffmpeg -t 110 -i "f:\output\input.mov"
-loop 1 -t 5 -i "f:\output\taff.jpg"
-f lavfi -t 5 -i anullsrc
-filter_complex "[0]split[base][full];[base]trim=0:5,drawbox=t=fill[base];[1][base]scale2ref='if(lt(mdar,dar),oh*mdar/sar,iw)':'if(lt(mdar,dar),ih,ow*sar/mdar)'[2nd][base];[base][2nd]overlay='(W-w)/2':'(H-h)/2'[padded];[full][0:a][padded][2:a]concat=n=2:v=1:a=1[v][a]"
-c:v libx264 -c:a aac -strict -2 -map "[v]" -map "[a]" f:\output\output.mp4
This last command requires ffmpeg version >= 3.4

FFmpeg filter_complex merging two commands

I'm having trouble merging these two commands, if anyone can help me merge the top ones transposing and watermarking to the second one I would really appreciate it. I've tried a few things such as:
ffmpeg -i .\test1.flv -i .\test2.flv -loop 1 -i .\watermark.png -filter_complex "[0]transpose=1[a];[1]transpose=1[b];[a][b]hstack[c];[c][2]overlay=W-w-5:H-h-5:shortest=1; [0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; [1:v]setpts=PTS-STARTPTS[fg]; amerge,pan=stereo:c0<c0+c2:c1<c1+c3" -c:v libx264 -f mp4 -threads 24 -y matt.mp4
ffmpeg -i .\test1.flv -i .\test2.flv -loop 1 -i .\watermark.png -filter_complex "[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; [1:v]setpts=PTS-STARTPTS[fg]; amerge,pan=stereo:c0<c0+c2:c1<c1+c3;[0]transpose=1[a];[1]transpose=1[b];[a][b]hstack[c];[c][2]overlay=W-w-5:H-h-5:shortest=1" -c:v libx264 -f mp4 -threads 24 -y matt.mp4
Thanks everyone!
If I understand your intent right, this is what you want
ffmpeg -i .\test1.flv -i .\test2.flv -loop 1 -i .\watermark.png -filter_complex
"[0]setpts=PTS-STARTPTS,transpose=1[a];
[1]setpts=PTS-STARTPTS,transpose=1[b];
[a][b]hstack[c];
[c][2]overlay=W-w-5:H-h-5:shortest=1[v];
[0]asetpts=PTS-STARTPTS[x];[1]asetpts=PTS-STARTPTS[y];
[x][y]amerge,pan=stereo:c0<c0+c2:c1<c1+c3[a]"
-map "[v]" -map "[a]" -c:v libx264 -f mp4 -threads 24 -y matt.mp4

Resources