Set hls attributes in multiple outputs as seperated piece code - ffmpeg

I used this command for create multiple outputs with different qualities in hls format and I mapped the audio for those.
ffmpeg -i kata.mp4 -filter_complex [0:v]split=4[s0][s1][s2][s3];
[s0]scale=hd720[v0];[s1]scale=hd480[v1];[s2]scale=nhd[v2];[s3]scale=cga[v3]
-map [v0] -map [v1] -map [v2] -map [v3] -map 0:a -c:v libx264 -c:a aac -f tee
-hls_list_size 0 -g 48 "[select=\'v:0,a\':f=hls]out.m3u8| [select=\'v:1,a\':f=hls]out-480.m3u8| [select=\'v:2,a\':f=hls]out-360.m3u8| [select=\'v:3,a\':f=hls]out-200.m3u8"
in my command -hls_list_size not work.
I think I must be use this attribute as a seperated piece code for every outputs ,but I don't know how can I do that,Or what is the syntax code for that.
Can anyone help me?

It's work with using the command as this way.
ffmpeg -i kata.mp4 -filter_complex [0:v]split=4[s0][s1][s2][s3];
[s0]scale=hd720[v0];[s1]scale=hd480[v1];[s2]scale=nhd[v2];[s3]scale=cga[v3]
-map [v0] -map [v1] -map [v2] -map [v3] -map 0:a -c:v libx264 -c:a aac -f tee -g 48 -threads 0
"[select='v\:0,a':f=hls:hls_list_size=0]../video/720p/out.m3u8|
[select='v\:1,a':f=hls:hls_list_size=0]../video/480p/out.m3u8|
[select='v\:2,a':f=hls:hls_list_size=0]../video/360p/out.m3u8|
[select='v\:3,a':f=hls:hls_list_size=0]../video/200p/out.m3u8"

Related

Combining 2 FFMPEG Commands

I'm trying to combine 2 ffmpeg commands, one which creates the video, and another which adds a simple fade to the beginning of the created video. Here's what I have:
ffmpeg -y -stream_loop -1 -i "video.mp4" -stream_loop -1 -i "music.mp3" -i "audio.mp3" -filter_complex "[1:a]volume=0.1[a1];[2:a]adelay=5000|5000,apad=pad_dur=10[a2];[a1][a2]amerge=inputs=2,afade=in:st=0:d=5[audio]" -map "0:v" -map "[audio]" -c:v libx264 -c:a aac -ac 2 -ar 22050 -preset veryfast -shortest "output.mp4"
ffmpeg -y -i "output.mp4" -filter_complex "[0:v]fade=in:0:d=5" -c:a copy -preset veryfast -movflags faststart -fflags genpts "done.mp4"
The two commands work perfectly fine, however the second one takes about the same amount of time to process as the first, and I feel it should be relatively easy to do the fade-in during the first encode. For my skillset atleast, I was wrong. Please could someone with more experience lend a helping hand?
Thanks.
Add a simple filterchain for the video.
ffmpeg -y -stream_loop -1 -i "video.mp4" -stream_loop -1 -i "music.mp3" -i "audio.mp3" -filter_complex "[1:a]volume=0.1[a1];[2:a]adelay=5000|5000,apad=pad_dur=10[a2];[a1][a2]amerge=inputs=2,afade=in:st=0:d=5[audio]" -vf "fade=in:0:d=5" -map "0:v" -map "[audio]" -c:v libx264 -c:a aac -ac 2 -ar 22050 -preset veryfast -shortest -movflags faststart "done.mp4"

FFMPEG map multiple audio input files to 1 single image file in order to create multiple video output files

I am attempting to have multiple output files in ffmpeg map to multiple inputs however instead of each input mapping to a unique output I get the first input mapping to the first video and then the next videos never get created at all, I will describe exactly what I am trying to achieve below,
I need to:
create multiple video output files
from multiple audio input files
which all use the same one common image file to create the
video
I will post my command below, any help would be greatly appreciated thanks
-y -i input1.mp3 -i input2.mp3 -f image2 -loop 1 -r 2 -i imagefile.png -shortest -c:a aac -c:v mpeg4 -crf 18 -preset veryfast -movflags faststart -map 0 output1.mp4 -map1 output2.mp4
Basic command
ffmpeg -i input1.mp3 -i input2.mp3 -loop 1 -framerate 10 -i imagefile.png -map 2:v -map 0:a -vf format=yuv420p -shortest output1.mp4 -map 2:v -map 1:a -vf format=yuv420p -shortest output2.mp4
The video is filtered and encoded once per output.
With the split filter
ffmpeg -i input1.mp3 -i input2.mp3 -loop 1 -framerate 10 -i imagefile.png -filter_complex "[2:v]format=yuv420p,split=outputs=2[v0][v1]" -map "[v0] -map 0:a -shortest -movflags +faststart output1.mp4 -map "[v1]" -map 1:a -shortest -movflags +faststart output2.mp4
The video is filtered once total, and encoded separately for each output.
Pipe
ffmpeg -y -v error -loop 1 -framerate 10 -i imagefile.png -filter_complex "[0:v]format=yuv420p[v]" -map "[v]" -c:v libx264 -f nut - | ffmpeg -y -i - -i input1.mp3 -i input2.mp3 -map 0:v -map 1:a -c:v copy -shortest -movflags +faststart output1.mp4 -map 0:v -map 2:a -c:v copy -shortest -movflags +faststart output2.mp4
The video is filtered and encoded only once and each output stream copies it.

ffmpeg parallel encoding for make mp4 qualities

I want to make the different qualities from video in one command.
I used this below code.
But there is an issue,and it's that the output files not have details
ffmpeg -i input.mp4 -filter_complex [0:v]format=yuv420p,split=2[s0][s1];
[s0]scale=hd480[v0];
[s1]scale=nhd[v1]
-map [v0] -map [v1] -map 0:a? -c:v libx264 -c:a aac -f tee -threads 0
"[select='v\:0,a':f=mp4]1/480.mp4|[select='v\:1,a':f=mp4]1/360.mp4"
What I must be do?
With the guidances and helps of #Mulvya the answer is like this :
ffmpeg -i input.mp4 -filter_complex [0:v]format=yuv420p,split=2[s0][s1];
[s0]scale=hd480[v0];
[s1]scale=nhd[v1]
-map [v0] -map [v1] -map 0:a? -c:v libx264 -c:a aac -f tee -flags +global_header -threads 0
"[select='v\:0,a':f=mp4]1/480.mp4|[select='v\:1,a':f=mp4]1/360.mp4"

Merge video and audio while delaying audio by x seconds

This works for merging audio and video
ffmpeg -i video.mp4 -i audio.ogg -filter_complex "[0:a][1:a]amerge=inputs=2[a]" -map 0:v -map "[a]" -c:v copy -c:a libvorbis -ac 2 -shortest out.mp4 -y -nostdin
I can't figure out how to delay the audio so it starts x seconds into the video. I have tried -itsoffset but it doesn't work.
Use
ffmpeg -i video.mp4 -i audio.ogg -filter_complex "[1:a]adelay=1000|1000[a1];[0:a][a1]amerge=inputs=2[a]" -map 0:v -map "[a]" -c:v copy -c:a libvorbis -ac 2 -shortest out.mp4 -y -nostdin
The adelay adds 1000 ms of silence to both channels of the OGG.
This is more of a workaround, but you could concatenate 1 second of silence with your ogg first:
https://trac.ffmpeg.org/wiki/Concatenate

FFMpeg - Combine multiple filter_complex and overlay functions

I am having trouble combining these 3 passes in ffmpeg into a single process.
Is this even possible?
Pass 1
ffmpeg -y -i C:\Users\MJ\Downloads\20151211_pmoney_pmpod.mp3 -loop 1 -i C:\Users\MJ\Documents\pm1080.png -filter_complex "[0:a]showwaves=s=1920x1080:mode=line,colorkey=0x000000:0.01:0.1,format=yuva420p[v];[1:v][v]overlay=0:270[outv]" -map "[outv]" -pix_fmt yuv420p -map 0:a -c:v libx264 -c:a copy -shortest C:\Users\MJ\Documents\20151211_pmoney_pmpod4.mp4
Pass 2
ffmpeg -i "C:\Users\MJ\Documents\20151211_pmoney_pmpod4.mp4" -vf drawtext="fontsize=50:fontcolor=white:fontfile=/Windows/Fonts/impact.ttf:text=Planet Money Podcast on NPR - A/B Split Testing:x=(w-text_w)/2:y=200" -acodec copy "C:\Users\MJ\Documents\20151211_pmoney_pmpod-overlay-text.mp4"
Pass 3
ffmpeg -i "C:\Users\MJ\Documents\20151211_pmoney_pmpod-overlay-text.mp4" -i C:\Users\MJ\Downloads\6.png -filter_complex "overlay=10:10" C:\Users\MJ\Documents\20151211_pmoney_pmpod-overlay-text1.mp4"
Thanks!
Join filters with a comma and filterchains with a semicolon:
ffmpeg -i audio.mp3 -i image1.png -i image2.png -filter_complex \
"[0:a]showwaves=s=1920x1080:mode=line[fg]; \
[1:v][fg]overlay=0:270,drawtext=fontsize=50:fontcolor=white:fontfile=/Windows/Fonts/impact.ttf:text='Planet Money Podcast on NPR - A/B Split Testing':x=(w-text_w)/2:y=200[bg]; \
[bg][2:v]overlay=10:10,format=yuv420p[outv]" \
-map "[outv]" -map 0:a -c:v libx264 -c:a copy -movflags +faststart -shortest out.mp4

Resources