Trying to loop a playlist in folder with ffmpeg - bash

When trying to play all of *.mp4 files in the folder, and then repeat it from the first video again, i get this error:
start_betterme_playlist: line 35: 25659 Killed ffmpeg $SOURCE -filter_complex "$filter" -map "[v]" -map "[a]" -deinterlace -vcodec libx264 -pix_fmt yuv420p -preset $QUAL -r $FPS -g $(($FPS * 2)) -b:v $VBR -acodec libmp3lame -ar 44100 -threads 6 -qscale 3 -b:a 712000 -bufsize 512k -f flv "$YOUTUBE_URL/$KEY"
The code is - Does anybody maybe know how to fix that problem, i really would like to have this multiple videos played on the stream contiounsly. Thanks in advance!
#! /bin/bash
VBR="2500k"
FPS="30"
QUAL="medium"
YOUTUBE_URL="rtmp://live.twitch.tv/app/"
FOLDER="videos"
KEY="live_16xxxxxxxxxxx_xxxvO7C"
SOURCE=""
n=0
filter=""
for f in $FOLDER/*.mp4
do
SOURCE="$SOURCE -i $f"
filter="$filter [$n:v:0] [$n:a:0]"
((n++))
done
filter="$filter concat=n=$n:v=1:a=1 [v] [a]"
echo "ffmpeg $SOURCE -filter_complex '$filter'"
ffmpeg \
$SOURCE -filter_complex "$filter" \
-map "[v]" -map "[a]" -deinterlace \
-vcodec libx264 -pix_fmt yuv420p -preset $QUAL -r $FPS -g $(($FPS * 2)) -b:v $VBR \
-acodec libmp3lame -ar 44100 -threads 6 -qscale 3 -b:a 712000 -bufsize 512k \
-f flv "$YOUTUBE_URL/$KEY"

You can use ffmpeg's built in playlist feature:
Create a playlist of all mp4 files in your directory (you can use a filler-video between the videos)
Use playlist as input for ffmpeg (edit ffmpeg commands to taste. I created a multicast stream that can be easily tested locally)
here is a simple shell script:
#!/bin/bash
# create playlist from all .mp4 and play in loop
playlist="playlist.txt"
# create/reset playlist file
echo "#ffmpeg playlist" > ${playlist}
# this video goes in between two video files
use_filler="true" # activate filler with "true". Use "false" or any other string to deactivate filler.
filler="filler.mp4"
# create playlist for ffmpeg from all videos in the current directory
for f in *.mp4;do
# exclude filler from playlist
if [ "${f}" != "${filler}" ]; then
echo "file '${f}'" >> ${playlist}
# if activated add filler after every video file
if [[ "${use_filler}" = "true" ]]; then # This condition is false for anything but the literal string "true".
echo "file '${filler}'" >> ${playlist}
fi
fi
done
# stream playlist immediately with ffmpeg
ffmpeg -f concat -safe 0 -stream_loop -1 \
-i "${playlist}" \
-c copy \
-f mpegts \
"udp://239.253.253.1:1234?pkt_size=1384"
Playback with ffplay...
ffplay udp://239.253.253.1:1234
...or VLC:
udp://#239.253.253.1:1234

Related

Change ffmpeg input while streaming

Is there a way to change ffmpeg input while streaming to rtmp?
I have this bash script
#! /bin/bash
VBR="1500k"
FPS="24"
QUAL="superfast"
RTMP_URL="rtmp://live.live/live"
KEY="xxxxxxxxxxxxxxxxxxxxx"
VIDEO_SOURCE="video.mp4"
AUDIO_SOURCE="song.mp3"
NP_SOURCE="song.txt"
FONT="font.ttf"
ffmpeg \
-re -f lavfi -i "movie=filename=$VIDEO_SOURCE:loop=0, setpts=N/(FRAME_RATE*TB)" \
-thread_queue_size 512 -i "$AUDIO_SOURCE" \
-map 0:v:0 -map 1:a:0 \
-map_metadata:g 1:g \
-vf drawtext="fontsize=25: fontfile=$FONT: \
box=1: boxcolor=black#0.5: boxborderw=20: \
textfile=$NP_SOURCE: reload=1: fontcolor=white#0.8: x=50: y=50" \
-vcodec libx264 -pix_fmt yuv420p -preset $QUAL -r $FPS -g $(($FPS * 2)) -b:v $VBR \
-acodec libmp3lame -ar 44100 -threads 6 -qscale:v 3 -b:a 320000 -bufsize 512k \
-f flv "$RTMP_URL/$KEY"
What i want to do is to be able to change VIDEO_SOURCE on the fly, i was thinking if it's possible to make the input a directory then change the video in that directory on the fly, i'm new to dealing with scripts so i don't know how to do that
This is a complete guess, based on what little I know about how ffmpeg handles interactive input:
while :; do
ffmpeg \
-re -f lavfi -i "movie=filename=$VIDEO_SOURCE:loop=0, setpts=N/(FRAME_RATE*TB)" \
-thread_queue_size 512 -i "$AUDIO_SOURCE" \
-map 0:v:0 -map 1:a:0 \
-map_metadata:g 1:g \
-vf drawtext="fontsize=25: fontfile=$FONT: \
box=1: boxcolor=black#0.5: boxborderw=20: \
textfile=$NP_SOURCE: reload=1: fontcolor=white#0.8: x=50: y=50" \
-vcodec libx264 -pix_fmt yuv420p -preset $QUAL -r $FPS -g $(($FPS * 2)) -b:v $VBR \
-acodec libmp3lame -ar 44100 -threads 6 -qscale:v 3 -b:a 320000 -bufsize 512k \
-f flv "$RTMP_URL/$KEY"
read -p "Next movie?" VIDEO_SOURCE
[ "$VIDEO_SOURCE" = q ] && break
done
ffmpeg should(?) exit if you send q to standard input. Your script will then prompt you for a new value for VIDEO_SOURCE. If you type q again, the loop exits. Otherwise, it restarts ffmpeg with the new video source file.
If this works, you can perhaps adapt it for something closer to your needs.

How to fix this script - recursive ffmpeg encoding

I've downloaded some videos from my course and I think they are too big. I need to reencode them to something better and smaller.
The problem is I made the script but it saves the file in the first folder. I want the output files inside the same folder as the input files. For example:
Folder 1
- script.sh
Folder 2
- file1.mp4
- file1.new
Folder 3
- file2.mp4
- file2.new
I've tried using the for loop only, and it was working actually but I couldn't encode the files recursively. This wouldn't work. Using find solved the problem, but now the output files are all in the same directory, which is the same directory where the script is located (working directory).
IFS=$'\n'; set -f
for i in $(find . -name '*.mp4'); do
if ($width > 600) && ($width < 800); then
echo "$i is a 720p video. Let's encode it to VP9."
notify-send Shrinker "Beginning encoding filename "$i""
ffmpeg -i "$i" -vf mpdecimate,setpts=N/FRAME_RATE/TB -vf scale=1280x720 -b:v 1800k \
-minrate 900k -maxrate 2610k -tile-columns 2 -g 240 -threads 8 \
-quality good -crf 32 -c:v libvpx-vp9 -c:a libopus \
-pass 1 -speed 4 "$(basename "${i/.mp4}")".webm && \
ffmpeg -y -i "$i" -vf mpdecimate,setpts=N/FRAME_RATE/TB -vf scale=1280x720 -b:v 1800k \
-minrate 900k -maxrate 2610k -tile-columns 2 -g 240 -threads 8 \
-quality good -crf 32 -c:v libvpx-vp9 -c:a libopus \
-pass 2 -speed 4 -y "$(basename "${i/.mp4}")".webm
elif ($width > 800) && ($width < 1081) && ($fps < 31.000); then
echo "$i is a 1080p video with 30fps or maybe less. Let's encode it to VP9."
ffmpeg -i "$i" -vf mpdecimate,setpts=N/FRAME_RATE/TB -vf scale=1920x1080 -b:v 1800k \
-minrate 900k -maxrate 2610k -tile-columns 2 -g 240 -threads 8 \
-quality good -crf 31 -c:v libvpx-vp9 -c:a libopus \
-pass 1 -speed 4 "$(basename "${i/.mp4}")".webm && \
ffmpeg -y -i "$i" -vf mpdecimate,setpts=N/FRAME_RATE/TB -vf scale=1920x1080 -b:v 1800k \
-minrate 900k -maxrate 2610k -tile-columns 4 -g 240 -threads 8 \
-quality good -crf 31 -c:v libvpx-vp9 -c:a libopus \
-pass 2 -speed 4 -y "$(basename "${i/.mp4}")".webm
notify-send Shrinker "Beginning encoding filename "$i""
elif ($width > 800) && ($width < 1081) && ($fps > 49.000); then
echo "$i is a 1080p video with 50fps or maybe more. Let's encode it to VP9."
ffmpeg -i "$i" -vf mpdecimate,setpts=N/FRAME_RATE/TB -vf scale=1920x1080 -b:v 3000k \
-minrate 1500k -maxrate 4350k -tile-columns 2 -g 240 -threads 8 \
-quality good -crf 31 -c:v libvpx-vp9 -c:a libopus \
-pass 1 -speed 4 "$(basename "${i/.mp4}")".webm && \
ffmpeg -y -i "$i" -vf mpdecimate,setpts=N/FRAME_RATE/TB -vf scale=1920x1080 -b:v 3000k \
-minrate 1500k -maxrate 4350k -tile-columns 4 -g 240 -threads 8 \
-quality good -crf 31 -c:v libvpx-vp9 -c:a libopus \
-pass 2 -speed 4 -y "$(basename "${i/.mp4}")".webm
notify-send Shrinker "Beginning encoding filename "$i""
else
echo "no file found"
fi
done
My script finds every file and encodes them BUT the output files are saved into folder 1. It should save on folder 2 and folder 3.
I get this:
Folder 1
- script.sh
- file1.webm
- file2.webm
Folder 2
- file1.mp4
Folder 3
- file2.mp4
I want this:
Folder 1
- script.sh
Folder 2
- file1.mp4
- file1.webm
Folder 3
- file2.mp4
- file2.webm

Add image on ffmpeg stream

I'm coming to you today because I just want to add an overlay to my feed on youtube I already have all the code that plays the video but I can not add an image. Here is the code that I am currently using and I have seen on another post how to add an image but I can not add it:
function goto {
VBR="2000k" # Bitrate de la vidéo en sortie
FPS="30" # FPS de la vidéo en sortie
QUAL="fast" # Preset de qualité FFMPEG
YOUTUBE_URL="rtmp://x.rtmp.youtube.com/live2" # URL de base RTMP youtube
result="$(ls Video | shuf -n 1)"
SOURCE="Video/${result}" # Source UDP (voir les annonces SAP)
KEY="ergtre498ter64t" # Clé à récupérer sur l'event youtube
ffmpeg \
-i "$SOURCE" -deinterlace -vf realtime -af arealtime \
-vcodec libx264 -pix_fmt yuv420p -preset $QUAL -r $FPS -g $(($FPS * 2)) -b:v $VBR \
-acodec libmp3lame -ar 44100 -threads 6 -qscale 3 -b:a 712000 -bufsize 512k \
-framerate 2 -f flv "$YOUTUBE_URL/$KEY"
goto
}
goto
And the code I found
ffmpeg -i input.mp4 -i image.png \
-filter_complex "[0:v][1:v] overlay=25:25:enable='between(t,0,20)'" \
-pix_fmt yuv420p -c:a copy \
output.mp4
FFmpeg command would be
ffmpeg \
-i "$SOURCE" -i "$IMAGE" -filter_complex "[0]yadif[m];[m][1]overlay=25:25,realtime" -af arealtime \
-vcodec libx264 -pix_fmt yuv420p -preset $QUAL -r $FPS -g $(($FPS * 2)) -b:v $VBR \
-acodec libmp3lame -ar 44100 -threads 6 -qscale 3 -b:a 712000 -bufsize 512k \
-f flv "$YOUTUBE_URL/$KEY"
$IMAGE should be set to your image file URL.

How to reset PTS for Stream

I am trying reset pts on input stream and create new pts and publish stream to RTMP.
ffmpeg -re -f lavfi -i "movie=${SOURCE}:s=0+1[out0][out1];[0:v]setpts=N/(FRAME_RATE*TB),[0:a]asetpts=N/(FRAME_RATE*TB)" \
-r 24 -crf 20 \
-c:v libx264 \
-c:a aac -ar 44100 -ab 128k -ac 2 -strict -2 \
-f flv ${DEST}
If I remove the setpts and asetpts filters them command works. But I need to setpts and asetpts at source before it is given to encoder.
Please help.
Alter the PTS outside the source graph.
ffmpeg -re -f lavfi -i "movie=${SOURCE}:s=0+1" \
-vf setpts=N/FRAME_RATE/TB -af asetpts=N/SR/TB
-r 24 -crf 20 \
-c:v libx264 \
-c:a aac -ar 44100 -ab 128k -ac 2 -strict -2 \
-f flv ${DEST}

MP4 to DASH (bash script)

I have a web site in which users can upload video files. I want to stream all of them using DASH to obtain an adaptive bitrate streaming. So I wrote a bash script (to run by cron) that converts all mp4 files to DASH, but it doesn't work properly: what is wrong?
For example, using the following script, I obtained:
https://www.informatica-libera.net/dash_faq/stream.mpd
It validates, but it doesn't play. I tested it on:
http://dash-mse-test.appspot.com/dash-player.html?url=https%3A%2F%2Fwww.informatica-libera.net%2Fdash_faq%2Fstream.mpd&autoplay=on&adapt=auto&flavor=
Thank you for any help.
The code:
#!/bin/bash
# THIS SCRIPT CONVERTS EVERY MP4 (IN THE CURRENT FOLDER AND SUBFOLDER)
# TO A MULTI-BITRATE VIDEO IN MP4-DASH
# For each file "videoname.mp4" it creates a folder "dash_videoname"
# containing a dash manifest file "stream.mpd" and subfolders containing
# video segments.
# mp4dash documentation and download: https://www.bento4.com/developers/dash/
MYDIR=$(dirname $(readlink -f ${BASH_SOURCE[0]}))
SAVEDIR=$(pwd)
# Check programs
if [ -z "$(which ffmpeg)" ]; then
echo "Error: ffmpeg is not installed"
exit 1
fi
if [ -z "$(which mp4dash)" ]; then
echo "Error: mp4dash is not installed"
exit 1
fi
cd "$MYDIR"
TARGET_FILES=$(find ./ -type f -name "*.mp4")
for f in $TARGET_FILES
do
f=$(basename "$f") # fullname of the file
f="${f%.*}" # name without extension
if [ ! -d "dash_${f}" ]; then
echo "Converting \"$f\" to multi-bitrate video in MPEG-DASH"
ffmpeg -y -i "${f}.mp4" -c:a libfdk_aac -ac 2 -ab 128k -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 1500k -vf "scale=-2:720" -f mp4 -pass 1 -y /dev/null
ffmpeg -y -i "${f}.mp4" -c:a libfdk_aac -ac 2 -ab 128k -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 1500k -vf "scale=-2:720" -f mp4 -pass 2 "${f}_1500.mp4"
ffmpeg -y -i "${f}.mp4" -c:a libfdk_aac -ac 2 -ab 128k -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 800k -vf "scale=-2:540" -f mp4 -pass 1 -y /dev/null
ffmpeg -y -i "${f}.mp4" -c:a libfdk_aac -ac 2 -ab 128k -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 800k -vf "scale=-2:540" -f mp4 -pass 2 "${f}_800.mp4"
ffmpeg -y -i "${f}.mp4" -c:a libfdk_aac -ac 2 -ab 128k -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 400k -vf "scale=-2:360" -f mp4 -pass 1 -y /dev/null
ffmpeg -y -i "${f}.mp4" -c:a libfdk_aac -ac 2 -ab 128k -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 400k -vf "scale=-2:360" -f mp4 -pass 2 "${f}_400.mp4"
ffmpeg -y -i "${f}.mp4" -c:a libfdk_aac -ac 2 -ab 128k -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 200k -vf "scale=-2:180" -f mp4 -pass 1 -y /dev/null
ffmpeg -y -i "${f}.mp4" -c:a libfdk_aac -ac 2 -ab 128k -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 200k -vf "scale=-2:180" -f mp4 -pass 2 "${f}_200.mp4"
rm -f ffmpeg*log*
mp4fragment "${f}_1500.mp4" "${f}_1500_fragmented.mp4"
mp4fragment "${f}_800.mp4" "${f}_800_fragmented.mp4"
mp4fragment "${f}_400.mp4" "${f}_400_fragmented.mp4"
mp4fragment "${f}_200.mp4" "${f}_200_fragmented.mp4"
rm -f "${f}_1500.mp4" "${f}_800.mp4" "${f}_400.mp4" "${f}_200.mp4"
mp4dash -v -o "dash_${f}" "${f}_1500_fragmented.mp4" "${f}_800_fragmented.mp4" "${f}_400_fragmented.mp4" "${f}_200_fragmented.mp4"
rm -f "${f}_1500_fragmented.mp4" "${f}_800_fragmented.mp4" "${f}_400_fragmented.mp4" "${f}_200_fragmented.mp4"
fi
done
cd "$SAVEDIR"
I solved so (however it's not fully compatible with every modern browser, for example on my Linux version of Firefox the audio doesn't play):
#!/bin/bash
# THIS SCRIPT CONVERTS EVERY MP4 (IN THE CURRENT FOLDER AND SUBFOLDER) TO A MULTI-BITRATE VIDEO IN MP4-DASH
# For each file "videoname.mp4" it creates a folder "dash_videoname" containing a dash manifest file "stream.mpd" and subfolders containing video segments.
# Validation tool:
# http://dashif.org/conformance.html
# Documentation:
# https://tdngan.wordpress.com/2016/11/17/how-to-encode-multi-bitrate-videos-in-mpeg-dash-for-mse-based-media-players/
# Remember to add the following mime-types (uncommented) to .htaccess:
# AddType video/mp4 m4s
# AddType application/dash+xml mpd
# DASH-264 JavaScript Reference Client
# https://github.com/Dash-Industry-Forum/dash.js
# https://github.com/Dash-Industry-Forum/dash.js/wiki
MYDIR=$(dirname $(readlink -f ${BASH_SOURCE[0]}))
SAVEDIR=$(pwd)
# Check programs
if [ -z "$(which ffmpeg)" ]; then
echo "Error: ffmpeg is not installed"
exit 1
fi
if [ -z "$(which MP4Box)" ]; then
echo "Error: MP4Box is not installed"
exit 1
fi
cd "$MYDIR"
TARGET_FILES=$(find ./ -type f -name "*.mp4")
for f in $TARGET_FILES
do
f=$(basename "$f") # fullname of the file
f="${f%.*}" # name without extension
if [ ! -d "dash_${f}" ]; then
echo "Converting \"$f\" to multi-bitrate video in MPEG-DASH"
ffmpeg -y -i "${f}.mp4" -c:a libfdk_aac -ac 2 -ab 128k -vn "${f}_audio.m4a"
ffmpeg -y -i "${f}.mp4" -an -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 1500k -vf "scale=-2:720" -f mp4 -pass 1 -y /dev/null
ffmpeg -y -i "${f}.mp4" -an -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 1500k -vf "scale=-2:720" -f mp4 -pass 2 "${f}_1500.mp4"
ffmpeg -y -i "${f}.mp4" -an -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 800k -vf "scale=-2:540" -f mp4 -pass 1 -y /dev/null
ffmpeg -y -i "${f}.mp4" -an -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 800k -vf "scale=-2:540" -f mp4 -pass 2 "${f}_800.mp4"
ffmpeg -y -i "${f}.mp4" -an -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 400k -vf "scale=-2:360" -f mp4 -pass 1 -y /dev/null
ffmpeg -y -i "${f}.mp4" -an -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 400k -vf "scale=-2:360" -f mp4 -pass 2 "${f}_400.mp4"
ffmpeg -y -i "${f}.mp4" -an -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 200k -vf "scale=-2:180" -f mp4 -pass 1 -y /dev/null
ffmpeg -y -i "${f}.mp4" -an -c:v libx264 -x264opts 'keyint=24:min-keyint=24:no-scenecut' -b:v 200k -vf "scale=-2:180" -f mp4 -pass 2 "${f}_200.mp4"
rm -f ffmpeg*log*
MP4Box -dash 2000 -rap -frag-rap -profile onDemand "${f}_1500.mp4" "${f}_800.mp4" "${f}_400.mp4" "${f}_200.mp4" "${f}_audio.m4a" -out "${f}_MP4.mpd"
rm "${f}_1500.mp4" "${f}_800.mp4" "${f}_400.mp4" "${f}_200.mp4" "${f}_audio.m4a"
fi
done
cd "$SAVEDIR"
I also tried VP9 instead of h.264, but also in this case there isn't compatibility with all browser (in my Linux distro, it plays correctly only on Firefox, while it doesn't play at all on Chrome):
#!/bin/bash
MYDIR=$(dirname $(readlink -f ${BASH_SOURCE[0]}))
SAVEDIR=$(pwd)
# Controlla che i programmi richiesti siano installati
if [ -z "$(which ffmpeg)" ]; then
echo "Errore: ffmpeg non e' installato"
exit 1
fi
cd "$MYDIR"
TARGET_FILES=$(find ./ -type f -name "*.mp4")
for f in $TARGET_FILES
do
f=$(basename "$f") # memorizza il nome completo del file
f="${f%.*}" # toglie l'estensione
if [ ! -f "${f}.mpd" ]; then
echo "Converto il file \"$f\" in Adaptive WebM using DASH"
echo "Riferimenti: http://wiki.webmproject.org/adaptive-streaming/instructions-to-playback-adaptive-webm-using-dash"
# http://wiki.webmproject.org/adaptive-streaming/instructions-to-playback-adaptive-webm-using-dash
VP9_DASH_PARAMS="-tile-columns 4 -frame-parallel 1"
ffmpeg -i "${f}.mp4" -c:v libvpx-vp9 -vf scale=-1:90 -b:v 250k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 -pass 1 -y /dev/null
ffmpeg -i "${f}.mp4" -c:v libvpx-vp9 -vf scale=-1:90 -b:v 250k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 -pass 2 "${f}_160px_250k.webm"
ffmpeg -i "${f}.mp4" -c:v libvpx-vp9 -vf scale=-1:180 -b:v 500k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 -pass 1 -y /dev/null
ffmpeg -i "${f}.mp4" -c:v libvpx-vp9 -vf scale=-1:180 -b:v 500k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 -pass 2 "${f}_320px_500k.webm"
ffmpeg -i "${f}.mp4" -c:v libvpx-vp9 -vf scale=-1:360 -b:v 750k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 -pass 1 -y /dev/null
ffmpeg -i "${f}.mp4" -c:v libvpx-vp9 -vf scale=-1:360 -b:v 750k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 -pass 2 "${f}_640px_750k.webm"
ffmpeg -i "${f}.mp4" -c:v libvpx-vp9 -vf scale=-1:360 -b:v 1000k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 -pass 1 -y /dev/null
ffmpeg -i "${f}.mp4" -c:v libvpx-vp9 -vf scale=-1:360 -b:v 1000k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 -pass 2 "${f}_640px_1000k.webm"
ffmpeg -i "${f}.mp4" -c:v libvpx-vp9 -vf scale=-1:720 -b:v 1500k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 -pass 1 -y /dev/null
ffmpeg -i "${f}.mp4" -c:v libvpx-vp9 -vf scale=-1:720 -b:v 1500k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 -pass 2 "${f}_1280px_1500k.webm"
ffmpeg -i "${f}.mp4" -c:a libvorbis -b:a 128k -vn -f webm -dash 1 "${f}_audio_128k.webm"
rm -f ffmpeg*.log
ffmpeg \
-f webm_dash_manifest -i "${f}_160px_250k.webm" \
-f webm_dash_manifest -i "${f}_320px_500k.webm" \
-f webm_dash_manifest -i "${f}_640px_750k.webm" \
-f webm_dash_manifest -i "${f}_640px_1000k.webm" \
-f webm_dash_manifest -i "${f}_1280px_1500k.webm" \
-f webm_dash_manifest -i "${f}_audio_128k.webm" \
-c copy -map 0 -map 1 -map 2 -map 3 -map 4 -map 5 \
-f webm_dash_manifest \
-adaptation_sets "id=0,streams=0,1,2,3,4 id=1,streams=5" \
"${f}.mpd"
fi
done
cd "$SAVEDIR"
I didn't find a way to serve audio/video content to all browsers. I've done my tests so:
<!DOCTYPE html>
<html>
<head>
<script src="http://cdn.dashjs.org/latest/dash.all.min.js"></script>
<style>
video {
width: 640px;
height: 360px;
}
</style>
</head>
<body>
<div>
<video data-dashjs-player autoplay src="test.mpd" controls></video>
</div>
</body>
</html>

Resources