Create and update HLS playlist programmatically - macos

I have a C++ application that records audio from my default input device, encodes it in AAC format and writes to a .aac file. I want to use HTTP Live Streaming to live stream this AAC file. According to this question, I have to create a FFMPEG script to split my audio file into several .ts files.
# bitrate, width, and height, you may want to change this
BR=512k
WIDTH=432
HEIGHT=240
input=${1}
# strip off the file extension
output=$(echo ${input} | sed 's/\..*//' )
# works for most videos
ffmpeg -y -i ${input} -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s ${WIDTH}x${HEIGHT} -vcodec libx264 -b ${BR} -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 7 -trellis 0 -refs 0 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate ${BR} -bufsize ${BR} -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 30 -qmax 51 -qdiff 4 -level 30 -aspect ${WIDTH}:${HEIGHT} -g 30 -async 2 ${output}-iphone.ts
(It is slightly different in my case because I only work with audio)
Can I do so programmatically in C++ and if so, do I have to use a third-part library or does macOS provide native functions to do so ?

No, you don’t have to split it (you can put byte offsets into the m3u8) and no you don’t need to us a ts. HLS supports .aac files. Yes you can programmatically make an m3u8 without a library. C++ or literally any other programming language is capable.

Related

Streaming a WebPage to YouTube using FFMpeg

I'm trying to stream a webpage to YouTube using phantomjs and ffmpeg.
Long question short:
This works. The video is saved to test.flv:
phantomjs runner.js|ffmpeg -y -f image2pipe -r 10 -s 1280x720 -i - -deinterlace -vcodec libx264 -pix_fmt yuv420p -preset ultrafast -r 10 -g 20 -vb 400k -maxrate 400k -minrate 400k -bufsize 800k -threads 6 -q:v 0 -t 10 -f flv test.flv
This doesn't. Despite no errors, nothing is streamed to YouTube.
phantomjs runner.js|ffmpeg -f image2pipe -r 10 -s 1280x720 -i - -deinterlace -vcodec libx264 -pix_fmt yuv420p -preset ultrafast -r 10 -g 2 -vb 400k -maxrate 400k -minrate 400k -bufsize 800k -threads 6 -q:v 0 -f flv rtmp://a.rtmp.youtube.com/live2/xxxxxxx
Remarks:
I'm aware phantomjs isn't actively developed anymore, but this doesn't seem
relevant since the phantomjs script works as intended;
phantomjs script: runner.js;
I've tried different ffmpeg settings, like frame-rates, bit-rates and bufsize to no avail.
Both commands are similar, but the 1st outputs to local-file test.flv while 2nd to YouTube;
I've used the YouTube streaming key on OBS Studio and it works normally;
ffmpeg output while streaming to YouTube:
frame= 13 fps=0.0 q=42.0 size= 94kB time=00:00:00.50 bitrate=1531.0kbits/ frame= 18 fps= 16 q=40.0 size= 130kB time=00:00:01.00 bitrate=1063.6kbits/ frame= 23 fps= 14 q=44.0 size= 149kB time=00:00:01.50 bitrate= 810.8kbits/
ffmpeg version 4.2.1-static https://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2019 the FFmpeg developers running on Ubuntu-1804
Any idea of what can be wrong?
You need to add an audio stream. It can be from a file, or you can generate silent/dummy audio stream using the anullsrc filter:
phantomjs runner.js | ffmpeg -f image2pipe -framerate 10 -video_size 1280x720 -re -i - -f lavfi -i anullsrc -c:v libx264 -preset ultrafast -g 20 -b:v 400k -maxrate 400k -bufsize 800k -vf format=yuv420p -c:a aac -f flv rtmp://a.rtmp.youtube.com/live2/xxxx-xxxx-xxxx-xxxx
Unrelated changes:
Use the image2pipe input options instead of generics. See ffmpeg -h demuxer=image2pipe
Removed -deinterlace. I doubt the input is interlaced. If it is use a filter instead (-deinterlace uses the yadif filter, but is less customizable than directly using yadif).
You don't need to add the -r output option if the input -framerate option is the same value, so it has been removed from your command.
Let the encoder automatically choose the optimal number of threads by omitting the -threads option.
-q:v 0 is ignored by libx264. Remove it.
-g can be set to frame rate x 2.
You can use ffmpeg to capture the screen if you don't want to use additional javascripts.

Failed to read the 'buffered' property from 'SourceBuffer': This SourceBuffer has been removed from the parent media source

i'm trying to play movie by convert to hls stream.
I convert movie to hls by ffmpeg using this command:
nice -n 19 /usr/bin/ffmpeg -y -i 9e36f618-5775-40ba-b045-66fd16badc3e.mkv -vf 'movie=/home/thanhtv/data/bitbucket/php/ps/public/img/default/phimsobiz.png [logo]; [in][logo] overlay=5:10' -flags -global_header -f segment -segment_format mpeg_ts -segment_list ./9e36f618-5775-40ba-b045-66fd16badc3e.m3u8 -r 22 -maxrate 2M -bufsize 1M -segment_time 20 -threads 12 -vcodec libx264 -acodec aac -refs 6 -coder 1 -sc_threshold 40 -flags +loop -me_range 16 -subq 7 -i_qfactor 0.71 -qcomp 0.6 -qdiff 4 -trellis 1 ./9e36f618-5775-40ba-b045-66fd16badc3e%09d.ts
but when i play it using hls js (this one) i get an error:
hls.js:5051 Uncaught DOMException: Failed to read the 'buffered' property from 'SourceBuffer': This SourceBuffer has been removed from the parent media source.
at r.onSBUpdateEnd (https://phimso.biz/assets/js/hls.js:5051:50)
with similar command i got another video play, but this one doesn't. I don't know why. Can any one help me please.
here is the movie
and here is the page i'm trying to play that movie (in hls of course)
https://phimso.biz/admin/test
(my english quite bad, so please forgive me if i made some grammar error)

convert avi to 3gp using ffmpeg

I want to convert an avi file to 3gp with codec as mpeg4 simple profile level 0.But i am not able to do it with ffmpeg,it gives this error-Requested output format ‘-vcodec’ is not a suitable output format.how to fix this help!thanks in advance
note - the input avi is 720x480 generated from bmp images using ffmpeg with codec ffv1.the output 3gp should be of mpeg4 simple profile level 0
Original from: http://forum.videohelp.com/threads/322328-libfaac-encoding-with-ffmpeg
This worked for me:
tools/ffmpeg/./ffmpeg -i debug/assets/videos/sample_iPod.mp4 -acodec libvo_aacenc -vcodec libx264 debug/assets/videos/sample_iPod.3gp
Try this.. Hope this will help you
ffmpeg -i input.AVI -acodec libfaac -ab 128k -ar 44100 -s 704x400 -r 20 -vcodec libx264 -b 256000 -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -me_method umh -subq 5 -trellis 1 -refs 2 -bf 1 -coder 1 -me_range 16 -g 300 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 256000 -maxrate 4M -bufsize 4M -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 21 output.3gp
this will convert your avi to 3gp. Try different formats as input and output file as per your need
NOTE :- Change parameters as per your need like 'level'
You can try this:
ffmpeg -y -i test.mpeg -r 20 -s 352x288 -b 400k -acodec aac -strict experimental -ac 1 -ar 8000 -ab 24k test.3gp
Following code is working fine:
ffmpeg -y -i test.mpeg -r 20 -s 352x288 -b 400k -acodec aac -strict experimental -ac 1 -ar 8000 -ab 24k test.3gp
The guy wants it in mpeg4 codec not x264.
Here's how:
ffmpeg.exe -i input.avi -acodec libvo_aacenc -ab 64k -vcodec mpeg4 -s 320x240 -b:v 400k -r 23.976 output.3gp
In some phones 320x240 may seem stretchy, in that case use 320x180 instead

How to change bitrate mode: VBR to CBR with MPEG4 of H264 file?

I've tried to convert bitrate mode from VBR to CBR with FFMPEG library,but bitrate mode cannot change.
My command line:
ffmpeg -i <in file> -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s 320×240 -vcodec libx264 -b 96k -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 5 -trellis 1 -refs 1 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate 96k -bufsize 96k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 320:240 -g 30 -async 2 <output file>
I found some apps but don't have any app that i can change bitrate mode.
Can anyone point me why my cmd cannot change mode or the app can does that?.
I think CBR is a mode that bitrate is the same all time,it's true?
Thanks
If I understand correctly, you are just looking to get a Constant Bitrate? If so, adding -minrate 96k will probably do the trick

Live Video Encoding in ffmpeg

Actually i am trying to compress raw video to MPEG-4 AVC/H.264 BD-compatible High Profile / Level 4.1 video.I am using ffmpeg.
ffmpeg -threads 2 -f rawvideo -pix_fmt bgr24 -re -s 720x576 -i - -threads 2 -vcodec libx264 -deinterlace -s 720x576 -coder 1 -flags +loop -cmp +chroma -partitions -parti8x8-parti4x4-partp8x8-partb8x8 -me_method dia -subq 1 -me_range 16 -g 250 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -b_strategy 1 -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -bf 3 -refs 1 -directpred 1 -trellis 0 -flags2 +bpyramid-mixed_refs+wpred-dct8x8+fastpskip-mbtree -wpredp 0 -b 3000k -g 300 -an -f flv -y -
it works (output video is 21MB for 1min) while using -f mp4 error shows (muxer does not seekabe output) any one help me. Is it right way to get it ah?
Thank you very much.
Actually the error message is correct.
If I understand the command line correctly, you want to do STDOUT. This is impossible for mp4, as mp4 is a file format which can't be written without seeking afterwards and updating the header/footer.
So change to mpegts or something else that can be written via a pipe.
Alternatively give the filename of the output file, then ffmpeg can write a mp4 file.

Resources