Failed to read the 'buffered' property from 'SourceBuffer': This SourceBuffer has been removed from the parent media source - hls.js

i'm trying to play movie by convert to hls stream.
I convert movie to hls by ffmpeg using this command:
nice -n 19 /usr/bin/ffmpeg -y -i 9e36f618-5775-40ba-b045-66fd16badc3e.mkv -vf 'movie=/home/thanhtv/data/bitbucket/php/ps/public/img/default/phimsobiz.png [logo]; [in][logo] overlay=5:10' -flags -global_header -f segment -segment_format mpeg_ts -segment_list ./9e36f618-5775-40ba-b045-66fd16badc3e.m3u8 -r 22 -maxrate 2M -bufsize 1M -segment_time 20 -threads 12 -vcodec libx264 -acodec aac -refs 6 -coder 1 -sc_threshold 40 -flags +loop -me_range 16 -subq 7 -i_qfactor 0.71 -qcomp 0.6 -qdiff 4 -trellis 1 ./9e36f618-5775-40ba-b045-66fd16badc3e%09d.ts
but when i play it using hls js (this one) i get an error:
hls.js:5051 Uncaught DOMException: Failed to read the 'buffered' property from 'SourceBuffer': This SourceBuffer has been removed from the parent media source.
at r.onSBUpdateEnd (https://phimso.biz/assets/js/hls.js:5051:50)
with similar command i got another video play, but this one doesn't. I don't know why. Can any one help me please.
here is the movie
and here is the page i'm trying to play that movie (in hls of course)
https://phimso.biz/admin/test
(my english quite bad, so please forgive me if i made some grammar error)

Related

ffmpeg: create slideshow of unknown number of images with transition

hope someone could help me with this. Kinda new with ffmpeg and a bit stumped.
Given an input of a set of numbered images in a folder, I want to generate a video with each image shown for 60 seconds. I would love to use a default transition between each images.
The following code, correctly generates an mp4 without transitions:
ffmpeg -framerate 1/60 -pattern_type glob -i "*.png" -vcodec libx264 \
-pix_fmt yuv420p -r 30 -threads 4 -crf 25 -refs 1 -bf 0 -coder 0 -g 25 \
-keyint_min 15 -movflags +faststart no_audio_output.mp4
But when I try to add a default transition (supported in the version of ffmpeg I'm using that is the 5.1):
ffmpeg -framerate 1/60 -pattern_type glob -i "WC*.png" -filter_complex
xfade=transition=circleopen:duration=5:offset=55 -vcodec libx264 \
-pix_fmt yuv420p -r 30 -threads 4 -crf 25 -refs 1 -bf 0 -coder 0 -g 25 \
-keyint_min 15 -movflags +faststart no_audio_output.mp4
I have as error:
Cannot find a matching stream for unlabeled input pad 1 on filter Parsed_xfade_0
I googled a lot but still the solution is unclear. All the examples I found have been designed to deal with a slideshow/input with a define number of pieces while in my case the folder could contain any number of images.
Thanks all for your help!

Create and update HLS playlist programmatically

I have a C++ application that records audio from my default input device, encodes it in AAC format and writes to a .aac file. I want to use HTTP Live Streaming to live stream this AAC file. According to this question, I have to create a FFMPEG script to split my audio file into several .ts files.
# bitrate, width, and height, you may want to change this
BR=512k
WIDTH=432
HEIGHT=240
input=${1}
# strip off the file extension
output=$(echo ${input} | sed 's/\..*//' )
# works for most videos
ffmpeg -y -i ${input} -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s ${WIDTH}x${HEIGHT} -vcodec libx264 -b ${BR} -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 7 -trellis 0 -refs 0 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate ${BR} -bufsize ${BR} -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 30 -qmax 51 -qdiff 4 -level 30 -aspect ${WIDTH}:${HEIGHT} -g 30 -async 2 ${output}-iphone.ts
(It is slightly different in my case because I only work with audio)
Can I do so programmatically in C++ and if so, do I have to use a third-part library or does macOS provide native functions to do so ?
No, you don’t have to split it (you can put byte offsets into the m3u8) and no you don’t need to us a ts. HLS supports .aac files. Yes you can programmatically make an m3u8 without a library. C++ or literally any other programming language is capable.

FFMpeg rtsp to m3u8

I am using FFMPeg (version ffmpeg-20170330-ad7aff0-win64-static) to convert RTSP to .m3u8.
The command is:
ffmpeg -rtsp_transport tcp -i {RTSP} -c:v libx264 -crf 35 -preset ultrafast -maxrate 3M -bufsize 300k -r 10 -g 20 -movflags +faststart -tune zerolatency -hls_time 1 -hls_list_size 4 -hls_wrap 4 -start_number 1 -hls_allow_cache 0 -threads 1 -loglevel warning -y {PLAYLISTM3U8LOCATION]
I am getting following warning constantly:
Duplicated segment filename detected: playlist1.ts
or
Duplicated segment filename detected: playlist2.ts
or
Duplicated segment filename detected: playlist3.ts
In between its also showing warning:
cseq 10 expected, 8 received
any help on this??
I have met with a problem like so.My solution is to set the hls_wraps value larger than the hls_list_sizes.Maybe it can help you.

Creating a simulated HLS live stream from multiple MP4 sources with ffmpeg

I've already tried to create HLS stream from a UDP continuous input stream, it was fairly easy, now I want to create a simulated live HLS stream from multiple MP4 sources with ffmpeg, the idea behind it is to be able to create a TV channel with non-live data, so the input must be a loop of non-live data to simulate live stream continuity. I tried to do it with the below command but after the first round, ffmpeg exits with this error:
concat:1.mp4|2.mp4|3.mp4" Resource temporarily unavailable.
ffmpeg command:
ffmpeg -i "concat:1.mp4|2.mp4|3.mp4" -strict experimental -sn -ac 2 -map_metadata -1 -s 720x576 -g 250 -c:v libx264 -pix_fmt yuv420p -flags -global_header -hls_time 10 -hls_list_size 5 -hls_wrap 12 -hls_flags delete_segments -f hls -strftime 1 -segment_time 10 -segment_format mpegts -segment_list_flags +live -hls_allow_cache 0 -segment_wrap 12 -segment_list_size 5 -hls_base_url http://192.168.1.100/0/ -hls_segment_filename /data/0/live_0_%02d.ts /data/0/live_0.m3u8
If anyone has a nice solution to this issue, I would appreciate any input.
Cheers,
Navid

Live Video Encoding in ffmpeg

Actually i am trying to compress raw video to MPEG-4 AVC/H.264 BD-compatible High Profile / Level 4.1 video.I am using ffmpeg.
ffmpeg -threads 2 -f rawvideo -pix_fmt bgr24 -re -s 720x576 -i - -threads 2 -vcodec libx264 -deinterlace -s 720x576 -coder 1 -flags +loop -cmp +chroma -partitions -parti8x8-parti4x4-partp8x8-partb8x8 -me_method dia -subq 1 -me_range 16 -g 250 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -b_strategy 1 -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -bf 3 -refs 1 -directpred 1 -trellis 0 -flags2 +bpyramid-mixed_refs+wpred-dct8x8+fastpskip-mbtree -wpredp 0 -b 3000k -g 300 -an -f flv -y -
it works (output video is 21MB for 1min) while using -f mp4 error shows (muxer does not seekabe output) any one help me. Is it right way to get it ah?
Thank you very much.
Actually the error message is correct.
If I understand the command line correctly, you want to do STDOUT. This is impossible for mp4, as mp4 is a file format which can't be written without seeking afterwards and updating the header/footer.
So change to mpegts or something else that can be written via a pipe.
Alternatively give the filename of the output file, then ffmpeg can write a mp4 file.

Resources