How to stream WEBM Video by Media Source Extensions API - ffmpeg

I'm developing video streaming website using MSE.
Each video converted to FragmentedMP4 (h264,aac => avc1,mp4a)
It is working very fine but what if I wanted to use webm format? like YouTube or Facebook they sometimes use it.
I want to know how to get index (like sidx atom in fmp4) from VP8, VP9 or vorbis codec
I use bento4 and ffmpeg to get metadata from video and audio
but bento4 is for MP4 Just, and use MP4BoxJS to parse index in browser by JavaScript.
What should I use? ffmpeg or what to create fragmented webm or something like that and get index stream info to append segments to MSE SourceBuffer and sure it should be seekable stream..

Related

FFMPEG-API Add Subtitle to an MPEG-TS Stream

We have a Video file we want to stream it over UDP in mpegts.
This is working perfectly with ffmpeg API.
Now we want to add subtitle's to this stream.
i'll try to load subtitle file and add stream to our output context not working
i'll try to dynamicaly add subtitle not working too
If Anyone has an idea.
Thanks to all.
MPEG-TS format only supports DVBSUB or DVBTELETEXT formats and both are bitmap-based.

change setting delay audio while stream video by ffmpeg?

I have 2 rtmp, one of them for audio playback and one for streaming video. I mix all of them using ffmpeg, but I want to change the delay between audio and video while running ffmpeg. How can I do that?

Libavformat- Passing an object of images to libavformat to generate a video

I am trying to generate a video with libavformat/Libavcodec with a bunch of images that are in memory.
Can someone point me in the right direction, please?
Thanks in advance.
First, the basics of creating a video from images with FFmpeg is explained here.
If you simply want to change/force the format and codec of your video, here is a good start.
For the raw FFmpeg documentation you could use the Video and Audio Format Conversion, the Codec Documentation, the Format Documentation the and the image2 demuxer documentation (this demuxer will manage images as an input).
If you just want to take images and make a simple video out of it, just look at the 2 first links. FFmpeg's documentation gives you powerful tools but don't use them if you don't need them.
A sample command to create a video from images is:
ffmpeg -i image-%03d.png video.mp4
This will take all the files in sequence from image-000.png to the highest number available and make a video out of it.
You can force the format with the extension of the output file. To force the video codec use -c:v followed by a codec name available in the codec documentation.

How to convert SRT (text based subtitle) to DVB (Bitmap based subtitle)?

I have Live stream with video and audio substream , I want to add third stream of subtitle as srt subtitle stream and generate output stream with dvb subtitle stream along with video and audio.
Here I can able convert add third stream (srt subtitle stream) in output stream as dvb subtitle.Third stream with dvb subtitle generated but not able to see subtitle data in VLC player.Here proble is ffmepg can not able to convert srt datra to dvb subtitle data, ffmpeg only able to convert "srt to srt" or "dvb(Bitmap) to dvb(Bitmap)" my concern is how to transcode (convert) srt subtitle data to dvb subtitle (BITMAP format) data.Need some logic or idea to step foreword.
Or any other way to doing this?
I have tried some tool that convert srt to dvb(Bitmap) format and add that converted file as input in ffmpeg , This way I can able to create third stream as dvb subtitle even I can able to see subtitle in VLC.Now I want to convert with my own as I my input stream is LIVE stream.I can't use tool to convert it need some proper logic to convert it.
I have tried multiple ffmpeg command to convert srt to dvb format but ffmepg have written code if input and output format of subtitle is same, no code I have found in ffmpeg that able to convert srt to dvb subtitle.

How to create video using QTRLE codec in ffmpeg?

I want to generate video using QTRLE codec and with ARGB pixel format using ffmpeg lib.
I am able to create video using H264 with YUV420P pixel format, but unable to do same with QTRLE.So how can i do this?
I suppose that your first video, the one you create using H264 with YUV240p pixel format is a mp4, and I suppose that you want to also generate a MP4 video with the QTRLE codec and ARGB pixel format.
The problem is QTRLE codec is not compatible with the mp4 container, so you cannot have an mp4 file encoded in qtrle. The container I know compatible with it is MOV. I tried it and there is no problem.

Resources