For my HW decoder I need know additional codec specification about video codec like h264? How I can extract low level codec info with libav ffmpeg?
The data that I need: https://www.ffmpeg.org/doxygen/2.7/structSPS.html
You can either look at each NALU in the stream looking for an SPS ( nalu[0] & 0x1F == 7 ) or you can take a look at the AVCC data pointed to by AVCodecContext.extradata and parse it as described here:
http://aviadr1.blogspot.com/2010/05/h264-extradata-partially-explained-for.html
Related
there are a h264 rtp stream passing through server, I catch every rtp packets and extract every nalu by getting rid of rtp header, and then write every nalu to file record.h264 with adding a h264 prefix start code 0x00000001, but file record.h264 cannot be played by ffplay and cannot be parsed by ffprobe, where is wrong?
here is my record file:https://github.com/sshsu/record_h264_file
You might be skipping the decoding of the NAL header within the RtpPacket to it's full format from the aggregated form.
I have an implementation in C# here:
https://github.com/juliusfriedman/net7mma_core/blob/master/RtspServer/MediaTypes/RFC6184Media.cs if it helps you.
I want to read and write subtitle stream
ffmpeg -i E:/Video/Waka.mp4 -vf subtitles=E:/Video/Waka.srt out.mp4
equivalent code in c or c++
and please provide how to add subtitle stream and encoding parameter, what is procedure to read subtitle stream and render at screan
Subtitle Stream encode and while decoding we find data stream so dnt worry just get data with stream index while reading packet.
I read what an Elementary Stream is on Wikipedia. A tool i am using "Live555" is demanding "H.264 Video Elementary Stream File". So when exporting a Video from a Video Application, do i have to choose specific preferences to generate a "Elementery Stream" ?
If you're using ffmpeg you could use something similar to the following:
ffmpeg -f video4linux2 -s 320x240 -i /dev/video0 -vcodec libx264 -f h264 test.264
You'll have to adapt the command line for the file type you're exporting the video from.
This generates a file containing H.264 access units where each access unit consists of one or more NAL units with each NAL unit prefixed with a start code (0001 or 001). You can open the file using a hex editor to take a look at it.
You can also create an H.264 elementary stream file (.264) by using the the H.264 reference encoder from raw YUV input files.
If you copy the generated .264 file into the live555 testOnDemandRTSPServer directory, you can test streaming the file over RTSP/RTP.
Can you give some references to read more about NAL / H.264 elementary Stream. How can I quickly check if the stream is an elementary stream?
Generally anything in a container (avi or mp4) is not an elementary stream. The typical extension used for elementary streams is ".264". The quickest way to double check that a file is an elementary stream is to open the file in a hex editor and look for a start code at the beginning of the file (00000001). Note that there should be 3 (000001) and 4 (00000001) byte start codes through out the file (before every NAL unit)
Why does live555 not play h264 streams which are not elementary?
This is purely if live555 has not implemented the required demux (e.g. avi or mp4). AFAIK live555 does support demuxing H.264 from the matroska container.
I am trying to stream data encoded using FFMPEg using live555. I have a custom framesource that sends the data to sink but I am unable to figure out how to set SPS and PPS in the framer. I understand that extradata contains this information but I saw only SPS in that. Does extradata changes while encoding by FFMPeg? If yes how and when we need to update this information in live555 framer.
Does anyone have a working sample using FFMpeg and live555 to stream H264
Live555 is simply a streaming tool, it does not do any encoding.
The SPS and PPS are NAL units within the encoded H264 stream (or the output from your FFMPEG implementation) (see some info here: http://www.cardinalpeak.com/blog/the-h-264-sequence-parameter-set/).
If you want to change the SPS or PPS information you need to do it in FFMPEG.
Examples of FFMPEG and Live555 working together to stream MPG2 and H264 streams are here:
https://github.com/alm865/FFMPEG-Live555-H264-H265-Streamer/
As for streaming a H264 stream, you need to break the output from FFMPEG into NAL units before you send if off to the discrete framer for it to work correctly. You must also strip the leading and trailing NAL bits from the packet (i.e. remove the NAL identifier 0x00 0x00 0x00 0x01).
Live555 will automatically read these in and update as necessary.
I create a simple direct show source filter using FFmpeg.I read rtp packets from RTSP source and give them to decoder. It works for h264 stream.
MyRtspSourceFilter[H264 Stream] ---> h264 Decoder --> Video Renderer
The bad news is that it does not work for MPEG-4. I can able to connect my rtsp source filter with MPEG-Decoder. I got no exception but video renderer does not show anything. Actually just show one frame then nothing [just stop]... Decoders and Renderers are 3rd party so i can not debug them.
MyRtspSourceFilter[MP4 Stream] ---> MPEG-4 Decoder --> Video Renderer
I can able to get rtp packets from MPEG-4 RTSP Source using FFmpeg sucessfully.There is no problem with it.
It seems that i have not set something(?) in my Rtsps Source
Filter which is not necessary for H264 stream but may be important for
MPEG-4 stream
What may cause this h264 stream and MPEG-4 stream difference in a direct show rtsp source filter? Any ideas.
More Info:
-- First i try some other rtsp source filters for MPEG-4 Stream...Although my rtsp source is same i see different subtypes in their pin connections.
-- Secondly i realy get suspicious if the source is really MPEG-4 SO i check with FFmpeg...FFmpeg gives the source codec id as "CODEC_ID_MPEG4".
Update:
[ Hack ]
I just set m_bmpInfo.biCompression = DWORD('xvid') it just worked fine...But it is static. How to dynamically get/determine this value using ffmpeg or other ways...
I am on the RTSP-server side, different use case with required by-frame conversions
MP4 file ---> MPEG-4 Decoder --> H264 Encoder --> RTSP Stream
Will deploy libav, which is kernel of ffmpeg.
EDIT:
With H264 encoded video layer, the video just needs to be remuxed from
length-prefixed file format "AVCC" to byte stream format according to some "Annex B" of the MPEG-4 specification. libav provides required bit-stream filter "h264_mp4toannexb"
MP4 file ---> h264_mp4toannexb_bsf --> RTSP Stream
Now, for decoding RTSP:
Video and Audio come in separate channels. Parsing and decoding the H264 stream is done here: my basic h264 decoder using libav
Audio is a different thing:
RTP Transport suggests, that AAC frames are encapsulated in ADTS, where RTSP players like VLC expect plane AAC and accordingly available RTSP server implementations AACSource::HandleFrame() pinch the ADTS header off.
Another different thing is "time stamps and RTP":
VLC does not support compensation of time offsets between audio and video. Nearly every RTSP producer or consumer has constraints or non-documented assumptions for a time offset; you might consider an additional delay pipe to compensate the offset of an RTSP source.