Stream H264 using live555 and FFmpeg - ffmpeg

I am trying to stream data encoded using FFMPEg using live555. I have a custom framesource that sends the data to sink but I am unable to figure out how to set SPS and PPS in the framer. I understand that extradata contains this information but I saw only SPS in that. Does extradata changes while encoding by FFMPeg? If yes how and when we need to update this information in live555 framer.
Does anyone have a working sample using FFMpeg and live555 to stream H264

Live555 is simply a streaming tool, it does not do any encoding.
The SPS and PPS are NAL units within the encoded H264 stream (or the output from your FFMPEG implementation) (see some info here: http://www.cardinalpeak.com/blog/the-h-264-sequence-parameter-set/).
If you want to change the SPS or PPS information you need to do it in FFMPEG.
Examples of FFMPEG and Live555 working together to stream MPG2 and H264 streams are here:
https://github.com/alm865/FFMPEG-Live555-H264-H265-Streamer/
As for streaming a H264 stream, you need to break the output from FFMPEG into NAL units before you send if off to the discrete framer for it to work correctly. You must also strip the leading and trailing NAL bits from the packet (i.e. remove the NAL identifier 0x00 0x00 0x00 0x01).
Live555 will automatically read these in and update as necessary.

Related

Format h264 detected only with low score of 1, misdetection possible

there are a h264 rtp stream passing through server, I catch every rtp packets and extract every nalu by getting rid of rtp header, and then write every nalu to file record.h264 with adding a h264 prefix start code 0x00000001, but file record.h264 cannot be played by ffplay and cannot be parsed by ffprobe, where is wrong?
here is my record file:https://github.com/sshsu/record_h264_file
You might be skipping the decoding of the NAL header within the RtpPacket to it's full format from the aggregated form.
I have an implementation in C# here:
https://github.com/juliusfriedman/net7mma_core/blob/master/RtspServer/MediaTypes/RFC6184Media.cs if it helps you.

I am using ffmpeg to stream a h264 encoded avi file to a player and the player supports only packetization mode 0

I am using ffmpeg to stream a h264 encoded avi file to a player and
the player supports only packetization mode 0 ( single NAL unit mode
). But ffmpeg always uses packetization mode 1 and sends FU-A nal unit
type, the player does not play the video on receiving a fu-a nal type
payload. It just displays a blank screen. I understand non-interleaved
mode supports both single NAL unit types (1-23) and fua, but how to
can I force ffmpeg to use only single nal unit type mode? Can some one
help me?
I'm assuming you mean H264 over RTP here. FFmpeg's RTP muxer can be forced to use mode 0 by using flag -rtpflags h264_mode0; though if you are seeing FU-A type (28) then chances are some NAL units can't fit single RTP packet and mode0 won't work.

What does Elementary Stream mean in Terms of H264

I read what an Elementary Stream is on Wikipedia. A tool i am using "Live555" is demanding "H.264 Video Elementary Stream File". So when exporting a Video from a Video Application, do i have to choose specific preferences to generate a "Elementery Stream" ?
If you're using ffmpeg you could use something similar to the following:
ffmpeg -f video4linux2 -s 320x240 -i /dev/video0 -vcodec libx264 -f h264 test.264
You'll have to adapt the command line for the file type you're exporting the video from.
This generates a file containing H.264 access units where each access unit consists of one or more NAL units with each NAL unit prefixed with a start code (0001 or 001). You can open the file using a hex editor to take a look at it.
You can also create an H.264 elementary stream file (.264) by using the the H.264 reference encoder from raw YUV input files.
If you copy the generated .264 file into the live555 testOnDemandRTSPServer directory, you can test streaming the file over RTSP/RTP.
Can you give some references to read more about NAL / H.264 elementary Stream. How can I quickly check if the stream is an elementary stream?
Generally anything in a container (avi or mp4) is not an elementary stream. The typical extension used for elementary streams is ".264". The quickest way to double check that a file is an elementary stream is to open the file in a hex editor and look for a start code at the beginning of the file (00000001). Note that there should be 3 (000001) and 4 (00000001) byte start codes through out the file (before every NAL unit)
Why does live555 not play h264 streams which are not elementary?
This is purely if live555 has not implemented the required demux (e.g. avi or mp4). AFAIK live555 does support demuxing H.264 from the matroska container.

Read H264 SPS & PPS NAL bytes using libavformat APIs

How to read H264 SPS & PPS NAL bytes using libavformat APIs?
I tried reading video data to 'AVPacket' structure using "av_read_frame(input_avFormatContext, &avPkt)" API, from a .mp4 video (codec is h264) file.
I dumped avPkt->data to a file. But 1st frame read is an IDR frame.
File generated using "ffmpeg -i video.mp4 video.h264" will contain SPS & PPS in the starting before start of IDR.
I want to extract raw .h264 video from .mp4 file and dump it in SPS,PPS, IDR, P1, P2... order.
I want to get this done programmatically using libavformat APIs.
Any idea on it?
Thanks.
In mp4-container (mkv also) PPS/SPS are stored separate from frame data in global headers. To access them from libav* APIs you need to look for extradata field in AVCodecContext of AVStream which relate to needed video stream. Also extradata can have different format from standard H.264 NALs so look in MP4-container specs for format description.

How get sps struct from h264 video with libav ffmpeg

For my HW decoder I need know additional codec specification about video codec like h264? How I can extract low level codec info with libav ffmpeg?
The data that I need: https://www.ffmpeg.org/doxygen/2.7/structSPS.html
You can either look at each NALU in the stream looking for an SPS ( nalu[0] & 0x1F == 7 ) or you can take a look at the AVCC data pointed to by AVCodecContext.extradata and parse it as described here:
http://aviadr1.blogspot.com/2010/05/h264-extradata-partially-explained-for.html

Resources