I wanna use ffmpeg to convert yuv raw video file into ts stream video file.So I do this in my code:
avcodec_find_encoder(AV_CODEC_ID_MPEG2TS);
But when I run it ,it occurs that:
[NULL # 0x8832020] No codec provided to avcodec_open2()
I change the "AV_CODEC_ID_MPEG2TS" into "AV_CODEC_ID_MPEG2VIDEO", it works well ,and generate a mpg file running well too.So I wanna ask why I cannot use "AV_CODEC_ID_MPEG2TS"?
I'm also looking for streaming a file with ffmpeg so I'm not sure about that but it is what I understand....
Mpeg TS (Transport Stream) is not a codec, it is an encapsulation method, so you have to encode the stream with some code (I'm not sure if you can chose any codec) and then you can encapsulate it with mpeg ts before transmit over the network.
If you don't need to transmit the stream over the network maybe you don't need mpeg ts.
I hope this will helpful....!
Look here: ffmpeg doxygen
Related
I currently have an IP camera streaming video (RTSP) with H.264 encoding.
I want to use FFmpeg to convert this H.264 encoded stream to another RTSP stream but MPEG-2 encoded. How exactly do I do this? Which commands should I use?
Sure you can do this. You may use ffmpeg to convert your RTSP-stream to MPEG-2 stream. Take a look at this answer it solves your problem https://stackoverflow.com/a/41836896/2000107
I am developing a player based on ffmpeg.
Now I try to decode hls video. The video stream has several programs (AVProgram) separated by quality. I want to select one specific program with desired quality. But ffmpeg reads packets from all programs (all streams).
How can I tell ffmpeg which streams to read?
Solved by using disard field in AVStream structure:
_stream->discard = AVDISCARD_ALL;
I have the code of a simple h264 encoder, which outputs a raw 264 file. I want to extend it to directly output the video in a playable container; it doesn't matter which one as long as it is playable by VLC. So, what is the easiest way to include a wrapper around this raw H264 file?
Everywhere I looked on the web, people used ffmpeg and libavformat, but I would prefer to have standalone code. I do not want fancy stuff like audio, subtiltes, chapters etc., just the video stream.
Thanks!
You can output a .264 directly by writing the Elementary stream to a file in AnnexB format. That is, write each NALU to the file separated by start codes (0x00000001). But make sure the stream writes SPS and PPS before the first IDR>
mkv, mpeg-ts, mp4 (you can use libMP4v2)
I'm looking for a way to mux mjpeg (compressed) video data into a video container like mp4 or avi. (I'll also need to add audio in the future).
Since i use FFMPEG in other parts of my project as well i'd like to do it using those libraries if possible.
I'm not looking for command line FFMPEG use!
I've tried to use the muxing example in ffmpeg with that i can only create a (very large) .mjpeg file with video information. This is not what I am looking for.
Examples would be very welcome, but a pointer in the right direction also works!
edit:
I have output the yuvj422p stream to jpeg images and I want to put this into a mp4 container. Using ffmpeg commandline this works:
ffmpeg -i yuvy%01d.jpg -vcodec mjpeg out.mp4
I would like to do this directely in my code (without creating jpeg images first ofcourse)
I fixed it doing the following:
I used the muxing example and in stead of using the encode functions i just skipped it and directly loaded the JPEG data into the packet. To set up the OutputContext i used guess format functions and i set the codec to MJPEG. I changed the PTS data to a frame counter. since all frames are chronologic anyway.
three major steps
decode ( using avcodec_decode_video() )
convert raw frame to yuv420p format ( using swscale() )
encode ( using avcodec_encode_video() )
I can provide a sample code if you need
Problem:
I have to save live video streams data which come as an RTP packets from RTSP Server.
The data come in two formats : MPEG4 and h264.
I do not want to encode/decode input stream.
Just write to a file which is playable with proper codecs.
Any advice?
Best Wishes
History:
My Solutions and their problems:
Firt attempt: FFmpeg
I use FFmpeg libary to get audio and video rtp packets.
But in order to write packets i have to use av_write_frame :
which seems that decode /encode takes place.
Also, when i give output format as mp4 ( av_guess_format("mp4", NULL, NULL)
the output file is unplayable.
[ any way ffmpeg has bad doc. hard to find what is wrong]
Second attempth: DirectShow
Then i decide to use DirectShow. I found a RTSP Source Filter.
Then a Mux and File Writer.
Cretae Single graph:
RTSP Source --> MPEG MUX ---> File Writer
It worked but the problem is that the output file is not playable
if graph is not stoped. If something happens, graph crashed for example
the output file is not playable
Also i can able to write H264 data, but the video is completely unplayable.
The MP4 file format has an index that is required for correct playback, and the index can only be created once you've finished recording. So any solution using MP4 container files (and other indexed files) is going to suffer from the same problem. You need to stop the recording to finalise the file, or it will not be playable.
One solution that might help is to break the graph up into two parts, so that you can keep recording to a new file while stopping the current one. There's an example of this at www.gdcl.co.uk/gmfbridge.
If you try the GDCL MP4 multiplexor, and you are having problems with H264 streams, see the related question GDCL Mpeg-4 Multiplexor Problem