I know we can combine videos using ffmpeg, but can I use ffmpeg to extract the H.264 content from a decrypted RTP stream? If so, how?
Related
Does ffmpeg can take the g729 encoded payload packets from RTP and make a playable wave file out of it?
if so, do I just need to create a basic binary file with just the payload bytes in it?
thank you
The issue: I need to convert an h.264 stream streamed over RTP into MJPEG, but for very convoluted reasons I am required to use the libjpeg-turbo library, not the mjpeg encoder that comes with ffmpeg. So the only thing FFMPEG needs to do is convert the h.264 RTP stream to rawvideo in RGBA and output to a socket where I then manually do the transcoding.
However, libjpeg-turbo only expects complete frames, meaning I need to collect rawvideo packet fragments and somehow synchronize them. Putting incoming raw video fragments into a buffer as they come results in heavily broken images.
Is there some way of saving the header information of the initial h.264 RTP packets? The command I'm currently using is very straightforward:
-i rtsp://: -vcodec rawvideo -f rawvideo udp://:
Here's how I stream MPEG-TS to a relay using ffmpeg:
ffmpeg -re -i out.ts -f mpegts -vcodec copy -acodec copy http://localhost:8081/secret
My question is in the internals of ffmpeg, I want to understand the core process as to how ffmpeg stream mpegts, what it does to the file to stream it, does it manipulate the byte it streams or it just stream as-is?
In this case, the transport stream is parsed, the audio and video elementary streams are read and depacketized. They are then repacketized, and remuxed into a new transport stream, then sent over http.
If you changed containers, the elementary streams may be converted to slightly different format depending on the codec and container global headers before being remuxed.
And if you transcoded the elementary stream would have been converted to raw pixels, and PCM, the reencoded back to a new elementary stream.
I currently have an IP camera streaming video (RTSP) with H.264 encoding.
I want to use FFmpeg to convert this H.264 encoded stream to another RTSP stream but MPEG-2 encoded. How exactly do I do this? Which commands should I use?
Sure you can do this. You may use ffmpeg to convert your RTSP-stream to MPEG-2 stream. Take a look at this answer it solves your problem https://stackoverflow.com/a/41836896/2000107
I have a query regarding using ffmpeg to encode a raw video(yuv sequence) to Raw Theora packets,
i.e. some kind of 'elementary bit-stream' without the Ogg container.
I am able to use ffmpeg to encode a raw video to Ogg theora bit stream, but i need to obtain a Theora bit stream with Raw Theora packets with no Ogg container header/
1) How can i achieve this?
2)If not using ffmpeg, then is there any other way/solution/tool to obtain what i need to get?
Thank you.
-AD.
I tried this out and think it will do what you're looking for; I can tell it's not in an Ogg container but haven't found a good way to play it back.
ffmpeg -i inputfile -vcodec libtheora -f rawvideo outputfile