I'm already searching for a solution for several days how to convert an MJPEG rtp stream to MP4 rtp stream.
Already tried something like this:
ffmpeg -i rtsp://192.168.10.8:554/stream1/mobotix.mjpeg -rtsp_transport tcp -f H264 udp://192.168.10.5:8554
ffmpeg then shows me like it's doing something...
frame= 612 fps= 11 q=25.0 size= 3243kB time=00:00:56.00 bitrate= 474.4kbits/s dup=275 drop=0 speed=0.981x
Then I tried with VLC to open udp://192.168.10.5:8554
but it opens nothing simply the bar is running left/right forever.
Do I need something like Simple RTP-Server (https://github.com/ossrs/srs) and then stream to that?
Best would be, when ffmpeg could host rtp itself...
Here is what I used to stream a local mkv to RTP
ffmpeg -re -thread_queue_size 4 -i input.mkv -strict -2 -vcodec copy -an -f rtp rtp://127.0.0.1:6005 -acodec copy -vn -sdp_file my_sdp_file -f rtp rtp://127.0.0.1:7005
I then had to copy the generated sdp file to the client and used ffmpeg to save the stream to disk
ffmpeg -protocol_whitelist "file,rtp,udp" -i my_sdp_file -strict -2 saved_rtp_stream.mp4
For completeness, here are the contents of the sdp file
$ cat my_sdp_file
SDP:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
t=0 0
a=tool:libavformat 57.83.100
m=video 6005 RTP/AVP 96
c=IN IP4 127.0.0.1
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1; sprop-parameter-sets=Z01AHuiAWh7f+AEAANiAAAH0gABdwHAwABAFgABVc0lGAPFi0SA=,aOvssg==; profile-level-id=4D401E
m=audio 7005 RTP/AVP 97
c=IN IP4 127.0.0.1
a=rtpmap:97 MPEG4-GENERIC/48000/2
a=fmtp:97 profile-level-id=1;mode=AAC-hbr;sizelength=13;indexlength=3;indexdeltalength=3; config=1190
Related
I am building a service that needs to convert RTP streams into HLS streams. The requirements recently shifted, and I now have to create an RTMP stream from the RTP stream and then convert the RTMP to HLS using two separate FFmpeg processes. The problem is that the RTP to RTMP process doesn't actually output anything to the specified RTMP URL.
Going directly from RTP to HLS with the following command (some options removed for brevity) works as expected:
ffmpeg -f sdp \
-protocol_whitelist file,udp,rtp \
-i example.sdp \
-g 2 \
-hls_time 2.0 \
-hls_list_size 5 \
-vcodec libx264 \
-acodec aac \
-f hls chunks/test-master.m3u8
However, converting RTP to RTMP with the following command yields no output, nor does it seem to be receiving any input despite the use of an identical SDP file:
ffmpeg -f sdp \
-protocol_whitelist pipe,udp,rtp \
-i example.sdp \
-g 2 \
-vcodec libx264 \
-acodec aac \
-f flv rtmp://localhost/test-stream
This is an example of what the SDP file looks like:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=Test
c=IN IP4 127.0.0.1
t=0 0
m=audio 37000 RTP/AVPF 97
a=rtpmap:97 opus/48000/2
a=fmtp:97 minptime=10;useinbandfec=1
m=video 37002 RTP/AVPF 96
a=rtpmap:96 H264/90000
a=fmtp:96 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=64001E
MediaSoup generates the RTP stream, and the ports and host match up. I've verified that there is actually a stream of data coming through the ports in question using nc. There are no error messages. Am I missing something obvious here?
MediaSoup is built with SFU (Selective Forwarding Unit). As I can see that you're using:
m=video 37002 RTP/AVPF 96 a=rtpmap:96 H264/90000 a=fmtp:96 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=64001E. You need to make sure the consumer for video stream is using the same video codec, this can be done by console.log(consumer.rtpParameters.codecs);
I need some help getting ffplay to receive and decode a Real Time stream encoded in h264.
I'm trying to make a point-to-point stream between computer A receiving video frames from a Kinect and computer B running ffplay to show the livestream.
These are the commands I'm running on both computers.
Computer A (RPI 3)
ffmpeg -f rawvideo -vcodec rawvideo -pix_fmt rgb24 -s 640x480 -i - -threads 4 -preset ultrafast -codec:v libx2 64 -an -f rtp rtp://192.168.0.100:2000
This is what ffmpeg outputs:
fps= 14 q=12.0 size=856kB time=00:00:05.56 bitrate=1261.4kbits/s speed=0.54x
The out stream runs in between 10-20 frames. It's not good, but I can work with that.
Computer B
ffplay -protocol_whitelist "file,udp,rtp" -probesize 32 -sync ext -i streaming.sdp
streaming.sdp
v=0
0=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 192.168.0.100
t=0 0
a=tool:libavformat 57.56.100
m=video 2000 RTP/AVP 96
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1
I'm getting the stream, but at about 0.0001fps which is clearly bad. My guess is I'm missing something on the ffplay command, since ffmpeg shows a more stable and fast stream, but I can't seem to find what I'm missing.
The problem wasn't on ffmpeg, but on the code I wrote that was grabbing the data from the device. I was receiving the same frame multiple times, and blocking the thread capturing data, making most of the frames a duplicate of the first one.
I am streaming a webcam/audio with the command:
ffmpeg.exe -f dshow -framerate 30 -i video="xxx" -c:v libx264 -an -f rtp rtp://localhost:50041 -f dshow -i audio="xxx" -c:a aac -vn -f rtp rtp://localhost:50043
This outputs the following sdp info:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
t=0 0
a=tool:libavformat 57.65.100
m=video 50041 RTP/AVP 96
c=IN IP6 ::1
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1
m=audio 50043 RTP/AVP 97
c=IN IP6 ::1
b=AS:128
a=rtpmap:97 MPEG4-GENERIC/44100/2
a=fmtp:97 profile-level-id=1;mode=AAC-
hbr;sizelength=13;indexlength=3;indexdeltalength=3; config=121056E500
And I read the stream with the command:
ffmpeg.exe -protocol_whitelist file,udp,rtp -i D:\test.sdp -c:v libx264 -c:a aac d:\out.mp4
In the resulting file, the audio is slightly ahead of the video. I have read that RTCP runs on the RTP port + 1, and contains synchronization information. I don't see any RTCP information in the SDP file though.
Do I need to specify something to include RTCP?
If that's not the issue, what else can I do to sync the audio and video?
Not sure if RTCP is your issue, but I would start by trying to use one directshow input and splitting it to two outputs like this:
ffmpeg.exe -f dshow -framerate 30 -i video="XX":audio="YY" -an -vcodec libx264 -f rtp rtp://localhost:50041 -acodec aac -vn -f rtp rtp://localhost:50043
The ffmpeg DirectShow documentation mentions synchronization issues when multiple inputs are used. It also mentions trying the "-copy_ts" flag to resolve sync issues if you want to keep the inputs separate.
I have a rtp live stream whit h.254 video, I want to copy it to file I use:
avconv -i rtp://#192.168.0.34:60005 -an -acodec copy -vcodec copy abc.mp4
But I have an error:
[rtp # 0x1f6cfe0] Unable to receive RTP payload type 96 without an SDP file describing it
That's ok, because avconv don't know what is inside.
My sdp file:
v=0
o=- 20966096445 1 IN IP4 0.0.0.0
t=0 0
a=type:broadcast
a=control:*
a=x-qt-text-nam:brovotech
a=x-qt-text-inf:live/sub
a=range:npt=0-
m=video 0 RTP/AVP 96
c=IN IP4 0.0.0.0
b=AS:8
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1;profile-level-id=4d001e;sprop-parameter-sets=Z00AHpWoLASZ,aO48gA==
How can I attach sdp file for FFmpeg? Or set some arguments that will describe the stream?
Just use avconv -i camera.sdp
camera.sdp:
...
o=- 20966096445 1 IN IP4 my_ip
...
m=video my_port RTP/AVP 96
I am streaming live video using rtp and ffmpeg using this command:
ffmpeg -re -f v4l2 -framerate 30 -video_size 640x480 -i /dev/video0 -c:v libx265 -tune zerolatency -s 320x240 -preset ultrafast -pix_fmt yuv420p -r 10 -strict experimental -f rtp rtp://127.0.0.1:49170 > ffmpeg.sdp
The generated sdp file is:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat 56.36.100
m=video 49170 RTP/AVP 96
a=rtpmap:96 H265/9000
Vlc gives the following error:
The format of 'file:///home/username/ffmpeg.sdp' cannot be detected. Have a look at the log for details.
Terminal gives the following error:
[0xaf801010] ps demux error: cannot peek
[0xaf801010] mjpeg demux error: cannot peek
[0xaf801010] mpgv demux error: cannot peek
[0xaf801010] ps demux error: cannot peek
[0xb6d00618] main input error: no suitable demux module for `file/:///home/username/ffmpeg.sdp'
If I simply change libx265 -> libx264 in the command and H265 -> H264 the stream runs perfectly fine.
However I need to stream on H265. Any Suggestions?
I guess the problem is because VLC (or ffplay) doesn't get the VPS,SPS,PPS frames.
In order to start to decode H265 stream you need a VPS, a SPS, a PPS and an IDR frame.
In order to ask to libx265 to repeat these configuration frames before each IDR frame you could add to your streaming command :
-x265-params keyint=30:repeat-headers=1
Then the command becomes :
ffmpeg -re -f v4l2 -framerate 30 -video_size 640x480 -i /dev/video0 -c:v libx265 -tune zerolatency -x265-params keyint=30:repeat-headers=1 -s 320x240 -preset ultrafast -pix_fmt yuv420p -r 10 -strict experimental -f rtp rtp://127.0.0.1:49170 > ffmpeg.sdp
It generate the following ffmpeg.sdp file:
SDP:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat 56.36.100
m=video 49170 RTP/AVP 96
a=rtpmap:96 H265/90000
I was able to display the stream with ffplay ffmpeg.sdp and VLC ffmpeg.sdp (removing the first line SDP: of ffmpeg.sdp)
Don't shoot me down in flames, as I do not use VLC for this type of thing but I recall that to get Gstreamer working with H265, I had to install:
libde265 and gstreamer1.0-libde265
There is also a vlc-plugin-libde265 listed in the ubuntu repositories.
See: https://github.com/strukturag/libde265