I use ffmpeg to make an rtp stream to kurento media server and then send it to browsers via webrtc.
ffmpeg (h264 RTP) -> Kurento -> (h264 WebRTC) Browser
I'm capturing virtual xorg display.
This is my ffmpeg command:
ffmpeg -y -v info -fflags +genpts -f x11grab -draw_mouse 0 -r 25 -s 1280x720 -thread_queue_size 4096 -i :0.0+0,0 -an -c:v libx264 -preset veryfast -crf 25 -g 50 -pix_fmt yuv420p -maxrate 2976k -bufsize 5952k -ssrc 112233 -payload_type 103 -tune zerolatency -f rtp rtp://172.16.1.115:40258
This is my fake sdp offer used in negotiation with kurento RtpEndpoint
v=0
o=- 0 0 IN IP4 127.0.0.1
s=Stream
c=IN IP4 127.0.0.1
t=0 0
m=video 9 RTP/AVP 103
a=rtpmap:103 H264/90000
a=fmtp:103 packetization-mode=1
a=sendonly
a=direction:active
a=ssrc:112233 cname:user#example.com
Here is the problem:
Some I-frames are choppy on bottom half slice of the frame while others have no problem.
It is sometimes corrected when another I-frame has arrived, but mostly choppy.
When choppiness happened, Kms log says:
kmsutils kmsutils.c:483:gap_detection_probe:kmsagnosticbin2-108:sink Stream gap detected, timestamp: 0:51:22.574766908, duration: 0:00:00.000008237
Normal stream (No choppy at all)
Choppy stream
When it is corrected with an I-Frame (Sometimes it happens)
I have no clue about what could lead the issue.
Things that I tried to solve the problem.
Adding ?pkt_size=1000 (1100,900,1200. Since the mtu in kurento is 1200 default)
Changing -crf to different values in between 18-35
Changing -preset between medium and ultrafast
Changing framerate
Changing gop length (When i lower the gop length -More I-Frames- choppiness become shorter in duration but more frequent)
When I disable sliced-threads, there is no issue with bottom side but whole screen freezes in same scenario
Any help would be appreciated. Thanks.
Related
I tried various ways to make a live streaming script with a bit rate according to YouTube's recommendation of 4500 Kbps bit rate.
The code:
ffmpeg -re -stream_loop -1 -i live1.mp4 -c copy -preset veryfast -b:v 7000k -maxrate 3000k -bufsize 6000k -pix_fmt yuv420p -g 50 -c:a aac -b:a 160k -ac 2 -ar 44100 -f flv -flvflags no_duration_filesize rtmp://a.rtmp.youtube.com/live2/(streamkey)
and in my current code, there is an error when live: Use a keyframe frequency of four seconds or less. Currently, keyframes are sent infrequently which can cause buffering. The current keyframe frequency is 5.0 seconds. Please note that errors in the transfer process can cause the size of the GOP (group of images) to be incorrect.
How to fix my code?
I've tried several ways, but the bit rate is still high, and the error is in YouTube Studio
I need some help getting ffplay to receive and decode a Real Time stream encoded in h264.
I'm trying to make a point-to-point stream between computer A receiving video frames from a Kinect and computer B running ffplay to show the livestream.
These are the commands I'm running on both computers.
Computer A (RPI 3)
ffmpeg -f rawvideo -vcodec rawvideo -pix_fmt rgb24 -s 640x480 -i - -threads 4 -preset ultrafast -codec:v libx2 64 -an -f rtp rtp://192.168.0.100:2000
This is what ffmpeg outputs:
fps= 14 q=12.0 size=856kB time=00:00:05.56 bitrate=1261.4kbits/s speed=0.54x
The out stream runs in between 10-20 frames. It's not good, but I can work with that.
Computer B
ffplay -protocol_whitelist "file,udp,rtp" -probesize 32 -sync ext -i streaming.sdp
streaming.sdp
v=0
0=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 192.168.0.100
t=0 0
a=tool:libavformat 57.56.100
m=video 2000 RTP/AVP 96
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1
I'm getting the stream, but at about 0.0001fps which is clearly bad. My guess is I'm missing something on the ffplay command, since ffmpeg shows a more stable and fast stream, but I can't seem to find what I'm missing.
The problem wasn't on ffmpeg, but on the code I wrote that was grabbing the data from the device. I was receiving the same frame multiple times, and blocking the thread capturing data, making most of the frames a duplicate of the first one.
I'm trying to configure ffmpeg to do a real-time video streaming using a webcam. The ffmpeg encoder command I use is as follows.
ffmpeg -f v4l2 -input_format yuyv422 -s 640x480 -i /dev/video0 -c:v libx264 -profile:v baseline -trellis 0 -subq 1 -level 32 -preset superfast -tune zerolatency -me_method epzs -crf 30 -threads 0 -bufsize 1 -refs 4 -coder 0 -b_strategy 0 -bf 0 -sc_threshold 0 -x264-params vbv-maxrate=2000:slice-max-size=1500:keyint=30:min-keyint=10: -pix_fmt yuv420p -an -f mpegts udp://192.168.1.8:5001
The ffplay command used to display the video feed is,
ffplay -analyzeduration 1 -fflags -nobuffer -i udp://192.168.1.8:5001
However, I'm experiencing a latency of 0.5 - 1.0s latency in the video stream. Is there a way to reduce this to a number less than 100ms. Also, when I replace the v4l2 camera capture with a screen capture using x11grab, the stream is almost real-time and I experience no noticeable delays. Moreover, changing the encoder from x264 to mpeg2 had no effect on the latency. In addition, the statistics from the ffmpeg shows that the encoder is performing at a 30fps rate, which I believe indicates that the encoding is real-time. This leaves me with only one reason for the experienced delay.
Is there a significant delay in buffers when using v4l2 during video capturing in a webcam?
I don't think the transmission delay is in effect in this case as I see no latencies when screen capture is used under the same conditions.
Can this latency be further reduced?. Can someone think of a different encoder configuration to be used instead of the one that I've been using?
I had also many problems in setting up a low latency video streaming system between an odroid spc and windows pc. Finally i found settings resulting in approx 500ms to max. 1s latency.
Setup: ffserver on odroid xu4 with ubuntu 18.04, connected via wifi dongle to network. Windows 10 PC in same wifi network streaming from odroid.
I run the following ffserver config (/etc/ffserver.conf) on my odroid
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxClients 1000
MaxBandwidth 10000
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 50M
ACL allow 127.0.0.1
ACL allow localhost
</Feed>
<Stream test1.asf>
Format asf
Feed feed1.ffm
VideoFrameRate 30
VideoSize 640x480
VideoBitRate 600
#VideoBufferSize 400
VideoQMin 1
VideoQMax 20
NoAudio
ACL allow 127.0.0.1
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
</Stream>
<Stream stat.html>
Format status
ACL allow 127.0.0.1
ACL allow localhost
</Stream>
and start the camera stream on the odroid with
ffserver -f /etc/ffserver.conf & ffmpeg -f v4l2 -s 640x480 -r 15 -i /dev/video0 -vcodec libx265 -threads 2 -tune zerolatency http://localhost:8090/feed1.ffm
On my Windows PC I tried several settings to get low latency. With VLC-Player I could not manage anything below 8 to 10 seconds.
With the following ffplay command I got about 500ms latency:
ffplay -fflags -nobuffer -probesize 32 -i mmsh://ubuntu1804:8090/test1.asf
so, -sync ext and -analyzeduration 1 did not help in reducing the latency.
The "stream production" on the odroid also runs with the same low latency when using libx264 instead of libx265 and removing the -thread 2 flag. But increasing the framerate to 30 or even increasing the resolution leads to significant delays.
i used the same send instruction and i tried this with the ffplay and it worked for me:
ffplay -analyzeduration 1 -fflags -nobuffer -probesize 32 -sync ext -i rtmp://localhost/live/STREAM_NAME
I am streaming live video using rtp and ffmpeg using this command:
ffmpeg -re -f v4l2 -framerate 30 -video_size 640x480 -i /dev/video0 -c:v libx265 -tune zerolatency -s 320x240 -preset ultrafast -pix_fmt yuv420p -r 10 -strict experimental -f rtp rtp://127.0.0.1:49170 > ffmpeg.sdp
The generated sdp file is:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat 56.36.100
m=video 49170 RTP/AVP 96
a=rtpmap:96 H265/9000
Vlc gives the following error:
The format of 'file:///home/username/ffmpeg.sdp' cannot be detected. Have a look at the log for details.
Terminal gives the following error:
[0xaf801010] ps demux error: cannot peek
[0xaf801010] mjpeg demux error: cannot peek
[0xaf801010] mpgv demux error: cannot peek
[0xaf801010] ps demux error: cannot peek
[0xb6d00618] main input error: no suitable demux module for `file/:///home/username/ffmpeg.sdp'
If I simply change libx265 -> libx264 in the command and H265 -> H264 the stream runs perfectly fine.
However I need to stream on H265. Any Suggestions?
I guess the problem is because VLC (or ffplay) doesn't get the VPS,SPS,PPS frames.
In order to start to decode H265 stream you need a VPS, a SPS, a PPS and an IDR frame.
In order to ask to libx265 to repeat these configuration frames before each IDR frame you could add to your streaming command :
-x265-params keyint=30:repeat-headers=1
Then the command becomes :
ffmpeg -re -f v4l2 -framerate 30 -video_size 640x480 -i /dev/video0 -c:v libx265 -tune zerolatency -x265-params keyint=30:repeat-headers=1 -s 320x240 -preset ultrafast -pix_fmt yuv420p -r 10 -strict experimental -f rtp rtp://127.0.0.1:49170 > ffmpeg.sdp
It generate the following ffmpeg.sdp file:
SDP:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat 56.36.100
m=video 49170 RTP/AVP 96
a=rtpmap:96 H265/90000
I was able to display the stream with ffplay ffmpeg.sdp and VLC ffmpeg.sdp (removing the first line SDP: of ffmpeg.sdp)
Don't shoot me down in flames, as I do not use VLC for this type of thing but I recall that to get Gstreamer working with H265, I had to install:
libde265 and gstreamer1.0-libde265
There is also a vlc-plugin-libde265 listed in the ubuntu repositories.
See: https://github.com/strukturag/libde265
I'm using FFMPEG on windows with direct show.
I'm streaming RTMP (command below) and i need very low latency.
Once run I get the following errors: [dshow # 024ce800] real-time buffer 204% full! frame dropped!
ffmpeg -threads 6 -f dshow -i video=UScreenCapture -s 1920x1080 -an -vco
dec libx264 -x264opts keyint=25:min-keyint=20 -b:v 1024k -preset ultrafast -tune zerolatency -crf 22 -r 10 -pix_fmt yuv420p -f flv rtmp://server...
Do you have an idea how to handle this kind of error?
Thanks
Ronen
That message means "dshow got an incoming packet, but you haven't finished sending the previous packet yet" so in reality, this should be contributing to as low a latency possible. If your goal is to avoid dropping packets then increase rtbufsize. With rtmp hopefully there will be some improvements soon so it has better throughput.