Live streaming from h.264 IP camera to Web browser - ffmpeg

I want to live stream video from h.264/h.265 IP camera to browser with little to no delay and in decent quality (Full HD). I know there are couple of questions like this one but the answers seem to be either incomplete or outdated. So far I've tried ffmpeg and ffserver and had some success with them, but there are problems:
When I stream to mjpg the quality isn't great, if I use webm quality is better but there is significant delay (aprox. 5 seconds), probably due to transcoding from h264 to vp9. How can I improve it? Is it possible to stream h264 without transcoding it to different format? Are there any better solutions than ffserver and ffmpeg?
Here is the config I've used for mjpg:
ffmpeg -rtsp_transport tcp -i rtsp://rtsp_user:Rtsp_pass#192.168.3.83:554/Streaming/Channels/101 -q:v 3 http://localhost:8090/feed3.ffm
on ffserver:
<feed feed3.ffm>
file /tmp/feed3.ffm
filemaxsize 1G
acl allow 127.0.0.1
</feed>
<Stream cam3.mjpg>
Feed feed3.ffm
Format mpjpeg
VideoCodec mjpeg
VideoFrameRate 25
VideoIntraOnly
VideoBufferSize 8192
VideoBitRate 8192
VideoSize 1920x1080
VideoQMin 5
VideoQMax 15
NoAudio
Strict -1
</Stream>
And for webm:
ffmpeg -rtsp_transport tcp -i rtsp://rtsp_user:Rtsp_pass#192.168.3.83:554/Streaming/Channels/101 -c:v libvpx http://127.0.0.1:8090/feed4.ffm
ffserver:
<Stream cam4.webm>
Feed feed4.ffm
Format webm
# Audio settings
NoAudio
# Video settings
VideoCodec libvpx
VideoSize 720x576
VideoFrameRate 25
AVOptionVideo qmin 10
AVOptionVideo qmax 42
AVOptionAudio flags +global_header
PreRoll -1
StartSendOnKey
VideoBitRate 400
</Stream>

Related

very low latency streaminig with ffmpeg using a webcam

I'm trying to configure ffmpeg to do a real-time video streaming using a webcam. The ffmpeg encoder command I use is as follows.
ffmpeg -f v4l2 -input_format yuyv422 -s 640x480 -i /dev/video0 -c:v libx264 -profile:v baseline -trellis 0 -subq 1 -level 32 -preset superfast -tune zerolatency -me_method epzs -crf 30 -threads 0 -bufsize 1 -refs 4 -coder 0 -b_strategy 0 -bf 0 -sc_threshold 0 -x264-params vbv-maxrate=2000:slice-max-size=1500:keyint=30:min-keyint=10: -pix_fmt yuv420p -an -f mpegts udp://192.168.1.8:5001
The ffplay command used to display the video feed is,
ffplay -analyzeduration 1 -fflags -nobuffer -i udp://192.168.1.8:5001
However, I'm experiencing a latency of 0.5 - 1.0s latency in the video stream. Is there a way to reduce this to a number less than 100ms. Also, when I replace the v4l2 camera capture with a screen capture using x11grab, the stream is almost real-time and I experience no noticeable delays. Moreover, changing the encoder from x264 to mpeg2 had no effect on the latency. In addition, the statistics from the ffmpeg shows that the encoder is performing at a 30fps rate, which I believe indicates that the encoding is real-time. This leaves me with only one reason for the experienced delay.
Is there a significant delay in buffers when using v4l2 during video capturing in a webcam?
I don't think the transmission delay is in effect in this case as I see no latencies when screen capture is used under the same conditions.
Can this latency be further reduced?. Can someone think of a different encoder configuration to be used instead of the one that I've been using?
I had also many problems in setting up a low latency video streaming system between an odroid spc and windows pc. Finally i found settings resulting in approx 500ms to max. 1s latency.
Setup: ffserver on odroid xu4 with ubuntu 18.04, connected via wifi dongle to network. Windows 10 PC in same wifi network streaming from odroid.
I run the following ffserver config (/etc/ffserver.conf) on my odroid
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxClients 1000
MaxBandwidth 10000
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 50M
ACL allow 127.0.0.1
ACL allow localhost
</Feed>
<Stream test1.asf>
Format asf
Feed feed1.ffm
VideoFrameRate 30
VideoSize 640x480
VideoBitRate 600
#VideoBufferSize 400
VideoQMin 1
VideoQMax 20
NoAudio
ACL allow 127.0.0.1
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
</Stream>
<Stream stat.html>
Format status
ACL allow 127.0.0.1
ACL allow localhost
</Stream>
and start the camera stream on the odroid with
ffserver -f /etc/ffserver.conf & ffmpeg -f v4l2 -s 640x480 -r 15 -i /dev/video0 -vcodec libx265 -threads 2 -tune zerolatency http://localhost:8090/feed1.ffm
On my Windows PC I tried several settings to get low latency. With VLC-Player I could not manage anything below 8 to 10 seconds.
With the following ffplay command I got about 500ms latency:
ffplay -fflags -nobuffer -probesize 32 -i mmsh://ubuntu1804:8090/test1.asf
so, -sync ext and -analyzeduration 1 did not help in reducing the latency.
The "stream production" on the odroid also runs with the same low latency when using libx264 instead of libx265 and removing the -thread 2 flag. But increasing the framerate to 30 or even increasing the resolution leads to significant delays.
i used the same send instruction and i tried this with the ffplay and it worked for me:
ffplay -analyzeduration 1 -fflags -nobuffer -probesize 32 -sync ext -i rtmp://localhost/live/STREAM_NAME

ffmpeg error: Data doesn't look like RTP packets, make sure the RTP muxer is used

I am trying to stream both video&audio from usbcam&mic throw ffmpeg over ffserver
I got 2 errors:
- ffmpeg seems functionning but showing "Data doesn't look like RTP packets, make sure the RTP muxer is used"
- i can connect to ffserver only for static files
here is server.conf file:
HTTPPort 1235
RTSPPort 1234
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 100000
#CustomLog –
########################################
## static file for testing
########################################
#HTTP requests
<Stream media.flv>
File "/home/username/media.flv"
Format flv
</Stream>
#RTSP requests
<Stream media.mpg>
#preconverted file:
File "/home/username/media.mpg"
Format rtp
VideoFrameRate 30
VideoCodec libx264
VideoSize 720x720
StartSendOnKey
Preroll 0
</Stream>
##################################################
## usb cam
###################################################
<Feed test.ffm>
File /tmp/test.ffm
FileMaxSize 20K
ACL allow 192.168.1.149
</Feed>
<Stream usbcam.mpg>
Feed test.ffm
Format rtp
VideoFrameRate 25
VideoCodec libx264
VideoSize 720x720
PreRoll 0
StartSendOnKey
</Stream>
my ffmpeg cmd is
ffmpeg -s 720x720 -f video4linux2 -i /dev/video0 -r 25 -f alsa -i hw:0 -c:v libx264 -c:a aac -strict -2 rtp://192.168.1.149:1234/test.ffm
it seems working but showing this error:
"Data doesn't look like RTP packets, make sure the RTP muxer is used"
when i stream the static files it works
but when i try to play usbcam stream throw ffplay and vlc nothing works
thank you in advance,
you can try tell to ffmpeg what is your output muxer format. (-f rtp)
ffmpeg -s 720x720 -f video4linux2 -i /dev/video0 -r 25 -f alsa -i hw:0 -c:v libx264 -c:a aac -strict -2 -f rtp rtp://192.168.1.149:1234/test.ffm

FFMpeg/FFServer P2P streaming between embedded device and smartphone

I've been playing for the last couple of days with FFMpeg and FFServer as I am considering them as candidates for livestreaming between an embedded device(with an integrated HD camera) and various clients(smartphones).
I've managed to achieve the stream using with the following config for FFServer:
HTTPPort 1234
RTSPPort 1235
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 512000 # Maximum bandwidth per client
# set this high enough to exceed stream bitrate
CustomLog -
NoDaemon # Remove this if you want FFserver to daemonize after start
<Feed feed.ffm> # This is the input feed where FFmpeg will send
File /tmp/feed.ffm # video stream.
FileMaxSize 512K
</Feed>
<Stream test.h264> # Output stream URL definition
Feed feed.ffm # Feed from which to receive video
Format rtp
# Video settings
VideoCodec libvpx
VideoSize 720x576 # Video resolution
VideoFrameRate 60 # Video FPS
AVOptionVideo flags +global_header # Parameters passed to encoder
# (same as ffmpeg command-line parameters)
AVOptionVideo cpu-used 0
AVOptionVideo qmin 10
AVOptionVideo qmax 42
AVOptionVideo quality good
AVOptionAudio flags +global_header
PreRoll 15
StartSendOnKey
VideoBitRate 400 # Video bitrate
NoAudio
</Stream>
And the following FFMpeg command to send the stream to FFServer:
ffmpeg -rtbufsize 2100M -f dshow -i video="Integrated Camera" -vcodec libx264 http://127.0.0.1:1234/feed.ffm
I also have a simple Android client that plays the RTSP stream using the following URL:
rtsp://mylocalnetworkip:1235/test.h264
But now I am trying to achieve a P2P connection between the embedded device and a smartphone. This has to be over the internet(not in the LAN) and capable to achieve UDP hole punching (such as Skype does for p2p video-calling).
Is this achievable with ffmpeg alone?
Can ffmpeg integrate with a Stun/Turn server such as Coturn to bypass symmetric NATs?

How can I configure ffserver to support rtmp instead of http?

I grab the image from my camera using ffmpeg and the following command:
ffmpeg -y -f vfwcap -r 25 -i 0 http://10.172.180.235:8090/feed2.ffm
and on other machine (with the ip mentioned above) I have the ffserver running with the following config file:
HttpPort 8090
HttpBindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 3000
CustomLog -
<Stream stat.html>
Format status
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
ACL allow 10.172.180.199
ACL allow 10.172.180.216
ACL allow 10.172.180.215
</Stream>
<Feed feed2.ffm>
File /tmp/feed2.ffm
FileMaxSize 1G
ACL allow 127.0.0.1
ACL allow 10.172.180.199
ACL allow 10.172.180.216
ACL allow 10.172.180.236
ACL allow 10.172.180.109
</Feed>
<Stream live.flv>
Format flv
Feed feed2.ffm
VideoCodec libx264
VideoFrameRate 30
VideoBitRate 800
VideoSize 1280x720
AVOptionVideo crf 23
AVOptionVideo preset medium
AVOptionVideo me_range 16
AVOptionVideo qdiff 4
AVOptionVideo qmin 10
AVOptionVideo qmax 51
AVOptionVideo flags +global_header
NoAudio
AudioCodec aac
Strict -2
AudioBitRate 128
AudioChannels 2
AudioSampleRate 44100
AVOptionAudio flags +global_header
</Stream>
And that works, I can stream video in flv over http... But now I would like to use rtmp, becase I want to display the live stream on my webpage with some player. I wanted to use the video.js, but it seems like in the latest versions it doesn't support live video any more... I found mediaelement.js, but to stream live content there I need the rtmp protocol, so that's the cause of my question.
Thanks for your help

Streaming webm with ffserver error-

I have setup ffserver to stream mpeg-ts, flv from a live rtsp feed via ffmpeg, but when i also include webm format in the configuration and try to play the webm file in browser i get the following error in the log
"Only VP8,VP9 video and Vorbis,Opus(experimental, use -strict -2) audio and WebVTT subtitles are supported for WebM"
The ffmpeg command i use is
ffmpeg -i rtsp://192.168.1.1:5543/lowQ.sdp -c copy http://xxx.xxxx.xxxx:8080/feed1.ffm
The ffserver configuration is
Feed feed1.ffm
Format webm
NoAudio
AVOptionVideo flags +global_header
VideoBitRate 500k
VideoBufferSize 40
VideoFrameRate 25
VideoCodec libvpx
StartSendOnKey
Preroll 15
Appreciate your help in this!
try to compile ffmpeg with libvpx support (../configure --enable-libvpx)

Resources