How can I configure ffserver to support rtmp instead of http? - ffmpeg

I grab the image from my camera using ffmpeg and the following command:
ffmpeg -y -f vfwcap -r 25 -i 0 http://10.172.180.235:8090/feed2.ffm
and on other machine (with the ip mentioned above) I have the ffserver running with the following config file:
HttpPort 8090
HttpBindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 3000
CustomLog -
<Stream stat.html>
Format status
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
ACL allow 10.172.180.199
ACL allow 10.172.180.216
ACL allow 10.172.180.215
</Stream>
<Feed feed2.ffm>
File /tmp/feed2.ffm
FileMaxSize 1G
ACL allow 127.0.0.1
ACL allow 10.172.180.199
ACL allow 10.172.180.216
ACL allow 10.172.180.236
ACL allow 10.172.180.109
</Feed>
<Stream live.flv>
Format flv
Feed feed2.ffm
VideoCodec libx264
VideoFrameRate 30
VideoBitRate 800
VideoSize 1280x720
AVOptionVideo crf 23
AVOptionVideo preset medium
AVOptionVideo me_range 16
AVOptionVideo qdiff 4
AVOptionVideo qmin 10
AVOptionVideo qmax 51
AVOptionVideo flags +global_header
NoAudio
AudioCodec aac
Strict -2
AudioBitRate 128
AudioChannels 2
AudioSampleRate 44100
AVOptionAudio flags +global_header
</Stream>
And that works, I can stream video in flv over http... But now I would like to use rtmp, becase I want to display the live stream on my webpage with some player. I wanted to use the video.js, but it seems like in the latest versions it doesn't support live video any more... I found mediaelement.js, but to stream live content there I need the rtmp protocol, so that's the cause of my question.
Thanks for your help

Related

Live streaming from h.264 IP camera to Web browser

I want to live stream video from h.264/h.265 IP camera to browser with little to no delay and in decent quality (Full HD). I know there are couple of questions like this one but the answers seem to be either incomplete or outdated. So far I've tried ffmpeg and ffserver and had some success with them, but there are problems:
When I stream to mjpg the quality isn't great, if I use webm quality is better but there is significant delay (aprox. 5 seconds), probably due to transcoding from h264 to vp9. How can I improve it? Is it possible to stream h264 without transcoding it to different format? Are there any better solutions than ffserver and ffmpeg?
Here is the config I've used for mjpg:
ffmpeg -rtsp_transport tcp -i rtsp://rtsp_user:Rtsp_pass#192.168.3.83:554/Streaming/Channels/101 -q:v 3 http://localhost:8090/feed3.ffm
on ffserver:
<feed feed3.ffm>
file /tmp/feed3.ffm
filemaxsize 1G
acl allow 127.0.0.1
</feed>
<Stream cam3.mjpg>
Feed feed3.ffm
Format mpjpeg
VideoCodec mjpeg
VideoFrameRate 25
VideoIntraOnly
VideoBufferSize 8192
VideoBitRate 8192
VideoSize 1920x1080
VideoQMin 5
VideoQMax 15
NoAudio
Strict -1
</Stream>
And for webm:
ffmpeg -rtsp_transport tcp -i rtsp://rtsp_user:Rtsp_pass#192.168.3.83:554/Streaming/Channels/101 -c:v libvpx http://127.0.0.1:8090/feed4.ffm
ffserver:
<Stream cam4.webm>
Feed feed4.ffm
Format webm
# Audio settings
NoAudio
# Video settings
VideoCodec libvpx
VideoSize 720x576
VideoFrameRate 25
AVOptionVideo qmin 10
AVOptionVideo qmax 42
AVOptionAudio flags +global_header
PreRoll -1
StartSendOnKey
VideoBitRate 400
</Stream>

CANNOT get FFserver stream going

I want to preface this question with the fact that I am very very new to ffmpeg and even newer to ffserver.
I cannot, for the life of me, get this thing going.
I get:
"Too large number of skipped frames 882933314374 > 60000"
Also, ffplay gives me first frame is no keyframe
Here is my ffserver.conf file
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 1000
MaxClients 10
MaxBandwidth 2000000
NoDefaults
###############################################################################################
<Feed test.ffm>
File /tmp/test.ffm
FileMaxSize 10000M
ACL ALLOW localhost
</Feed>
<Stream status.html>
Format status
# Only allow local people to get the status
ACL allow localhost
</Stream>
<Stream test.avi>
Feed test.ffm
Format avi
ACL ALLOW localhost
ACL ALLOW 192.168.1.0
NoAudio
VideoSize 3840x2160
VideoFrameRate 30
Preroll 10
</Stream>
###############################################################################################
And here is my ffmpeg command
ffmpeg -i smaller.avi http://localhost:8090/test.ffm
I've been fighting with this thing all day, googling like a madman the entire time. What am I doing wrong? Any help will be welcomed enthusiastically.
These are my notes as I'm currently working through a similar process:
Video Streaming from ffserver for Raspberry PI - Unoptimized
Follow this tutorial: (I know people don't like links but this tut worked)
https://oscarliang.com/webcam-streaming-video-raspberry-pi-via-browser/
Download ffmpeg for windows (or linux)
git clone https://git.ffmpeg.org/ffmpeg.git ffmpeg
// Keep your ffserver.conf simple at first
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 1000
MaxClients 10
MaxBandwidth 2000000
NoDefaults
###############################################################################################
<Feed test.ffm>
File /tmp/test.ffm
FileMaxSize 10M
</Feed>
<Stream test.avi>
Feed test.avi
Format mjpeg
VideoSize 640x480
VideoFrameRate 20
VideoBitRate 2000
VideoQMin 2
VideoQMax 10
</Stream>
Put endpoint at http://<localhost>/webcam.mjpeg
Makesure webcam.sh contains:
ffserver -f /etc/ffserver.conf \
& ffmpeg -v verbose \
-r 30 \
-s 640x480 \
-f video4linux2 \
-i /dev/video0 http://localhost/webcam.ffm
Run the following:
// Use the following instead of vlc as this has faster streaming
Win:
ffplay.exe http://localhost/webcam.mjpeg
Linux:
ffplay http://localhost/webcam.mjpeg

FFMpeg/FFServer P2P streaming between embedded device and smartphone

I've been playing for the last couple of days with FFMpeg and FFServer as I am considering them as candidates for livestreaming between an embedded device(with an integrated HD camera) and various clients(smartphones).
I've managed to achieve the stream using with the following config for FFServer:
HTTPPort 1234
RTSPPort 1235
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 512000 # Maximum bandwidth per client
# set this high enough to exceed stream bitrate
CustomLog -
NoDaemon # Remove this if you want FFserver to daemonize after start
<Feed feed.ffm> # This is the input feed where FFmpeg will send
File /tmp/feed.ffm # video stream.
FileMaxSize 512K
</Feed>
<Stream test.h264> # Output stream URL definition
Feed feed.ffm # Feed from which to receive video
Format rtp
# Video settings
VideoCodec libvpx
VideoSize 720x576 # Video resolution
VideoFrameRate 60 # Video FPS
AVOptionVideo flags +global_header # Parameters passed to encoder
# (same as ffmpeg command-line parameters)
AVOptionVideo cpu-used 0
AVOptionVideo qmin 10
AVOptionVideo qmax 42
AVOptionVideo quality good
AVOptionAudio flags +global_header
PreRoll 15
StartSendOnKey
VideoBitRate 400 # Video bitrate
NoAudio
</Stream>
And the following FFMpeg command to send the stream to FFServer:
ffmpeg -rtbufsize 2100M -f dshow -i video="Integrated Camera" -vcodec libx264 http://127.0.0.1:1234/feed.ffm
I also have a simple Android client that plays the RTSP stream using the following URL:
rtsp://mylocalnetworkip:1235/test.h264
But now I am trying to achieve a P2P connection between the embedded device and a smartphone. This has to be over the internet(not in the LAN) and capable to achieve UDP hole punching (such as Skype does for p2p video-calling).
Is this achievable with ffmpeg alone?
Can ffmpeg integrate with a Stun/Turn server such as Coturn to bypass symmetric NATs?

ffmpeg stream rc buffer underflow

At the moment I'm setting up a screen-sharing platform with the opensource tool ffmpeg / ffserver. At the beginning of the sharing everything is fine. After around 1 1/2 minutes I get the following exception in the output.
[flv # 0x3a47aa0] rc buffer underflow
[flv # 0x3a47aa0] max bitrate possibly too small or try trellis with large lmax or increase qmax
I've tried to set an verry high lmax & qmax but this hasn't changed any thing.Additionally I've tried to increase the bitrate , the buffersize and the bitrate.
I use the following command with ffmpeg:
ffmpeg -f x11grab -s 1920x1080 -r 20 -i :0.0+1680,0 "http://localserver.de:8080/input1.ffm"
The config file for the ffserver is:
HTTPPort 8080
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 50000
CustomLog -
<Feed input1.ffm>
File /var/ffserver/input1.ffm
FileMaxSize 20M
ACL allow *FROM IP* *TO IP*
</Feed>
<Stream screen1.swf>
Feed input1.ffm
Format swf
VideoCodec flv
VideoFrameRate 20
VideoBufferSize 8000
VideoBitRate 250
VideoQMin 1
VideoQMax 5
VideoSize 640x400
PreRoll 0
StartSendOnKey
NoAudio
</Stream>
An other streaming format would also be a possibility but I didn't know which are possible for live-streaming.

av_interleaved_write_frame() unknown error when streaming WebM

I'm following a guide to live WebM streaming through FFMpeg / FFServer and running into an interesting error. I have tried using a DirectShow webcam source, and also an existing WebM (finite length) video using -vcodec copy. Initially, both will manage to connect to the FFServer (I can see the POST 200 OKs to /feed1.ffm), and maybe even send a frame or two, but then FFMpeg crashes with av_interleaved_write_frame(): Unknown error. (Meanwhile, FFServer appears to be fine.)
This appears to be an unusual variant of the error - normally it's more common to get, say, av_interleaved_write_frame(): I/O error (which indicates file corruption). Has anyone seen this error, and better yet, can anyone tell me how to fix it?
FFMpeg commands
ffmpeg -re -i univac.webm -vcodec copy -acodec copy -f webm http://[my server]/feed1.ffm
ffmpeg -f dshow -i video="[my dshow source]" -f webm http://[my server]/feed1.ffm
FFserver command
ffserver -f ffserver.conf
ffserver.conf
This is only a slight variation in the one provided in the aforementioned guide.
Port 8080
BindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 5
# MaxBandwidth 10000
CustomLog -
NoDaemon
<Feed feed1.ffm>
File ./feed1.ffm
FileMaxSize 1G
ACL allow [IP of the machine with ffmpeg]
</Feed>
<Stream test.webm>
Feed feed1.ffm
Format webm
# Audio settings
AudioCodec vorbis
AudioBitRate 64
# Video settings
VideoCodec libvpx
VideoSize 640x480
VideoFrameRate 30
AVOptionVideo flags +global_header
AVOptionVideo cpu-used 0
AVOptionVideo qmin 10
AVOptionVideo qmax 42
AVOptionVideo quality good
AVOptionAudio flags +global_header
VideoBitRate 400
# Streaming settings
PreRoll 15
StartSendOnKey
</Stream>
FFserver logs
avserver version 0.8.6-6:0.8.6-1ubuntu2, Copyright (c) 2000-2013 the Libav developers
built on Mar 30 2013 with gcc 4.7.2
AVserver started
[current time] - [GET] "/feed1.ffm HTTP/1.1" 200 4149
[current time] - [POST] "/feed1.ffm HTTP/1.1" 200 4096
This is probably caused by using different versions of ffmpeg and ffserver.
Try to use the same version. They should work without a problem.
In addition, use only libav or ffmpeg, because they are probably not quite compatible with each other.
The connection was establish by tcp and after I get the error on the client 'av_interleaved_write_frame(): Unknown error'. And I get 'Connection timed out' on the server.
For me, I found that I had an other process listened on the same port that ffmpeg was configured to used on the client.
to check used ports command:
(windows) netstat -a -b
(ubuntu) netstat -a -p
I used a custom ffmpeg inside a folder. The command used 'ffmpeg' used the wrong ffmpeg. Then I change it to './ffmpeg'.

Resources