Not able to configure FFserver for RTSP - ffmpeg

I am kind of new to the streaming world... so, please forgive me if I ask kind of dumb question.
I am trying to stream my pre-recorded file over RTSP through FFserver.
My config file is :
RTSPPort 8544
<Feed feed2.ffm>
File /home/xyz/tmp/feed2.ffm
FileMaxSize 200K
ACL allow 127.0.0.1
</Feed>
<Stream test.sdp>
Feed feed2.ffm
Format rtsp
VideoFrameRate 15
VideoSize 352x240
VideoBitRate 256
VideoBufferSize 40
VideoGopSize 30
AudioBitRate 64
StartSendOnKey
</Stream>
After starting the server, it give the below log:
$ ./ffserver -f doc/ffserver.conf
ffserver version 0.11.1 Copyright (c) 2000-2012 the FFmpeg developers
built on Sep 17 2012 19:46:38 with gcc 4.1.2 20080704 (Red Hat 4.1.2-52)
configuration: --enable-gpl --enable-libmp3lame --enable-libtheora --enable-libvo-aacenc
-enable-libvorbis --enable-libvpx --enable-libx264 --enable-version3
libavutil 51. 54.100 / 51. 54.100
libavcodec 54. 23.100 / 54. 23.100
libavformat 54. 6.100 / 54. 6.100
libavdevice 54. 0.100 / 54. 0.100
libavfilter 2. 77.100 / 2. 77.100
libswscale 2. 1.100 / 2. 1.100
libswresample 0. 15.100 / 0. 15.100
libpostproc 52. 0.100 / 52. 0.100
Wed Sep 19 17:03:32 2012 FFserver started.
And now from my VLC client I am trying to type the URL: rtsp://xxx.xxx.xxx.xxx:8554/test.sdp
But, what happens is that, there is no response on the ffserver.
I have no clue what might be the problem. Thanks in advance.

You don't have anything to stream.
you need to start
ffmpeg -i <source> http://localhost:8090/feed2.ffm
IF you enable Port 8090 for http,
with the directive (in your config file)
Port 8090
This has been asked before, but badly tagged, so I can't find it.
If anybody finds it, please link it here. As starting an empty server seems to be common.

Matthias is right. Currently you are not streaming anything.
And the given ffmpeg-command should work but you might want to consider, that maybe the feed section is contra productive.
If the video-file is already stored on the server, you don't need the feed (since the video itself will serve as one).
<Stream test.sdp>
File "path_to_your_file" #instead of the Feed
...
</Stream>
If the video-file is on a different computer, you have to stream it to the server first (see Matthias' answer).
Edit:
Also you need the feed if you want to mess around with the stream before streaming.

Related

ffmpeg video to jpg frames poor quality

The quality of JPGs extracted by ffmpeg from an mp4 is much poorer than pause frame from video player (vlc). I am looking for ffmpeg cmd option to improve output quality.
Using following cmd :
/home/tools/bin/ffmpeg -i Merkurtransit_20191111_crf20_8fps_crop.mp4 Merkurtransit_20191111_crf20_8fps_crop_%04d.jpg -hide_banner
The ffmpeg cmd is from instructions found here :
https://www.bugcodemaster.com/article/extract-images-frame-frame-video-file-using-ffmpeg
A comparing screen copy is here:
http://skywatcher.space/download/vlc_player_vs_ffmpeg_bug.png
A few items of note. I created the mp4 myself from high res png (actually originally from 16bit tiff) using ffmpeg :
/home/tools/bin/ffmpeg -framerate 8.0 -i ./AS_P10_RS6_png_reg/Merkurtransit_20191111_%03d.png -vf "crop=760:560:20:40" -pix_fmt yuv420p -crf 20 -r 24 -y ./Video/Merkurtransit_20191111_crf20_8fps_crop.mp4
The crf 20 is pretty high quality, close to 100% and the recovered frame should be close to original. The video player pause frame shows adequate quality. (though I can't say if it is on a key frame or not)
ffmpeg version info:
home/tools/bin/ffmpeg -v
ffmpeg version N-80251-g0c7fa15 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.3)
configuration: --prefix=/home/tools/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/tools/ffmpeg_build/include --extra-ldflags=-L/home/tools/ffmpeg_build/lib --bindir=/home/tools/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree
libavutil 55. 24.100 / 55. 24.100
libavcodec 57. 45.100 / 57. 45.100
libavformat 57. 37.101 / 57. 37.101
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 46.101 / 6. 46.101
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
FFmpeg is primarily a video convertor and JPEG output is the result of a MJPEG encoder generating a single image. When no rate control parameters are set, a default bitrate of 200 kbps is selected.
For a better quality output, use
ffmpeg -i in.mp4 -q:v 1 -qmin 1 -qmax 1 out%d.jpg
The quantizer is clamped to exactly 1.

ffserver - invalid codec name libvpx

I have the following configuration of ffserver.conf:
Port 8090 # Port to bind the server to
BindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 10000 # Maximum bandwidth per client
# set this high enough to exceed stream bitrate
CustomLog -
NoDaemon # Remove this if you want FFserver to daemonize after start
<Feed feed1.ffm> # This is the input feed where FFmpeg will send
File ./feed1.ffm # video stream.
FileMaxSize 1G # Maximum file size for buffering video
ACL allow 127.0.0.1 # Allowed IPs
</Feed>
<Stream test.webm> # Output stream URL definition
Feed feed1.ffm # Feed from which to receive video
Format webm
# Audio settings
AudioCodec vorbis
AudioBitRate 64 # Audio bitrate
# Video settings
VideoCodec libvpx
VideoSize 720x576 # Video resolution
VideoFrameRate 25 # Video FPS
AVOptionVideo flags +global_header # Parameters passed to encoder
# (same as ffmpeg command-line parameters)
AVOptionVideo cpu-used 0
AVOptionVideo qmin 10
AVOptionVideo qmax 42
AVOptionVideo quality good
AVOptionAudio flags +global_header
PreRoll 15
StartSendOnKey
VideoBitRate 400 # Video bitrate
</Stream>
<Stream status.html> # Server status URL
Format status
# Only allow local people to get the status
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
</Stream>
<Redirect index.html> # Just an URL redirect for index
# Redirect index.html to the appropriate site
URL http://www.ffmpeg.org/
</Redirect>
and when I run the server with that config file, I get the following errors:
ffserver version 2.6.2 Copyright (c) 2000-2015 the FFmpeg developers
built with Apple LLVM version 6.1.0 (clang-602.0.49) (based on LLVM 3.6.0svn)
configuration: --prefix=/usr/local/Cellar/ffmpeg/2.6.2 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-libx264 --enable-libmp3lame --enable-libvo-aacenc --enable-libxvid --enable-vda
libavutil 54. 20.100 / 54. 20.100
libavcodec 56. 26.100 / 56. 26.100
libavformat 56. 25.101 / 56. 25.101
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 11.102 / 5. 11.102
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 1.100 / 1. 1.100
libpostproc 53. 3.100 / 53. 3.100
ffserver.conf:1: Port option is deprecated. Use HTTPPort instead.
ffserver.conf:2: BindAddress option is deprecated. Use HTTPBindAddress instead.
ffserver.conf:8: NoDaemon option has no effect. You should remove it.
ffserver.conf:13: ACL refers to invalid host or IP address '#'
ffserver.conf:26: Invalid codec name: 'libvpx'
ffserver.conf:31: Option not found: 'cpu-used'
ffserver.conf:31: If 'cpu-used' is a codec privateoption, then prefix it with codec name, for example 'vp8:cpu-used 0' or define codec earlier.
ffserver.conf:34: Option not found: 'quality'
ffserver.conf:39: Setting default value for audio sample rate = 22050. Use NoDefaults to disable it.
ffserver.conf:39: Setting default value for audio channel count = 1. Use NoDefaults to disable it.
How can I run it successfuly? I want to stream live webm video, but so far I stuck at the point of starting ffserver..
Ffmpeg lists config options on any invocation, and there's no --enable-libvpx option in your configuration. Try building ffmpeg with --enable-libvpx.

FFmpeg: NetStream.Play.StreamNotFound on RMTP stream

I want to take snapshots periodically of a RTMP live video stream.
I can see the rtmp video stream using VLC. This is the rtmp url:
rtmp://antena3fms35livefs.fplive.net/antena3fms35live-live/stream-antena3_1
I'm using the following command to capture the snapshots, according to the official FFmpeg site here:
ffmpeg -i rtmp://antena3fms35livefs.fplive.net/antena3fms35live-live/stream-antena3_1 -f image2 -vf fps=fps=1 out%d.png
The command produces the following output:
ffmpeg version N-64667-gd595361 Copyright (c) 2000-2014 the FFmpeg developers
built on Jul 14 2014 22:09:48 with gcc 4.8.3 (GCC)
configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzl
libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amr
enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --ena
libavutil 52. 92.100 / 52. 92.100
libavcodec 55. 69.100 / 55. 69.100
libavformat 55. 47.100 / 55. 47.100
libavdevice 55. 13.102 / 55. 13.102
libavfilter 4. 10.100 / 4. 10.100
libswscale 2. 6.100 / 2. 6.100
libswresample 0. 19.100 / 0. 19.100
libpostproc 52. 3.100 / 52. 3.100
HandShake: client signature does not match!
Closing connection: NetStream.Play.StreamNotFound
rtmp://antena3fms35livefs.fplive.net/antena3fms35live-live/stream-antena3_1: Unknown error occurred
I've tried it with another rmtp streams, but I'm still getting the exact same error.
What could be the problem?
Thank you!
I just tried your command and it worked fine for me. Maybe it is something about your FFMPEG installation? I am using version 2.4 on a Mac (tessus build).
I know other/older versions used "librtmp" for rtmp connections, which required some extra options behind the stream URL. See ffmpeg docs here:
ffmpeg documentation on librtmp
And librtmp docs here:
librtmp documentation
For an unprotected live stream, you may want to try quoting the stream URL and appending " live=1" within the quotes:
ffmpeg -i "rtmp://antena3fms35livefs.fplive.net/antena3fms35live-live/stream-antena3_1 live=1" -f image2 -vf fps=fps=1 out%d.png

Stream from MP4 file over RTSP with ffserver

I'm trying to stream a mp4 file over RTSP using ffserver with no luck so far. I just want to stream directly from the file, without feeding from ffmpeg (no transcoding involved). But I've made it work with mpg video.
Here is my ffserver config file:
Port 8090
BindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 500000
CustomLog -
NoDaemon
RTSPPort 7654
RTSPBindAddress 0.0.0.0
<Stream test1-rtsp>
Format rtp
File "/home/g/video_streaming/sample3-mpeg2.mpg"
</Stream>
<Stream test2-rtsp>
Format rtp
File "/home/g/video.mp4"
</Stream>
When I launch ffserver, everything seems fine based on the log output:
$ ./dev/ffmpeg/ffserver -f ffserver-sample.conf
ffserver version N-45673-gd0c27e8 Copyright (c) 2000-2012 the FFmpeg developers
built on Oct 18 2012 10:36:52 with gcc 4.6 (Ubuntu/Linaro 4.6.3-1ubuntu5)
configuration:
libavutil 51. 76.100 / 51. 76.100
libavcodec 54. 66.100 / 54. 66.100
libavformat 54. 33.100 / 54. 33.100
libavdevice 54. 3.100 / 54. 3.100
libavfilter 3. 19.103 / 3. 19.103
libswscale 2. 1.101 / 2. 1.101
libswresample 0. 16.100 / 0. 16.100
Thu Oct 18 11:54:22 2012 Opening file '/home/g/video.mp4'
Thu Oct 18 11:54:22 2012 Opening file '/home/g/video.mp4'
Thu Oct 18 11:54:23 2012 Opening file '/home/g/video_streaming/sample3-mpeg2.mpg'
Thu Oct 18 11:54:23 2012 [mpeg # 0x1dae3c0]max_analyze_duration 5000000 reached at 5005000
Thu Oct 18 11:54:23 2012 Opening file '/home/g/video_streaming/sample3-mpeg2.mpg'
Thu Oct 18 11:54:23 2012 [mpeg # 0x1dae3c0]max_analyze_duration 5000000 reached at 5005000
Thu Oct 18 11:54:23 2012 FFserver started.
Finally, if I run ffplay in order to test the server, everything works fine for the mpg file, but not for the mp4:
$ ffplay rtsp://192.168.1.99:7654/test2-rtsp
ffplay version N-45656-g916352f Copyright (c) 2003-2012 the FFmpeg developers
built on Oct 17 2012 16:14:14 with gcc 4.4.5 (Ubuntu/Linaro 4.4.4-14ubuntu5.1)
configuration:
libavutil 51. 76.100 / 51. 76.100
libavcodec 54. 66.100 / 54. 66.100
libavformat 54. 33.100 / 54. 33.100
libavdevice 54. 3.100 / 54. 3.100
libavfilter 3. 19.103 / 3. 19.103
libswscale 2. 1.101 / 2. 1.101
libswresample 0. 16.100 / 0. 16.100
rtsp://192.168.1.99:7654/test2-rtsp: Invalid data found when processing input
Server's output:
Thu Oct 18 11:57:51 2012 FFserver started.
Thu Oct 18 11:58:01 2012 192.168.1.101 - - [DESCRIBE] "rtsp://192.168.1.99:7654/test2-rtsp RTSP/1.0" 200 167
Segmentation fault (core dumped)
I don't really know what I could be missing. I've just read in the official doc that streaming from a file is kind of broken. Since I don't really know if that's up to date, I decided to give it a try here.
Any help or suggestions? Alternatives?
If you are looking for alternatives live555 (http://www.live555.com/ ) and darwin servers are good options. I have used them both and the behave well while streaming from file.
In the above case you can even try debugging by analyzing core dump. By looking at the logs I think server is crashing even before receiving a play command. So it may be a small hick-up somewhere
Only 4 years later...
You can't stream mp4 videos using FFserver because it contains global metadata in the file header - making random stream access impossible. [source]
Possible alternative:
// convert awesome.mp4 to awesome.flv
$ ffmpeg -i awesome.mp4 -c:v libx264 -ar 22050 -crf 28 awesome.flv
For more on FFmpeg... go to blog.

Save continuous RTSP stream to 5-10 minute long mp4 files

How can I keep the flow (protocol rtsp, codec h264) in file (container mp4)? That is, on inputting an endless stream (with CCTV camera), and the output files in mp4 format size of 5-10 minutes of recording time.
OS: debian, ubuntu
Software: vlc, ffmpeg (avconv)
Currently this scheme is used:
cvlc rtsp://admin:admin#10.1.1.1:554/ch1-s1 --sout=file/ts:stream.ts
ffmpeg -i stream.ts -vcodec copy -f mp4 stream.mp4
But it can not record video continuously (between restarts vlc loses about 10 seconds of live video).
See this question and answer on Server Fault. In short, switch tools. avconv will do what you want. (ffmpeg has become avconv.)
The feature you are looking for is called segmentation. Your command line would look something like this:
avconv -i rtsp://10.2.2.19/live/ch01_0 -c copy -map 0 -f segment -segment_time 300 -segment_format mp4 "capture-%03d.mp4"
Alexander Garden solution works for ffmpep using the version below. Replace avconv with ffmpeg.
./ffmpeg -i rtsp://10.2.2.19/live/ch01_0 -c copy -map 0 -f segment -segment_time 300 -segment_format mp4 "capture-%03d.mp4"
I'm including this header because of the FFmpeg confusion over versions, the ubuntu schism and rapid development.
ffmpeg version N-80023-gd55568d Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04.1)
configuration: --prefix=/home/rhinchley/q10/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/rhinchley/q10/ffmpeg_build/include --extra-ldflags=-L/home/rhinchley/q10/ffmpeg_build/lib --bindir=/home/rhinchley/q10/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree
libavutil 55. 24.100 / 55. 24.100
libavcodec 57. 42.100 / 57. 42.100
libavformat 57. 36.100 / 57. 36.100
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 45.100 / 6. 45.100
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
Team work: Split the video source and have two processes alternate recording the time frame. You'll want to test how variable the startup time is, and how variable it is. You might want to set the processes priority to realtime to reduce start time variance. There will be some overlap but that sound like it might be ok for your application from what I infer. Example:
p1: sRRRRRRRRRwwwwwwwwsRRRRRRRRRwwwwwwwwsRRRRRRRRR...
p2: wwwwwwwwwsRRRRRRRRRwwwwwwwwsRRRRRRRRRwwwwwwwww...
time -->
s: startup
R: running
w: wait

Resources