I wish to capture an rtsp stream and convert it to an mjpeg (over http) stream using ffmpeg. I am running Ubuntu 20. I have searched and searched for the solution, and mostly find:
a) solutions requiring ffserver (deprecated)
b) solutions converting from mjpeg to rtsp
c) solutions converting from rtsp to hls (nginx, wowza, etc...) which doesn't work in my application. I need http output as mjpeg.
d) vlc - which does work but requires way too much of my available processor (80%)
e) rtsp2mjpg - github project which I installed, but could not get to work and can't get any support.
I am not an ffmpeg expert, so if someone could step me through an ffmpeg solution to this, if it exists, I'd really appreciate it.
I've very recently solved this myself, after finding the exact same things as you. The two parts are you need are (1) ffmpeg conversion in a script, and (2) something like lighttpd+cgibin or nginix+fastcgi to serve it over http/https. I don't expect you'll be able to do much better in terms of CPU use than vlc, though.
This bash script will do the ffmpeg conversion to MJPEG, and send the output to stdout. Put this in lighttpd's cgi-bin folder (/var/www/cgi-bin for me). Call it something like "webcamstream", and adjust the rtsp:// URL to suit your camera:
#!/bin/bash
echo "Content-Type: multipart/x-mixed-replace;boundary=ffmpeg"
echo "Cache-Control: no-cache"
echo ""
ffmpeg -i "rtsp://192.168.60.13:554/user=admin&password=SECRET&channel=1&stream=0.sdp" -c:v mjpeg -q:v 1 -f mpjpeg -an -
Enable cgi-bin for lighttpd:
ln -s /etc/lighttpd/conf-available/10-cgi.conf /etc/lighttpd/conf-enabled/10-cgi.conf
..and then adjust lighttp's cgi-bin configuration (/etc/lighttpd/conf-enabled/10-cgi.conf) as shown below. The stream-response-body setting is important, as it'll both stop the stream when the client disconnects, and also avoid having lighttpd try to buffer the entire infinite stream before sending anything to the client.
server.modules += ( "mod_cgi" )
$HTTP["url"] =~ "^/cgi-bin/" {
server.stream-response-body = 2
cgi.assign = ( "" => "" )
alias.url += ( "/cgi-bin/" => "/var/www/cgi-bin/" )
}
Make the cgi-bin script executable and restart lighttpd:
chmod +x /var/www/cgi-bin/webcamstream
systemctl restart lighttpd
...and that should be it. You can then access the MJPEG stream at a URL like this, where the last part is your script's name:
http://serveraddress/cgi-bin/webcamstream
I've written it up in more detail here: Converting RTSP to HTTP on demand
As far as I can tell, you can't avoid taking the CPU hit of the conversion -- the format/encoding of RTSP vs. MJPEG frames are different. I reduced my CPU load by configuring the camera to reduce the source's framerate and resolution until it was an acceptable load on ffmpeg. You can change the resolution and framerate with ffmpeg arguments as well, but it would still have to decode the full frames first and do the work of resizing.
The paths above are on Debian, so you may need to adjust them to suit your Ubuntu system.
Convert RTSP to MJPEG via FFSERVER
ffmpeg download:
https://ffmpeg.org/releases/
choose old version before 3.4(since this version FFSERVER WAS REMOVED), recommand to use 3.2.16
Compile
./configure --prefix=/u/tool/ffserver
make && make install
FFSERVER
cd /u/tool/ffserver/bin
edit {ffserver.conf}
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 10000
<Feed feed.ffm>
File /tmp/feed.ffm
FileMaxSize 50M
</Feed>
<Stream live.mjpeg>
Feed feed.ffm
Format mpjpeg
VideoFrameRate 5
VideoIntraOnly
VideoSize 720x405
VideoQMin 5
VideoQMax 20
NoAudio
Strict -1
NoDefaults
</Stream>
<Stream still.jpg>
Feed feed.ffm
Format jpeg
VideoFrameRate 2
VideoSize 720x404
VideoQMin 1
VideoQMax 15
VideoIntraOnly
NoAudio
Strict -1
NoDefaults
</Stream>
Run FFSERVER
./ffserver -f ./ffserver.conf
pixel err
1280x720 == 720x405
if you use VideoSize 720x405,startup error message shows:
ffserver.conf "Image size is not a multiple of 2"
fix 405 to 404.
FEED STREAMING
ffmpeg feed, DO NOT USE SYSTEM BUILD FFMPEG!! CAUSE AFTER VERSION 3.4, IT WAS REMOVED!
use the ffmpeg you just compiled same directory with ffserver.
./ffmpeg -rtsp_transport tcp -i "rtsp://127.0.0.1:8554/000FFC52F1D3" -r 15 -an http://127.0.0.1:8090/feed.ffm
Browse mjpeg:
http://192.168.1.17:8090/live.mjpeg
Browse snap image:
http://192.168.1.17:8090/still.jpg
mjpeg status
http://localhost/tool/mjpeg.htm
Prevent RTSP stopped broke mjpeg image updating,loop update image path in JS every N seconds(ig: 15):
setInterval(function() {
var myImg = $('#myJpeg').attr('src', "http://192.168.1.17:8090/live.mjpeg?rand=" + Math.random());
}, 15000);
run server
ffserver -f /etc/ffserver.conf
run debug mode
ffserver -d -f /etc/ffserver.conf
If your want to run FFMPEG in console background, try to use -nostdin to FFMPEG or run in terminal multiplexer like SCREEN or TMUX.
Related
The situation is kind of complex. I was archiving several CCTV camera feeds (rtsp, h264, no audio) through OpenCV, which worked but the CPU utilization was too high and started to lose some frames time by time.
To reduce the CPU utilization, I started to use FFMPEG to skip the decoding and encoding processes, which worked perfectly on my home machine. However, when I connected to my university VPN and tried to deploy it on our lab server, FFmpeg couldn't read any frame, ffplay couldn't get anything either. However, OpenCV, VLC Player and IINA Player could still read and display the feed.
In Summary,
1 FFMPEG/ffplay
1.1 can only read the feed from my home network(Wi-Fi, optimum)
1.2 from other two networks, the error message says: "Could not find codec parameters for stream 0 (Video: h264, none): unspecified size
Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options"
2 IINA/VLC Player, OpenCV
These tools can get the video all the time.
I'm wondering whether it's related to some specific port access, that the ffmpeg required but the others don't. I'd appreciate it if anyone can provide any suggestions.
As references, the tested ffplay command is simple:
ffplay 'the rtsp address'
Thanks
Update
More tests have been performed.
By specifying rtsp_transport as TCP, ffplay can play the video, but FFmpeg can't access the video. (In the beginning, when both FFmpeg and ffplay worked through my home network, it was UDP)
The FFmpeg command is as follows:
ffmpeg -i rtsp://the_ip_address/axis-media/media.amp -hide_banner -c:v copy -s 1920x1080 -segment_time 00:30:00 -f segment -strftime 1 -reset_timestamps 1 -rtsp_transport tcp "%Y-%m-%d-%H-%M-%S_Test.mp4"
Please help...
Solved by forcing it to use "-rtsp_transport tcp" right before -i.
ffmpeg -rtsp_transport tcp -i rtsp://the_ip_address/axis-media/media.amp -hide_banner -c:v copy -s 1920x1080 -segment_time 00:30:00 -f segment -strftime 1 -reset_timestamps 1 "%Y-%m-%d-%H-%M-%S_Test.mp4"
I'm running the following command using FFMpeg to extract an image every 1 second from a UDP stream:
ffmpeg -i "udp://224.1.2.123:9001" -s 256x144 -vf fps=1 -update 1 test.jpg -y
This works well, but it takes about 5 seconds to actually start producing images. Is there any way to lower the startup time?
The UDP stream uses mpegts format and is encoded with H264/AAC.
Thanks!
I managed to successfully feed ffserver from ffmpeg. ffmpeg takes input as PIPE:
ffmpeg -loglevel fatal -f image2pipe -re -vcodec png -i - -vcodec libx264 http://localhost:8090/%s.ffm
External java process generates svg/png images and pushes to ffpmepg
My ffserver config allows me to buffer live feeds in ffm file without defining the size of the file.
My stream configuration looks like this:
<Stream live2.mjpg>
Feed feed2.ffm
Format mpjpeg
VideoFrameRate 25
VideoSize 640x880
VideoQMin 1
VideoQMax 5
NoAudio
Strict -1
</Stream>
The problem is that, despite that I can watch streams in VLC by opening network:
http://0.0.0.0:8090/live2.mjpg
But I can not seek through already buffered movie.
Is there a way to achieve seeking through movie, pausing, and resume playing from "now"? I have tried already rtsp with h264, mpg and sdp but without success:
<Stream test1.mpg/sdp/h264>
Format rtp
Feed feed2.ffm
VideoCodec libx264
VideoSize 640x880
VideoQMin 1
VideoQMax 5
NoAudio
Strict -1
VideoFrameRate 25
</Stream>
Is rtsp solution for this problem, or I need something else?
Can this be achieved from dynamic file since I am using PIPE?
RTSP
RTSP support in ffserver seems a bit sketchy, you could try Darwin Streaming Server or the Live555 media server. The two seem to support some forms of trick-play at least for VOD. Since you're using a pipe this won't probably help.
RTMP
Some RTMP servers/clients support in-buffer seeking (Smart Seeking).
About Smart Seek
Adobe Media Server 3.5.3 and Flash Player 10.1 work together to
support smart seeking in VOD streams and in live streams that have a
buffer. [Source].
ffserver doesn't support RTMP output but you can use your ffmpeg command to push your stream directly to the server:
ffmpeg -re -i <input> -f flv rtmp://...
There's a Nginx RTMP module and a C++ RTMP server although it's not very clear if they support smart seeking. VLC seems to be able to seek a bit while paused and there are usually options to modify the size of the client RTMP buffer.
I'm following a guide to live WebM streaming through FFMpeg / FFServer and running into an interesting error. I have tried using a DirectShow webcam source, and also an existing WebM (finite length) video using -vcodec copy. Initially, both will manage to connect to the FFServer (I can see the POST 200 OKs to /feed1.ffm), and maybe even send a frame or two, but then FFMpeg crashes with av_interleaved_write_frame(): Unknown error. (Meanwhile, FFServer appears to be fine.)
This appears to be an unusual variant of the error - normally it's more common to get, say, av_interleaved_write_frame(): I/O error (which indicates file corruption). Has anyone seen this error, and better yet, can anyone tell me how to fix it?
FFMpeg commands
ffmpeg -re -i univac.webm -vcodec copy -acodec copy -f webm http://[my server]/feed1.ffm
ffmpeg -f dshow -i video="[my dshow source]" -f webm http://[my server]/feed1.ffm
FFserver command
ffserver -f ffserver.conf
ffserver.conf
This is only a slight variation in the one provided in the aforementioned guide.
Port 8080
BindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 5
# MaxBandwidth 10000
CustomLog -
NoDaemon
<Feed feed1.ffm>
File ./feed1.ffm
FileMaxSize 1G
ACL allow [IP of the machine with ffmpeg]
</Feed>
<Stream test.webm>
Feed feed1.ffm
Format webm
# Audio settings
AudioCodec vorbis
AudioBitRate 64
# Video settings
VideoCodec libvpx
VideoSize 640x480
VideoFrameRate 30
AVOptionVideo flags +global_header
AVOptionVideo cpu-used 0
AVOptionVideo qmin 10
AVOptionVideo qmax 42
AVOptionVideo quality good
AVOptionAudio flags +global_header
VideoBitRate 400
# Streaming settings
PreRoll 15
StartSendOnKey
</Stream>
FFserver logs
avserver version 0.8.6-6:0.8.6-1ubuntu2, Copyright (c) 2000-2013 the Libav developers
built on Mar 30 2013 with gcc 4.7.2
AVserver started
[current time] - [GET] "/feed1.ffm HTTP/1.1" 200 4149
[current time] - [POST] "/feed1.ffm HTTP/1.1" 200 4096
This is probably caused by using different versions of ffmpeg and ffserver.
Try to use the same version. They should work without a problem.
In addition, use only libav or ffmpeg, because they are probably not quite compatible with each other.
The connection was establish by tcp and after I get the error on the client 'av_interleaved_write_frame(): Unknown error'. And I get 'Connection timed out' on the server.
For me, I found that I had an other process listened on the same port that ffmpeg was configured to used on the client.
to check used ports command:
(windows) netstat -a -b
(ubuntu) netstat -a -p
I used a custom ffmpeg inside a folder. The command used 'ffmpeg' used the wrong ffmpeg. Then I change it to './ffmpeg'.
ffmpeg handles RTMP streaming as input or output, and it's working well.
I want to stream some videos (a dynamic playlist managed by a python script) to a RTMP server, and i'm currently doing something quite simple: streaming my videos one by one with FFMPEG to the RTMP server, however this causes a connection break every time a video end, and the stream is ready to go when the next video begins.
I would like to stream those videos without any connection breaks continuously, then the stream could be correctly viewed.
I use this command to stream my videos one by one to the server
ffmpeg -re -y -i myvideo.mp4 -vcodec libx264 -b:v 600k -r 25 -s 640x360 \
-filter:v yadif -ab 64k -ac 1 -ar 44100 -f flv \
"rtmp://mystreamingserver/app/streamName"
I looked for some workarounds over the internet for many days, and i found some people talking about using a named pipe as input in ffmpeg, I've tried it and it didn't work well since ffmpeg does not only close the RTMP stream when a new video comes but also closes itself.
Is there any way to do this ? (stream a dynamic playlist of videos with ffmpeg to RTMP server without connection breaks
Update (as I can't delete the accepted answer): the proper solution is to implement a custom demuxer, similar to the concat one. There's currently no other clean way. You have to get your hands dirty and code!
Below is an ugly hack. This is a very bad way to do it, just don't!
The solution uses the concat demuxer and assumes all your source media files use the same codec. The example is based on MPEG-TS but the same can be done for RTMP.
Make a playlist file holding a huge list of entry points for you dynamic playlist with the following format:
file 'item_1.ts'
file 'item_2.ts'
file 'item_3.ts'
[...]
file 'item_[ENOUGH_FOR_A_LIFETIME].ts'
These files are just placeholders.
Make a script that keeps track of you current playlist index and creates symbolic links on-the-fly for current_index + 1
ln -s /path/to/what/to/play/next.ts item_1.ts
ln -s /path/to/what/to/play/next.ts item_2.ts
ln -s /path/to/what/to/play/next.ts item_3.ts
[...]
Start playing
ffmpeg -f concat -i playlist.txt -c copy output -f mpegts udp://<ip>:<port>
Get chased and called names by an angry system administrator
Need to create two playlist files and at the end of each file specify a link to another file.
list_1.txt
ffconcat version 1.0
file 'item_1.mp4'
file 'list_2.txt'
list_2.txt
ffconcat version 1.0
file 'item_2.mp4'
file 'list_1.txt'
Now all you need is to dynamically change the contents of the next playlist file.
You can pipe your loop to a buffer, and from this buffer you pipe to your streaming instance.
In shell it would look like:
#!/bin/bash
for i in *.mp4; do
ffmpeg -hide_banner -nostats -i "$i" -c:v mpeg2video \
[proper settings] -f mpegts -
done | mbuffer -q -c -m 20000k | ffmpeg -hide_banner \
-nostats -re -fflags +igndts \
-thread_queue_size 512 -i pipe:0 -fflags +genpts \
[proper codec setting] -f flv rtmp://127.0.0.1/live/stream
Of course you can use any kind of loop, also looping through a playlist.
I figure out that mpeg is a bit more stabile, then x264 for the input stream.
I don't know why, but minimum 2 threads for the mpeg compression works better.
the input compression need to be faster then the output frame rate, so we get fast enough new input.
Because of the non-continuing timestamp we have to skip them and generate a new one in the output.
The buffer size needs to be big enough for the loop to have enough time to get the new clip.
Here is a Rust based solution, which uses this technique: ffplayout
This uses a JSON playlist format. The Playlist is dynamic, in that way that you can edit always the current playlist and change tracks or add new ones.
Very Late Answer, but I recently ran into the exact same issue as the poster above.
I solved this problem by using OBS and the OBS websockets plugin.
First, set your RTMP streaming app as you have it now. but stream to a LOCAL RTMP stream.
Then have OBS load this RTMP stream as a VLC source layer with the local RTMP as the source.
then (in your app), using the OBS websockets plugin, have your VLC source switch to a static black video or PNG file when the video ends. Then switch back to the RTMP stream once the next video starts. This will prevent the RTMP stream from stopping when the video ends. OBS will go black durring the short transition, but the final OBS RTMP output will never stop.
There is surely a way to do this with manually setting up a intermediate RTMP server that pushes to a final RTMP server, but I find using OBS to be easier, with little overhead.
I hope this helps others, this solutions has been working incredible for me.