av_interleaved_write_frame() unknown error when streaming WebM - ffmpeg

I'm following a guide to live WebM streaming through FFMpeg / FFServer and running into an interesting error. I have tried using a DirectShow webcam source, and also an existing WebM (finite length) video using -vcodec copy. Initially, both will manage to connect to the FFServer (I can see the POST 200 OKs to /feed1.ffm), and maybe even send a frame or two, but then FFMpeg crashes with av_interleaved_write_frame(): Unknown error. (Meanwhile, FFServer appears to be fine.)
This appears to be an unusual variant of the error - normally it's more common to get, say, av_interleaved_write_frame(): I/O error (which indicates file corruption). Has anyone seen this error, and better yet, can anyone tell me how to fix it?
FFMpeg commands
ffmpeg -re -i univac.webm -vcodec copy -acodec copy -f webm http://[my server]/feed1.ffm
ffmpeg -f dshow -i video="[my dshow source]" -f webm http://[my server]/feed1.ffm
FFserver command
ffserver -f ffserver.conf
ffserver.conf
This is only a slight variation in the one provided in the aforementioned guide.
Port 8080
BindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 5
# MaxBandwidth 10000
CustomLog -
NoDaemon
<Feed feed1.ffm>
File ./feed1.ffm
FileMaxSize 1G
ACL allow [IP of the machine with ffmpeg]
</Feed>
<Stream test.webm>
Feed feed1.ffm
Format webm
# Audio settings
AudioCodec vorbis
AudioBitRate 64
# Video settings
VideoCodec libvpx
VideoSize 640x480
VideoFrameRate 30
AVOptionVideo flags +global_header
AVOptionVideo cpu-used 0
AVOptionVideo qmin 10
AVOptionVideo qmax 42
AVOptionVideo quality good
AVOptionAudio flags +global_header
VideoBitRate 400
# Streaming settings
PreRoll 15
StartSendOnKey
</Stream>
FFserver logs
avserver version 0.8.6-6:0.8.6-1ubuntu2, Copyright (c) 2000-2013 the Libav developers
built on Mar 30 2013 with gcc 4.7.2
AVserver started
[current time] - [GET] "/feed1.ffm HTTP/1.1" 200 4149
[current time] - [POST] "/feed1.ffm HTTP/1.1" 200 4096

This is probably caused by using different versions of ffmpeg and ffserver.
Try to use the same version. They should work without a problem.
In addition, use only libav or ffmpeg, because they are probably not quite compatible with each other.

The connection was establish by tcp and after I get the error on the client 'av_interleaved_write_frame(): Unknown error'. And I get 'Connection timed out' on the server.
For me, I found that I had an other process listened on the same port that ffmpeg was configured to used on the client.
to check used ports command:
(windows) netstat -a -b
(ubuntu) netstat -a -p
I used a custom ffmpeg inside a folder. The command used 'ffmpeg' used the wrong ffmpeg. Then I change it to './ffmpeg'.

Related

How to convert rtsp stream to mjpeg (http) using ffmpeg

I wish to capture an rtsp stream and convert it to an mjpeg (over http) stream using ffmpeg. I am running Ubuntu 20. I have searched and searched for the solution, and mostly find:
a) solutions requiring ffserver (deprecated)
b) solutions converting from mjpeg to rtsp
c) solutions converting from rtsp to hls (nginx, wowza, etc...) which doesn't work in my application. I need http output as mjpeg.
d) vlc - which does work but requires way too much of my available processor (80%)
e) rtsp2mjpg - github project which I installed, but could not get to work and can't get any support.
I am not an ffmpeg expert, so if someone could step me through an ffmpeg solution to this, if it exists, I'd really appreciate it.
I've very recently solved this myself, after finding the exact same things as you. The two parts are you need are (1) ffmpeg conversion in a script, and (2) something like lighttpd+cgibin or nginix+fastcgi to serve it over http/https. I don't expect you'll be able to do much better in terms of CPU use than vlc, though.
This bash script will do the ffmpeg conversion to MJPEG, and send the output to stdout. Put this in lighttpd's cgi-bin folder (/var/www/cgi-bin for me). Call it something like "webcamstream", and adjust the rtsp:// URL to suit your camera:
#!/bin/bash
echo "Content-Type: multipart/x-mixed-replace;boundary=ffmpeg"
echo "Cache-Control: no-cache"
echo ""
ffmpeg -i "rtsp://192.168.60.13:554/user=admin&password=SECRET&channel=1&stream=0.sdp" -c:v mjpeg -q:v 1 -f mpjpeg -an -
Enable cgi-bin for lighttpd:
ln -s /etc/lighttpd/conf-available/10-cgi.conf /etc/lighttpd/conf-enabled/10-cgi.conf
..and then adjust lighttp's cgi-bin configuration (/etc/lighttpd/conf-enabled/10-cgi.conf) as shown below. The stream-response-body setting is important, as it'll both stop the stream when the client disconnects, and also avoid having lighttpd try to buffer the entire infinite stream before sending anything to the client.
server.modules += ( "mod_cgi" )
$HTTP["url"] =~ "^/cgi-bin/" {
server.stream-response-body = 2
cgi.assign = ( "" => "" )
alias.url += ( "/cgi-bin/" => "/var/www/cgi-bin/" )
}
Make the cgi-bin script executable and restart lighttpd:
chmod +x /var/www/cgi-bin/webcamstream
systemctl restart lighttpd
...and that should be it. You can then access the MJPEG stream at a URL like this, where the last part is your script's name:
http://serveraddress/cgi-bin/webcamstream
I've written it up in more detail here: Converting RTSP to HTTP on demand
As far as I can tell, you can't avoid taking the CPU hit of the conversion -- the format/encoding of RTSP vs. MJPEG frames are different. I reduced my CPU load by configuring the camera to reduce the source's framerate and resolution until it was an acceptable load on ffmpeg. You can change the resolution and framerate with ffmpeg arguments as well, but it would still have to decode the full frames first and do the work of resizing.
The paths above are on Debian, so you may need to adjust them to suit your Ubuntu system.
Convert RTSP to MJPEG via FFSERVER
ffmpeg download:
https://ffmpeg.org/releases/
choose old version before 3.4(since this version FFSERVER WAS REMOVED), recommand to use 3.2.16
Compile
./configure --prefix=/u/tool/ffserver
make && make install
FFSERVER
cd /u/tool/ffserver/bin
edit {ffserver.conf}
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 10000
<Feed feed.ffm>
File /tmp/feed.ffm
FileMaxSize 50M
</Feed>
<Stream live.mjpeg>
Feed feed.ffm
Format mpjpeg
VideoFrameRate 5
VideoIntraOnly
VideoSize 720x405
VideoQMin 5
VideoQMax 20
NoAudio
Strict -1
NoDefaults
</Stream>
<Stream still.jpg>
Feed feed.ffm
Format jpeg
VideoFrameRate 2
VideoSize 720x404
VideoQMin 1
VideoQMax 15
VideoIntraOnly
NoAudio
Strict -1
NoDefaults
</Stream>
Run FFSERVER
./ffserver -f ./ffserver.conf
pixel err
1280x720 == 720x405
if you use VideoSize 720x405,startup error message shows:
ffserver.conf "Image size is not a multiple of 2"
fix 405 to 404.
FEED STREAMING
ffmpeg feed, DO NOT USE SYSTEM BUILD FFMPEG!! CAUSE AFTER VERSION 3.4, IT WAS REMOVED!
use the ffmpeg you just compiled same directory with ffserver.
./ffmpeg -rtsp_transport tcp -i "rtsp://127.0.0.1:8554/000FFC52F1D3" -r 15 -an http://127.0.0.1:8090/feed.ffm
Browse mjpeg:
http://192.168.1.17:8090/live.mjpeg
Browse snap image:
http://192.168.1.17:8090/still.jpg
mjpeg status
http://localhost/tool/mjpeg.htm
Prevent RTSP stopped broke mjpeg image updating,loop update image path in JS every N seconds(ig: 15):
setInterval(function() {
var myImg = $('#myJpeg').attr('src', "http://192.168.1.17:8090/live.mjpeg?rand=" + Math.random());
}, 15000);
run server
ffserver -f /etc/ffserver.conf
run debug mode
ffserver -d -f /etc/ffserver.conf
If your want to run FFMPEG in console background, try to use -nostdin to FFMPEG or run in terminal multiplexer like SCREEN or TMUX.

very low latency streaminig with ffmpeg using a webcam

I'm trying to configure ffmpeg to do a real-time video streaming using a webcam. The ffmpeg encoder command I use is as follows.
ffmpeg -f v4l2 -input_format yuyv422 -s 640x480 -i /dev/video0 -c:v libx264 -profile:v baseline -trellis 0 -subq 1 -level 32 -preset superfast -tune zerolatency -me_method epzs -crf 30 -threads 0 -bufsize 1 -refs 4 -coder 0 -b_strategy 0 -bf 0 -sc_threshold 0 -x264-params vbv-maxrate=2000:slice-max-size=1500:keyint=30:min-keyint=10: -pix_fmt yuv420p -an -f mpegts udp://192.168.1.8:5001
The ffplay command used to display the video feed is,
ffplay -analyzeduration 1 -fflags -nobuffer -i udp://192.168.1.8:5001
However, I'm experiencing a latency of 0.5 - 1.0s latency in the video stream. Is there a way to reduce this to a number less than 100ms. Also, when I replace the v4l2 camera capture with a screen capture using x11grab, the stream is almost real-time and I experience no noticeable delays. Moreover, changing the encoder from x264 to mpeg2 had no effect on the latency. In addition, the statistics from the ffmpeg shows that the encoder is performing at a 30fps rate, which I believe indicates that the encoding is real-time. This leaves me with only one reason for the experienced delay.
Is there a significant delay in buffers when using v4l2 during video capturing in a webcam?
I don't think the transmission delay is in effect in this case as I see no latencies when screen capture is used under the same conditions.
Can this latency be further reduced?. Can someone think of a different encoder configuration to be used instead of the one that I've been using?
I had also many problems in setting up a low latency video streaming system between an odroid spc and windows pc. Finally i found settings resulting in approx 500ms to max. 1s latency.
Setup: ffserver on odroid xu4 with ubuntu 18.04, connected via wifi dongle to network. Windows 10 PC in same wifi network streaming from odroid.
I run the following ffserver config (/etc/ffserver.conf) on my odroid
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxClients 1000
MaxBandwidth 10000
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 50M
ACL allow 127.0.0.1
ACL allow localhost
</Feed>
<Stream test1.asf>
Format asf
Feed feed1.ffm
VideoFrameRate 30
VideoSize 640x480
VideoBitRate 600
#VideoBufferSize 400
VideoQMin 1
VideoQMax 20
NoAudio
ACL allow 127.0.0.1
ACL allow localhost
ACL allow 192.168.0.0 192.168.255.255
</Stream>
<Stream stat.html>
Format status
ACL allow 127.0.0.1
ACL allow localhost
</Stream>
and start the camera stream on the odroid with
ffserver -f /etc/ffserver.conf & ffmpeg -f v4l2 -s 640x480 -r 15 -i /dev/video0 -vcodec libx265 -threads 2 -tune zerolatency http://localhost:8090/feed1.ffm
On my Windows PC I tried several settings to get low latency. With VLC-Player I could not manage anything below 8 to 10 seconds.
With the following ffplay command I got about 500ms latency:
ffplay -fflags -nobuffer -probesize 32 -i mmsh://ubuntu1804:8090/test1.asf
so, -sync ext and -analyzeduration 1 did not help in reducing the latency.
The "stream production" on the odroid also runs with the same low latency when using libx264 instead of libx265 and removing the -thread 2 flag. But increasing the framerate to 30 or even increasing the resolution leads to significant delays.
i used the same send instruction and i tried this with the ffplay and it worked for me:
ffplay -analyzeduration 1 -fflags -nobuffer -probesize 32 -sync ext -i rtmp://localhost/live/STREAM_NAME

CANNOT get FFserver stream going

I want to preface this question with the fact that I am very very new to ffmpeg and even newer to ffserver.
I cannot, for the life of me, get this thing going.
I get:
"Too large number of skipped frames 882933314374 > 60000"
Also, ffplay gives me first frame is no keyframe
Here is my ffserver.conf file
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 1000
MaxClients 10
MaxBandwidth 2000000
NoDefaults
###############################################################################################
<Feed test.ffm>
File /tmp/test.ffm
FileMaxSize 10000M
ACL ALLOW localhost
</Feed>
<Stream status.html>
Format status
# Only allow local people to get the status
ACL allow localhost
</Stream>
<Stream test.avi>
Feed test.ffm
Format avi
ACL ALLOW localhost
ACL ALLOW 192.168.1.0
NoAudio
VideoSize 3840x2160
VideoFrameRate 30
Preroll 10
</Stream>
###############################################################################################
And here is my ffmpeg command
ffmpeg -i smaller.avi http://localhost:8090/test.ffm
I've been fighting with this thing all day, googling like a madman the entire time. What am I doing wrong? Any help will be welcomed enthusiastically.
These are my notes as I'm currently working through a similar process:
Video Streaming from ffserver for Raspberry PI - Unoptimized
Follow this tutorial: (I know people don't like links but this tut worked)
https://oscarliang.com/webcam-streaming-video-raspberry-pi-via-browser/
Download ffmpeg for windows (or linux)
git clone https://git.ffmpeg.org/ffmpeg.git ffmpeg
// Keep your ffserver.conf simple at first
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 1000
MaxClients 10
MaxBandwidth 2000000
NoDefaults
###############################################################################################
<Feed test.ffm>
File /tmp/test.ffm
FileMaxSize 10M
</Feed>
<Stream test.avi>
Feed test.avi
Format mjpeg
VideoSize 640x480
VideoFrameRate 20
VideoBitRate 2000
VideoQMin 2
VideoQMax 10
</Stream>
Put endpoint at http://<localhost>/webcam.mjpeg
Makesure webcam.sh contains:
ffserver -f /etc/ffserver.conf \
& ffmpeg -v verbose \
-r 30 \
-s 640x480 \
-f video4linux2 \
-i /dev/video0 http://localhost/webcam.ffm
Run the following:
// Use the following instead of vlc as this has faster streaming
Win:
ffplay.exe http://localhost/webcam.mjpeg
Linux:
ffplay http://localhost/webcam.mjpeg

FFMpeg/FFServer P2P streaming between embedded device and smartphone

I've been playing for the last couple of days with FFMpeg and FFServer as I am considering them as candidates for livestreaming between an embedded device(with an integrated HD camera) and various clients(smartphones).
I've managed to achieve the stream using with the following config for FFServer:
HTTPPort 1234
RTSPPort 1235
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 512000 # Maximum bandwidth per client
# set this high enough to exceed stream bitrate
CustomLog -
NoDaemon # Remove this if you want FFserver to daemonize after start
<Feed feed.ffm> # This is the input feed where FFmpeg will send
File /tmp/feed.ffm # video stream.
FileMaxSize 512K
</Feed>
<Stream test.h264> # Output stream URL definition
Feed feed.ffm # Feed from which to receive video
Format rtp
# Video settings
VideoCodec libvpx
VideoSize 720x576 # Video resolution
VideoFrameRate 60 # Video FPS
AVOptionVideo flags +global_header # Parameters passed to encoder
# (same as ffmpeg command-line parameters)
AVOptionVideo cpu-used 0
AVOptionVideo qmin 10
AVOptionVideo qmax 42
AVOptionVideo quality good
AVOptionAudio flags +global_header
PreRoll 15
StartSendOnKey
VideoBitRate 400 # Video bitrate
NoAudio
</Stream>
And the following FFMpeg command to send the stream to FFServer:
ffmpeg -rtbufsize 2100M -f dshow -i video="Integrated Camera" -vcodec libx264 http://127.0.0.1:1234/feed.ffm
I also have a simple Android client that plays the RTSP stream using the following URL:
rtsp://mylocalnetworkip:1235/test.h264
But now I am trying to achieve a P2P connection between the embedded device and a smartphone. This has to be over the internet(not in the LAN) and capable to achieve UDP hole punching (such as Skype does for p2p video-calling).
Is this achievable with ffmpeg alone?
Can ffmpeg integrate with a Stun/Turn server such as Coturn to bypass symmetric NATs?

How to convert RTSP stream into flv/swf Stream (w. ffmpeg)?

I want embed a webcam stream (From geovision video server) into a website. Unfortunately only the rtsp stream gives direct access to the video data.
I tried a bunch of different variants. With this version I got no errors:
openRTSP -b 50000 -w 352 -h 288 -f 5 -v -c -u admin password rtsp://xxxxxx.dyndns.org:8554/CH001.sdp | \
ffmpeg -r 5 -b 256000 -f mp4 -i - http://127.0.0.1:8090/feed1.ffm
Unfortunately I get no video. Sometimes I see a single frame of the webcam, but no livestream.
This is my ffserver.conf
Port 8090
BindAddress 0.0.0.0
MaxClients 200
MaxBandwidth 20000
CustomLog /var/log/flvserver/access.log
NoDaemon
# Server Status
<Stream stat.html>
Format status
</Stream>
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 200K
ACL allow 127.0.0.1
</Feed>
# SWF output - great for testing
<Stream test.swf>
# the source feed
Feed feed1.ffm
# the output stream format - SWF = flash
Format swf
#VideoCodec flv
# this must match the ffmpeg -r argument
VideoFrameRate 5
# another quality tweak
VideoBitRate 256K
# quality ranges - 1-31 (1 = best, 31 = worst)
VideoQMin 1
VideoQMax 3
VideoSize 352x288
# wecams don't have audio
NoAudio
</Stream>
What am I doing wrong? THe test.swf seems to load forever...
Tried something like this with vlc and it worked for me...
vlc.exe -I http -vv camURL :sout=#transcode{vcodec=h264,vb=0,scale=0,acodec=mp4a,ab=128,channels=2,samplerate=44100}:http{mux=ffmpeg{mux=flv},dst=addr:availablePort}
camurl is the url of the camera....
addr is the address where you want the httpstream to be sent to...
availablePort is the port where you want the httpstream to be sent to
You need to insert quotes in right way:
cvlc rtsp://192.168.13.162:554/ :sout='#transcode{vcodec=FLV1,vb=2048,fps=25,scale=1,acodec=none,deinterlace}:http{mime=video/x-flv,mux=ffmpeg{mux=flv},dst=0.0.0.0:5555/}' :no-sout-standard-sap :ttl=5 :sout-keep :no-audio --rtsp-caching 10200 --video --no-sout-audio --udp-caching=30000 --http-caching=5000

Resources