What steps are needed to stream RTSP from FFmpeg? - ffmpeg

What steps are needed to stream RTSP from FFmpeg?
Streaming UDP is not a problem, but as I want to stream to mobile devices which can natively read RTSP streams, I couldn't find any setup which tells what exactly is needed. Do I need an RTSP streaming server like LIVE555 or can I use FFmpeg only?
My Command:
ffmpeg -i space.mp4 -vcodec libx264 -tune zerolatency -crf 18 -f rtsp -muxdelay 0.1 rtsp://192.168.1.200:1234
I get an Input/Output error.
Do I need a SDP description to use RTSP?
And if yes where do I have to put it?

You can use FFserver to stream a video using RTSP.
Just change console syntax to something like this:
ffmpeg -i space.mp4 -vcodec libx264 -tune zerolatency -crf 18 http://localhost:1234/feed1.ffm
Create a ffserver.config file (sample) where you declare HTTPPort, RTSPPort and SDP stream. Your config file could look like this (some important stuff might be missing):
HTTPPort 1234
RTSPPort 1235
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 2M
ACL allow 127.0.0.1
</Feed>
<Stream test1.sdp>
Feed feed1.ffm
Format rtp
Noaudio
VideoCodec libx264
AVOptionVideo flags +global_header
AVOptionVideo me_range 16
AVOptionVideo qdiff 4
AVOptionVideo qmin 10
AVOptionVideo qmax 51
ACL allow 192.168.0.0 192.168.255.255
</Stream>
With such setup you can watch the stream with i.e. VLC by typing:
rtsp://192.168.0.xxx:1235/test1.sdp
Here is the FFserver documentation.

FWIW, I was able to setup a local RTSP server for testing purposes using simple-rtsp-server and ffmpeg following these steps:
Create a configuration file for the RTSP server called rtsp-simple-server.yml with this single line:
protocols: [tcp]
Start the RTSP server as a Docker container:
$ docker run --rm -it -v $PWD/rtsp-simple-server.yml:/rtsp-simple-server.yml -p 8554:8554 aler9/rtsp-simple-server
Use ffmpeg to stream a video file (looping forever) to the server:
$ ffmpeg -re -stream_loop -1 -i test.mp4 -f rtsp -rtsp_transport tcp rtsp://localhost:8554/live.stream
Once you have that running you can use ffplay to view the stream:
$ ffplay -rtsp_transport tcp rtsp://localhost:8554/live.stream
Note that simple-rtsp-server can also handle UDP streams (i.s.o. TCP) but that's tricky running the server as a Docker container.

Another streaming command I've had good results with is piping the ffmpeg output to vlc to create a stream. If you don't have these installed, you can add them:
sudo apt install vlc ffmpeg
In the example I use an mpeg transport stream (ts) over http, instead of rtsp. I've tried both, but the http ts stream seems to work glitch-free on my playback devices.
I'm using a video capture HDMI>USB device that sets itself up on the video4linux2 driver as input. Piping through vlc must be CPU-friendly, because my old dual-core Pentium CPU is able to do the real-time encoding with no dropped frames. I've also had audio-sync issues with some of the other methods, where this method always has perfect audio-sync.
You will have to adjust the command for your device or file. If you're using a file as input, you won't need all that v4l2 and alsa stuff. Here's the ffmpeg|vlc command:
ffmpeg -thread_queue_size 1024 -f video4linux2 -input_format mjpeg -i /dev/video0 -r 30 -f alsa -ac 1 -thread_queue_size 1024 -i hw:1,0 -acodec aac -vcodec libx264 -preset ultrafast -crf 18 -s hd720 -vf format=yuv420p -profile:v main -threads 0 -f mpegts -|vlc -I dummy - --sout='#std{access=http,mux=ts,dst=:8554}'
For example, lets say your server PC IP is 192.168.0.10, then the stream can be played by this command:
ffplay http://192.168.0.10:8554
#or
vlc http://192.168.0.10:8554
UPDATE:
Here is a command to use VLC for rtsp, instead of using the rtsp-simple-server:
ffmpeg -thread_queue_size 1024 -f video4linux2 -input_format mjpeg -video_size 1280x720 -r 30 -i /dev/video0 -f alsa -thread_queue_size 1024 -i plughw:CARD=MS2109,DEV=0 -acodec mp2 -vcodec libx264 -preset ultrafast -crf 20 -s hd720 -vf format=yuv420p -profile:v main -f mpegts -|vlc -I dummy - --sout='#rtp{sdp=rtsp://:8554/} --sout-all --sout-keep'
If your PC ip is 192.168.0.10, then the rtsp stream is played by this command:
vlc rtsp://192.168.0.10:8554/

An alternative that I used instead of FFServer was Red5 Pro. On Ubuntu, I used this line:
ffmpeg -f pulse -i default -f video4linux2 -thread_queue_size 64 -framerate 25 -video_size 640x480 -i /dev/video0 -pix_fmt yuv420p -bsf:v h264_mp4toannexb -profile:v baseline -level:v 3.2 -c:v libx264 -x264-params keyint=120:scenecut=0 -c:a aac -b:a 128k -ar 44100 -f rtsp -muxdelay 0.1 rtsp://localhost:8554/live/paul

Related

Streaming to localhost using FFmpeg doesn't work

I'm trying to stream my screen using FFmpeg, but I can't access it using VLC player - it keeps loading the stream and doesn't show anything.
The command I use:
ffmpeg -f gdigrab -s 1920x1080 -i desktop -preset ultrafast -vcodec libx264 -tune zerolatency -b 900k -f rtp rtp://localhost:1234
The network URL I put in VLC:
udp://localhost:1234
What am I doing wrong?

Minimal SRT Stream Example with ffmpeg

I'm having a hard finding a simple solution to showcase the srt streaming protocol with FFmpeg. The only article that I've found, is either going over multiple hoops to setup a stream. Is there no way to do a simple receiver/sender principle like in the old days with udp?
Sender:
ffmpeg -i myfile.mp4 -vcodec libx264 -crf 12 -f mpegts udp://192.168.1.5:1234
Receiver:
ffplay udp://192.168.1.5:1234
Your ffmpeg needs to be compiled with --enable-libsrt to support the SRT protocol. See the output of ffmpeg -protocols to determine if it supports SRT.
Untested examples:
# stream copy
ffmpeg -re -i input.mp4 -c copy -f mpegts srt://192.168.1.5:1234
# re-encode
ffmpeg -re -i input.mp4 -c:v libx264 -b:v 4000k -maxrate 4000k -bufsize 8000k -g 50 -f mpegts srt://192.168.1.5:1234
See FFmpeg Protocols Documentation: SRT.
I also miss comprehensive documentation and explanation on how to use SRT. However here is a minimum example I use:
Sender which acts as a listener and waits for connections:
ffmpeg -i test.mp4 -c:v libx264 -f mpegts 'srt://:40052?mode=listener&latency=20000000'
Receiver which acts as a caller:
ffmpeg -i 'srt://192.168.1.345:40052?mode=caller' -c copy output.mkv

ffmpeg raw video over udp

I am trying to stream desktop with as little latency as possible I am using this command to stream
ffmpeg -sn -f avfoundation -i '1' -r 10 -vf scale=1920x1080 -tune zerolatency -f rawvideo udp://224.3.0.11:5000
and for client side this command
ffplay -f rawvideo -pixel_format uyvy422 -framerate 10 -video_size 1920x1080 -fs -i udp://224.3.0.11:5000
The issue I am having is shown in this screenshoot here from the client side does anyone know what I can do to stop this issue?
Because of UDP.
UDP is an unordered protocol. Hence the video decoder did not receive the video packets in the correct order - causing the glitch in your video stream.

FFmpeg send stream on a web server

For stream my screen, i used :
ffmpeg -s 1920x1080 -f X11grab -i :0.0+0,0 -codec:v libvpx -b:v 4M -b:a libvorbis -crf 20 capture.webm
This command save the stream in a file : capture.webm.
But now, I want send stream on a udp server. So i make this command :
ffmpeg -s 1920x1080 -f X11grab -i :0.0+0,0 -codec:v libvpx -b:v 4M -b:a libvorbis -crf 20 -f webm udp://192.168.232.2:8080
But it doesn't run.
To send a stream to a server instead of
-f webm udp://192.168.232.2:8080
use
-f rtp rtp://192.168.232.2:32200
where 32200 is some not used port
To play it from there you can use:
1. ffplay with sdp
2. Set up WebRTC with Janus
3. Publish video in flv format to rtmp server and play it using Flash:
ffmpeg -protocol_whitelist file,udp,rtp -loglevel repeat+info -i source.sdp -flags +global_header -f flv rtmp://127.0.0.1/mystream/mystream1
where source.sdp(comes from output from ffmpeg -s 1920x1080 -f X11grab -i :0.0+0,0 -codec:v libvpx -b:v 4M -b:a libvorbis -crf 20 -f webm udp://192.168.232.2:32200)
v=0
o=- 0 0 IN IP4 127.0.1.1
s=No Name
c=IN IP4 192.168.232.2
t=0 0
a=tool:libavformat 57.71.100
m=video 32200 ....
If you don't want to build media server and viewing logic in the browser you can either send stream from your screen to any of Media Servers for example Wowza or Facebook Live(according to instructions)

Seek issue of HEVC streams with ffplay

I'm attempting to perform a seek operation in an MPEG-TS stream that contains a HEVC encoded bit stream. The HEVC stream is encoded using the following command;
ffmpeg -s:v 1920x1080 -i kimono.yuv -c:v libx265 -x265-params crf=23:fps=30:keyint=10:min-keyint=10 -c:a copy -f mpegts testhevc.ts
The seek operation is attempted with ffplay as;
ffplay testhevc.ts -ss 5 -vf showinfo
The information shown gives multiple initial errors with missing POCs, such as;
Could not find ref with POC 126
However, everything works fine when the same operation is performed with H.264. The encoding with H.264/AVC is performed as;
ffmpeg -s:v 1920x1080 -i kimono.yuv -c:v libx265 -c:v libx264 -crf 23 -r 30 -keyint_min 10 -g 10 -c:a copy -f mpegts testh264.ts
Is this an issue with the ffmpeg tools for HEVC or am I missing something in these commands?.
Thanks.

Resources