ffmpeg raw video over udp - macos

I am trying to stream desktop with as little latency as possible I am using this command to stream
ffmpeg -sn -f avfoundation -i '1' -r 10 -vf scale=1920x1080 -tune zerolatency -f rawvideo udp://224.3.0.11:5000
and for client side this command
ffplay -f rawvideo -pixel_format uyvy422 -framerate 10 -video_size 1920x1080 -fs -i udp://224.3.0.11:5000
The issue I am having is shown in this screenshoot here from the client side does anyone know what I can do to stop this issue?

Because of UDP.
UDP is an unordered protocol. Hence the video decoder did not receive the video packets in the correct order - causing the glitch in your video stream.

Related

Extracting frames from video while recording using ffmpeg

I am using ffmpeg to record a video using a Raspberry Pi with its camera module.
I would like to run a image classifier on a regular interval for which I need to extract a frame from the stream.
This is the command I currently use for recording:
$ ffmpeg -f video4linux2 -input_format h264 -video_size 1280x720 -framerate 30 -i /dev/video0 -vcodec copy -an test.h264
In other threads this command is recommended:
ffmpeg -i file.mpg -r 1/1 $filename%03d.bmp
I don't think this is intended to be used with files that are still appended to and I get the error "Cannot use -sseof, duration of test.h264 not known".
Is there any way that ffmpeg allows this?
I don't have a Raspberry Pi set up with a camera at the moment to test with, but you should be able to simply append a second output stream to your original command, as follows to get, say, 1 frame/second of BMP images:
ffmpeg -f video4linux2 -input_format h264 -video_size 1280x720 -framerate 30 -i /dev/video0 -vcodec copy -an test.h264 -r 1 frame-%03d.bmp

How to get non-decoded h264 stream from the webcam using ffmpeg?

I want to get the file which is non-decoded h264 format to use in another client application. I know how to stream to disk using below command from the docs.
Example to encode video from /dev/video0:
ffmpeg -f v4l2 -framerate 25 -video_size 640x480 -i /dev/video0 output.mp4
High level Diagram
This is typical producer and consumer problem -
Webcam =============> ffmpeg to video stream into file. (producer)
^
|
|
Client ________________________________|
(consumer)
// reads only Non-decoded h264 format from a file.
Use
ffmpeg -f v4l2 -framerate 25 -video_size 640x480 -i /dev/video0 output.mp4 -c copy out.h264
out.h264 is the received H264 bitstream, saved as a file.
I found this as solution
ffmpeg -pix_fmt yuv420p -y -f v4l2 -vcodec h264 -i /dev/video0 out.h264

Sending BlackMagic DeckLink Studio 4K over RTMP streams with FFmpeg

I'm trying to send a stream of video that's coming into a BlackMagic DeckLink Studio 4K capture card over a few different RTMP streams at once with FFmpeg. The command that I am doing it with is this:
ffmpeg -re -format_code Hi59 -f decklink -i 'DeckLink Studio 4K' -map 0 -flags +global_header -vcodec libx264 -crf 25 -preset medium -pix_fmt yuv422p -acodec aac -f tee "[f=flv]rtmp://ip1/live/test|[f=flv]rtmp://ip2/live/test.
However, whenever I send this video out, I just get color bars when looking at the stream. I tried using a different video source (the testsrc supplied by FFmpeg), and that sends out fine over RTMP to multiple stream destinations.
Is there something weird with how tee and the decklink stuff work in FFmpeg? Or is there an issue with my command?
If you see colors bars, that means that ffmpeg is connecting to the card and streaming fine but the card is giving the bars. Your command says that ffmpeg is expecting 1920X1080#29.97 interlaced, make sure that is the format into the Decklink. You can also try explicitly setting the connection type, example:
ffmpeg -re -format_code Hi59 -video_input sdi -f decklink -i 'DeckLink Studio 4K' -map 0 -flags +global_header -vcodec libx264 -crf 25 -preset medium -pix_fmt yuv422p -acodec aac -f tee "[f=flv]rtmp://ip1/live/test|[f=flv]rtmp://ip2/live/test
If you are still running into issues, make sure that the BlackMagic software can see the video signal, and it's the format you expect.
One last thing to check, if its an HDMI input make sure it is not HDCP; it's not supported.

Stream RTSP from ONVIF general camera to Youtube

I have a wifi camera that uses RTSP/ONVIF protocol and after reading FFMPEG docs and some threads at Google I am trying to broadcast the stream to Youtube. So I started a broadcast at youtube and in my computer in ffmpeg I executed this command:
ffmpeg -f lavfi -i anullsrc -rtsp_transport udp -i rtsp://200.193.21.176:6002/onvif1 -tune zerolatency -vcodec libx264 -t 12:00:00 -pix_fmt + -c:v copy -c:a aac -strict experimental -f flv rtmp://x.rtmp.youtube.com/live2/private_key
The command above looks like it's correct cause it ouputs constantly something like this:
The problem is that at YOUTUBE it still says I am offline. Why?
Try replace first part to: ffmpeg -re -i somefile.mp4, so you will to know, if here any problems with your camera or not.
ffmpeg and VLC very similar and even uses same code for codecs. But RTSP it handles differently. But try just ffmpeg -i rtsp://200.193.21.176:6002/onvif1 and nothing more as source.

What steps are needed to stream RTSP from FFmpeg?

What steps are needed to stream RTSP from FFmpeg?
Streaming UDP is not a problem, but as I want to stream to mobile devices which can natively read RTSP streams, I couldn't find any setup which tells what exactly is needed. Do I need an RTSP streaming server like LIVE555 or can I use FFmpeg only?
My Command:
ffmpeg -i space.mp4 -vcodec libx264 -tune zerolatency -crf 18 -f rtsp -muxdelay 0.1 rtsp://192.168.1.200:1234
I get an Input/Output error.
Do I need a SDP description to use RTSP?
And if yes where do I have to put it?
You can use FFserver to stream a video using RTSP.
Just change console syntax to something like this:
ffmpeg -i space.mp4 -vcodec libx264 -tune zerolatency -crf 18 http://localhost:1234/feed1.ffm
Create a ffserver.config file (sample) where you declare HTTPPort, RTSPPort and SDP stream. Your config file could look like this (some important stuff might be missing):
HTTPPort 1234
RTSPPort 1235
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 2M
ACL allow 127.0.0.1
</Feed>
<Stream test1.sdp>
Feed feed1.ffm
Format rtp
Noaudio
VideoCodec libx264
AVOptionVideo flags +global_header
AVOptionVideo me_range 16
AVOptionVideo qdiff 4
AVOptionVideo qmin 10
AVOptionVideo qmax 51
ACL allow 192.168.0.0 192.168.255.255
</Stream>
With such setup you can watch the stream with i.e. VLC by typing:
rtsp://192.168.0.xxx:1235/test1.sdp
Here is the FFserver documentation.
FWIW, I was able to setup a local RTSP server for testing purposes using simple-rtsp-server and ffmpeg following these steps:
Create a configuration file for the RTSP server called rtsp-simple-server.yml with this single line:
protocols: [tcp]
Start the RTSP server as a Docker container:
$ docker run --rm -it -v $PWD/rtsp-simple-server.yml:/rtsp-simple-server.yml -p 8554:8554 aler9/rtsp-simple-server
Use ffmpeg to stream a video file (looping forever) to the server:
$ ffmpeg -re -stream_loop -1 -i test.mp4 -f rtsp -rtsp_transport tcp rtsp://localhost:8554/live.stream
Once you have that running you can use ffplay to view the stream:
$ ffplay -rtsp_transport tcp rtsp://localhost:8554/live.stream
Note that simple-rtsp-server can also handle UDP streams (i.s.o. TCP) but that's tricky running the server as a Docker container.
Another streaming command I've had good results with is piping the ffmpeg output to vlc to create a stream. If you don't have these installed, you can add them:
sudo apt install vlc ffmpeg
In the example I use an mpeg transport stream (ts) over http, instead of rtsp. I've tried both, but the http ts stream seems to work glitch-free on my playback devices.
I'm using a video capture HDMI>USB device that sets itself up on the video4linux2 driver as input. Piping through vlc must be CPU-friendly, because my old dual-core Pentium CPU is able to do the real-time encoding with no dropped frames. I've also had audio-sync issues with some of the other methods, where this method always has perfect audio-sync.
You will have to adjust the command for your device or file. If you're using a file as input, you won't need all that v4l2 and alsa stuff. Here's the ffmpeg|vlc command:
ffmpeg -thread_queue_size 1024 -f video4linux2 -input_format mjpeg -i /dev/video0 -r 30 -f alsa -ac 1 -thread_queue_size 1024 -i hw:1,0 -acodec aac -vcodec libx264 -preset ultrafast -crf 18 -s hd720 -vf format=yuv420p -profile:v main -threads 0 -f mpegts -|vlc -I dummy - --sout='#std{access=http,mux=ts,dst=:8554}'
For example, lets say your server PC IP is 192.168.0.10, then the stream can be played by this command:
ffplay http://192.168.0.10:8554
#or
vlc http://192.168.0.10:8554
UPDATE:
Here is a command to use VLC for rtsp, instead of using the rtsp-simple-server:
ffmpeg -thread_queue_size 1024 -f video4linux2 -input_format mjpeg -video_size 1280x720 -r 30 -i /dev/video0 -f alsa -thread_queue_size 1024 -i plughw:CARD=MS2109,DEV=0 -acodec mp2 -vcodec libx264 -preset ultrafast -crf 20 -s hd720 -vf format=yuv420p -profile:v main -f mpegts -|vlc -I dummy - --sout='#rtp{sdp=rtsp://:8554/} --sout-all --sout-keep'
If your PC ip is 192.168.0.10, then the rtsp stream is played by this command:
vlc rtsp://192.168.0.10:8554/
An alternative that I used instead of FFServer was Red5 Pro. On Ubuntu, I used this line:
ffmpeg -f pulse -i default -f video4linux2 -thread_queue_size 64 -framerate 25 -video_size 640x480 -i /dev/video0 -pix_fmt yuv420p -bsf:v h264_mp4toannexb -profile:v baseline -level:v 3.2 -c:v libx264 -x264-params keyint=120:scenecut=0 -c:a aac -b:a 128k -ar 44100 -f rtsp -muxdelay 0.1 rtsp://localhost:8554/live/paul

Resources