Streaming to localhost using FFmpeg doesn't work - ffmpeg

I'm trying to stream my screen using FFmpeg, but I can't access it using VLC player - it keeps loading the stream and doesn't show anything.
The command I use:
ffmpeg -f gdigrab -s 1920x1080 -i desktop -preset ultrafast -vcodec libx264 -tune zerolatency -b 900k -f rtp rtp://localhost:1234
The network URL I put in VLC:
udp://localhost:1234
What am I doing wrong?

Related

ffmpeg only works 2 instances

I have 3 usb web cameras. Im using Windows 7.
I have 3 usb web cameras
And I need to create 3 video streams:
ffmpeg -f dshow -i video="Full HD webcam" -vcodec libx264 -tune zerolatency -threads 0 -b 900k -f mpegts udp://localhost:1234
ffmpeg -f dshow -i video=”#device_pnp_\?\usb#vid_1908&pid_2311&mi_00#8&134fde2a&0&0000#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global” -vcodec libx264 -tune zerolatency -threads 0 -b 900k -f mpegts udp://localhost:1233
ffmpeg -f dshow -i video="#device_pnp_\?\usb#vid_1908&pid_2311&mi_00#8&962d85&0&0000#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global" -vcodec libx264 -tune zerolatency -threads 0 -b 900k -f mpegts udp://localhost:1232
But only works 2 any instances of video streams. When I try to create third video stream I get error:
I get error
Can anybody help me, please?
i tried to create 3 video streams, but could only create 2 instances of ffmpeg process.

ffmpeg to YouTube with tee mux giving resolution warning (65535x65535), which is not optimal

I am streaming to YouTube and Facebook using ffmpeg , also writing data into disk (recording).
it's working fine on Facebook and recording but on YouTube it's giving warning that is
Please check the video resolution. The current resolution is (65535x65535), which is not optimal.
and output on YouTube is also 1:1 aspect ratio due to the above resolution.
I am using tee mux in ffmpeg Command.
ffmpeg -f dshow -framerate 30 -i video="Integrated Webcam":audio="Microphone Array (Intel® Smart Sound Technology (Intel® SST))" -s 1920x1080 -c:v libx264 -r 30 -preset ultrafast -tune zerolatency -crf 28 -pix_fmt yuv420p -c:a aac -strict -2 -ac 2 -b:a 128k -t 4 -map 0 -f tee "[f=ismv]pipe:1 | [f=flv]rtmps://youtube | [f=flv]facebook"

FFmpeg - Streaming from a rtsp server to a rtmp server - loosing packages

I'm trying to stream a video from a rtsp server to a rtmp one using FFmpeg.
Tried multiple arguments for my command :
ffmpeg.exe -re -i "rtsp://10.65.28.251:11442/video/live" -pix_fmt yuv420p -codec:v libx264 -tune animation -preset fast -crf 23 -maxrate 4M -bufsize 8M -f flv "rtmp://10.65.58.21:1935/rec/XB"
ffmpeg.exe -re -i "rtsp://10.65.28.251:11442/video/live" -preset ultrafast -vcodec libx264 -tune zerolatency -b 900k -f flv "rtmp://10.65.52.131:1935/rec/XB
I'm loosing a lot of packages as seen in the picture. I'm pretty new to FFmpeg so I'm pretty sure I'm messing up the parameters somehow.
My goal is to get a video on rtmp with min 30fps and as least lost packages as possible. If needed a downsize of the video quality would be fine.
Any idea what I'm doing wrong?
Thanks!
As kesh pointed above removing -re made a big difference. I ended up with this command which holds pretty good quality at 30fps.
ffmpeg.exe -i "rtsp://serversource:11442" -filter:v fps=fps=30 -crf 40 -preset ultrafast -vcodec libx264 -f flv "rtmp://servertarget:1935"

What steps are needed to stream RTSP from FFmpeg?

What steps are needed to stream RTSP from FFmpeg?
Streaming UDP is not a problem, but as I want to stream to mobile devices which can natively read RTSP streams, I couldn't find any setup which tells what exactly is needed. Do I need an RTSP streaming server like LIVE555 or can I use FFmpeg only?
My Command:
ffmpeg -i space.mp4 -vcodec libx264 -tune zerolatency -crf 18 -f rtsp -muxdelay 0.1 rtsp://192.168.1.200:1234
I get an Input/Output error.
Do I need a SDP description to use RTSP?
And if yes where do I have to put it?
You can use FFserver to stream a video using RTSP.
Just change console syntax to something like this:
ffmpeg -i space.mp4 -vcodec libx264 -tune zerolatency -crf 18 http://localhost:1234/feed1.ffm
Create a ffserver.config file (sample) where you declare HTTPPort, RTSPPort and SDP stream. Your config file could look like this (some important stuff might be missing):
HTTPPort 1234
RTSPPort 1235
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 2M
ACL allow 127.0.0.1
</Feed>
<Stream test1.sdp>
Feed feed1.ffm
Format rtp
Noaudio
VideoCodec libx264
AVOptionVideo flags +global_header
AVOptionVideo me_range 16
AVOptionVideo qdiff 4
AVOptionVideo qmin 10
AVOptionVideo qmax 51
ACL allow 192.168.0.0 192.168.255.255
</Stream>
With such setup you can watch the stream with i.e. VLC by typing:
rtsp://192.168.0.xxx:1235/test1.sdp
Here is the FFserver documentation.
FWIW, I was able to setup a local RTSP server for testing purposes using simple-rtsp-server and ffmpeg following these steps:
Create a configuration file for the RTSP server called rtsp-simple-server.yml with this single line:
protocols: [tcp]
Start the RTSP server as a Docker container:
$ docker run --rm -it -v $PWD/rtsp-simple-server.yml:/rtsp-simple-server.yml -p 8554:8554 aler9/rtsp-simple-server
Use ffmpeg to stream a video file (looping forever) to the server:
$ ffmpeg -re -stream_loop -1 -i test.mp4 -f rtsp -rtsp_transport tcp rtsp://localhost:8554/live.stream
Once you have that running you can use ffplay to view the stream:
$ ffplay -rtsp_transport tcp rtsp://localhost:8554/live.stream
Note that simple-rtsp-server can also handle UDP streams (i.s.o. TCP) but that's tricky running the server as a Docker container.
Another streaming command I've had good results with is piping the ffmpeg output to vlc to create a stream. If you don't have these installed, you can add them:
sudo apt install vlc ffmpeg
In the example I use an mpeg transport stream (ts) over http, instead of rtsp. I've tried both, but the http ts stream seems to work glitch-free on my playback devices.
I'm using a video capture HDMI>USB device that sets itself up on the video4linux2 driver as input. Piping through vlc must be CPU-friendly, because my old dual-core Pentium CPU is able to do the real-time encoding with no dropped frames. I've also had audio-sync issues with some of the other methods, where this method always has perfect audio-sync.
You will have to adjust the command for your device or file. If you're using a file as input, you won't need all that v4l2 and alsa stuff. Here's the ffmpeg|vlc command:
ffmpeg -thread_queue_size 1024 -f video4linux2 -input_format mjpeg -i /dev/video0 -r 30 -f alsa -ac 1 -thread_queue_size 1024 -i hw:1,0 -acodec aac -vcodec libx264 -preset ultrafast -crf 18 -s hd720 -vf format=yuv420p -profile:v main -threads 0 -f mpegts -|vlc -I dummy - --sout='#std{access=http,mux=ts,dst=:8554}'
For example, lets say your server PC IP is 192.168.0.10, then the stream can be played by this command:
ffplay http://192.168.0.10:8554
#or
vlc http://192.168.0.10:8554
UPDATE:
Here is a command to use VLC for rtsp, instead of using the rtsp-simple-server:
ffmpeg -thread_queue_size 1024 -f video4linux2 -input_format mjpeg -video_size 1280x720 -r 30 -i /dev/video0 -f alsa -thread_queue_size 1024 -i plughw:CARD=MS2109,DEV=0 -acodec mp2 -vcodec libx264 -preset ultrafast -crf 20 -s hd720 -vf format=yuv420p -profile:v main -f mpegts -|vlc -I dummy - --sout='#rtp{sdp=rtsp://:8554/} --sout-all --sout-keep'
If your PC ip is 192.168.0.10, then the rtsp stream is played by this command:
vlc rtsp://192.168.0.10:8554/
An alternative that I used instead of FFServer was Red5 Pro. On Ubuntu, I used this line:
ffmpeg -f pulse -i default -f video4linux2 -thread_queue_size 64 -framerate 25 -video_size 640x480 -i /dev/video0 -pix_fmt yuv420p -bsf:v h264_mp4toannexb -profile:v baseline -level:v 3.2 -c:v libx264 -x264-params keyint=120:scenecut=0 -c:a aac -b:a 128k -ar 44100 -f rtsp -muxdelay 0.1 rtsp://localhost:8554/live/paul

FFMpeg not printing SDP information to console

I was trying to create an FFMpeg stream and read it from VLC player, but I am getting an error saying SDP is required. However FFMpeg is not printing the SDP information to console as it is supposed to. How can I get the SDP file for the stream?
Here is the command I am using to stream
ffmpeg -f dshow -r 10000/1001 -i video="screen-capture-recorder" -vcodec libx264 -tune zerolatency -b 900k -f mpegts rtp://127.0.0.1:1234
I found out what I had done wrong. I was previously using this command:
ffmpeg -f dshow -r 10000/1001 -i video="screen-capture-recorder" -vcodec libx264 -tune zerolatency -b 900k -f mpegts rtp://127.0.0.1:1234
Which I changed to this:
ffmpeg -f dshow -r 10000/1001 -i video="screen-capture-recorder" -vcodec libx264 -tune zerolatency -b 900k -f rtp rtp://127.0.0.1:1234
I just changed "mpegts" to "rtp" since that is the protocol I was using.

Resources