I am trying to capture with ffmpeg the output from a Decklink Studio card on Windows, how would this be done? Is this even possible?
ffmpeg -video_size 720x576 -rtbufsize 702000k -framerate 60 -pixel_format uyvy422 -framerate 25 -f dshow -i video="Decklink Video Capture":audio="Decklink Audio Capture" -format x264 -an -b:v 1200k -s 1280x720 -f flv "rtmp://yourURL/Key"
of course it depends on the streaming protocol.
Sure. You just need to use the DeckLink SDK.
Related
I want to add audio to my ffmpeg cli bellow
ffmpeg -framerate 25 -video_size 1920x1080 -f x11grab -i :0.0 -vf format=yuv420p http://localhost:8080/feed.ffm
I tried -acodec libmp3lame, -c:a libmp3lame but it does not seam to work
You have to provide an audio input. Assuming ALSA:
ffmpeg -framerate 25 -video_size 1920x1080 -f x11grab -i :0.0 -f alsa -sample_rate 48000 -channels 2 -i hw:0 -c:v libx264 -c:a aac -vf format=yuv420p http://localhost:8080/feed.ffm
See FFmpeg ALSA input documentation and FFmpeg Wiki: ALSA.
I want to record and stream desktop to Youtube live by FFmpeg. But the output resolution is very low, maximum 360.
What options I need to change?
ffmpeg -framerate 30 -f x11grab -i :1 -f pulse -i default -c:v libx264 -s 1920x1080 -r 60 -b:v 5000k -crf 10 -vf format=yuv420p -c:a aac -b:a 128k -f flv rtmp://a.rtmp.youtube.com/live2/stream_key
Problem
Default size for x11grab is the full desktop or window (640x480 for old ffmpeg versions). Your ffmpeg is old, so it is capturing at 640x480. You are then upscaling 640x480 to 1920x1080 which is bad and looks ugly.
Solution 1: Upgrade ffmpeg
Fix by using a modern ffmpeg version and it will grab the full desktop or window size by default. See FFmpeg Download page for links or the FFmpeg compile and install guides.
Solution 2: Use -video_size input option
ffmpeg -framerate 30 -video_size 1920x1080 -f x11grab -i :0.0 -f pulse -i default -c:v libx264 -b:v 5000k -maxrate 5000k -bufsize 10000k -g 60 -vf format=yuv420p -c:a aac -b:a 128k -f flv rtmp://a.rtmp.youtube.com/live2/stream_key
See the FFmpeg x11grab documentation for more info and options.
For streaming it is recommended to add -g, -bufsize, and -maxrate to enable VBV.
I want to get the file which is non-decoded h264 format to use in another client application. I know how to stream to disk using below command from the docs.
Example to encode video from /dev/video0:
ffmpeg -f v4l2 -framerate 25 -video_size 640x480 -i /dev/video0 output.mp4
High level Diagram
This is typical producer and consumer problem -
Webcam =============> ffmpeg to video stream into file. (producer)
^
|
|
Client ________________________________|
(consumer)
// reads only Non-decoded h264 format from a file.
Use
ffmpeg -f v4l2 -framerate 25 -video_size 640x480 -i /dev/video0 output.mp4 -c copy out.h264
out.h264 is the received H264 bitstream, saved as a file.
I found this as solution
ffmpeg -pix_fmt yuv420p -y -f v4l2 -vcodec h264 -i /dev/video0 out.h264
I´m trying to stream my Desktop with RTP using ffmpeg.
libx264 seems to work fine. But I would like to test the performance of a hardware accelerated codec.
ffmpeg -re -f dshow -i video="screen-capture-recorder" -vcodec libx264 -tune zerolatency -preset ultrafast -an -f rtp rtp://192.168.0.1
The NVENC codec works fine in other situations like this
ffmpeg -y -rtbufsize 2000M -f gdigrab -framerate 60 -offset_x 0 -offset_y 0 -video_size 1280x1080 -i desktop -c:v nvenc -preset:v fast -pix_fmt nv12 out.mp4
The Codec also apears in the list of available codecs.
However this command doesn´t work
ffmpeg -re -f dshow -i video="screen-capture-recorder" -vcodec nvenc -preset llhq -an -f rtp rtp://192.168.0.1
My machine is a Windows 10 with GTX 760
I found out what was wrong a few weeks ago.
Ffmpeg was expecting a GPU with CUDA 8, unfortunately GPUs with Kepler archtecture, like my 760, have limited CUDA 8 features.
So NVENC could not properly work. The "solution" is to find a Ffmpeg release with CUDA 7 or 7.5 support.
What steps are needed to stream RTSP from FFmpeg?
Streaming UDP is not a problem, but as I want to stream to mobile devices which can natively read RTSP streams, I couldn't find any setup which tells what exactly is needed. Do I need an RTSP streaming server like LIVE555 or can I use FFmpeg only?
My Command:
ffmpeg -i space.mp4 -vcodec libx264 -tune zerolatency -crf 18 -f rtsp -muxdelay 0.1 rtsp://192.168.1.200:1234
I get an Input/Output error.
Do I need a SDP description to use RTSP?
And if yes where do I have to put it?
You can use FFserver to stream a video using RTSP.
Just change console syntax to something like this:
ffmpeg -i space.mp4 -vcodec libx264 -tune zerolatency -crf 18 http://localhost:1234/feed1.ffm
Create a ffserver.config file (sample) where you declare HTTPPort, RTSPPort and SDP stream. Your config file could look like this (some important stuff might be missing):
HTTPPort 1234
RTSPPort 1235
<Feed feed1.ffm>
File /tmp/feed1.ffm
FileMaxSize 2M
ACL allow 127.0.0.1
</Feed>
<Stream test1.sdp>
Feed feed1.ffm
Format rtp
Noaudio
VideoCodec libx264
AVOptionVideo flags +global_header
AVOptionVideo me_range 16
AVOptionVideo qdiff 4
AVOptionVideo qmin 10
AVOptionVideo qmax 51
ACL allow 192.168.0.0 192.168.255.255
</Stream>
With such setup you can watch the stream with i.e. VLC by typing:
rtsp://192.168.0.xxx:1235/test1.sdp
Here is the FFserver documentation.
FWIW, I was able to setup a local RTSP server for testing purposes using simple-rtsp-server and ffmpeg following these steps:
Create a configuration file for the RTSP server called rtsp-simple-server.yml with this single line:
protocols: [tcp]
Start the RTSP server as a Docker container:
$ docker run --rm -it -v $PWD/rtsp-simple-server.yml:/rtsp-simple-server.yml -p 8554:8554 aler9/rtsp-simple-server
Use ffmpeg to stream a video file (looping forever) to the server:
$ ffmpeg -re -stream_loop -1 -i test.mp4 -f rtsp -rtsp_transport tcp rtsp://localhost:8554/live.stream
Once you have that running you can use ffplay to view the stream:
$ ffplay -rtsp_transport tcp rtsp://localhost:8554/live.stream
Note that simple-rtsp-server can also handle UDP streams (i.s.o. TCP) but that's tricky running the server as a Docker container.
Another streaming command I've had good results with is piping the ffmpeg output to vlc to create a stream. If you don't have these installed, you can add them:
sudo apt install vlc ffmpeg
In the example I use an mpeg transport stream (ts) over http, instead of rtsp. I've tried both, but the http ts stream seems to work glitch-free on my playback devices.
I'm using a video capture HDMI>USB device that sets itself up on the video4linux2 driver as input. Piping through vlc must be CPU-friendly, because my old dual-core Pentium CPU is able to do the real-time encoding with no dropped frames. I've also had audio-sync issues with some of the other methods, where this method always has perfect audio-sync.
You will have to adjust the command for your device or file. If you're using a file as input, you won't need all that v4l2 and alsa stuff. Here's the ffmpeg|vlc command:
ffmpeg -thread_queue_size 1024 -f video4linux2 -input_format mjpeg -i /dev/video0 -r 30 -f alsa -ac 1 -thread_queue_size 1024 -i hw:1,0 -acodec aac -vcodec libx264 -preset ultrafast -crf 18 -s hd720 -vf format=yuv420p -profile:v main -threads 0 -f mpegts -|vlc -I dummy - --sout='#std{access=http,mux=ts,dst=:8554}'
For example, lets say your server PC IP is 192.168.0.10, then the stream can be played by this command:
ffplay http://192.168.0.10:8554
#or
vlc http://192.168.0.10:8554
UPDATE:
Here is a command to use VLC for rtsp, instead of using the rtsp-simple-server:
ffmpeg -thread_queue_size 1024 -f video4linux2 -input_format mjpeg -video_size 1280x720 -r 30 -i /dev/video0 -f alsa -thread_queue_size 1024 -i plughw:CARD=MS2109,DEV=0 -acodec mp2 -vcodec libx264 -preset ultrafast -crf 20 -s hd720 -vf format=yuv420p -profile:v main -f mpegts -|vlc -I dummy - --sout='#rtp{sdp=rtsp://:8554/} --sout-all --sout-keep'
If your PC ip is 192.168.0.10, then the rtsp stream is played by this command:
vlc rtsp://192.168.0.10:8554/
An alternative that I used instead of FFServer was Red5 Pro. On Ubuntu, I used this line:
ffmpeg -f pulse -i default -f video4linux2 -thread_queue_size 64 -framerate 25 -video_size 640x480 -i /dev/video0 -pix_fmt yuv420p -bsf:v h264_mp4toannexb -profile:v baseline -level:v 3.2 -c:v libx264 -x264-params keyint=120:scenecut=0 -c:a aac -b:a 128k -ar 44100 -f rtsp -muxdelay 0.1 rtsp://localhost:8554/live/paul