Encode youtube live stream to UDP out using youtube-dl and ffmpeg - ffmpeg

I am trying to encode youtube live stream to UDP destination using youtube-dl and ffmpeg with below command
youtube-dl -f best --buffer-size 2M -o - "https://www.youtube.com/watch?v=tkUvWJiTf9A" | ffmpeg -re -f mp4 -i pipe:0 -codec copy -f mpegts udp://192.168.1.107:1234?pkt_size=1316
But its not working, its just downloading ts segments of that live stream.
When I am trying with video of youtube its working fine with below commands
youtube-dl -f best --buffer-size 2M -o - "https://www.youtube.com/watch?v=snDI6AaL04g" | ffmpeg -re -f mp4 -i pipe:0 -codec copy -f mpegts udp://192.168.1.107:1234?pkt_size=1316
Any help or suggestion appreciated.

I have got it solved using below command using Streamlink with ffmpeg. sharing so anyone needed can refer that.
streamlink --hls-segment-threads 10 --ringbuffer-size 10M https://www.youtube.com/watch?v=NMre6IAAAiU 140p,worst --stdout | ffmpeg -i pipe:0 -codec copy -bsf:v h264_mp4toannexb -f mpegts udp://192.168.2.7:1234?pkt_size=1316

Related

Restream m3u8 with python, ffmpeg or nginx

I need to restream m3u8 videos to my service. The project is written in django. I can't get my m3u8 with ffmpeg command
ffmpeg -fflags +igndts -hide_banner -i https://link/stream.m3u8 -c copy -f flv rtmp:127.0.0.1/live/stream
Please tell me how do I do this?
ffmpeg -fflags +igndts -hide_banner -i https://link/stream.m3u8 -c copy -f flv rtmp:127.0.0.1/live/stream

ffmpeg play RTSP stream while recording

I successfully record to a file a RTSP stream using ffmpeg with the following command:
ffmpeg -i "rtsp://1.1.1.1:554/user=admin&password=admin&channel=1&stream=1" -acodec copy -vcodec copy -movflags frag_keyframe+empty_moov -y http://www.example.com/rec/1.mp4
now I need to play video while ffmpeg is still writing to file. Even changing file format, is it possible?
Pipe a 2nd output to ffplay
ffmpeg -i "rtsp://1.1.1.1:554/user=admin&password=admin&channel=1&stream=1" -acodec copy -vcodec copy -movflags frag_keyframe+empty_moov -y http://www.example.com/rec/1.mp4 -c copy -f nut - | ffplay

Capturing and streaming with ffmpeg while displaying locally

I can capture with ffmpeg from a device, I can transcode the audio/video, I can stream it to ffserver.
How can I capture and stream with ffmpeg while showing locally what is captured?
Up to now I've been using VLC to capture and stream to localhost, then ffmpeg to get that stream, transcode it again, and stream to ffserver.
I'd like to do this using ffmpeg only.
Thank you.
Option A: Use ffmpeg with multiple outputs and a separate player:
output 1: copy source without transcoding and pipe it or send it to a local port
output 2: transcode and send to server
Example using ffplay
ffmpeg -f x11grab [grab parameters] -i :0.0 \
[transcode parameters] -f [transcode output] \
-f rawvideo - | ffplay -f rawvideo [grab parameters] -i -
Option B: ffmpegonly with OpenGL and a SDL window (requires SDL and --enable-opengl)
ffmpeg -f x11grab [grab parameters] -i :0.0 \
[transcode parameters] -f [transcode output] \
-f opengl "Window title"
You can also use tee separately what is more error prone for me (I couldn't get aergistal's solution to work):
cat file | tee >(program_1) [...] >(program_n) | destination
In this case:
ffmpeg -i rtsp://url -codec:a aac -b:a 192k -codec:v copy -f mpegts - | \
tee >(ffplay -f mpegts -i -) | \
ffmpeg -y -f mpegts -i - -c copy /path/to/file.mp4
(Tested with ffmpeg v:3.2.11 [current in Debian stable])

how to push a video list to rtmp server and keep connect

Current methods
ffmpeg -re -i 1.mp4 -f flv "rtmp://example.com/live"
ffmpeg -re -i 2.mp4 -f flv "rtmp://example.com/live"
ffmpeg -re -i 3.mp4 -f flv "rtmp://example.com/live"
...
but 1.mp4 push done later , client and server will be disconnected.
i hope keep connect.
Try concat demuxer.
Create a list (myfiles.txt):
file '1.mp4'
file '2.mp4'
file '3.mp4'
...
Then
ffmpeg -f concat -i myfiles.txt -f flv "rtmp://example.com/live"

How to stream webcam video using ffmpeg?

I am very new to ffmpeg and just read some examples on how to open a video file and decode its stream.
But is it possible to open a webcam's stream, something like:
http://192.168.1.173:80/live/0/mjpeg.jpg?x.mjpeg
Is there any examples/tutorials on this?
I need to use ffmpeg as decoder to decode the stream in my own Qt based program.
Nyaruko,
First check if your webcam is supported... Do
ffmpeg -y -f vfwcap -i list
Next ,
ffmpeg -y -f vfwcap -r 25 -i 0 out.mp4 for encoding
This site has helpful info;
http://www.area536.com/projects/streaming-video/
Best of Luck.
This works for live video streaming:
ffplay -f dshow -video_size 1280x720 -i video0
The other option using ffmpeg is:
ffmpeg -f dshow -video_size 1280x720 -i video0 -f sdl2 -
Above both the solution are provided by FFMPED

Resources