How to pipe the FFmpeg output to multiple ffplay? - windows

I use the following command to pipe the FFmpeg output to 2 ffplay , but it doesn't work.
ffmpeg -ss 5 -t 10 -i input.avi -force_key_frames 00:00:00.000 -tune zerolatency -s 1920x1080 -r 25 -f mpegts output.ts -f avi -vcodec copy -an - | ffplay -i - -f mpeg2video - | ffplay -i -
How can I pipe the FFmpeg output to 2 (or more) ffplay?
I saw this page but it doesn't work for ffplay. (it is for Linux but my OS is windows)
Please help me
Thanks

There's some kind of Tee-Object (alias tee) in PowerShell but I'm not sure if it's similar to the one on Linux. You can try:
ffmpeg -re -i [...] -f mpegts - | tee >(ffplay -) | ffplay -
An alternative is to output to a multicast port on the local subnetwork:
ffmpeg -re -i [...] -f mpegts udp://224.0.0.1:10000
You can then connect as many clients as you require on the same address/port:
ffplay udp://224.0.0.1:10000

Related

ffmpeg fails with: pipe:1: Invalid argument

I am trying to record my desktop using pipe, but ffmpeg fails.
OS windows:
ffmpeg -filter_complex ddagrab=output_idx=0:framerate=5,hwdownload,format=bgra -c:v libx264 -crf 18 -y pipe:1 | cat > test.mp4
OS mac:
ffmpeg -f avfoundation -framerate 5 -capture_cursor 1 ⁣ ⁣pipe:1 | cat > output.mkv
However, on windows, this command works
ffmpeg -f gdigrab -i desktop -f mpegts pipe:1 | cat > out.mp4
it turned out to solve the problem by adding a parameter -f mpegts

Encode youtube live stream to UDP out using youtube-dl and ffmpeg

I am trying to encode youtube live stream to UDP destination using youtube-dl and ffmpeg with below command
youtube-dl -f best --buffer-size 2M -o - "https://www.youtube.com/watch?v=tkUvWJiTf9A" | ffmpeg -re -f mp4 -i pipe:0 -codec copy -f mpegts udp://192.168.1.107:1234?pkt_size=1316
But its not working, its just downloading ts segments of that live stream.
When I am trying with video of youtube its working fine with below commands
youtube-dl -f best --buffer-size 2M -o - "https://www.youtube.com/watch?v=snDI6AaL04g" | ffmpeg -re -f mp4 -i pipe:0 -codec copy -f mpegts udp://192.168.1.107:1234?pkt_size=1316
Any help or suggestion appreciated.
I have got it solved using below command using Streamlink with ffmpeg. sharing so anyone needed can refer that.
streamlink --hls-segment-threads 10 --ringbuffer-size 10M https://www.youtube.com/watch?v=NMre6IAAAiU 140p,worst --stdout | ffmpeg -i pipe:0 -codec copy -bsf:v h264_mp4toannexb -f mpegts udp://192.168.2.7:1234?pkt_size=1316

ffmpeg How to end a specific process?

If I have multiple ffmpeg running in the background, for example:
process 1
ffmpeg -re -i "https://mnmedias.api.telequebec.tv/m3u8/29880.m3u8" -filter_complex "null" -acodec aac -vcodec libx264 -f flv ./videos/cut-videos/standard/happens.mp4
process 2
ffmpeg -re -i "https://mnmedias.api.telequebec.tv/m3u8/29880.m3u8" -filter_complex "null" -acodec aac -vcodec libx264 -f flv ./videos/cut-videos/standard/happens2.mp4
process 3
ffmpeg -re -i "https://mnmedias.api.telequebec.tv/m3u8/29880.m3u8" -filter_complex "null" -acodec aac -vcodec libx264 -f flv ./videos/cut-videos/standard/happens3.mp4
How Can I end process 3 specifically?
You can use pgrep alone to get the PID:
pgrep -f happens3.mp4
Example with kill:
kill "$(pgrep -f happens3.mp4)"
Try this
ps -ef | grep happens3.mp4 | awk '{ print $2 }'
This should give you the PID to that exact process.
For this example, let's say your PID is 1234.
To kill it, run the following
kill 1234

from unix ffmpeg bash pipe to windows ''universe"

I am trying to "translate" some script from bash to windows powershell
I tried to pass the simplest
ffmpeg -i video.mp4 -vn -sn -map 0:a:0 -f flac - | ffmpeg -i - -c:a aac oiji.m4a
the result is a failure
using -f wav - another failure
there is a way to make it work?
thank you

Capturing and streaming with ffmpeg while displaying locally

I can capture with ffmpeg from a device, I can transcode the audio/video, I can stream it to ffserver.
How can I capture and stream with ffmpeg while showing locally what is captured?
Up to now I've been using VLC to capture and stream to localhost, then ffmpeg to get that stream, transcode it again, and stream to ffserver.
I'd like to do this using ffmpeg only.
Thank you.
Option A: Use ffmpeg with multiple outputs and a separate player:
output 1: copy source without transcoding and pipe it or send it to a local port
output 2: transcode and send to server
Example using ffplay
ffmpeg -f x11grab [grab parameters] -i :0.0 \
[transcode parameters] -f [transcode output] \
-f rawvideo - | ffplay -f rawvideo [grab parameters] -i -
Option B: ffmpegonly with OpenGL and a SDL window (requires SDL and --enable-opengl)
ffmpeg -f x11grab [grab parameters] -i :0.0 \
[transcode parameters] -f [transcode output] \
-f opengl "Window title"
You can also use tee separately what is more error prone for me (I couldn't get aergistal's solution to work):
cat file | tee >(program_1) [...] >(program_n) | destination
In this case:
ffmpeg -i rtsp://url -codec:a aac -b:a 192k -codec:v copy -f mpegts - | \
tee >(ffplay -f mpegts -i -) | \
ffmpeg -y -f mpegts -i - -c copy /path/to/file.mp4
(Tested with ffmpeg v:3.2.11 [current in Debian stable])

Resources