ffmpeg How to end a specific process? - ffmpeg

If I have multiple ffmpeg running in the background, for example:
process 1
ffmpeg -re -i "https://mnmedias.api.telequebec.tv/m3u8/29880.m3u8" -filter_complex "null" -acodec aac -vcodec libx264 -f flv ./videos/cut-videos/standard/happens.mp4
process 2
ffmpeg -re -i "https://mnmedias.api.telequebec.tv/m3u8/29880.m3u8" -filter_complex "null" -acodec aac -vcodec libx264 -f flv ./videos/cut-videos/standard/happens2.mp4
process 3
ffmpeg -re -i "https://mnmedias.api.telequebec.tv/m3u8/29880.m3u8" -filter_complex "null" -acodec aac -vcodec libx264 -f flv ./videos/cut-videos/standard/happens3.mp4
How Can I end process 3 specifically?

You can use pgrep alone to get the PID:
pgrep -f happens3.mp4
Example with kill:
kill "$(pgrep -f happens3.mp4)"

Try this
ps -ef | grep happens3.mp4 | awk '{ print $2 }'
This should give you the PID to that exact process.
For this example, let's say your PID is 1234.
To kill it, run the following
kill 1234

Related

Can i change audio/video sync in a stream generated by ffmpeg, while it runs?

i am trying this:
ffmpeg -v verbose -re -y -i syncTest.mp4 -af azmq,volume=1 \
-c:v copy -c:a aac ./output.mp4
then invoke
echo 'Parsed_volume_1 volume 0' | ./zmqsend
it works, audio is muted until i invoke it again with 1
but,
ffmpeg -v verbose -re -y -i syncTest.mp4 -af \
azmq,adelay=delays=0S:all=1 -c:v copy -c:a aac ./output.mp4
and then doing something like
echo Parsed_adelay_1 delays 20000S | ./zmqsend
echo Parsed_adelay_1 all 1 | ./zmqsend
does not work, it prints:
78 Function not implemented
is there totally no way to do it?

how to use the tee to output rtmp and one stdout for ffmpeg command?

i want to output ffmpeg to one rtmp stream.
and at the same time, I want to handle the H264 stream by my program.
I already tried "ffmpeg -f dshow -i video="Webcam C110" -vcodec libx264 -f tee -map 0:v "xx.mkv|-"
but not work.
Base syntax would be
ffmpeg -f dshow -i video="Webcam C110" -vcodec libx264 -f tee -map 0:v "[f=flv]rtmp://url|[f=h264]pipe:1"

How to pipe the FFmpeg output to multiple ffplay?

I use the following command to pipe the FFmpeg output to 2 ffplay , but it doesn't work.
ffmpeg -ss 5 -t 10 -i input.avi -force_key_frames 00:00:00.000 -tune zerolatency -s 1920x1080 -r 25 -f mpegts output.ts -f avi -vcodec copy -an - | ffplay -i - -f mpeg2video - | ffplay -i -
How can I pipe the FFmpeg output to 2 (or more) ffplay?
I saw this page but it doesn't work for ffplay. (it is for Linux but my OS is windows)
Please help me
Thanks
There's some kind of Tee-Object (alias tee) in PowerShell but I'm not sure if it's similar to the one on Linux. You can try:
ffmpeg -re -i [...] -f mpegts - | tee >(ffplay -) | ffplay -
An alternative is to output to a multicast port on the local subnetwork:
ffmpeg -re -i [...] -f mpegts udp://224.0.0.1:10000
You can then connect as many clients as you require on the same address/port:
ffplay udp://224.0.0.1:10000

shell script ffmpeg stops after 2 jobs

I have a pretty simple shell script and after doing the first two jobs, it just stops and sits there, doesnt do anything, it doesnt seem to matter what the third job is, if I switch the order etc, it will not finish it.
Any ideas would be great...
Here is my shell script
for f in "$#"
do
name=$(basename "$f")
dir=$(dirname "$f")
/opt/local/bin/ffmpeg -i "$f" -y -b 250k -deinterlace -vcodec vp8 -acodec libvorbis -nostdin "$dir/webm/${name%.*}.webm"
/opt/local/bin/ffmpeg -i "$f" -y -b 250k -strict experimental -deinterlace -vcodec h264 -acodec aac -nostdin "$dir/mp4/${name%.*}.mp4"
/opt/local/bin/ffmpeg -i "$f" -y -ss 00:00:15.000 -deinterlace -vcodec mjpeg -vframes 1 -an -f rawvideo -s 720x480 "$dir/img/${name%.*}.jpg"
done
Your final ffmpeg line needs -nostdin.
Running FFMPEG from Shell Script /bin/sh

ffmpeg output to multiple files simultaneously

What format/syntax is needed for ffmpeg to output the same input to several different "output" files? For instance different formats/different bitrates? Does it support parallelism on the output?
The ffmpeg documentation has been updated with lots more information about this and options depend on the version of ffmpeg you use: http://ffmpeg.org/trac/ffmpeg/wiki/Creating%20multiple%20outputs
From FFMpeg documentation, FFmpeg writes to an arbitrary number of output "files".
Just make sure each output file (or stream), is preceded by the proper output options.
I use
ffmpeg -f lavfi -re -i 'life=s=300x200:mold=10:r=25:ratio=0.1:death_color=#C83232:life_color=#00ff00,scale=1200:800:flags=16' \
-f lavfi -re -i sine=frequency=1000:sample_rate=44100 -pix_fmt yuv420p \
-c:v libx264 -b:v 1000k -g 30 -keyint_min 60 -profile:v baseline -preset veryfast -c:a aac -b:a 96k \
-f flv "rtmp://yourname.com:1935/live/stream1" \
-f flv "rtmp://yourname.com:1935/live/stream2" \
-f flv "rtmp://yourname.com:1935/live/stream3" \
Is there any reason you can't just run more than one instance of ffmpeg? I've have great results with that ...
Generally what I've done is run ffmpeg once on the source file to get it to sort of the base standard (say a higher quality h.264 mp4 file) this will make sure your other jobs will run more quickly if your source file has any issues since they'll be cleaned up in this first pass
Then use that new source/input file to run x number of ffmpeg jobs, for example in bash ...
Where you see "..." would be where you'd put all your encoding options.
# create 'base' file
ffmpeg -loglevel error -er 4 -i $INPUT_FILE ... INPUT.mp4 >> $LOG_FILE 2>&1
# the command above will run and then move to start 3 background jobs
# text output will be sent to a log file
echo "base file done!"
# note & at the end to send job to the background
ffmpeg ... -i INPUT.mp4 ... FILENAME1.mp4 ... >/dev/null 2>&1 &
ffmpeg ... -i INPUT.mp4 ... FILENAME2.mp4 ... >/dev/null 2>&1 &
ffmpeg ... -i INPUT.mp4 ... FILENAME3.mp4 ... >/dev/null 2>&1 &
# wait until you have no more background jobs running
wait > 0
echo "done!"
Each of the background jobs will run in parallel and will be (essentially) balanced over your cpus, so you can maximize each core.
based on http://sonnati.wordpress.com/2011/08/30/ffmpeg-–-the-swiss-army-knife-of-internet-streaming-–-part-iv/ and http://ffmpeg-users.933282.n4.nabble.com/Multiple-output-files-td2076623.html
ffmpeg -re -i rtmp://server/live/high_FMLE_stream -acodec copy -vcodec x264lib -s 640×360 -b 500k -vpre medium -vpre baseline rtmp://server/live/baseline_500k -acodec copy -vcodec x264lib -s 480×272 -b 300k -vpre medium -vpre baseline rtmp://server/live/baseline_300k -acodec copy -vcodec x264lib -s 320×200 -b 150k -vpre medium -vpre baseline rtmp://server/live/baseline_150k -acodec libfaac -vn -ab 48k rtmp://server/live/audio_only_AAC_48k
Or you could pipe the output to a "tee" and send it to "X" other processes to actually do the encoding, like
ffmpeg -i input - | tee ...
which might save cpu since it might enable more output parallelism, which is apparently otherwise unavailable
see http://ffmpeg.org/trac/ffmpeg/wiki/Creating%20multiple%20outputs and here
I have done like that
ffmpeg -re -i nameoffile.mp4 -vcodec libx264 -c:a aac -b:a 160k -ar 44100 -strict -2 -f flv \
-f flv rtmp://rtmp.1.com/code \
-f flv rtmp://rtmp.2.com/code \
-f flv rtmp://rtmp.3.com/code \
-f flv rtmp://rtmp.4.com/code \
-f flv rtmp://rtmp.5.com/code \
but is not working completely well as i was expecting and having on restream with nginx

Resources