FFMPEG Live Stream Capturing: Any option to find stream is down? - ffmpeg

I am capturing live video stream using ffmpeg through the following command:
ffmpeg -re -i STREAM_URL -t 3600 c:/test.mp4
Is there any opetion in ffmpeg that generate a message whenever the stream is down

It really depends on input's streaming type. you must define first what type of streaming is allowed in your work(such as HLS, RTMP)
When the streaming URL is really unavailable while in streaming, in general situation, FFmpeg just returns EOF flag so you really don't know whether streaming is down, or just end.
It could be more complicated in live streaming. when streaming URL is alive but no stream is pushed into input streaming URL, it just waiting forever without any message.
Funny thing is that you still can catch only few things in above situation in source code level. So there is really no way to know whether stream is down, or whatever if you're just using command-line. You must check server too.

This works :
ffmpeg -f dshow -benchmark -t 00:00:30 -s 1280x720 -i video0 -b 5000k -report c:/main/sample.avi
Run it on CMD it will capture live video of 30 seconds and save it along with report.

Related

FFmpeg CLI - swap RTMP source using ZMQ (zmqsend)

My setup is as follows:
Nginx with the RTMP module
Multiple RTMP stream pairs, each one with a primary and backup RTMP endpoint (so streaming to rtmp://localhost/main/$STREAM_NAME and rtmp://localhost/backup/$STREAM_NAME)
Using the Nginx RTMP module exec_publish and exec_publish_done hooks, I push either main or backup to an FFmpeg CLI proc that restreams it to a remote RTMP endpoint (Wowza server in this case, though it's not very relevant to my question)
My problem is that currently, if the main stream is stopped, I have to stop the FFmpeg CLI process that restreams to Wowza and start another with a new input source (the backup stream). This often causes issues on the Wowza side so I'm looking for a way to avoid that.
After some research, I found that FFmpeg encapsulated ZMQ support but it seems documentation is very sparse. Is it possible to send a message to the running FFmpeg process to alert it that it must change its source to a different RTMP stream?
Thanks a lot,
In case it's of interest to anyone, I solved my problem in a different way.
I now use named pipes, like so:
PIPE_FILE= /path/to/pip/file
mkfifo $PIPE_FILE
exec 7<>$PIPE_FILE
ffmpeg -nostdin -i /path/to/source -acodec copy -vcodec copy -vbsf h264_mp4toannexb -f mpegts pipe:1 > $PIPE_FILE
/path/to/source can be a media file on the FS or an RTMP stream, for that matter.
I then re-stream from the pipe to the final RTMP endpoint:
ffmpeg -re -i $PIPE_FILE -c:v libx264 -preset veryfast -r 25 -g 50 -f flv $RTMP_ENDPOINT
When $PIPE_FILE stops receiving data (i.e - when streaming stops or, in the case of sending data from a local media file, when EOF is reached), I immediately launch a different FFmpeg CLI proc and feed the pipe data from the backup media file/stream.
That keeps the re-streaming FFmpeg CLI proc continuously up and running.
Interesting approach. I've got something similar. Instead of the Pipe, I'm using another local rtmp destination.
I've got an nginx rtmp setup with 3 apps. One is the main app, another the backup app, and another is the distribute app.
So I send the main stream to the main app from my streaming software.
I have a ffmpeg process running:
ffmpeg -i rtmp://127.0.0.1/main/stream1 -c copy rtmp://127.0.0.1/distribute/stream1
If this process breaks due to the input shutting down, I run a similar command to pull input from the backup:
ffmpeg -i rtmp://127.0.0.1/backup/stream1 -c copy rtmp://127.0.0.1/distribute/stream1
From my distribute app I stream to my external outputs.
Only issue here is that I get the non-monotonous DTS error after the switch, so I've had to add a few flags when streaming from distribute to my outputs. The command is:
ffmpeg -fflags +genpts+igndts+ignidx -avoid_negative_ts make_zero -use_wallclock_as_timestamps 1 -i rtmp://127.0.0.1/distribute/stream1 -c:v libx264 -preset veryfast -r 25 -g 50 -c:a aac -b:a 128k -f flv $RTMP_ENDPOINT
I've noticed I get some warnings in the ffmpeg process when I switch, if the main and backup streams are coming in with different x264 profiles, let's say one is on High and the other on Baseline or Main.

How to get Continuous live streaming without buffering in azure media player using FFMPEG(Latency is not a issue)?

I am streaming from the ip camera which uses RTSP protocol and ingesting the feed to RTMP(to Azure media server) using the following command
ffmpeg command : ffmpeg -f lavfi -i anullsrc -rtsp_transport tcp -i rtsp://CloudAppUser:admin#192.168.8.145/MediaInput/h264/stream_1 -vcodec libx264 -t 12:00:00 -pix_fmt + -c:v copy -c:a aac -strict experimental -f flv rtmp://channel1-cloudstream-inso.channel.media.azure.net:1934/live/980b582afc12e421b85b4jifd8e8662b/df
I am able to watch the stream but it is buffering once in every 30 seconds , and I want to know the reason behind this buffering
Please any one change this command , so that it should not buffer
I am executing this command from my terminal
I would like to watch my live stream in azure media player without any buffering and latency below 1 minute is not an issue
As documented here, when on-premise encoders are set up to push a contribution feed into a Channel, we recommend that these encoders use fixed 2 second GOPs. If your IP camera is not sending 2 second GOPs, you'd have to modify the ffmpeg commandline to re-encode the input video bitstream, and not just copy it. If that doesn't help, recommend contacting us via amshelp#microsoft.com with the (output) stream URL, and other details like the Media Service account name, region used, and date/time/timezone you attempted to stream the feed.

Capturing and processing a live RTMP stream

I'm trying to download a live stream (not a file) coming from a live camera feed available at the following website: http://www.dot.ca.gov/video/.
I used Wireshark for sniffing the TCP packets and was able to extract the RTMP parameters, but wasn't able to use them with FFMPEG/VLC for downloading / playing the stream on VLC (I guess I didn't construct the URL correctly).
for example, for the camera feed available here, I got the following parameters:
swfUrl: http://www.dot.ca.gov/research/its/StrobeMediaPlayback.swf
pageUrl: http://www.dot.ca.gov/d4/d4cameras/ct-cam-pop- N17_at_Saratoga_Rd.html
tcUrl: rtmp://wzmedia.dot.ca.gov:1935/D4
Play : E37_at_Lakeville_Rd.stream.
Is there a chance someone is familiar with this and can help with understanding how I can use the above for downloading the stream?
Thanks a lot! Yaniv
ffmpeg -re -i "rtmp://wzmedia.dot.ca.gov:1935/D4" -acodec copy -vcodec libx264 -f flv -y ~/save_stream.flv
"-i " means infile and "-y" means overwrite output files.
you can use ffmpeg -h to see it.

FFMPEG: How to stream to multiple outputs with the same encoding independently

How my stream is working right now:
Input:
Switcher program that captures the camera and screen shots and make a different layouts. One of the windows from the software is the one used as Input in the ffmpeg command line.
Output:
- Facebook (example)
- Youtube (example)
At the beginning, i thought that maybe could be better create two different ffmpeg processes to stream independently to each output. The problem was it uses too much CPU.
The answer for it, was to encode one time and copy it to different outputs. Ok, great, it solves the problem, but what if one of the output fails? Both fail.
I'm trying to make one encoding to two outputs and if one of these outputs is not available, the other keep going well.
Anybody have any idea to solve it?
Thanks!
I found the solution following what #LordNeckbeard said.
Here is a sample code to:
Save a local file
Stream to your server
Stream to Facebook server
Every stream is independent from the other and will try to recover itself independently every one second if something happened like internet connection–will save locally and try to recover when internet access came back–or destination server is not available yet and when it came back it will restart the streaming process):
-i ... -f tee "[onfail=ignore]'C:\Users\blablabla.mp4'|
[f=fifo:fifo_format=flv:drop_pkts_on_overflow=1:attempt_recovery=1:recovery_wait_time=1]rtmp://yourServer...|
[f=fifo:fifo_format=flv:drop_pkts_on_overflow=1:attempt_recovery=1:recovery_wait_time=1]"rtmp://facebook..."
Example using the tee muxer with the onfail option and also output a local file:
ffmpeg -i input -map 0 -c:v libx264 -c:a aac -maxrate 1000k -bufsize 2000k -g 50 -f tee "[f=flv:onfail=ignore]rtmp://facebook|[f=flv:onfail=ignore]rtmp://youtube|[f=segment:strftime=1:segment_time=60]local_%F_%H-%M-%S.mkv"
Also see:
FFmpeg Documentation: tee muxer
FFmpeg Documentation: segment muxer
FFmpeg Wiki: Encoding for Streaming Sites

Using FFMPEG to stream continuously videos files to a RTMP server

ffmpeg handles RTMP streaming as input or output, and it's working well.
I want to stream some videos (a dynamic playlist managed by a python script) to a RTMP server, and i'm currently doing something quite simple: streaming my videos one by one with FFMPEG to the RTMP server, however this causes a connection break every time a video end, and the stream is ready to go when the next video begins.
I would like to stream those videos without any connection breaks continuously, then the stream could be correctly viewed.
I use this command to stream my videos one by one to the server
ffmpeg -re -y -i myvideo.mp4 -vcodec libx264 -b:v 600k -r 25 -s 640x360 \
-filter:v yadif -ab 64k -ac 1 -ar 44100 -f flv \
"rtmp://mystreamingserver/app/streamName"
I looked for some workarounds over the internet for many days, and i found some people talking about using a named pipe as input in ffmpeg, I've tried it and it didn't work well since ffmpeg does not only close the RTMP stream when a new video comes but also closes itself.
Is there any way to do this ? (stream a dynamic playlist of videos with ffmpeg to RTMP server without connection breaks
Update (as I can't delete the accepted answer): the proper solution is to implement a custom demuxer, similar to the concat one. There's currently no other clean way. You have to get your hands dirty and code!
Below is an ugly hack. This is a very bad way to do it, just don't!
The solution uses the concat demuxer and assumes all your source media files use the same codec. The example is based on MPEG-TS but the same can be done for RTMP.
Make a playlist file holding a huge list of entry points for you dynamic playlist with the following format:
file 'item_1.ts'
file 'item_2.ts'
file 'item_3.ts'
[...]
file 'item_[ENOUGH_FOR_A_LIFETIME].ts'
These files are just placeholders.
Make a script that keeps track of you current playlist index and creates symbolic links on-the-fly for current_index + 1
ln -s /path/to/what/to/play/next.ts item_1.ts
ln -s /path/to/what/to/play/next.ts item_2.ts
ln -s /path/to/what/to/play/next.ts item_3.ts
[...]
Start playing
ffmpeg -f concat -i playlist.txt -c copy output -f mpegts udp://<ip>:<port>
Get chased and called names by an angry system administrator
Need to create two playlist files and at the end of each file specify a link to another file.
list_1.txt
ffconcat version 1.0
file 'item_1.mp4'
file 'list_2.txt'
list_2.txt
ffconcat version 1.0
file 'item_2.mp4'
file 'list_1.txt'
Now all you need is to dynamically change the contents of the next playlist file.
You can pipe your loop to a buffer, and from this buffer you pipe to your streaming instance.
In shell it would look like:
#!/bin/bash
for i in *.mp4; do
ffmpeg -hide_banner -nostats -i "$i" -c:v mpeg2video \
[proper settings] -f mpegts -
done | mbuffer -q -c -m 20000k | ffmpeg -hide_banner \
-nostats -re -fflags +igndts \
-thread_queue_size 512 -i pipe:0 -fflags +genpts \
[proper codec setting] -f flv rtmp://127.0.0.1/live/stream
Of course you can use any kind of loop, also looping through a playlist.
I figure out that mpeg is a bit more stabile, then x264 for the input stream.
I don't know why, but minimum 2 threads for the mpeg compression works better.
the input compression need to be faster then the output frame rate, so we get fast enough new input.
Because of the non-continuing timestamp we have to skip them and generate a new one in the output.
The buffer size needs to be big enough for the loop to have enough time to get the new clip.
Here is a Rust based solution, which uses this technique: ffplayout
This uses a JSON playlist format. The Playlist is dynamic, in that way that you can edit always the current playlist and change tracks or add new ones.
Very Late Answer, but I recently ran into the exact same issue as the poster above.
I solved this problem by using OBS and the OBS websockets plugin.
First, set your RTMP streaming app as you have it now. but stream to a LOCAL RTMP stream.
Then have OBS load this RTMP stream as a VLC source layer with the local RTMP as the source.
then (in your app), using the OBS websockets plugin, have your VLC source switch to a static black video or PNG file when the video ends. Then switch back to the RTMP stream once the next video starts. This will prevent the RTMP stream from stopping when the video ends. OBS will go black durring the short transition, but the final OBS RTMP output will never stop.
There is surely a way to do this with manually setting up a intermediate RTMP server that pushes to a final RTMP server, but I find using OBS to be easier, with little overhead.
I hope this helps others, this solutions has been working incredible for me.

Resources