FFmpeg CLI - swap RTMP source using ZMQ (zmqsend) - ffmpeg

My setup is as follows:
Nginx with the RTMP module
Multiple RTMP stream pairs, each one with a primary and backup RTMP endpoint (so streaming to rtmp://localhost/main/$STREAM_NAME and rtmp://localhost/backup/$STREAM_NAME)
Using the Nginx RTMP module exec_publish and exec_publish_done hooks, I push either main or backup to an FFmpeg CLI proc that restreams it to a remote RTMP endpoint (Wowza server in this case, though it's not very relevant to my question)
My problem is that currently, if the main stream is stopped, I have to stop the FFmpeg CLI process that restreams to Wowza and start another with a new input source (the backup stream). This often causes issues on the Wowza side so I'm looking for a way to avoid that.
After some research, I found that FFmpeg encapsulated ZMQ support but it seems documentation is very sparse. Is it possible to send a message to the running FFmpeg process to alert it that it must change its source to a different RTMP stream?
Thanks a lot,

In case it's of interest to anyone, I solved my problem in a different way.
I now use named pipes, like so:
PIPE_FILE= /path/to/pip/file
mkfifo $PIPE_FILE
exec 7<>$PIPE_FILE
ffmpeg -nostdin -i /path/to/source -acodec copy -vcodec copy -vbsf h264_mp4toannexb -f mpegts pipe:1 > $PIPE_FILE
/path/to/source can be a media file on the FS or an RTMP stream, for that matter.
I then re-stream from the pipe to the final RTMP endpoint:
ffmpeg -re -i $PIPE_FILE -c:v libx264 -preset veryfast -r 25 -g 50 -f flv $RTMP_ENDPOINT
When $PIPE_FILE stops receiving data (i.e - when streaming stops or, in the case of sending data from a local media file, when EOF is reached), I immediately launch a different FFmpeg CLI proc and feed the pipe data from the backup media file/stream.
That keeps the re-streaming FFmpeg CLI proc continuously up and running.

Interesting approach. I've got something similar. Instead of the Pipe, I'm using another local rtmp destination.
I've got an nginx rtmp setup with 3 apps. One is the main app, another the backup app, and another is the distribute app.
So I send the main stream to the main app from my streaming software.
I have a ffmpeg process running:
ffmpeg -i rtmp://127.0.0.1/main/stream1 -c copy rtmp://127.0.0.1/distribute/stream1
If this process breaks due to the input shutting down, I run a similar command to pull input from the backup:
ffmpeg -i rtmp://127.0.0.1/backup/stream1 -c copy rtmp://127.0.0.1/distribute/stream1
From my distribute app I stream to my external outputs.
Only issue here is that I get the non-monotonous DTS error after the switch, so I've had to add a few flags when streaming from distribute to my outputs. The command is:
ffmpeg -fflags +genpts+igndts+ignidx -avoid_negative_ts make_zero -use_wallclock_as_timestamps 1 -i rtmp://127.0.0.1/distribute/stream1 -c:v libx264 -preset veryfast -r 25 -g 50 -c:a aac -b:a 128k -f flv $RTMP_ENDPOINT
I've noticed I get some warnings in the ffmpeg process when I switch, if the main and backup streams are coming in with different x264 profiles, let's say one is on High and the other on Baseline or Main.

Related

Pause/Resume/Seek mp4 on FFMPEG while transmitting to RTMP

I'm currently streaming an mp4 to an RTMP server just fine. I'm trying to figure out if I can pause the MP4 and Resume while staying connected to the RTMP server the entire time.
I'm streaming with the following command:
ffmpeg -re -i myfile.mp4 -acodec copy -vcodec copy -preset:v ultrafast -f flv rtmp://myrtmp.server
The idea is that we are showing everyone a video that is in sync for everyone on our platform. If someone has a question, I would like to be able to pause the mp4. Once the question is answered, tell FFmpeg to resume playing the mp4
I'm currently using PHP, SOCKET, and obviously FFmpeg

Streaming RTMP to JANUS-Gateway only showing bitrate but no video

I'm currently using the streaming plugin as follows
Fancy artchitecture here
OBS--------RTMP--------->NGINX-Server------FFMPEG(input RTMP output RTP)--------->JANUS---------webrtc-------->Client
When using the ffmpeg command (bellow), on the Janus streaming interface, we only see the bitrate that corresponds to that of the ffmpeg output in the console but we don't see any video.
ffmpeg -i rtmp://localhost/live/test -an -c:v copy -flags global_header -bsf dump_extra -f rtp rtp://localhost:8004
(using "-c:v copy" so that no encoding is used and hence reducing the
latency)
The video shows fine if I use "-c:v libx264", the only issue is that it is CPU intensive and adds latency.
Previously I had tried using RTSP as input for FFMPEG and in this case the video show fine with almost no latency even though I use "-c:v copy".
So I don't realy get why for RTSP the copy works fine, but for RTMP I have to use the libx264 codec. If anyone has an idea about this I am all ears :)
I had similar issue and my problem was that the stream / video that I used has large GOP size.
For WebRTC, latency is sub-second, so the input source should have short interval I frames. Better to remove B frames since they referring backward and forward as well.
Here are commands that you could use for small GOP size (4) and remove B frames.
Using RTMP streaming src:
ffmpeg rtmp://<your_src> -c:v libx264 -g 4 -bf 0 -f rtp -an rtp://<your_dst>
Using a mp4 file:
ffmpeg -re -i test.mp4 -c:v libx264 -g 4 -bf 0 -f rtp -an rtp://<your_dst>
-c:v copy does not reduce latency. It merely tells ffmpeg not to transcode.

How to get Continuous live streaming without buffering in azure media player using FFMPEG(Latency is not a issue)?

I am streaming from the ip camera which uses RTSP protocol and ingesting the feed to RTMP(to Azure media server) using the following command
ffmpeg command : ffmpeg -f lavfi -i anullsrc -rtsp_transport tcp -i rtsp://CloudAppUser:admin#192.168.8.145/MediaInput/h264/stream_1 -vcodec libx264 -t 12:00:00 -pix_fmt + -c:v copy -c:a aac -strict experimental -f flv rtmp://channel1-cloudstream-inso.channel.media.azure.net:1934/live/980b582afc12e421b85b4jifd8e8662b/df
I am able to watch the stream but it is buffering once in every 30 seconds , and I want to know the reason behind this buffering
Please any one change this command , so that it should not buffer
I am executing this command from my terminal
I would like to watch my live stream in azure media player without any buffering and latency below 1 minute is not an issue
As documented here, when on-premise encoders are set up to push a contribution feed into a Channel, we recommend that these encoders use fixed 2 second GOPs. If your IP camera is not sending 2 second GOPs, you'd have to modify the ffmpeg commandline to re-encode the input video bitstream, and not just copy it. If that doesn't help, recommend contacting us via amshelp#microsoft.com with the (output) stream URL, and other details like the Media Service account name, region used, and date/time/timezone you attempted to stream the feed.

Wowza error :Failed to play myStream; stream not found.

i am using ffmpeg for encoding a video which will then be restreamed using wowza.i am new to streaming.first i started wowza using command
/etc/init.d/WowzaMediaServer start
after that i start streaming a MP4 file using rtsp protocol.i used the command
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream.sdp
video start streaming before all of these i changed admin.password and added a username as myuser and password as mypassword when i run above command its streaming but after that they say go to
WowzaMediaServer/examples/LiveVideoStreaming/FlashRTMPPlayer/Player.html
and fill server with rtmp://localhost:1935/live
and Stream field with myStream
when i click on connect its giving me status
"Failed to play myStream; stream not found."
i am following this article http://www.wowza.com/forums/content.php?354-How-to-set-up-live-streaming-using-an-RTSP-RTP-based-encoder
where i am wrong i dont know.i am unable to figure it out. from wowza support team i am not getting satisfactory answers .so someone work on wowza please help me!!why i am not able to connect my video stream to wowza.Please respond stuck badly.
So it appears there are some basic issues with the rtsp from ffmpeg and then no matches to the play request
You have
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream.sdp
You need to make sure your ffmpeg has libx264 and libfdk_aac plugins available. You should be able to determine this with just
ffmpeg
and it should print out the libraries available.
If you have all the libraries then you are publishing a stream called
myStream.sdp
You then have instructions that say
and fill server with rtmp://localhost:1935/live
and Stream field with myStream
So you should either change your ffmpeg command to
ffmpeg -re -i /usr/local/WowzaMediaServer/content/sample.mp4 -acodec libfdk_aac -vcodec libx264 -f rtsp -muxdelay 0.1 rtsp://myuser:mypassword#127.0.0.1:1935/live/myStream
Note no .sdp in the stream name any more or use a Stream field in the player of
myStream.sdp
When publishing a stream and then attempting to play it back they must match, otherwise you get back Stream Not Found.
One way to successfully do this is to specify only the port number (65000 in this example), making sure it isn't 1935 and server in your ffmpeg command then create a mystream.stream file in your content directory of your Wowza server with ONLY the following line:
udp://0.0.0:65000
Then, in Wowza/conf/startupstreams.xml, add the following:
<!-- Native RTP example (SDP file is myStream.sdp) -->
<StartupStream>
<Application>live/_definst_</Application>
<MediaCasterType>rtp</MediaCasterType>
<StreamName>mystream.stream</StreamName>
</StartupStream>
Restart wowza and ffmpeg and then re-try your url in with the stream name mystream.stream.

Using FFMPEG to stream continuously videos files to a RTMP server

ffmpeg handles RTMP streaming as input or output, and it's working well.
I want to stream some videos (a dynamic playlist managed by a python script) to a RTMP server, and i'm currently doing something quite simple: streaming my videos one by one with FFMPEG to the RTMP server, however this causes a connection break every time a video end, and the stream is ready to go when the next video begins.
I would like to stream those videos without any connection breaks continuously, then the stream could be correctly viewed.
I use this command to stream my videos one by one to the server
ffmpeg -re -y -i myvideo.mp4 -vcodec libx264 -b:v 600k -r 25 -s 640x360 \
-filter:v yadif -ab 64k -ac 1 -ar 44100 -f flv \
"rtmp://mystreamingserver/app/streamName"
I looked for some workarounds over the internet for many days, and i found some people talking about using a named pipe as input in ffmpeg, I've tried it and it didn't work well since ffmpeg does not only close the RTMP stream when a new video comes but also closes itself.
Is there any way to do this ? (stream a dynamic playlist of videos with ffmpeg to RTMP server without connection breaks
Update (as I can't delete the accepted answer): the proper solution is to implement a custom demuxer, similar to the concat one. There's currently no other clean way. You have to get your hands dirty and code!
Below is an ugly hack. This is a very bad way to do it, just don't!
The solution uses the concat demuxer and assumes all your source media files use the same codec. The example is based on MPEG-TS but the same can be done for RTMP.
Make a playlist file holding a huge list of entry points for you dynamic playlist with the following format:
file 'item_1.ts'
file 'item_2.ts'
file 'item_3.ts'
[...]
file 'item_[ENOUGH_FOR_A_LIFETIME].ts'
These files are just placeholders.
Make a script that keeps track of you current playlist index and creates symbolic links on-the-fly for current_index + 1
ln -s /path/to/what/to/play/next.ts item_1.ts
ln -s /path/to/what/to/play/next.ts item_2.ts
ln -s /path/to/what/to/play/next.ts item_3.ts
[...]
Start playing
ffmpeg -f concat -i playlist.txt -c copy output -f mpegts udp://<ip>:<port>
Get chased and called names by an angry system administrator
Need to create two playlist files and at the end of each file specify a link to another file.
list_1.txt
ffconcat version 1.0
file 'item_1.mp4'
file 'list_2.txt'
list_2.txt
ffconcat version 1.0
file 'item_2.mp4'
file 'list_1.txt'
Now all you need is to dynamically change the contents of the next playlist file.
You can pipe your loop to a buffer, and from this buffer you pipe to your streaming instance.
In shell it would look like:
#!/bin/bash
for i in *.mp4; do
ffmpeg -hide_banner -nostats -i "$i" -c:v mpeg2video \
[proper settings] -f mpegts -
done | mbuffer -q -c -m 20000k | ffmpeg -hide_banner \
-nostats -re -fflags +igndts \
-thread_queue_size 512 -i pipe:0 -fflags +genpts \
[proper codec setting] -f flv rtmp://127.0.0.1/live/stream
Of course you can use any kind of loop, also looping through a playlist.
I figure out that mpeg is a bit more stabile, then x264 for the input stream.
I don't know why, but minimum 2 threads for the mpeg compression works better.
the input compression need to be faster then the output frame rate, so we get fast enough new input.
Because of the non-continuing timestamp we have to skip them and generate a new one in the output.
The buffer size needs to be big enough for the loop to have enough time to get the new clip.
Here is a Rust based solution, which uses this technique: ffplayout
This uses a JSON playlist format. The Playlist is dynamic, in that way that you can edit always the current playlist and change tracks or add new ones.
Very Late Answer, but I recently ran into the exact same issue as the poster above.
I solved this problem by using OBS and the OBS websockets plugin.
First, set your RTMP streaming app as you have it now. but stream to a LOCAL RTMP stream.
Then have OBS load this RTMP stream as a VLC source layer with the local RTMP as the source.
then (in your app), using the OBS websockets plugin, have your VLC source switch to a static black video or PNG file when the video ends. Then switch back to the RTMP stream once the next video starts. This will prevent the RTMP stream from stopping when the video ends. OBS will go black durring the short transition, but the final OBS RTMP output will never stop.
There is surely a way to do this with manually setting up a intermediate RTMP server that pushes to a final RTMP server, but I find using OBS to be easier, with little overhead.
I hope this helps others, this solutions has been working incredible for me.

Resources