FFmpeg multicast packet filtering with UDP - ffmpeg

Problem Scenario:
I am trying to capture multiple multicast cameras on a network with FFmpeg. Upon receiving the streams, I find that each FFmpeg instance running is receiving and decoding the packets destined for the other instances. This causes the video of each instance to flicker between the correct image and the images of all the other cameras. Each camera uses the same destination UDP port and I believe this is the reason its happening.
Example
Process 1
ffmpeg -rtsp_transport udp_multicast -i "rtsp://192.168.1.1/stream1m" test1.mp4
Process 2
ffmpeg -rtsp_transport udp_multicast -i "rtsp://192.168.1.2/stream1m" test2.mp4
Expected Output
Each MP4 contains only one, non-interrupted stream
Actual Output
As described above
I've trawled through FFmpeg's docs, as well as extensive 'googling' however I can only see a way to filter incoming packets when using an rtp:// or udp:// input. This isn't possible in this application as I want to use RTSP for the SDP it provides.
Any help is greatly appreciated!

I got this to work using different multicast IPs and ports. The trick is the ports - they have to be different on the stream side, even if the IPs are different.
This seems to be a problem with ffmpeg.

Related

Switch between rtsp stream and static image input on the fly based on rtsp stream availability in ffmpeg command while recording

Basically there are two inputs for ffmpeg planning to execute Windows 10 OS. RTSP stream and static image. While saving frames if rtsp strram is not available or timeout occurs, then ffmpeg immediately switch to second input. If rtsp stream is back, continue with rtsp.
Key expectations is that the process should not try to reconnect to rtsp source or process should not restart.
Ffmpeg commands with/without pipes or gstreamer or simple proxy server or any fresh solutions with scrpts/commands/codes to try outs are highly appreciated.

ffmpeg Convert hls to rtsp protocal

Hi everyone,most of time I had seen the method that convert ip camera source(rtsp) to http(hls),but no one try convert hls to rtsp。I have hls test url:
https://multiplatform-f.akamaihd.net/i/multi/will/bunny/big_buck_bunny_,640x360_400,640x360_700,640x360_1000,950x540_1500,.f4v.csmil/master.m3u8
and Use the command to convert hls to rtsp:
ffmpeg -i https://multiplatform-f.akamaihd.net/i/multi/will/bunny/big_buck_bunny_,640x360_400,640x360_700,640x360_1000,950x540_1500,.f4v.csmil/master.m3u8 -f rtsp rtsp://localhost:8554/test
but not work,can someone help me?Thank you very much!
You need a live streaming server (like Wowza SE) to serve the live stream to other clients (players).
Also when publishing the live stream to the streaming server some authentication parameters are required, like rtsp://user:password#server:[post]/app/stream .
You can check the WordPress - Broadcast Live Video plugin, that handles various protocols, streaming methods and FFmpeg transcoding (if you have the necessary streaming server).

SRT : No Room to Store Incoming Packets when we stream SRT streams through VLC

Hope all you are doing well.
Actually i was trying to stream a media file using VLC by SRT protocol. For which, srt-live-transmit is used as converter in between SRT Listener and VLC UDP streams. srt-live-transmit used to convert udp to srt streams(mpegts). But when i tried to do that, after few seconds, i got error in srt-live-transmit terminal that :
No room to store incoming packet:
What should be the reason for this error? And if anyone know anything about this problem, please share info. It would be helpful.
Thank you.

Proxying an RTSP url using an RTSP Proxy Server

I have a use case where I need to restream an RTSP URL.
For all the below cases, Live555 is built locally by cloning the source.
For all the below cases, required ports are opened for RTSP in respective servers.
First I set up a file to be streamed using Live555MediaServer. This is in an AWS instance (AWS1).
./Live555MediaServer ashi.webm gives
rtsp://public_IP_of_instance:8554/ashi.web
I check whether an incoming stream is working or not, by trying to play it in VLC player. This incoming stream is working fine and playing well on VLC.
Now, to restream, I tried Live555ProxyServer. Now incoming stream from AWS1 is restreamed to another URL by running Live555ProxyServer on my Mac.
./Live555ProxyServer rtsp://public_IP_of_instance_one:8554/ashi.web gives,
rtsp://localhost:8554/ProxyStream
This URL is also playable in VLC.
Now I setup another AWS instance (AWS2) and run ProxyServer on it, listening to AWS1.
That is,
./Live555ProxyServer rtsp://public_IP_of_instance_one:8554/ashi.web gives,
rtsp://public_IP_of_instance_two:8554/ProxyStream
This URL is not playable.
I tried using -t flag for TCP instead of UDP. I tried checking the incoming RTSP stream with ffprobe and the stream is showing all the required details.
What could be the possible reason? What is the missing piece in this pipeline? Do we've great industry-grade open source RTSP servers?
EDIT:
I don't know what is exactly happened, but using VLC as the client to check the stream was the problem. I moved to ffmpeg and tried to write it to a file using
ffmpeg -i rtsp://public_IP_of_instance_two:8554/proxyStream -acodec copy -vcodec copy abc.webm
So the stream from ProxyServer is fine.
The lead to the change of client was the repeated warning from live555ProxyServer about outdated firmware explained here.
I'm currently using VLC Version 3.0.3 Vetinari (Intel 64bit) in Mac which is the latest version. Maybe VLC is using an old version of Live555 internally.

Joining a multicast stream via RTSP using ffplay

I have a RTSP server that provides access to a multicast stream. I can play it fine in VLC and various other players, however in ffmpeg / ffplay I get no data back.
After investigation this seems to be caused by the RTSP SETUP listing only unicast as a transport method.
If I use:
ffplay -rtsp_transport udp_multicast rtsp://<blah>
then it works fine.
How does ffmpeg decide which transport methods to put into its SETUP call? Is it perhaps some part of the SDP returned by the DESCRIBE call? Ideally I'd like to find a way to allow ffplay to play the multicast stream without the need to force the transport mode.

Resources