Live Streaming is not work when enable YuvData in mobile sdk - dji-sdk

I am trying to use Yuv data to send this data by using webrtc protocol and also wanted to send stream with RTMP at sametime.
So while doing this, I have to enable Yuv data by function "enabledYuvData" in DJICodecManager after this I am getting state -3 when going live with RTMP.
I wanted to use both together "enabledYuvData". and "Live Stream with RTMP".
I able to use either "enabledYuvData" or "Live Stream with RTMP" at a time.

Current version is not supported,MSDK 5.0+ will fix this.

Related

How to switch streams in live streaming for fault tolerance

When live streaming in outdoor with weak network conditions, there are some devices to aggregate networks like MPTCP router. But packet loss in MPTCP, will cause the live streaming problem, like stream interruption.
So if publish multiple live streams to server, it's much robust than single stream. My question is: How to switch between streams, if some stream has problem, without reconnect for player?
For example, publish two streams:
streamA rtmp://xxx/app/streamA
streamB rtmp://xxx/app/streamB
Play the stream, whatever which stream is selected:
stream rtmp://xxx/app/stream
If streamA is poor quality or interrupted, administrator could switch to streamB, and player still play stream without reconnection.
In addition, there is no need to refresh the playback end, stream will not interrupt, video content will not repeat or jump frame.
To support multiple streams fault tolerance, there are some solutions:
Server solution: Publish streams with the same name, play only one stream.
Server solution: Publish streams with different names, play only one stream.
Client solution: Publish streams with different names, play a list of streams.
Client Solution
It's the most simple solution, for example, let's say publish two streams:
streamA rtmp://xxx/app/streamA
streamB rtmp://xxx/app/streamB
Player get the playlist from your backend server:
streamA rtmp://xxx/app/streamA
streamB rtmp://xxx/app/streamB
If serverA is unavailable, user will switch to streamB. Highly recommend this solution, because it's simple and robust, and no change need to be made for the streaming system.
Note: Player also could play HTTP-FLV/HLS or WebRTC, please read this post
Server solution: Multiple URLs
If publish multiple streams, each with its URL, like this:
streamA rtmp://xxx/app/streamA
streamB rtmp://xxx/app/streamB
The media server will merge these URLs to one URL, for switching automatically if stream is unavailable, to enable the player always use one URL to play the stream:
stream rtmp://xxx/app/stream
Note: Player also could play HTTP-FLV/HLS or WebRTC, please read this post
For SRS, this feature is named Alias for Stream, which is not supported right now(at 2022.01), but there is a workaround, use FFmpeg to covert streamA to stream:
ffmpeg -f flv rtmp://xxx/app/streamA -c copy -f flv rtmp://xxx/app/stream
If streamA is unavailable, administrator should switch to streamB by start a new FFmpeg cli, after killed the previous FFmpeg process:
killall -9 ffmpeg
ffmpeg -f flv rtmp://xxx/app/streamB -c copy -f flv rtmp://xxx/app/stream
This solution is much simple, nothing need to be done for the streaming system and player, it should works perfect.
Server solution: One URL
If publish multiple streams with the same URL:
streamA rtmp://xxx/app/stream
streamB rtmp://xxx/app/stream
It requires the media server or cluster support this feature. SRS has no plan for this, because it's not simple enough than previous solutions.
However, some video cloud platforms already support this feature, like Tencent Cloud Streaming Services.
Other Issues
If switch between streams, there always be some issues, like content repeating, or lagging. Think about the first solution, player play a playlist with two streams, if switch to another one, it's impossible to play the same timestamp because it's live streaming.
You could try HLS, to switch between streams, and play from the next ts file of another stream, but I'm not sure whether it works.
I think when stream switching, it means some big problems like network fail or device corrupts, it seems OK unless the stream is unavailable.

Recording RTP stream in segments based off traffic

I'm looking to record multiple multicast RTP audio stream into chunked timestamped files based off the traffic on that live stream.
For example the application would listen to the IP address/port then start recording when RTP traffic is beings streamed, then stop the recording and save the file when the RTP traffic stops.
I've been trying to find examples for how FFMPEG or Gstreamer could do this but have not found anything concrete. Is this possible with one of these applications? If so could you provide an example?
You can add a probe to the src element and trigger the start/stop recording in the callback function of the probe.
Another simpler version, a pipeline on the following lines will do the trick :
udpsrc -> rtpbin -> decoder if required -> filesink location=recording.mp4
The above pipeline will record for the complete duration of pipeline being active. Have a look at element rtpbin, to see some more pipeline examples.

How to Play Multiple Streams with FFPLAY at the Same Time using RTSP Url

I am trying to play live stream coming from the server with RTSP URL. A sample RTSP URL is given below:
rtsp://username:password#machine_ip/42331536059e9f21
Actually, this stream is the call between two participants (caller & called). But when I play this URL with FFPLAY, I get just one stream(called) while I should get both streams (caller and called). I am using the following command:
ffplay rtsp://username:password#machine_ip/42331536059e9f21
Am I missing some parameters along with this command to fetch all streams.
It's actually the limitation of FFPLAY that it doesn't support multiple streams at the same time.
FFplay has currently no support for playing two audio streams simultaneously.
Here is the reference of this answer.

how to restream rtsp h264 as "live dvr" for iOs using ffserver?

I would like to grab an existing stream from an IP-Camera delivering h264 encoded rtsp stream and restream it for iPhone/Ipad, where the user would have the opportunity to jump back in time for aprox. 1 minute. And later jump back to the "live" feed.
Actually I would like to do the same as in wowza (http://www.wowza.com/addons/wowza-ndvr-addon) but with ff** software.
Thank you for all your hints!
As I recall ffserver does not support http streaming protocol, so what are you restreaming the video as. I know live555 server can be configured for http live streaming.
converting to http live streaming is the only possible reason I think of that you would want to re-stream at all.
there are frameworks that can be used to play live rtsp feeds.
Dropcam is one and is based on live 555 , we have one here based on ffmpeg
https://github.com/mooncatventures-group
neither of these have the scrub back x seconds you wish, but you could easily take the extracted frames , put them into a ring buffer and play from the buffer
take a look at ffplayer-tests that records video (not audio just yet) to a new h264 mov and stores it in the photo album

What are the performance differences between RTP over UDP and RTSP/RTP?

I realize that RTSP uses RTP, I'm wanting to compare is plain RTP over UDP vs RTSP using RTP. This would be on the publishing side of the stream and in this specific scenario bandwidth is extremely limited. Will removing RTSP from the mix actually gain me anything?
RTSP really only deals with the setup, pausing, resuming and teardown of the stream. This bandwidth is usually tiny in comparison to the media (which is sent over RTP).
So no, removing RTSP from the mix won't help you.

Resources