I need to play a video coming from a stream of bytes arriving live via UDP on Mac, I have seen AVPlayer but it only seems to be possible to get the data from a given URL, how could I do this?
Thank you
Related
Could anybody give me tip how to stream from main aircraft camera to remote server? We have our own app runing on RASPI 4 build on Matrice and can get live view from camera, can download h264 file to SD card, but havent't found any description/sample how to stream outside.
Is it possibe to use aircraft-RemoteController connection and then RemoteController to WiFi? Or rather use RASPI WiFi (that will cut range I assume).
Setup a RTMP server.
Stream to a RTMP-server from the MSDK (which running on the remote).
See the MSDK example project.
Anyway if you search for the class "LiveStreamManager" in the msdk example app on github.
method getLiveStreamManager
LiveStreamManager getLiveStreamManager()
Provides access to getLiveStreamManager. It can be used to stream the video to a RTMP server to do live streaming with DJI products.
It is possible to do so as shown in the figure below. The RTMP streaming is done by using FFMEPG we stream the section of the desktop to a webRTC server. We use opencv to control the XT2 image box on the desktop and then perform the live streaming. But the normal 4G based point-to-point connection may have 30sec latency, we use a webRTC video server to make the stream realtime.
I’m running basic_video_chat application from opentok linux sdk examples, There was an audio problem on the hardware so, I had set otc_publisher_set_publish_audio (g_publisher, false) and otc_publisher_set_audio_fallback_enabled(g_publisher, false), then it created a session and started to stream video, but I get black video on opentok playground.
I tested my webcam with other application and it is working fine, webcam activity LED turned on while running application so webcam is also getting accessed. Also, I can hear audio from the subscriber side, but subscriber can't see my published video.
Found solution, I need to allow Vp8 codec in API Key configuration.
Hello I am producing a hls live stream from my phone and sending it over to chromecast over local wifi. I see huge buffering before chromeast could start playing frames.
how do i force chromecast to start livestream right away (after recieving first chunk of ts file)
how do i control buffering size on chromecast. i can clone sample chromecast reciever app and hack into it if need be.
I know that for audio streaming and video streaming RTSP protocol is used (However i am not aware of what is used in bluetooth).
However my question is little different.I would like to explain it with an example(Actually it is not an example i am trying to build something similar).
When we connect our device(a mobile running on android) using bluetooth to our PC(Operating System-Windows) the
control panel shows an option for streaming audio.
As many of you would be aware in this when i play a song on my device it is played on my PC speakers.
So my question is
1)Who is the server
2)Who is the client
Probably i think that my PC is the client.If PC is the client then it would open a connection for audio streaming with the device.
As it opens a connection with the server, the server should have a specific application through which the transfer of packets must take place with the client.
But to my surprise i was able to use any media player on my device to play songs on my laptop speaker.
How is this possible?? Is it possible to do the same thing using RTSP protocol.
I am trying to write a program that will connect to a RTSP video source and redirect the video data to another location using UDP. I am also saving the RTSP packets to be able to replay the video stream at a latter moment in time as well. Right now my program can connect to the RTSP video stream and redirect and save, but when I try to look at the redirected video I get nothing using VLC.
Currently the program just strips out the datagram from the RTSP video packets it receives in its open UDP socket and re-sends them using this code using the boost asio library.
newVideoSocket->send_to(&dg.data[0], dg.data.size() ,Endpoint);
When I look at the traffic using Wireshark I see that it is actually sending the data to the new address and it is recognized as a UDP packet, but when I try and view the video using VLC nothing happens. The video stream is Mpeg4 with the video encoded as H.264 and VLC can play it.
I have tried to connect to the redirected stream as UDP and as RTP at both multicast and unicast addresses but have had no success. Do I need to add or take something out of the datagram before I resend it? Or is it something wrong with how I am tring to view it in VLC? Thanks for the help.
To play raw UDP-stream VLC needs information about a stream (this information is transferred through RTSP in DESCRIBE and SETUP messages). Try to create sdp file, specifying port number, video type, etc (you need to read DESCRIBE response from serer) and then open it in vlc.
I've managed to make it work, but using VLC like this I've encountered problems with synchronization and video output (video was broken).