Use WebRTC video call SDK for load/stress/performance testing - jmeter

I have WebRTC video call SDK and WebRTC server. I wanted to Register more than 10000 user virtually to WebRTC server and do the video call in between them by using my SDK. Basically I wanted to do load/stress/performance testing using my SDK.
Could you please suggest me how can we do the load/stress/performance testing using my own WebRTC video call SDK ?

Related

How to send an audio stream over SIP

I'm developing an application that receives an audio stream over a WebSocket and needs to forward the audio to a SIP server.
Currently, I've managed to connect to the audio source using a Websocket and receive the media stream (encoded u-law) using Node-Red, but I'm struggling to figure out how to send the media stream to the SIP server. Any advice would be much appreciated.
I looked into this for a similar question a while back, can't find where it was now.
As you probably know the media part of SIP is RTP, so its a fairly separate stack to the call signalling.
I didn't find any nodes that supported it and the few node.js libraries for RTP were all very incomplete and out of date.
In theory it might be possible to craft your own RTP streams using the UDP nodes and then create the relevant SDP in the SIP response but I'm not sure how robust or scalable this would be.
The other option is that there are a couple of Programmable Comms platforms out there that support both SIP and Web sockets so you could possible utilise one of those and connect from Node-RED via websocket letting them do the SIP work.
I've done SIP|<>Websocket stuff with both the Vonage API (Previously Nexmo) and Jambonz (open source)

DJI OSDK - how to stream to remote server

Could anybody give me tip how to stream from main aircraft camera to remote server? We have our own app runing on RASPI 4 build on Matrice and can get live view from camera, can download h264 file to SD card, but havent't found any description/sample how to stream outside.
Is it possibe to use aircraft-RemoteController connection and then RemoteController to WiFi? Or rather use RASPI WiFi (that will cut range I assume).
Setup a RTMP server.
Stream to a RTMP-server from the MSDK (which running on the remote).
See the MSDK example project.
Anyway if you search for the class "LiveStreamManager" in the msdk example app on github.
method getLiveStreamManager
LiveStreamManager getLiveStreamManager()
Provides access to getLiveStreamManager. It can be used to stream the video to a RTMP server to do live streaming with DJI products.
It is possible to do so as shown in the figure below. The RTMP streaming is done by using FFMEPG we stream the section of the desktop to a webRTC server. We use opencv to control the XT2 image box on the desktop and then perform the live streaming. But the normal 4G based point-to-point connection may have 30sec latency, we use a webRTC video server to make the stream realtime.

Opentok display’s black video

I’m running basic_video_chat application from opentok linux sdk examples, There was an audio problem on the hardware so, I had set otc_publisher_set_publish_audio (g_publisher, false) and otc_publisher_set_audio_fallback_enabled(g_publisher, false), then it created a session and started to stream video, but I get black video on opentok playground.
I tested my webcam with other application and it is working fine, webcam activity LED turned on while running application so webcam is also getting accessed. Also, I can hear audio from the subscriber side, but subscriber can't see my published video.
Found solution, I need to allow Vp8 codec in API Key configuration.

What are the possible ways to play RTSP live stream on Web?

What are the possible ways we can play a live stream (RTSP) on the web browser, without using any video player plugin like VLC or VXG players?
I have a web application written with Laravel framework. The application is to play live stream from IP cameras. Using the video player plugins could work, but it has high latency. Also, it is not reliable. After running for some time, it will crash without any useful error message.
I tried to use opencv.js but failed. The VideoCapture function does not accept URL (but in its documentation it can. Perhaps in opencv.js it is different.)
Any other alternative?
No options available. Web browsers can not open sockets directly. The only option for getting data into a browser are HTTP, web sockets or webrtc.

Socket Programming For Audio Streaming using Bluetooth or wifi

I know that for audio streaming and video streaming RTSP protocol is used (However i am not aware of what is used in bluetooth).
However my question is little different.I would like to explain it with an example(Actually it is not an example i am trying to build something similar).
When we connect our device(a mobile running on android) using bluetooth to our PC(Operating System-Windows) the
control panel shows an option for streaming audio.
As many of you would be aware in this when i play a song on my device it is played on my PC speakers.
So my question is
1)Who is the server
2)Who is the client
Probably i think that my PC is the client.If PC is the client then it would open a connection for audio streaming with the device.
As it opens a connection with the server, the server should have a specific application through which the transfer of packets must take place with the client.
But to my surprise i was able to use any media player on my device to play songs on my laptop speaker.
How is this possible?? Is it possible to do the same thing using RTSP protocol.

Resources