Agora switch from audio to video call during voice call - laravel

I have implemented Agora Web SDK for calling and its working great in Laravel and Vue.js.
I want to know if there is any way to switch from audio to video call while having audio call.
If I createStream with video = false then it is only audio call but I cannot change it to switch to video call using rtc.localStream.unmuteVideo() because it requires video = true to make it work
If I createStream with video = true then I turn of the camera using rtc.localStream.muteVideo(). So it turns off the camera and user can turn it on when required. But camera light is still on.
// create local stream
this.rtc.localStream = AgoraRTC.createStream({
streamID: this.rtc.params.uid,
audio: true,
video: true,
screen: false,
microphoneId: this.option.microphoneId,
cameraId: this.option.cameraId
});
Which means it is still a video call but video track is just a blank screen.
I am looking for a way where I can remove the only audio call stream and add audio-video call stream without interruption.
I have looked into Agora docs but I couldnt find anything about it.
I think once a stream is created then cannot change video to true or false.
UPDATE:
After reading Hermes' answer. I did some RnD and found some useful events.
Trick is same as mentioned in answer.
To change call from screen <=> audio <=> video <=> screen.
Unpublish the local stream using
Create new stream to add/remove video or screen sharing
wait for old local stream to be unpublished and init the new stream and publish it.
Above 3 steps will trigger AGORA events
client.unpublish triggers stream-removed event
publish triggers stream-added event . In this event listener, subscribe to new stream. When you subscribe to new stream then stream-subscribed event is triggered.
On the basis of these events, you can update the ui and (local, remote) streams as well.

The best way to handle this is to use two Agora stream objects.
First create the audio only stream, and publish it using client.publish. Then when you want to change to a video call, \ then call client.unpublish on the the audio stream and create a second stream with video option enabled and call client.publish with the video stream.
There will be only a momentary interruption as the one stream is unpublished and then new one is published to the channel.

Related

FFMPEG handling h264 stream that does not send a frame consistently

I am working with a raw h264 stream, this is a live stream coming from a device, however when the device is streaming a menu page that is static, it doesn’t send out a frame. I am feeding the stream back into a v4l2 loop back instance and then consuming this on a webpage via getUserMedia. The issue I have is that ffmpeg does not send frames to v4l2 when the hardware device is not sending frames. I have tried to set the output of ffmpeg to cfr and 60fps. However this doesn’t make it send out duplicates of the last frame. Is there anyway to achieve this?
Thanks in advance

DirectShow WAV file source does not produce any sound when graph runs

We have a DirectShow application where we capture video input from USB, multiplex with audio from a WAV file (backing music), overlay audio and video effects, compress and write to an MP4 file.
Originally we were using an audio input source (microphone) and mixing our backing music and sound effects over the top but the decision was made to not capture live audio, and so I thought it would make more sense to use the backing music WAV file itself as the audio source.
Here is the filter graph we have:
backing.wav is a simple WAV file (stored locally), and was added to the graph using IFilterGraph::AddSourceFilter.
The problem is that when the graph is run, no audio samples are delivered from the WAV file. The video part of the graph runs as normal, but it's as if the audio part of the graph simply isn't running.
If I stop the graph in GraphEdit, add the Default DirectSound Device audio renderer and hook that up in place of the AAC Encoder filter and then run the graph again, the audio plays as you would expect.
Additionally, if backing.wav is replaced with an audio capture source like a microphone, audio data flows through as normal.
Does anyone have any ideas why the above graph, using a WAV file as the audio source, would fail to produce any audio samples?
I suppose the title is incorrectly identifying/summarizing the problem. There is nothing to tell for the fact that audio is not produced. It is likely that it is produced equally well with DirectSound Renderer and with AAC Encoder, specifically the data is reaching output pin of Mixing Transform Filter (is this your filter? You should be able to trace its flow and see media samples passing though).
With the information given, I would say it's likely that custom AAC encoder somehow does not like the feed and somehow either drops data or switches to erroneous state. You should be able to debug this further by inserting a Sample Grabber (or alike) filter¹ before the AAC encoder and tracing the media samples. Also comparing them to data from another source. The encoder might be sensitive to small details like media sample duration or discontinuity flag on the first sample streamed.
¹ With GraphStudioNext (GraphEdit makes no longer sense compared to) you can use internal Analyzer Filter and review media sample flow interactively using filter property page.

How to save video into Camera Roll

Can I use this code to save a recorded video in the Camera Roll?
Medialibrary.SavePictureToCameraRoll(fileName, Stream);
where Stream is photo stream or video stream.
You can't save videos in the camera roll from a third party application at the moment. That API can be used just to save pictures.
Apparently it is. Have you tried it yet?
A quick google's come up with this:
http://wp.qmatteoq.com/how-to-save-a-picture-captured-with-the-new-cameras-api-in-the-camera-roll-in-windows-phone-8/

Playing videos sequentially

I want to play video in WP7.
This is my code:
MediaPlayerLauncher player = new MediaPlayerLauncher();
player.Media = new Uri("video link", UriKind.RelativeOrAbsolute);
player.Location = MediaLocationType.Data;
player.Controls = MediaPlaybackControls.All;
player.Show();
This is working fine.
After finishing this video I want to continue playing another video. I want to play two videos one after another.
Is this possible in WP7? How can I accomplish this?
The title asks how to play videos in general. Are you aware of the MediaElement? It can be used to play back video as well and it has an event telling you when video playback ends. And it can also give you the video length.
This blog post has an example of both MediaElement and MediaPlayerLauncher.
The MediaPlayerLauncher does not expose an event or callback for you to find out if and/or when the video has ended. I am afraid it is not possible to hook into these events.

How to get the MJPEG stream from a NC541 camera?

I have an NC541 IP camera, which supposedly does have an MJPEG stream, as in the manual it says "The video is compressed by MJPEG", but I can not find a way of how to get that stream from the camera. Seems that it wants to work only with the build-in program, while I need the way mjpeg stream instead.
Any ideas? Thanks!
I don't have this camera, but on many you can simply right click on the video window in your browser, select properties, and it will tell you the URL of the raw stream. If this is a multi codec camera you may or may not get the mjpeg stream depending on which one is chosen for the camera's home page. This often works for me.

Resources