TokBox screen sharing audio is not working - opentok

I'm using the TokBox / OpenTok screen sharing API for web browser, and when publishing a screen I'm using "publishAudio: true", but the subscriber does not receive any audio. The subscriber does receive the screen video though.
Does anyone know how to solve this audio issue? I'm using Google Chrome on macOS Catalina.

I need see more of you concrete case code but i hope this can help you.
You need to send the audioSource. Example:
if (videoSource) { // in my case, i'm using null or 'screen'
opts.videoSource = videoSource;
}
console.log('micStream getAudioTracks:');
console.log(micStream.getAudioTracks());
opts.audioSource = micStream.getAudioTracks()[0]; //you can choose you audio source
const target = document.getElementById(targetId);
const pub = OT.initPublisher(target, opts, err => {
if (err) {
console.log('err');
reject(err);
} else {
console.log('resolve');
resolve(pub);
}
});
I

Related

AWS Chime SDK js does not recognize video and audio elements

I am attempting to get the basic tutorial for the AWS Chime SDK to work in our application and the meetingSession.audioVideo.listVideoInputDevices() always returns nothing/null.
I am running this on lastest chrome, my operating system is a windows 10 workspace instance. I have headphones plugged in; but that shouldn't make a difference.
My expected result is to return at least one device for the video. Here is the output from the Logger.
2020-08-26T15:29:19.127Z [INFO] MyLogger - attempting to trigger media device labels since they are hidden
chime-bundle.js:1 2020-08-26T15:29:19.133Z [INFO] MyLogger - unable to get media device labels
chime-bundle.js:1 2020-08-26T15:29:19.134Z [INFO] MyLogger - API/DefaultDeviceController/listVideoInputDevices null -> []
chime-bundle.js:1 Uncaught (in promise) TypeError: Cannot read property 'deviceId' of undefined
*Note. The video and audio elements are not hidden.
I have tried the code snippits from various demos. Which are all just a copy of AWS's walkthrough. So pretty much zero information. I have researched how the audio devices work in html5 and looking through the files provided in the sdk-js, I am even more confused. Can someone point me in the right direction?
Here is the basic code, you can get it, and a description from the link above.
var fetchResult = await window.fetch(
window.encodeURI("<our endpoint for backend (running c# instead of node)>",
{
method: 'POST'
}
);
let result = await fetchResult.json();
console.log("Result from Chime API:", result);
const logger = new ConsoleLogger('MyLogger', LogLevel.INFO);
const deviceController = new DefaultDeviceController(logger);
const meetingResponse = result.JoinInfo.Meeting;
const attendeeResponse = result.JoinInfo.Attendee;
const configuration = new MeetingSessionConfiguration(meetingResponse, attendeeResponse);
// In the usage examples below, you will use this meetingSession object.
const meetingSession = new DefaultMeetingSession(
configuration,
logger,
deviceController
);
console.log("MEETING SESSION", meetingSession);
//SETUP AUDIO
const audioElement = document.getElementById('notary-audio');
meetingSession.audioVideo.bindAudioElement(audioElement);
const videoElement = document.getElementById('notary-video');
// Make sure you have chosen your camera. In this use case, you will choose the first device.
const videoInputDevices = await meetingSession.audioVideo.listVideoInputDevices();
// The camera LED light will turn on indicating that it is now capturing.
// See the "Device" section for details.
await meetingSession.audioVideo.chooseVideoInputDevice(videoInputDevices[0].deviceId);
const observer = {
audioVideoDidStart: () => {
console.log('Started');
},
audioVideoDidStop: sessionStatus => {
// See the "Stopping a session" section for details.
console.log('Stopped with a session status code: ', sessionStatus.statusCode());
},
audioVideoDidStartConnecting: reconnecting => {
if (reconnecting) {
// e.g. the WiFi connection is dropped.
console.log('Attempting to reconnect');
}
},
// videoTileDidUpdate is called whenever a new tile is created or tileState changes.
videoTileDidUpdate: tileState => {
// Ignore a tile without attendee ID and other attendee's tile.
if (!tileState.boundAttendeeId || !tileState.localTile) {
return;
}
// videoTileDidUpdate is also invoked when you call startLocalVideoTile or tileState changes.
console.log(`If you called stopLocalVideoTile, ${tileState.active} is false.`);
meetingSession.audioVideo.bindVideoElement(tileState.tileId, videoElement);
localTileId = tileState.tileId;
},
videoTileWasRemoved: tileId => {
if (localTileId === tileId) {
console.log(`You called removeLocalVideoTile. videoElement can be bound to another tile.`);
localTileId = null;
}
}
};
meetingSession.audioVideo.addObserver(observer);
meetingSession.audioVideo.start();

How to play sound with a music discord bot on heroku

I'm creating a Discord Music Bot in discord.js, I already installed ffmpeg, and everything seems to work normally, but when I execute play command, bot joins a channel, send a message, but doesn't play anything, I've already checked console and it doesn't log anything.
I know it's not a problem with code since it works perfectly locally, the problem is when I try to use heroku, I thought it could be a opusscript problem but I don't know.
I don't find code relevant here, since it works perfectly in my localhost, but when I start to host it at Heroku, nothing happens.
Here you have it, maybe there's an error, but as I said I think the problem is with opusscript or node-opus.
Here are my Heroku buildpacks
And this is my code:
const ytdl = require('ytdl-core');
let voiceChn = message.member.voiceChannel;
if(!voiceChn) return message.channel.send('¡Join a voice channel first!.');
if(!args) return message.channel.send('Add a youtube URL to play it.');
voiceChn.join()
.then(connection => {
const url = ytdl(args.join(' '), { filter : 'audioonly' });
const dispatcher = connection.playStream(url);
message.delete();
message.channel.send('Now playing : '+ args);
}).catch(console.error);
For what its worth I am seeing a very similar issue. My bot should join the channel, play a sound clip from a S3 bucket (which is made public), and then leave.
Here's my code:
async function executePlaySoundCommand(message, filePath) {
try {
const voiceChannel = message.member.voiceChannel;
const connection = await voiceChannel.join();
console.log(`filePath: ${filePath}`);
const file = `${process.env.S3_URL}/${filePath}`;
console.log(`file: ${file}`);
const dispatcher = await connection.playArbitraryInput(file);
console.log('Playback finished');
dispatcher.on('end', () => {
voiceChannel.leave();
});
} catch (err) {
console.log(err);
}
}
Locally the bot will join the channel, play the sound and then leave as expected. However in heroku, the bot will join the channel, then immediately leave.
Below are the sanitized logs from heroku:
Executing <command-name> command
filePath: <audio-file>.mp3
file: https://s3-eu-west-1.amazonaws.com/<s3-bucket-name>/<audio-file>.mp3
Playback finished
I don't think there's anything wrong with my code(?), looking into ffmpeg protocols to see if I have missed something.

MS Bot Framework VideoCard

Can someone please explain how you use the MediaUrl for the VideoCard?
I've tried just adding the video url to the CardMedia which loads the media player but I can't play the video.
And whatever I try with MediaUrl I just get an error saying MediaUrl is not a funtion.
var url = new MediaUrl("Test", "https://www.youtube.com/watch?v=0i4v0Texqco");
var vid = new builder.Message(session)
.textFormat(builder.TextFormat.xml)
.attachments([
new builder.MediaCard(session)
.title("Test title")
.media([
builder.CardMedia.create(session, url)
])
]);
session.send(vid);
Thanks for any help!
For the Javascript solution the following code works pretty well, please note that it has been tested in bot framework emulator.
const {ActivityTypes } = require('botbuilder');
async playYoutube(context)
{
const reply = { type: ActivityTypes.Message };
reply.attachments = [this.getInlineAttachment()];
await context.sendActivity(reply);
}
get the object:
getInlineAttachment() {
return {
name: 'YoutubeVideo',
contentType: 'video/mp4',
contentUrl: 'https://www.youtube.com/watch?v=-2JRiv3Mycs'
}
hope this helps. Cheers
Per the VideoCard documentation and the MediaUrl documentation the url must be the one to the source of the video.
Using a YouTube URL in the VideoCard is not supported.
However, recently a "YouTube Card" was recently added to the WebChat/Emulator. If you are using WebChat, you might consider using that. See this and this for more information.

addStream in firefox dont work - webrtc

I try use webrtc in a app, for realtime comunication, this in chrome work fine but in firefox i get error in function addStream, i am using adapter.js i suppose what it will solved all error of compatibility but the error keep.
pc = new RTCPeerConnection(pc_config);
pc.onicecandidate = function (evt) {
// my code here
}
pc.onnegotiationneeded = function (evt) {
// my code here
}
if(isChromium) {
object_user.pc.onaddstream = function (evt) {
};
} else {
object_user.pc.ontrack = function (evt) {
};
}
if(isChromium) {
object_user.pc.addStream(window.localstream); // <- get error in firefox
}else{
object_user.pc.addTrack(window.localstream);
}
I try to change addStream by addTrack of firefox but I get "Not enough arguments to RTCPeerConnection.addTrack."
The documentation for addTrack requires 2 argumuments, track and stream, which is probably why you get an error.
Syntax
rtpSender = RTCPeerConnection.addTrack(track, stream...);
Parameters
track
A MediaStreamTrack object representing the media track to add to the
peer connection.
stream...
One or more MediaStream objects in which the specified track are to be
contained.
https://developer.mozilla.org/en-US/docs/Web/API/RTCPeerConnection/addTrack

cordova record audio not work windows 10 mobile

I work for some time with the media cordova plugin on android and ios mobile on to record audio files. It works well. However on windows, while recording no error occurred, but no file exists. Just when the application goes into background and we return it, an error with code like 2147483648 (I have not found any information relevant to my problem with this code).
function recordAudio() {
var src = "ms-appdata:///temp/sound.m4a";;
var mediaRec = new Media(src,
// success callback
function() {
console.log("recordAudio():Audio Success");
},
// error callback
function(err) {
console.log("recordAudio():Audio Error: "+ err.code);
});
// Record audio
mediaRec.startRecord();
}
I can not find solutions or similar problems. The rest github does not included the problems.
In MediaProxy.js (plugins/cordova-plugin-media/mediaProxy.js)
There is this constant:
var PARAMETER_IS_INCORRECT = -2147024809;
Is this the error you are getting? If so, this only seems to be used in one place, if there is no scheme for the path. Take a look at setTemporaryFsByDefault() in that same file.

Resources