I am using RecordRTC to allow my application user to record their video and upload. This works perfectly fine with Chrome and Mozila Firefox but Safari having trouble playing recorded videos.
I research for the same and found mimeType: 'video/webm\;codecs=vp8' to be used for safari. However this still won't work on Safari browsers. (mac OS x and iOS)
Can someone please help me with this? I have multiple users working on different operating systems and working on different browsers and I want to make sure RecordRTC works for all of them.
I am open to change/ switch to any other alternative if available to implement this feature with cross-browser support of course.
I found Twilio but I am not sure whether Twilio supports standalone recordings or not? I mean I just want single user to go on a page record a video and upload.
Please see below my RecordRTC configuration if it helps:
const options = {
type: 'video',
mimeType: 'video/webm\;codecs=vp8',
bitsPerSecond: 128000,
timeSlice: 1000
};
After user completes the recording, I am converting it to a blob and append it to the formData to save it to the server:
var recordedBlob: Blob = recordRTC.getBlob();
formData.append('files', recordedBlob, this.courseComponent?.courseComponent?.name + '.webm');
This recorded video plays fine in Chrome and Firefox but in Safari it Fails.
Please if you can guide me through this it will be great help to me.
Thanks.
Below are the console log snaps:
1.browser does not support media recorder api
when tried to play the recorded video
says browser does not support media recorder api and try using whammyRecorder
Related
Folks,
I am trying work on a simple Web MIDI app.
I already looked up and found out that Google Chrome is the only browser that supports this. So, I installed this but I still get this.
WebMidi could not be enabled Error: The Web MIDI API is not supported
by your browser.
at WebMidi.enable (webmidi.min.js:30)
at script.js:430 (anonymous) # script.js:432 WebMidi.enable # webmidi.min.js:30 (anonymous) # script.js:430
Promise.then (async) (anonymous) # script.js:154
Mac - 10.15.2
Chrome - 79.0.3945.117
According to this link - https://www.midi.org/17-the-mma/99-web-midi , Chrome definitely has the support.
Important Note - If I were run the code directly on codepen, it works just fine. So the browser is working. But when I try to run locally, I get the error.
https://codepen.io/teropa/pen/JLjXGK
WebMidi.enable(err => {
if (err) {
console.error('WebMidi could not be enabled', err);
return;
}
What am I missing here? is this a chrome issue or Mac issue or some permission issue. Or, is there is something specific I need to make the MIDI code run locally?
I am using this server, https://www.npmjs.com/package/http-server, to run the code locally.
(I have looked at other questions but did not find anything that relates to Chrome on Mac)
I've used web-midi with Chromium and Opera on 10.12.6, so I wouldn't say that Chrome is the only browser that has web-midi.
With Opera I think I had to enable experimental features:
chrome://flags/#enable-experimental-web-platform-features
Maybe see if Chrome needs that too?
Or maybe it's just a side-effect of all the lock-down in 10.15?
if serving dev site from 0.0.0.0 you won't get any MIDI in browser, but when loading from 127.0.0.1 it should work ( as commented by user Alex above )
UPDATE (July 2020): I was able to get a mostly working version of this. See my answer to the related question here.
In my angular Nativescript iOS app, I have a webview set up to play youtube videos. Following a method similar to this question here, I can get the youtube video to load and play automatically, using the Youtube Iframe API. But how do I detect when the video actually starts playing (after it is loaded)?
The Iframe API has events for this purpose, like "onStateChange". But, because my youtube code is "stuck" inside the webview, I am currently not able to read when events are fired from that webview.
In Nativescript, there is a nativescript-webview-interface plugin for this purpose, but I can't get it to work. I have put my code below. If that is the way to go, what is the correct code to get it working?
(I don't want to use the nativescript-youtube plugin because that brings in youtube's quotas, which are regularly shrinking. )
Code I Have Tried:
To get the youtube player to activate, I have put all of the relevant youtube code inside the webview. That works to play the video, but not yet to get the event of when the player starts playing. To do what I want to do, I need to have some way of inserting that code into my app WITHOUT trapping it in the webview. Or, have some way of communicating inside the webview.
To try to communicate with the webview, I have tried the nativescript-webview-plugin:
$ tns plugin add nativescript-webview-interface
html:
<web-view src="{{youtubeCode}}" #webView ></web-view>
ts:
import {WebView, LoadEventData} from "tns-core-modules/ui/web-view";
let webViewInterfaceModule = require('nativescript-webview-interface');
...
export class ...{
#ViewChild('webView') webView: ElementRef;
public youtubeCode = [code that youtube provides in its IFrame API]
ngOnInit(): void {
this.setupWebViewInterface();
}
setupWebViewInterface() {
let webView: WebView = this.webView.nativeElement;
this.oWebViewInterface = new webViewInterfaceModule.WebViewInterface(webView, '~/www/index.html');
this.youtubeListen()
}..
youtubeListen(){
this.oWebViewInterface.on('onStateChange', (eventData)=>{ //'onStateChange' is the event provided in the Youtube Iframe API
console.log('event data = ' + eventData) //***this is the key part I want to work.
});
}
RESULT: ERROR TypeError: undefined is not an object (evaluating 'this.webView.ios.constructor')
This is a problem in the plugin's index.ios.js file. If I comment out the offending line, then the errors go away, but then nothing happens
..
I'm experiencing an issue when trying to capture the audio levels from the Opentok publisher. My code works perfectly on Chrome (Version 70.0.3538.110) but are not working as expected on Safari (Version 12.0.1). I'm using #opentok/client Version 2.15.4 and opentok node server Version 2.8.0.
Here is my code:
this.publisher.on('audioLevelUpdated', (event) => {
console.log("event.audioLevel: " + event.audioLevel);
// etc...
In Chrome, I get the following as expected:
In Safari, the value of 'event.audioLevel' is 0 after after a short period of time (about 5 seconds) for some reason:
Any thoughts as to why this is happening? Any help is much appreciated!
I just tried this using Safari 12 and it was working fine. Try updating to the latest Safari 12 and see whether you are still seeing this problem. Quite a few WebRTC related bugs have been fixed in Safari 12.
Here is the jsbin I put together to test the audio level.
https://output.jsbin.com/sugeyim
const publisher = OT.initPublisher();
const audioEl = document.querySelector('#audioLevel');
publisher.on('audioLevelUpdated', ({audioLevel}) => {
audioEl.innerHTML = audioLevel;
});
There are some flags to create a fake media stream for web camera recording. For example:
'--use-fake-device-for-media-stream', '--use-fake-ui-for-media-stream' (Chrome);
'make.navigator.streams.fake' (Firefox).
Are there any flags for Edge/IE browser to pass them to WebDriver capability (Selenium::WebDriver.new :remote, { url: url, desired_caps: caps })?
Currently, there is no such BrowserStack specific custom capability to pass fake media stream on IE and edge. However, you can use the same on chrome and firefox as you have mentioned above. The Chrome Option 'use-fake-device-for-media-stream' is used to grant camera access. However, you can use the 'use-fake-ui-for-media-stream' Chrome Option that will handle the scenarios for 'use-fake-device-for-media-stream' as well('Use you Microphone' OR 'Use your camera' popup).
I'm having some cross-browser compatibility joys. I have a Ruby WEBrick server with a couple of servlets, one of which is used to stream media (Ogg and MP3). The servlet gets a couple of query parameters (a base64 urlsafe string key, a small string user, and a small number, sid). When I put the URL verbatim into Chrome, I get the QuickTime extension, and it works. When I post the same into FireFox, I get the expected unsupported codec. When I put it in Safari, it works. However, the URL is not being accessed directly. I have a jQuery Mobile app that uses the Javascript Buzz library, and uses these servlet streaming URLs as sources. The code works in Safari, allowing me to play the sounds. On FireFox, instead of falling back onto Ogg, it gives unsupported errors, and on Chrome, it does nothing. Here's the code that serves the MP3:
res.status = 200
str = File.new("Music/#{req.query['sid']}.mp3", 'r:BINARY:BINARY').read
res.body = str
res['content-type'] = 'audio/mpeg'
Can anyone tell me how to get audio streaming to all browsers through a Ruby servlet?
Note: This is not a duplicate. I've been thorough a lot of SO answers, but none work because they are designed for static files, not servlets.