I'm trying to record Video/Audio files using MediaRecorder API for Firefox.
When I'm using web Audio API for creating the nodes (source -> Gain -> Destination)
The output of the recorded file is only Audio as the return stream from the destination node is Audio stream only referring to this documentation
https://developer.mozilla.org/en-US/docs/Web/API/MediaStreamAudioDestinationNode
Any suggestion for getting Audio/video stream in the destination output to record audio/video not audio only.
var mediastream;
var ctx = new AudioContext();
var mediaStreamSource = ctx.createMediaStreamSource(mediaStream);
var destination = ctx.createMediaStreamDestination();
ObjectStore.VolumeGainNode = ctx.createGain();
ObjectStore.VolumeGainNode.gain.value = 0.5;
mediaStreamSource.connect(ObjectStore.VolumeGainNode);
ObjectStore.VolumeGainNode.connect(destination);
mediaStream = destination.stream;
You need a stream composed of the gUM video track and your gain-modified audio track.
Following the standard, Firefox lets you modify tracks in a stream using stream.addTrack and stream.removeTrack, as well as compose new streams out of tracks with new MediaStream([tracks]).
This lets you solve your problem by replacing the gUM audio track with your gain-manipulated one:
var constraints = { video: true, audio: true };
var start = () => navigator.mediaDevices.getUserMedia(constraints)
.then(stream => modifyGain(video.srcObject = stream, 0.5))
.catch(e => console.error(e));
var modifyGain = (stream, gainValue) => {
var audioTrack = stream.getAudioTracks()[0];
var ctx = new AudioContext();
var src = ctx.createMediaStreamSource(new MediaStream([audioTrack]));
var dst = ctx.createMediaStreamDestination();
var gainNode = ctx.createGain();
gainNode.gain.value = gainValue;
[src, gainNode, dst].reduce((a, b) => a && a.connect(b));
stream.removeTrack(audioTrack);
stream.addTrack(dst.stream.getAudioTracks()[0]);
};
Here's the fiddle (Firefox 44 or newer): https://jsfiddle.net/7wd2z8rz/
Again with MediaRecorder: https://jsfiddle.net/j33xmkcq/
Related
I am trying to record a session using MediaRecorder.
I am using canvasStream for video and getUserMedia() for audio stream.
This code is working fine if laptop's speaker and microphone is used, but when I use headset it doesn't record audio output that is coming in headset.
Please find below sample code:
navigator.mediaDevices.getUserMedia({audio: {echoCancellation: false, noiseSuppression: false, channelCount: 2}}).then(function(audioStream) {
var canvas = document.getElementById('canvas');
var canvasStream = canvas.captureStream(35);
var o = new MediaStream();
getTracks(audioStream, 'audio').forEach(function(track) {
o.addTrack(track);
});
getTracks(canvasStream, 'video').forEach(function(track) {
o.addTrack(track);
});
let recorder = new MediaRecorder(o, {mimeType: 'video/webm'});
});
Is there a property on the AVAudioPlayer class that I can use to get the samples? If not is there another class I can use to get this information?
Here's what I have:
var openDialog = NSOpenPanel.OpenPanel;
openDialog.CanChooseFiles = true;
openDialog.CanChooseDirectories = false;
openDialog.AllowedFileTypes = new string[] { "wav" };
if (openDialog.RunModal() == 1)
{
var url = openDialog.Urls[0];
if (url != null)
{
var path = url.Path;
var audioplayer = AVFoundation.AVAudioPlayer.FromUrl(file);
var samples = audioplayer.SAMPLES?;
Visual Studio Mac (C# / Xamarin)
AVAudioPlayer does not give you access to the sample data, but if you switch playback to AVPlayer you can use an MTAudioProcessingTap to "tap" the samples as they are played.
If you simply want to examine the samples in your file you can use AVAudioFile.
// get the total number of samples
var audioFile = new AVAudioFile(file, out outError);
var samples = audioFile.Length;
I am trying to compress an image client-side (Angular 2/Ionic 3) and When I log the file that is created by the camera, it says:
"type":null when it should say "type":'image/jpeg'.
I am using the Ionic camera plugin to handle taking a picture or choosing one from the photo library, after that (I assume) the file is created without a type. Every compression tool I have tried has had this problem, and I have run out of options. Is there a way to change the type of a File object?
I created a new Blob with this method and made sure to give it image type:
dataURItoBlob(dataURI, callback): Promise<any> {
return new Promise((resolve, reject) => {
let byteString = atob(dataURI);
//console.log(byteString);
// separate out the mime component
//let mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0]
// write the bytes of the string to an ArrayBuffer
let ab = new ArrayBuffer(byteString.length);
let ia = new Uint8Array(ab);
for (let i = 0; i < byteString.length; i++) {
ia[i] = byteString.charCodeAt(i);
}
// write the ArrayBuffer to a blob, and you're done
let bb = new Blob([ab], {type: 'image/jpeg'});
resolve(bb);
})
}
I used it after reading the contents of the image to base64 like this and got the resulting blob with image type:
var readerZ = new FileReader();
readerZ.onload = (e) => {
let data = readerZ.result.split(',')[1];
//console.log(data);
self.dataURItoBlob(data, null).then(blob => {
After downloading xamarin.signaturePad sample i just want to get image from signature pad to memory stream and show on my imageview. here is my code and its working fine on iOS but on android its showing empty stream
var image = await padView.GetImageStreamAsync(SignatureImageFormat.Png);
var stream = new MemoryStream();
image.CopyToAsync(stream);
var imageByteArray= stream.ToArray();
img_result.Source = ImageSource.FromStream(() => newMemoryStream(imageByteArray));
Just cast your imagestream to memorystream. It should be valid
var imageStream = await padView.GetImageStreamAsync(SignatureImageFormat.Png);
// this is actually memory-stream so convertible to it
var mstream = (MemoryStream)imageStream;
//Unfortunately above mstream is not valid until you take it as byte array
mstream = new MemoryStream(mstream.ToArray());
//Now you can
img_result.Source = ImageSource.FromStream(()=>mstream);
I'm attempting to take a photo with my device camera, but images taken with the device held in "portrait" mode come out sideways. I'd like to rotate them before saving them, but the solution that I keep coming across isn't working for me.
Windows.Storage.Streams.InMemoryRandomAccessStream stream = new Windows.Storage.Streams.InMemoryRandomAccessStream();
imagePreview.Source = null;
await stream.WriteAsync(currentImage.AsBuffer());
stream.Seek(0);
BitmapDecoder decoder = await BitmapDecoder.CreateAsync(stream);
BitmapEncoder encoder = await BitmapEncoder.CreateForTranscodingAsync(stream, decoder);
encoder.BitmapTransform.Rotation = BitmapRotation.Clockwise90Degrees;
encoder.IsThumbnailGenerated = false;
await encoder.FlushAsync();
//save the image
StorageFolder folder = KnownFolders.SavedPictures;
StorageFile capturefile = await folder.CreateFileAsync("photo_" + DateTime.Now.Ticks.ToString() + ".bmp", CreationCollisionOption.ReplaceExisting);
string captureFileName = capturefile.Name;
//store stream in file
using (var fileStream = await capturefile.OpenStreamForWriteAsync())
{
try
{
//because of using statement stream will be closed automatically after copying finished
await Windows.Storage.Streams.RandomAccessStream.CopyAsync(stream, fileStream.AsOutputStream());
}
catch
{
}
}
this produces the original image with no rotation applied to it. I've looked at a lot of samples, and can't figure out what I'm doing wrong.