Chrome ARC AAC Encoder crashing - google-chrome-arc

I encode audio using the AAC encoder in my app and all works fine in Android, but the same APK crashes out when running in Chrome with ARC Welder.
My code looks like this:
mRecorder = new MediaRecorder();
mRecorder.setOnInfoListener(infoListener);
String audioFilePath = mUri.getPath();
mRecorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mRecorder.setOutputFile(audioFilePath);
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
mRecorder.setAudioEncodingBitRate(256 * 1024);
mRecorder.setAudioSamplingRate(44100);
mRecorder.setMaxDuration(600000); // 600000ms = 10 minutes
try{
mRecorder.prepare();
}catch (IOException ex){
Log.e(TAG, "Prepare() failed", ex);
}
mRecorder.start();
It crashes on
mRecorder.start();
I've found that by changing the Encoder and format to AMR-WB I can record and playback the audio OK, is there anyway that I can get AAC to work?

Related

Xam.Plugin.Media is Giving a Crash on IOS devices while picking a video

I am using Plugin.Media to pick a video from my device. When I upload the video, it shows it is being compressed and the App immediately crashes. How do I successfully upload a video from my iOS device ?
Error Message:
Access to the path 'private/var/mobile/Containers/Data/PluginKitPlugin/../tmp/file1.mov is denied
Xam.Plugin version: 4.4.10-beta
iOS Version: 13.4
if (!CrossMedia.Current.IsPickVideoSupported)
{
await DisplayAlert("Videos Not Supported", ":( Permission not granted to videos.", "OK");
return;
}
videoFile = await CrossMedia.Current.PickVideoAsync();
if (videoFile == null)
return;
using (MemoryStream memory = new MemoryStream())
{
Stream stream = videoFile.GetStream();
stream.CopyTo(memory);
videoArray = memory.ToArray();
}

xamarin IOS ReadToEnd() on stream from HttpWebResponse not working

I have this code which is working fine in Android after I get a response from web service but fails in IOS. It was also working fine in ISO before I updated Visual studio. It fails during reader.ReadToEnd()
try
{
HttpWebResponse response=GetWebResponseAsync();
}
catch (exception ex)
{
var stream = ex.Response.GetResponseStream();
using (var reader = new StreamReader(stream))
{
retVal.ResponseText = reader.ReadToEnd();//It fails during
ReadToEnd()
}
}

Discord.NET 1.0.2 sending voice to voice channel not working

I did everything like the Discord.Net Documentation guide on voice -
https://discord.foxbot.me/latest/guides/voice/sending-voice.html
and it didn't work the bot just joined the voice channel but it dont make any sound.
and i have ffmpeg installed in PATH and ffmpeg.exe in my bot Directory along with opus.dll and libsodium.dll so i dont know what is the problem...
public class gil : ModuleBase<SocketCommandContext>
{
[Command("join")]
public async Task JoinChannel(IVoiceChannel channel = null)
{
// Get the audio channel
channel = channel ?? (Context.Message.Author as IGuildUser)?.VoiceChannel;
if (channel == null) { await Context.Message.Channel.SendMessageAsync("User must be in a voice channel, or a voice channel must be passed as an argument."); return; }
// For the next step with transmitting audio, you would want to pass this Audio Client in to a service.
var audioClient = await channel.ConnectAsync();
await SendAsync(audioClient,"audio/hello.mp3");
}
private Process CreateStream(string path)
{
return Process.Start(new ProcessStartInfo
{
FileName = "ffmpeg.exe",
Arguments = $"-hide_banner -loglevel panic -i \"{path}\" -ac 2 -f s16le -ar 48000 pipe:1",
UseShellExecute = false,
RedirectStandardOutput = true,
});
}
private async Task SendAsync(IAudioClient client, string path)
{
// Create FFmpeg using the previous example
using (var ffmpeg = CreateStream(path))
using (var output = ffmpeg.StandardOutput.BaseStream)
using (var discord = client.CreatePCMStream(AudioApplication.Mixed))
{
try { await output.CopyToAsync(discord); }
finally { await discord.FlushAsync(); }
}
}
}
please help

SignaturePad Xamarin Forms PCL unable to capture Image

I am using SignaturePad Nuget for PCL.
How do i convert this System.IO.Stream into an Image?
Any idea would be appreciated.. Thank you.
Here's what I found after 2 weeks of research about this.
This is the code in c# to save an signature to a png file on android phone.
async void Enregistrer(object sender, EventArgs e)
{
// This is the code to get the image (as a stream)
var img = padView.GetImage(Acr.XamForms.SignaturePad.ImageFormatType.Png); // This returns the Bitmap from SignaturePad.
img.Seek(0,0); // This is to have the current position in the stream
// create a new file stream for writing
FileStream fs = new FileStream(Android.OS.Environment.ExternalStorageDirectory + "/imagefilename.png", FileMode.Create,FileAccess.Write); // create a new file stream for writing
await img.CopyToAsync(fs); // have the image stream to disk
fs.Close(); // close the filestream
img.Dispose(); // clear the ressourse used by the stream
}
have fun !

How to encode picture to H264 use AVFoundation on Mac, not use x264

I'm trying to make a Mac broadcast client to encode into H264 using FFmpeg but not x264 library.
So basically, I am able to get raw frames out from AVFoundation in either CMSampleBufferRef or AVPicture. So is there a way to encode a series of those pictures into H264 frames using Apple framework, like AVVideoCodecH264.
I know the way to encode it use AVAssetWriter, but that only saves the video into file, but I don't want the file, instead, I'd want to have AVPacket so I can send out using FFmpeg. Does anyone have any idea? Thank you.
After refer to VideoCore project, I'm able to use Apple's VideoToolbox framework to encode in hardware.
Start an VTCompressionSession:
// Create compression session
err = VTCompressionSessionCreate(kCFAllocatorDefault,
frameW,
frameH,
kCMVideoCodecType_H264,
encoderSpecifications,
NULL,
NULL,
(VTCompressionOutputCallback)vtCallback,
self,
&compression_session);
if(err == noErr) {
comp_session = session;
}
push the raw frame to the VTCompressionSession
// Delegate method from the AVCaptureVideoDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// get pixelbuffer out from samplebuffer
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//set up all the info for the frame
// call VT to encode frame
VTCompressionSessionEncodeFrame(compression_session, pixelBuffer, pts, dur, NULL, NULL, NULL);
}
Get the encoded frame in the VTCallback, this is a C method to be used as a parameter of VTCompressionSessionCreate()
void vtCallback(void *outputCallbackRefCon,
void *sourceFrameRefCon,
OSStatus status,
VTEncodeInfoFlags infoFlags,
CMSampleBufferRef sampleBuffer ) {
// Do whatever you want with the sampleBuffer
CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer);
}

Resources