App MediaFrameReader always returns null Bitmap - xamarin

I am using MediaCapture to preview camera on the screen, which is working fine.
However, I need to access the current frames in my App. Therefore I added FrameReader to the MediaCapture to get the event Reader_FrameArrived.
So the event is working, however the bitmap itself is always null.
I implemented the ecent callback just like shown in the examples:
var mediaFrameReference = sender.TryAcquireLatestFrame();
var videoMediaFrame = mediaFrameReference?.VideoMediaFrame;
var softwareBitmap = videoMediaFrame?.SoftwareBitmap;
if (softwareBitmap != null)
{
Debug.WriteLine("here");
}
else
{
return;
}
and this is how I initialize the reader:
var frameSourceGroups = await MediaFrameSourceGroup.FindAllAsync();
var allGroups = await MediaFrameSourceGroup.FindAllAsync();
var eligibleGroups = allGroups.Select(g => new
{
Group = g,
// For each source kind, find the source which offers that kind of media frame,
// or null if there is no such source.
SourceInfos = new MediaFrameSourceInfo[]
{
g.SourceInfos.FirstOrDefault(info => info.SourceKind == MediaFrameSourceKind.Color),
g.SourceInfos.FirstOrDefault(info => info.SourceKind == MediaFrameSourceKind.Depth),
g.SourceInfos.FirstOrDefault(info => info.SourceKind == MediaFrameSourceKind.Infrared),
}
}).Where(g => g.SourceInfos.Any(info => info != null)).ToList();
if (eligibleGroups.Count == 0)
{
System.Diagnostics.Debug.WriteLine("No source group with color, depth or infrared found.");
return;
}
var selectedGroupIndex = 0; // Select the first eligible group
MediaFrameSourceGroup selectedGroup = eligibleGroups[selectedGroupIndex].Group;
MediaFrameSourceInfo colorSourceInfo = eligibleGroups[selectedGroupIndex].SourceInfos[0];
MediaFrameSourceInfo infraredSourceInfo = eligibleGroups[selectedGroupIndex].SourceInfos[1];
MediaFrameSourceInfo depthSourceInfo = eligibleGroups[selectedGroupIndex].SourceInfos[2];
//_mediaCapture.FrameSources.TryGetValue(cameraDevice.Id, out _source);
var colorFrameSource = _mediaCapture.FrameSources[colorSourceInfo.Id];
if (colorFrameSource != null)
{
_reader = await _mediaCapture.CreateFrameReaderAsync(colorFrameSource, MediaEncodingSubtypes.Argb32);
_reader.FrameArrived += Reader_FrameArrived;
MediaFrameReaderStartStatus result = await _reader.StartAsync();
Debug.WriteLine(result.ToString());
}
Any ideas why the bitmap could be null?

I am using MediaCapture to preview camera on the screen, which is working fine.
For the scenario, we suggest you use MediaCapture class capture bitmap, it contains GetPreviewFrameAsync method gets a preview frame from the capture device than convert it to SoftwareBitmap.
private async Task GetPreviewFrameAsSoftwareBitmapAsync()
{
// Get information about the preview
var previewProperties = _mediaCapture.VideoDeviceController.GetMediaStreamProperties(MediaStreamType.VideoPreview) as VideoEncodingProperties;
// Create the video frame to request a SoftwareBitmap preview frame
var videoFrame = new VideoFrame(BitmapPixelFormat.Bgra8, (int)previewProperties.Width, (int)previewProperties.Height);
// Capture the preview frame
using (var currentFrame = await _mediaCapture.GetPreviewFrameAsync(videoFrame))
{
// Collect the resulting frame
SoftwareBitmap previewFrame = currentFrame.SoftwareBitmap;
// Show the frame information
FrameInfoTextBlock.Text = String.Format("{0}x{1} {2}", previewFrame.PixelWidth, previewFrame.PixelHeight, previewFrame.BitmapPixelFormat);
// Add a simple green filter effect to the SoftwareBitmap
if (GreenEffectCheckBox.IsChecked == true)
{
ApplyGreenFilter(previewFrame);
}
// Show the frame (as is, no rotation is being applied)
if (ShowFrameCheckBox.IsChecked == true)
{
// Create a SoftwareBitmapSource to display the SoftwareBitmap to the user
var sbSource = new SoftwareBitmapSource();
await sbSource.SetBitmapAsync(previewFrame);
// Display it in the Image control
PreviewFrameImage.Source = sbSource;
}
// Save the frame (as is, no rotation is being applied)
if (SaveFrameCheckBox.IsChecked == true)
{
var file = await _captureFolder.CreateFileAsync("PreviewFrame.jpg", CreationCollisionOption.GenerateUniqueName);
Debug.WriteLine("Saving preview frame to " + file.Path);
await SaveSoftwareBitmapAsync(previewFrame, file);
}
}
}
And here is official code sample that you could refer directly.

Related

Player Notifications with Xamarin MediaManager plugin

I'm using the media manager nuget plugin and it's great, but for the life of me, I can't get the lock screen or car bluetooth to show the notifications. I'm using the following to display the notifications (set within OnAppearing)
ViewModel.PropertyChanged += (sender, e) =>
{
switch (e.PropertyName)
{
case "RadioSchedule":
if (listData != null)
{
listData.ItemsSource = null;
var first = ViewModel.RadioSchedule[0];
Device.BeginInvokeOnMainThread(() =>
{
listData.ItemsSource = ViewModel.RadioSchedule;
MediaFile.Metadata.Artist = MediaFile.Metadata.DisplaySubtitle = MediaFile.Metadata.AlbumArtist = first.Artist;
MediaFile.Metadata.Title = MediaFile.Metadata.DisplayTitle = first.Track;
MediaFile.Metadata.DisplayIcon = new Image { Source = "icon".CorrectedImageSource() };
MediaFile.Metadata.BluetoothFolderType = "1";
MediaFile.Type = MediaFileType.Audio;
MediaFile.Url = Constants.RadioStream;
MediaFile.Availability = ResourceAvailability.Remote;
MediaFile.MetadataExtracted = true;
MediaFile.Metadata.Date = DateTime.Now;
MediaFile.Metadata.Duration = 300;
MediaFile.Metadata.Genre = "Rock";
MediaFile.Metadata.TrackNumber = MediaFile.Metadata.NumTracks = 1;
MediaFile.Metadata.DisplayDescription = "Radio Station";
if (!ViewModel.NotificationStarted)
{
if (CrossMediaManager.Current.MediaNotificationManager != null)
CrossMediaManager.Current.MediaNotificationManager.StartNotification(MediaFile);
ViewModel.NotificationStarted = true;
}
CrossMediaManager.Current.MediaNotificationManager?.UpdateNotifications(MediaFile, MediaPlayerStatus.Playing);
});
}
break;
The code itself is being hit (I can set break points and they are hit). I've tried it on and off the UI thread as well.
The playlist comes from a webapi which works fine. The notifier gives unknown/unknown on the device media player (both iOS and Android) and nothing in-car. For Android, the permissions the readme file says to use have also been set.
Is there some sort of magic I have to do to get this to work? This is a Xam.Forms package rather than something native.
The MediaPlayer is started further in the class using the following code
CrossMediaManager.Current.Play(Constants.RadioStream, MediaFileType.Audio, ResourceAvailability.Remote);
Where Constants.RadioStream is the URL of the radio stream.

Xamarin.Android Record Video - Quality Poor

I'm using the following Xamarin tutorial https://developer.xamarin.com/recipes/android/media/video/record_video/
I can successfully record video and audio however the quality is not very good. Can anyone suggest/explain how I can increase the quality please?
I know the device can record in higher quality because the native camera app record in much higher quality.
EDIT here is my code so far
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
// Set our view from the "main" layout resource
SetContentView(Resource.Layout.RecordVideo);
var record = FindViewById<Button>(Resource.Id.Record);
var stop = FindViewById<Button>(Resource.Id.Stop);
var play = FindViewById<Button>(Resource.Id.Play);
var video = FindViewById<VideoView>(Resource.Id.SampleVideoView);
var videoPlayback = FindViewById<VideoView>(Resource.Id.PlaybackVideoView);
string path = Android.OS.Environment.ExternalStorageDirectory.AbsolutePath + "/test.mp4";
if (Camera.NumberOfCameras < 2)
{
Toast.MakeText(this, "Front camera missing", ToastLength.Long).Show();
return;
}
video.Visibility = ViewStates.Visible;
videoPlayback.Visibility = ViewStates.Gone;
_camera = Camera.Open(1);
_camera.SetDisplayOrientation(90);
_camera.Unlock();
recorder = new MediaRecorder();
recorder.SetCamera(_camera);
recorder.SetAudioSource(AudioSource.Mic);
recorder.SetVideoSource(VideoSource.Camera);
recorder.SetOutputFormat(OutputFormat.Default);
recorder.SetAudioEncoder(AudioEncoder.Default);
recorder.SetVideoEncoder(VideoEncoder.Default);
//var cameraProfile = CamcorderProfile.Get(CamcorderQuality.HighSpeed1080p);
// recorder.SetProfile(cameraProfile);
recorder.SetOutputFile(path);
recorder.SetOrientationHint(270);
recorder.SetPreviewDisplay(video.Holder.Surface);
record.Click += delegate
{
recorder.Prepare();
recorder.Start();
};
stop.Click += delegate
{
if (recorder != null)
{
video.Visibility = ViewStates.Gone;
videoPlayback.Visibility = ViewStates.Visible;
recorder.Stop();
recorder.Release();
}
};
play.Click += delegate
{
video.Visibility = ViewStates.Gone;
videoPlayback.Visibility = ViewStates.Visible;
var uri = Android.Net.Uri.Parse(path);
videoPlayback.SetVideoURI(uri);
videoPlayback.Start();
};
}
I don't see the example specifying the CamcorderProfile anywhere so you might want to start from that. It's possible that the default framerate, bitrate and video frame size are lower than you'd expect. I'm not an a computer right now but try to set the profile to for example QUALITY_1080p using the SetProfile method in MediaRecorder.
You need to set the profile after setting the video and audio sources but before calling SetOutputFile method.

Unable to port Lumia imaging SDK2.0 to SDK 3.0(UWP)

I am having a tough time converting lumia imaging SDK 2.0 code to SDK3.0 in below specific case. I used to increase/decrease the image quality of JPG file using below code in Windows phone 8.1 RT apps:
using (StreamImageSource source = new StreamImageSource(fileStream.AsStreamForRead()))
{
IFilterEffect effect = new FilterEffect(source);
using (JpegRenderer renderer = new JpegRenderer(effect))
{
renderer.Quality = App.COMPRESSION_RATIO / 100.0; // higher value means better quality
compressedImageBytes = await renderer.RenderAsync();
}
}
Now since FilterEffect class has been replaced in SDK 3.0 with EffectList(), I changed code to
using (BufferProviderImageSource source = new BufferProviderImageSource(fileStream.AsBufferProvider()))
{
using (JpegRenderer renderer = new JpegRenderer())
{
IImageProvider2 source1 = new EffectList() { Source = source };
renderer.Source = source1;
renderer.Quality = App.COMPRESSION_RATIO / 100.0;
try
{
var img = await renderer.RenderAsync();
}
catch (Exception ex)
{
;
}
}
}
I am getting InvalidCastException exception. I have tried several combinations but no luck.
I don't really know what is going on with the InvalidCastException, we can continue that discussion in the comments as it will most likely need some back-and-forth.
That said, you could continue without the effect list, and chain effects in the normal way. So to rewrite your scenario:
using (var soruce = new StreamImageSource(...))
using (var renderer = new JpegRenderer(source))
{
renderer.Quality = App.COMPRESSION_RATIO / 100.0;
var img = await renderer.RenderAsync();
}
If you wanted to add an effect (for example a CarttonEffect), just do:
using (var soruce = new StreamImageSource(...))
using (var caroonEffect = new CartoonEffect(source))
using (var renderer = new JpegRenderer(caroonEffect))
{
renderer.Quality = App.COMPRESSION_RATIO / 100.0;
var img = await renderer.RenderAsync();
}
and so on. If you had effects A, B, C and D just make a chain Source -> A -> B -> C -> D -> JpegRenderer.
I am on VS 2015 community version. While digging around this, I got below code working which works exactly same as SDK 2.0. All I did was specified the Size of JpegRenderer. It works for all landscape images but fails to transform the portrait images to correct orientation. There is no exception but result of portrait image is widely stretched landscape image.
I initialized the Size for portrait images to Size(765, 1024) but no impact.
using (JpegRenderer renderer = new JpegRenderer(source))
{
renderer.Quality = App.COMPRESSION_RATIO / 100.0;
try
{
var info = await source.GetInfoAsync();
renderer.Size = new Size(1024, 765);
compressedImageBytes = await renderer.RenderAsync();
}
catch (Exception ex)
{
new MessageDialog("Error while compressing.").ShowAsync();
}
}
I am sorry the working code was using BufferProviderImageSource instead StreamImageSource. Below is the snippet. Few points here:
1) If I don't use Size property I get "The component cannot be found exception".
2) GetInfoAsync(): Yes it was useless for above code but I need to use it to know if image is Landscape or Portrait so that I can initialize Size property of resultant image.
3) If Size property goes beyond 1024x1024 for portrait images I get the exception "Value does not fall within the expected range"
Why lumia made this version so tricky. :(
var stream = FileIO.ReadBufferAsync(file);
using (var source = new BufferProviderImageSource(stream.AsBufferProvider()))
{
EffectList list = new EffectList() { Source = source };
using (JpegRenderer renderer = new JpegRenderer(list))
{
renderer.Quality = App.COMPRESSION_RATIO / 100.0;
renderer.OutputOption = OutputOption.PreserveAspectRatio;
try
{
var info = await source.GetInfoAsync();
double width = 0;
double height = 0;
if (info.ImageSize.Width > info.ImageSize.Height) //landscape
{
width = 1024;
height = 765;
if (info.ImageSize.Width < 1024)
width = info.ImageSize.Width;
if (info.ImageSize.Height < 765)
height = info.ImageSize.Height;
}
else //portrait..
{
width = 765;
height = 1024;
if (info.ImageSize.Width < 765)
width = info.ImageSize.Width;
if (info.ImageSize.Height < 1024)
height = info.ImageSize.Height;
}
renderer.Size = new Size(width, height);
compressedImageBytes = await renderer.RenderAsync();
}
catch (Exception ex)
{
new MessageDialog(ex.Message).ShowAsync();
}
}
}

Why does live camera capture control with Xamarin Forms on iOS freeze?

I downloaded the source for Xamarin Moments from GitHub and now I'm trying to convert the CameraPage renderer from Page to a ContentView
Then I refactored the code to make it a ContentView renderer. Most of the actual setup of the live preview and image capture comes from the Moments app with some refactoring where needed/preferred.
The live preview shows up but when I press the button to take the picture the app freezes without an exception, not even in Xcode's console view.
//this is how it's called:
btnTakePicture.Clicked += (s,e)=> { GetCameraImage().Wait(); };
// this method freezes
public async Task<byte[]> GetCameraImage()
{
byte[] imageBuffer = null;
if (captureDeviceInput != null)
{
var videoConnection = stillImageOutput.ConnectionFromMediaType(AVMediaType.Video);
Console.WriteLine("[HASFIQWRPPOA] This message shows up");
// this is where the app freezes, even though the live preview still moves.
var sampleBuffer = await stillImageOutput.CaptureStillImageTaskAsync(videoConnection);
Console.WriteLine("[CLKJFADSFQXW] THIS DOESN'T SHOW UP");
// var jpegImageAsBytes = AVCaptureStillImageOutput.JpegStillToNSData (sampleBuffer).ToArray ();
var jpegImageAsNsData = AVCaptureStillImageOutput.JpegStillToNSData(sampleBuffer);
Console.WriteLine("[ROIAJDGNQWTG]");
// var image = new UIImage (jpegImageAsNsData);
// var image2 = new UIImage (image.CGImage, image.CurrentScale, UIImageOrientation.UpMirrored);
// var data = image2.AsJPEG ().ToArray ();
imageBuffer = jpegImageAsNsData.ToArray();
Console.WriteLine("[FIOUJGAIDGUQ] Image buffer: "+imageBuffer.Length);
}
if (imageBuffer != null && imageBuffer.Length > 100)
{
using (var ms = new MemoryStream(imageBuffer))
{
var uiimg = UIImage.LoadFromData(NSData.FromStream(ms));
this.Add(new UIImageView(uiimg));
}
}
return imageBuffer;
}
Here is how I set the live preview
// This method runs fine and the camera preview is started as expected
public void SetupLiveCameraStream()
{
try
{
// add a UIView to the renderer
liveCameraStream = new UIView()
{
Frame = new CGRect(0f, 0f, Element.Width, Element.Height),
};
this.Add(liveCameraStream);
// find a camera
var captureDevice = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);
if (captureDevice != null)
{
Console.WriteLine("[ZKSDJGWEHSY] Capture device found"); // not the case on simulator
captureSession = new AVCaptureSession();
videoPreviewLayer = new AVCaptureVideoPreviewLayer(captureSession)
{
Frame = liveCameraStream.Bounds
};
liveCameraStream.Layer.AddSublayer(videoPreviewLayer);
ConfigureCameraForDevice(captureDevice);
captureDeviceInput = AVCaptureDeviceInput.FromDevice(captureDevice);
var dictionary = new NSMutableDictionary();
dictionary[AVVideo.CodecKey] = new NSNumber((int)AVVideoCodec.JPEG);
stillImageOutput = new AVCaptureStillImageOutput()
{
OutputSettings = new NSDictionary()
};
captureSession.AddInput(captureDeviceInput);
captureSession.AddOutput(stillImageOutput);
captureSession.StartRunning();
Console.WriteLine("[OIGAJGUWRJHWY] Camera session started");
}
else
{
Console.WriteLine("[OASDFUJGOR] Could not find a camera device");
}
}
catch (Exception x)
{
Console.WriteLine("[QWKRIFQEAHJF] ERROR:" + x);
}
}
I had this issue, and it turned out I was deadlocking because of a combination of using async/await with Task.Result. At a guess you could be experiencing something similar with your usage of Task.Wait().
The two sections of code:
btnTakePicture.Clicked += await (s,e) => { GetCameraImage().Wait(); };
And:
var sampleBuffer = await stillImageOutput.CaptureStillImageTaskAsync(videoConnection);

Pictures keep going to the top left.(Actionscript 3)

So my project was to make two gallery pages. I called them "gallery1" and "gallery2".The gallery pages each have 5 thumbnails that act like buttons so when you click on em, it opens an swf of the picture. Now the problem is, it always opens on the top left, i want them to be in the middle of the page. This is the code for gallery1. Gallery 2 is the samething but with different pics. b1-b5 are the thumbnails.Please help.
var swfRequest1:URLRequest = new URLRequest("image1.swf");
var swfRequest2:URLRequest = new URLRequest("image2.swf");
var swfRequest3:URLRequest = new URLRequest("image3.swf");
var swfRequest4:URLRequest = new URLRequest("image4.swf");
var swfRequest5:URLRequest = new URLRequest("image5.swf");
var swfLoader:Loader = new Loader();
function opengal(evt:MouseEvent):void
{
var pTarget:String = evt.currentTarget.name;
if(pTarget == "b1")
{
swfLoader.load(swfRequest1);
addChild(swfLoader);
}
else if(pTarget == "b2")
{
swfLoader.load(swfRequest2);
addChild(swfLoader);
}
else if(pTarget == "b3")
{
swfLoader.load(swfRequest3);
addChild(swfLoader);
}
else if (pTarget == "b4")
{
swfLoader.load(swfRequest4);
addChild(swfLoader);
}
else if(pTarget == "b5")
{
swfLoader.load(swfRequest5);
addChild(swfLoader);
}
};
b1.addEventListener(MouseEvent.CLICK, opengal);
b2.addEventListener(MouseEvent.CLICK, opengal);
b3.addEventListener(MouseEvent.CLICK, opengal);
b4.addEventListener(MouseEvent.CLICK, opengal);
b5.addEventListener(MouseEvent.CLICK, opengal);
To set your loaded content in the middle of your stage, you can do like this :
// store files path into an array
var files:Array = ['image01.swf', 'image02.swf', 'image03.swf', 'image04.swf', 'image05.swf'];
var loader:Loader = new Loader();
// use loader.contentLoaderInfo to listen to Event.INIT
loader.contentLoaderInfo.addEventListener(
Event.INIT, function(e:Event):void {
// center our loader
loader.x = (stage.stageWidth - loader.width)/2;
loader.y = (stage.stageHeight - loader.height)/2;
})
addChild(loader);
function opengal(evt:MouseEvent):void {
// here the pTarget should be between 0 and 4 as an array index
var pTarget:int = Number((evt.currentTarget.name).substr(-1, 1)) - 1;
loader.load(new URLRequest(files[pTarget]));
};
...
I hope this can help you.
You have to position your swfLoader in the middle of the screen.
You can do this with this code:
swfLoader.x = (swfLoader.width - stage.stageWidth)/2; // and the y axis in a similar way
The required width and height properties can be retrieved when the contentLoaderInfo dispatches an Event.INIT.

Resources