I have a sample iPhone application taken from https://github.com/KhaosT/CBPeripheralManager-Demo/tree/master/PeripheralModeTest. I am advertising my peripheral service as follows.
- (void)peripheralManager:(CBPeripheralManager *)peripheral didAddService:(CBService *)service error:(NSError *)error
{
NSLog(#"didAddService start");
NSLog(#"Added");
NSDictionary *advertisingData = #{CBAdvertisementDataLocalNameKey : #"BTService", CBAdvertisementDataServiceUUIDsKey : #[[CBUUID UUIDWithString:#"EBA38950-0D9B-4DBA-B0DF-BC7196DD44FC"]]};
[peripheral startAdvertising:advertisingData];
NSLog(#"didAddService end");
}
I have taken windows app code sample from https://code.msdn.microsoft.com/windowsapps/Bluetooth-Generic-5a99ef95. I replaced the Heart rate UUID with the service UUID. But on running it, it is unable to find the service mentioned.
void Scenario1_DeviceEvents::RunButton_Click(Object ^ /* sender */, RoutedEventArgs ^ /* e */)
{
RunButton->IsEnabled = false;
Vector<String^>^ additionalProperties = ref new Vector<String^>;
additionalProperties->Append("System.Devices.ContainerId");
Platform::Guid lol = Platform::Guid::Guid(0xEBA38950, 0x0D9B, 0x4DBA, 0xB0, 0xDF, 0xBC, 0x71, 0x96, 0xDD, 0x44, 0xFC);
create_task(DeviceInformation::FindAllAsync(
GattDeviceService::GetDeviceSelectorFromUuid(lol), additionalProperties))
.then([this](Windows::Devices::Enumeration::DeviceInformationCollection^ devices)
{
this->devices = devices;
if (devices->Size > 0)
{
Dispatcher->RunAsync(CoreDispatcherPriority::Normal, ref new DispatchedHandler([this, devices]()
{
DevicesListBox->Items->Clear();
auto serviceNames = ref new Vector<String^>();
for_each(begin(devices), end(devices), [=](DeviceInformation^ device)
{
serviceNames->Append(device->Name);
});
cvs->Source = serviceNames;
DevicesListBox->Visibility = Windows::UI::Xaml::Visibility::Visible;
}));
}
else
{
MainPage::Current->NotifyUser(L"Could not find any Heart Rate devices. Please make sure your " +
"device is paired and powered on!",
NotifyType::StatusMessage);
}
Dispatcher->RunAsync(CoreDispatcherPriority::Normal, ref new DispatchedHandler([this]()
{
RunButton->IsEnabled = true;
}));
}).then([](task<void> finalTask)
{
try
{
// Capture any errors and exceptions that occured during device discovery
finalTask.get();
}
catch (COMException^ e)
{
MainPage::Current->NotifyUser("Error: " + e->Message, NotifyType::ErrorMessage);
}
});
}
I tried pairing the device with windows. But still no luck. It has been so frustating, no proper support for discovering devices.
Related
For my studying project, I need to realize an application that has a CameraView or a CameraPage, with a special design. However, I’m not able to figure out how to realize it.
I found a lot of information, to be honest, but they are either obsolete or incomplete, so, I would like to make a point about it, through this thread!
How to implement a Camera?
Well, two solutions can be considered based on what I read.
Camera Page
Let’s say that it’s the first “official” solution. It’s proposed by Xamarin itself, with the Customizing a ContentPage tutorial/documentation. It explains you, through a web page how to implement the camera service with a cross-platform solution.
I then tried the UWP solution:
<?xml version="1.0" encoding="utf-8" ?>
<ContentPage xmlns="http://xamarin.com/schemas/2014/forms"
xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"
x:Class="CameraPreviewProject.Sources.Pages.CameraPage">
<ContentPage.Content>
<AbsoluteLayout>
<Button Text="Click me !" AbsoluteLayout.LayoutBounds="0.5, 0.5, 0.1, 0.1" AbsoluteLayout.LayoutFlags="All" />
</AbsoluteLayout>
</ContentPage.Content>
</ContentPage>
Finally, the C# side gives us this:
public partial class CameraPage : ContentPage
{
public CameraPage()
{
InitializeComponent();
}
}
Then, we create a renderer in the UWP side :
using CameraPreviewProject.Sources.Pages;
using CameraPreviewProject.UWP.Sources.PageRenderers;
using System;
using System.Diagnostics;
using System.Linq;
using System.Threading.Tasks;
using Windows.ApplicationModel;
using Windows.Devices.Enumeration;
using Windows.Devices.Sensors;
using Windows.Foundation;
using Windows.Graphics.Display;
using Windows.Graphics.Imaging;
using Windows.Media;
using Windows.Media.Capture;
using Windows.Media.MediaProperties;
using Windows.Storage;
using Windows.Storage.FileProperties;
using Windows.Storage.Streams;
using Windows.System.Display;
using Windows.UI.Core;
using Windows.UI.Xaml;
using Windows.UI.Xaml.Controls;
using Windows.UI.Xaml.Media;
using Xamarin.Forms.Platform.UWP;
[assembly: ExportRenderer(typeof(CameraPage), typeof(CameraPageRenderer))]
namespace CameraPreviewProject.UWP.Sources.PageRenderers
{
public class CameraPageRenderer : PageRenderer
{
private readonly DisplayInformation displayInformation = DisplayInformation.GetForCurrentView();
private readonly SimpleOrientationSensor orientationSensor = SimpleOrientationSensor.GetDefault();
private readonly DisplayRequest displayRequest = new DisplayRequest();
private SimpleOrientation deviceOrientation = SimpleOrientation.NotRotated;
private DisplayOrientations displayOrientation = DisplayOrientations.Portrait;
// Rotation metadata to apply to preview stream (https://msdn.microsoft.com/en-us/library/windows/apps/xaml/hh868174.aspx)
private static readonly Guid RotationKey = new Guid("C380465D-2271-428C-9B83-ECEA3B4A85C1"); // (MF_MT_VIDEO_ROTATION)
private StorageFolder captureFolder = null;
private readonly SystemMediaTransportControls systemMediaControls = SystemMediaTransportControls.GetForCurrentView();
private MediaCapture mediaCapture;
private CaptureElement captureElement;
private bool isInitialized;
private bool isPreviewing;
private bool externalCamera;
private bool mirroringPreview;
private Page page;
private AppBarButton takePhotoButton;
private Application app;
protected override void OnElementChanged(ElementChangedEventArgs<Xamarin.Forms.Page> e)
{
base.OnElementChanged(e);
if (e.OldElement != null || Element == null)
{
return;
}
try
{
app = Application.Current;
app.Suspending += OnAppSuspending;
app.Resuming += OnAppResuming;
SetupUserInterface();
SetupCamera();
this.Children.Add(page);
}
catch (Exception ex)
{
Debug.WriteLine(#" ERROR: ", ex.Message);
}
}
protected override Size ArrangeOverride(Size finalSize)
{
page.Arrange(new Rect(0, 0, finalSize.Width, finalSize.Height));
return finalSize;
}
private void SetupUserInterface()
{
takePhotoButton = new AppBarButton
{
VerticalAlignment = VerticalAlignment.Center,
HorizontalAlignment = HorizontalAlignment.Center,
Icon = new SymbolIcon(Symbol.Camera)
};
var commandBar = new CommandBar();
commandBar.PrimaryCommands.Add(takePhotoButton);
captureElement = new CaptureElement();
captureElement.Stretch = Stretch.UniformToFill;
var stackPanel = new StackPanel();
stackPanel.Children.Add(captureElement);
page = new Page();
page.BottomAppBar = commandBar;
page.Content = stackPanel;
page.Unloaded += OnPageUnloaded;
}
private async void SetupCamera()
{
await SetupUIAsync();
await InitializeCameraAsync();
}
#region Event Handlers
private async void OnSystemMediaControlsPropertyChanged(SystemMediaTransportControls sender, SystemMediaTransportControlsPropertyChangedEventArgs args)
{
await Dispatcher.RunAsync(CoreDispatcherPriority.Normal, async () =>
{
// Only handle event if the page is being displayed
if (args.Property == SystemMediaTransportControlsProperty.SoundLevel && page.Frame.CurrentSourcePageType == typeof(MainPage))
{
// Check if the app is being muted. If so, it's being minimized
// Otherwise if it is not initialized, it's being brought into focus
if (sender.SoundLevel == SoundLevel.Muted)
{
await CleanupCameraAsync();
}
else if (!isInitialized)
{
await InitializeCameraAsync();
}
}
});
}
private void OnOrientationSensorOrientationChanged(SimpleOrientationSensor sender, SimpleOrientationSensorOrientationChangedEventArgs args)
{
// Only update orientatino if the device is not parallel to the ground
if (args.Orientation != SimpleOrientation.Faceup && args.Orientation != SimpleOrientation.Facedown)
{
deviceOrientation = args.Orientation;
}
}
private async void OnDisplayInformationOrientationChanged(DisplayInformation sender, object args)
{
displayOrientation = sender.CurrentOrientation;
if (isPreviewing)
{
await SetPreviewRotationAsync();
}
}
private async void OnTakePhotoButtonClicked(object sender, RoutedEventArgs e)
{
await TakePhotoAsync();
}
/*async void OnHardwareCameraButtonPressed(object sender, CameraEventArgs e)
{
await TakePhotoAsync();
}*/
#endregion Event Handlers
#region Media Capture
private async Task InitializeCameraAsync()
{
if (mediaCapture == null)
{
var devices = await DeviceInformation.FindAllAsync(DeviceClass.VideoCapture);
var cameraDevice = devices.FirstOrDefault(c => c.EnclosureLocation != null && c.EnclosureLocation.Panel == Windows.Devices.Enumeration.Panel.Back);
// Get any camera if there isn't one on the back panel
cameraDevice = cameraDevice ?? devices.FirstOrDefault();
if (cameraDevice == null)
{
Debug.WriteLine("No camera found");
return;
}
mediaCapture = new MediaCapture();
try
{
await mediaCapture.InitializeAsync(new MediaCaptureInitializationSettings
{
VideoDeviceId = cameraDevice.Id,
AudioDeviceId = string.Empty,
StreamingCaptureMode = StreamingCaptureMode.Video,
PhotoCaptureSource = PhotoCaptureSource.Photo
});
isInitialized = true;
}
catch (UnauthorizedAccessException)
{
Debug.WriteLine("Camera access denied");
}
catch (Exception ex)
{
Debug.WriteLine("Exception initializing MediaCapture - {0}: {1}", cameraDevice.Id, ex.ToString());
}
if (isInitialized)
{
if (cameraDevice.EnclosureLocation == null || cameraDevice.EnclosureLocation.Panel == Windows.Devices.Enumeration.Panel.Unknown)
{
externalCamera = true;
}
else
{
// Camera is on device
externalCamera = false;
// Mirror preview if camera is on front panel
mirroringPreview = (cameraDevice.EnclosureLocation.Panel == Windows.Devices.Enumeration.Panel.Front);
}
await StartPreviewAsync();
}
}
}
private async Task StartPreviewAsync()
{
// Prevent the device from sleeping while the preview is running
displayRequest.RequestActive();
// Setup preview source in UI and mirror if required
captureElement.Source = mediaCapture;
captureElement.FlowDirection = mirroringPreview ? FlowDirection.RightToLeft : FlowDirection.LeftToRight;
// Start preview
await mediaCapture.StartPreviewAsync();
isPreviewing = true;
if (isPreviewing)
{
await SetPreviewRotationAsync();
}
}
private async Task StopPreviewAsync()
{
isPreviewing = false;
await mediaCapture.StopPreviewAsync();
// Use dispatcher because sometimes this method is called from non-UI threads
await Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
{
// UI cleanup
captureElement.Source = null;
// Allow device screen to sleep now preview is stopped
displayRequest.RequestRelease();
});
}
private async Task SetPreviewRotationAsync()
{
// Only update the orientation if the camera is mounted on the device
if (externalCamera)
{
return;
}
// Derive the preview rotation
int rotation = ConvertDisplayOrientationToDegrees(displayOrientation);
// Invert if mirroring
if (mirroringPreview)
{
rotation = (360 - rotation) % 360;
}
// Add rotation metadata to preview stream
var props = mediaCapture.VideoDeviceController.GetMediaStreamProperties(MediaStreamType.VideoPreview);
props.Properties.Add(RotationKey, rotation);
await mediaCapture.SetEncodingPropertiesAsync(MediaStreamType.VideoPreview, props, null);
}
private async Task TakePhotoAsync()
{
var stream = new InMemoryRandomAccessStream();
await mediaCapture.CapturePhotoToStreamAsync(ImageEncodingProperties.CreateJpeg(), stream);
try
{
var file = await captureFolder.CreateFileAsync("photo.jpg", CreationCollisionOption.GenerateUniqueName);
var orientation = ConvertOrientationToPhotoOrientation(GetCameraOrientation());
await ReencodeAndSavePhotoAsync(stream, file, orientation);
}
catch (Exception ex)
{
Debug.WriteLine("Exception when taking photo: " + ex.ToString());
}
}
private async Task CleanupCameraAsync()
{
if (isInitialized)
{
if (isPreviewing)
{
await StopPreviewAsync();
}
isInitialized = false;
}
if (mediaCapture != null)
{
mediaCapture.Dispose();
mediaCapture = null;
}
}
#endregion Media Capture
#region Helpers
private async Task SetupUIAsync()
{
// Lock page to landscape to prevent the capture element from rotating
DisplayInformation.AutoRotationPreferences = DisplayOrientations.Landscape;
/*// Hide status bar
if (ApiInformation.IsTypePresent("Windows.UI.ViewManagement.StatusBar"))
{
await Windows.UI.ViewManagement.StatusBar.GetForCurrentView().HideAsync();
}*/
displayOrientation = displayInformation.CurrentOrientation;
if (orientationSensor != null)
{
deviceOrientation = orientationSensor.GetCurrentOrientation();
}
RegisterEventHandlers();
var picturesLibrary = await StorageLibrary.GetLibraryAsync(KnownLibraryId.Pictures);
// Fallback to local app storage if no pictures library
captureFolder = picturesLibrary.SaveFolder ?? ApplicationData.Current.LocalFolder;
}
private async Task CleanupUIAsync()
{
UnregisterEventHandlers();
/*if (ApiInformation.IsTypePresent("Windows.UI.ViewManagement.StatusBar"))
{
await Windows.UI.ViewManagement.StatusBar.GetForCurrentView().ShowAsync();
}*/
// Revert orientation preferences
DisplayInformation.AutoRotationPreferences = DisplayOrientations.None;
}
private void RegisterEventHandlers()
{
/*if (ApiInformation.IsTypePresent("Windows.Phone.UI.Input.HardwareButtons"))
{
HardwareButtons.CameraPressed += OnHardwareCameraButtonPressed;
}*/
if (orientationSensor != null)
{
orientationSensor.OrientationChanged += OnOrientationSensorOrientationChanged;
}
displayInformation.OrientationChanged += OnDisplayInformationOrientationChanged;
systemMediaControls.PropertyChanged += OnSystemMediaControlsPropertyChanged;
takePhotoButton.Click += OnTakePhotoButtonClicked;
}
private void UnregisterEventHandlers()
{
/*if (ApiInformation.IsTypePresent("Windows.Phone.UI.Input.HardwareButtons"))
{
HardwareButtons.CameraPressed -= OnHardwareCameraButtonPressed;
}*/
if (orientationSensor != null)
{
orientationSensor.OrientationChanged -= OnOrientationSensorOrientationChanged;
}
displayInformation.OrientationChanged -= OnDisplayInformationOrientationChanged;
systemMediaControls.PropertyChanged -= OnSystemMediaControlsPropertyChanged;
takePhotoButton.Click -= OnTakePhotoButtonClicked;
}
private static async Task ReencodeAndSavePhotoAsync(IRandomAccessStream stream, StorageFile file, PhotoOrientation orientation)
{
using (var inputStream = stream)
{
var decoder = await BitmapDecoder.CreateAsync(inputStream);
using (var outputStream = await file.OpenAsync(FileAccessMode.ReadWrite))
{
var encoder = await BitmapEncoder.CreateForTranscodingAsync(outputStream, decoder);
var properties = new BitmapPropertySet
{
{
"System.Photo.Orientation", new BitmapTypedValue(orientation, Windows.Foundation.PropertyType.UInt16)
}
};
await encoder.BitmapProperties.SetPropertiesAsync(properties);
await encoder.FlushAsync();
}
}
}
#endregion Helpers
#region Rotation
private SimpleOrientation GetCameraOrientation()
{
if (externalCamera)
{
// Cameras that aren't attached to the device do not rotate along with it
return SimpleOrientation.NotRotated;
}
var result = deviceOrientation;
// On portrait-first devices, the camera sensor is mounted at a 90 degree offset to the native orientation
if (displayInformation.NativeOrientation == DisplayOrientations.Portrait)
{
switch (result)
{
case SimpleOrientation.Rotated90DegreesCounterclockwise:
result = SimpleOrientation.NotRotated;
break;
case SimpleOrientation.Rotated180DegreesCounterclockwise:
result = SimpleOrientation.Rotated90DegreesCounterclockwise;
break;
case SimpleOrientation.Rotated270DegreesCounterclockwise:
result = SimpleOrientation.Rotated180DegreesCounterclockwise;
break;
case SimpleOrientation.NotRotated:
result = SimpleOrientation.Rotated270DegreesCounterclockwise;
break;
}
}
// If the preview is mirrored for a front-facing camera, invert the rotation
if (mirroringPreview)
{
// Rotating 0 and 180 ddegrees is the same clockwise and anti-clockwise
switch (result)
{
case SimpleOrientation.Rotated90DegreesCounterclockwise:
return SimpleOrientation.Rotated270DegreesCounterclockwise;
case SimpleOrientation.Rotated270DegreesCounterclockwise:
return SimpleOrientation.Rotated90DegreesCounterclockwise;
}
}
return result;
}
private static int ConvertDeviceOrientationToDegrees(SimpleOrientation orientation)
{
switch (orientation)
{
case SimpleOrientation.Rotated90DegreesCounterclockwise:
return 90;
case SimpleOrientation.Rotated180DegreesCounterclockwise:
return 180;
case SimpleOrientation.Rotated270DegreesCounterclockwise:
return 270;
case SimpleOrientation.NotRotated:
default:
return 0;
}
}
private static int ConvertDisplayOrientationToDegrees(DisplayOrientations orientation)
{
switch (orientation)
{
case DisplayOrientations.Portrait:
return 90;
case DisplayOrientations.LandscapeFlipped:
return 180;
case DisplayOrientations.PortraitFlipped:
return 270;
case DisplayOrientations.Landscape:
default:
return 0;
}
}
private static PhotoOrientation ConvertOrientationToPhotoOrientation(SimpleOrientation orientation)
{
switch (orientation)
{
case SimpleOrientation.Rotated90DegreesCounterclockwise:
return PhotoOrientation.Rotate90;
case SimpleOrientation.Rotated180DegreesCounterclockwise:
return PhotoOrientation.Rotate180;
case SimpleOrientation.Rotated270DegreesCounterclockwise:
return PhotoOrientation.Rotate270;
case SimpleOrientation.NotRotated:
default:
return PhotoOrientation.Normal;
}
}
#endregion Rotation
#region Lifecycle
private async void OnAppSuspending(object sender, SuspendingEventArgs e)
{
var deferral = e.SuspendingOperation.GetDeferral();
await CleanupCameraAsync();
await CleanupUIAsync();
deferral.Complete();
}
private async void OnAppResuming(object sender, object o)
{
await SetupUIAsync();
await InitializeCameraAsync();
}
private async void OnPageUnloaded(object sender, RoutedEventArgs e)
{
await CleanupCameraAsync();
await CleanupUIAsync();
}
#endregion Lifecycle
}
}
This idea is pretty logic, you have a basic page, but which have renderer that preview the camera in the background, I mean, this is the idea I understood, however, it only gives you a white screen that throws an exception… (x86)
Exception initializing MediaCapture - \\?\USB#VID_045E&PID_0779&MI_00#6&2E9BBB25&0&0000#{e5323777-f976-4f5b-9b55-b94699c46e44}\GLOBAL: System.Runtime.InteropServices.COMException (0xC00DABE6): The current capture source does not have an independent photo stream.
The current capture source does not have an independent photo stream.
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.GetResult()
at CameraPreviewProject.UWP.Sources.PageRenderers.CameraPageRenderer.<InitializeCameraAsync>d__25.MoveNext()
Then I click the button of the downside woft menu and get:
Exception thrown: 'System.Runtime.InteropServices.COMException' in System.Private.CoreLib.ni.dll
WinRT information: This object needs to be initialized before the requested operation can be carried out.
I’m a Xamarin Fan, but on that part, I’m not. This link about MediaCapture can be interesting though!
CameraView
To be honest, it’s so way easier to have a control as a button!
<Camera/>
Well, let’s have a look at it! I found a couple of solutions:
Moment MVVM logic - It seems to work only with Android & iOS
Xlabs Camera Unable to try since I can’t start VS2017 from the .sln. Also, I couldn't test the UWP side because it's an MVVM logic..
Xam.Plugin.Media This solution works on UWP !! But run a new activity/instance/page with a native design, so this isn't the solution searched
So, my question is “Does someone could create an element public class Camera() that can be used and declared as a simple xamarin forms button?”
Because, I saw as well 2 others projects about it, one I can’t remember, but the second one is Barcode Scanning but I’m not able to understand or take back the code to implement it as I would like…
It seems so easy and it’s so hard to get, why? Because finally, we’re talking about a view/image that displays a stream from a camera? A camera is just a service where you have methods such as TakePictureAsync() or anything like that? Rotate(), Switch(ViewSide vs), etc etc?
So, I searched about getting a frame view or display the stream of the camera into an image or a view.. I began from those links:
UWP get live webcam video stream by David Pine
UWP stream Webcam over socket to mediaElement I just made some changes
because the subject is a bit different, but.. I couldn't make it work
To be honest, I don’t know what to try now… I’m lost because, at the same time, I tried some Xamarin Forms solution, but also some proper UWP solutions and … nothing…. Maybe my point of view is not good, maybe my idea and just on the side, maybe I should try another approach, I don’t know at all..
I was also thinking about creating a class with some interface that I redefine in each platform renderer, but, still nothing…
Do you have please, any idea or any approach?
Note I have cross-posed this to the Xamarin forums.
Our application should have the functionality to save Application files to Google Drive. Of course, using the local configured account.
From Android API i tried to figure out some clue. But android API with Xamarin implementation seems very tough for me.
I have installed Google Play Services- Drive from Xamarin Components but there are no examples listed from which we can refer the flow and functionality.
The basic steps (see the link below for full details):
Create GoogleApiClient with the Drive API and Scope
Try to connect (login) the GoogleApiClient
The first time you try to connect it will fail as the user has not selected a Google Account that should be used
Use StartResolutionForResult to handle this condition
When GoogleApiClient is connected
Request a Drive content (DriveContentsResult) to write the file contents to.
When the result is obtained, write data into the Drive content.
Set the metadata for the file
Create the Drive-based file with the Drive content
Note: This example assumes that you have Google Drive installed on your device/emulator and you have registered your app in Google's Developer API Console with the Google Drive API Enabled.
C# Example:
[Activity(Label = "DriveOpen", MainLauncher = true, Icon = "#mipmap/icon")]
public class MainActivity : Activity, GoogleApiClient.IConnectionCallbacks, IResultCallback, IDriveApiDriveContentsResult
{
const string TAG = "GDriveExample";
const int REQUEST_CODE_RESOLUTION = 3;
GoogleApiClient _googleApiClient;
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
SetContentView(Resource.Layout.Main);
Button button = FindViewById<Button>(Resource.Id.myButton);
button.Click += delegate
{
if (_googleApiClient == null)
{
_googleApiClient = new GoogleApiClient.Builder(this)
.AddApi(DriveClass.API)
.AddScope(DriveClass.ScopeFile)
.AddConnectionCallbacks(this)
.AddOnConnectionFailedListener(onConnectionFailed)
.Build();
}
if (!_googleApiClient.IsConnected)
_googleApiClient.Connect();
};
}
protected void onConnectionFailed(ConnectionResult result)
{
Log.Info(TAG, "GoogleApiClient connection failed: " + result);
if (!result.HasResolution)
{
GoogleApiAvailability.Instance.GetErrorDialog(this, result.ErrorCode, 0).Show();
return;
}
try
{
result.StartResolutionForResult(this, REQUEST_CODE_RESOLUTION);
}
catch (IntentSender.SendIntentException e)
{
Log.Error(TAG, "Exception while starting resolution activity", e);
}
}
public void OnConnected(Bundle connectionHint)
{
Log.Info(TAG, "Client connected.");
DriveClass.DriveApi.NewDriveContents(_googleApiClient).SetResultCallback(this);
}
protected override void OnActivityResult(int requestCode, Result resultCode, Intent data)
{
base.OnActivityResult(requestCode, resultCode, data);
if (requestCode == REQUEST_CODE_RESOLUTION)
{
switch (resultCode)
{
case Result.Ok:
_googleApiClient.Connect();
break;
case Result.Canceled:
Log.Error(TAG, "Unable to sign in, is app registered for Drive access in Google Dev Console?");
break;
case Result.FirstUser:
Log.Error(TAG, "Unable to sign in: RESULT_FIRST_USER");
break;
default:
Log.Error(TAG, "Should never be here: " + resultCode);
return;
}
}
}
void IResultCallback.OnResult(Java.Lang.Object result)
{
var contentResults = (result).JavaCast<IDriveApiDriveContentsResult>();
if (!contentResults.Status.IsSuccess) // handle the error
return;
Task.Run(() =>
{
var writer = new OutputStreamWriter(contentResults.DriveContents.OutputStream);
writer.Write("Stack Overflow");
writer.Close();
MetadataChangeSet changeSet = new MetadataChangeSet.Builder()
.SetTitle("New Text File")
.SetMimeType("text/plain")
.Build();
DriveClass.DriveApi
.GetRootFolder(_googleApiClient)
.CreateFile(_googleApiClient, changeSet, contentResults.DriveContents);
});
}
public void OnConnectionSuspended(int cause)
{
throw new NotImplementedException();
}
public IDriveContents DriveContents
{
get
{
throw new NotImplementedException();
}
}
public Statuses Status
{
get
{
throw new NotImplementedException();
}
}
}
Ref: https://developers.google.com/drive/android/create-file
Is there a way to detect if the phone, at the call try moment, can only make emergency calls (using Xamarin)?
This is something platform specific. Unfortunaelty I don't know any plugin for this, so you have to use the native API in Xamarin.
On Android it is the TelephonyManager as shown here: https://stackoverflow.com/a/14355706/1489968 It's Java, but can be easily translated to C#:
public class MainActivity : Activity
{
protected override void OnCreate(Bundle bundle)
{
base.OnCreate(bundle);
SetContentView(Resource.Layout.Main);
var telMng = (TelephonyManager) GetSystemService(TelephonyService);
var myPhoneStateListener = new MyPhoneStateListener();
myPhoneStateListener.ServiceStateChanged += (s, e) => Console.WriteLine("State: {0}", e);
telMng.Listen(myPhoneStateListener, PhoneStateListenerFlags.ServiceState);
}
}
public class MyPhoneStateListener : PhoneStateListener
{
public event EventHandler<ServiceState> ServiceStateChanged;
public override void OnServiceStateChanged(ServiceState serviceState)
{
base.OnServiceStateChanged(serviceState);
ServiceStateChanged?.Invoke(this, serviceState);
}
}
On iOS: Not sure if that information is available, never have seen it exposed via the SDK... (maybe add iOS tag to this question, or ask an iOS only question, answer might be in ObjC/Swift but you can translate it)
On Android: The info you are looking for is contained within the ServiceState of the phone:
var callState = new ServiceState ();
switch (callState.State) {
case PhoneState.InService:
{
var uri = Android.Net.Uri.Parse ("tel:555-2368"); // Jim Rockford's number ;-)
var intent = new Intent (Intent.ActionDial, uri);
StartActivity (intent);
break;
}
case PhoneState.EmergencyOnly:
{
Toast.MakeText (this, "Emergency Calls Only", ToastLength.Long).Show();
break;
}
case PhoneState.OutOfService:
{
Toast.MakeText (this, "Out of Service", ToastLength.Long).Show();
break;
}
case PhoneState.PowerOff:
{
Toast.MakeText (this, "Cell/Modem Power Off", ToastLength.Long).Show();
break;
}
default:
{
Toast.MakeText (this, "Should never be shown on a real device", ToastLength.Long).Show();
break;
}
}
Ref: http://developer.android.com/reference/android/telephony/ServiceState.html
For testing on the emulator, you can set the state to denied via the adb shell:
voice denied
data denied
Ref: https://developer.android.com/tools/devices/emulator.html
I'm building an API based application. I use the Pebble.SendAppMessage function to send data from the API to the watch. Each time I try to send data containing accentued characters and log them, the console print this following empty message:
[xx:xx:xx] javascript>
and no more interaction between the phone and the watch will work.
I first tried to encode each API data to UTF8 encoding but the crash still appears.
Can we send strings containing accents using the Pebble.SendAppMessage function? What exactly did I miss?
C code :
#include <pebble.h>
static Window *window;
static TextLayer *text_layer;
static void message_provider(DictionaryIterator* iterator, void* context) {
Tuple* tuple = dict_find(iterator, 0);
text_layer_set_text(text_layer, tuple->value->cstring);
}
static void window_load(Window *window) {
Layer *window_layer = window_get_root_layer(window);
GRect bounds = layer_get_bounds(window_layer);
text_layer = text_layer_create((GRect) { .origin = { 0, 72 }, .size = { bounds.size.w, 20 } });
text_layer_set_text(text_layer, "Loading...");
text_layer_set_text_alignment(text_layer, GTextAlignmentCenter);
layer_add_child(window_layer, text_layer_get_layer(text_layer));
}
static void window_unload(Window *window) {
text_layer_destroy(text_layer);
}
static void init(void) {
window = window_create();
window_set_window_handlers(window, (WindowHandlers) {
.load = window_load,
.unload = window_unload,
});
const bool animated = true;
window_stack_push(window, animated);
app_message_register_inbox_received(message_provider);
app_message_open(app_message_inbox_size_maximum(), app_message_outbox_size_maximum());
}
static void deinit(void) {
window_destroy(window);
}
int main(void) {
init();
app_event_loop();
deinit();
}
JS code :
Pebble.addEventListener("ready", function() {
Pebble.sendAppMessage({"dummy" : "Gérard example is always the best."}, function() {
console.log("message sent successfully !");
}, function() {
console.log("Cannot send message with accent.");
});
});
i'm trying to get a list of the applications installed on the wear device using
packageManager.getInstalledPackages(PackageManager.GET_PERMISSIONS);
But many apps are missing from that list (all motorola apps, SetAlarm, SetTimer,ShowAlarms)
Anybody know what i should do to get all of them ?
That's because these apps are intended to be launched only with voice.
Here is how I get them in Wear Mini Launcher:
try {
ApplicationInfo app = manager.getApplicationInfo("com.google.android.deskclock", 0);
String name = manager.getApplicationLabel(app).toString();
Intent intentAI = new Intent();
intentAI.setPackage(app.packageName);
List<ResolveInfo> listRI = manager.queryIntentActivities(intentAI, 0);
// Launchable app
if (listRI.size() > 0) {
for (ResolveInfo resolveInfo : listRI) {
if (name != null) {
//Do your stuff here
}
}
}
} catch (PackageManager.NameNotFoundException e) {
e.printStackTrace();
}