The Android device I am using does not have any touchscreen. All it has is a touch pad that can generate KEYCODE_DPAD_UP, KEYCODE_DPAD_DOWN, etc. It appears, by default, a Xamarin-based app ignores these key codes. I am wondering if there is some way to map these key codes into navigating among the displayed controls. Regards.
Related
Has anyone gotten the android keyboard to support GIF and image insertion in an Xamarin app? Let's say I want to build a chat application, have a custom EditText view to capture user input, and want to have it behave similar to the Android built-in chat application. Currently the keyboard shows a popup . Is it possible to make this work? I prefer to not create a custom keyboard renderer, though was hoping this could be done within a custom EditText renderer. Along the lines of:
UPDATE: For anyone looking for a similar solution, I have been making progress using the https://github.com/xamarin/monodroid-samples/tree/main/android-n/CommitContentSample. Essentially set the content mime types and wrap the IOnCommitContentListener, then process OnCommitContent().
I have Xamarin buttons and CollectionView items which don't show text strings, and I'm working to support Android Voice Access being able to interact with those elements. Voice Access works fine with controls that do show text strings, but I've not found a way to interact with the controls that don't show text. I've set appropriate AutomationProperties.Name properties on the controls, but Voice Access doesn't interact with the controls when I speak those AutomationProperties.Name properties.
What can I set on the controls to have Voice Access work with controls, when the controls don't show text?
A snippet of code might help generate more answers but without seeing the code, you might need android:importantForAccessibility="yes" or android:focusable="true" and possibly android:clickable="true"
See also What is the difference between Android focusable and importantForAccessibility when using TalkBack?
I have an Entry which is placed in a ContentView, and this ContentView is placed in a Grid. When this Entry is Focused, the Keyboard is placed over the ContentView preventing the user from seeing the Entry.
I would like to know if there is a way to determine if a View is Visible and if not make sure it is (prevent the Keyboard from being placed over it).
Any thoughts on how I could do this.
I would need this to work on iOS specifically, Android and Windows seem not to have this issue in my use-case.
On Android platform, the official document provides Soft Keyboard Input Mode. Refer to https://learn.microsoft.com/en-us/xamarin/xamarin-forms/platform/android/soft-keyboard-input-mode for details.
On IOS platform, you can query KeyboardOverlap installation in IOS's nuget, and then add KeyboardOverlapRenderer.Init (); in AppDelegate to achieve the effect in IOS.
I have tried my UWP app on XBOX one which is built using xamarin.forms and it works generally fine but by default mouse mode is active. I changed it using
RequiresPointerMode = Windows.UI.Xaml.ApplicationRequiresPointerMode.WhenRequested;
on UWP level in app.xaml.cs and it disabled mouse pointer but problem is that using game pad, i am not able to select items on the UI. I can navigate through textbox and buttons but not Toolbar (Commandbar in uwp), ListView, Masterdetail, Tabs etc.
I created a blank native UWP application and added a commandbar with AppBarButtons and NavigationView with NavigationViewItems. It perfectly works, I am able to navigate between menu items and commanbaritems using mouse pad.
Why this is not working for Xamarin.Forms? is xamarin.forms not actually native for UWP?
Xamarin.Forms MasterDetailPage was written before NavigationView existed and it doesn't use it at all (and especially not with NavigationViewItems, that would limit the flexibility, i don't think it will ever be used).
As SplitView has some focus bug that I can confirm it doesn't come as a surprise that it doesn't work with XBox as expected. However UWP doesn't grant that the app will work properly when you disable the mouse mode with native controls, that's why it is enabled by default. There are properties like XYFocusLeft that must be set if the app is not working properly. You probably need to make custom renderer to expose those properties and set them right. That's pretty much of work to do but it is up to you to decide...
I have a winphone app with several buttons on a page. I want to be able to press them two (or more) at a time. Unfortunately when I press two buttons (with 2 fingers), only one of them gets the click or tap events and the other one does not. How can I know that 2 buttons are pressed?
According to the post below, you could achieve this by using Touch.FrameReported event.
Pressing multiple buttons simultaneously
What underlying framework are you using ? Is it C++ or C# ? XNA or WPF/XAML ? WP7 or WP8?
As Kulasangar pointed Touch.FrameReported should be the way to do it in C#.
For XNA you will have to enable the MultiTouch and then use the TouchCollection class. That will give you access to multiple touches.
In DirectX you might have to use managed C++ and listen to one of the system events.
Things might change based on the platform ( WP7 or WP8 )