How can I get VisualElement coordinates on screen iOS Xamarin - xamarin

I need to know VisualElement coordinates in device screen coordinate space on iOS in Xamarin project.
Android has its own method GetLocationOnScreen
But iOS hasn't
I found this solution:
public static Point GetScreenCoords(this VisualElement view)
{
var result = new Point(view.X, view.Y);
while (view.Parent is VisualElement parent)
{
result = result.Offset(parent.X, parent.Y);
view = parent;
}
return result;
}
but properties X ad Y of VisualElement are relatives to parents bounds and don't provide required values.

You can use DependencyService to implement the function. I wrote a simple case to get the coordinates of VisualElement. In my simple, I need to get the border of the element relative to the parent element (the child element is inside the parent element). If no parent element is specified, the method returns the bounds of the element relative to the window. You can refer to this:
Create a dependent interface:
public interface IMyLocation
{
RectangleF GetCoordinates(VisualElement element, VisualElement parentElement);
}
iOS implementation:
[assembly: Xamarin.Forms.Dependency(typeof(MyLocation))]
namespace FormsDemoWang.iOS
{
public class MyLocation : IMyLocation
{
public RectangleF GetCoordinates(VisualElement element, VisualElement parentElement)
{
IVisualElementRenderer renderer = Platform.GetRenderer(element);
UIView elementNativeView = renderer.NativeView;
UIView parentNativeView = null;
if (parentNativeView != null) {
parentNativeView = Platform.GetRenderer(parentElement).NativeView;
}
CGRect rect = elementNativeView.ConvertRectToView(elementNativeView.Frame, parentNativeView);
float x = (float)Math.Round(rect.X);
float y = (float)Math.Round(rect.Y);
float width = (float)Math.Round(rect.Width);
float height = (float)Math.Round(rect.Height);
return new RectangleF(x, y, width, height);
}
}
}
Used in Forms:
var coordinates = DependencyService.Get<IMyLocation>().GetCoordinates(myElement, parentElement);
var xlable = new Label { Text = $"X:{coordinates.X}" };
var ylable = new Label { Text = $"Y:{coordinates.Y}" };

Related

Can't Trigger MouseEntered on NSButton in Xamarin.Mac

I'm trying to programmatically add a MouseEntered event to a custom NSButton class, and I can't seem to get it to fire. I'm writing a Mac OS application in Visual Studio for Mac, using Xamarin.Mac. I need to add the event in code because I'm creating the buttons dynamically.
Here's my ViewController where these buttons are being created. They're instantiated in the DrawHexMap method near the bottom.
public partial class MapController : NSViewController
{
public MapController (IntPtr handle) : base (handle)
{
}
public override void ViewDidLoad()
{
base.ViewDidLoad();
DrawHexMap();
}
partial void GoToHex(Foundation.NSObject sender)
{
string[] coordinateStrings = ((NSButton)sender).Title.Split(',');
Hex chosenHex = HexRepo.GetHex(coordinateStrings[0], coordinateStrings[1]);
HexService.currentHex = chosenHex;
NSTabViewController tp = (NSTabViewController)ParentViewController;
tp.TabView.SelectAt(1);
}
private void DrawHexMap()
{
double height = 60;
for (int x = 0; x < 17; x++) {
for (int y = 0; y < 13; y++) {
HexButton button = new HexButton(x, y, height);
var handle = ObjCRuntime.Selector.GetHandle("GoToHex:");
button.Action = ObjCRuntime.Selector.FromHandle(handle);
button.Target = this;
View.AddSubview(button);
}
}
}
}
And here's the custom button class.
public class HexButton : NSButton
{
public NSTrackingArea _trackingArea;
public HexButton(int x, int y, double height)
{
double width = height/Math.Sqrt(3);
double doubleWidth = width * 2;
double halfHeight = height/2;
double columnNudge = decimal.Remainder(x, 2) == 0 ? 0 : halfHeight;
Title = x + "," + y;
//Bordered = false;
//ShowsBorderOnlyWhileMouseInside = true;
SetFrameSize(new CGSize(width, height));
SetFrameOrigin(new CGPoint(width + (x * doubleWidth), (y * height) + columnNudge));
_trackingArea = new NSTrackingArea(Frame, NSTrackingAreaOptions.ActiveInKeyWindow | NSTrackingAreaOptions.MouseEnteredAndExited, this, null);
AddTrackingArea(_trackingArea);
}
public override void MouseEntered(NSEvent theEvent)
{
base.MouseEntered(theEvent);
Console.WriteLine("mouse enter");
}
}
So as you can see, I'm creating a tracking area for the button and adding it in the constructor. Yet I can't seem to get a MouseEntered to fire. I know the MouseEntered override in this class works, because when I call button.MouseEntered() directly from my code, the method fires.
A few other things I've tried include: Commenting out the lines that set the Action and Target in the ViewController, in case those were overriding the MouseEntered handler somehow. Setting those values inside the HexButton constructor so that the Target was the button instead of the ViewController. Putting the MouseEntered override in the ViewController instead of the button class. Creating the tracking area after the button was added as a subview to the ViewController. None of these made a difference.
Any help would be much appreciated! It's quite difficult to find documentation for Xamarin.Mac...
Thanks!
You are adding the Frame as the region tracked, but you are attaching the tracking area to the view that you are creating the region from, thus you need to track the Bounds coords.
_trackingArea = new NSTrackingArea(Bounds, NSTrackingAreaOptions.ActiveInKeyWindow | NSTrackingAreaOptions.MouseEnteredAndExited, this, null);
Note: If you were tracking from the view that you are adding the buttons to, then you would need to track the "Frame" as it is the tracking region is relative to the view being tracked, not its children.

How to get DPI device to PCL in Xamarin. Forms?

I need to get DPI device in my Xamarin class PCL. I do not want to use Xamarin.Essentials. Can I do this using Native interfaces, if its possible, how can I do it?
in your pcl create a new interface called IDisplayInfo:
public interface IDisplayInfo
{
int GetDisplayWidth();
int GetDisplayHeight();
int GetDisplayDpi();
}
In your android implementation, add a new class:
[assembly: Dependency(typeof(DisplayInfo))]
namespace YourAppNamespace.Droid
{
public class DisplayInfo : IDisplayInfo
{
public int GetDisplayWidth()
{
return (int)Android.App.Application.Context.Resources.DisplayMetrics.WidthPixels;
}
public int GetDisplayHeight()
{
return (int)Android.App.Application.Context.Resources.DisplayMetrics.HeightPixels;
}
public int GetDisplayDpi()
{
return (int)Android.App.Application.Context.Resources.DisplayMetrics.DensityDpi;
}
}
}
and in the iOS implementation, add the same class:
[assembly: Dependency(typeof(DisplayInfo))]
namespace YourNamespace.iOS
{
public class DisplayInfo : IDisplayInfo
{
public int GetDisplayWidth()
{
return (int)UIScreen.MainScreen.Bounds.Width;
}
public int GetDisplayHeight()
{
return (int)UIScreen.MainScreen.Bounds.Height;
}
public int GetDisplayDpi()
{
return (int)(int)UIScreen.MainScreen.Scale;
}
}
}
Now in your shared code, you can call
int dpi = DependencyService.Get<IDisplayInfo>().GetDisplayDpi();
and should be good to go. Note that i also added methods for getting screen width and height, basically because i already had them in my code and since you are probably going to need them sooner or later anyways.
Currently Device Display Information available via official Xamarin.Essentials nuget package, see:
https://learn.microsoft.com/en-us/xamarin/essentials/device-display
// Get Metrics
var mainDisplayInfo = DeviceDisplay.MainDisplayInfo;
// Orientation (Landscape, Portrait, Square, Unknown)
var orientation = mainDisplayInfo.Orientation;
// Rotation (0, 90, 180, 270)
var rotation = mainDisplayInfo.Rotation;
// Width (in pixels)
var width = mainDisplayInfo.Width;
// Height (in pixels)
var height = mainDisplayInfo.Height;
// Screen density
var density = mainDisplayInfo.Density;
I think it will help you. Here you have it described
enter link description here
I have a static class Core to store some shared stuff defined the shared code.
On app start it's receiving values for later use everywhere:
Android MainActivity OnCreate:
Core.IsAndroid = true;
Core.DisplayDensity = Resources.DisplayMetrics.Density;
iOS AppDelegate FinishedLaunching:
Core.IsIOS = true;
Core.DisplayDensity = (float)(UIScreen.MainScreen.NativeBounds.Width / UIScreen.MainScreen.Bounds.Width);

Adding a bottom border to an Entry in Xamarin Forms iOS with an image at the end

Now before anyone ignores this as a duplicate please read till the end. What I want to achieve is this
I've been doing some googling and looking at objective c and swift responses on stackoverflow as well. And this response StackOverFlowPost seemed to point me in the right direction. The author even told me to use ClipsToBounds to clip the subview and ensure it's within the parents bounds. Now here's my problem, if I want to show an image on the right side of the entry(Gender field), I can't because I'm clipping the subview.
For clipping, I'm setting the property IsClippedToBounds="True" in the parent stacklayout for all textboxes.
This is the code I'm using to add the bottom border
Control.BorderStyle = UITextBorderStyle.None;
var myBox = new UIView(new CGRect(0, 40, 1000, 1))
{
BackgroundColor = view.BorderColor.ToUIColor(),
};
Control.AddSubview(myBox);
This is the code I'm using to add an image at the beginning or end of an entry
private void SetImage(ExtendedEntry view)
{
if (!string.IsNullOrEmpty(view.ImageWithin))
{
UIImageView icon = new UIImageView
{
Image = UIImage.FromFile(view.ImageWithin),
Frame = new CGRect(0, -12, view.ImageWidth, view.ImageHeight),
ClipsToBounds = true
};
switch (view.ImagePos)
{
case ImagePosition.Left:
Control.LeftView.AddSubview(icon);
Control.LeftViewMode = UITextFieldViewMode.Always;
break;
case ImagePosition.Right:
Control.RightView.AddSubview(icon);
Control.RightViewMode = UITextFieldViewMode.Always;
break;
}
}
}
After analysing and debugging, I figured out that when OnElementChanged function of the Custom Renderer is called, the control is still not drawn so it doesn't have a size. So I subclassed UITextField like this
public class ExtendedUITextField : UITextField
{
public UIColor BorderColor;
public bool HasBottomBorder;
public override void Draw(CGRect rect)
{
base.Draw(rect);
if (HasBottomBorder)
{
BorderStyle = UITextBorderStyle.None;
var myBox = new UIView(new CGRect(0, 40, Frame.Size.Width, 1))
{
BackgroundColor = BorderColor
};
AddSubview(myBox);
}
}
public void InitInhertedProperties(UITextField baseClassInstance)
{
TextColor = baseClassInstance.TextColor;
}
}
And passed the hasbottomborder and bordercolor parameters like this
protected override void OnElementChanged(ElementChangedEventArgs<Entry> e)
{
base.OnElementChanged(e);
var view = e.NewElement as ExtendedEntry;
if (view != null && Control != null)
{
if (view.HasBottomBorder)
{
var native = new ExtendedUITextField
{
BorderColor = view.BorderColor.ToUIColor(),
HasBottomBorder = view.HasBottomBorder
};
native.InitInhertedProperties(Control);
SetNativeControl(native);
}
}
But after doing this, now no events fire :(
Can someone please point me in the right direction. I've already built this for Android, but iOS seems to be giving me a problem.
I figured out that when OnElementChanged function of the Custom Renderer is called, the control is still not drawn so it doesn't have a size.
In older versions of Xamarin.Forms and iOS 9, obtaining the control's size within OnElementChanged worked....
You do not need the ExtendedUITextField, to obtain the size of the control, override the Frame in your original renderer:
public override CGRect Frame
{
get
{
return base.Frame;
}
set
{
if (value.Width > 0 && value.Height > 0)
{
// Use the frame size now to update any of your subview/layer sizes, etc...
}
base.Frame = value;
}
}

Scale UI for multiple resolutions/different devices

I have a quite simple unity GUI that has the following scheme :
Where Brekt and so are buttons.
The GUI works just fine on PC and is on screen space : overlay so it is supposed to be adapted automatically to fit every screen.
But on tablet the whole GUI is smaller and reduced in the center of the screen, with huge margins around the elements (can't join a screenshot now)
What is the way to fix that? Is it something in player settings or in project settings?
Automatically scaling the UI requires using combination of anchor,pivot point of RecTransform and the Canvas Scaler component. It is hard to understand it without images or videos. It is very important that you thoroughly understand how to do this and Unity provided full video tutorial for this.You can watch it here.
Also, when using scrollbar, scrollview and other similar UI controls, the ContentSizeFitter component is also used to make sure they fit in that layout.
There is a problem with MovementRange. We must scale this value too.
I did it so:
public int MovementRange = 100;
public AxisOption axesToUse = AxisOption.Both; // The options for the axes that the still will use
public string horizontalAxisName = "Horizontal"; // The name given to the horizontal axis for the cross platform input
public string verticalAxisName = "Vertical"; // The name given to the vertical axis for the cross platform input
private int _MovementRange = 100;
Vector3 m_StartPos;
bool m_UseX; // Toggle for using the x axis
bool m_UseY; // Toggle for using the Y axis
CrossPlatformInputManager.VirtualAxis m_HorizontalVirtualAxis; // Reference to the joystick in the cross platform input
CrossPlatformInputManager.VirtualAxis m_VerticalVirtualAxis; // Reference to the joystick in the cross platform input
void OnEnable()
{
CreateVirtualAxes();
}
void Start()
{
m_StartPos = transform.position;
Canvas c = GetComponentInParent<Canvas>();
_MovementRange = (int)(MovementRange * c.scaleFactor);
Debug.Log("Range:"+ _MovementRange);
}
void UpdateVirtualAxes(Vector3 value)
{
var delta = m_StartPos - value;
delta.y = -delta.y;
delta /= _MovementRange;
if (m_UseX)
{
m_HorizontalVirtualAxis.Update(-delta.x);
}
if (m_UseY)
{
m_VerticalVirtualAxis.Update(delta.y);
}
}
void CreateVirtualAxes()
{
// set axes to use
m_UseX = (axesToUse == AxisOption.Both || axesToUse == AxisOption.OnlyHorizontal);
m_UseY = (axesToUse == AxisOption.Both || axesToUse == AxisOption.OnlyVertical);
// create new axes based on axes to use
if (m_UseX)
{
m_HorizontalVirtualAxis = new CrossPlatformInputManager.VirtualAxis(horizontalAxisName);
CrossPlatformInputManager.RegisterVirtualAxis(m_HorizontalVirtualAxis);
}
if (m_UseY)
{
m_VerticalVirtualAxis = new CrossPlatformInputManager.VirtualAxis(verticalAxisName);
CrossPlatformInputManager.RegisterVirtualAxis(m_VerticalVirtualAxis);
}
}
public void OnDrag(PointerEventData data)
{
Vector3 newPos = Vector3.zero;
if (m_UseX)
{
int delta = (int)(data.position.x - m_StartPos.x);
delta = Mathf.Clamp(delta, -_MovementRange, _MovementRange);
newPos.x = delta;
}
if (m_UseY)
{
int delta = (int)(data.position.y - m_StartPos.y);
delta = Mathf.Clamp(delta, -_MovementRange, _MovementRange);
newPos.y = delta;
}
transform.position = new Vector3(m_StartPos.x + newPos.x, m_StartPos.y + newPos.y, m_StartPos.z + newPos.z);
UpdateVirtualAxes(transform.position);
}

SwapChainBackgroundPanel letterboxing Monogame Windows Store App

I am porting my space shooter game from Windows Phone to Windows Store App. In WP it always play in full portrait orientation.
For the Windows Store app though while in landscape mode, I want to center the game screen with letterboxing on the left and right. The problem is I can't adjust the margin property of SwapChainBackgroundPanel so the game always aligned to the left and the black screen is on the right.
Here's my code
public Game1()
{
graphics = new GraphicsDeviceManager(this);
GamePage.Current.SizeChanged += OnWindowSizeChanged;
Content.RootDirectory = "Content";
}
private void OnWindowSizeChanged(object sender, Windows.UI.Xaml.SizeChangedEventArgs e)
{
var CurrentViewState = Windows.UI.ViewManagement.ApplicationView.Value;
double width = e.NewSize.Width;
double height = e.NewSize.Height;
// using Windows.Graphics.Display;
ResolutionScale resolutionScale = DisplayProperties.ResolutionScale;
string orientation = null;
if (ApplicationView.Value == ApplicationViewState.FullScreenLandscape)
{
orientation = "FullScreenLandscape";
//Does not work because it's start on the center of the screen
//Black screen is on the left and place the game screen on the right
GamePage.Current.HorizontalAlignment = Windows.UI.Xaml.HorizontalAlignment.Center;
//Gives error - WinRT information: Setting 'Margin' property is
//not supported on SwapChainBackgroundPanel.
GamePage.Current.Margin = new Thickness(centerMargin, 0, 0, 0);
}
else if (ApplicationView.Value == ApplicationViewState.FullScreenPortrait)
{
orientation = "FullScreenPortrait";
}
else if (ApplicationView.Value == ApplicationViewState.Filled)
{
orientation = "Filled";
}
else if (ApplicationView.Value == ApplicationViewState.Snapped)
{
orientation = "Snapped";
}
Debug.WriteLine("{0} x {1}. Scale: {2}. Orientation: {3}",
width.ToString(), height.ToString(), resolutionScale.ToString(),
orientation);
}
The GamePage.xaml is the default
<SwapChainBackgroundPanel
x:Class="SpaceShooterXW8.GamePage"
xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
xmlns:local="using:SpaceShooterXW8"
xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
mc:Ignorable="d">
</SwapChainBackgroundPanel>
After some researched I think I've figured it out thanks to this blog post. To those who are in a similar situation, here's what I did.
The beauty of the solution is that the letterboxing is automatically managed by the Resolution class. All I have to do is update the batch.begin() lines in my code to something like
batch.Begin(SpriteSortMode.Deferred,
null, SamplerState.LinearClamp,
null,
null,
null,
Resolution.getTransformationMatrix());
To handle resolution changes as the orientation changed I use this in my Game1.cs
public Game1()
{
graphics = new GraphicsDeviceManager(this);
GamePage.Current.SizeChanged += OnWindowSizeChanged;
Content.RootDirectory = "Content";
Resolution.Init(ref graphics);
Resolution.SetVirtualResolution(480, 800);
}
private void OnWindowSizeChanged(object sender, Windows.UI.Xaml.SizeChangedEventArgs e)
{
var CurrentViewState = Windows.UI.ViewManagement.ApplicationView.Value;
App.AppWidth = (int)e.NewSize.Width;
App.AppHeight = (int)e.NewSize.Height;
Resolution.SetResolution(App.AppWidth, App.AppHeight, true);
}
The initial values of App.AppWidth and App.AppHeight is set in the GamePage.xaml.cs.
public GamePage(string launchArguments)
{
this.InitializeComponent();
App.AppWidth = (int)Window.Current.Bounds.Width;
App.AppHeight = (int)Window.Current.Bounds.Height;
Current = this;
// Create the game.
_game = XamlGame<Game1>.Create(launchArguments, Window.Current.CoreWindow, this);
}
Both are global static property created in the App.xaml.cs
public static int AppWidth { get; set; }
public static int AppHeight { get; set; }
The only problem I've encountered so far, the mouse input does not scale to the screen resolution change. I do not have a touch screen to test unfortunately but I think touch input should scale. If anyone tested touch, please share your findings. Thanks.
Update
I've managed to scale the Mouse input using the following
public static Vector2 ScaleGesture(Vector2 position)
{
int x = (int)(position.X / (float)App.AppWidth * (float)Screen.ScreenWidth);
int y = (int)(position.Y / (float)App.AppHeight * (float)Screen.ScreenHeight);
var scaledPosition = new Vector2(x, y);
return scaledPosition;
}

Resources