How to add bezel to WearOS simulator - wear-os

Samsung started using WearOS in their latest smartwatches, e.g. in Galaxy 4 watch, and I need to test bezel functionality since the latter model does have it. However I didn't find any WearOS devices in AVD supporting bezel.
I've also tried creating a new h/w profile, but didn't find a bezel option there either. All navigation options they have are below. None of them is related to bezel.
I've also tried to find a skin for Galaxy 4, but with no luck so far. The code that doesn't work according to a Galaxy4 owner is below. You can suggest how to fix the code of course, but I still want to know how to test it without buying a watch
view.setOnGenericMotionListener { v, ev ->
if (ev.action == MotionEvent.ACTION_SCROLL &&
ev.isFromSource(InputDeviceCompat.SOURCE_ROTARY_ENCODER)
) {
val delta = -ev.getAxisValue(MotionEventCompat.AXIS_SCROLL) *
ViewConfigurationCompat.getScaledVerticalScrollFactor(
ViewConfiguration.get(this), this
)
if (Math.abs(delta) > 2f) {
val np = if (delta > 0) Util.nextAccount(mAccount) else Util.prevAccount(mAccount)
Util.d(TAG, mAccount + np.toString())
switchAccount(np)
}
true
} else {
false
}
}
nextAccount and prevAccount are some custom functions that switch the view. None of them is called according to a user.
Here is a Tizen Studio emulator with a bezel that can be rotated by dragging the white dot:

I've finally fixed the problem. In a view's layout that is supposed to process the rotary event I've added requestFocus tag:
<ScrollView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:fillViewport="true"
android:scrollbars="vertical"
android:fadeScrollbars="false"
android:id="#+id/token_scroll"
>
<requestFocus />
...
To test the bezel, I've used menu on the right of the emulator as shown on the picture below. Bezel events are processed correctly at least in the emulator. I'll let you know if it works in real Galaxy 4 smartwatch when hear from the user.
UPDATE
A Galaxy 4 smartwatch user has just confirmed that bezel works after the fix. It confirms that both the fix and the testing method were correct and achieved their goals.

Related

Input.GetAxis("Mouse X") not working on some Windows configurations

public float GetAxis()
{
if (inputDevice == InputDevice.MouseKeyboard)
{
return Input.GetAxis(this.buttonName);
}
}
This code is working perfectly on my Windows 7 x64 PC. My Project Input settings are ordinal:
Input settings:
But I watched some videos on youtube where people playing my game. And they can't use mouse in it. Looks like Input.GetAxis("Mouse X") and Input.GetAxis("Mouse Y") has not returning proper values for them and they can't control camera in game.
Other input is working fine for them.
My Unity version is 5.6.0f3 and I can't upgrade to actual version because game's code is too complex.
How to troubleshot and fix it? I have not build for other platforms then windows x86 and x64.
Input object was constructed:
public GenericInput rotateCameraXInput = new GenericInput("Mouse X", "RightAnalogHorizontal");
To read delta mouse movement I am running this method in LateUpdate():
protected virtual void CameraInput()
{
if (tpCamera == null || cc.lockCamera)
return;
var Y = rotateCameraYInput.GetAxis();
var X = rotateCameraXInput.GetAxis();
}
I've come across this issue myself while using an RDP session or some kind of remote viewer such as TeamViewer. Mouse X and Mouse Y read the output directly from a device. If the device is not directly plugged into the machine that the player is being ran on then the inputs will not be properly retrieved. I'm not sure if this is the case for you, but this is the only instance I can think of these not being picked up.
Maybe you should add a bit of code that gets the mouse position each frame and outputs the difference, this would bypass the Mouse X/Y inputs anyways.
Update. It is not a bug. I've just missed something in my project.
I have a script to control the speed of mouse-controlled camera and PlayerPrefs variable to change it. And, in some conditions, that variable was set to 0. But for my case, it has already been set to registry and on my PC everything was working fine.
Maybe I need to delete this question, because it has not provided enough data.
I found this thread at unity forum There are some people who encountered the same issue on real Windows PC with different Unity versions and different mouse drivers.
This is an old Unity hardware compatibility bug. Looks like it can't be fixed other then upgrading Unity or using other Input system.

Input.Touch not working on OSX

I'm a total beginner in Unity and created a function that simply sets the text of a text element when a touch is detected like this:
// Update is called once per frame
void Update () {
if (didTap ()) {
print ("did tap!");
tapText.text = "TAP!";
}
}
private bool didTap() {
print ("checking didtap");
if (Input.touchCount > 0) {
Touch initialTouch = Input.GetTouch (0);
return initialTouch.phase == TouchPhase.Began;
}
return false;
}
When I run this on an actual device (Android phone) it works perfectly, but when I 'click' in the unity editor when running the game nothing happens. I'm running OSX 10.12.2. Anything I need to configure to have the mouse mimic touch events?
You can duplicate Unity's play mode window onto your mobile device by using the UnityRemote app. This will allow you to interact with your phone's features, while still running Unity on your computer.
UnityRemote requires some setup in Unity along with downloading the app. Consider watching this video: Unity Remote Setup Tutorial
Alternatively, see the answer from Hellium in this question, in conjunction with getting the mouse clicks, to only compile code based on the device used.
There's an odd difference between "touch" input and "mouse" input in Unity. Input.touchCount will always return 0 while you're running in the Editor but Input.GetMouseButtonDown(0) works for both the Editor and the first touch of one or more touches while on a mobile build.
You have two options to solve this issue via code (your other option is the remote ryemoss mentioned):
If you're only ever using one touch, use Input.GetMouseButtonDown(0) and Input.GetMouseButtonUp(0).
Use #if UNITY_IOS, #if UNITY_ANDROID and #if UNITY_EDITOR tags to write separate input readers for your platforms and then have them all call into the same functionality.

App did not run at iPhone resolution when reviewed on iPad running iOS 10.2.3

This is my first iOS app, designed explicitly for the iPhone, not iPad, but apparently Apple won't put it on their store unless it will run on a iPad as well. I'm at a total loss as how to "fix" this. I've searched everywhere for suggestions and nothing I've tried works (including here on StackOverflow). I can run the app in the iPad simulator and get the same results as Apple does, but can't seem to find a fix. This is proving extremely frustrating, especially when one considers that this app won't run on a iPad because it needs access to a cellular network.
I'm using the latest Xamarin for Visual Studios and I'm using Visual Studios 2013. There is no mention of the iPad in Info.plist or anywhere else for that matter
Anyone have any suggestions?
R/
Prescott ....
You are receiving this from Apple because you might have indicated your app is an Universal App.
To indicate your application will only run on iPhones you need to set this in the info.Plist file if using Xamarin Studio. When using Visual Studio this is in the project properties. The option is called "Devices" make sure you select iPhone/iPod from the dropdown list.
Device selection in VS2017
I know you said you are using 2013 but this might give you an idea.
All,
Apple requires that apps run on all iOS platforms. Accordingly I had to add constraints to my storyboard to adjust the location of subviews on each screen. Because adding each constraint is tedious, I used Cirrious Fluent Layout which worked very well for me. Below is the code I used on my screen that included a imageview. This was the most complicated screen to "fix" because (apparently) the imagevies somehow changed all the screen sizes, being an iOS developer novice, I had no idea why this worked, only that it did.
First I needed to add the reference:
using Cirrious.FluentLayouts.Touch;
Code:
//This line is required to turn off all autosizing/positioning
View.SubviewsDoNotTranslateAutoresizingMaskIntoConstraints();
// Get the screen dimensions and the middle of the screen
// for button positioning
var barheight = this.NavigationController.NavigationBar.Bounds.Height;
// Height of the navigation bar
var height = UIScreen.MainScreen.Bounds.Height;
var width = UIScreen.MainScreen.Bounds.Width;
int middle = (int) UIScreen.MainScreen.Bounds.Width / 2;
// We cast to int to truncate float,
// int will convert implictly to float when used (Visual Studio).
var heightavailabletoimageviw = height -74 - 47 - 26 - 60;
// 74 is the height of the banner, 47 is the height of the buttons and
// 26 is the height of the title label plus a 5px gap The rest of the
// screen is available for use by the image view,
// set heightavailabletoimageviw to this value
// Had to subtract 60 because the image view still overlapped
// the buttons, no idea why. Anyone?
// Had to add a constraint to the imageview because if I didn't
// it automatically scaled to the size of the image, not good.
ThePhoto.AddConstraints(
ThePhoto.Width().EqualTo(UIScreen.MainScreen.Bounds.Width),
ThePhoto.Height().EqualTo(heightavailabletoimageviw)
);
// Had to fix the size of the imagebutton otherwise the button size
// scaled to the size of the image
btnPhoto.AddConstraints(
btnPhoto.Width().EqualTo(62f),
btnPhoto.Height().EqualTo(47f)
);
// Now we add the constraints to the viewcontroller to finish up.
View.AddConstraints(
// Can't cover the navigation bar (unless it isn't there, mine is),
// this value sets all other relative positions
Banner.AtTopOf(View, barheight),
Banner.AtRightOf(View, 0),
Banner.AtLeftOf(View, 0),
lblTitle.Below(Banner, 0),
lblTitle.WithSameWidth(Banner),
ThePhoto.Below(lblTitle, 5),
ThePhoto.WithSameWidth(lblTitle),
// I have no idea why, but I had to use negative
// values for the buttons to appear on the screen,
// otherwise they were off screen.
// If anyone could explain this, I would appreciate it.
btnUpload.AtBottomOf(View),
btnUpload.ToLeftOf(View,-60),
// Same here, had to use negative values for button to
// position correctly on the screen
btnPhoto.AtBottomOf(View),
btnPhoto.ToLeftOf(View,-(middle + 31)),
// Again, same thing.
btnMainMenu.AtBottomOf(View),
btnMainMenu.ToRightOf(View,-80)
);
This is how I solved my problem, I re-submitted the app and it now appears on the app store at: https://itunes.apple.com/us/app/oml-photo-manager/id1212622377?mt=8.
Hope this helps someone ....
R/
Prescott ...

Today Widget indented / doesn't use all space

Building a Today Widget in iOS, and my view doesn't take up the full width of the screen:
See in the above screenshot that anything that goes beyond that point of the screen gets cut off.
I have set this up using the standard storyboard provided with the Today Widget extension:
This is confusing to me because I see plenty of other app's Today extensions use the full width of the screen.
you need to implement widgetMarginInsetsForProposedMarginInsets: like this
func widgetMarginInsetsForProposedMarginInsets(defaultMarginInsets: UIEdgeInsets) -> UIEdgeInsets {
return UIEdgeInsetsZero
}
check Apple Doc - Design the UI

Cause for a clipped keyboard in landscape

I have a page with a Pivot, with TextBox controls. In landscape, the SIP (the virtual keyboard) is opffsetted right by 42 pixels, thus clipped to its right.
Another app of mine also has a similar page, without the offsetted keyboard problem. Before I dig more into the differences between the two, has anyone ever encountered this problem before? Can we consider this a bug with Windows Phone 7.1?
(it does occur on a real device too)
It is a bug in windows phone:
If you set the Mode property on the app bar to Minimised and then turn the thing to landscape, the app bar pops back out. The code that figures out where to show the keyboard doesn't realise this and displays the keyboard as if the app bar is still minimised.
I solved it by changing the mode of the app bar as the orientation changes:
private void phoneApplicationPage_OrientationChanged(object sender, OrientationChangedEventArgs e)
{
if (e.Orientation == PageOrientation.LandscapeLeft || e.Orientation == PageOrientation.LandscapeRight)
{
this.ApplicationBar.Mode = Microsoft.Phone.Shell.ApplicationBarMode.Default;
}
else
{
this.ApplicationBar.Mode = Microsoft.Phone.Shell.ApplicationBarMode.Minimized;
}
}
This seems to solve the problem
Try checking if you have a set common right offset margin somewhere in your code. That is the most likely reason for this behavior.
I have never experienced this error myself.

Resources